Website Upgrade Incoming - we're working on a new look (and speed!) standby while we finalise the project

Tips on using this forum..

(1) Explain your problem, don't simply post "This isn't working". What were you doing when you faced the problem? What have you tried to resolve - did you look for a solution using "Search" ? Has it happened just once or several times?

(2) It's also good to get feedback when a solution is found, return to the original post to explain how it was resolved so that more people can also use the results.

Does TIME-COST trade offs have any relevance in planning practice?

50 replies [Last post]
Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Dear all,

Probably all of you aware the fact that modern project management is tight together with the development of Pert, and CPM. I guess you know it too, that the original CPM method was a sophisticated cost optmization method, which could optimize the so called direct costs to a given project duration.

My question 1.: Why did we forget this technique? Was it due to:

  • computational problems
  • inadequate knowledge at the user's side
  • or the tremendous effort that we should put into the preparation phase

Question 2.: Would you use it if this feature would be included in your application?

Thanks in advance

Miklos

ps. I'm still looking for applications that can handle maximal relationships :)) (so far I have two)

Replies

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

It is a mess!!!

For modeling a simple min-max relationship, and if you create a wrong link it will invalidate your schedule, too difficult it is not practical.

Well guess it goes with my point, maybe can be modeled using traditional CPM but it is not practical.

Perhaps something not as ambitious can be of practical use; like creating a combined min-max link, that if min=max will become a strict link, only one allowed per activity, a work around for resource leveling shall be developed.

In Spider we can toggle on and off each link so we can leave some inactive but defined, and when need be these can be activated, obviously because of the limitation for only one such link, once the previous is inactivated. Perhaps conditional scheduling can be applied here.

Best regards,

Rafael

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

Huuh, it took time even for me, but I think it works from this point of view.

You are right it is error prone and besides it is not easy to explain it to an average client, what this is all about.

Miklós

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

I made some changes to fit the model needs as follows.

Photobucket

http://www.mediafire.com/?5kpnl92j12vy5b5

Even if it works, it is no good, too error prone and time consuming, go for it.

Please display ProJack results showing Free Float and Total Float values.

Regards,

Rafael

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

During they were working on my knee, I had a couple of minutes to analyze your solution. I' ve found an another difference:

If you FS0 from A to B. and a maxFS2 from A to B, and a FS0 from C to B (this latter will force the start of B far away from the finish of B if A  maxFS2> B is not applied) then A will have a float cause it can finish 2 days or 1 days or 0 days before the start of B) So if B is critical then A has to have two days total float, and free float as well. I think in your solution there is no float for A

Miklós

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Miklos,

But my approach is time consuming and error prone, I still advocate for functionalities that reduce probability of error and at the same time makes transparent what your intentions are.

If latter on I change successors or predecessors and link it to the wrong node the model will be invalidated, on a model with hundreds of activities there is big chance you will eventually hit the wrong node. As Murphy says if something can go wrong it will eventually go wrong at the worst moment. Maybe at a meeting with the PM and project team.

Best regards,

Rafael

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

Now it's clear for me. You are right these type of problems can be modelled with with using ALAP and introducing a couple of dummy activities. If ALAp could be defined on a way that ALAP - 2 days, then even dummy activities wouldn't be necessary. So if someone has no maximal relations in his application this is a real option for modeling this kind of problem.  (But at the end everything can be modelled with 0 and 1 in the IT world :-))

Got to go. My  physiotherapist is waiting for me.

Miklos

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Just take a look at the links table, if finish of activity C changes then activity AA changes, if activity AA changes then finish of A will follow due to ALAP constraint.

The following video capture will show how moving C will move AA and A.

http://www.youtube.com/watch?v=2uk9kFa3jFI

You can try modeling it into Spider Demo so you can debug it or just download from the following link and play with it.

http://www.mediafire.com/?3985q0kbd26rt7e

PP is for people willing to take challenges and learn something in return, you are welcomed to question it, if wrong the Recycle Bin on my computer will dispose of it free of charge.

Your Glue example is great, a classic.

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Edgar , Anoon (which one do you prefer?)

Maximal relations are not about resources, maximal relations are about describe logical relations between tasks.

Couple of examples:

1)  you have to glue to pieces together.  For this you have to put the adhesive onto the two surfaces. On the user manual there is a warning that after 5 minutes you can start to compress the pieces, but do it within 8 minutes c'cause the glue will dry. Now imagine that you have only minimal relations. This a FS5minutes from glueing piece #1 to the compression, and a FS5minutes from glieing piece #2 to compression. What if if the other  predecessors of compression put compression 1 hour  after putting the adhesive on the two pieces?   The existing relation in this case whould be what the manual says:  compression should be start after 5 but before 8 minutes. If your software cannot handle this latter you cannot make an adequate model, and you can get results where compression follows gluing with one hour.

2) Shoring can start right after finishing the hole, however shoring material will be available when a refill at an other place has been finished. If this material will be available too late, then the side-walls of the hole will be collapsed. The right relation is shoring can start right after the finish of the excavation and should be start within two days to avoide collapsion

3) An example when maximal relations can be help in case of leveling. Imagine that you have a city with 110 streets where you have to make a sewage system. For each task you have only one resource, that after  finishing the demolition of pavemet in a street you can go to an another street for doing the same. Without thhese limits all the strets can be done parallel.  Resource levelling can lead to a result, where demolishing the pavement in street A  starts at the start of the project and putting the new asphalt layer will start at the end of the project say two years later. (A simple street according to the technological relations would be only 2 months) Here there is a maximal relation in case of this stret, that duration of the work is two month but let it be smaller then 2,5 month. In this case you will have a result where during leveling streets will be kept together. (Of course there are other ways for describing that during leveling roads  should kept closed for the traffic for only a required minimum time, but this is the easiest

4) Hundreds of different problems can be cited from construction, and if you start to investigate your projects you soon will find a lot of example for this. As maximal relations were thaught in the university when I was a student, it was always obvious for me to think in this way. As so far you didn't know the existance of these relations therefore you didin't miss them.

But time cost trade off is not about this. Time cost trade off is about choosing among different options you have.

To make the problem very simple, imagine only one path  with 200 days length. Your deadline allows 170 days. What you can do? A lot of things, one  amongst them is to sorten the length of the critical path by changing activity durations.  (From here I'll deal only with this option.) You are a general contractor and you subcontracted all the tasks. So you can ask all your subcontractors whether there is a chance to shorten the durations of their activities. They all will say yes, but they say it will have cost consequnces. If you accept all the offers it will result  in 140 days, and a lot of extra cost. So you have options to choose which offer  to accept and which one to refuse in order to achieve 170days. Bringing this decison a time cost trade off can help you. Real life situatation are much more complex, for example you have usally hundreds of paths from the start to end, and crashing the original critical path soon will result more critical path in your network, so hand computation are almost impossible. Time cost trade off is a method that can help you in these situations.

Miklós

 

Anoon Iimos
User offline. Last seen 2 years 45 weeks ago. Offline
Joined: 22 Sep 2006
Posts: 1422

Miklos,

Just curious...when you are working on a limited resources, for sure you won't get too many options, so why the need of maximal relations?

To simplify the job, you need to narrow down your options, and you have to choose only one, right?

So how can you think of too many options, and then decide on a single one?

Are you going to think of the unprecedented always? Or you just narrow down to the tested ones and then decide?

cheers!

Edgar

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

The solution you have provided is wrong, I'm quite sure about this.

It should work on that way that if the finish of activity C changes (due to the delay of its predecessors) than the finish of activity A has to change as well. I can't see this in your model. Probably I was not clear in defining maximal relations, and what are they good for, so its my fault and it is due to the limited oppurtunities of this site.. However there are dozens of books are available where they are explained in details, and explains the situations which cannot be resolved as Vladimir sad.

You could survey a completely new software within couple of hours, I admire you for this. Even in case of MSP wich I know for almost twenty years discovering new functionalites can take weeks for me. May be because I'm getting older and older :)). So far I've posted some problems which could easily be modelled with ProJack and not with others. So I think modelling capabilities of ProJack , of which max relations are only one amongst many others are really unique, comparing to any other solutions.

But its biggest advantage is its very sophisticated  approach to costs : rates,  normes, BOQ's,  handling different margins on primary cost, and the attached production rate database which contains 160.000 rates, for practically all type of the construction work, handling the procurement process etc.,  is where its biggest advantege can be found.

But of course use the software you prefer, I do not want to convince you that use this or use that.

This whole topic is about to collect the list of softwares that can handle maximal relations, so far I know only ProJack and Acos. This latter is the German one.

Regards

   Miklos 

 

 

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Based on my understanding of the requested schedule the following is a model in Spider Project of how I would do it.

Because resource leveling could break the modeling of strict links I would have to use some workaround to manually fix resource demand that can easily be in conflict with the strict link.

Photobucket

As Vladimir said, "maximal relationships easily lead to the situations that cannot be resolved and require permanent diagnostics if anything changes (links, durations, volumes, etc.)".

In reality most of our jobs have no much need of min-max relationships, until a better procedure to handle this comes out I will not consider moving to other software. Maybe a procedure that will not solve all issues but that will provide some warnings and suggestions can be developed. I do not pretend for the computer to solve all issues at a click of the mouse but to better assist us in our decision making, it is possible, go for it.

I uninstalled ProJack as it falls too short on other functionalities, still the issue on addressing the min-max relationships is of interest and believe ProJack still can improve on their approach, still interested on what the Germans are doing.

Regards,

Rafael

PS. For every predecessor negative lag there is a successor negative lag, think out of the box.

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Just two hints

Adding an FS2 link form a to B 

  • just click to the finish of A and relase it at the start of B. This will be an FS0. If you click or double click onto the relation you can edit the lag.
  • press ctrl+q in order to get the quick task editor and choose the predecessor or the  successors tab (Is tab the correct English name  for this?)

Adding a maxFS2 press SHIFT while drawing the relation from the finish to the start

Miklos

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

It's good that the software is avaiiable in English, I have to feeling that not everything is translated, in some cases it won"t work properly, and I do not know whether any help is available in English. I'll check it, and refer on it.

Maximal relations....

I've never tried before to model the maximal relations with minimal ones, although it is definietly possible in the following way

Imagine two activities A and B and a min FS2 says from A to B

In this case you can formulate the following

     StartB - FinishA>= 2 days

A maximal FS2 relation from A to B is

    StartB - FinishA<= 2 days

So the only difference is the operandus. If you multiply the second one by (-1) you will convert it into a minimal relation.

It will look as follows

    FinishA- StartB>= -2 days

And this is nothing else but a  minSF-2days from B to A. The only problem that this two relations FS2 from A to B and SF-2 from B to A will form a loop which is not welcome in most of the commercial applications.

What you did as i see, is something else, with dummy activities, but I cannot follow it. Please explain it using my sample. We have to shore the hole (B)right after the finish of excavation (A). This is an minFS0 from A to B. Shoring material will be available after the refill (C) of  an another hole . So there is a minFS0 from C to B. However due to the predecessors of C. C will finish much later than A, that is B will start much later than A. During this time the wall of the hole will collapse, therefore the right relations are you can start the shoring right after the excavation, but start it within two days, that is an additional maxFS2 from A to B is necessary. Try it in ProJack, and after that try it to model with minimal relations.

I'm really curious if you can do it.

Mikklos

 

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Photobucket

Seems like with the use of double links you can model min-max relationsiops as shown above. With the use of fragments you can save useful fragments for future re-use.

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Photobucket

It is already in English and will be looking at it on my spare time, it does not matter if the software is as powerful as others as long as it is enough to showcase the functionality.

Very interesting.

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

The first is a Hungarian one, and I played some role in the development. I do not know when it will be available in English, if it will be at all. (www.projackmanager.com)

The second one is a German development. I have a Ph.D students who has the information about it, but she is in Romania on a conference and not available at this moment. I'll send the homapege of tis latter onet his week.

Regards

   Miklos

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Miklos,

Please don't ever stop asking why we do not use some features, this is what we need, developers that look for our opinion and will move in the direction of our priorities and needs. But we got to be honest with you.

I do not use Spider Project optimization algorithm unless in problems, I start with an algorithm that let me define activities and WBS priorities, at times Standard, at times Advanced, these give me some buffer and some control.

Perhaps some compromise between full resource leveling an implementation of maximal and strict relationships can be developed, like for example when you define a train of activities with strict dependencies and only one of the user selected activities will be resource leveled and the remaining ignored.

In case your software bypassed (ignored) some activities during resource leveling it should warn you and provide for you to filter the by-passed activities, in such case the schedule most probably will still be unreal unless you take the measures to avoid it. Perhaps a resource leveling check can be run as to determine if all, included over allocations bypassed by the algorithm have been resolved. Issues with by-passed activities must be resolved. No schedule that do not pass the over allocation test shall be accepted. Similar to out-of-sequence events, the software continues with the calculations based on some rules but you still got to solve it with some user intervention.

About a plan that does not satisfy time limits it is not uncommon as cost of "liquidated damages" can be less than acceleration or crashing to get on time, what does not makes sense is a plan you cannot fulfill because of resources availability.

Please let me know which two commercial applications can handle maximal relationships.

Regards,

Rafael

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

"twenty options for crashing in the first two month" Wow, twenty options, do you mean the software will tell us how many possible options are there and the cost of each one? Don't ask for a single day crashing, the options will be on the hundreds.

I mean, that you start to investigate the crashing options, and you find 20 activitis in the first two month where crashing is possible. You define how, and for what cost consequences, and after this let the software help you to choose the best combination in terms of money. After that it is your decision. It could be like resource planning,. Maybe a plan that satisfies the limits is not aceptable, because of some consideration you cannot put into your model, but the decision is yours whether to accept or modify manually or refuse the whole as it is. Models are unfortunately offers limited capabilities of modeling real life situations.

Conserning Vico....

At this moment I'm not their biggest fan (I do not say that they stole my original idea probably they have completely forgotten our meeting 15 years ago, but there are  bitter feelings inside of me) but i think this is one of the possible way for the future. We have a shortage in good models, that can help in decision making, and the other hand we have a shortage in good graphical visualizatzion methods for projects. Vico aims this latter one. The software is not easy to use, not convinient, has limited possibilites, but the final goal is OK for me.

But I feel, that with a small group of programmers I could prevail over them within two years. So I'm thinking about this.

Miklós

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Miklos,

 "and let me decide whether to use it or not."

Impossible I can say otherwise. The issue becomes is it worth it, is it practical or a distraction?

I believe the nature of the problem, the discontinuity of the time-cost function and many other factors make it impractical, at least with the rudimentary approach of the 70's, something much better got to evolve otherwise will still be the same Mummy, just a bit older. Seems like the model does not fit the real issues.

"twenty options for crashing in the first two month" Wow, twenty options, do you mean the software will tell us how many possible options are there and the cost of each one? Don't ask for a single day crashing, the options will be on the hundreds.

A single-seat license for Vico's software is priced at $12,500 and a typical purchase is worth about $75,000. I do not believe Vico will survive this economy, as times goes-on available market will be less and less, sorry but I perceive Vico more as a marketing tool than anything else. You do not need Vico to realize 99% of construction potential problems, 3D modeling in most cases will be enough to identify the remaining 1%, no need for a 3D movie on most Construction Jobs, though a 3d movie for IT jobs will be novel.

I doubt Vico will disclose hidden soil problems or structural design errors such as the one that ended on the collapse of the Minnesota bridge.

Still the following "Vico" movie is entertaining.

http://www.youtube.com/watch?v=P0Fi1VcbpAI

Vladimir,

Double links I rarely use but no doubt they are worth it, is as easy as 123 to setup.

I like Miklos open mind for change and improvement. Even the mouse was rejected and criticized when Apple started using it as an integral part of their operating system. No doubt there is much room for improvement.

Regards,

Rafael

Miklos,

in your example we will use 50m lags.

Using Spider Project you can set any number of dependencies between two activities and link not only start and finish points but also any other.

Entering actual data we shall reschedule the remaining work restoring the distance in your example if something wrong happened on the preceding activity.

And certainly the project shall be rescheduled if resource assignments are changed.

Any model is an approximation of the reality. When people set the restriction of 50m distance it usually means that 50m is comfortable and 45m is still acceptable. So the deviations in the distance during one shift are usually acceptable. Planning is done at discrete moments and the performance is continious.

Regards,

Vladimir

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

I agree with you in this case all the smart contractor would choose a crashing in the first two month. But what if if you have twenty optiions for crashing in the first two month? All in all...  we can close it that in some cases can be a help, but decisions are allways ours.

It is also true that adding funcionalities can lead problems in some cases, ind in case of some planners. But not all the planners are the same. There are planners who are worrying bacuse there are 'critical' activiites in the plan, and they do not want any critical stuff in their schedule, and there are planners who know the model very well, they know the limitations, and they are eager for new tools eg. in resource and cost  planning, eg. decision nodes in order to model diffferent options in one plan, cost optimizaton to choose the cheapest of the hundreds of different options, define the right sequence in case of repetitive works, simulate it in 3D etc. We still do not have solutions for these problems in commercial softwares, and we suffer because the limitations of the available models.  As you sad: "Give me the flexibility" and I continue in this way: "and let me decide whether to use it or not."

Just a short story to close this message. In 1996 so 15 years ago when I was young and enthusiastic and handsome, there was  a company in Hungary they were called Graphisoft. They had and they still have a CAd for architects, it's called Archicad. You can check it at GOOGLE, they are one of the biggest even in these days. I went to the deputy CEO in charge for all the development stuff, and told him: "I've got a great idea, lets connect your software with a scheduling software, in case the drawings can be made according to the schedule." I also told that this simulation could be a great help in the visualization of the project. They asked two weeks, and after that they told, that "We do not see any potential in this, this can be interesting only for academic people (like me) and not for the practice). 10 years later they made a softeware, now its called Vico, and they deal with 4D, and 5D modelling.

Miklós

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Vladimir,

I 'm totally agree with you. We do mot play witk activity durations, and cost. We make decisions based on technology, change the shift, increase or decrease the number of crew. These solutions have to beased solely on our expertise. But at the end it leads to that I have two options for A  X day, and X-3 day, but the later one costs me an extra $1000. And I have this kind of option for activity B, and for 3 as well. I would enjoy a feature that could help me to choose which is the cheapest for me. After that the solution still would be mine, and I can decide whether to accept it or not. But for this I would expect  options from the computer.

But I think we know each others standpoints well, so do not waste more time on this topic. I know hundreds of problems which can arise during the scheduling or resource planning, and cannot be solved with the commercial softwares, although theory have solutions for it.

Back to your question.

In Precedence Diagramming there is a strong hypothesis on activities. (Every mathematical model has hypothesis in order to simplify the world, eg. linear time-cost curve in the original CPM model) In Precedence Diagramming (the one I described before) the hypothesis is the next: activities have to carry out with the same intensity, without interuption.

It means that each activity is drawn as a sraight line in linear scheduling. If there is a planned change in intensity (eg. double the crew whent its finished in fifity %) than the right way to model it, to cut it into two pieces.

What is the reason for that hypothesis (what we do not assume in CPM by the way)?. The reason is behind the existence of precedence relations. With precedence diagramming you want to define relations between activities. However precedence relations exist only betwen the start and finsh point of activities. (Ss,FS,SF,FF, maxSS,maxSF, maxFF, maxFS)

How can these relations can extended instead of start and finish points into the whole activitiy? If you have the correct function about percent complete based on time. This was assumed to be linear in Precedence Diagramming, and I never sen any paper where other time/completness functions have ben determined.

Just an example to make it easier. You make a long ditch (A) and you lay the pipe down (B). It can be done by overlapping, (some safety distance eg. 50m-s is necessary) so  you define the necessary lag between the start and between the finishes as well. Say SS2 days, and FF 2 days. During excavation machine can stop for 10 minutes, in some places he can go faster or slower due the the soil conditions,  and the same is true for pipe laying. So the distance sometimes is less than 50m-s but it can be acceptable. So activities never carryed out with the same intensity, but it is our decision whether to accept these small changes in intensity or not. But imagine that you plan to start A with a machine with small pruductivity and after that you change into a better one that doubles the rates. And in case of B, you plan to start with a crew with higher productivity and after 50 % you cut back the productivity into 50% of the originanal one. Both activites are finished within 30 days,sio SS2 and FF2 will fulfilled, but if you follow this, there will be places where you lay down the pipes, and after that you will dig the earth from beneath the pipes. But the original ralations are fulfilled.

This is because you wanted to formulate a relation betwen the activities (eg. should be 50m-s safaty distance everywhere, but the model allows you to define relations between the start and finish points. This relations can be extened two the activities in whole, if there is an assumption about the accomplishment. That's why that the same intensity, without interruption hypothesis is so important. One has always keep in mind this hypothesis, because it's never fulfiled, and it's our decision to accept it in certain cases or to say, that this can cause problems, I'll cut this activity into pieces.

Miklós

 

 

 

 

 

 

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Miklos,

There are many ways to crash a job but not always the least expensive is the best option. This shall not be decided by the computer, a good tool but not omnipotent.

Suppose you can crash the schedule activities during the fist two months at cost X and another option is to to crash the schedule during months 8 and 9 at cost 0.90X.

It might be the better choice crashing during fist two months so that if latter-on you get into trouble you still have feasible options.

Do not underestimate contractors, they make decisions after much brainstorming with their engineers and supervisors, this way of decision making have proved better than hiring an external "expert" that know nothing about what is being scheduled. This is specially true in the USA where on every job each owner specifies different software and the contractors have no other option than to hire an external consultant available on very limited time.

I believe if you start adding functionalities many do not understand or too difficult and time consuming to setup, users will move to software they can handle.

Vladimir,

Miklos said, "In construction the same crew will not do the same job without interruption and same intensity". This is true when you have poor resource planning, otherwise crews will perform as expected. Usually interruptions occur because poor planning creates the need to improvise and move available resources to other activities. We do not plan for sudden changes in crew size or production, in any case we split the activity and proceed with good planning practices. Genuine unforeseen are not that common, bad planning is.

This have been my experience on jobs no resource loading and constraining was implemented. Keep in mind contractors avoid resource loading when the software makes it difficult to model your resource loading needs, and this is particularly true on software lacking for partial assignment of resources in hundreds of activities in a transparent way, total work hours per day is not transparent because assignment of 8 hours per day performed by two resources working 4 hours a day is not the same as 8 hours by a single resource.

Regards,

Rafael

Miklos,

of course any change has cost impact and may change activity durations. And calculating the schedule with new initial data you will get different activity durations and costs, and project duration and budget. What I try to explain: you play not with activity durations and costs but with resources, technologies, calendars and get new activity durations and costs as the result of new settings and project scheduling.

In construction the same crew will do the same work with the same intensity and without interruptions. Please explain what did you mean when wrote that it is not true?

Best Regards,

Vladimir

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Vladimir,

I'm talking exactly on the same as you. When I was talking on data preparation I was talking about increase or decrease the number of crew, increase the working hours, changing technologies etc. Sometimes it is easy sometimes it is difficult.

But most of these changes have cost impact. You cannot pay the same for extra working hours, so this extra hours (that is shortening in activity duraton) will cost extra money. Changing technology in order to shorten the duration usually  cost extra as well. So all your decisions concerning time has cost consequences. For an activity this kind of options  will result in different activity durations with different costs.

Time cost trade off helps you to make decisions,  which activities should shorten (or in precedence diagramming to lenghten) in order to shorten the project duration.

Models are interesting things. For what you sad in your letter at least hundreds of different models exist. One should be very carefull about their pros and cons. In my experience for example only a fey % of the schedulers know, that the very basic assumption of the precedence diagramming method (the one I shortly explained and used by MSP and Primavera) is that activities should be carried out with the same intensity without interruption. It is very hard to accept 'cause is there anobody who seen a construction activity accomplished with the same intensity and without interruption? But that's is the most important assumption of the model. And we faces wiith the well known modeling problems,  because these assumption never fulfilled in the real life.

Miklos

Miklos,

I don't understand what data preparation do you mean. Changing resource crews leads to different crew productivity and thus different activity durations. It is easy enough and certainly not time consuming. It may take seconds or minutes, certainly not hours. Adding shifts is more complicated and takes more time. Changing technologies means changing scope and project model. None of these options means creating activity duration and cost distributions.

Any project model includes restrictions on the order of activity execution. Why the model is called Diagramming I don't understand.

I think that we talk about the same model: activities, relationships, resource (renewable, consumable, space) requirements and availabilities, costs and financing, calendars, uncertainties, risks, conditions, targets.

Regards,

Vladimir

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Vladimir,

It's true, duration is affected by resources, crew, skill, shifts, calendars etc. And at the end it will finish in with different costs for different durations. This preparation has to be done by an expert, and after that the what ifs could be automatized, and left to the computer. Never sad that preparing these data are always easy, it is as difficult as to  prepare the optimistic or pessimistic data for PERT.

Precedence Diagramming Method is a matematical model, when we can formulate different relations among the distinguished (that is start and finish) of the activities. Activities should carry out with the same intensity without interruption. Relations can describe minimal times that should be elapsed between the (start/finish) points of activities, and can describe the maximum allowable time between the points of activities. Graphical interpretation is irrelevant, usually we display it as an activity on node network, or with logic Gantt (bar chart) The primaly goal of the calculation is time analysis, that is to define the project duration, and the early and late times for activities.

In this method shortening an activity on a critical path can result in a longer project duration. (MSP, Primavera use this model)

I do not wan't to put on the mathematical model here, (I've got a book on this, published by Kluver) but this is more or less adequate.

Let me know, if we speaking about different models.

Miklos

 

 

 

Miklos,

when we estimate options it may be different crews (resources) on certain types of activities, different resource calendars, additional shifts, different technologies, different directions of work, additional streams, different resource pool, different financing or supply schedule. They never look like different durations for different costs.

Precedence diagramming is one of many ways to present the project model. The mathematical model itself does not depend on the way it is presented.

Regards,

Vladimir

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Vladimir,

You sad:

"Time-cost analysis is done by what if calculations comparing different user defined options. Cost estimates for different activity durations do not make any practical sense."

I feekl myself confused. Yor second sentence seems as it would contradict with the first one. Time cost trade off canot be done without cost estimates.

My example on optimistic, pessimistic durations was not an example for time cost trade off, it was an example that sound estimationjs for optmistic and pessimistic duration is at least as hard then to produce cost estimations for time cost trade off. And it slowly but definietly comes back into the planning practice

But ii's good to know that you make time-cost trade offs by what if analysis. For this check the example I piut in on couple of comments ago, and give a solution for 8 days project dutration. After that imagine how many what ifs could be done in case of thousands of activities.

In addition it is much more difficult in precedence diagramming, where shportening an activity on te crirtical path can result in a longer project duration, etc.

As this conversation goes ahead I'm better and better convinced that this would be an interesting feauture.

Regards

Miklos

Miklos,

we use different approaches for analyzing schedule and cost uncertainties and I did not ever suggest to estimate each activity optimistic, most probable and pessimistic values.

When we make resource plans all renewable resources and major consumable resources are included.

Time-cost analysis is done by what if calculations comparing different user defined options. Cost estimates for different activity durations do not make any practical sense.

Regards,

Vladimir

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Vladimir,

What you say is true, but in PERT - if we follow your suggestion-  we should  make estimations for thousands of activities when we perform calculations with pessimistic, optimistic durations. A sound estimation for an activity at least takes as much time than to define  costs for different activity durations.

Here in Hungary we have a consultant firm and they do it on a way that somehow only a fraction of the activities is analyzed in details, and for te rest of it they just make assumptions. They say it works.

When you make resource plans you don't have to make it for all the resources, you select first, than make the plan only for the selected resources.

In case of time cost trade off you can follow the same method Analyze only the importat ones, and make assupmptions for the rest of them.

I think time cost trade off can be usefull, we just have to find its right place in the methodology.

Cheers

Miklós

Miklos,

I wrote that people make discrete decisions - they will use second shift or not. Cost curves for activities are not practical. You will not create these curves for every activity as you proposed especially if there are many thousands of them.

Time-cost trade off is always done but by people. I suggested you a set of examples to explain that these decisions cannot be automated except evaluating options created by people.

Regards,

Vladimir

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

Your "Give me the flexibility and let me worry about the validity of the model"

was a good cue.

All in all I was benefited from this  but you are  right let's move further,

I declare this topic CLOSED

Miklós

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

You wrote, "I have a feeling that within twenty years all these modell we use now for scheduling, will be history", I hope you are right because during the last 30 years I have seen no real progress in commercial applications. Only Spider Project has provided me with enough functionality to move from "soft logic" to true computer driven resource constraining.

I will be very happy when traditional CPM is replaced by something better, the model is too limited for many real life needs and hope it will evolve from dinosaur into a fast flying bird.

Yes I can accept time-cost trade off with resource leveling with the understanding of the limitations as some other scenarios need to be supplied by the user. Go for it, I even advocated for it before in PP forums but still have some priorities with regard to other functionalities I have greater need for everyday situations. Like for example:

  • Maximal and minimal can be one of these, when feasible, when not just break the link and mark it as broken.
  • Modeling of Circular references when feasible, when not just break the link and mark it as broken.
  • More intelligent link options, more transparent, like a link to a group of activities be activated once a certain no. or %no. or %volume is attained.
  • More formulas like Max, Min, Ave still missing in the software of my preference.
  • Group of activities (WBS) that can be linked to other activities and be driven or drive other activities. Similar to the topic activity previously available in SureTrak, or like the summary activities in MSP.

Give me the flexibility and let me worry about the validity of the model.

This is a never ending story, just keep moving.

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

I've read your comments,and I think I understand and accept all of them.

But let me take some notes fjust to force you for further considerations.

You sad that different optinos almost always result in disctontinious cost curces. This is 1000% true. But even the original model resulted in integer days solutions, that is it has never sad, that this activity will be 11hours and  36minutes. The integrity theorem of linear programming ensures that durations will be chosen as if the cost curve would be discrete.  Sorry I'll never make such a mathematical notes again :)) And I mentioned that hundreds of relaxations were added to the original model. Like non-continious and non-linear curves. So this problems seems to be solved.

Crashing an activity will require of crashing some other activities. Well,  additions to the original model can hadle this problems partly?, I 'm not quite sure about it. (I read around hundred papers on netrwork scheduling, incuding time cost trade off, which is only a fraction of the published ones)

Time cost trade off works (I mean there are algorithms for this) under resource constraints and precedence relations as well.

Third which is a genaral comment - not concerning time cost trade off - that recently we planners start to use PERT model (you know, this optimistic, pessimistic stuff) But if one would dig deep inside into himself. I'm sure that they wouldn't give even a bottle of beer to defend their approximations on pessimistic or optimistic durations, but we accept this.

I still think that in some cases would be a great help, and I admit that there are cases due to the non-adequate modelling capabilities (but we do not know where the theory is at this moment)  when human decision is the solution.

But do not forget, now we have computers more intelligent than men, and I know that there are labs where researchers are working on, how tu use artificial intelligence in scheduling. I have a feeling that within twenty years all these modell we use now for scheduling, will be history.

Until then we have these tools and human brain...

Take care

 Miklos

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Vladimir!

I've presented the original model, but I've added somewhere, that hundreds of relaxations and new developments has been added to the original model so far.. Non-continious, and non-linear coct curves were the first.

I can not follow your second point. Time cost trade off works in case of thousands of activities as well. In case of five activities anybody can figure out the optimal solution so we do not need a mathematical model. but  in case of thousands you really need a computer. (Anyway it not so easy in case of even only five activities.(See the sample that has been sent to Anoon)

So it seems to me - and I'm surprised - that you wouldn't use a feature like time cost trade off, if your software would provide this option.

I'm surpised because I do time cost trade off in my practice. Its true that in the lack of an application I practice it by following my senses only.  As I mentioned - when I work on a schedule - for the firs try project duration is usually far beyond the deadline, and after that I start to make crashing, that is shortening the project duration. I'll try to do that on the way, that to minimeze the extra cost. And this is not but time cost trade off.

Miklós

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

I would say, more often than not it is a discontinuous set of individual points.

Like when your options are to increase or decrease amount of concrete forms, it is not linear but determined on discrete points depending on acceptable pour limits or construction joints.

Also the model does not take into account the relationship between the activities as when crashing one other related activities must also be crashed. A similar objection many have with regard to the perceived precision on Monte Carlo that does not take these relationships into consideration.

My experience is than in a real life project when your job is lagging the best option is by means of a recovery schedule that will include some changes in resources and activities logic.

At home more overtime is counterproductive, means employees will create excuse for the overtime to continue, they love it $$$$ (2x after 8 hours 3x if at designated lunch time on each shift), additional shifts means some change in logic as daytime resources and night time resources and operations are not the same. The issue goes further than just a matter of linearity or non linearity of the curve, even if the model accepts a logarithmic spiral time-cost falls short. It is about time-cost-resources-logic and human decisions and commitment.

I use resource leveling as a way to find a practical solution to resource smoothing and reducing idle time in practical ways, we work in crews, not half crews with half cranes, it is either a full crew, or two crews or another crew composition. For this reason the also abandoned functionality to smooth changes in resource demand as a sum-of-least-squares is equally far from reality as the simplified time-cost-trade-off found in the literature. 

Miklos,

in the real life it does not work this way.

The task is certainly non-linear. You can add crane, or second shift, or create new stream in the pipeline project. And never normal distribution.

Besides, I am used to work with the schedules consisting of many thousands activities.

Vladimir

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Anoon,

Here is the example.

You have the so called normat duratuions and the cost for all the activities. Project duration will be 10days, cost will be $500.

You make some value engineering and define for each activity how, with how many days, and for what costs can they be speeded up. All these are given in the picture below. (For the sake of simplicity you can assume that the change of cost is linear between the normal and crash durations)

Now the question is  what would be the optimum cost if my dedline is 8 days.

This is time cost trade off.

http://www.projackmanager.com/sites/www.projackmanager.com/files/images/...

 

 

 

Miklós

Anoon Iimos
User offline. Last seen 2 years 45 weeks ago. Offline
Joined: 22 Sep 2006
Posts: 1422

Miklos,

I further believe that it will affect engineering specifications; construction methodologies; contracts; etc., etc., which can become very complicated.

In order to save money, make it as simple as you can.

 

cheers!

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Vladimir!

Thats right, it is our decision, whether to use better machines, or increase the number of working hours in order to decrease the duration of an activity.

However if you make these considerations you can come to a conclusion that I can speed up A by 5 days for $10 per day, I can speed up B with 5 days for $12 per day, and so one....  If you have these data for all the activities , a time-cost trade off procedure can help you in selecting the activities that have to speed up for the desired project duration, and will result in the smallest increment in project direct cost.

Miklós

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Anoon,

Here I explain the original CPM model from 1958, and then you can decide. I think time-cost trade off could be a tool for value engineering, but I leave this decesion up to you. Maybe we make these kind of considerations in mind without the help of mathematical algorithms.

So here comes the original CPM network, as it was defined by Kelley and Walker in 1958

1) Imagine an activity on arrow network, one start node one finish node, no loops etc.

2) For each and every activity you define a normal duration, (this is the duration when you work in normal way). You can define a cost for these durations, These costs are called normal cost.  No imagine that you perform the time analysis, and with this your project duration will be ten days.

3) Well, you recognise that this beyond your deadline, therefore you ask your subcontractors to improve their speed in each and every activity. They come back for the following data for each activity. They define their fastest available duration which is cold crash duration, but they say this will cost you more (because of the big rush, working around the clock) etc. This cost is called crash cost. (there is an assumption that cost changes linearly from normal to crash points)

4) If you perform the time analysis with crash durations you will get a minimal project duration say 5 days.

5) here comes the question: if 7 days would be the desired project duration which activities should sped up. The answer: those that result in the smallest cost increment.

THIS IS THE ORIGINAL TIME COST TRADE OFF PROBLEM, AND THIS IS THE ORIGINAL (but forgetten) CPM PROBLEM.

Now dozens of developments has been added, for example it works with precedence relations, works with maximal relations, works with constraints, work with non-linear and discrete cost functions, works with project calendar, works with penalty milestones etc. In spite of this it's not part of the practice.

I'm trying to put together a sample and I'll incude it into my next comment

 

Miklós

 

Hi Miklos,

The answer to your question is easy: because the computer does not know if it is possible to work in two shifts instead of one or set 12 hours work day, if more productive and expensive crane is available, if you have enough resources to start building a road assigning separate crews for each 100m, etc. People shall define options and set success criteria. The usual way to estimate different options - play "what if" scenarios. This process can be automated but usually the number of options is limited (you will not suggest work day that lasts 11 hours 23 minutes and 19 seconds) and people like to take decisions comparing different options and taking into account those factors that are not measurable.

Some small options like using 3 to 5 workers on some task, selection among available machines, etc. can be automated.

Regards,

Vladimir

Anoon Iimos
User offline. Last seen 2 years 45 weeks ago. Offline
Joined: 22 Sep 2006
Posts: 1422

Miklos,

I believe "value engineering" is practised all the time, isn't it the same?

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Rafael,

Thanks for your comment, but I've the feeling, you have misunderstood me.( Maybe its due to my non-adequate English)

Here we protect workers, maybe better than anywhere outside Europe, as well.

I was talking about that we have a  vast number of  subcontractors, therefore it is not a problem to get enough subcontractors for a job. Here in Hungary we make contracts with subcontracrors for a job, and after they finished it and they are payed, they can go away, and not even a trade union can help if they want to stay and get money just becauese they are there.

I've also tried to explain, that we have this situation, because construction firms are very small, so no construction firm exist that can carry out even a small project alone. This is the reason that a project OBS can contain hundreds of subcontractors according to my experience.

(From 80 thousand construction firms we have only ten with more then hundred physical workers, and their average numbers is below ten). That's the reason that labour and machine are practically no matters in our schedule, they are practically unlimitid (of course there are exceptions, like physical constraints... you cannot put 10 cranes if you have place only for one etc.)

Back to the original topic.....

In my experience when I make a network I do not care about the deadline, I make a plan, which means that I establish a system of logical dependencies. After that I make the schedule. (couple of years ago there was the methodology that planning and scheduling should be two seperate process. Some of the cases I still follow this methodology In this case the deadline won't affect your estimations for duration, etc. The reason for that methodology was, that they observed, that as we are getting closer to deadline our activity bars will be shorter and shorter, overlapping will be grater and greater because the deadline  affects - even if you fight against it - our mind. Of course this methodology has less and less follower, because software calculate the schedule at the very moment you enter a new data, which is also very good.

So after finishing my first plan I usually have the experience that instead of 13month which is the deadline my project will finish in 18month. So I start to modify it, and after not dozens, but sometims hundres of modification, I will achieve the desired dedline. In this process a time -cost trade off algorithm would be a help.

So my investigation is about, why don't we use time cost trade off in practice. It was invented 60 years ago, and tremendous embellishments has ben developed since then to make it more realistic.

Regards

    Miklos

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Vladimir!

I do not want to argue with you because you are right, sometimes (but less then "most of the cases") resource constraints are phisycal, because you do not have enough place on site (like space for one crane only). Situation is not so clear in case of roads, pipelines etc, here you can start in a couple of different places parallel.

But even in the case of your crane example, you have the choice, the get a faster crane (this will cost more money of course) or you can decide to have 10hours, 12 hours, 16 hours working day instead of 8, which also can improve the daily productivity.

Time cost trade offs is about to decide which activitiy or activities should speed up, in order to achieve the desired project duration, for the minimum extra cost. I think it's a good basic idea, but we do not practicise this. I'd like to figure out why.

Regards

Miklós

 

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

Maybe in Hungary you can hire and fire Romas as if toilet tissue paper but here the laws that protect labor and the unions would burn you at the very beginning of your job. Here you simply cannot do that, hiring and firing at will is considered a medieval practice. Here if you are open shop next day the union will take over or perhaps the labor department if you survive.

Some trades are scarce like epoxy terrazzo installers, PLC programmers, instrumentation installers, laboratory cabinet installers or the rare trades like gold leaf lettering on glass doors. Even heavy equipment operators of some particular equipment are scarce and will not move to your job unless for a reasonable amount of time.

Even aligning the cardan shaft of some water-waste-water pumps require specialized people trained at factory, and frequently initial start-up must be done under the supervision of a factory technician.

Photobucket

Some specialized labor got to be imported from mainland USA, like for example for the calibration of liquors machines that send the liquor via underground plastic piping to each station.

Huge cranes for heavy lift are scarce as the few are kept busy as to justify the investment.

We do not have unlimited supply of concrete at our construction sites, it is limited by distance and plant capacity among others. Same goes with relation to asphalt and borrow materials. The supply of these materials is somewhat limited.

In a Portfolio of multiple jobs shared resources can be an issue.

Yes we live in a very different world, Disney is at Orlando and nowhere else.

Hi Miklos,

you wrote In Hungary at this moment we have the experience that lack of resources is simply not an issue, especially when you need labour or machine.

Resource constraints in most cases are physical (example: there is a place for only one crane) or the result of time-cost trade off (if the job can be done on time by two resource units why use three?)

In most cases faster project completion with the same resources means lower project cost.

Regards,

Vladimir

Miklos Hajdu
User offline. Last seen 11 years 49 weeks ago. Offline
Joined: 13 May 2011
Posts: 97
Groups: GPC Malaysia

Hi Rafael,

thanks for your answer. My comments on yours will be in Italic

 

My answer to Question no 1.

How can people forget something was not available on the most used software packages at the time it came out in the mid 70's, perhaps only a couple of very expensive main frame computer software had it, among them believe ARTEMIS.

Here in Hungary when we start to teach network techniques we start it without computers, and we start it with the original CPM model. It is really aq surprise for me, that years later when they come back for extra courses, or we make some consultancy at companies it is a sad experience that they forget everything about the original model.

The other issue is that it is very theoretical as most construction activities are performed by some standard crews, you do not have 80% of a slab pouring crew, it is usually the same for all pour sizes unless equipment changes. Therefore the software to be closer to reality must take into account variations in resources for different activity durations. Yes Contractors usually optimize the individual activities using available float and then look at the Critical Path to look for time-cost savings to apply manually.

You absolutely right we have a crew, and not 80% of the crew, but you have the freedom to say that instead of 8 hours they have to work in 10, 12 hours with a well paid overtime rate, or you can apply two pouring crews etc. So i think there is the option to make this kind of considerations but no one do this way. Contractors by the way could act on a way, that '"o this activity has a lot of float, so I can make it slower and cheaper", but they do not act like this according to my experience

 

The tight schedules usually do not allows for the Contractor to select an optimum job duration, we are required to use all contract duration in our Baseline Schedules. The CPM is mainly a Claim tool and you cannot submit a schedule without considering this fact. 

Our most common "optimization" of activities was the time to pour, we use to pour slabs early in the morning and finis the work during regular hours, then in the afternoon we use to star poring walls with a crane and a crew of four at 3pm with some overtime that makes the particular activity bit more expensive but this allows for the walls forms crew to work with the crane and the forming during regular hours. Yes contractors are aware the optimization of the total is not necessarily equal to the optimization of the parts.

Finally but not less relevant I wonder if time-cost trade off functionality considers resource constraints or if they allow for unleveled schedules impractical to be implemented. This I do not know as I never had software with such functionality but for some reason I believe it does not as we studied it as a problem solved by linear programming while resource leveling was performed using other mathematical approaches.

That's really interesting what you are pointing to, I wanted to exchange my idea with you at last time as well. In Hungary at this moment we have the experience that lack of resources is simply not an issue, especially when you need labour or machine.  It is true from client side (they simply not interested in contractors' resources) but it's tru from the contractors' side as well, we say "they go to the corner, whistle, and within five minutes they can choose amongst hundreds of subcontractors." We have 80.000 construction firms in Hungary but only ten has more than hundred physical workers.  So during the years working on either contractor or client side resource was never an issue. This is not the first time you cite resources, limits of resources, so I guess situation is completely different at your enviroment.

Resource levelling algorithms are based on 99% on heuristics,  correct models that describe optimize for example resource levelling based on integer programming which is not an easy task to solve in polinomial time. That's why we use heuristics instead of it. CPM can be modelled with integer progrqmming as well, and there are hundreds of papers that discuss time cost trade off with resource constraints, but i never found any commercial solutions so far.

 

My answer to Question no 2.

Because we are not the one who select software as Contractors, the Owner selects it by means of the Contract Specifications, based on our experience seems we will never be able to use time-cost trade offs.

Because of the requirement to use all contract time in our schedules baselines the use of time-cost trade offs is ruled out.

The exetensions of the original Time cost trade offs allow to handle milestones with penalty in case of delays, so from this point of you models are close to reality.

Even if allowed to define schedule duration at optimum with some buffer time I doubt it will be of widespread use, still too theoretical, especially if cannot handle at the same time resource constraining issues.

In my practice I found that lot of main contractors do not care about resources, they employ everything labour machine etc, through subcontractors. Even more I've heard about contracts where not only delay punished but earlir accomplishments are rewarded. In this case time cost trade off would be an interesting tool to discover wether it worth or not to speed up the project beyond a certein project duration. But.. I agree wth you I have never heard about any project they've practising CPM time cost trade offs since the Personal Computer  age.

We realize this is not an exact science and our actual performance make the procedure impractical, our activity durations are on general too variable, fortunately costs are more on the bulls eye.

I have a feeling: This is - in big part - is the failure of our education systems. See Risk analysis, optimistic, pessimistic, duratiom  came back after 50 years (first with Risky project, then with the others) and we (planners) slowly start to understand it, and use it. Maybe this will be the situation with Time-Cost trade off as well.

Regards ,

    Miklós

Rafael Davila
User offline. Last seen 1 week 3 days ago. Offline
Joined: 1 Mar 2004
Posts: 5241

My answer to Question no 1.

How can people forget something was not available on the most used software packages at the time it came out?  In the mid 70's, perhaps only a couple of very expensive main frame computer software had it, among them believe ARTEMIS.

The other issue is that it is very theoretical as most construction activities are performed by some standard crews, you do not have 80% of a slab pouring crew, it is usually the same for all pour sizes unless equipment changes. Therefore the software to be closer to reality must take into account variations in resources demand and availability for different activity durations and timing. Yes Contractors usually optimize the individual activities using available float and then look at the Critical Path to look for time-cost savings to apply manually.

The tight schedules usually do not allows for the Contractor to select an optimum job duration, we are required to use all contract duration in our Baseline Schedules. The CPM is mainly a Claim tool and you cannot submit a schedule without considering this fact. 

Finally but not less relevant I wonder if time-cost trade off functionality considers resource constraints or if they allow for unleveled schedules impractical to be implemented. This I do not know as I never had software with such functionality but for some reason I believe it does not as we studied it as a problem solved by linear programming while resource leveling was performed using other mathematical approaches.

My answer to Question no 2.

Because we are not the one who select software as Contractors, the Owner selects it by means of the Contract Specifications, based on our experience seems we will never be able to use time-cost trade offs.

Because of the requirement to use all contract time in our schedules baselines the use of time-cost trade offs is ruled out.

Even if allowed to define schedule duration at optimum with some buffer time I doubt it will be of widespread use, still too theoretical, especially if cannot handle at the same time resource constraining issues. We realize this is not an exact science and our actual performance make the procedure impractical, our activity durations are on general too variable, fortunately costs are more on the bulls eye.

I believe the same happened to true resource smoothing, not the unique and different interpretation by P3, but to reduce peak changes in resource demand using some mathematical procedures to minimize the sum of least squares. This also dissapeared as a scheduling option.

Regards,

Rafael