Planning Fallacy, Optimism Bias and Anchoring Effect - the struggle is real

To explore the topics of planning fallacy, optimism bias and the anchoring effect, I have chosen to communicate them through my own learning curve in the early 1990’s.  I was just getting introduced to estimating by... (continued)

 

 

 

To explore the topics of planning fallacy, optimism bias and the anchoring effect, I have chosen to communicate them through my own learning curve in the early 1990’s.  I was just getting introduced to estimating by a seasoned (crusty) sheet metal worker who had worked his way up to the role of General Manager, we will call him GM.  GM told me what production multipliers to use, but I wanted to know how he knew they were accurate, and he would say, “he just knew”.  I had no estimating experience and like a toddler, drove him crazy with my constant “why” questions.  Finally, out of frustration he put me in the field as part of a roofing crew so “I would know why”.  This started by love of production numbers, variables, job costing, WBS and tracking actuals against estimates.  He unintentionally created a statistical monster. 

 

What GM’s lesson (punishment) did was begin to clearly demonstrate all three of our subject topics.   I learned from data gathering, that GM’s production numbers were way off, but the projects he bid tended to come in within 5% +/- of his total bid costs.  GM’s basis for his estimates were (what he believed were) his production numbers when he was doing similar work over twenty years prior.  He estimated that metal roofing could be installed at 25 square feet per hour and flashings at 100 to 120 lineal feet per hour.  He had a definite optimism bias when it came to flashings.  What my data showed (depending on a lot of circumstances) was the roofing was being installed at 40 to 60 square feet per hour and flashings at 10 to 40 lineal feet per hour.   Because his numbers were so skewed, when we had a job with a lot of flashings, we lost money.  When he bid a job that was heavy on the square footage of roofing, we were always too high.  As we fine-tuned the estimating process based on real data, we still consistently battled his planning fallacy.  Even when I proved his production rates were clearly inaccurate, he continued to use his methods for forecasting.   GM understood and even accepted the data, but consistently repeated the erroneous calculations, it drove me crazy.  At the time, I didn’t realize there was a name for it, planning fallacy.  “The intriguing aspect of this phenomenon is the ability of people to hold two seemingly contradictory beliefs: Although aware that most of their previous predictions were overly optimistic, they believe that their current forecasts are realistic” (Buehler, Griffin, and Ross, 1994).

 

As my career advanced and work became more complex, I still gathered data like a fiend to help with estimating, but that doesn’t provide a real time picture of what is happening in the field for forecasting (it’s just another source of information).  When updating schedules and working on balance to complete estimates, I always relied heavily on the people performing the work as the key information source; they are after all the ones doing the work.  Not only do I find I am able to obtain better information, involving them in the process builds trust and their commitment to the dates.  I found that even when they are a bit too optimistic (which is what we are talking about), “they typically meet deadlines” (Buehler, Griffin and Ross, 1994), so it makes sense to develop forecasts and schedules this way.  It is important that we understand our role as PM in this process, as the voice of reason and recognize that planning fallacy is normal and must be challenged, though delicately.  We must not allow unrealistic commitments from those who are telling us what we want to hear, and what they will (wish to) accomplish (under perfect conditions).  We can not let our own optimism bias get in the way of a realistic view of the conditions and facts.  We can not allow our team to set themselves up for failure with unrealistic expectations, even from themselves.  Learning how to facilitate and draw out reasonable accurate information from experienced people can take time and patience; it’s a learning curve because people are individuals.  We also must recognize and deal with our own biases in the process.

 

Optimism bias was one of the hardest lessons to learn about and overcome.  Interestingly another seasoned (even more crusty) sheet metal worker helped our team overcome this obstacle.  Being young, the only female on the crew and eager build a positive team, I planned like a mad woman to make sure we had everything we could possibly need before we started a job.  I planned for everything.  Slowly over time I gained the trust of the guys and we experienced great success together, but we also fell victim to optimism bias one winter.  With years of experience under my belt, I watched the weather like a hawk because it was a risk I could not control.  Steep slope metal roofing in freezing conditions is not something that should be attempted, we all knew better.  There was no way to maintain adequate production rates and stay safe; safety was our number one priority.  So, one unseasonably bad winter we let our optimism bias convince us to try it, after all we weren’t like most crews; we were special.  After weeks of poor production, I discussed with the crew stopping work.  Everyone thought they could pick up the pace and wanted to keep trying.  My bobble head bobbing, I drank the optimism Kool aid and we continued until we had someone fall.  Luckily his fall protection did its job and he wasn’t hurt, but that was enough to wake me up.  I stopped the work and we conducted a lessons learned.  During the debrief, one of the crew members (more crusty, Bill) let us all have it.  Bill told us how stupid everyone was for attempting to work in those conditions.  He was right.  When I asked him why he hadn’t spoken up prior, he said that no one had asked him directly.  Bill and I became pretty good buddies after that.  I learned that Bill was my doom and gloom guy.  No optimism bias in his entire body.  Anytime I wanted to get a worst-case picture of a project that Bill was on, he and I would have a private discussion.  It's hard to recognize our own optimism biases so its helpful to consult someone with different optics, to check yourself.

 

I also learned the risks of sharing certain information with the field crews.  Not because I didn’t trust them, but because it impeded their ability to provide me accurate and unbiased information.  Early in my PM career when I would ask the crew how long they thought it would take to complete a task, they would immediately ask me how much time they had left in the estimate.  Inevitably their responses would be anchored around whatever number I provided.  I was disappointed several times before I realized the issue was with the anchoring effect.  When I stopped providing that information, I received much more accurate results, often very different than my estimate.  This opened the door for examination of risks that were lurking, not yet visible in data.  When they gave me numbers that were way off what I expected, it sparked conversations that needed to take place, altering the team to pending issues while we still had time to mitigate impacts.  Once the field staff understood why I wouldn’t share the information, this simple step improved our communication, collaboration, risk planning and overall team dynamic significantly.   Simply stated, I valued their knowledge and input, without influence.

 

In some situations, when sourcing input from your team, you will need to avoid providing base data that may unintentionally corrupted their ability to think independently and provide objectivity.  The struggle is real.

 

~ Khris Beyer

 

 


References

Buehler, R., Griffin, D., & Ross, M. (1994). Exploring the" planning fallacy": Why people underestimate their task completion times. Journal of personality and social psychology67(3), 366.

Forsyth, D. K., & Burt, C. D. (2008). Allocating time to future tasks: The effect of task segmentation on planning fallacy bias. Memory & cognition36(4), 791-798.

Gardner, D. (2009). The science of fear: how the culture of fear manipulates your brain. New York, NY: Plume.

Humphreys, G. C. (2018). Project management using earned value (4th ed.). Irvine, CA: Humphreys & Associates.

 

 

#behavioralprojectmanagement #neuroprojectmanagement #decisionscience #behavioralscience #cognitivebiases #planningfallacy #optimismbias #anchoringeffect #behavioraleconomics #neuralplan #behaviorinformedprojectdesign 


Showing 3 reactions

Please check your e-mail for a link to activate your account.