Have you had one of those folks on your project team that seemed to be eternally optimistic, and then the project team struggled to meet an unrealistic deadline because of it, with the inevitable failure to meet the schedule in the end? Also, have you ever noticed how we tend to look for information that confirms what we believe? Or, how we tend to prefer our own team over those other guys?
What I just mentioned, are three different cognitive biases: optimism bias, confirmation bias, and ingroup bias. These biases have been studied in psychology (and some in neuroscience), with measurable results that show how they impact real-life decision making.
A cognitive bias is an error in reasoning, evaluating, or recalling, which often deviates from rational judgment as a systematic pattern of thinking to enable faster decisions. The issue with biases is that we are often unaware that we are using them. One of the reasons we don’t recognize biases is because they are part of the automatic system in our brain. Our minds generally operate using a Dual-system: System one, which is the automatic and quick-thinking system, and System two, which is the slower and more deliberative system.
System one is often described as the unconscious system that is automatic, fast, and efficient at running with data or experiences it already knows (or thinks it knows). It requires much less of your energy or attention, and you operate in this mode about 95% of your day. System one is also more prone to errors, with little awareness that it is making them. It is also partially responsible for your “gut feeling” and “intuition.” This is because System one takes memories of past experiences and compiles them in fractions of a second to give you an instant, intuitive feeling of ‘good’ or ‘bad.’ However, if you do any reading on confirmation bias, you’ll learn that our brain is always searching for information that supports what we already believe; this means that our memories and experiences are filtered through the lens of what we believe the situation is, and those biased, filtered memories are then fed back into your System one at a later date, further biasing your decisions in present and future moments.
System one is a complex animal, wouldn’t you agree?
So, of the over 150 researched biases, how do we begin to solve the problem? Moreover, what’s the significance of these biases? Well, just to give you an example of the real, material impact that bias can have, the cost of a year of delay from an optimistic project plan has been shown to increase cost overruns by 4.6% (Flyvbjerg et al., 2004).
Though there is no one fix-all solution to biases, there are some strategies that can be used to keep some biases at bay. Here are a few:
- Raise awareness. People will self-monitor to a degree, as long as they know what to monitor. This begins with training the organization on biases that may be applicable to their discipline.
- Reduce time pressure when there are big decisions that need to be made, especially decisions regarding risk, uncertainty, and anything associated with prediction and forecasting. Time pressure induces more use of System one, and with that comes a whole host of decision errors and an increase in the reliance on biases.
- Be aware of the automatic system (System one) and its effects on decision making. You must make a conscious effort to think through a problem and use System two, or else the automatic system may prevail.
- Consider choice architecture as a way of creating defaults that influence decisions in the right direction.
- Train specifically for certain activities and the biases that are prevalent in those areas. Planning and forecasting, for example, has certain steps that can be taken to increase accuracy through bias mitigation, reduce risk exposure, and create more reliable plans.
With about 70% of project performance being attributable to human factors, de-biasing the project is of utmost importance. And since half of all projects fail their cost and schedule objectives, we have room for improvement, especially given the fact that about $16 trillion of every year’s global Gross Domestic Product is projects. The next frontier in project management is de-biasing and managing the human factor through Behavioral Project Management!
Elsbach, K. D., & Hargadon, A. B. (August 01, 2006). Enhancing Creativity Through “Mindless” Work: A Framework of Workday Design. Organization Science, 17, 4, 470-483.
Flyvbjerg, B., Skamris, H., & Buhl, S. (January 01, 2004). What Causes Cost Overrun in Transport Infrastructure Projects?. Transport Reviews, 24, 1, 3-18.
Kahneman, D. (2011). Thinking, fast and slow. London: Allen Lane.
Kirchler, M., Andersson, D., Bonn, C., Johannesson, M., Sørensen, E. Ø., Stefan, M., Tinghög, G., … Västfjäll, D. (January 01, 2017). The effect of fast and slow decisions on risk taking. Journal of Risk and Uncertainty, 54, 1, 37-59.
Stanovich, K., & West, R. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23(5), 645-665.
Josh Ramirez, PMP, MSM-PM, is a consultant at Evanclaer and is experienced in business operations management, project management, and project controls. He has worked at several national laboratories and other projects throughout the Department of Energy and is pursuing a Ph.D. in business psychology. He has a Masters’ degree in project management, is an adjunct professor of project management and conducts training courses that integrate the behavioral sciences with project management. Josh writes about culture and behavior, as well as Metrics and KPIs.