Behavioral Concepts

Here is a list of major behavioral concepts that apply to project management, with their brief definitions. There are over 100 different categories, constructs, theories, biases, and other applicable behavioral and neuro elements that are applicable to project management, and they will be listed here as each is ready to be published on the site.

(Institute members will have access to additional recommended mitigations to some behavioral issues, as the recommendations become ready).

The image below is an example of the bias mapping the Institute is currently researching to match cognitive biases to project management processes:

 

Time Pressure

Time pressure can cause System 1 processes to be more dominant (Kahneman, 2011). Because of the nature of the automatic System 1 with fast thinking and mental shortcutting, time pressure changes cognitive processing, and this impact on decision making under time pressure has been studied in many disciplines (Nepal, Park, & Son, 2006; Sarter, & Schroeder, 2001). Time pressure also increases affective (emotional) decision making (Finucane, Alhakami, Slovic, & Johnson, 2000), and creativity may also be reduced under the perceived pressure of time (Elsbach, & Hargadon, 2006). As time pressure increases, System 1 becomes dominant, biases and heuristics increase, creativity declines, affective decision making increases, and risky decision making can increase (Kirchler et al., 2017). Exceptions to time pressure and erroneous decision making are when recognition-primed decisions rely on learned patterns (Klein, 2008).

Because of the very definition of a project, the pressure of time from a time constraint is one of the key behavioral elements in Behavioral Project Management. 

 

Planning Fallacy

Kahneman & Tversky (1979) found that people have a tendency to underestimate durations of tasks. This finding is critical to project management, because projects and temporary organizations are made up of a series of tasks (Lundin & Söderholm, 1995), and rely on the completion of those tasks in order to deliver an outcome within a specific period of time. The planning fallacy is a form of optimism bias that influences unrealistic project planning (Peetz, Buehler, & Wilson, 2010). This biased tendency towards optimism has also been identified in fMRI scans, with people expecting their future to have a decreased risk of negative events, having little evidentiary support for their expectation (Sharot, Riccardi, Raio, & Phelps, 2007). Optimism bias (Costa-Font, Mossialos, & Rudisill, 2009) is also known as unrealistic optimism (Weinstein, 1980).

It should be noted that there could be many elements that contribute to the planning fallacy, such as optimism bias, the overconfidence effect, deliberate ignorance (also known as the ostrich effect), and the anchoring effect, to name a few. 

 

Dual-System Theory

There are two different primary systems of cognition. The automatic, fast, and non-conscious system is referred to as intuitive, or System 1. The more controlled, slow, and conscious thinking is known as System 2 (Stanovich & West, 2000). Many cognitive biases and heuristics result from the automatic and intuitive System 1 (Kahneman, 2011).

 

Inertia

Inertia is the endurance of a steady and unchanged state associated with inaction and highly related to status quo bias (Madrian & Shea 2001). Decision inertia can be mitigated by setting defaults through methods such as choice architecture.

Inertia can be seen in many different areas of project management. An example may be a planning session, where one may see inertia in action with standard processes taking precedence over new or mental energy-consuming processes.

 

Ostrich Effect (Deliberate Ignorance)

Have you ever turned your head away from something you don't want to see? What about having a thought of failure enter your mind, and quickly try to clear that thought from your head in an effort to deny its existence?

This is known in several different disciplines as:

  • Deliberate Ignorance (Kutsch & Hall, 2010)
  • Willful Ignorance (Ramasesh & Browning, 2014)
  • The Ostrich Effect (Karlsson, Loewenstein, & Seppi, 2009)
  • Strategic Ignorance (Van der Weele, 2012)

There are really two ways to look at ignorance, according to Kutsch and Hall: plain and simple error (unintentional), and irrelevance (more intentional).

Irrelevance is broken down into three subdomains:

  1. Untopicability - This is information that is considered off topic, which is the most obvious kind of irrelevance. This is more of a limiting of information on risks and other things that may be pertinent, but are considered out of the range of importance in the given scenario.
  2. Taboo - This is defined as a "moral and/or cautionary restriction placed on action based on what is deemed inappropriate". This is a significant subdomain, where a subject is socially uncomfortable and bringing it up might challenge our unrealistic view of the project. This is the subdomain of taboo, where exposure to project risk may cause anxiety, so no one discusses the issues. 
  3. Undecidability - This subdomain is explained by the search for a true or false answer. If there is a lack of data for predicting a risk, then it's easy for stakeholders to take the 'out' of not knowing which risks may be considered true. In this case, the team deems the risk as not pertinent, and it gets scratched from the risk register.

 

Availability Heuristic

Availability is a judgmental heuristic where one evaluates probability of events based on the ease of which events come to mind (Tversky & Kahneman, 1973). The relevance of the event is affected by size, frequency, likelihood, and vividness (Rothman & Hardin, 2016; Shedler & Manis, 1986; Tversky & Kahneman, 1973).

The availability heuristic functions in several ways:

  • Memory of an event is more prevalent with larger, more frequent instances (Tversky & Kahneman, 1973).
  • Likelier events are easier to imagine (Tversky & Kahneman, 1973).
  • Events that are most “vivid” in memory are easier to recall (Shedler & Manis, 1986). Vivid is defined as “emotionally interesting”, “concrete and imagery provoking”, and “proximate in a sensory, temporal, or spatial way".

Previous research results have identified risk awareness as a mediator between overconfidence and risk assessment in projects. The availability heuristic was suggested as a possible explanation of why project managers might be overconfident, perceiving risks as less threatening. Overconfidence could reduce the availability of risks in memory, and reduced availability of risks might decrease the perceived threat of risk occurrence. If the availability of risks in memory is low, project managers may find it difficult to recall occasions where risks occurred, reducing estimated risk occurrence probabilities (Fabricius & Büttgen, 2015). This would be explained by the availability heuristic, which suggests that the estimated likelihood of events is based on the availability of the event in memory (Schwarz, Bless, Strack, & Klumpp. 1991; Tversky & Kahneman, 1973).

The formulation of activity durations may also be affected by availability. As people plan projects, the determination of task durations will be based on the ability to recall similar relevant events. In the face of uncertainty in planning, judgments based on instances that can be brought to mind are limited by vividness and recency of the event (Son & Rojas, 2011, p. 148). 

As the project comes to a close, documenting project lessons learned is recognized as a best practice for information transfer to future phases or future projects (PMBOK, 2017). The tendency to recall events most recent and available in memory may affect documentation of lessons learned, especially at the end of a long project or phase. Given that relevance and recall of an event is affected by frequency, likelihood, and vividness (Rothman & Hardin, 2016; Shedler & Manis, 1986; Tversky & Kahneman, 1973), future phases and projects may be at greater risk due to limited recall and subsequent documentation of events. Risks and opportunities affected by availability of the documenter to recall relevant events may limit the potential for future projects to apply risk mitigation and take advantage of opportunities to improve project outcomes.

 

Algorithm Aversion

Algorithms are all around us. From doctors relying on models to identify outliers in breast cancer scans, to project managers who use a statistical model to predict how long the next phase of their project will last. While some may look at algorithms and see nothing but benefits, others are more hesitant. In fact, some of us are distrustful of algorithms in general, especially after we’ve seen them perform and unavoidably, have seen them make one mistake or another (this has recently been termed algorithm aversion; Dietvorst, Simmons, & Massey, 2015). In contrast to fellow human beings, we don’t expect algorithms to make mistakes. We have some sort of ‘perfection scheme’ in our mind (Prahl & Van Swol, 2017): it is a computer program, therefore it must be logical and infallible. When in fact we know, no matter the source, is that forecasts are always wrong! Should we then throw away our prediction algorithms? Of course not! Studies time and again show that algorithms outperform human judgment.

However, we don’t need to trust algorithms blindly either. Remember the man who followed his GPS and drove straight off a cliff? In a similar fashion, we need to trust, but not blindly trust, algorithms in our project management adventures. A prediction model can form a solid base for your plan. However, you may know additional information that the model does not. Let’s say that you heard through the grapevine that your main supplier is having stock issues. This warrants some adjustment in your resource acquiring phase, as determined by your model. But it is exactly in these adjustments that the danger lies: we are prone to biases that may translate into fiddling with or adjusting the numbers that can harm accuracy. We are, for instance, extraordinarily optimistic. We may adjust the prediction model to much shorter project phases due to this optimism, but it isn’t based on reality. So, when to adjust and when not? When to trust the system and when not? In our NeuralPlan certification course we discuss the way forward: how do we trust algorithms, when do we trust them and when can we trust ourselves?

 

 

 

Join your peers and become a member of the most advanced project management endeavor, the building of #projectscience through the neuro, behavioral, and cognitive sciences! Behavioral Economics has made great strides, so what are we waiting for?

 

 

References

Costa-Font, J., Mossialos, E., & Rudisill, C. (January 01, 2009). Optimism and the perceptions of new risks. Journal of Risk Research, 12, 1, 27-41.

Dietvorst, B. J., Simmons, J. P., & Massey, C. (2015). Algorithm aversion: People erroneously avoid algorithms after seeing them err. Journal of Experimental Psychology: General, 144(1), 114. Journal of Experimental Psychology: General, 144(1), 114-126.

Elsbach, K. D., & Hargadon, A. B. (August 01, 2006). Enhancing Creativity Through “Mindless” Work: A Framework of Workday Design. Organization Science, 17, 4, 470-483.

Finucane, M. L., Alhakami, A., Slovic, P., & Johnson, S. M. (2000). The affect heuristic in judgments of risks and benefits. Journal of Behavioral Decision Making, 13, 1-17.

Kahneman, Daniel; Tversky, Amos (1979). "Intuitive prediction: biases and corrective procedures". TIMS Studies in Management Science. 12: 313–327.

Kahneman, D. (2011). Thinking, fast and slow. London: Allen Lane.

Karlsson, N., Loewenstein, G., & Seppi, D. (January 01, 2009). The ostrich effect: Selective attention to information. Journal of Risk and Uncertainty, 38, 2, 95-115.

Kirchler, M., Andersson, D., Bonn, C., Johannesson, M., Sørensen, E. Ø., Stefan, M., Tinghög, G., ... Västfjäll, D. (January 01, 2017). The effect of fast and slow decisions on risk taking. Journal of Risk and Uncertainty, 54, 1, 37-59.

Klein, G. (January 01, 2008). Naturalistic decision making. Human Factors, 50, 3, 456-60.

Kutsch, E., & Hall, M. (January 01, 2010). Deliberate ignorance in project risk management. International Journal of Project Management, 28, 3, 245-255.

Lundin, R. A., & Söderholm, A. (January 01, 1995). A theory of the temporary organization. Scandinavian Journal of Management, 11, 4, 437-455.

Madrian, B., & Shea, D. (2001). The power of suggestion: Inertia in 401(k) participation and savings behavior. Quarterly Journal of Economics, 116, 1149-1187.

Nepal, M. P., Park, M., & Son, B. (February 01, 2006). Effects of Schedule Pressure on Construction Performance. Journal of Construction Engineering and Management, 132, 2, 182-188.

Peetz, J., Buehler, R., & Wilson, A. (September 01, 2010). Planning for the near and distant future: How does temporal distance affect task completion predictions?. Journal of Experimental Social Psychology, 46, 5, 709-720.

Prahl, A., & Van Swol, L. M. (2017). Towards an understanding of algorithm aversion: Why do decision-makers discount advice from automation. Journal of Forecasting. Journal of Forecasting, 36(6), 691-702.

Project Management Institute. (2017). A Guide to the Project Management Body of Knowledge (6th ed.). Newtown Square: Project Management Institute.

Ramasesh, R. V., & Browning, T. R. (May 01, 2014). A conceptual framework for tackling knowable unknown unknowns in project management. Journal of Operations Management, 32, 4, 190-204.

Rothman, A. J., & Hardin, C. D. (July 02, 2016). Differential Use of the Availability Heuristic in Social Judgment. Personality and Social Psychology Bulletin, 23, 2, 123-138.

Sarter, N. B., & Schroeder, B. (January 01, 2001). Supporting decision making and action selection under time pressure and uncertainty: the case of in-flight icing. Human Factors, 43, 4, 573-83.

Schwarz, N., Bless, H., Strack, F., Klumpp, G., & et, . G. (January 01, 1991). Ease of retrieval as information: Another look at the availability heuristic. Journal of Personality and Social Psychology, 61, 2, 195-202.

Sharot, T., Riccardi, A. M., Raio, C. M., & Phelps, E. A. (2007). Neural mechanisms mediating optimism bias. Nature, 450(7166), 102-5. http://dx.doi.org.tcsedsystem.idm.oclc.org/10.1038/nature06280

Shedler, J., & Manis, M. (1986). Can the availability heuristic explain vividness effects? Journal of Personality and Social Psychology, 51(1), 26-36. doi:http://dx.doi.org.tcsedsystem.idm.oclc.org/10.1037/0022-3514.51.1.26

Son, J., & Rojas, E. M. (2011). Impact of Optimism Bias Regarding Organizational Dynamics on Project Planning and Control. Journal Of Construction Engineering & Management137(2), 147-157. doi:10.1061/(ASCE)CO.1943-7862.0000260

Stanovich, K., & West, R. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23(5), 645-665.

Tversky, A., & Kahneman, D. (January 01, 1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 2, 207-232.

Van der Weele, Joel J., When Ignorance Is Innocence: On Information Avoidance in Moral Dilemmas (August 20, 2012). Available at SSRN: https://ssrn.com/abstract=1844702 or http://dx.doi.org/10.2139/ssrn.1844702

Weinstein, N. D. (January 01, 1980). Unrealistic optimism about future life events. Journal of Personality and Social Psychology, 39, 5, 806-820.

 

 


Be the first to comment

Please check your e-mail for a link to activate your account.