Safety, Risk, and Accident Prevention

Projects are inherently more risky

Because projects are time-pressured environments, this time constraint causes the brain to make decisions that are usually more error-prone than decisions in an operations environment where time pressure may be lower (see image of your brain on time pressure, below). 

Time pressure:

  • Causes more use of cognitive biases
  • Causes reliance on old habits and defaulting to mental rules of thumb, known as heuristics
  • May impact ethical behavior
  • Can increase risk-taking
  • Decreases prediction accuracy in forward-looking activities and behaviors

Forstmann, B. U., Dutilh, G., Brown, S., Neumann, J., von Cramon, D. Y., Ridderinkhof, K. R., Wagenmakers, E. (2008). Striatum and pre-SMA facilitate decision-making under time-pressure.Proceedings of the National Academy of Sciences of the United States of America, 105(45), 17538-17542.

 

Project Cultures and social dynamics of organizations 

Organizations can also be subject to social dynamics that increase risky behavior and thus accident probability. A whole host of factors can cause organizational cultures to be more prone to safety and risk issues. Some of the causes come from levels of acceptance of risk and behavior as a culture, and some may link directly to leadership and management. Some examples include:

  • Organizational silence - the tendency of employees to be silent and not speak up about safety and risk issues that could be potentially dangerous
  • Normalization of deviance - the tendency of people in the organization to normalize risky behaviors over time as socially acceptable

 

Indicators may be in places you hadn't thought

Some issues may be measurable and have indicators that are in obscure places. For example, optimism bias and deliberate ignorance can be strong indicators of risk exposure in your organization (see the brain on optimism bias, below). Planning and forecasting accuracy may also show signs of potential social pressure issues, such as strategic misrepresentation, that could be indicators of the same psychological issues that cause organizational silence, for example.

Optimism Bias

Sharot, T., Riccardi, A. M., Raio, C. M., & Phelps, E. A. (2007). Neural mechanisms mediating optimism bias. Nature, 450(7166), 102-5. http://dx.doi.org.tcsedsystem.idm.oclc.org/10.1038/nature06280

 

 

 

References

Flyvbjerg, B. (January 01, 2008). Curbing Optimism Bias and Strategic Misrepresentation in Planning: Reference Class Forecasting in Practice. European Planning Studies, 16, 1, 3-21.

Grossman, Z. (November 01, 2014). Strategic Ignorance and the Robustness of Social Preferences. Management Science, 60, 11, 2659-2665.

Kutsch, E., & Hall, M. (January 01, 2010). Deliberate ignorance in project risk management. International Journal of Project Management, 28, 3, 245-255.

Pessemier, W. (2008). Improving safety performance by understanding perceptions of risk and improving safety management systems. Public Entity Risk Institute.

Sharot, T., Riccardi, A. M., Raio, C. M., & Phelps, E. A. (2007). Neural mechanisms mediating optimism bias. Nature, 450(7166), 102-5. http://dx.doi.org.tcsedsystem.idm.oclc.org/10.1038/nature06280


Be the first to comment

Please check your e-mail for a link to activate your account.