Join getAbstract to access the summary!


Join getAbstract to access the summary!


What Plane Crashes, Oil Spills, and Dumb Business Decisions Can Teach Us About How to Succeed at Work and at Home

Penguin Press,

15 min read
9 take-aways
Audio & text

What's inside?

Advanced technological progress puts society at risk of catastrophic system failure.

Editorial Rating



  • Eye Opening
  • Concrete Examples
  • Engaging


Former derivatives trader Chris Clearfield and management professor András Tilcsik assess modern systems’ frightening potential for failure, such as nuclear reactor meltdowns, transportation disasters, poor water quality and hacked defibrillators. They delve into what went wrong in various crises without being too technical. They spend two-thirds of this coherent, well-organized argument on research-backed strategies for addressing the risk of failure. A nuclear meltdown might not seem related to daily life, but these failure-prevention strategies apply to many issues. The authors provide tools for staving off disaster in complex environments.


Complexity makes modern systems more useful yet more vulnerable to catastrophic failure.

Many complex systems intertwine with your life. They include utilities such as electricity and water, transportation systems, communication networks and many more.

Sometimes, these systems fail. Such failures are more common than you may realize, and they relate to society’s technological progress. Many technologically advanced systems are incredibly complex with slim tolerances for error. Mistakes can become catastrophic.

Acknowledging small problems and learning from them helps you find larger threats.

The 1979 meltdown of the Three Mile Island nuclear reactor in Harrisburg, Pennsylvania, was a complex system failure, but it wasn't caused by a major system malfunction. The source was an interaction among small problems with the plumbing, a valve and an indicator light. 

The small problems caused such an outsized failure – a reactor meltdown – for two reasons: the technological complexity of a nuclear reactor as a system and because the system had no room for error...

About the Authors

Former stock trader Chris Clearfield writes about complexity and failure for general and academic publications. University of Toronto, Rotman School of Management professor András Tilcsik teaches organizational failure and disaster risk management.

Comment on this summary