Searching...
English
EnglishEnglish
EspañolSpanish
简体中文Chinese
FrançaisFrench
DeutschGerman
日本語Japanese
PortuguêsPortuguese
ItalianoItalian
한국어Korean
РусскийRussian
NederlandsDutch
العربيةArabic
PolskiPolish
हिन्दीHindi
Tiếng ViệtVietnamese
SvenskaSwedish
ΕλληνικάGreek
TürkçeTurkish
ไทยThai
ČeštinaCzech
RomânăRomanian
MagyarHungarian
УкраїнськаUkrainian
Bahasa IndonesiaIndonesian
DanskDanish
SuomiFinnish
БългарскиBulgarian
עבריתHebrew
NorskNorwegian
HrvatskiCroatian
CatalàCatalan
SlovenčinaSlovak
LietuviųLithuanian
SlovenščinaSlovenian
СрпскиSerbian
EestiEstonian
LatviešuLatvian
فارسیPersian
മലയാളംMalayalam
தமிழ்Tamil
اردوUrdu
Drift into Failure

Drift into Failure

by Sidney Dekker 2011 234 pages
4.01
343 ratings
Listen
Try Full Access for 3 Days
Unlock listening & more!
Continue

Key Takeaways

1. Beyond Broken Parts: The Limits of Newtonian Thinking in Accidents

The Newtonian, vision has had enormous consequences for our thinking even in the case of systems that are not as linear and closed as Newton's basic model – the planetary system.

Outdated worldview. Our understanding of accidents is largely trapped in a Newtonian-Cartesian worldview, which emphasizes reductionism, linear cause-and-effect, and the search for a single "broken part" or "bad actor." This perspective, rooted in 17th-century scientific thought, assumes that complex phenomena can be understood by dissecting them into their simplest components. For example, in the Deepwater Horizon disaster, the immediate response was a hunt for culprits and broken components, rather than a deeper systemic inquiry.

Reductionist approach. This reductionist approach leads to investigations that catalog broken components—be they mechanical parts, human errors, or deficient procedures—and assume a direct, unproblematic relationship between these failures and the overall system breakdown. This is evident in accident reports that list findings as a series of broken and unbroken parts, expecting the reader to connect them into a coherent narrative. However, this method often fails to explain how things got to that broken state.

Foreseeability and blame. The Newtonian model also underpins our notions of foreseeability and accountability. It assumes that with enough knowledge of initial conditions and governing laws, outcomes are predictable. Therefore, if harm occurs, it implies negligence or a failure to foresee, leading to blame and punishment, as seen in the criminal charges against a nurse for a medication error. This framework, while offering comfort by identifying a fixable cause, severely limits our ability to understand and prevent failures in truly complex systems.

2. Failure is a Gradual Drift, Not a Sudden Break

Drifting into failure is a gradual, incremental decline into disaster driven by environmental pressure, unruly technology and social processes that normalize growing risk.

Slow, incremental process. "Drift into Failure" describes a slow, incremental decline into disaster, where organizations, in pursuit of their mandate, gradually borrow from safety margins. This isn't a sudden collapse due to an abnormal dysfunction, but an inevitable byproduct of normal functioning under pressure. The Deepwater Horizon disaster, for instance, was not a single catastrophic event but the culmination of years of cost-cutting and risk-taking that became normalized.

Normalizing deviance. This drift involves a continuous organizational and operational adaptation around conflicting goals and uncertainties, leading to small, step-wise normalizations of what was previously considered deviant or risky. Each small deviation, if followed by continued operational success, is taken as proof that the adaptation is safe, reinforcing the behavior. The extended lubrication intervals for the Alaska Airlines Flight 261 jackscrew, from 300 to 2,550 flight hours, exemplify this decrementalism.

Invisible from within. One of the greatest challenges is that this drift is often invisible to those inside the system, especially without a catastrophic outcome. What appears as a slide into disaster in hindsight was, at the time, a series of rational, common-sense decisions made within local contexts. This makes it difficult for internal reporting systems to capture the true nature of the accumulating risk, as deviations become the new normal.

3. Five Interacting Features Fuel the Drift into Failure

The potential for drift into failure can be baked into a very small decision or event.

Interconnected drivers. Drift into failure is shaped by five interconnected features that interact in complex ways. These are not isolated causes but dynamic influences that collectively push a system towards its margins. Understanding these interactions is crucial for moving beyond simplistic explanations of accidents.

Key features of drift:

  • Scarcity and competition: Environmental pressures force trade-offs between efficiency and safety.
  • Decrementalism (small steps): Gradual, seemingly minor deviations from norms become accepted.
  • Sensitive dependence on initial conditions: Small early decisions can have disproportionately large, unforeseen consequences later.
  • Unruly technology: Technology behaves unpredictably in real-world contexts, defying design assumptions.
  • Contribution of the protective structure: Regulators and internal safety mechanisms can inadvertently facilitate drift.

Cumulative impact. These features don't act in isolation; their interactions create a self-reinforcing cycle. For example, competitive pressure (scarcity) can lead to small steps in normalizing risk, which, when combined with unruly technology, can have amplified effects due to sensitive dependence on initial conditions, often overlooked by a compromised protective structure. This cumulative effect makes the system increasingly brittle.

4. Complexity: More Than Just a Complicated System

Complexity means that a huge number of interacting and diverse parts give rise to outcomes that are really hard, if not impossible, to foresee.

Beyond complication. A complex system is fundamentally different from a merely complicated one. While a complicated system (like a Boeing 737) has many parts and interactions that are, in principle, exhaustively describable, a complex system's behavior cannot be fully predicted or understood by analyzing its individual components. This is because complex systems are open, adaptive, and their parts interact non-linearly.

Emergent properties. In complex systems, the whole is more than the sum of its parts; it exhibits emergent properties. These are behaviors or characteristics that cannot be explained or predicted from the properties of the individual components. For example, the "wetness" of water is an emergent property not found in individual H2O molecules. Similarly, accidents in complex systems are often emergent, arising from the interactions of normally functioning parts.

Unforeseeable outcomes. The non-linear interactions and sensitive dependence on initial conditions mean that small changes can lead to large, unpredictable outcomes—the "butterfly effect." This makes traditional control and prediction difficult, as the system's future states are not deterministically knowable. The example of a cell phone's systemic impact, linking its production to Coltan mining, civil wars, and ecological destruction, illustrates how far-reaching and unpredictable these emergent relationships can be.

5. Accidents Emerge from Normal Operations, Not Just Errors

The harmful outcome is not reducible to the acts or decisions by individuals in the system, but a routine by-product of the characteristics of the complex system itself.

Normal work, abnormal outcomes. Accidents in complex systems are often a routine by-product of normal functioning, not solely the result of abnormal dysfunctions or individual errors. People come to work intending to do a good job, making locally rational decisions that, when aggregated across the system and over time, can inadvertently lead to catastrophic outcomes. The Alaska 261 accident, where maintenance intervals were extended and procedures followed, despite underlying risks, illustrates this.

Beyond human error. The focus on "human error" as a primary cause is a Newtonian simplification. In complex systems, individual actions, even if seemingly flawed in hindsight, are often sensible given the immediate context, pressures, and available information. The problem isn't necessarily that people are making "bad" decisions, but that the system's inherent complexity allows these locally rational actions to accumulate into systemic vulnerability.

Banality of accidents. This concept, sometimes called the "banality of accidents thesis," suggests that incidents do not necessarily precede accidents in a linear fashion. Instead, normal work, with its inherent trade-offs and adaptations, creates the conditions for failure. What is considered "report-worthy" or an "incident" from within the system often shifts as deviance becomes normalized, making early detection of drift extremely challenging.

6. Local Rationality Drives Systemic Vulnerability

The frame of reference for understanding people's ideas and people's decisions, then, should be their own local work context, the context in which they were embedded, and from whose (limited) point of view assessments and decisions were made.

Bounded rationality. People in complex systems operate with "local rationality," meaning their decisions make sense given their immediate situational indications, operational pressures, and organizational norms. They do not possess perfect, global knowledge of the entire system or all possible outcomes. This contrasts sharply with rational choice theory, which assumes fully informed, objective decision-makers.

Macro-micro connection. Global pressures like resource scarcity and competition trickle down, influencing local decision-making. These macro-level forces become internalized, shaping what individuals perceive as rational or unremarkable trade-offs between efficiency and safety. For example, NASA's "Faster, Better, Cheaper" philosophy translated into local decisions by engineers and managers to prioritize schedules, even if it meant normalizing risks like foam strikes.

Structural secrecy. Large, bureaucratic organizations inherently create "structural secrecy," where patterns of information, organizational structure, and processes undermine attempts to understand the whole. Specialized knowledge, physical distance, and formal communication channels can inadvertently obscure critical information, making it difficult for anyone, even experts, to grasp the full implications of local decisions. This secrecy allows practical drift to occur unnoticed across different units.

7. Unruly Technology Defies Simple Prediction and Control

Unruly technology introduces and sustains uncertainties about how and when things may develop and fail.

Beyond design specifications. Technology, especially in complex systems, often behaves in ways that defy its original design assumptions and predictions. This "unruliness" means that uncertainties about its operation and failure modes cannot be fully reduced by traditional engineering calculations or fault trees. The Gaussian copula, designed to price debt obligations, became unruly when released into the complex financial market, contributing to a global crisis.

Operational context matters. Technology's behavior is highly context-dependent. Universal assurances about reliability figures are often mute when designs are put into real-world situations. The MD-80 jackscrew, designed for a 30,000-hour service life without inspection, showed excessive wear within a year of operation, forcing a re-evaluation of maintenance. This highlights the gap between theoretical design and practical reality.

Tool for discovery. Instead of viewing unruly technology as a problem to be "fixed" to meet ideal specifications, complexity theory suggests seeing it as a tool for discovery. Its unexpected behaviors offer windows into the complex system's workings and our own understanding of them. The surprises generated by unruly technology provide opportunities for learning, not just about the technology itself, but about the broader socio-technical environment in which it operates.

8. Even Safety Structures Can Contribute to Drift

The protective structure can actively contribute to drift into failure, rather than just not intervening when it should.

Paradox of protection. Structures designed to ensure safety—such as regulators, quality review boards, and internal safety departments—can paradoxically contribute to drift into failure. These protective structures are themselves complex systems, influenced by societal expectations, resource constraints, and conflicting goals, which can affect how they define and rationalize "acceptable" system performance.

Co-optation and normalization. Regulators, needing insider knowledge to be effective, can become too integrated with the industries they oversee, leading to a decline in diverse viewpoints. The federal Minerals Management Service, for example, was criticized for its close ties to the oil industry, leading to "alternative compliance" and exemptions for deep-water drilling projects like BP's Liberty. This co-optation normalizes risky practices.

Mismatched tools. Traditional regulatory tools like compliance checks and oversight are often mismatched to the dynamic, emergent nature of complex systems. Rules, often lagging behind evolving practices and unruly technology, can become obsolete. A compliance-based approach may simply apply existing rules to new, complex situations, inadvertently legitimizing practices that are drifting towards risk, rather than proactively addressing emergent vulnerabilities.

9. Diversity is the Antidote to Systemic Brittleness

Diversity is a critical ingredient for resilience, because it gives a system the requisite variety that allows it to respond to disturbances.

Requisite variety. Diversity is paramount for managing safety under uncertainty in complex systems. It provides a system with the necessary variety of perspectives and responses to recognize, adapt to, and absorb disturbances without catastrophic failure. High-reliability organizations, for instance, foster diversity through decentralized decision-making and encouraging dissent, empowering those closest to the problem to respond effectively.

Challenging dominant logic. Systems lacking diversity tend to over-exploit existing knowledge and dominant logics, becoming brittle and less adaptive. This can manifest as "groupthink" or a "take-over by dominant logic," where alternative viewpoints are suppressed or ignored. The departure of Rich Kinder from Enron, who offered a counter-balance to Skilling's aggressive strategies, is cited as a loss of diversity that contributed to Enron's drift.

Fostering diverse perspectives:

  • Empowerment: Granting authority to lower-ranking employees to halt operations if they perceive risk.
  • Rotation: Rotating personnel to bring fresh perspectives and challenge existing norms.
  • Critical reflection: Encouraging continuous questioning of "normal" operations and small steps.
  • Multiple narratives: Valuing different interpretations of events, rather than seeking a single "true" story.

By actively cultivating diversity, organizations can enhance their capacity for exploration and innovation, making them more resilient to unforeseen challenges and less prone to drifting into failure.

10. A New Ethic for Accountability in Complex Systems

The problem of accountability in complex systems is no less complex than the complexity of the system itself.

Beyond linear blame. The Newtonian quest for a single "broken part" to assign blame is existentially and morally comforting but fundamentally flawed in complex systems. Since accidents are emergent properties of normal operations, not reducible to individual actions, traditional notions of accountability become problematic. Skilling's conviction for Enron's collapse, while providing closure, struggles to pinpoint a clear, linear causal chain.

Unforeseeable outcomes. In complex systems, outcomes are uncertain and often unforeseeable due to non-linearity and sensitive dependence on initial conditions. This challenges the ethical premise that individuals can be held accountable for consequences they could not reasonably predict. Trivial, everyday decisions, only deemed significant in hindsight, cannot fairly bear the full weight of catastrophic outcomes.

Authentic stories over accurate ones. A post-Newtonian ethic for failure moves beyond the pursuit of a single, "accurate" story of what happened, which is impossible in complex systems. Instead, it advocates for multiple "authentic" stories that capture the vitality and phenomenology of being inside the system. This approach acknowledges that:

  • No objective truth exists; knowledge is locally constructed.
  • Diversity of narratives is a source of resilience, not weakness.
  • Judgments are constructions, influenced by observer biases.
  • Consequences of judgments must be considered, even if not fully foreseeable.
  • Judgments should be revisable as new understanding emerges.

This new ethic invites more voices into the conversation, celebrating diversity and acknowledging the inherent limitations of our understanding and control. It shifts the focus from finding a singular culprit to understanding the intricate web of relationships that lead to both success and failure.

Last updated:

Want to read the full book?

Review Summary

4.01 out of 5
Average of 343 ratings from Goodreads and Amazon.

Drift into Failure receives mixed reviews, averaging 4.01/5 stars. Readers praise Dekker's exploration of complexity theory and systems thinking, finding it thought-provoking and applicable beyond industrial safety. The book challenges traditional Newtonian-Cartesian approaches to understanding failures, arguing complex systems drift gradually through small adaptations rather than single broken components. Common criticisms include repetitiveness, dense academic writing, high price for length, lack of practical guidance, and occasional editing issues. Many appreciate real-world disaster examples (Challenger, Columbia, Alaska Airlines) but desire more actionable solutions. Overall, readers recommend it for those interested in organizational safety and systems thinking despite its flaws.

Your rating:
4.41
3 ratings

About the Author

Sidney W. A. Dekker, born 1969 near Amsterdam, is Professor at Griffith University in Brisbane, Australia, where he founded the Safety Science Innovation Lab. He also holds an Honorary Professorship of Psychology at the University of Queensland. Previously, Dekker was Professor of human factors and system safety at Lund University in Sweden, founding the Leonardo da Vinci Laboratory for Complexity and Systems Thinking. He flew as First Officer on Boeing 737s for Sterling and Cimber Airlines from Copenhagen. Dekker is internationally recognized for pioneering "Safety Differently" and "Restorative Just Culture" movements, challenging bureaucratic, blame-oriented safety approaches with emphasis on resilience, compassion, and organizational learning.

Listen
Now playing
Drift into Failure
0:00
-0:00
Now playing
Drift into Failure
0:00
-0:00
1x
Voice
Speed
Dan
Andrew
Michelle
Lauren
1.0×
+
200 words per minute
Queue
Home
Swipe
Library
Get App
Create a free account to unlock:
Recommendations: Personalized for you
Requests: Request new book summaries
Bookmarks: Save your favorite books
History: Revisit books later
Ratings: Rate books & see your ratings
600,000+ readers
Try Full Access for 3 Days
Listen, bookmark, and more
Compare Features Free Pro
📖 Read Summaries
Read unlimited summaries. Free users get 3 per month
🎧 Listen to Summaries
Listen to unlimited summaries in 40 languages
❤️ Unlimited Bookmarks
Free users are limited to 4
📜 Unlimited History
Free users are limited to 4
📥 Unlimited Downloads
Free users are limited to 1
Risk-Free Timeline
Today: Get Instant Access
Listen to full summaries of 26,000+ books. That's 12,000+ hours of audio!
Day 2: Trial Reminder
We'll send you a notification that your trial is ending soon.
Day 3: Your subscription begins
You'll be charged on Mar 16,
cancel anytime before.
Consume 2.8× More Books
2.8× more books Listening Reading
Our users love us
600,000+ readers
Trustpilot Rating
TrustPilot
4.6 Excellent
This site is a total game-changer. I've been flying through book summaries like never before. Highly, highly recommend.
— Dave G
Worth my money and time, and really well made. I've never seen this quality of summaries on other websites. Very helpful!
— Em
Highly recommended!! Fantastic service. Perfect for those that want a little more than a teaser but not all the intricate details of a full audio book.
— Greg M
Save 62%
Yearly
$119.88 $44.99/year/yr
$3.75/mo
Monthly
$9.99/mo
Start a 3-Day Free Trial
3 days free, then $44.99/year. Cancel anytime.
Scanner
Find a barcode to scan

We have a special gift for you
Open
38% OFF
DISCOUNT FOR YOU
$79.99
$49.99/year
only $4.16 per month
Continue
2 taps to start, super easy to cancel
Settings
General
Widget
Loading...
We have a special gift for you
Open
38% OFF
DISCOUNT FOR YOU
$79.99
$49.99/year
only $4.16 per month
Continue
2 taps to start, super easy to cancel