The Human Factor in Fundraising: What Scuba Diving Teaches Us About Campaign Failures

Human factors principles from technical diving reveal why fundraising campaigns fail—not from single catastrophes, but from cascading small errors that compound into organizational disaster.

Share

When a fundraising campaign fails, boards and executives typically hunt for the culprit. Was it the economy? A competitor's campaign? Bad timing? This forensic approach assumes that organizational failures, like plane crashes, have a single root cause waiting to be discovered. The assumption is wrong, and it's costing nonprofits their ability to learn from failure.

The field of human factors—developed in aviation safety, nuclear power, and technical diving—offers a more accurate model. Failures rarely stem from one catastrophic decision. They emerge from multiple small problems that align at precisely the wrong moment. Understanding this pattern transforms how organizations can prevent the next failure before it happens.

The Incident Pit: How Small Problems Compound

In technical diving, Gareth Lock teaches about the "Incident Pit"—a mental model for understanding how divers get into life-threatening situations. Rarely does one dramatic event kill a diver. Instead, a sequence of minor issues accumulates: a fogged mask reduces visibility, cold water slows cognitive processing, an unfamiliar current demands extra attention. Each problem is manageable alone. Together, they drag the diver into a situation where any additional problem becomes fatal.

The Incident Pit

A cascading failure model where multiple minor issues—each survivable in isolation—compound to create conditions where any additional problem triggers catastrophe. The pit deepens gradually, making escape increasingly difficult as problems accumulate.

Safety engineers call this the Swiss cheese model. Imagine multiple slices of cheese stacked together, each representing a defense layer in your system. Each slice has random holes—imperfections, gaps, moments of inattention. Most of the time, the holes don't align. A problem gets caught by the next layer. But occasionally, every hole lines up perfectly, and the hazard passes straight through every defense.

When nonprofits diagnose a "failed campaign," they almost always identify an external shark—the economy crashed, a major donor died, a competitor launched at the same time. But the post-mortem rarely examines the silt: the database that wasn't cleaned before the appeal, the mobile-unfriendly donation page, the thank-you notes that went out three weeks late. These internal messes clouded the organization's vision long before the external shock arrived.

Task Loading and Tunnel Vision

Consider a technical diver attempting to photograph marine life while simultaneously checking navigation, monitoring air supply, and controlling buoyancy. Each task is straightforward. Combined, they exceed the brain's processing capacity. The diver develops tunnel vision—hyperfocused on the immediate task (getting the shot) while critical background processes (checking the air gauge) fade from awareness.

Task Loading

A cognitive state where the cumulative demand of simultaneous tasks exceeds available mental bandwidth, causing the brain to narrow focus to immediate concerns while losing awareness of peripheral but critical information.

This pattern replicates precisely in nonprofit development offices. Organizations hire a single Development Director and assign them grant writing, gala planning, social media management, major donor cultivation, database maintenance, and board reporting. Each responsibility is reasonable. The total exceeds human cognitive capacity.

Traditional Response

When the overwhelmed Development Director misses deadlines or drops tasks, management responds with performance improvement plans, productivity tools, or replacement hiring. The assumption: the individual failed to meet reasonable expectations.

Human Factors Response

Recognize that task loading is a systems problem, not a personnel problem. You cannot instruct a task-loaded diver to "try harder"—you must reduce the task load. The same applies to staff operating beyond cognitive capacity.

The task-loaded fundraiser enters tunnel vision just like the task-loaded diver. They focus entirely on the immediate deliverable—the gala, the grant deadline, the board presentation—while donor stewardship (the oxygen) depletes unnoticed. By the time retention numbers surface in next year's reports, the damage is irreversible.

Normalization of Deviance: The Drift into Failure

Perhaps the most insidious human factor is what sociologist Diane Vaughan termed "normalization of deviance." A diver skips the pre-dive safety check once—maybe they're running late, maybe conditions seem benign. Nothing bad happens. So they skip it again. Gradually, the omitted step stops feeling like a deviation and starts feeling normal. The new baseline becomes "we don't do safety checks."

Normalization of Deviance

The gradual process by which unacceptable practices become acceptable as deviations from established standards become routine without producing immediate negative consequences. Each successful deviation reinforces the behavior until the original standard is forgotten.

Nonprofit operations drift in exactly this pattern. An organization implements personalized thank-you notes as a retention strategy. One busy month, they send generic acknowledgments instead. Donations don't drop immediately—donor behavior has latency. So the next month, under continued pressure, they send generic notes again. Within a year, personalization has disappeared entirely. Two years later, when retention finally collapses, nobody connects the failure to the accumulated deviations. They blame the economy.

The drift happens because short-term outcomes don't reveal long-term damage. You can skip the safety check dozens of times before the one dive where it would have mattered. You can send generic thank-you notes for quarters before donor fatigue manifests in lapsed giving. The absence of immediate consequences doesn't mean the practice is safe—it means you're accumulating risk that will compound when conditions change.

Psychological Safety and "Thumbing the Dive"

Technical diving teams operate with a critical protocol: any team member can "thumb the dive"—signal to abort—at any time, for any reason, without providing justification or facing criticism. The practice exists because diving teams recognized that the alternative—pressure to continue when someone senses danger—produces fatalities.

The fundraising parallel is uncomfortable but necessary. Does your organization have a culture where a Major Gift Officer can say, "This campaign goal is mathematically impossible given our current pipeline," without being labeled negative or uncommitted? Can a program manager flag that the gala timeline has become dangerous without being accused of insufficient creativity?

Typical Nonprofit Culture

Concerns about feasibility are met with "we need can-do attitudes" or "let's focus on solutions, not problems." Staff learn that raising red flags damages their standing. Problems go unreported until they become crises.

Psychologically Safe Culture

Team members are expected to flag concerns early and specifically. Raising a problem is treated as valuable information, not negativity. The organization would rather abort a questionable initiative than drift into failure.

Organizations without psychological safety don't lack warning signs—they lack the channel for those signs to surface. The Development Director who sees the impossible goal doesn't lack perception; they lack permission to speak. Every person who stayed silent knew the dive was going wrong. They just couldn't thumb it.

Key Insight

If your team cannot call off a bad initiative without career consequences, you will eventually execute a catastrophic one. Psychological safety isn't about comfort—it's about organizational survival.

Applying Human Factors to Your Organization

The human factors framework suggests specific interventions beyond "work harder" or "hire better people."

First, audit for silt before scanning for sharks. When campaigns underperform, before examining external factors, systematically review the small internal issues that may have aligned: Was the database current? Did technology function properly? Were communications timely? Did staff have capacity for the work? The external shock may have been the trigger, but the internal conditions determined vulnerability.

Second, measure task load explicitly. Map every responsibility assigned to each role and estimate cognitive demand honestly. When the sum exceeds reasonable capacity, the organization has created the conditions for tunnel vision. Reducing scope or adding staff isn't admitting defeat—it's acknowledging human cognitive limits.

Third, track deviations from standard practice. When a procedure gets skipped "just this once," document it. When workarounds become routine, flag them. The drift into failure happens precisely because deviations feel insignificant in isolation. Only systematic tracking reveals the pattern before consequences manifest.

Fourth, establish explicit channels for thumbing the dive. Create regular forums where staff are specifically asked to identify initiatives that seem problematic. Make clear that raising feasibility concerns is valued, not punished. Consider whether your culture actually permits bad news to travel upward.

Summary

Fundraising failures—like diving fatalities and aviation disasters—rarely have single causes. They emerge from the intersection of multiple small problems that individually seem manageable. The human factors lens shifts attention from blaming individuals to examining systems: the task loads that exceed cognitive capacity, the normalized deviations that accumulate risk, the cultural barriers that prevent early warnings from surfacing.

Human Factor Traditional View Systems View
Campaign Failure Find the external cause or responsible person Examine how small internal issues aligned
Overwhelmed Staff Performance problem requiring coaching Task loading exceeding cognitive capacity
Skipped Procedures One-time workaround, no immediate harm Drift into failure accumulating risk
Unreported Concerns Staff should speak up if they see problems Culture may punish the messenger

You don't need to get wet to learn these lessons. They describe how human brains function under pressure—in any environment where cognitive load is high and failure has consequences. The diver watching their air gauge drop and the Development Director watching their pipeline shrink face the same fundamental challenge: operating effectively when the system is pushing them toward the incident pit.

References

  1. Lock, G. (2019). Under Pressure: Diving Deeper with Human Factors. The Human Diver. Goodreads →
  2. Reason, J. (1990). Human Error. Cambridge University Press. Goodreads →
  3. Vaughan, D. (1996). The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. University of Chicago Press. Goodreads →
  4. Edmondson, A. (2018). The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth. Wiley. Goodreads →

The Human Factor in Fundraising

Hear this research discussed in depth on the Fundraising Command Center Podcast.

Listen to Episode →