THE DEPROLIFERATOR — You’ve probably heard the word redundancy in its current embodiment. To refresh your memory, it refers to the duplication of the critical components of a system, such as an airplane, to enhance its reliability. Redundancy’s rationale is obvious: The likelihood that the entire system will fail decreases as its components are duplicated, or even triplicated, in isolation from each other.
But, such is redundancy’s aura of near-invincibility that few are willing to entertain the notion that it may actually compound rather than reduce risk in some situations.
Back in 2004 noted nuclear-security expert Scott Sagan (recently engaged in a spirited debate in Survival magazine on first use of nuclear weapons) wrote an award-winning article for the publication Risk Analysis. Its enigmatic title: The Problem of Redundancy Problem: Why More Nuclear Security Forces May Produce Less Nuclear Security.
First, let’s get the problem with the problems in the title out of the way. Presumably the duplication of the word is a humorous attempt (however stiff) to highlight the concept of redundancy. Sagan begins his piece by recalling that post-9/11 fears of attacks on U.S. nuclear facilities induced officials to authorize an increase in security personnel to protect them.
But first he explains how technological redundancy can lead to what’s called “catastrophic common-mode error” — the failure of all the components in a system.
[Many] serious accidents with hazardous technologies. . . are caused by redundant safety devices. … The October 1966 near-meltdown accident at the Fermi reactor near Monroe, MI, for example, was caused by an emergency safety device, a piece of zirconium plate [which] broke off, however, and blocked a pipe, stopping the flow of coolants into the reactor core.
Meanwhile, Sagan maintains, another potential contributing factor to catastrophic common-mode failure is adding manpower to guard critical sites. Huh? Apparently, despite the supposed success of the “Surge” in Iraq, a reflexive precaution such as adding reinforcements comes complete with a hole through which you could drive a truck bomb, I mean truck. To wit: an insider threat. Who, Sagan asks, “should guard the guardians?”
He’s skeptical of reassurances by the nuclear power industry — and even its regulators — that security personnel are ” thoroughly vetted through intense background checks, random drug and alcohol tests, and security management programs, like the Continuous Behavior Observation Program, which ensures that supervisors and colleagues will report on any suspicious behavior.” Overlooking the obvious reservations about that last course of action (C-BOP?) for the moment, why is Sagan suspicious?
Because “the criteria used to assess suspicious behavior are suspicious.” Here we go again with the redundancy. What Sagan means is that the criteria have failed to filter out what may charitably be called undesirable elements. “For example,” he writes, “security personnel of at least one nuclear weapons facility were known to have ties with members of anti-government right-wing militia groups.”
Just as long as it’s not al Qaeda. Compared to 9/11, Oklahoma City was only a blip on the terrorist-attack radar, right?
Moving on, Sagan writes, “The second way in which redundancy can backfire is when diffusion of responsibility leads to ‘social shirking.'” We may not be familiar with this term, but we know all too well what it means. It’s a “common phenomenon — in which individuals or groups reduce their reliability in the belief that others will take up the slack.” Yet it’s “rarely examined.” Why?
“. . . because of a ‘translation problem’ that exists when transferring redundancy theory from purely mechanical systems to complex organizations. [It seems that in] mechanical engineering, the redundant units are usually inanimate objects, unaware of each other’s existence. In organizations, however, we are usually analyzing redundant individuals, groups, or agencies, backup systems that are aware of one another. [Emphasis added.]
True, this sounds like something out of a Malcolm Gladwell book, or the genre he spawned. Yet here’s another “outlier” (or something like that) — overcompensation, which occurs when. . .
. . . the addition of extra components encourages individuals or organizations to. … engage in inherently risky behavior — driving faster, flying higher, producing more nuclear energy, etc.
Think about the old argument: If boxers wear helmets, do they then suffer fewer qualms about inserting their heads into the action and risking cranial injury? In fact, Sagan writes:
Research demonstrates. . . that the increased use of ski helmets has not led to decreases in head injuries in accidents on the slopes because many skiers with helmets just go faster down more treacherous terrain.
Something similar seems to have triggered the January 1986 space shuttle Challenger explosion. Sagan again:
A strong consensus [emerged that] the unprecedented cold temperature at the Kennedy Space Center at the time of launch caused the failure of two critical O-rings [in the shuttle’s] rocket booster [which were] listed as redundant safety components. [The secondary O-ring was intended to] seal even if the first leaked. [The] decision makers falsely believed that redundant safety devices allowed them to operate in more dangerous conditions [the unusual cold — RW] without increasing the risk of a catastrophe.
As for the optimal number of guards at U.S. nuclear facilities, Sagan writes:
. . . overcompensation should remind nuclear security analysts [that] increases in nuclear security forces should not be used as a justification [for maintaining] insecure facilities or increasing the numbers of nuclear power plants, storage sites, or weapons facilities.
It appears then that it would be wise to refrain from carrying redundancy over from other systems to one as critical as a nuclear power or weapons facility. At the same time, besides helping prevent a terrorist attack, unmasking the problems that redundancy can lead to in the nuclear realm might help shine a light on the difficulties that redundancy presents in other systems.
First posted at the Faster Times.