March 9, 2003
By JOHN SCHWARTZ and MATTHEW L. WALD
AT NASA, it really is rocket science, and the decision makers really are rocket scientists. But a body of research that is getting more and more attention points to the ways that smart people working collectively can be dumber than the sum of their brains.
The issue came into sharp focus in Houston last week at the first public hearing of the board investigating the Columbia disaster last month. Henry M. McDonald, a former director of the NASA Ames Research Center, testifying before the board, said that officials at the space agency want to do the right thing, but cannot always get the facts they need.
Referring to the shuttle program manager, Ron D. Dittemore, he said, ''I have no concern at all that people like Ron Dittemore, presented with the facts, will make the right decision.'' But, he said, ''the concern is presenting him with the facts.''
In fact, NASA's databases are out of date. For example, it cannot easily collect its data on damage to the shuttle on previous flights, and then search the material for trends and warning signs.
Investigators are also questioning the quick analysis by Boeing engineers that NASA used to decide early in the Columbia mission that falling foam did not endanger the shuttle, though it is now considered one of the leading candidates for the craft's breakup. The analysis satisfied important decision makers, but some engineers continued to discuss situations involving possible problems related to the impact -- a routine process NASA calls ''what-if-ing.''
Because the engineers directly connected to the process were satisfied that the foam was not a risk, they did not pass the results of their discussions up the line, even though they suggested the material could potentially cause catastrophic damage. But other engineers who had been consulted became increasingly concerned and frustrated.
''Any more activity today on the tile damage, or are people just relegated to crossing their fingers and hoping for the best?'' asked a landing gear specialist, Robert H. Daugherty, in a Jan. 28 e-mail message to an engineer at the Johnson Space Center, just days before the shuttle disintegrated on Feb. 1.
The shuttle investigation may conclude that NASA did nothing wrong. But if part of the problem turns out to be the culture of decision making at NASA, it could lead to more group dynamics and words like groupthink, an ungainly term coined in 1972 by Irving L. Janis, a Yale psychologist and a pioneer in the study of social dynamics.
He called groupthink ''a mode of thinking that people engage in when they are deeply involved in a cohesive in-group, when the members' strivings for unanimity override their motivation to realistically appraise alternative courses of action.'' It is the triumph of concurrence over good sense, and authority over expertise.
It would not be the first time the term has been applied to NASA. Professor Janis, who died in 1990, cited the phenomenon after the loss of Challenger and its crew in 1986.
The official inquiry into the Challenger disaster found that the direct cause was the malfunction of an O-ring seal on the right solid-rocket booster that caused the shuttle to explode 73 seconds after launching.
But the commission also found ''a serious flaw in the decision-making process leading up to the launch.'' Worries about the O-rings circulated within the agency for months before the accident, but ''NASA appeared to be requiring a contractor to prove that it was not safe to launch, rather than proving it was safe.''
Groupthink, Professor Janis said, was not limited to NASA. He found it in the bungled Bay of Pigs invasion of Cuba and the escalation of the Vietnam War. It can be found, he said, whenever institutions make difficult decisions.
David Lochbaum, a nuclear engineer at the Union of Concerned Scientists, has studied nuclear plants where problems have gone uncorrected because of internal communications failures and poor oversight. His list includes the Davis-Besse plant near Toledo, Ohio, where in March 2002 technicians discovered that rust had eaten a hole the size of a football nearly all the way through the vessel head. Only luck prevented what might have become an American Chernobyl.
''As you go up the chain, you're generally asked harder and harder questions by people who have more and more control over your future,'' Mr. Lochbaum said. The group answering the questions then tend to agree upon a single answer, and to be reluctant to admit it when they don't have a complete answer.
Engineers, he said, can also become complacent in the face of a potential problem that has not gone badly wrong before.
''In the Challenger thing, where they had O-ring problems on previous flights, it got to be an annoyance, but not a symptom of a disaster,'' he said. Nuclear plants suffer from the same false security, he said; six plants had previously suffered minor corrosion, but none was discovered in a condition like Davis-Besse.
IT is only common sense that large institutions should try to make sound decisions, said John Seely Brown, a former researcher at Xerox and a co-author of ''The Social Life of Information.'' But it can be bewilderingly hard to do in practice.
''Often it takes tremendous skill in running a brainstorming session,'' Mr. Brown said. ''Every once in a while, the random way-out idea needs to have more of a voice.''
But giving the dissenting voice or voices greater influence turns out to be tricky. ''You've got to figure out something in a finite amount of time,'' Mr. Brown said, or find yourself, as NASA is now, ''swimming in a sea of hypotheses.''