Space flight is inherently dangerous. They strap you to a rocket moving at 18,000 miles per hour. But we shouldn’t be stupid about it.
Many of us are familiar with the Space Shuttle Challenger disaster in 1986. NASA launched the ill-fated flight in January, which exploded 73 seconds after lift-off, killing its crew in front of everyone watching. The proximate cause was the O-ring, which couldn’t handle the freezing temperatures and leaked fuel during the launch, causing the explosion. But the actual cause was the inability of the groups inside the various organizations to dissent to the launch because of the pressures of managers and dates.
A similar story played out with the Columbia disaster in 2003. The Columbia shuttle was damaged during launch, and upon reentry to Earth’s atmosphere at the end of its mission, it exploded, killing its entire crew.
Both are tragedies, but the second feels even worse because the lessons of the first should have helped prevent the second, but they didn’t. The mission team on the ground had the opportunity to fully investigate the damage before re-entry, but didn’t. Dissenting voices were silenced, and they entertained only the most optimistic views of the success of the mission and the return of the space shuttle. There was no room for contrary opinions. Until everything went wrong.
Working in Groups
We often think working in groups will help us do better work. It’s why we have teams at work and in school, right? We expect the group to curb individual biases and make better decisions.
Group Polarization
Unfortunately the opposite can be true. Individual biases can get amplified in groups. We know this as group polarization.
As a group talks and works together, it may move to a more extreme consensus. They have shown this experimentally. In the link above, the authors tested out the theory with some hot-button political issues, bringing together people from either left-leaning or right-leaning backgrounds. After grouping them with similar thinking members, not only did the group get more extreme, but individuals reported more extreme personal views as well.
Hidden Profiles
When we bring a group together, we will often get the information that most members have. But a serious problem is that information that a few members may have could stay hidden. If 12 group members all know something, but two of the group members have some additional information that would be useful, it’s likely that those two will stay silent.
This is like the shared information bias and happens together with it. We focus on the information we all share in order to reach a consensus rather than the information that only a few may have that may be more consequential.
Conditions for Groupthink
Irving L. Janis was a pioneer in the study of group dynamics and coined the term “groupthink” in 1971.
He described three fundamental conditions that make groupthink more likely:
A highly cohesive group where there are no longer disagreements and members are deindividualized
Structural faults such as a leader with a preference for a certain decision, insulation of the group from outside opinions, or homogeneity of members
Situational contexts such as highly stressful external threats, recent failures or time pressures
Symptoms
Janis also identified eight major symptoms of groupthink, and grouped them into three categories:
Type I: Overestimating the power and morality of the group
Illusions of invulnerability creating excessive optimism and extreme risk-seeking.
Unquestioned belief in the group’s morality, causing members to ignore the consequences of their actions.
Type II: Closed-mindedness
Rationalizing warnings that might challenge or discount the group's assumptions.
Stereotyping those who oppose the group as weak, evil, biased, spiteful, impotent, or stupid.
Type III: Pressures toward uniformity
Self-censorship of ideas that deviate from the apparent group consensus.
Illusions of unanimity among group members, silence is viewed as agreement.
Direct pressure to conform placed on any member who questions the group, couched in terms of "disloyalty"
Mindguards— self-appointed members who shield the group from dissenting information.
The Tragedy
In the tragedy of Columbia, most factors listed above were present.
The conditions for groupthink were right for the mission team. It was a cohesive group with structural faults and outside pressures. That seems to be a recurring theme with NASA when problems arise.
But as you dive deeper, you can see the symptoms as well. The group apparently had an attitude of invulnerability as evidenced by skipping meetings. There was self-censorship within the mission team, pressure to silence criticism and guards against dissenting information. Rather than ask “what could go wrong”, they were only looking at the best scenarios and assuming everything would go right.
It’s Easy
It’s easy to play the blame-game in hindsight. But we’re all guilty of the same issues. How often do we only look at the best scenarios? How often do we rationalize dissenting warnings or self-censor?
The next few newsletters and posts will look at groupthink within our organizations and teams, and what we can do about it. It’s not just NASA that has this issue, it’s all of us. But it’s fixable, or at least something we can mitigate if we do better at understanding it.
Best of the Rest
Building Trust and Effectively Working Across Disciplines (podcast) - We interviewed Ali Maquet, a Principal Portfolio Manager. She's had an array of experience, from working as a BA, a scrum master, a PM and now a portfolio manager. She gives advice for moving into new roles, shaping your role, working with different disciplines, & building trust.