Design Errors and Group Biases
by Rick Brenner
Design errors can cause unwanted outcomes, but they can also lead to welcome surprises. The causes of many design errors are fundamental attributes of the way groups function. Here is Part II of our exploration.
Eugene F. Kranz, flight director, at his console on May 30, 1965, in NASA's Control Room in the Mission Control Center at Houston. The photo was taken during a simulation in preparation for a four-day, 62-orbit flight designated Gemini-Titan IV. After the Apollo I accident, in which three astronauts (Virgil I. "Gus" Grissom, Edward H. White II, and Roger B. Chaffee) were killed in a fire, Kranz convened a meeting in Mission Control and delivered a speech to controllers that subsequently became a part of NASA culture. He said, in part: "From this day forward, Flight Control will be known by two words: Tough and Competent. Tough means we are forever accountable for what we do or what we fail to do. We will never again compromise our responsibilities… Competent means we will never take anything for granted… Mission Control will be perfect. … These words are the price of admission to the ranks of Mission Control." No subsequent Apollo mission sustained loss of life.
By overlaying these standards on Mission Control culture, Kranz succeeded in disabling the mechanisms of group bias discussed here. He demanded that group members hold themselves personally accountable for the actions of the group.
Photo courtesy U.S. National Aeronautics and Space Administration.
In Part I of this examination of design errors, we noted that the consequences of design errors are sometimes favorable. We also explored groupthink and considered an example of how groupthink can lead to design errors. Groupthink is an example of a group bias — an attribute of the way groups function that can often lead to results that differ from the group's intentions.
Many group biases have been identified, and to the extent that they produce results at variance with group intentions, they can all lead to design errors that produce unexpected and unintended results. Here are three of them.
- Group polarization
- Group polarization is the tendency of groups to adopt positions more extreme than any of their members would adopt if acting individually. The phenomenon is consistent with a normalization effect that can occur when group members learn that the sense of the group is in general alignment with their own inclinations. Members then feel free to abandon reluctance and doubt with respect to their private judgments, and the result is a "hardening" of those judgments. More
- For groups making design decisions, group polarization can suppress interest in alternatives, and any desire to search for or explore rare but important use cases. It can also lead to outright rejection of perfectly workable designs — a form of design error not often noticed, because rejected designs typically are not implemented.
- Pluralistic ignorance
- In pluralistic ignorance, group members privately reject a position, while they simultaneously and incorrectly believe that almost everyone else accepts it. They decline to voice objections because they feel that doing so is pointless, or because they misinterpret the positions of other group members. More
- For example, consider a design that forthrightly concedes that it does not address a well-defined need of the customer population. All of the members of the group might have misgivings about failing to address the issue, but the group adopts the design anyway because all members believe (erroneously) that the others favor it.
- Abilene paradox
- Closely related to pluralistic Many group biases have been identified,
and to the extent that they produce
results at variance with group intentions,
they can all lead to design errorsignorance, the Abilene paradox applies when members of a group agree to go along with a group decision despite their private misgivings, mostly because of unpleasant imaginings of what the group might say or do if the member were to be honest about his or her misgivings. More
- For example, a group can reach a design decision that none of its members support, because all of its members imagine that serious conflict — possibly threatening the group's ability to work together — would erupt if they were to express their honest objections to the proposed design.
Although all of these biases (and others) can lead groups to decisions their members do not support, the results can actually be positive. Some groups do well in spite of themselves. It's rare, but it happens. First in this series Top Next Issue
Is every other day a tense, anxious, angry misery as you watch people around you, who couldn't even think their way through a game of Jacks, win at workplace politics and steal the credit and glory for just about everyone's best work including yours? Read 303 Secrets of Workplace Politics, filled with tips and techniques for succeeding in workplace politics. Order Now!
Your comments are welcome
Would you like to see your comments posted here? Send me your comments by email
, or by Web form
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful,
and that you'll consider recommending it to a friend
Point Lookout is a free weekly email newsletter. Browse the archive
of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout,
as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in,
anonymously if you like, and I'll give you my two cents.
More articles on Problem Solving and Creativity
- Finger Puzzles and "Common Sense"
- Working on complex projects, we often face a choice between "just do it" and "wait, let's think this through first." Choosing to just do it can seem to be the shortest path to the goal, but it rarely is. It's an example of a Finger Puzzle.
- Help for Asking for Help
- When we ask for help, from peers or from those with organizational power, we have some choices. How we go about it can determine whether we get the help we need, in time for the help to help.
- Bois Sec!
- When your current approach isn't working, you can scrap whatever you're doing and start again — if you have enough time and money. There's a less radical solution, and if it works, it's usually both cheaper and faster.
- Dealing with Deadlock
- At times it seems that nothing works. Whenever we try to get moving, we encounter obstacles. If we try to go around them, we find more obstacles. How do we get stuck? And how can we get unstuck?
- Teamwork Myths: Conflict
- For many teams, conflict is uncomfortable or threatening. It's so unpleasant so often that many believe that all conflict is bad — that it must be avoided, stifled, or at least managed. This is a myth. Conflict, in its constructive forms, is essential to high performance.
See also Problem Solving and Creativity and Project Management for more related articles.
I offer email and telephone coaching at both corporate and individual rates.
Contact Rick for details at rbrenner@ChacoCanyon.com
or (617) 491-6289, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout
are available in six ebooks:
Reprinting this article
Are you a writer, editor or publisher on deadline?
Are you looking for an article that will get people talking and get compliments flying your way? You can have 500 words in your inbox in one hour. License any article from this Web site. More info
- The Race to the South Pole: Lessons in Risk Management for Leaders
- On 14 December 1911, four men led by Roald Amundsen reached the South Pole. Thirty-five days later, Robert F. Scott and four others followed. Amundsen had won the race to the pole. Amundsen's party returned to base on 26 January 1912. Scott's party perished. As historical drama, why this happened is interesting enough. But to organizational leaders and project managers, the story is fascinating. We'll use the history of this event to explore lessons in risk management and its application to organizational efforts. A fascinating and refreshing look at risk management from the vantage point of history. Read more about this program. Here's an upcoming date for this program:
- Cognitive Biases and Workplace Decision-Making
- For most of us, making decisions is a large part of what we do at work. And we tend to believe that we make our decisions rationally, except possibly when stressed or hurried. That is a mistaken belief — very few of our decisions are purely rational. In this eye-opening yet entertaining program, Rick Brenner guides you through the fascinating world of cognitive biases, and he'll give concrete tips to help you control the influence of cognitive biases. Read more about this program. Here's an upcoming date for this program:
- MITRE, in Bedford, MA: October 21, Monthly Meeting, Boston SPIN.
- The Race to the South Pole: Ten Lessons for Project Managers
- On 14 December 1911, four men led by Roald Amundsen reached the South Pole. Thirty-five days later, Robert F. Scott and four others followed. Amundsen had won the race to the pole. Amundsen's party returned to base on 26 January 1912. Scott's party perished. As historical drama, why this happened is interesting enough, but to project managers, the story is fascinating. Lessons abound. Read more about this program. Here's an upcoming date for this program:
- The Politics of Meetings for People Who Hate Politics
- There's a lot more to running an effective meeting than having the right room, the right equipment, and the right people. With meetings, the whole really is more than the sum of its parts. How the parts interact with each other and with external elements is as important as the parts themselves. And those interactions are the essence of politics for meetings. This program explores techniques for leading meetings that are based on understanding political interactions, and using that knowledge effectively to meet organizational goals. Read more about this program. Here's an upcoming date for this program: