The Fundamental Attribution Error
by Rick Brenner
When we try to understand the behavior of others, we often make a particularly human mistake. We tend to attribute too much to character and disposition and too little to situation and context. When we seek a better balance, we can adopt a more accepting view of events around us.
The meeting ran over by about five minutes, just long enough to make most of them late to their 11ams. So the room emptied quickly, leaving Spencer alone with the meeting chair, Lynne. Lynne asked, "Help me take down the flip charts?"
"Sure," he said, "no problem."
"I wanted to talk to you, too," she said. "I really felt that you weren't being very helpful today."
Spencer felt somewhat shocked at first, but then it came to him — it was probably Metronome. Still, he didn't want to let her see that he knew what it was. "Oh? In what way?"
They now had all the flip chart sheets flat on the table, and Lynne sat down in her chair. Spencer sat down across the table from her.
"When you brought up the Metronome interface," she said.
"Oh, that," he said. "It just seemed to me that the rest of the meeting depended on it."
We have a tendency
to explain the behavior
of others in terms of
than contextLynne felt her frustration building. "But I explained all that in my email yesterday. And you went ahead anyway. That's what bothers me."
Lynne has now dug herself into a neat hole. She is assuming that Spencer saw her message, and she feels that he disregarded it. In fact, he never did receive it, and he was unaware of the change in the agenda.
Lynne's error is perfectly human. It's so common that it even has a name — the Fundamental Attribution Error (FAE). As humans, we have a tendency to explain the behavior of others on the basis of disposition or character, rather than context or the actions of third parties. Probably this happens because we understand the internal motives of others more easily than we understand the complex situations they face. That's reasonable, because we usually have only vague information about how situations look to others.
For example, Lynne was completely unaware that Spencer had been having chronic email problems. Customer reports are routed to a list he has to subscribe to, and his inbox suffers from chronic bloat, which has exposed a bug in the email client they all use. Lynne attributed Spencer's behavior to a deliberate choice, but he might have made another choice if he had been aware of the change in the agenda.
An American Indian proverb captures the idea of the FAE most elegantly: "Don't judge a man until you've walked two moons in his moccasins." To help you remember the Fundamental Attribution Error, get a pair of baby moccasins. Baby shoes will do, too. Put one on your desk or on top of your computer monitor and the other in your car. Only you will know what they mean, because everyone else who tries to figure out their meaning will make the Fundamental Attribution Error. Top Next Issue
For more about the Fundamental Attribution Error, see Gladwell, Malcolm, The Tipping Point: How Little Things Can Make a Big Difference. Boston: Back Bay Books, 2002. p. 160-163. Order from Amazon.com.
The article you've been reading is an archived issue of Point Lookout, my weekly newsletter. I've been publishing it since January, 2001, free to all subscribers, over the Web, and via RSS. You can help keep it free by donating either as or as . You'll receive in return my sincere thanks — and the comfort of knowing that you've helped to propagate insights and perspectives that can help make our workplaces a little more human-friendly. More
Your comments are welcome
Would you like to see your comments posted here? Send me your comments by email
, or by Web form
About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful,
and that you'll consider recommending it to a friend
Point Lookout is a free weekly email newsletter. Browse the archive
of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout,
as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in,
anonymously if you like, and I'll give you my two cents.
More articles on Emotions at Work
- The Fallacy of the False Cause
- Although we sometimes make decisions with incomplete information, we do the best we can, given what we know. Sometimes, we make wrong decisions not because we have incomplete information, but because we make mistakes in how we reason about the information we do have.
- Think in Living Color
- Feeling trapped, with no clear way out, often leads to anger. One way to defuse your anger is to notice false traps, particularly the false dichotomy. When you notice that you're the target of a false dichotomy, you can control your anger more easily — and then the trap often disappears.
- Ethical Influence: Part II
- When we influence others as they're making tough decisions, it's easy to enter a gray area. How can we be certain that our influence isn't manipulation? How can we influence others ethically?
- Unintended Consequences
- Sometimes, when we solve problems, the solutions create new problems that can be worse than the problems we solve. Why does this happen? How can we limit this effect?
- Teamwork Myths: I vs. We
- In high performance teams, cooperative behavior is a given. But in the experience of many, truly cooperative behavior is so rare that they believe that something fundamental is at work — that cooperative behavior requires surrendering the self, which most people are unwilling to do. It's another teamwork myth.
See also Emotions at Work and Critical Thinking at Work for more related articles.
Forthcoming Issues of Point Lookout
- Coming November 26: Ten Approaches to Managing Project Risks: Part II
- Managing risk entails coping with unwanted events that might or might not happen, and which can be costly if they do happen. Here's Part II of our exploration of coping strategies for unwanted events. Available here and by RSS on November 26.
- And on December 3: Ten Approaches to Managing Project Risks: Part III
- Project risk management strategies are numerous, but these ten strategies are among the most common. Here are the last three of the ten strategies in this little catalog. Available here and by RSS on December 3.
I offer email and telephone coaching at both corporate and individual rates.
Contact Rick for details at rbrenner@ChacoCanyon.com
or (617) 491-6289, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout
are available in six ebooks:
Reprinting this article
Are you a writer, editor or publisher on deadline?
Are you looking for an article that will get people talking and get compliments flying your way? You can have 500 words in your inbox in one hour. License any article from this Web site. More info
- The Race to the South Pole: Ten Lessons for Project Managers
- On 14 December 1911, four men led by Roald Amundsen reached the South Pole. Thirty-five days later, Robert F. Scott and four others followed. Amundsen had won the race to the pole. Amundsen's party returned to base on 26 January 1912. Scott's party perished. As historical drama, why this happened is interesting enough, but to project managers, the story is fascinating. Lessons abound. Read more about this program. Here's an upcoming date for this program:
- The Politics of Meetings for People Who Hate Politics
- There's a lot more to running an effective meeting than having the right room, the right equipment, and the right people. With meetings, the whole really is more than the sum of its parts. How the parts interact with each other and with external elements is as important as the parts themselves. And those interactions are the essence of politics for meetings. This program explores techniques for leading meetings that are based on understanding political interactions, and using that knowledge effectively to meet organizational goals. Read more about this program. Here's an upcoming date for this program:
- Cognitive Biases and Workplace Decision-Making
- For most of us, making decisions is a large part of what we do at work. And we tend to believe that we make our decisions rationally, except possibly when stressed or hurried. That is a mistaken belief — very few of our decisions are purely rational. In this eye-opening yet entertaining program, Rick Brenner guides you through the fascinating world of cognitive biases, and he'll give concrete tips to help you control the influence of cognitive biases. Read more about this program. Here's an upcoming date for this program: