Nan picked up the last chunk of cookie and ate it. Peter and Trish had long ago finished theirs, but Nan liked making cookies last. "The critical thinking fallacies were my favorites," she said. "I like learning how to think more clearly."
Peter sipped his coffee. "Mmm." He swallowed. "But how do we avoid those fallacies?"
Nan had an idea. "Maybe we should inspect our project plans, like we inspect components."
Trish was intrigued. "Yeah, and I know what I'd put at the top of the checklist."
"OK, I'll bite," said Peter. "What?"
Trish was ready. "The Nine Project Management Fallacies."
Not a bad idea. These last three fallacies (Part IV of a little catalog of the fallacies of project management) are errors of critical thinking. For Part III, see "Nine Project Management Fallacies: III," Point Lookout for December 28, 2005.
Nonrandom pollingmight provide comfort,
but it's hardly
scientific
- The Normative Fallacy
- This fallacy holds that when we ask some people their opinions, and most of them agree, then they're correct. Usually we select people nonrandomly, choosing those who will give us desirable answers, or those we can trust, or those of high rank.
- Nonrandom polling might provide comfort, but it's hardly scientific, and it almost always leads to biased conclusions.
- To get truly useful polling data, you must poll people randomly.
- The Availability Heuristic
- In risk management, we often estimate the probabilities of certain events. We're using the Availability Heuristic [Tversky 1973] when we estimate these probabilities by sensing the difficulty of imagining or understanding the string of events that lead to the risk.
- For instance, when we ask people whether death resulting from being attacked by a shark is more or less likely than from being hit by falling airplane parts, they usually answer that death by shark attack is more likely. Actually, death from being hit by falling airplane parts is 30 times more likely, but people are fooled because it's easier to imagine shark attacks, which are more common.
- Estimating probabilities is unlikely to produce reliable results. For this reason, the Availability Heuristic is usually considered an example of a cognitive bias. Use real data, or use huge error bars.
- The Grandiosity Fallacy
- Confronting a problem, we sometimes address a generalization of the problem instead, hoping to solve a host of similar problems, and thereby solving the original problem almost "for free." Rarely does the reality match the wish.
- Grandiosity usually generates two kinds of trouble. First, it's often more expensive and time-consuming than originally estimated. Second, the people of the organization rarely want the general solution. If they did, they probably would have sought it in the first place.
- Sometimes customers don't know the value of the general solution, and telling them about it might produce a better outcome. But usually they want only what they asked for. Work with them on that first.
Track the incidence of these nine fallacies in your organization. Use them to inspect project plans. Probably your projects will have fewer surprises, or at least you'll be just a little less likely to be hit by falling airplane parts. First in this series Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
For more on cognitive biases, see "The Focusing Illusion in Organizations," Point Lookout for January 19, 2011.
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Project Management:
- Toxic Projects
- A toxic project is one that harms its organization, its people or its customers. We often think of toxic
projects as projects that fail, but even a "successful" project can hurt people or damage
the organization — sometimes irreparably.
- Nine Project Management Fallacies: III
- Some of what we "know" about managing projects just isn't so. Identifying the fallacies of
project management reduces risk and enhances your ability to complete projects successfully.
- Risk Management Risk: I
- Risk Management Risk is the risk that a particular risk management plan is deficient. It's often overlooked,
and therefore often unmitigated. We can reduce this risk by applying some simple procedures.
- Wishful Interpretation: I
- Wishful thinking comes from more than mere imagination. It can enter when we interpret our own observations
or what others tell us. Here's Part I of a little catalog of ways our wishes affect how we interpret
the world.
- Power Distance and Risk
- Managing or responding to project risks is much easier when team culture encourages people to report
problems and to question any plans they have reason to doubt. Here are five examples that show how such
encouragement helps to manage risk.
See also Project Management and Critical Thinking at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming April 3: Recapping Factioned Meetings
- A factioned meeting is one in which participants identify more closely with their factions, rather than with the meeting as a whole. Agreements reached in such meetings are at risk of instability as participants maneuver for advantage after the meeting. Available here and by RSS on April 3.
- And on April 10: Managing Dunning-Kruger Risk
- A cognitive bias called the Dunning-Kruger Effect can create risk for organizational missions that require expertise beyond the range of knowledge and experience of decision-makers. They might misjudge the organization's capacity to execute the mission successfully. They might even be unaware of the risk of so misjudging. Available here and by RSS on April 10.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed