In Part I of this series, we suggested that deep thought about difficult subject matter can sometimes cause blindness to related and important ideas — a kind of looking, but not seeing. And when we have preconceptions or we think we know what's happening, we sometimes don't even look.
Let's continue exploring ways of missing the obvious.
- Not knowing your own patterns
- If you don't know your own patterns, repetitions are likely. Recall situations in which you or your team missed the obvious. Whatever caused those oversights might still be in place, waiting to trip you once again.
- Track the patterns you tend to repeat. Data on repetitions is valuable.
- Seeking confirmation but not counterexamples
- When we have hunches or conjectures about something, we tend to search for confirmation rather than disconfirmation. It's satisfying to prove guesses correct — especially if they're our own guesses. And it's risky to prove guesses incorrect, especially if they're someone else's guesses.
- Falsifying conjectures can generate new insight. Examine past efforts. An imbalance in favor of seeking confirmation, rather than disconfirmation, could indicate this bias.
Sometimes entire groups or teams miss the obvious. Here are two common patterns.
- Media distortion
- The medium a team uses for meetings or other communication can strongly affect outcomes. It can even prevent effective communication, especially when virtual teams rarely or never meet face-to-face. It can conceal the fact that someone is withholding information. It can so distract people in meetings that they forget to mention something important. And the audio quality can be so poor that people miss subtle points — or even the main point — of the discussion.
- If your team or group depends on a virtual workspace, distribute notes and meeting summaries regularly to clarify issues and decisions. It's a poor substitute for co-located meetings, but it does help.
- Information siloing
- Groups If your team or group depends
on a virtual workspace, distribute
notes and meeting summaries to
clarify issues and decisionsconvened to resolve issues or solve problems usually include representatives of all functions that have relevant skills, information, or assets. Typically, they assume that everyone shares whatever they know. But when some keep information within their individual delegations, declining to share it, the knowledge that is shared acquires a bias, which can lead to poor decisions and missing the obvious. - This comes about, in part, because of a cognitive bias known as shared information bias, which causes group members to discuss what all group members know already. They're less inclined to discuss what only a few group members know. The effect is more marked when there's a sense of urgency, or when group members are uncomfortable with ambiguity or lack of consensus. The effect is less marked when the group, as a whole, is concerned with decision quality. Sharing knowledge about the shared information bias is one way of mitigating its effects.
The human mind is endlessly inventive. Of the many more ways to miss the obvious, I'm sure I've missed some that are obvious. First in this series Top Next Issue
Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Your comments are welcome
Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Project Management:
- Nine Project Management Fallacies: I
- Most of what we know about managing projects is useful and effective, but some of what we "know"
just isn't so. Identifying the fallacies of project management reduces risk and enhances your ability
to complete projects successfully.
- Beyond Our Control
- When bad things happen, despite our plans and our best efforts, we sometimes feel responsible. We failed.
We could have done more. But is that really true? Aren't some things beyond our control?
- Power Distance and Teams
- One of the attributes of team cultures is something called power distance, which is a measure
of the overall comfort people have with inequality in the distribution of power. Power distance can
determine how well a team performs when executing high-risk projects.
- Seven More Planning Pitfalls: II
- Planning teams, like all teams, are susceptible to several patterns of interaction that can lead to
counter-productive results. Three of these most relevant to planners are False Consensus, Groupthink,
and Shared Information Bias.
- Anticipating Absence: Internal Consulting
- Most consultants are advisors from outside the organization. But when many employees are unavailable
because of the Coronavirus pandemic, we need to find ways to access the knowledge that remains inside
the organization. Internal consulting can help.
See also Project Management and Critical Thinking at Work for more related articles.
Forthcoming issues of Point Lookout
- Coming April 3: Recapping Factioned Meetings
- A factioned meeting is one in which participants identify more closely with their factions, rather than with the meeting as a whole. Agreements reached in such meetings are at risk of instability as participants maneuver for advantage after the meeting. Available here and by RSS on April 3.
- And on April 10: Managing Dunning-Kruger Risk
- A cognitive bias called the Dunning-Kruger Effect can create risk for organizational missions that require expertise beyond the range of knowledge and experience of decision-makers. They might misjudge the organization's capacity to execute the mission successfully. They might even be unaware of the risk of so misjudging. Available here and by RSS on April 10.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed