Some people who have organizational power use it wisely. Some who have organizational power do not. This post is about the latter. Specifically, this post explores one way people with organizational power can go wrong, due to the Dunning-Kruger Effect. I begin by describing the Dunning-Kruger Effect. Next I'll describe how the effect generates risks for organizational missions. I then offer three recommendations for managing that risk.
The Dunning-Kruger Effect
A cognitive bias is the tendency to make systematic errors of judgment based on thought-related factors rather than evidence. For example, a bias known as self-serving bias causes us to tend to attribute our successes to our own capabilities, and our failures to situational disorder. In 1999, Justin Kruger and David Dunning demonstrated the effects of a cognitive bias that has become known as the Dunning-Kruger Effect. They found that when we assess our own competence or abilities in a particular field, either in an absolute sense, or relative to others, we tend to commit systematic errors. [Kruger 1999] Four of their principal findings are:- The less competent tend to overestimate their own competence
- The less competent don't recognize the superior competence of the more competent
- The more competent tend to underestimate their own relative competence
- The more competent tend to estimate accurately the incompetence of the less competent
Consequences for people with power
Because of the Dunning-Kruger Effect, people with high levels of organizational power are at risk of demanding that the organization achieve goals that are not in fact achievable. Everyone is subject to the Dunning-Kruger Effect. With respect to knowledge domains outside our areas of expertise, any of us can mistakenly regard as achievable an objective that isn't achievable. Or we can regard as achievable an objective that can be achieved only at such high cost as to be truly impractical. Because An organizational leader who steps beyond his orher domain of expertise to devise and advocate
an organizational mission is at risk of sending the
organization on a fool's errandof the Dunning-Kruger Effect, when people in organizations receive commands from those with power, there is always the possibility that those with power have assessed themselves and their organizations as more capable than they actually are. The Dunning-Kruger Effect holds that anyone, including decision-makers with organizational power, is at risk of making these errors of judgment. In organizations, decision-makers are at risk of requiring others to carry out impossible missions, if those decision-makers lack some of the expertise required to assess those missions accurately. Decision-makers would be wise to consult experts in all domains relevant to a given mission before charging the organization with achieving that mission. The Dunning-Kruger Effect implies that relying on organizational leaders alone for these decisions is risky.
Three recommendations
To mitigate these risks, decision-makers can rely on domain experts who can assess the organization and its leaders with respect to three criteria:- 1. The mission is within the reach of the organization
- Missions require financial resources. They also require people with skills, knowledge, and experience that completely cover the mission's needs. A mission is within the reach of the organization if the necessary resources and people are available or can be acquired within the necessary time frames.
- 2. The decision-maker is competent to make future mission-relevant decisions
- During mission execution, the decision-maker must be available and competent to address any issues the mission requires. If issues beyond the competence range of the decision-maker should arise, the decision-maker has access to others who can identify those issues and provide or obtain the needed expertise. Taking into account the Dunning-Kruger Effect, the experts recognize that the decision-maker is not a reliable source for assessing compliance with this criterion.
- 3. A process is in place to maintain compliance with Criterion 1 and Criterion 2
- Ensuring ongoing compliance with Criteria 1 and 2 requires access to consultant capacity equivalent to what was available at the approval stage of the decision to undertake the mission. A correction process takes effect if the consultant finds misalignment between the organization and any of these three criteria.
Last words
Managing the risk of the Dunning-Kruger Effect requires people with organizational power — decision-makers — to acknowledge limits to that power. That acknowledging will be a difficult challenge for many. But the choice is clear: either acknowledge the limits of organizational power or accept the risks of the Dunning-Kruger Effect. Top Next IssueAre your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!
Footnotes
Your comments are welcome
Would you like to see your comments posted here? rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend me your comments by email, or by Web form.About Point Lookout
Thank you for reading this article. I hope you enjoyed it and found it useful, and that you'll consider recommending it to a friend.
This article in its entirety was written by a human being. No machine intelligence was involved in any way.
Point Lookout is a free weekly email newsletter. Browse the archive of past issues. Subscribe for free.
Support Point Lookout by joining the Friends of Point Lookout, as an individual or as an organization.
Do you face a complex interpersonal situation? Send it in, anonymously if you like, and I'll give you my two cents.
Related articles
More articles on Cognitive Biases at Work:
- How Messages Get Mixed
- Although most authors of mixed messages don't intend to be confusing, message mixing does happen. One
of the most fascinating mixing mechanisms occurs in the mind of the recipient of the message.
- The Rhyme-as-Reason Effect
- When we speak or write, the phrases we use have both form and meaning. Although we usually think of
form and meaning as distinct, humans tend to assess as more meaningful and valid those phrases that
are more beautifully formed. The rhyme-as-reason effect causes us to confuse the validity of a phrase
with its aesthetics.
- Motivated Reasoning
- When we prefer a certain outcome of a decision process, we risk falling into a pattern of motivated
reasoning. That can cause us to gather data and construct arguments that erroneously lead to the
outcome we prefer, often outside our awareness. And it can happen even when the outcome we prefer is
known to threaten our safety and security.
- Some Perils of Reverse Scheduling
- Especially when time is tight, project sponsors sometimes ask their project managers to produce "reverse
schedules." They want to know what would have to be done by when to complete their projects "on
time." It's a risky process that produces aggressive schedules.
- Lessons Not Learned: II
- The planning fallacy is a cognitive bias that causes us to underestimate the cost and effort involved
in projects large and small. Efforts to limit its effects are more effective when they're guided by
interactions with other cognitive biases.
See also Cognitive Biases at Work and Managing Your Boss for more related articles.
Forthcoming issues of Point Lookout
- Coming May 15: Should I Write or Should I Call?
- After we recognize the need to contact a colleague or colleagues to work out a way to move forward, we next must decide how to make contact. Phone? Videoconference? Text message? There are some simple criteria that can help with such decisions. Available here and by RSS on May 15.
- And on May 22: Rescheduling Collaborative Work
- Rescheduling is what we do when the schedule we have now is so desperately unachievable that we must let go of it because when we look at it we can no longer decide whether to laugh or cry. The fear is that the new schedule might come to the same end. Available here and by RSS on May 22.
Coaching services
I offer email and telephone coaching at both corporate and individual rates. Contact Rick for details at rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.com or (650) 787-6475, or toll-free in the continental US at (866) 378-5470.
Get the ebook!
Past issues of Point Lookout are available in six ebooks:
- Get 2001-2 in Geese Don't Land on Twigs (PDF, )
- Get 2003-4 in Why Dogs Wag (PDF, )
- Get 2005-6 in Loopy Things We Do (PDF, )
- Get 2007-8 in Things We Believe That Maybe Aren't So True (PDF, )
- Get 2009-10 in The Questions Not Asked (PDF, )
- Get all of the first twelve years (2001-2012) in The Collected Issues of Point Lookout (PDF, )
Are you a writer, editor or publisher on deadline? Are you looking for an article that will get people talking and get compliments flying your way? You can have 500-1000 words in your inbox in one hour. License any article from this Web site. More info
Follow Rick
Recommend this issue to a friend
Send an email message to a friend
rbrendPtoGuFOkTSMQOzxner@ChacEgGqaylUnkmwIkkwoCanyon.comSend a message to Rick
A Tip A Day feed
Point Lookout weekly feed