Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 1
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Decision-Making forTeam Leaders
presented to
by
Rick Brenner
Chaco Canyon Consulting
Building State-of-the-Art Teamwork
In Problem-Solving Organizations
www.ChacoCanyon.com 1Copyright © 2015 Richard Brenner
Northern Utah Chapter of PMI
2015 Professional Development ConferenceBYU Conference Center, Provo, UtahMay 13, 2015
A note on format
• Underlined items are live links to:• Other slides• Articles on my Web site• Articles elsewhere on the Web
• To get a copy with working links:
• To get a copy of the handout:
• To get both and more as a zip archive:
22
Please let me know as we go alongif you want to ask a question
http://goo.gl/hvP6UI
http://goo.gl/fA0kwP
http://goo.gl/E2jgYc
Adobe Reader 6.0or later is required
Core message
• We all make decisions
• We make more decisions than we realize
• We can improve how we make decisions by:
• Consciously choosing how we make decisions
• Ensuring that we have what we need when we decide
• Understanding how decisions go wrong
• Managing the risks (including behavioral risks) of the decision-making process
3
The menu
• Boyd’s OODA model
• Tools for orienting to the situation
• Dealing with cognitive biases
• Critical thinking and group decision-making dysfunctions
• Tools for deciding
• Closing
• Resources and references
4
Boyd’s OODA model
Observe
Orient
Decide
Act
6
COL John Boyd, USAFas Captain or Major
during the Korean War
The OODA model in brief
• Observe• Sense the environment• Example: Management learns of a hostile takeover
attempt
• Orient• We synthesize what we’ve observed• Example: Management researches the offer and
available defenses
• Decide• We select a response • Example: Management decides to
approach a competitor about a merger
• Act• We execute our decision• Example: Management agrees to merge
7
Observe
Orient
Decide
Act
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 2
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Use OODA to guide your thinking
• For success in rapidly changing situations:Cycle through your OODA Loop fast enough
• When your opponent—or the situation—changes too fast, failure is almost certain
• If you have sentient opponents:Cycle through your OODA Loop faster than
your opponents cycle throughtheirs
8
For examples of strategicuse of OODA, study national
political campaigns
Observe
Orient
Decide
Act
How to use OODA
• Be clear about what stage is happening now
• Understand importance of OODA cycle time
• Do what you can to
• Reduce your own cycle time
• Lengthen your opponent’s cycle time
• Reduce your own cycle time by:
• Enhancing Observation skills
• Enhancing Orienting skills
• Making better Decisions
• Making Decisions faster
9
Observe
Orient
Decide
Act
Enhancing Observation skills
• Increase data collection capacity to match the situation
• Make data collection more objective, less biased
• Enhance observational accuracy
• Use filters to remove extraneous data
• Ensure that filters don’t introduce distortions
10
In our discussion today, we address thesepriorities by understanding cognitive biases
Enhancing Orienting skills
• Orienting requires processing observed data
• Make sense of it
• Form a model of the situation
• For decision-making in our context this
means
• Focusing on relevant data
• Developing options and ranking them
• Pre-defined tools for orienting can speed the process
11
Comprehensive situational awarenessrequires effective orienting processes
Making better Decisions
Given comprehensive situational awareness, decision quality depends on
• Critical thinking skills, for both individuals and groups
• Familiarity with group decision-makingdysfunctions
• Mechanisms for managing group decision-making dysfunction risk
12
We’ll address these needs inour discussions of critical thinking and
group decision-making dysfunctions
Making Decisions faster
• Making decisions faster requires
• Familiarity and facility with a variety of decision-making patterns
• Choosing the right pattern for the situation
• Practice as individuals
• Practice as a group
• We’ll explore the available patterns as tools fordeciding
13
Everything goes faster if youhave competent facilitation
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 3
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Applications of OODA
• Understand “firefighting” mode
• Accept the “horizon of uncertainty”
• Distinguish strategic and tactical decisions
• Effects of situational complexity
• Effects of virtuality on decision-making
14
Understand “firefighting” mode
• Definition:
• Reactive, short-term problem fixing
• Suppressing symptoms of problems
• Deferring consequences of problems
• Refraining from actually addressing root causes
• By the way:
• This isn’t actually what firefighters do (usually)
• They are much more proactive
15
A wildland firefighterigniting a controlled grassfire
How we operate in firefighting mode
• In firefighting mode:
• We jump from one fire to the next
• No time to really extinguish any one fire
• What’s actually happening in terms of OODA:
• The situation has gotten “inside” our OODA loop
• We can’t cycle through the loop fast enough to match the situation
• Unless we can shorten our
cycle time, we stay in firefighting mode
16
Observe
Orient
Decide
Act
Firefighting tactics:
Shorten OODA cycle time
• Firebreak: strip of terrain that has been cleared of fuel
• Backfire: controlled burn in path of fire that deprives fire of fuel
• Seek organizational analogs of both
17
Observe
Orient
Decide
Act
Firefighting tactics:
Organizational firebreaks
• In organizations: exploit modularity
• Applying modularity
• Share “resources” only across projects that interact weakly
• Minimize resource sharing
• Make cost models of resource sharing to demonstrate full costs to decision-makers
18
Observe
Orient
Decide
Act
Firefighting tactics:
Organizational backfires
• In organizations: invest in paying down technical debt and management debt
• Technical debt (T-debt):
• Incomplete implementations and upgrades
• Near-term solutions at long-term expense
• Management debt (M-debt):
• Incomplete or deferred organizational upgrades
• Near-term solutions at long-term expense
• To reduce debt:
• Make cost models of T-debt and M-debt
• Demonstrate wisdom of debt-retirement programs with numbers
19
Firefighting tactics:
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 4
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Accept the “horizon of uncertainty”
• When we plan future activity:
• The future is uncertain
• The farther into the future, the greater the uncertainty
• Horizon of uncertainty: uncertainty so high that plans are unreliable
• In fluid environments:
• Horizon of uncertainty is very close to right now
• Investing in long term plans is hard to justify
• Best to plan work in short increments
• This is what agile processes do
20
Conventional product lifecycles
• Waterfall, iterative, …
• Requirements and analysis: Observe, Orient
• Project planning: Orient, Decide
• Project execution: Act
• For conventional lifecycles OODA loops are relatively long
• Waterfall: entire project is one cycle
• Iterative: one iteration is one cycle
21
Observe
Orient
Decide
Act
In conventional lifecycles wemust plan in detail well beyond
our “horizon of uncertainty”
Horizon of uncertainty:
Agile product lifecycles
• Insert the customer into the team to enhance Observation and Orientation
• Deliver frequently
• Agile lifecycles tighten the OODA Loop dramatically
• Days or weeks
• Not months or quarters or years
• One delivery is one OODA cycle
22
Observe
Orient
Decide
Act
In agile lifecycles wedon’t plan in detail beyondour “horizon of uncertainty”
Horizon of uncertainty:
Distinguish strategic and tactical decisions
• Strategy: What we are doing and why
• Tactics: How we’ll accomplish it and with what
• Both strategy and tactics require decisions
• Strategic decisions: Are our goals and motives right?
• Tactical decisions: Will our methods work?
• Common errors
• Thinking tactically about strategy
• Thinking strategically about tactics
• OODA helps with both, but the cycle times are very different
23
How to distinguish strategy from tactics
• OODA helps with both, but the cycle times are very different
• Strategic decisions have long cycle times
• Tactical decisions can be arbitrarily short
• If a decision is needed urgently:
• It is no longer strategic, if it ever was
• If it was strategic, the strategy is broken
24
Effects of situational complexity
• “Situation” might not be unitary
• At any one time, several interlocking situations can be unfolding
• The OODA loop for one might tangle with the OODA loop for another
• Example: dealing with a project issue might involve personnel issues
• Do what you can to factorize problems
• Arrange to let them unfold independently
25
Observe
Orient
Decide
Act
Observe
Orient
Decide
Act
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 5
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Effects of virtuality on decision-making
• In virtual contexts:
• Simultaneity can get skewed
• One site might have knowledge another doesn’t
• OODA loop can become phase-distorted
• OODA loop cycle times are longer
• How to address this:
• Include phase distortion management in risk plan
• Allow budget and schedule for it
26
Observe
Orient
Decide
Act
Phases of the decision process
• Awareness that a decision might be necessary
• Agree that a decision is necessary
• Define the problem
• Determine what information is needed
• Develop alternative solutions
• Evaluate alternatives
• Select an alternative
• Implement the decision
• Evaluate the results27
Observe
Orient
Decide
Act
Observe
Applications of OODA in the team context
We can apply OODA in multiple domains:
• Content of the team effort
• Context of the team:
• Resources
• Suppliers
• Customers
• Regulatory environment
• Competitive environment
• Political environment
• Team lead: team dynamics
28
Tools for Orienting
• Brainstorming
• Generalized Morphological Analysis
• Nominal Group Technique
• List reduction
• Decisional balance sheet
• Decision matrix
• Weighted voting
• Paired comparisons
31
Brainstorming
• Structured option-generation method
• Not a random idea generation conversation
• Principles:
• Focus on a single question
• Focus on quantity of ideas
• Withhold evaluation of ideas
• Welcome unusual ideas
• Combine and improve ideas
• There are many variations
32
Any particular way of looking at things is only one fromamong many other possible ways.—Edward deBono
Brainstorming difficulties
• Method is better suited to extraverts than introverts
• Evaluation anxiety
• Evaluation is nominally deprecated
• Tacit (and deferred) evaluation can be a real concern
• Group integration correlation problems
• People who are less well integrated into the group might hold back
• People better integrated into the group might dominate
• Mechanical hindrances
• Capturing ideas can be a bottleneck
• People might forget their ideas before they can be captured
33
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 6
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Generalized Morphological Analysis
• Invented by Fritz Zwicky in 1960s
• Procedure:
• Let L be a list of the attributes of the problem space
• Examine the grids LxL, LxLxL, …
• Ideal for exploring complex problems
• Enables systematic exploration of theproblem space
• Excellent for finding non-quantitativesolutions
• Ensures that you cover the space
• Exposes difficulties that might otherwise be overlooked
• Limits surprises
34
Fritz Zwicky
Example of usingGeneralized Morphological Analysis (GMA)
• Manage coverage of a help desk during an epidemic
• Look for risk/risk interactions
Adams subs for Franklin
Franklin subs for Adams
Franklin subs for Hamilton
Franklin subs for Madison
Adams subs for Franklin
Franklin subs for Adams
Franklin subs for Hamilton
Franklin subs for Madison
35
Nominal Group Technique
• Structured method for rank-ordering a list of options
• Each member rank-orders the list
• Sum the individual results
• Examples of conditions that NGT addresses:
• Some group members are much more vocal than others
• Some prefer silence for cogitating
• Some are not participating
• Group is stuck
• Some group members are new
• Conflict is turning toxic
• Group is inhomogeneous in status
• Stakeholders want rank-ordered results36
List reduction
• Technique for triage of ideas
• Used for processing lists of options (e.g., from a brainstorm)
• Goal: reduce the list by filtering out less-desirable items
• Steps:
• Clarify the options
• Define filters, e.g., Possible?, Too expensive? …
• Apply filters
• Rank-order those that survive the filters
37
Beware: Some items might be incompatiblewith others. It might be necessary to
combine this method with GMA.
Decisional balance sheet
• Purpose: clarify differences among options to aid
in forming a consensus
• Useful for comparing a limited set of options (2-
4)
• Brainstorm to find advantages and disadvantages
38
Advantages Disadvantages
Option 1 ••
••
Option 2 ••
••
Option 3 ••
••
Decision matrix
Useful for comparing several options relative to multiple criteria
39
Weight Option 1 Option 2 Option 3 Option 4 Option 5
Criterion 1
Criterion 2
Criterion 3
Criterion 4
Total
See the example in the download archive
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 7
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Weighted voting
• Voters can cast differing numbers of votes
• Analogous to shareholder meetings
• Parameters:
• Quota: number of votes required to pass
• Votes vector: votes for each voter
• Advantage: nicely captures the relative importance of different voters
• Disadvantage: can generate jealousies, envy, toxic conflict
40
Paired comparisons
• Basic idea
• Compare each option to each other option
• Each member tells which one of each pair is preferred
• Total the net preferences for each option
• Use for quantifying team’s preferences among a set of options
• Helpful when no clear choice is emerging
• Can be an alternative to list reduction
• Not for decision-making, rather for getting a sense of the group
41
Paired comparisons example
A spreadsheet is helpful for tallying results
42
Option 1 Option 2 Option 3
Option 1 3 2
Option 2 1 2
Option 3 3 1
Totals
Option 1 1
Option 2 1
Option 3 0
See the example in the download archive
About cognitive biases
• Cognitive biases are systematic patterns of thought that:
• Cause deviations from what rational or objective thinking would produce
• Are outside our awareness
• Some definitions of CB imply that they pertain only to social interactions
• They can apply to any realm of endeavor
• They are not personal failings
45
Cognitive biases affect our abilityto make decisions rationally
An example: the planning fallacy
• The tendency to:
• Underestimate the time, costs,and risks of future actions
• Overestimate benefits
• First identified by Kahnemanand Tversky in 1979
• Widely observed before that
• Previously considered to be a formof incompetence
• Now we know better
46
Kahneman and TverskyNobel Prize for Economics 2002
The bias in cognitive bias research
• Most research focuses on social behavior
• Individuals
• Groups
• Comparatively less research on business
decisions
• Recent shift in focus is promising, but much work remains
47
Caveat: Many of the workplace mechanismsdescribed in what follows are speculative
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 8
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Some cognitive biasesthat affect decision-making
• Sunk Cost Effect
• Irrational Escalation
• Endowment Effect
• Focusing Illusion
• Optimism Bias
• Self-Serving Bias
• Halo Effect
• Hindsight Bias
• Asymmetric Insight Illusion
• Hot Hands Fallacy
• Illusory Superiority
• Illusion of Control
• Confirmation Bias
• Assimilation Bias
• Backfire effect
• Ambiguity Effect
• Bias Blind Spot
• Anchoring Effect
• Planning Fallacy
• Pygmalion Effect
• Social Comparison Bias
• Dunning-Kruger Effect
• Fundamental attribution error
• Availability Heuristic
• Observer-Expectancy Effect
48
Sunk Cost Effect
• “Sunk cost”
• A term from the field of investment management
• Costs already incurred and not recoverable
• Sunk Cost Effect makes us averse to
terminating an effort
• Accepting failure is difficult even when continuing will only add to losses
49
Beware the Sunk Cost Effectwhen deciding about ending an
effort or limiting its scope or goals
Irrational Escalation
• Can cause us to
• Commit still more resources
• Disregard evidence that doingso is ineffective
• Unlike the Sunk Cost Effect
• Can take hold even before effort starts
• Commitment (not expenditure) is all that’s required
• Example: bidding war at an auction
50
Beware Irrational Escalation in situationsthat involve competition or contention
Test launch of aTrident missile
Endowment Effect
• Affects how we value what wealready have relative to whatwe don’t
• We tend to value what we havemore than we would spend to get it
• This bias might account for overvaluing work already performed
• Enhances both the Sunk Cost Effect and Irrational Escalation
• Devalues alternative opportunities
51
The Endowment Effect causes problems in strategicbudgeting. It leads us to undervalue opportunity costs.
Focusing Illusion
• Attaching too much significance to a single feature
• Example: If I were rich, I’d be happy
• Workplace examples:
• If we hire this superstar, all will be well
• If we get Snidely off the team, we’ll finish on time
• Outsourcing will cut costs 40%
• Leads to:
• Failure to grasp full complexity of a situation
• Silver-bullet thinking
52
Many management fads are simpleexploitations of the Focusing Illusion
Optimism Bias
• Tendency to believe:• Probability of success is
higher than data supports
• Risk of negative event is relatively low
• Example: I smoke, but I won’t get lung cancer
• Workplace examples:• We’re more biased in favor of proposals that
promise greater things
• We’re more susceptible to bias when we feel we’re in control
• Leads to inadequate risk management
53
Ice on Challenger’sLaunch Pad
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 9
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Self-Serving Bias
• Tendency to believe:• Success is due to our own talents
• Failure is due to situational factors
• Example:• Blame computer when things don’t work
• Credit themselves when things do work
• Workplace examples:• Retrospectives: attribute problems
to external factors
• Risk plans: Acknowledge external risks, not our shortcomings
• Security: stronger defenses against external attacks
• Leads to:• Defective, inconsistent control processes
• Inability to learn from errors
• Low productivity of pair programmers who lack a close personal relationship
54
Gen. William Westmoreland
Halo Effect
• Tendency for one attribute of a person/thing to bias overall assessments of that person/thing
• Example: people judge attractive people as more sociable, morecapable, …
• Workplace examples:• Performance reviews biased by a single incident
• In virtual meetings, people at the chair’s site have more credibility
• Leads to distorted assessments of value of contributions, proposals, concepts, …
55
Hindsight Bias
• Tendency to:
• See causal connections between outcomes and antecedent conditions
• Ignore conditions that introduce uncertainty
• Example: “I knew it all along”
• Workplace examples:
• Performance review: your impulsiveness makes for conflict
• Retrospective: project was late because of conflict
• Leads to:
• Failure to recognize actual causes
• Memory distortion56
Asymmetric InsightIllusion
• Tendency to believe that my knowledge of you exceeds your knowledge of yourself
• Example: “I’ve got your number, pal!”
• Workplace examples:• We understand our competitors
better than they do us
• I know why my political rival does what she does
• Leads to:• Underestimating rivals and competitors
• Being surprised unpleasantly
57
George W. Bushand Vladimir Putin
at G8 Summit, 2006
Hot Hands Fallacy
• Belief that recent events arebest predictors of success
• Example: “hot” and “cold” streaks in games
• Workplace examples:
• A recent success makes a project manager a better bet
• Fire a CEO because companyhad a few bad quarters
• Leads to:
• Decisions not based on reality
• Scope creep58
Fernandez & Cortazzo
Illusory Superiority
• Tendency to believe that one is “better than average”
• Example: all the kids in Lake Wobegon
• Workplace examples:
• We needn’t manage that riskbecause we’re so good that it can’t happen
• If you work here, you must be one of the best
• Leads to:
• Excessive and unrealistic risk appetite
• Overvaluing in-house expertise
59
Garrison Keillor
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 10
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Illusion of Control
• Tendency to overestimateone’s ability to controlevents
• Example: I’ll win thelottery if I pick the number carefully
• Workplace examples:• When the project succeeds, the project manager
believes it’s a personal success
• When the project fails, we investigate only what the project manager did wrong
• Leads to:• Erroneous identification of root causes
• Overvaluation of individual capabilities60
Confirmation Bias
• Tendency to favor and seekonly information thatconfirms our preconceptions
• Example: homogeneity ofnews channel audiences
• Workplace examples:
• Biased search for information about competitors
• Biased evaluation of competitors
• Leads to:
• Acquiring inaccurate picture of reality
• Deprecating dissenting views
61
Blackberry Curve
Assimilation Bias
• Tendency to distort our observations to conform to our preconceived schema
• Workplace examples:
• Job candidates with impressiveresumes are regarded as more capable
• We estimate costs as lower when money is tight
• Leads to:
• Categorizing defects as less severe than they are
• Budget and schedule overruns
• Misunderstanding strengths of competitors62
Backfire effect
• Response to evidencecontradicting our convictions:
• Grasp those convictions even more tightly
• Reject the evidence
• Sometimes: reject the presenters of the evidence
• Interferes with critical thinking
• In groups:
• Phenomenon contributes to group polarization
• Can lead to ejection of those who hold minority or contradicting views
• Can dangerously degrade decision quality
63
Ambiguity Effect
• Tendency to:
• Prefer options with known probability of a good outcome
• Avoid options with a less-well-known probability of a good outcome
• Example: investing in government bonds
• Workplace examples:
• Assign tasks to a long-time mediocre performer in preference to a promising rookie
• Use a tried-and-true expensive solution instead of a newer, cheaper solution
• Leads to scope creep, wasted resources64
Read: Reluctance to vaccinate:omission bias and ambiguity
Bias Blind Spot
• A tendency to believe thatwe’re immune to cognitivebiases
• Related to Illusory Superiority
• Example: We believe that we don’t have a bias blind spot
• Workplace examples:• We believe that we make decisions objectively
• Risk plans generally don’t address CB risk
• Leads to• Failure to compensate for cognitive biases
• Excessive aversion to reviews and red teams
• Accepting output of retrospectives as complete65
Supermarket merchandising
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 11
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Anchoring Effect
• Tendency to give too much weightto early-arriving information
• Examples:• Initial price offered sets the range
• MSRP
• Workplace examples:• Project budget/schedule set by early estimates
• In job search, salary set by previous salaries
• Management solicits proposals for X with budget B and schedule S
• Leads to:• Less-than-optimal solutions because exploration is
biased
• Unreliable estimates
66
Planning Fallacy
• Tendency to:
• Underestimate one’s ownbudget, schedule, risks
• Overestimate benefits
• Examples:
• Panama Canal 1.0
• Iraq war 2.0
• Workplace examples: just about any project
• Leads to:
• Budget, schedule overruns
• Unmitigated or poorly mitigated risks
• Disappointing results
• The need to “embellish” and “spin” in status reports67
Ferdinand de Lesseps
Pygmalion Effect
• Tendency for behavior or performance to align withexpectations
• Example: students whom teachers regard as bright perform better
• Workplace examples:
• Employee performance correlates with ratings
• Also applies to processes
• Leads to misevaluation of true capability
68
Pygmalion et GalatéeÉtienne Falconet (1763)
Social Comparison Bias
• Tendency to dislike or feel competitive with someone we see as “better”
• Examples:• Usually relates to wealth
and/or status
• Associated with depression, suicide
• Workplace examples:• Competition for promotions or assignments
• Competition for workspaces, equipment, privileges
• Leads to distorted resource allocations• Rivalries and empire building
• Decisions not based on organizational priorities69
Gen. Montgomery and Lt. Gen. Patton, at Palermo
Dunning-Kruger Effect
• The Dunning-Kruger effect:• Competence and the ability to
assess competence are inversely correlated
• Competence and confidence are inversely correlated
• Those who are least competent have greatest confidence
• The competent tend to underestimate their own competence
• Excessive confidence makes us vulnerable politically
• In politics, the more competent can exploit this vulnerability
70
U.S. Congressman Steve Stockman(R-TX)
Fundamentalattribution error
71
• Tendency to explainbehavior of others
• On the basis of dispositionor character
• Rather than context or the actions of third parties
• Example: project failure
• We tend to see the project manager as the cause
• We tend to ignore contextual factors
• Leads to:
• Erroneous identification of causes
• Over-estimation of the capabilities of people who participate in success
State Correctional Institution RockviewBellefonte, Pennsylvania
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 12
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Availability Heuristic
• Biases our estimates of probabilities of events
• Confuse higher probabilitywith:• Ease of imagining
• “Front of mind”
• Example: Compare these two probabilities:• Getting bitten by a shark
• Being hit by falling airplane parts
• Workplace examples:• Probability of new version of critical software
• Probability that we’ll keep the same team for the duration
72
Estimating probabilities rarely produces reliableresults. Use real data, or use huge error bars.
White-tipped shark
Observer-Expectancy Effect
• Tendency for experimenters to influence the subjects of an experiment
• Originally found in psychology and medicine
• Related to Hawthorne Effect
• In our context:• Affects data obtained from interviews and focus
groups
• Affects reports from customers, subordinates, task leads
• Consider these effects• When evaluating field data
• When constructing questionnaires
• When conducting focus groups73
What we need…
74
There are approximately 200 CBs identified so far.Many are related to others.
Tools for Deciding
• Facilitation
• Decision patterns
• Unanimity
• Consensus
• Consensus minus N
• Time-boxed consensus
• Authority
• Voting
• Multi-voting
• Veto
• Expert subgroup
• Delphi method77
Facilitation
• Facilitator owns the group process
• Ensures fairness
• Detects and reports on dysfunction
• Facilitator must be content-neutral
• Participation in content creates a conflict of interest
• “Temporary” surrender of the facilitator role is a dangerous fiction
78
Separate leadership from facilitation,especially for high-impact decisions
Unanimity
• Everyone agrees with the proposal 100%
• Advantages:
• Good for binary decisions (yes/no)
• Useful for small groups (five or fewer)
• Ensures that everyone is on board
• Disadvantages:
• Difficult for groups of more than 5-8
• Stalemates possible with even numbers of votes
• Members experience pressure to agree
• Even a little controversy creates difficulty
79
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 13
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Consensus
• Consensus is not necessarilyunanimity
• Good for yes/no decisions
• Each member can honestly say:• You understand my point of view, and I yours
• Whether or not I prefer this decision, I will support it
• Decisions was reached openly and fairly
• Determine consensus in an open manner• Thumb Up: I agree with the decision that has been
reached.
• Thumb Horizontal: I can “live with” and support the decision
• Thumb Down: I have serious reservations about the decision and cannot support it.
80
Consensus minus N
• Like Consensus, except proposal is accepted with up to N “no” votes
• Advantages:
• Good for yes/no decisions
• Faster than consensus
• Useful in a polarized atmosphere when a small minority threatens to block decisions
• Disadvantages:
• Minority can feel “trampled” or alienated
• Minority can “check out”
• Minority can be right
81
Time-boxed consensus
• Set a deadline (the “time box”)
• The group reaches a consensus (or doesn’t) before the deadline
• If it fails, another pattern (usually Authority) kicks in
• Advantages• Good for yes/no decisions
• Mitigates efficiency risk of Consensus or alienation risk of Authority
• Helpful in polarized situations
• Disadvantages• Some in group can feel distrusted or pressured
• Rush to decision can degrade decision quality82
Authority
• The group leader (or a set of co-leaders) decides, optionally with input from others
• Advantages:
• Speed
• Economy: other group members can work on other things
• Good for yes/no decisions
• Disadvantages:
• Risk of bad decisions
• Group leader might lack information or expertise
• Other members can feel excluded
83
Voting
• Options
• Fraction needed to pass:majority, super-majority
• Ballot secrecy
• Number of votes per voter (multi-voting)
• Restrictions on distributing multiple votes
• Advantages
• Allows more dissenters than consensus
• Faster decisions even in a polarized environment
• Disadvantages
• Fosters the development of factions
• Minority can feel alienated
• Decisions by thin majorities can exacerbate polarization84
Multi-voting
• Used for reducing a field of N choices to M<N
• Each voter ranks M choices lowest to highest: 1, 2, 3, …, M
• Tally by adding the ranks for each of the N choices
• The top M choices are then known
• Advantages:
• Clear, simple, fast
• Good representation of minority view
• Disadvantage: assumes a linear ranking scale
85
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 14
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Veto
• Group reaches a decision by some other method
• Group leader (or set of co-leaders) can veto
• Advantages
• Enables maintenance of confidentiality for sensitive issues
• Mitigates risk of groupthink in the larger group
• Disadvantages
• High risk of alienation of group members
• Risk is present even if veto isn’t exercised
• “Technical veto” is somewhat safer
86
Expert subgroup
• Delegate decisions to an expert subgroup
• Subgroup uses pattern of its choice to reach decision
• Larger group must adopt the result
• If subgroup’s decision isn’t binding, then decision wasn’t actually delegated
• Advantages
• Addresses risk of slow decisions
• Best for decisions that require expertise or study
• Disadvantages
• Flaws and alternatives might not be fully considered
• Increased chance of blockage by a minority
• Possible difficulty selecting members fairly
87
Delphi method
• Technique for gaining convergence of opinion
• Best suited for forecasting
• Panelists participate in rounds
• Anonymous to each other
• In each round, panelists submit their views
• Facilitator assembles a summary
• Distributes to panel
• Repeat
• Rounds end when:
• Convergence is achieved, or
• Set number of rounds is completed, or
• …any other criteria88
Critical Thinking
• Dictionary definition:
• Skills required:
• Mastery of logic
• Forming cogent arguments
• Gathering evidence
• Evaluating evidence
91
Disciplined thinking that is clear, rational,open-minded, and informed by evidence
Skills and anti-skills for critical thinking
• Skills:
• Know the Ten Management Fallacies
• Understand how rhetorical fallacies work
• Anti-skills
• Skills related to avoiding traps and errors
• Detecting and avoiding:
• Flawed logic
• Invalid “evidence”
• Malformed arguments
• Dealing with use of these techniques by others
92
Ten management fallacies
• Fallacy of Positivism
• Bad Actor Fallacy
• Naturalistic Fallacy
• Culturalistic Fallacy
• Fungibility Fallacy
• Linearity Fallacy
• Normative Fallacy
• Availability Heuristic
• Grandiosity Fallacy
• Invulnerability Fallacy
93
Ten fallacies trip us up when we deal withcommon management issues
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 15
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Fallacy of Positivism
• The fallacy:
• If we believe we can accom-plish something, we’re more likely to actually accomplish it
• If we express doubts, we’re lesslikely to succeed
• Tempting to leaders who want to motivate
• But Truth is more important
• Be positive when it’s appropriate
• Express doubts when they’re real and relevant
94
Either staying positive or expressing doubtinappropriately can lead to catastrophe
The Little Engine That Could,by Watty Piper
Bad Actor Fallacy
• The fallacy:Repeated patterns ofdysfunction are always due toa single team member
• This belief is temptingbecause it suggests a simplesolution
• Sometimes it’s valid; usually not
• Team performance is determined by:
• The team
• The organization in which that team is embedded
95
Osama bin Laden
Naturalistic Fallacy
• The fallacy:Professional credentials—experi-ence, education, seniority, orpast performance—are equiva-lent to abilities
• Related to the Fundamental Attribution Error
• Judgments based on credentials are risky:
• They ignore past prevailing context
• Past context might have played a significant role
96
To assess capabilities, consider bothcredentials and past context
Ferdinand de Lesseps
Culturalistic Fallacy
• The fallacy:• Organizational leaders create high
performance teams
• People of that team don’t
• Indicators:• Credit for high performance tends
to flow to leaders
• Blame for dysfunction tends to flow to team members
• The most-likely truth:• Any one person can undermine a team’s performance
• No single person is responsible for creating high performance
• External factors certainly contribute
• A team’s performance is most directly due to the choices of the team
97
Bill Belichick
Fungibility Fallacy
• The fallacy:
• Every person produces one hour ofoutput in one hour
• We can substitute people for one another
• The most-likely truth:
• Often, only a few people can perform certain tasks
• Tools that account for this are difficult to use correctly
• Indicators: Use of terms such as man-month, headcount, FTE.
• Running “lean and mean” worsens the problem
• Counting delays and lost sales, running “fatter and kinder” is usually more profitable
98
An early assembly line
Linearity Fallacy
• The fallacy:Human effort required toexecute a project scales inproportion to project size ortotal budget
• The most-likely truth:
• Complexity grows combinatorially with size of effort
• Operating costs per unit output grow rapidly with size of effort
• Costs decline unexpectedly slowly as effort shrinks
• We have difficulty abandoning control processes as we reduce size 99
Pack ice in the Ross Sea,Jan. 2004. Photo credit:Edmund Stump, NASA
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 16
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Normative Fallacy
• The fallacy:When we poll people and most ofthem agree, they’re correct
• The most-likely truth:
• Usually we select people non-randomly
• We choose those who will give us desirable answers, or those we trust, or those of high rank
• Non-random polling almost always leads to biased conclusions
• To get truly useful polling data, you must poll people randomly
100
George Gallup
Availability Heuristic
• The fallacy:We can estimate the proba-bilities of events intuitively
• The most-likely truth:
• The availability Heuristic is a cognitive bias
• When we estimate probabilities by sensing the difficulty of imagining the events, we usually guess wrong
• Estimating probabilities is unlikely to produce reliable results
• Use real data, or use huge error bars
101
Oceanic Whitetip SharkCredit: Michael Aston CC BY-NC 2.0
Grandiosity Fallacy
• The fallacy:• We can address a generalization
of our problem instead of the problem
• That way we solve the originalproblem almost “for free”
• The most-likely truth:• The Grand Solution is often more expensive and
time-consuming than originally estimated
• People rarely want or need the general solution
• Customers usually want only what they asked for
• Work with customers on what they want first
102
Messerschmitt 262Jet Fighter
Invulnerability Fallacy
• The fallacy:I am personally pure—I never use fallacies
• The most-likely truth:
• You’re human
• You do use fallacies
• The trick:Get good at catching yourself when you slip
on a fallacy
103
Rhetorical fallacies
• Any of dozens of misleading debating techniques• If intentional: deceptive• If accidental: incompetent
• All fallacies carry risk. They:• Cloud reasoning• Lead to bad decisions• Waste time• Generate toxic conflict
104
Examples of rhetorical fallacies
• Straw man
• Ad hominem
• Slippery slope
• False dichotomy
• False cause
• Nominal fallacy
• Misleading vividness
• Begging the question
105
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 17
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Straw man
• Technique: exaggerateadvocate’s premise, then refute the exaggeration
• Warning signs:
• Feeling the need to say “I never said…”
• Someone characterizes a position, then draws inferences
• When you see a position being characterized:
• Call a halt
• Check that everyone is OK with the characterization
106
Ad hominem
• Attacking the advocate, not the position
• Many types of ad hominem attacks
• “Your estimate was wrong before, so this one is probably wrong too”
• Defending yourself against ad hominem is very difficult
• When you see an ad hominem on someone
else
• Call a process check
• Get consensus on whether it is ad hominem
• If it is, backtrack
107
Slippery slope
• Technique:
• If we accept your premise, then we’d have to accept my exaggerated form of your premise
• Usually the exaggerated form is scary
• “If we include these fixes, we’ll have to include the whole B-2 list, and we’ll be a year late”
• Check carefully: Is the conclusion correct?
108
The slippery slope is an appeal to fear
False dichotomy
• “Black-and-white” thinking
• The only solution to a problem is an extreme andover-simplified path
• Often stated elegantly—that’s part of the deception
• “You’re either part of the solution, or part of the problem”
• “If we don’t resolve this now, we never will”
109
False cause
• Mistaking proximity for causality
• Correlation isn’t cause
• Time sequence isn’t cause
• Example:
• Every project you’ve managed has been late and every other project has been on time
• Neglected to mention: you get all the high risk projects
110
This one is especially insidious
Example:False cause rhetorical fallacy
111
TylerVigen.com
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 18
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Nominal fallacy
• Confusing the naming of something with explaining it
• Q: Why are his projects always over budget?
• A: Because he’s a bad project manager
• Indicators
• Risk plans that identify risks, but offer only weak responses
• Project plans that identify tasks, but have gaps in describing who will execute them and how
112
Misleading vividness
• We evaluate arguments, in part, on the ease of imagining their elements
• If the elements are vivid, we’re more likely to use heuristics,rather than logic
• Example: “I wouldn’t take a customer to lunch there. Remember when Grant got sick? I heard it was the sour cream on their baked potatoes.”
113
Begging the question
• Term is often misused—it doesn’t mean “raising the question”
• It refers to using one unprovenassertion to “prove” another
• Boss: “Jean, Mark says you’re bullying him. I want it stopped.”
• Jean: “I certainly am not bullying anyone.”
• Boss: “Then why does Mark say so? Stop it, or I’ll have to take action.”
114
The fallacy of composition
• Make statements about someparts of a whole (or even everypart of a whole)
• Then conclude something about the whole
• Examples:
• We can make up time and get back on schedule if Tim works weekends. So it’s probably best if everyone works weekends until the deadline.
• We’ve found serious problems in the proposal. The Localization budget is too low, and the schedule for customer extensions is too aggressive. You need to rethink the whole thing.
115
Dealing with rhetorical fallacies
• Don’t try to educate in the midst of debate
• It might seem condescending
• It might feel like an attack
• Educate in advance
• Do it in small doses
• Give each fallacy an easy-to-remember name
• Don’t try to cover too many (there are dozens)
• Address the ones people use
• Examine their costs
• Institute a “logic check” for use in meetings
• If the meeting is hot, take a break first
116
Group decision-making dysfunctions
• Groupthink
• Group polarization
• False consensus
• Abilene paradox
• Group narcissism
• Group-serving bias
• Digging in
• Currying favor
• Online disinhibition
effect
• Sabotage
• Retribution
• Two more causes of impasses
• Lock-in
• Shared information bias
• Pluralistic ignorance
117
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 19
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Groupthink
• Groupthink is a pattern that results in bad decisions
• Can be present in degrees
118
The more amiability and esprit decorps there is among the members
of a policy-making ingroup, the greaterthe danger that independent critical
thinking will be replaced by groupthink,
which is likely to result in irrationaland dehumanizing actions directedagainst outgroups. —Irving Janis
Eight attributes that indicate Groupthink• Overestimations of group’s power and morality
1. Invulnerability illusions: excessive optimism, risk taking
2. Unquestioned belief in group’s morality: members ignore the consequences of their actions
• Closed-mindedness3. Rationalizing warnings that challenge assumptions
4. Stereotyping those opposed to the group as weak, evil, biased, spiteful, impotent, or stupid
• Pressures toward uniformity5. Censorship of ideas deviating from group consensus
6. Illusions of unanimity—silence is viewed as agreement
7. Direct pressure to conform, couched in terms of “disloyalty”
8. Mindguards—self-appointed members who filter info
119
Tactics that can lead to Groupthink
• Taking dissenters and/or dissent off line
• Ejecting or shunning dissenters
• Excluding new members because they might dissent
• Thought policing
• Expressing contempt for opponents or competitors
• Openly stated
• Accepted without dissent
• Emphasizing being a “team player”
120
Group polarization
• This is not the tendency todivide into opposed factions
• In groups:• We tend to adopt positions more
extreme than our individual positions would be
• Individual members’ views also become more extreme during discussion
• Also known as “risky shift” or “cautious shift”
• Less likely in virtual discussions
• More likely when all members are initially similarly inclined
• Manage group polarization risk by creating Designated Skeptic roles
121
(cc) Roger McLassus
Group polarization in teams
• Most examples of GP relate to public policy, terrorism, or violence
• But GP can be important in teams
• GP is characterized by:
• Increased risk appetite
• Reduced positional nuance
• Reduced positional diversity
• Reduced appreciation for ambiguity
• Reduced appreciation for situational complexity
122
Group Polarization can lead to adoptingpositions that substantially enhance risk
Abilene paradox
• Happens when we all agree tosomething none of us wants
• Because we believe everyoneelse wants it
• Because we’re all trying to please each other
• Distinguished from agreeing for other reasons:
• As part of a deal
• Under duress
• …many more
123
If everyone knows about the Abilene Paradox,you can call an “Abilene Check” at any time
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 20
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
False consensus
• A cognitive bias• Tendency to overestimate
alignment between personalviews and views of others
• Tendency to regard dissenters as defective
• Consequences for groups• Leads members to overestimate degree of
agreement
• Related to Abilene Paradox
• The effect disappears when people can easilyascertain the views of others
• Open forms of voting limit the effects of false consensus
124
A scene from the 1943 film,The Oxbow Incident
Group narcissism
• Members of a group haveinflated love of the group
• Group and its members:
• Demand external validation of their superiority
• Demand attention, acknowledgement of status
• Believe that the group doesn’t get all it is due
• Associated with
• Aggression against other groups
• Unrealistic perceptions of threats from other groups
125
Group narcissism can cause groups to makepoor decisions relative to issues of competition
Group-serving bias
• Tendency to:
• Adhere to disparaging explanations for successes and failures of members of out-groups
• Attribute to disposition the failures of members of out-groups
• Attribute to external factors the successes of members of out-groups
• The reverse happens when explaining successes and failures of in-group members
126
Group-serving bias can contribute to under-estimating the strengths of competitors’ products
Digging in
• One of several phenomenathat cause impasses
• People become publicly committed to a position
• They fear altering their positions because of:• Humiliation
• Criticism by rivals
• How to avoid this:• Keep an open mind yourself
• To help others, propose a debate halt
• Resume only after all agree to temporarily adopt a debate opponent’s position
• Position-swapping creates understanding127
Currying favor
• Another cause of impasses
• Advocates might or might not have made commitments to anyone
• But they advocate positions favored by the powerful (or those they believe are powerful)
• Hope to gain recognition
• No prior quid pro quo necessary
• Persuading these people on the merits isn’t likely to succeed
• To persuade them, address the currying-favor strategy
128
Online disinhibition effect
• Behavior in the virtual environment is unruly
• Likely to lead to toxic conflict and bad decisions
• Reasons:
• Weakened connection between our personhood and our actions
• Blurred distinctions between levels of authority
129
We can gain a measure of control byeducating everyone about this phenomenon
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 21
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Sabotage
• Some dissenters aren’tseeking issue resolution
• Their goal: prevent group from reaching any decisions at all
• Examples of possible motives:
• They believe that anything the group might decide would be unacceptable
• Demonstrate fecklessness of the group’s leadership
• Debating the issues with saboteurs is futile with respect to resolving the issues
• Debate can be useful if it reveals saboteurs’ true goals 130
Retribution
• Yet another cause of impasses
• Some dissenters might feel:
• They’ve been badly treated in the past
• A need for revenge by blocking forward progress
• Addressing their objections at face value isn’t likely to work
• Instead, address the past hurt
• Acknowledge it
• Seek symmetrical understanding
• Privacy and discretion are required
131
Two more causes of impasses
• Some group members make confidential agreements in exchange for adopting a position
• Some group members are under pressure from external sources
• Persuading them on the merits won’t work
• What might work:
• Address the content of the confidential agreement
• Address the nature of the external pressure
• Both are difficult
132
Lock-in
• The tendency to adhere to a decision despite the existence of superior alternatives
• Indicators:
• Escalating commitment: increasingly irrational desire not to abandon the decision
• Sunk cost effect
• The prototype (or draft) becomes the product
• Premature rejection of alternatives
• Irrational compulsion to reach closure
• Path dependence: we stay with the decision because of how we reached it
• Compulsion to justify the current course of action with success 133
Tactics for lock-in
• Ask:
• If we were starting over,would we choose this again?
• Are we staying with thisapproach because we lackthe knowledge required for an alternative?
• Are we staying with this approach for personal reasons?
• Justifying the status quo requires comparing it to at least two alternatives
• Costs
• Benefits134
The Embarcadero Freeway, haltedby the California Freeway Revolt
Shared information bias
• Tendency for groups to allocate:
• Too much time to discussing areas all members already know about
• Not enough time to discussing areas only some members know much about
• Results:
• Important issues remain unaddressed
• Time and resources wasted
• Decision quality put at risk
• Risk of alienation of some group members
135
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 22
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Pluralistic ignorance
• Contradictory beliefs:
• Group members privately reject a position
• Simultaneously and incorrectly they believe that almost everyone else accepts it
• Members decline to voice objections
• They feel that doing so is pointless
• They misinterpret the positions of other group members
• Closely related to Abilene Paradox
• Decision can be adopted while everyone has misgivings
136
Homework:Classifying group dysfunctions
• One possible categorization:
• Sins of Commission
• Sins of Omission
• Sins of Imprecision
• Homework:
• Classify the group dysfunctions accordingly
• How many of these dysfunctions can appear in multiple categories? Under what conditions?
137
Group Leadership for Decision-Making
Closing
138
Gene Kranz, Apollo flight director
Following Apollo I accident:
139
From this day forward, Flight Control will be known by two words: Tough and Competent. Tough means we are forever accountable for what we do or what we fail to do. We will never again compromise our responsibilities…Competent means we will never take anything for granted… Mission Control will be perfect. … These words are the price of admission to the ranks of Mission Control.
Resources and References
Last words
141
Recent publications
• “Leading in the Time of Data Breaches”
• Cutter IT Journal, August, 2014
• Deals with decision-making in cyber security
• Download a free copy at: http://goo.gl/H9QfrK
• “Creating High-Performance Virtual Teams”
• Cutter IT Journal, May 2013
• Fourteen recommendations for enhancing performance of virtual teams
• Download a free copy at: http://goo.gl/B4PCn
142
Decision-Making for Team Leaders Presented to Northern Utah Chapter of PMI by Richard Brenner
Principal, Chaco Canyon Consulting
on May 13, 2015
Copyright © 2015 Richard Brenner
Chaco Canyon Consulting
www.ChacoCanyon.com 23
Subscribe to Point Lookout: http://www.ChacoCanyon.com/pointlookout
This document is http://goo.gl/fA0kwP
Page
Subscribe to my free newsletter:Point Lookout
• Weekly email newsletter• 500 words per edition• Topics:
• Communications• Meetings• Project management• Managing your boss
• To subscribe use the form at the end of the handout or
• Change
• Workplace politics
• Conflict
• …and more
143
More info: http://www.ChacoCanyon.com/pointlookout
Resources
• Archive of my newsletterhttp://www.ChacoCanyon.com/pointlookout/politics.shtml
• Tips book 303 Secrets of Workplace Politics• Acrobat: http://goo.gl/7hG5g• iTunes iBook: http://goo.gl/QaRVgx
• Discussion group at LinkedIn: http://goo.gl/I7zhqi
• Links collection: http://goo.gl/f8Aoea• Follow Rick on Twitter: @RickBrenner
http://www.Twitter.com/RickBrenner• Connect with me on LinkedIn:
http://LinkedIn.com/in/RickBrenner144
Tell Me About This Presentation
Chaco Canyon Consulting Rick Brenner www.ChacoCanyon.com 866-378-5470 [email protected]
May I please have and use a quote from you about this presentation? Thanks!
OK to use my name OK to use my title OK to use my company name
What did you like best about the presentation?
What ideas will you use first?
Optional: Name:
Position:
Phone:
To receive my free newsletter: Email: (BLOCK
CAPITAL LETTERS)
Did you use BLOCKCAPITAL LETTERS?
My major source of business is through referrals. Do you know of a company, business organization, or association that could benefit from state-of-the-art teamwork and better relationships between people? Or could benefit from a presentation or seminar on topics like this one? Thank you! (if you don’t have all the info, fill in what you have and I’ll get the rest somehow)
Referral Name:
Position and company:
Phone:
Email:
(If you would like me to call)