Accendo Reliability

Your Reliability Engineering Professional Development Site

  • Home
  • About
    • Contributors
  • Reliability.fm
    • Speaking Of Reliability
    • Rooted in Reliability: The Plant Performance Podcast
    • Quality during Design
    • Way of the Quality Warrior
    • Critical Talks
    • Dare to Know
    • Maintenance Disrupted
    • Metal Conversations
    • The Leadership Connection
    • Practical Reliability Podcast
    • Reliability Matters
    • Reliability it Matters
    • Maintenance Mavericks Podcast
    • Women in Maintenance
    • Accendo Reliability Webinar Series
  • Articles
    • CRE Preparation Notes
    • on Leadership & Career
      • Advanced Engineering Culture
      • Engineering Leadership
      • Managing in the 2000s
      • Product Development and Process Improvement
    • on Maintenance Reliability
      • Aasan Asset Management
      • AI & Predictive Maintenance
      • Asset Management in the Mining Industry
      • CMMS and Reliability
      • Conscious Asset
      • EAM & CMMS
      • Everyday RCM
      • History of Maintenance Management
      • Life Cycle Asset Management
      • Maintenance and Reliability
      • Maintenance Management
      • Plant Maintenance
      • Process Plant Reliability Engineering
      • ReliabilityXperience
      • RCM Blitz®
      • Rob’s Reliability Project
      • The Intelligent Transformer Blog
      • The People Side of Maintenance
      • The Reliability Mindset
    • on Product Reliability
      • Accelerated Reliability
      • Achieving the Benefits of Reliability
      • Apex Ridge
      • Metals Engineering and Product Reliability
      • Musings on Reliability and Maintenance Topics
      • Product Validation
      • Reliability Engineering Insights
      • Reliability in Emerging Technology
    • on Risk & Safety
      • CERM® Risk Insights
      • Equipment Risk and Reliability in Downhole Applications
      • Operational Risk Process Safety
    • on Systems Thinking
      • Communicating with FINESSE
      • The RCA
    • on Tools & Techniques
      • Big Data & Analytics
      • Experimental Design for NPD
      • Innovative Thinking in Reliability and Durability
      • Inside and Beyond HALT
      • Inside FMEA
      • Integral Concepts
      • Learning from Failures
      • Progress in Field Reliability?
      • R for Engineering
      • Reliability Engineering Using Python
      • Reliability Reflections
      • Testing 1 2 3
      • The Manufacturing Academy
  • eBooks
  • Resources
    • Accendo Authors
    • FMEA Resources
    • Feed Forward Publications
    • Openings
    • Books
    • Webinars
    • Journals
    • Higher Education
    • Podcasts
  • Courses
    • 14 Ways to Acquire Reliability Engineering Knowledge
    • Reliability Analysis Methods online course
    • Measurement System Assessment
    • SPC-Process Capability Course
    • Design of Experiments
    • Foundations of RCM online course
    • Quality during Design Journey
    • Reliability Engineering Statistics
    • Quality Engineering Statistics
    • An Introduction to Reliability Engineering
    • Reliability Engineering for Heavy Industry
    • An Introduction to Quality Engineering
    • Process Capability Analysis course
    • Root Cause Analysis and the 8D Corrective Action Process course
    • Return on Investment online course
    • CRE Preparation Online Course
    • Quondam Courses
  • Webinars
    • Upcoming Live Events
  • Calendar
    • Call for Papers Listing
    • Upcoming Webinars
    • Webinar Calendar
  • Login
    • Member Home

by Greg Hutchins Leave a Comment

How Not to Make Bad Risk Decisions

How Not to Make Bad Risk Decisions

Guest Post by Ed Perkins (first posted on CERM ® RISK INSIGHTS – reposted here with permission)

Why are there “bad” decisions? No one starts out to deliberately make a bad decision. If you look into available thought papers and reports, you can find some evidence that can provide some understanding of how bad decisions are made.

COSO in 2012, commissioned a report on “Enhancing Board Oversight”[1] focusing on challenges and biases in making professional judgments.

More recently, several HBS faculty authored a study an “attribution error”[2], where decisions are biased by unjustified attributions of goodness to applicants based on “luck” rather than ability.

COSO Risk report

The COSO report, authored by KPMG, describes decision making as a five-step process:

  1. Defining the problem and identifying fundamental objectives
  2. Consider alternatives
  3. Gather and evaluate information
  4. Reach a conclusion
  5. Articulate and document the rationale (for the conclusion)

Step 5, documenting the rationale is a “quality control“ step, in that if the rationale cannot be clearly articulated, the decision process may be suspect.

COSO notes that “the judgment tendencies and shortcuts that human beings often rely on can short-circuit such a process, and as a result, our decisions can be biased.” The authors then go on to enumerate several “traps” and several tendencies that can bias the decision-making process.

Decision Traps

“Rush to solve” – the pressure to have an immediate solution, without due diligence and care. This provides an ego benefit of appearing decisive; but you can be decisively incorrect as well as correct. This short-circuits step 1 of the process, figuring out what the problem really is so the optimal solution is found.

“Judgment trigger” – aka “Incomplete problem” – the decision to be made is presented as a compelling solution – but the real problem to be addressed and objective is not stated, leading to a lack of diligence to consider alternatives (step 2). Judgment triggers are best addressed by inquiring the “what and why” of the proposed solution.

“Incorrect framing” – this trap affects the gathering and evaluation of the information needed (step 3) to reach a conclusion. Frames are the mental perspectives that are used to consider and evaluate the information gathered to reach a decision. Frames affect the understanding and interpretation of the information being considered. How an issue is framed can effect/bias risk assessment and appetite. Risk may lead to reward but it can also lead to failure.

A recent post in Linked-in attributes bad decisions to “framing error”[3], and provides a slideshow “course” on framing and its role in decision making.

Natural Tendencies

Overconfidence – decision makers can be prone to overestimate their abilities to gather information and make accurate assessments of risks or other factors in reaching a conclusion and making a decision. This is a subconscious tendency, resulting from personal motivation or self-interest. Even when trying to be objective. This shows up in mis-estimating outcomes or likelihoods when evaluating risks.

Confirmation tendency – when decision makers do not seek out objective evidence but rather look for confirmation for their initial beliefs or preferences for the conclusion. Sort of like “stacking the deck”.

Anchoring – when some initial information is used as a starting point and incremental adjustments are made regardless of whether the starting point is reasonable or the adjustments relevant. For example, starting by reusing an existing document and editing rather than creating a new document. The new document is constrained by the structure and content in the original.

Availability tendency – when decision makers weight easily retrievable information, say from memory or recent subjective experience as being more likely, more relevant, and more important for making a judgment than objective evidence. This can be either a positive or negative bias depending on the memory or recent experience.

Attribution Error

In their study, “Inflated Applicants: Attribution Errors in Performance Evaluation by Professionals”[2] the authors note this in their abstract:

When explaining others’ behaviors, achievements, and failures, it is common for people to attribute too much influence to disposition and too little influence to structural and situational factors. We examine whether this tendency leads even experienced professionals to make systematic mistakes in their selection decisions, favoring alumni from academic institutions with high grade distributions and employees from forgiving business environments. We find that candidates benefiting from favorable situations are more likely to be admitted and promoted than their equivalently skilled peers. The results suggest that decision-makers take high nominal performance as evidence of high ability and do not discount it by the ease with which it was achieved. These results clarify our understanding of the correspondence bias using evidence from both archival studies and experiments with experienced professionals. We discuss implications for both admissions and personnel selection practices.

A fundamental principal in the study is “correspondence bias”, first noted by psychology researchers in the 1970’s.  This occurs when decision-makers “forget” situational factors that contribute to the performance or results of an alternative. This alternative can be a person (candidate), product, or theory.  As the authors note, organizations and managers often assume “that performance in one domain will predict performance in another domain”. They cite a study that implies that CEOs who are good at golf have higher compensation, implying they are also good executives; but in reality they are not.  An underlying issue is the difficulty in having visibility into both performance and the associated situational environment. They conducted experiments on admissions, selection and promotion, and reviewed actual decisions and concluded that correspondence bias and attribution error is a factor in evaluation decisions.

As a result of the effect of correspondence bias, organizations may reject sufficiently qualified candidates while systematically selecting lower performing candidates. The consequences could be substantial.

Conclusion

As we have seen, here’s the way to make a bad decisions: start with an agenda, be very confident that you don’t need to bother fully defining the issue or setting any objectives, set a tight deadline, look for easily available evidence that supports your preconceived notions, or select a “star” solution that obviously will work in your situation, and viola! To make things worse don’t consider any risks associated with the decision or ignore any consequences.

References

[1] Enhancing Board Oversight by Avoiding and Challenging Traps and Biases in Professional Judgment
The Committee of Sponsoring Organizations of the Treadway Commission; March, 2012
http://www.coso.org/documents/COSO-EnhancingBoardOversight_r8_Web-ready%20(2).pdf

[2] Inappropriate framing is the root cause of most bad decisions.
Coach’s Guide to Framing, by Baker Street Publishing; September 2013
http://www.slideshare.net/barrager/coaching-guide-to-framing

[3] Inflated Applicants: Attribution Errors in Performance Evaluation by Professionals
Samuel A. Swift, Don A. Moore, Zachariah S. Sharek, Francesca Gino; July 2013
http://www.plosone.org/article/info:doi/10.1371/journal.pone.0069258

Filed Under: Articles, CERM® Risk Insights, on Risk & Safety Tagged With: Bias

About Greg Hutchins

Greg Hutchins PE CERM is the evangelist of Future of Quality: Risk®. He has been involved in quality since 1985 when he set up the first quality program in North America based on Mil Q 9858 for the natural gas industry. Mil Q became ISO 9001 in 1987

He is the author of more than 30 books. ISO 31000: ERM is the best-selling and highest-rated ISO risk book on Amazon (4.8 stars). Value Added Auditing (4th edition) is the first ISO risk-based auditing book.

« Managing Up – Beyond the Analysis Numbers
Shaft Coupling Selection Issues »

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

CERM® Risk Insights series Article by Greg Hutchins, Editor and noted guest authors

Join Accendo

Receive information and updates about articles and many other resources offered by Accendo Reliability by becoming a member.

It’s free and only takes a minute.

Join Today

Recent Articles

  • test
  • test
  • test
  • Your Most Important Business Equation
  • Your Suppliers Can Be a Risk to Your Project

© 2025 FMS Reliability · Privacy Policy · Terms of Service · Cookies Policy