Accendo Reliability

Your Reliability Engineering Professional Development Site

  • Home
  • About
    • Contributors
  • Reliability.fm
    • Speaking Of Reliability
    • Rooted in Reliability: The Plant Performance Podcast
    • Quality during Design
    • Way of the Quality Warrior
    • Critical Talks
    • Dare to Know
    • Maintenance Disrupted
    • Metal Conversations
    • The Leadership Connection
    • Practical Reliability Podcast
    • Reliability Matters
    • Reliability it Matters
    • Maintenance Mavericks Podcast
    • Women in Maintenance
    • Accendo Reliability Webinar Series
  • Articles
    • CRE Preparation Notes
    • on Leadership & Career
      • Advanced Engineering Culture
      • Engineering Leadership
      • Managing in the 2000s
      • Product Development and Process Improvement
    • on Maintenance Reliability
      • Aasan Asset Management
      • AI & Predictive Maintenance
      • Asset Management in the Mining Industry
      • CMMS and Reliability
      • Conscious Asset
      • EAM & CMMS
      • Everyday RCM
      • History of Maintenance Management
      • Life Cycle Asset Management
      • Maintenance and Reliability
      • Maintenance Management
      • Plant Maintenance
      • Process Plant Reliability Engineering
      • ReliabilityXperience
      • RCM Blitz®
      • Rob’s Reliability Project
      • The Intelligent Transformer Blog
      • The People Side of Maintenance
      • The Reliability Mindset
    • on Product Reliability
      • Accelerated Reliability
      • Achieving the Benefits of Reliability
      • Apex Ridge
      • Metals Engineering and Product Reliability
      • Musings on Reliability and Maintenance Topics
      • Product Validation
      • Reliability Engineering Insights
      • Reliability in Emerging Technology
    • on Risk & Safety
      • CERM® Risk Insights
      • Equipment Risk and Reliability in Downhole Applications
      • Operational Risk Process Safety
    • on Systems Thinking
      • Communicating with FINESSE
      • The RCA
    • on Tools & Techniques
      • Big Data & Analytics
      • Experimental Design for NPD
      • Innovative Thinking in Reliability and Durability
      • Inside and Beyond HALT
      • Inside FMEA
      • Integral Concepts
      • Learning from Failures
      • Progress in Field Reliability?
      • R for Engineering
      • Reliability Engineering Using Python
      • Reliability Reflections
      • Testing 1 2 3
      • The Manufacturing Academy
  • eBooks
  • Resources
    • Accendo Authors
    • FMEA Resources
    • Feed Forward Publications
    • Openings
    • Books
    • Webinars
    • Journals
    • Higher Education
    • Podcasts
  • Courses
    • 14 Ways to Acquire Reliability Engineering Knowledge
    • Reliability Analysis Methods online course
    • Measurement System Assessment
    • SPC-Process Capability Course
    • Design of Experiments
    • Foundations of RCM online course
    • Quality during Design Journey
    • Reliability Engineering Statistics
    • Quality Engineering Statistics
    • An Introduction to Reliability Engineering
    • Reliability Engineering for Heavy Industry
    • An Introduction to Quality Engineering
    • Process Capability Analysis course
    • Root Cause Analysis and the 8D Corrective Action Process course
    • Return on Investment online course
    • CRE Preparation Online Course
    • Quondam Courses
  • Webinars
    • Upcoming Live Events
  • Calendar
    • Call for Papers Listing
    • Upcoming Webinars
    • Webinar Calendar
  • Login
    • Member Home

by Christopher Jackson Leave a Comment

Find Me the Statistics that I Like to Believe the Most …

Find Me the Statistics that I Like to Believe the Most …

It looks like 2023 will be the hottest year on record. Along with all the cyclones, hurricanes, floods and bushfires we have already had. Those who study and take climate change seriously unanimously agree that man-made changes to the environment are causing the climate to change so fast that mother nature will struggle to keep up. 

And for the minority (yes, it is a minority as has been confirmed by many surveys and studies across the world), their arguments against climate change go something like this …

It might not be because of us …

… so it’s definitely not because of us.

That’s right. Whatever Twitter is called now is full of amateur opinions about the ‘natural’ cycle of temperatures and climate over millions of years and how this is just another part of that. Of course, it is only the last 200 years or so where the atmosphere has had to deal with ozone depleting chlorofluorocarbons, carbon dioxide from all our vehicles and electricity plants (which causes acid rains that kills the plants that make oxygen), the efficiently ruthless removal of even more plants and forests, plastics in the ocean as far as the eye can see and so on. And of course, it is only the last 200 years where the rate of change has been so pronounced that things that are usually resilient (think corals) simply cannot adapt quickly enough for the raising temperatures in the water.

We often hear of this thing called ‘confirmation bias.’ That is, we tend to believe only the statistics that support our pre-existing beliefs. And we also know that again, thanks to whatever Twitter is called now, there is a never-ending stream of statistics. Some of them are even valid. So we are forced to choose statistics to take seriously.

But confirmation bias is a normal human action. There is nothing inherently wrong with it. For example, let’s say that you have a bird feeder in your backyard. You put bird seed in your bird feeder to attract beautiful songbirds to make your backyard a more beautiful place to sit in. But there is a problem. Your bird seed is going missing. More often than not, you notice that all the bird seed you put in your bird feeder every day is gone by lunch time. And this cannot be to birds alone.

So you borrow a camera – like a ‘trail cam’ that hunters use. The camera is designed to take pictures every time it senses movement. And so you point this camera at your bird feeder. For ten days it takes pictures. And after ten days, you have to return your trail camera to the person you borrowed it from. And now you have a chance to look at the pictures.

Sure enough, there is a pesky squirrel (let’s call him ‘Cyril’) that regularly appears and eats all your bird seed, chasing the birds away, and otherwise gorging on the bird seed himself. Now the next day, you fill up your bird feeder in the morning. And lo and behold, the bird seed is gone by lunch time.

Technically, if you put this on Cyril the Squirrel, then you are guilty of confirmation bias. The loss of bird seed could plausibly be down to high winds, a broken bird feeder, an unusually high number of birds and so on. But it is not wrong to blame Cyril for the loss of bird seed on the eleventh day … given you have confirmed that he was behind the loss of bird seed on the previous ten. 

So confirmation bias serves a useful purpose. True, it can go too far. But it is very human to look back on your experiences (which have this annoying tendency to motivate you to form opinions) and then scrutinize statistics that violate your beliefs more vigorously than statistics that confirm them. We can’t spend every moment of every day examining every idea, opinion or conclusion exhaustively. So we need to choose which claims are worth of skepticism and which are not.

But today, we have something more sinister at hand. The concept of ‘identity politics’ touches on this. But doesn’t explain the whole story.

Put simply, humans tend to let emotions (not experience) drive decision making. Let’s say that twenty years ago you had a bad experience with a ‘militant environmentalist’ that motivated you to join an online forum where there appeared to be a valid questioning of the claims of climate change. There was nothing sinister or overwhelming about this choice. It is just that you were yelled at for no apparent reason by some lunatic who claimed that ‘you were part of the problem.’ And this really ruined your birthday. This felt wrong to you, and you wanted to look to others to see if they felt it was wrong too.

But now, you have been an active participant of this forum ever since. You have put so much effort into feeling (and then reinforcing) the sweet sensation of vindication for all these years that it is now borderline impossible for you to leave. Online forum members are even considered to be ‘close electronic friends.’ But the evidence supporting a conclusion of disastrous man-made climate change is orders of magnitude more persuasive now than it was back then. So you and the other members of that forum are finding ever more ‘fringe’ statistics, from whatever source, that keep the forum and the social support structure it brings alive.

In this scenario, the existing belief is not so much based on a detailed study of the facts. It is instead a necessary foundation for personal and professional validation amongst the ‘peers’ of the forum. If your opinion changes, you loose membership of the forum. So your opinion doesn’t change. And you have lots of ‘statistics’ to back this up.

We emotions dress up as reason everywhere. 

Take engineering for example. A discipline with the perception of objectivity and thorough investigation. We routinely see seemingly smart men and women make bone-headed decisions that result in unmitigated disaster. Whether it was Boeing deciding to halve the amount of money it paid its suppliers (and hoped nothing would change or go wrong). Or TEPCO ignoring its own risk assessment regarding tsunamis before the Fukushima disaster. NASA and its toxic management culture that unambiguously resulted in two Space Shuttle disasters. The Royal Australian Navy that cut back on maintenance so much that it ‘informally’ decided that ships and boats didn’t need to have redundant systems that worked when they put to sea.

This is not confirmation bias. These disasters came down to the selfish prioritization of personal goals at the expense of other peoples. Some of these selfish motivations are down to culture. NASA for example created a ‘space flight’ culture where people who got in the way of launches were professionally ostracized. This included engineers who tried to explain that the Challenger and Columbia launches were dangerously risky. It also included the ‘bulk’ of NASA’s organization that quickly overruled them. So people prioritized promotion and ‘belonging’ over common sense and science.

But it is not always culture.

The then CEO of Boeing, Dennis Muilenburg, was the unambiguous boss of the organization. But he chased the adulation from shareholders when he was able to generate high (short term) profit after profit. The problem was this was done at the expense of engineering and physics. So pretty soon, aircraft started falling out of the sky.

These scenarios have been studied. And all these studies essentially point out how wrong the individuals (and in some cases the cultures that cultivated them) were in the lead up to these problems.

But these studies also found something else. They found that the decision-makers of disaster always clung to some barely verifiable statistic that justified those boneheaded decisions. Even they were so concerned about what they were doing they needed the security blanket of ‘someone else’s’ numbers to give them the comfort they needed as they ploughed through barrier after barrier towards the precipice of disaster.

The antidote to this is of course critical thinking. That takes time and effort. And sometimes pushes us outside our comfort zone. There are lots of tools in our toolbox to try and relegate the need for critical thinking. We might chase group consensus (where for some reason the shared opinion of many fools can feel way more enlightened than the reasoned conclusions of a single person who was motivated to ask questions). And of course, gathering statistics from fools is not research. It is not even a literature review.

So remember that 2023 is on track to the hottest year since records began being kept. And it is true that the climate has naturally cycled over millions of years. But here is another statistic for you …

… it took the dinosaurs no less than 33 000 years after the asteroid hit the Yucatán Peninsula to die off.

And that is considered to be a blink of the eye in the geological history of the world. So really, 200 years is about the time it takes for a bomb to explode or a car to crash in terms of how we gauge time. So make sure that when you design the next plane, train or automobile, make corporate decisions, or do anything where the impact of you being ‘wrong’ is substantial … use critical thinking and don’t just gather the statistics of fools.

Filed Under: Articles, on Product Reliability, Reliability in Emerging Technology

About Christopher Jackson

Chris is a reliability engineering teacher ... which means that after working with many organizations to make lasting cultural changes, he is now focusing no developing online, avatar-based courses that will hopefully make the 'complex' art of reliability engineering into a simple, understandable activity that you feel confident of doing (and understanding what you are doing).

« Is your Data Good Enough for Machine Learning-Based Predictive Maintenance (PdM)?
The 10 Habits of Highly Effective Reliability-AM Professionals »

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Article by Chris Jackson
in the Reliability in Emerging Technology series

Join Accendo

Receive information and updates about articles and many other resources offered by Accendo Reliability by becoming a member.

It’s free and only takes a minute.

Join Today

Recent Posts

  • test
  • test
  • test
  • Your Most Important Business Equation
  • Your Suppliers Can Be a Risk to Your Project

© 2025 FMS Reliability · Privacy Policy · Terms of Service · Cookies Policy