Accendo Reliability

Your Reliability Engineering Professional Development Site

  • Home
  • About
    • Contributors
  • Reliability.fm
    • Speaking Of Reliability
    • Rooted in Reliability: The Plant Performance Podcast
    • Quality during Design
    • Way of the Quality Warrior
    • Critical Talks
    • Dare to Know
    • Maintenance Disrupted
    • Metal Conversations
    • The Leadership Connection
    • Practical Reliability Podcast
    • Reliability Matters
    • Reliability it Matters
    • Maintenance Mavericks Podcast
    • Women in Maintenance
    • Accendo Reliability Webinar Series
  • Articles
    • CRE Preparation Notes
    • on Leadership & Career
      • Advanced Engineering Culture
      • Engineering Leadership
      • Managing in the 2000s
      • Product Development and Process Improvement
    • on Maintenance Reliability
      • Aasan Asset Management
      • AI & Predictive Maintenance
      • Asset Management in the Mining Industry
      • CMMS and Reliability
      • Conscious Asset
      • EAM & CMMS
      • Everyday RCM
      • History of Maintenance Management
      • Life Cycle Asset Management
      • Maintenance and Reliability
      • Maintenance Management
      • Plant Maintenance
      • Process Plant Reliability Engineering
      • ReliabilityXperience
      • RCM Blitz®
      • Rob’s Reliability Project
      • The Intelligent Transformer Blog
      • The People Side of Maintenance
      • The Reliability Mindset
    • on Product Reliability
      • Accelerated Reliability
      • Achieving the Benefits of Reliability
      • Apex Ridge
      • Metals Engineering and Product Reliability
      • Musings on Reliability and Maintenance Topics
      • Product Validation
      • Reliability Engineering Insights
      • Reliability in Emerging Technology
    • on Risk & Safety
      • CERM® Risk Insights
      • Equipment Risk and Reliability in Downhole Applications
      • Operational Risk Process Safety
    • on Systems Thinking
      • Communicating with FINESSE
      • The RCA
    • on Tools & Techniques
      • Big Data & Analytics
      • Experimental Design for NPD
      • Innovative Thinking in Reliability and Durability
      • Inside and Beyond HALT
      • Inside FMEA
      • Integral Concepts
      • Learning from Failures
      • Progress in Field Reliability?
      • R for Engineering
      • Reliability Engineering Using Python
      • Reliability Reflections
      • Testing 1 2 3
      • The Manufacturing Academy
  • eBooks
  • Resources
    • Accendo Authors
    • FMEA Resources
    • Feed Forward Publications
    • Openings
    • Books
    • Webinars
    • Journals
    • Higher Education
    • Podcasts
  • Courses
    • 14 Ways to Acquire Reliability Engineering Knowledge
    • Reliability Analysis Methods online course
    • Measurement System Assessment
    • SPC-Process Capability Course
    • Design of Experiments
    • Foundations of RCM online course
    • Quality during Design Journey
    • Reliability Engineering Statistics
    • Quality Engineering Statistics
    • An Introduction to Reliability Engineering
    • Reliability Engineering for Heavy Industry
    • An Introduction to Quality Engineering
    • Process Capability Analysis course
    • Root Cause Analysis and the 8D Corrective Action Process course
    • Return on Investment online course
    • CRE Preparation Online Course
    • Quondam Courses
  • Webinars
    • Upcoming Live Events
  • Calendar
    • Call for Papers Listing
    • Upcoming Webinars
    • Webinar Calendar
  • Login
    • Member Home

by Greg Hutchins Leave a Comment

How Less Data Can Give You Better Results

How Less Data Can Give You Better Results

Guest Post by Andrew Sheves (first posted on CERM ® RISK INSIGHTS – reposted here with permission)

“Hi, I’m Andrew, and I have a weakness for data.”

There, I said it.

I love spreadsheets. I love national statistics. I love primary sources.

I could probably have completed my Master’s dissertation without an extension if I had just accepted that cited quotes were valid instead of looking for all the original sources*. And I don’t need to read the last three years of a company’s annual reports before I have a 20-minute call with them.

I worry less about being distracted by social media than I do about some data-heavy wonky paper getting me off track.

But having too much data isn’t just a distraction. It prevents us from making good decisions.

Seems counterintuitive right? Surely the more information we have, the better. Well, as it turns out, that’s only true up to a point. Too much data poses two major problems when it comes to decision-making.

Too much data, too little time

The first problem is relatively straightforward. A surplus of data demands an enormous amount of time to process before you have to what you need to make an informed decision. However, time is in short supply so, 1) you only get whatever data there was time to process, not necessarily the data you need and, 2) your decision-making time is curtailed.

Luckily, you can put a process in place to develop a more efficient set of filters and processes to sift data to speed this up. That should get you the most pertinent data, quickly. Filtering out the noise like this will help achieve understanding and subsequently aid your decision-making.

Too much data, too much bias

The second problem is just as significant but much more interesting as it’s completely counterintuitive.

The more data you have, the worse your decision-making becomes.

However, this isn’t the same as the first issue: the amount of data isn’t overwhelming your ability to use it. Instead, the more information you have, the more confidence you have in your decision-making ability. (In games of chance, this gap is called a confidence–accuracy discrepancy which I know because I ended up getting sucked into a paper about that…. I’m serious, this is a real problem!)

Unfortunately, this increased confidence is coupled with an increase in confirmation bias: the condition where you ignore information that doesn’t support your initial assessment and place a greater emphasis on data that does.

The result is that beyond a certain point, the more data you have, the worse your decisions become. So not only does too much data slow down your decision-making, it leads you to make decisions that are of lower quality.

Finding the Goldilocks Solution

But here’s the really hard part: how do you know how much information is just right?

Too little data and you can’t make an informed decision.

But with too much data, confirmation bias kicks in and skews your thinking.

I wish I could say that five or seven data points are all you need. Unfortunately, I don’t have a magic number to share.

But I do have a couple of suggestions.

Firstly, think about the fundamentals of whatever issue you are considering. Manufacturing costs, product positioning, and competitor pricing might be enough to make some decisions on how to price your new product. You might not need the dozens of other data points.

Secondly, use data that is at the right scale for what you’re doing. So, if you’re working at a local level, you can focus on data related to your immediate geographic area and things measured in days, weeks, and hundreds of dollars. At a corporate level, you will need a global overview, be thinking in timelines of months and years, and will probably be OK with values rounded to the nearest $10K or even $100K.

Trying to use strategic data at the local level won’t be detailed enough, and tactical data in the boardroom immediately pulls the discussions down to the tactical level.

The golden rule here is that you want to work with the least amount of the highest quality data you need to make your decision

Also, the number of decisions you are dealing with also has an impact. So don’t try to boil the ocean. Limiting the decisions you’re trying to make also limits the amount of data you need to process. Targeting the top-three risks your organization is facing will help keep things manageable and allow you to reach definitive conclusions. Trying to tackle the complete, 100-item risk register in one go is an impossible goal.

Less is more (for humans anyway)

Don’t get me wrong, there’s definitely a time and place for big data.

We benefit from AI’s ability to number-crunch mind-numbing amounts of data to predict the weather, make medicine safer, and to help us find the house that matches our needs. However, this is only possible by removing humans from the loop: we can’t process large amounts of data, and our decision-making can be skewed. So, assuming the underlying algorithm and parameters are sound and ethical, and the computers are fast enough, we can expect good results.

But a ‘more is more’ approach doesn’t work for humans.

It’s too easy for us to become overwhelmed or to have a false confidence in our decision-making ability when we try to manage large amounts of data ourselves. We can’t (and shouldn’t) outsource all of our decisions to machines so we need that ‘Goldilocks’ mix: having the least amount of highest quality data you need to make your decision.

This is a time when less can definitely be more.

*Managing my time better would also have helped but you get the idea.

Andrew Sheves Bio

Andrew Sheves is a risk, crisis, and security manager with over 25 years of experience managing risk in the commercial sector and in government. He has provided risk, security, and crisis management support worldwide to clients ranging from Fortune Five oil and gas firms, pharmaceutical majors and banks to NGOs, schools and high net worth individuals. This has allowed him to work at every stage of the risk management cycle from the field to the boardroom. During this time, Andrew has been involved in the response to a range of major incidents including offshore blowout, terrorism, civil unrest, pipeline spill, cyber attack, coup d’etat, and kidnapping.

Filed Under: Articles, CERM® Risk Insights, on Risk & Safety

About Greg Hutchins

Greg Hutchins PE CERM is the evangelist of Future of Quality: Risk®. He has been involved in quality since 1985 when he set up the first quality program in North America based on Mil Q 9858 for the natural gas industry. Mil Q became ISO 9001 in 1987

He is the author of more than 30 books. ISO 31000: ERM is the best-selling and highest-rated ISO risk book on Amazon (4.8 stars). Value Added Auditing (4th edition) is the first ISO risk-based auditing book.

« Do you really need an assessment?
Learning to Crawl, Before You Can Walk and Run »

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

CERM® Risk Insights series Article by Greg Hutchins, Editor and noted guest authors

Join Accendo

Receive information and updates about articles and many other resources offered by Accendo Reliability by becoming a member.

It’s free and only takes a minute.

Join Today

Recent Articles

  • test
  • test
  • test
  • Your Most Important Business Equation
  • Your Suppliers Can Be a Risk to Your Project

© 2025 FMS Reliability · Privacy Policy · Terms of Service · Cookies Policy