Safety Investigations are Prone to Bias

Article by Andy Brazier AMIChemE

AS humans we are all prone to bias – a tendency to prefer one thing over another. It can actually be a useful characteristic because it allows us to make decisions quickly using a minimum of information. The problem is that it leads us to often make poor decisions or choices.

Two commonly-observed examples are “confirmation bias” where we look for information that is consistent with our expectation, and “availability bias” where we rely on the most vivid or memorable information.

Because bias is an inherent human condition it cannot be avoided. People can be trained to be aware of it; design can assist by making sure the most important items or sources of information are placed in the most prominent way; and procedures can be written to guide people to make a more objective evaluation of a situation before they make a decision.

How bias affects safety

Bias can have a significant safety effect, both in the causes of accidents and in the way we investigate them. Prior to the doomed launch of the Challenger space shuttle a discussion took place about the potential for ‘O’ ring failure causing a fuel leak. However, there was a lot of pressure to launch on the agreed day, and evidence presented to say it was safe was believed and the counter argument that would have resulted in a postponed launch was ignored. Prior to the Deepwater Horizon disaster, an explanation for the observed results of the negative pressure test was accepted to demonstrate that the hydrocarbons were being controlled effectively, even though this suggested “bladder effect” was not a recognised phenomenon for this type of test.

People investigating incidents should be aware that they may be prone to bias and that this can have a significant impact on the effectiveness of their investigation. Focussing on the actions of the people present at the incident can mean the actions of the many other people who will have had some involvement in the system through previous design, construction, operations and maintenance activities can be overlooked. And “hindsight bias” means investigators view the incident circumstances based on the information available after the event that was not available to the people involved at the time. 

Some examples of bias in investigation are discussed in the latest edition of the Loss Prevention Bulletin (LPB 264, December 2018) including: 

  • the tendency to believe explanations about why an accident occurred from people with perceived authority rather than the person doing the job at the time;
  • having an inflated view of the effectiveness of procedures in managing risks leading to the conclusion that an accident occurred simply because someone failed to follow a procedure; and
  • assuming everyone involved in an investigation has the same objectives when in reality different stakeholders (eg business owner, corporate HQ, employees, contractors, unions, local authorities, emergency services) may have their own interests at heart.

In the same edition of LPB the “watermelon effect” is described.  This where people are happy to believe all is well from their safety metric because the information that is immediately available (the outer shell) is showing green but they need to dig deeper to discover the hidden red indicators (the inner flesh).

Article by Andy Brazier AMIChemE

Consultant, AB Risk

Recent Editions

Catch up on the latest news, views and jobs from The Chemical Engineer. Below are the four latest issues. View a wider selection of the archive from within the Magazine section of this site.