Drew Rae: Podcaster

Article by Drew Rae

PA/PA Archive/PA Images
An earlier explosion aboard Piper Alpha proved a dress rehearsal for the disaster to come

IN March 1984 there was a gas explosion aboard the Piper Alpha oil and gas production platform in the North Sea. After an unsuccessful attempt to fight the subsequent fire, the rig was evacuated by helicopter. There were no fatalities, and only a few minor injuries. An internal report by the rig operator, Occidental, was closely held within the organisation, and not even shared with the safety committee. The Department of Energy declined to prosecute, on the grounds that the explosion was caused by a design fault, and was not the fault of Occidental.

The onshore safety superintendent prepared a memo, “How it was vs how it could have been”, arguing that incident was a near-disaster, saved only by fortunate circumstances. He was accused of painting a worst-case picture, and the recommendations in the memo were dismissed. Piper Alpha had wide safety margins, and redundant safety systems. The original design was subject to detailed safety analysis. Unlike most oil platforms at the time, there was an integrated safety committee, and an apparently effective system of training and permits governed all work. Industry assessments suggested that the actual likelihood of a “catastrophic collapse of safety precautions” in North Sea oil would occur around once every 10,000 years.

“Catastrophic collapse of safety precautions” is almost the exact definition of a disaster provided by sociologist Barry Turner. Turner, like many accident theorists before and after, sought to understand why organisations and societies are incredibly skilled at convincing themselves that the precautions they have taken against major accidents are adequate. He suggested that in a world full of potential problems, it is almost impossible to determine what to truly worry about. “The central difficulty”, wrote Turner, “lies in discovering which aspects of the current set of problems facing an organisation are prudent to ignore, and which should be attended to.”

In hindsight, of course, the answer to this difficulty is always obvious. On 6 July 1988, Piper Alpha was destroyed by explosion and fire, killing 167 people. The explosion in 1984 wasn’t just a lucky escape, it was a dress rehearsal for a major disaster. With what we know now, the disaster was preventable. With Lord Cullen’s report into the industry, we know – or at least appear to know – all of the precautions that should have been taken. Hindsight gives us incredible power to prevent accidents if we had a time machine, but is curiously incapable of predicting the future with equal accuracy.

It would be easy to say that the world has learnt the right lessons from Piper Alpha. I could talk about the changes that have been made to safety regulation. I could discuss the widespread adoption of safety cases, and of integrated safety management systems. I could point, as the oil and gas industry sometimes does, to the implausibly monotonically decreasing trend in lost-time injuries as evidence that these practices are working. I could applaud the ongoing attempts at “cultural improvement” in the offshore oil and gas industry.

Unfortunately, we don’t really know if any of these practices are effective. Oil and gas explosions still happen. When they do, we explain them away as aberrations – isolated failures to conform, to keep up, or to learn the lessons that everyone else has apparently taken to heart. We don’t want to admit that maybe they aren’t exceptions at all. Maybe they are just symptoms of the fact that none of the precautions we take are sufficient to keep disaster completely at bay.

I remember the central lesson that Turner drew from the Aberfan tip collapse, the Hixon level crossing collision, and the Summerland fire disaster. It is a lesson reinforced by Piper Alpha – organisations and societies are incredibly skilled at convincing themselves that the precautions they have taken against major accidents are adequate. If we forget that lesson, then every apparent advance in safety is actually a step backwards. Instead of making accidents unlikely, we are merely making them unthinkable. 

Drew Rae hosts the Disastercast podcast, which he launched in January 2013 to “review scary things and how to stop them happening." For more information and to listen to his podcasts, including his recording on Piper Alpha, visit: http://disastercast.co.uk/wp/


We have added fresh perspectives each day in the run up to the 30th anniversary of the Piper Alpha tragedy. Read the rest of the series here.

Article by Drew Rae

Drew Rae is manager of the Safety Science Innovation Lab at Griffith University, Australia. He also hosts the Disastercast podcast, which he launched in January 2013 to “review scary things and how to stop them happening.”

Recent Editions

Catch up on the latest news, views and jobs from The Chemical Engineer. Below are the four latest issues. View a wider selection of the archive from within the Magazine section of this site.