55 years ago this week, a man-made landslide of coal mining by-product killed over 100 Welsh schoolchildren. It was a tragedy. It was preventable. And it got me thinking.
Aberfan. Grenfell. Piper Alpha. The Harrow & Wealdstone Crash. The Kings Cross Fire. Hillsborough. Harold Shipman. The Herald of Free Enterprise. Mad Cow Disease. Reading about these disasters, so varied in many ways, commonalities emerge.
1. Disaster has many causes, but systems are at the heart
Tragedy comes from a stunning variety of causes, generally complex. Rarely, an individual can be blamed; such as with Harold Shipman’s one-man killing spree. More commonly the failures are spread out. Hillsborough followed a complex chain of errors, both on the day and in the years of crowd management strategies leading up to it.
Frequently, the failure is not of a person but of a system. Consider the Piper Alpha oil platform, destroyed with the loss of 167 lives in 1988. The crucial problem was the way repair forms were filed, meaning workers couldn’t realise in an emergency that the backup gas pump was out of service. Forced to make a decision in a matter of minutes, they activated it; causing a gas explosion which, after triggering other fires and explosions, destroyed the platform.
It was a terrible event, and a chastening realisation: It’s entirely possible to cause catastrophe without any malicious or even rule-breaking individuals. Only robust systems can reliably prevent problems.
In Aberfan the immediate cause was lax standards around dumping waste into giant ‘tips’ around the mine. But the wider cause was a culture of avoiding the issue by the mining authority and safety inspectors lacking practical independence.
Even the murders of Harold Shipman were a wider failing than just his crimes. Had medical authorities made systematic reviews of GPs’ death rates dozens or even hundreds of his victims would have been saved.
2.Tragedy tends to be preventable, but that doesn’t always make it forseeable
In hindsight we can almost always see what could have stopped a tragedy, but that doesn’t mean we could reasonably have seen the danger coming in advance.
As society and technology develop we simply create situations we have no experience of. The Kings Cross Fire killed most of its victims far away from the source when it suddenly flared up without warning. Later analysis would show this was caused by a previously unknown ‘Trench Effect’ on fires in certain-angled shafts.
A corollary of this is that disaster can arise from risks which don’t manifest for years or even decades after they are created – making it extremely easy to assume the system is operating safely. The Mad Cow Disease outbreak in the 1990s didn’t follow a sudden shift in feed practices: Adding cow bones to cow feed had been widespread without major incident for at least a century. The delays this caused, as authorities realised this had suddenly turned deadly, cost lives but were understandable.
Again, this is a chastening realisation: Deadly risks are particularly easy to miss when they clash with implicit assumptions that long-standing practices must be safe. The only way to recognise problems reliably is to consider every new situation from first principles, a nigh on impossible discipline to maintain.
3. Tragedy repeats unless we react
After the Harrow & Wealdstone crash, where signals were ignored (or missed) by a train driver causing a devastating three-train collision, the Ministry of Transport commissioned a report which was published the next year. As it recommended, new signal alerts were added to trains. A sharp decline in similar incidents followed.
This process has broken down at every stage. Formal investigations are no longer common ground, but pursued or opposed on MPs’ assumptions as to the conclusions. Those conclusions are pre-emptively politicised relentlessly (Grenfell is a classic example, COVID another). The number of inquiries has grown as governments since Blair have begun using them as intentional tools of media management, but implementation of recommendations has tailed off.
Shipman was able to kill because Dr John Bodkin Adams, a very similar man operating in the decade following World War II, didn’t cause the profession to monitor GPs’ death rates in future. Hillsborough came 18 years after the Ibrox crush. Grenfell Tower burned 8 years after Lakanal House had a virtually identical, but fortunately more limited, fire and the recommendations of the coroner’s inquest were ignored.
We were never perfect learners, but it feels like our response now commonly slips across the line from looking for understanding and accountability to caring simply about blame. We are now almost exclusively backwards-looking, looking for villains to blame for the past but not improvements to make for the future.
I don’t know the solution to this. Its causes, too, are complex. As individuals we can try to take emotion out of learning from our (and others) mistakes. But on a societal scale the more I read about our past the more I worry about our lack of reaction in the future.
We should try hard to learn better. Lives, and lots of them, are at stake.
A maxim I have followed in previous articles of mine is that as much content on this site as possible should be directly betting related. I have departed from that on this occasion, and hope you will forgive me this indulgence. Betting-related content will return with my next contribution.
Pip Moss posts on Political Betting as Quincel. You can follow him on Twitter at @PipsFunFacts