Learning from Success

The limitations of only looking at failure

The limitations of our traditional approach to looking at what goes wrong for learning

The challenge for everyone involved in incident management and investigation is that we fall victim to the mindset of over-simplifying causes of disasters. Disaster reports are starting to sound the same. They all tell the tale of a company that preferred profit over safety. They make recommendations for greater safety leadership, the need for a safety culture and for a more effective regulator. We analyse things in the same way, make the same recommendations and somehow expect the result next time around to be different. We do this at a smaller scale also. The imperative in the aftermath of an incident is to minimise the impact of the incident. That means reducing shutdown time associated with damaged equipment, regulatory notices or industrial action. This often leads to reactive and narrow-focused decision making on corrective actions - a new safe working procedure and training course, for example, is the most popular corrective action. The assumption is that if we identify the cause of the incident, we can simply develop a procedure for addressing it, train workers in the procedure and require them to follow it. That thinking satisfies the regulators, they being more eager than most to move on to the next incident being investigated. It limits liability in that it is usually accompanied by a third feature -the implicit or explicit blame of the workers involved, either for needing a procedure or for failing to follow it. It also satisfies the conscience of managers who feel they have addressed the issue as soon as it came onto their radar. Crucially, it is cost effective. A procedure is relatively cheap compared to an engineering solution. Of course, if that procedure works and will be followed, then the problem is truly solved and all objectives have been met. In a perfect world, that would be the case since, for example, it is in the best interest of the company to ensure that the procedure is comprehensive and effective. It is in the best interest of the workers to understand the procedure and follow it; after all, it is there for their health and safety. Unfortunately, the world is not that simple. There are a number of biases that prevent us from understanding what goes wrong. For example, investigations suffer from hindsight bias – we know exactly what happened and how it happened so it seems obvious to us where the operator went wrong. But if it seems so obvious to us, why would they have done what they have done? Surely no one goes to work with the intention of hurting themselves

2

Made with FlippingBook - Online catalogs