Learning from Success

around "just culture", a finding of an error, even an expected and understandable one which is caused by system deficiencies, is a negative finding for the individual concerned. Surrounding that single error were a number of steps correctly followed and many controls which operated effectively, both prior to and following the incident. Those effective controls are taken for granted. It is the lack of attention given to those effective controls that undermines the resilience of a system. That vulnerability in the traditional approach is the motivation behind exploring a new technique in the next section.

Why find out about what went right?

What went right in an incident can be just as instructive as what went wrong. By identifying effective control features, they can be replicated across the system. Controls that work at a local level - that are accepted by operators and fit into other complex systems - are rare. Their effectiveness should be celebrated. That is particularly the case in near misses where, had it not been for those controls, an incident would have occurred. Indeed, even if what went right was not a control at all but a "lucky event'', an analysis of this may be instructive as to the type of controls that might work as a final barrier to the incident's causal trajectory. The reality is that we have been attempting to learn the negative lessons from disasters since the inception of safety science as a discipline. Major disaster report after major disaster report sets out the facts of the incident, the deficiencies in the system, expresses outrage as to how society can allow these conditions to exist, and makes recommendations in relation to safety leadership and safety culture with some specific design recommendations for industry consumption. This was the case in the Columbus, Piper Alpha, Exxon Valdez, BP Texas refinery, Upper Big Branch and Deepwater Horizon reports, to name a few. The problem with that approach is that it is entirely negative.

If it was that simple to learn the lessons from disasters, surely we would have learnt them by now.

The legal and commercial consequences of failing to do so are very significant globally. We have to assume that most leaders are, at worst, agnostic towards safety. Some may be passionate about safety but certainly none display the psychotic behaviour which would mean that lessons, if capable of being easily applied, would be ignored. I have never encountered any managing director who wakes up in the morning wanting to hurt their people, yet even in Australia, which prides itself on its safety standards, on average one person is still killed every working day. Globally the figure is much worse. The reality is that the lessons from disasters, instructive as they may be, are entirely superficial. Traditional linear incident investigations have limited ability to impact incident prevention because lightning does not strike twice. As Dekker (2011) observes: "Reconstructing events in a complex system, then, is nonsensical: the system's characteristics make it impossible. Investigations of past failures thus do not contain much predictive value for

10

Made with FlippingBook - Online catalogs