FROM THE INDUSTRY This demands more than compliance. It requires testing whether controls actually work, and whether people can and will use them. Cyber, by contrast, sometimes drifts towards a checklist mentality. A policy exists so we must be secure; a control is deployed so the risk must be managed. The outcome-focused approach in H&S offers a useful challenge: it is not enough for a rule to exist - it must be effective. This is the change that telecoms in the UK has been undertaking over the past two decades in regulated providers. Behavioural parallels between the disciplines are impossible to ignore. Whether climbing a ladder or deciding whether to bother with multi factor authentication, behaviour is guided by convenience, perceived risk and habit. As humans, we’ll expend a heroic amount of time and creativity to find the easiest possible way of doing something. Designing systems so that the safe or secure way is the easiest way is therefore essential. The moment friction appears, people revert to workarounds, which is proven in the Delone and McLean IS success model (shown below and referenced in Des Ward’s dissertation on Changing the Value Perception of Security in the Enterprise). This is where a more modern view of safety, often called Safety II, adds value. Traditional safety (Safety I) focuses on preventing things from going wrong. Safety II complements it by asking why things usually go right. It recognises that humans keep systems functioning despite their flaws, not because of their perfection. cyber has only really adopted this mindset, understanding that controls must reflect real behaviour rather than tidy procedural assumptions. In practice, intuitive,
That question emerged following a panel at the INCA summit in October 2025, initially framed as a one way question about the ‘younger’ discipline of cyber learning from the more established world of Health and Safety. Yet it would be hubris to pretend that H&S cannot learn from other areas; in reality, both disciplines face emerging threats that simply don’t respect traditional professional boundaries. Like it or not, we are being pushed into learning together. Background H&S in the UK has evolved gradually over more than two centuries. From the early factory laws of the 1800s through to the Health and Safety at Work etc Act 1974 and beyond, the trajectory has been driven by a simple moral imperative: people should not be injured or killed for earning a living (or because of someone else’s work). The system that developed is mature and largely stable. It expects organisations to identify hazards and evaluate risk properly, reducing it as far as reasonably practicable. It relies on mature understanding of the assets at risk (i.e. humans), shared norms, safe systems of work and a recognition that people’s personal risk tolerance does not always align neatly with organisational expectations. Cyber, meanwhile, travelled a different road; Its early motivations were mainly practical – securing the perimeter as the assets to be managed were in physical premises and accessed behind firewall. Few early cyber security pioneers could have predicted just how deeply digital systems would become woven into human life, and increasingly increasingly accessed in environments and services outside the direct control of the organisation. As this has happened, the risks have increasingly focused on cyber resilience in telecoms over that past two decades in the UK – moving beyond cyber towards the availability of the network and hosted services. This is due to the risk of failure in telecoms services - it can shut down emergency services, disrupt essential utilities or expose vulnerable people to danger. The consequences may be less visible than in H&S, but they are no less real – COVID-19 is generally accepted to have accelerated technical adoption in the enterprise by around 7 years, but this tsunami-level event has left many organisations stranded outside the safety of typical perimeters.
Parallel vs Linked problems When you look at how the two fields meet, two types of problems become obvious. Parallel problems are arise independently in each discipline. They may have similar root causes (such as human shortcuts, overly rigid procedures or regulation that can’t keep pace with technology) but fixing them in one domain doesn’t necessarily affect the other. This is, in part due to the different understanding and perceptions of the impacts of failure – risk to life is easier to understand than risk to service. Linked problems , however, occur at the interface between physical and digital systems, needing joint thinking. A classic example is a secure access control system designed with cyber in mind that ends up delaying an evacuation. Or a cyber attack that disables environmental controls or monitoring systems, creating immediate physical hazards for people on site. These are not “your problem” or “my problem”, but common problems requiring shared solutions. The problem, of course, is where these linked problems are competing for finite organisational budgets with differing risk perceptions. Human factors, shared learning, and outcome focus: What cyber can learn from H&S At the core of H&S regulation is an outcome focused mindset: n Have we reduced the risk as far as reasonably practicable? n Can this work be done safely in the real world, not just on paper?
Volume 48 No.2 MAY 2026
67
Made with FlippingBook - Online magazine maker