Just Culture: Why it is essential that we stop judging & blaming and start looking at context if we are to improve diving safety.

By Gareth Lock

“A failure to learn from mistakes has been one of the greatest obstacles to human progress.”  Matthew Syed

“Pilots shut down the wrong engine: 47 dead,” might appear somewhat incredulous but it does sell newspapers! In the case of the British Midland Boeing 737-400 which crashed just short of East Midlands airport in the UK in 1989, the cause wasn’t ‘stupidity’ on the part of the crew, but rather there were multiple issues which came together at just the wrong time and their combinatory effects were unseen by the crew: technical issues, communication issues, organisational issues, and human factor issues. Each on their own wouldn’t have been catastrophic but the emergent effect (i.e. 2+2 > 4) had a massive and catastrophic impact. However, rather than castigate the crew for what appears to be stupidity (I mean how hard can it be to shut down the correct engine in an emergency?) the ensuing investigation showed that a number of contributory factors were to play and that blaming the last person to touch the controls wouldn’t help fix systemic issues and cognitive limitations which led to the crash. Unfortunately, a similar situation happened last year when the captain of an ATR-72 twin-turbo prop crashed in Taiwan after he shut down the wrong engine. It turned out, the way in which the airline’s emergency simulations were run, if an engine failed immediately after takeoff it was always the No 1 engine which led to the captains always shutting down the No 1 engine instantly. Unfortunately, the co-pilot was not in the communications/confirmation loop to check that the lever for the No 2 engine (correct engine) had been selected, and the captain shutdown the wrong engine at approximately 1500 feet. No thrust and no time to resolve the matter meant the aircraft crashed into the river killing 43 of the 58 onboard.  (Link to Wiki Report)

Despite these apparent ‘stupid’ mistakes, aviation has an amazing safety record and I believe that this has been developed at the operator level because of three key reasons:

  1. Aircrew (and non-aircrew too!) recognise that they are fallible and that errors happen all the time. An error is seen as an opportunity to learn and prevent future adverse events.
  2. The industry (in the main) has a Just Culture in place which recognises human fallibility and uses a series of tests to determine if a similarly qualified and experienced operator in the same circumstances would make the same error and if similar issues had happened in the past. If the answer is yes, it is likely to be a ‘system’ issue and not an individual issue.
  3. There is legislation in place which protects information provided in an accident investigation from being used in a legal case.

Consider these two examples of similar events.

An airline crew taxied out at night onto the runway of an airfield at a remote island destination which neither of the pilots had been to before. At the end of the runway they were expecting to have a dumb-bell area of concrete to turn their large passenger jet around as there wasn’t a taxiway to enter that end. As they taxied down the runway they saw a large area of clear concrete either side of the runway and not much in front of them so thought they were at the threshold; there was limited lighting at the airfield and the location appeared to be correct on their taxi chart. They turned around, lined up, gained clearance and rolled down the runway; nothing appeared wrong at this stage. Crews get used to how long the take-off run takes and roughly how long it takes from ‘rotate’ until they pass the approach lights for the opposite runway. In this case it was a much shorter time than expected, safe but shorter. They made a note of this as they recognised that something wasn’t right. When they were able to, they looked at the airfield on Google Earth and noticed that at approximately 1000 ft to go from the threshold where they would line up there was a circular area of concrete which wasn’t marked on the taxi charts; this is where they had taken off from! They filed a near-miss report and the taxi charts were amended to warn crews of this anomaly. No criticism was received from their operations team.

A crew taxied out to the departure threshold having briefed that they had to leave from the threshold due to the weight of the aircraft and the air temperatures. However, there was a miscommunication about the designation of the taxiway entry point and they entered the runway some 4500ft short of where they should have. One of the crew questioned this but was told it was okay. As they ran down the runway with 900m to go the crew were concerned they wouldn’t get airborne, but also concerned they wouldn’t stop if they aborted the take-off. Shortly after they got airborne, the aircraft struck the approach runway lights and damaged the belly of the aircraft. However, the collision and damage weren’t detected until they landed in Florida some 13 hours later. Rather than accept the reasons why this happened (culture and cognitive failures) and take it as a learning opportunity, the airline fired the 4 flight deck crew saying that they wouldn’t accept this unprofessionalism and it would be weeded out.

In the first case there is a strong Just Culture within the organisation, with crews and personnel recognising that they are just one cog in the wider system working together – crucially they may spot something which helps another crew prevent a catastrophe which would have been obvious in hindsight. In the second case, what do you think the likelihood of any crews reporting near misses would be? As Nancy Leveson said, “Blame is the enemy of safety”.

Now let’s consider diving.

Recognition of Fallibility

How often do we talk about our own personal failures, student and/or instructor, in a diving class? How often does the instructor or instructor trainer really delve into why their own incident occurred rather than look to blame some piece of equipment or the environment? How often do they hold up their hands and say, “I screwed up. I broke the ‘rules’ to get this dive done.” How often do we congratulate the student for being awesome, rather than ask why it made sense to do what they did when it wasn’t right? When (if!) incidents are reported to agencies, how honest are they likely to be when the phrase (or similar), “This report is being prepared in the event of litigation” is written at the top of the page? In aviation, crews will report when they have broken the rules because they realise that there might be a systemic issue at play that can only be fixed by reporting them. However, as one diving insurance underwriter said recently, “If your paperwork is 100% complete, then you will be safe” – ‘safe’ inferred protection from litigation rather than operational safety. Just because your paperwork is correct, doesn’t mean your operation is ‘safe’, we just need to look at Deep Water Horizon for that – the rig crews were being given awards for their excellent OSHA record.

Just Culture

How often when we read about something going wrong which leads to a fatality do we jump to conclusions about the cause? The double fatality in Eagle’s Nest recently is a classic example of this. Rumours start flying around that they weren’t qualified or experienced, what were they doing there in the first place…and so on. All without an understanding of what actually happened! Once the horse has bolted, it is very hard to regain the ‘truth’.

So, for example, if we read about an out-of-air situation rather than jump to the conclusion, “How stupid could they be to run out of gas?” ask the questions, “How did that diver come to run out of gas?”, “Why were they not monitoring the consumption?”, “How often had they relied on their guide/instructor to do that for them?”, “Were they continually distracted (e.g. video/photo)?” These questions should be asked in a non-judgmental manner. The answers to these questions are more likely to provide an answer to fix future issues than simple statements like, “Don’t run out of gas” or, “Make sure you monitor your gas” which don’t really help. That’s like saying, “Don’t touch that, it’s hot” and people still do. We need to understand the context to fix the problems. Unfortunately many of the contextual issues reside above the individual diver, and maybe that is why the incidents don’t get fully investigated.

Litigation and Protection

“Discovery” is a double-edged sword. Whilst it allows both sides to see what went on, the reason for discovery doesn’t appear to be about learning what happened with a view to fixing the problem, it is about trying to pin the problem on the other party and transfer the liability. Transferring risk to the insurance company so you can get a pay out if something goes wrong doesn’t help bring a dead body back!! This continual pursuit of legal blame is probably the largest barrier in terms of organisational learning from incidents. Unfortunately, given the litigious nature of modern society, especially in the US, then we are always going to be struggling to get contextual and honest reporting but it is essential that we do so.

Learning from Near Misses

This appears to be an obvious thing to do. However, there is a problem with the term ‘Near Misses’ and it is both technical and social in nature.

First let’s address the technical aspects. Although the majority of sport diving does not fall under OSHA in the US, OSHA defines, “a near miss is an incident in which no property was damaged and no personal injury was sustained, but where, given a slight shift in time or position, damage or injury could easily have occurred”. So, those involved have to look at the potential of a risk materialising. Most of diving is about risk management, with many of the potential outcomes causing injury or death and only a slight shift creating these circumstances! Furthermore, if safety is managing an acceptable level of risk, what is considered acceptable?! Both risk perception and acceptance are based on previous knowledge and experience, and we are very poor at determining probabilities of adverse events occurring due to our cognitive biases. Crucially, we don’t see the things we aren’t looking for. Finally, one of the biggest barriers to learning from incidents in diving is that there is no formally accepted definition of an incident, so what do we report?!

In terms of the social aspects, when our professionalism or competency is threatened, we are liable to put up defenses. We don’t want to think of ourselves as incompetent or inept. We don’t want our credibility to be undermined in the eyes of our colleagues, and yet as I have presented on at a number of conferences now, we are all subject to the Dunning-Kruger effect to a greater or lesser degree – “We don’t know what we don’t know, and worse still, we don’t know we don’t know.” So it should come as no surprise that divers, including instructors and instructor trainers, ALL make mistakes, irrespective of their experience. So why can’t they talk about them? Because the same social media community that allows the rapid sharing of useful and educational information such as this article, also allows (and sometimes encourages) the criticism and castigation of those involved without understanding the context in which those decisions were made. Furthermore, when those involved have died, there is no way of finding out their side of the story to determine the ‘why it made sense’.

Summary

This is all great, but how do we improve safety and knowledge in diving? To start with, why not accept that we all make mistakes and that malicious intent or sabotage is a rarity. Further, most divers don’t plan to hurt or kill themselves, therefore the often used phrase ‘they chose to take the risk’ is massively flawed. After that, use this definition of Just Culture to frame adverse events – “a culture where divers and instructors/supervisors are not punished for actions, omissions, or decisions taken by them that are commensurate with their experience and training; but in which gross negligence, willful violations and destructive acts are not tolerated.”

Next, to facilitate greater learning when you read about an incident, don’t immediately jump to the conclusion that it was ‘obvious’ that those involved should have seen what was coming and done something about it, rather put yourself in their shoes (fins!) given their experience, knowledge and context, not yours, and try to work out why it made sense for them to behave in the manner they did, or make the decisions they did. By looking at the ‘how’ an incident occurred, not the ‘what or who’ we remove much of the emotion from a discussion. At this point, the real reasons why the adverse event will come to the fore. However, if you use the term ‘human error’ in that conclusion, start again and look a little deeper, human error is only the starting point for those who want to improve the system.

Gareth Lock is a retired RAF senior officer C-130 navigator with considerable experience in the field of human factors, human error and Just Culture in diving. He manages a small training and research consultancy whose prime aim is to improve the performance of teams and individuals and accelerate their learning by taking lessons learned from aviation and applying them to diving, healthcare and other high risk environments. He achieves this by running online and classroom-based classes on human factors & non-technical skills. These globally-unique classes take the theory of human factors and apply them to diving. More details of the classes can be found here www.humanfactors.academy which also includes a list of dates and locations for the forthcoming classroom-based classes which take place in the US, Australia and Europe. He can be reached by email at gareth@humaninthesystem.co.uk

Related Blog Articles

0 antwoorden

Plaats een Reactie

Meepraten?
Draag gerust bij!

Geef een reactie

Het e-mailadres wordt niet gepubliceerd. Vereiste velden zijn gemarkeerd met *