Stop-making-stupid-mistakes.-If-only-they’d-follow-the-rules_fb_v1

“Stop making stupid mistakes. If only they’d follow the rules”

By: Gareth Lock

In 2012, two divers entered the water for a check-out and set-up dive in 115ft (35m) of water dive on their rebreathers. 27 mins later, one of them was on the surface having been rescued by their inexperienced closed circuit rebreather (CCR) buddy having suffered an oxygen toxicity seizure. The victim was given emergency life support, put on O2 and evacuated to a hyperbaric chamber. Despite the victim being unconscious underwater, they made a full recovery with no lasting physical or neurological damage. Subsequent analysis showed that the loop pO2 reached had more than 3.0 and was probably closer to 4.0. The same analysis showed that the cells in the rebreather were 33 months old. The accident happened because of a number of converging factors which weren’t related to undetectable technical failures, but rather a failure of mental processing and interpersonal skills.

For those who have just had a mental response of ‘How stupid was it that they would use cells that old?’, ‘Darwin Award winner’ or similar, you have just demonstrated some of the mental biases and shortcuts (heuristics) which are part of the way our brain is wired for efficiency. These mental shortcuts are how we deal with the complex and ambiguous world we live in, and we do this all the time. Most of the time it doesn’t matter, sometimes it is critical.

When we focus on the negative aspects during the discussions of an incident, we often miss the positive aspects of human performance in emergency or high stress/high risk situations. In this case, the rescuer dealt with surprise, uncertainty and followed what she thought was the best plan given the limited information she had. She rescued the diver by sending them to the surface from 70 ft (21m) under positive buoyancy as she was losing control of the ascent. The surface team had resources in place and had the training, both technical and ‘soft skills’, to execute an effective rescue. However, during the debrief they realized they could do much better, recognizing why the accident had happened, and putting measures in place to improve the prediction and prevention of another accident. The formal investigation was very much focused on blame and did very little to improve learning within the team, if anything, it created an environment when failures would not be discussed.

This iterative process of learning from failure, through the analysis of context rich investigation data and sharing of all the details including the violations and rule-breaking, is how high risk industries have made their environments much safer. Aviation is considered the classic example of high reliability and a ‘safe’ domain but it did not used to be like this. Much of the positive changes had to do with the recognition that humans are fallible and designing ‘systems’ which took this fallibility into account. Interestingly in the 1970s, aviation accident databases did not contain much information when it came to human performance other than bundling it under the heading of ‘pilot error’ – that has changed beyond all recognition for the better. Diving incident reporting systems, unfortunately, are in a similar place now to where aviation was at that time.

Not more rules

Counterintuitively, improving safety doesn’t start with more ‘operationally-focused’ rules. The application of a compliance-based mindset (‘follow the rules or else’) has been shown to have limited effect in high risk industries. This is even more obvious when the risk of being caught for transgression is slim, or if you are caught the punishment has a limited impact. Therefore, adding more regulations is unlikely to make much difference to safety. Ironically, during the investigation into the Deep Water Horizon disaster it could be seen that focusing on OSHA/HSE-type rules (slips, trips and falls) meant that more systemic issues were missed which created the environment for the disaster. Furthermore, when the situation started to deteriorate there were multiple poor decisions based on flawed assumptions, miscommunications and an inability to challenge authority which combined to lead to the loss of life and the platform.

In diving, more rules might help with reducing litigation, which is a major concern for the training organizations and professionals, but it doesn’t help improve safety and human performance. Indeed, if we look at fatalities in diving, the statistics show that it is pretty safe, but how much ‘bad stuff’ is missed with near-misses or decompression incidents occurring which are not reported or are self-treated in the case of DCI?

Understanding the human element

By understanding the human element e.g. human error, decision making, psychology, communications, teamwork and leadership, it has been shown that performance and reliability of teams and individuals can be improved. Safety comes as a by-product of this developmental process because we understand how and why humans make errors or break rules. Human performance can be both positive and negative in its outcome i.e. humans can do amazing things with limited information in time critical circumstances and also commit some really stupid things which dumbfound onlookers who have 20:20 hindsight. This duality has been described by some as “human as hero, human as villain/hazard”.

System Safety

System in the context of human performance doesn’t just relate to hardware, it relates to people operating as part of an environment which includes other people, hardware, software and the physical environment. Nancy Leveson, one of the world’s leading researchers on System Safety working on topics such as nuclear power plants, military decision making and command and control systems, has said:

“Safety is an emergent property of systems, not a component property.”

What this means is that while you can develop an amazing piece of hardware, unless you take into account the social, physical and technical environments in which it is operated, and the performance limitations of the human in the system, you will not have a safe capability. This is because humans, and the environment in which we operate, are dynamic and as such, you cannot replicate exactly how each situation is going to develop based on previous experiences and so often we make ‘a best guess’ as to what we should do. Consequently, the humans in the system have to be resilient and have skills to deal with this complexity, uncertainty and ambiguity.

How do high risk industries address the variability in human performance?

In the 1970s and 1980s, the aviation community recognized that the complex nature of factors in aviation went beyond pilot error alone. In 1977, two Boeing 747s collided on a runway in fog in Tenerife and 583 people killed. Nothing was technically wrong with the aircraft but there had been a series of miscommunications, flawed assumptions, and teamwork issues. These factors led to the KLM jet accelerating down the runway towards the Pan-Am 747 taxiing the other way, and the collision occurred.

The discovery that human error caused many more airline crashes than mechanical malfunctions led to a cockpit resource management (CRM) program being developed by Ames Research Centre in the mid-70s, a program first deployed by United Airlines. These CRM programs have morphed and improved over time, but in essence, they are about understanding human error and human performance variability and developing training and system to improve operations, improvements which include safety outcomes.

Healthcare has slowly started to apply the same concepts as there are hundreds of thousands of preventable injuries and deaths in healthcare because the human element is not taken into account. However, culture is one of the major hurdles to overcome there before change can happen and a noticeable improvement in safety achieved. The Oil and Gas sector have formalized the similar training for oil rig and platform crews following the Deep Water Horizon disaster in 2010, but the downturn in the market has limited the deployment. The Maritime sector has also realized that it is the human element which is the weak link in the metaphorical chain and now provide comparable training to the other sectors described.

All of these sectors have done this because, after analyzing the incident/fatality/near-miss investigation data, they have realized that the technical solutions to problems only go so far when it comes to improving safety and performance; demonstrated by safety figures which plateau.

What do these CRM programs look like?

Ultimately we want operators (pilots, surgeons or divers) to make the best decisions they can, with the limited or uncertain information they have, cognizant of their biases, stresses and emotional/commercial drivers they will be subject to. At the same time, they need to ensure that the rest of their team are aware of what is coming next, why and how it will be executed.

At its core, CRM is about creating and sharing a mental model within the whole team so that effective decisions are made, and if they are found out to be wrong after the event, they are discussed in a debrief which is framed around learning and not blame.

To achieve this, the CRM programs teach operators and managers about human variability by developing their cognitive and interpersonal skills. Within some non-aviation domains, CRM is known as ’Non-Technical Skills’. ‘Technical’ in this context means the skills which the operator, pilot or surgeon uses to execute their role e.g. pure flying skills or pure surgical skills. In the context of diving, this would be buoyancy, propulsion, operation of the CCR, launching a dSMB etc. Non-technical refers to decision-making, situational awareness, communications, teamwork, leadership/followership and understanding the impact of performance shaping factors. These are sometimes called ‘soft skills’ but CRM goes much further because it includes cognition and an understanding of the impact of stress and fatigue on human performance.

Isn’t this too complicated for diving? Don’t we do this already?

A fair percentage of the time, yes, because we have built experience up over time. But what if you don’t have the experience? At its most basic level, you are dealing with uncertainty. When you don’t have all the information about the likelihood of an event, you are taking gambles with unknown odds of certainty and hoping the odds pay off. What CRM or non-technical skills programs do is increase the odds by which effective decisions will be made by reducing the likelihood of an erroneous decision. In ‘The Killing Zone’, Paul Craig describes the challenge in the general aviation community:

“The toughest problem to solve in all of aviation is how to beat the Killing Zone. How do pilots without experience gain the experience without killing themselves in the process? The answer is to gain airmanship faster than flight hours.”

Why are these skills important?

Think of any diving incident that you have personally had or read about recently. Would you consider the main reason why it happened as an undetected technical failure, or was it likely to be a combination of flawed assumptions, complacency, miscommunication, poor leadership, risk-seeking behaviors, breaking the ‘rules’, instructors not teaching what they were supposed to, or graduates forgetting what they were taught? The simple way would be to reduce everything down ‘human error’ but that does not help learning nor does it lead to improved diving safety because ‘human error’ is complex and is made of many factors.

Research from DAN shows that 41% of fatalities have an initial ‘trigger’ as ‘Insufficient Gas’ (see above image). What the data doesn’t show is the context behind why they ran out of gas. From personal discussions I have had with Dr Petar Denoble, one of the authors of the above paper, they had to draw a pragmatic line somewhere which is why it stops at these triggers. However, I can guarantee that the majority of those events will not be down to an undetected technical failure of the equipment or a total catastrophic loss of gas, but rather a combination of human factors. In addition, contrary to societal wants, there is unlikely to be a single ‘root cause’ due to interactions of agents and factors in the system.

Doesn’t this training exist already?

Surely something as important as the human element within diving is already covered in materials provided by training agencies? Certain areas are indeed covered, such as describing what situational awareness is, but they don’t go into the real detail of what it is, how to maintain it when other demands are being placed on you or how to recognize when you’ve lost it. The diver leadership programs (e.g. DM, instructor and IT) don’t go into what it really means to be a leader in the context of diving, the power of role-modelling or the ways in which you can create psychological safety so that errors and failure are used as developmental opportunities. None of them, as far as I know, cover normalization of deviance and why it is a real issue, both at the individual level or at the group, dive center or organizational levels. Finally, none explain the essential concept of a Just Culture which is key to learning from mistakes.

In terms of published research in this area, the UK HSE published a report in 2011 which looked at Rebreather normal and emergency operations and made the recommendation that CCR divers be given human factors training due to the complex nature of CCR operations when considering the equipment, the diver and the social environment in which they operate. None of the agencies have such material in their CCR programs.

Why are you writing this?

Because my goal is to raise the awareness of the importance of human factors in diving, and to show that by focusing on ‘technical skills’ (buoyancy, shutdowns, dSMB launch, air sharing…) without addressing the human element, safety is unlikely to be improved much further. I know that aviation is not the same as sport, technical or commercial diving, but the people inside the system are the same, behave broadly the same and make the same mistakes.

Given the large number of variables in the system and the very small number of fatalities, the variability in annual national mortality rates is likely to be noise in a system and therefore, in my humble opinion, these rates are of no real use when it comes to improving diving safety and therefore another approach is needed. (The utility of mortality rates in diving safety is the subject of a paper I am currently writing as I think it is flawed.)

The future

Will things change? I don’t know, but I do know I will continue with my work and delivering my human factors/non-technical skills programs because I believe there is a need and a want for it. Since my programs were launched in January 2017, I have trained nearly 200 divers in face-to-face classes across the globe, more than 150 online courses have been purchased by individuals and I have presented at international sport, commercial and military diving conferences on the subject of human factors in diving. You can find out more what my programs look like here http://www.thehumandiver.com and https://www.facebook.com/groups/184882365201810/

Related Blog Articles

0 balasan-balasan

Tinggalkan balasan

Ingin bergabung dalam diskusi?
Silahkan berkontribusi!

Tinggalkan Balasan

Alamat email Anda tidak akan dipublikasikan. Ruas yang wajib ditandai *