After a diving accident, it doesn’t take long for the commentators and observers to look for the rules that were broken. “They didn’t monitor their gas.” “They weren’t sufficiently experienced.” “They didn’t do X, Y or Z…” The design of rules, their application and subsequent rule-breaking is a complex topic. Applying a simplistic view to a complex problem won’t help improve performance or safety.
Following an accident or near-miss, the speed by which the story is propagated across social media is staggering. Most of the early information transmitted is incomplete or false. Part of this is because it is just not known. It can also be that information which is available is converted using the internet version of the telephone.
“A friend told me…”
“I had a friend who told me this in confidence, don’t pass it on. But this is what happened…”
The stories sometimes change beyond all recognition. For those who are involved, it becomes deeply distressing. This is especially true when stories gravitate to ‘rule-breaking,’ violations and just ‘farm-animal stupid.’
Unfortunately, the real and honest truths rarely come out. This can be because society finds it easier to blame someone for breaking a rule (for a number of reasons) rather than looking to see why it made sense to them to do what they did.
If rules are regularly being broken in diving, wouldn’t it be a good idea to understand why this is happening? It is if we want to improve diver performance and diving safety? Or is because we want to have the flexibility to manage the risks ourselves and not be constrained by more rules?
This article is the translation of a piece of work examining why anesthetists are likely to commit violations into the domain of diving. It shows there are often rational reasons behind these violations. If we want to really improve diving safety we need to understand what this local rationality is. Then we need to put measures in place to reduce the likelihood of their occurring. What we do not need to do is blame people for being stupid or apply the simplistic, “They should have known better…”
For those reading this who think you make conscious choices in all the decisions you make, including violations, the research from multiple domains shows this to be wrong. Most of your decision making is not actively controlled; It is subconsciously influenced. Our behavior is a function of our personality and how the environment in which we operate impacts this.
What is a violation?
Professor James Reason’s examination of human error created the well-known concept of the Swiss Cheese Model. This concept looks at failures within different levels of an operation or organization that could lead to an accident or incident. In this model, there are three layers of latent failures or conditions (organization, supervisor, and individual) and then a layer of active failures which included violations.
A violation may be defined as a deliberate act that deviates from established protocols of practice. However, to improve safety, this simplistic view doesn’t help. This is because we need to take into account the motivation behind the deliberate aspect of this violation taking into account the social and physical environment in which the person was operating.
In other words, was the violation:
Situational: The only way to solve the problem was to break the rules (i.e., rescuing someone below the MOD of their breathing gas).
Routine: It had become the norm for the group to break the rules, making it easier to socially conform than to say no.
For a personal gain: Goal fixation or to gain something (i.e., time or money from the rule-breaking).
For an organizational gain: “You have to ‘break these standards’ because if you don’t, I will find another instructor who will” (i.e., class sizes or the definition of mastery.)
To create a safer environment, we need to understand why the rules were broken. This need for greater understanding was the basis behind the paper I will be making reference to. (Phipps et al,(2008). Identifying violation-provoking conditions in a healthcare setting. Ergonomics, 51(11), 1625–1642. doi:10.1080/00140130802331617 (for those who are able to access academic papers).
The Risk/Benefit Argument
As we will see, breaking rules to achieve goals is often about determining how much benefit the actor (diver) will get by breaking/following the rules compared to the benefit/loss by not following them. This is risk management at its core. The majority of the time these decisions are being made on a fight/flight or emotional level and not on a rational, logical one.
The work by Kahneman and Tverskey on Behavioral Economics won them a Nobel prize. It showed that much of our decision-making is not rational nor logical. The challenge for diving safety and learning from experience is that, after an adverse event, we are able to apply logic and rationale using information which was not necessarily available to those involved. This is why it is easy to see that “those breaking the rules were stupid!” Critical thinking requires mental effort and humans are efficient (or lazy) depending on your viewpoint.
As part of this risk management process, we use biases and mental shortcuts to speed up the process. One of which is called outcome bias. The premise is that the more severe the outcome, the harsher we judge the (errant) behavior even if the activity itself is almost identical.
New research has also shown that ‘near misses’ are internally rationalized as success stories and not ‘near failures’. Unless we are trained to recognize this fallacy and have a growth mindset (always looking for improvement which we can influence), we will continue to focus on the positive aspect of the outcome (‘we survived’) and not how close to real failure we were. Consequently, we don’t change our behavior. This is the start of the normalization of deviance process.
What did Phipps and his team find out about anesthetists breaking the rules?
The research team interviewed and observed 27 anesthetists during their normal work. They also interviewed the anesthetists using a set of standardized questions and then developed themes which highlighted three key areas and a number of subsections which would need to be addressed if safety and performance were to be improved.
The high-level topics are:
The Rule: This can be summarized as who (person or organization) wrote the rule, how much credibility do they have, and what punishment would occur if the rule was broken and they were caught. And, finally, clarity of the rule.
The Anesthetist: The themes that came from this subsection are the risk perception of the anesthetist, their experience and their expertise and the professional group norm when it comes to violations.
Situational or Organizational factors: Finally, the topics which related to this subsection were the time pressures the anesthetists were under, the amount of resources available, the design of the equipment and whether there were concurrent tasks which needed to be managed.
In the context of diving, no published research has been carried out to understand why divers break rules. The majority of published data focuses on outcomes and not failed processes. In immature safety cultures, it is easy to blame individuals rather than look at the system and whether the rules are supporting positive or negative behaviors.
As such, we need to understand:
What the rules consist of
Who wrote them
Their credibility in the context of the diver
What the disciplinary or social castigation consequences would be if divers break them and they get caught
How clear the rule was to the diver
In the majority of cases, the rules are not really clear because there are so many varied standards across the industry. Defining what is ‘right’ is often difficult. We only know it was ‘wrong’ after the adverse event. Then we have the benefit of hindsight bias to join the dots and outcome bias to attribute severity.
An example would be Always use a checklist’ before a rebreather dive. What checklist? Who wrote it? Is it operationally relevant and based on the application of effective training? Or is it seen as a liability limiting exercise to make up for ineffective training and attitude towards safety and performance?
Moving to the individuals themselves we need to consider their own risk perception, experience, and expertise. We also must consider what the social norm is of the group if we want to understand why violations happen.
Risk perception is a funny thing. We can perceive the risk associated with an activity very differently than another diver who might be equally qualified and experienced. We can even perceive the risk differently at different times in our own lives, often becoming more risk-averse as we age.
The real difficulty is when we have never encountered a situation before and we try to assess the risk. In so doing, we make a ‘best guess’ using emotion rather than logic. This is why real experience, as well as technical skills, are so important.
You cannot be taught everything in a class and therefore you have to learn on the job. Crucially, “…risk is seen as inherently subjective. It does not exist “out there,” independent of our minds and cultures, waiting to be measured. Instead, risk is seen as a concept that human beings have invented to help them understand and cope with the dangers and uncertainties of life.” (Slovic, 1987) and therefore applying your measure of risk to someone else’s situation is likely to end up with a flawed outcome.
As discussed in the human factors in diving micro-class, many of our decisions are not made in slow-time and with logic (System 2), but rather, are emotionally-biased based on mental shortcuts and the cognitive biases we use to navigate our complex and uncertain world (System 1). If we have the wrong information coming into the decision-making process because of a lack of experience, we shouldn’t be surprised if the outcome is flawed. Such flawed outcomes are therefore likely to lead to violations or ‘at risk’ behaviors.
Finally, we need to consider the social norm of the group. Humans are simple beings. We like to be part of a group, a behavior developed thousands of years ago. This is because a group is more likely to be able to survive than a single person on the savannah.
However, to remain part of the tribe/pack, we needed to comply with the social norms. If those norms weren’t complied with, you were ejected to be left fend for yourself. We still see this behavior in troops of primates now.
Despite millennia of development, our brains haven’t moved on much. If the social norm of the group of divers we are part of, or want to be part of, is to take risks, it is much harder to be ‘safe’ and follow the rules. If a newcomer joins the group, he or she will conform too. This is why effective leadership is so important, especially when it comes to instruction.
Situational or Organization Factors
Human behavior is a product of the personality of the person and the environment they are in.
If people are rewarded for a certain activity and punished for something else, don’t be surprised if they conform. This includes social media commentary by the way and not just employment or litigation punishment.
If instructors are rewarded for the number of certs issued by their manager (wages/keeping their job) or their organization and ‘punished’ if they don’t achieve a throughput, don’t be surprised if the instructors put quantity over quality.
If the client expectation is that they can take an Open Water Diver course and become a ‘qualified’ autonomous diver in two days, don’t be surprised if that drives dive center behaviors because ‘everyone else is doing it.’
If reportable incidents are seen as a negative rather than an opportunity to learn and manage risk effectively, don’t be surprised if accidents, incidents and near-misses, especially those involving violations, aren’t reported.
As there is no real quality control in diver training that ensures what is in the standards is taught in every course by instructors across the globe, then drift is likely. Violations are a normal outcome of drift.
This lack of adherence to standards also applies to graduates of training courses. How do you maintain standards and reduce risk-seeking behavior in the real world when effective debriefs and defined standards are missing?
Finally, optimizing behaviors for organizational gain, which might include violations, should be seen as a positive way of improving the system for the local environment. However, it requires an understanding of the factors present.
Drift is normal, but understanding why the drift has happened and modifying processes accordingly in a proactive and informed manner is a good thing. It is normally known as innovation. Enforcing rules for the sake of them, without understanding the unintended consequences, can lead to safety and performance being compromised.
Solving complex problems with simple solutions never works. Divers are part of a complex system of human interactions with other divers, with organizations, with equipment, and with the environment. You can’t write rules for complex environments, especially when you don’t have an effective feedback mechanism so that lessons can be learned without fear of litigation.
So, before you look at violations or at-risk behaviors following an accident, incident or near-miss, consider the rule itself and the person involved and their peer-group. Finally, look at the situational or organizational factors present.
As I have written numerous times, divers don’t get up in the morning and decide “Today is a good day to die.” As such, whatever they were doing at the time will have made sense to them, even if that meant breaking the rules…whatever ‘rules’ mean in the context of a leisure activity with an inherent risk of death and a lack of supervision and quality control.
The Human Diver provides globally-unique training which encourages a change in perspective to look at high performance in divers and dive teams by applying knowledge, skills, and materials from high-risk domains such as military aviation to recreational and technical diving.
These programs are delivered via eLearning, webinar and face-to-face classroom-based sessions. They have gained praise from some of the world’s top divers. You can find out more about how to improve your performance, and safety as a consequence, by following this link www.thehumandiver.com. Apply Human Factors. Master the Dive.
https://www.tdisdi.com/wp-content/uploads/2021/04/that-one-time-we...-FaceBook-header.png7201280Brittany Bozikhttps://www.tdisdi.com/wp-content/uploads/2019/07/header-web-live.pngBrittany Bozik2021-04-06 07:52:052021-04-06 13:45:40That one time we salvaged a toilet for no reason...
https://www.tdisdi.com/wp-content/uploads/2021/02/The-biggest-and-most-unexpected-thing-I-learned-in-tech-dive-training_FB.jpg6271200Brittany Bozikhttps://www.tdisdi.com/wp-content/uploads/2019/07/header-web-live.pngBrittany Bozik2021-02-09 20:11:372021-02-16 06:35:23The biggest and most unexpected thing I learned in technical dive training
https://www.tdisdi.com/wp-content/uploads/2015/06/Refresher-Courses-1.png6281200jamescouncillhttps://www.tdisdi.com/wp-content/uploads/2019/07/header-web-live.pngjamescouncill2021-02-09 09:41:332021-02-10 07:21:07Girls Pee, Too
https://www.tdisdi.com/wp-content/uploads/2021/01/To-Reel-or-not-to-Reel_FB.jpg6271200Brittany Bozikhttps://www.tdisdi.com/wp-content/uploads/2019/07/header-web-live.pngBrittany Bozik2021-01-12 13:04:472021-01-14 12:08:22To Reel or Not to Reel
https://www.tdisdi.com/wp-content/uploads/2020/12/that-time-i-thought-i-wanted-to-tech-dive.png7201280Brittany Bozikhttps://www.tdisdi.com/wp-content/uploads/2019/07/header-web-live.pngBrittany Bozik2020-12-09 10:55:552020-12-09 10:55:55That Time I Thought I Wanted to Tech Dive