rain waves

Why Do Divers Keep Breaking the “Rules”?

By Gareth Lock

After a diving accident, it doesn’t take long for the commentators and observers to look for the rules that were broken. “They didn’t monitor their gas.” “They weren’t sufficiently experienced.” “They didn’t do X, Y or Z…” The design of rules, their application and subsequent rule-breaking is a complex topic. Applying a simplistic view to a complex problem won’t help improve performance or safety.

Following an accident or near-miss, the speed by which the story is propagated across social media is staggering. Most of the early information transmitted is incomplete or false. Part of this is because it is just not known. It can also be that information which is available is converted using the internet version of the telephone.

“A friend told me…”

“I had a friend who told me this in confidence, don’t pass it on. But this is what happened…”

The stories sometimes change beyond all recognition. For those who are involved, it becomes deeply distressing. This is especially true when stories gravitate to ‘rule-breaking,’ violations and just ‘farm-animal stupid.’

Unfortunately, the real and honest truths rarely come out. This can be because society finds it easier to blame someone for breaking a rule (for a number of reasons) rather than looking to see why it made sense to them to do what they did.

If rules are regularly being broken in diving, wouldn’t it be a good idea to understand why this is happening? It is if we want to improve diver performance and diving safety? Or is because we want to have the flexibility to manage the risks ourselves and not be constrained by more rules?

This article is the translation of a piece of work examining why anesthetists are likely to commit violations into the domain of diving. It shows there are often rational reasons behind these violations. If we want to really improve diving safety we need to understand what this local rationality is. Then we need to put measures in place to reduce the likelihood of their occurring. What we do not need to do is blame people for being stupid or apply the simplistic, “They should have known better…”

For those reading this who think you make conscious choices in all the decisions you make, including violations, the research from multiple domains shows this to be wrong. Most of your decision making is not actively controlled; It is subconsciously influenced. Our behavior is a function of our personality and how the environment in which we operate impacts this.

What is a violation?

Professor James Reason’s examination of human error created the well-known concept of the Swiss Cheese Model. This concept looks at failures within different levels of an operation or organization that could lead to an accident or incident. In this model, there are three layers of latent failures or conditions (organization, supervisor, and individual) and then a layer of active failures which included violations.

A violation may be defined as a deliberate act that deviates from established protocols of practice. However, to improve safety, this simplistic view doesn’t help. This is because we need to take into account the motivation behind the deliberate aspect of this violation taking into account the social and physical environment in which the person was operating.

In other words, was the violation:

  • Situational: The only way to solve the problem was to break the rules (i.e., rescuing someone below the MOD of their breathing gas).
  • Routine: It had become the norm for the group to break the rules, making it easier to socially conform than to say no.
  • For a personal gain: Goal fixation or to gain something (i.e., time or money from the rule-breaking).
  • For an organizational gain: “You have to ‘break these standards’ because if you don’t, I will find another instructor who will” (i.e., class sizes or the definition of mastery.)

To create a safer environment, we need to understand why the rules were broken. This need for greater understanding was the basis behind the paper I will be making reference to. (Phipps et al,(2008). Identifying violation-provoking conditions in a healthcare setting. Ergonomics, 51(11), 1625–1642. doi:10.1080/00140130802331617 (for those who are able to access academic papers).

The Risk/Benefit Argument

As we will see, breaking rules to achieve goals is often about determining how much benefit the actor (diver) will get by breaking/following the rules compared to the benefit/loss by not following them. This is risk management at its core. The majority of the time these decisions are being made on a fight/flight or emotional level and not on a rational, logical one.

The work by Kahneman and Tverskey on Behavioral Economics won them a Nobel prize. It showed that much of our decision-making is not rational nor logical. The challenge for diving safety and learning from experience is that, after an adverse event, we are able to apply logic and rationale using information which was not necessarily available to those involved. This is why it is easy to see that “those breaking the rules were stupid!” Critical thinking requires mental effort and humans are efficient (or lazy) depending on your viewpoint.

As part of this risk management process, we use biases and mental shortcuts to speed up the process. One of which is called outcome bias. The premise is that the more severe the outcome, the harsher we judge the (errant) behavior even if the activity itself is almost identical.

New research has also shown that ‘near misses’ are internally rationalized as success stories and not ‘near failures’. Unless we are trained to recognize this fallacy and have a growth mindset (always looking for improvement which we can influence), we will continue to focus on the positive aspect of the outcome (‘we survived’) and not how close to real failure we were. Consequently, we don’t change our behavior. This is the start of the normalization of deviance process.

What did Phipps and his team find out about anesthetists breaking the rules?

The research team interviewed and observed 27 anesthetists during their normal work. They also interviewed the anesthetists using a set of standardized questions and then developed themes which highlighted three key areas and a number of subsections which would need to be addressed if safety and performance were to be improved.

The high-level topics are:

  • The Rule: This can be summarized as who (person or organization) wrote the rule, how much credibility do they have, and what punishment would occur if the rule was broken and they were caught. And, finally, clarity of the rule.
  • The Anesthetist: The themes that came from this subsection are the risk perception of the anesthetist, their experience and their expertise and the professional group norm when it comes to violations.
  • Situational or Organizational factors: Finally, the topics which related to this subsection were the time pressures the anesthetists were under, the amount of resources available, the design of the equipment and whether there were concurrent tasks which needed to be managed.

The Rule

In the context of diving, no published research has been carried out to understand why divers break rules. The majority of published data focuses on outcomes and not failed processes. In immature safety cultures, it is easy to blame individuals rather than look at the system and whether the rules are supporting positive or negative behaviors.

As such, we need to understand:

  • What the rules consist of
  • Who wrote them
  • Their credibility in the context of the diver
  • What the disciplinary or social castigation consequences would be if divers break them and they get caught

How clear the rule was to the diver

In the majority of cases, the rules are not really clear because there are so many varied standards across the industry. Defining what is ‘right’ is often difficult. We only know it was ‘wrong’ after the adverse event. Then we have the benefit of hindsight bias to join the dots and outcome bias to attribute severity.

An example would be Always use a checklist’ before a rebreather dive. What checklist? Who wrote it? Is it operationally relevant and based on the application of effective training? Or is it seen as a liability limiting exercise to make up for ineffective training and attitude towards safety and performance?

The Individual

Moving to the individuals themselves we need to consider their own risk perception, experience, and expertise. We also must consider what the social norm is of the group if we want to understand why violations happen.

Risk perception is a funny thing. We can perceive the risk associated with an activity very differently than another diver who might be equally qualified and experienced. We can even perceive the risk differently at different times in our own lives, often becoming more risk-averse as we age.

The real difficulty is when we have never encountered a situation before and we try to assess the risk. In so doing, we make a ‘best guess’ using emotion rather than logic. This is why real experience, as well as technical skills, are so important.

You cannot be taught everything in a class and therefore you have to learn on the job. Crucially, “…risk is seen as inherently subjective. It does not exist “out there,” independent of our minds and cultures, waiting to be measured. Instead, risk is seen as a concept that human beings have invented to help them understand and cope with the dangers and uncertainties of life.” (Slovic, 1987) and therefore applying your measure of risk to someone else’s situation is likely to end up with a flawed outcome.

As discussed in the human factors in diving micro-class, many of our decisions are not made in slow-time and with logic (System 2), but rather, are emotionally-biased based on mental shortcuts and the cognitive biases we use to navigate our complex and uncertain world (System 1). If we have the wrong information coming into the decision-making process because of a lack of experience, we shouldn’t be surprised if the outcome is flawed. Such flawed outcomes are therefore likely to lead to violations or ‘at risk’ behaviors.

Finally, we need to consider the social norm of the group. Humans are simple beings. We like to be part of a group, a behavior developed thousands of years ago. This is because a group is more likely to be able to survive than a single person on the savannah.

However, to remain part of the tribe/pack, we needed to comply with the social norms. If those norms weren’t complied with, you were ejected to be left fend for yourself. We still see this behavior in troops of primates now.

Despite millennia of development, our brains haven’t moved on much. If the social norm of the group of divers we are part of, or want to be part of, is to take risks, it is much harder to be ‘safe’ and follow the rules. If a newcomer joins the group, he or she will conform too. This is why effective leadership is so important, especially when it comes to instruction.

Situational or Organization Factors

Human behavior is a product of the personality of the person and the environment they are in.

  • If people are rewarded for a certain activity and punished for something else, don’t be surprised if they conform. This includes social media commentary by the way and not just employment or litigation punishment.
  • If instructors are rewarded for the number of certs issued by their manager (wages/keeping their job) or their organization and ‘punished’ if they don’t achieve a throughput, don’t be surprised if the instructors put quantity over quality.
  • If the client expectation is that they can take an Open Water Diver course and become a ‘qualified’ autonomous diver in two days, don’t be surprised if that drives dive center behaviors because ‘everyone else is doing it.’
  • If reportable incidents are seen as a negative rather than an opportunity to learn and manage risk effectively, don’t be surprised if accidents, incidents and near-misses, especially those involving violations, aren’t reported.

As there is no real quality control in diver training that ensures what is in the standards is taught in every course by instructors across the globe, then drift is likely. Violations are a normal outcome of drift.

This lack of adherence to standards also applies to graduates of training courses. How do you maintain standards and reduce risk-seeking behavior in the real world when effective debriefs and defined standards are missing?

Finally, optimizing behaviors for organizational gain, which might include violations, should be seen as a positive way of improving the system for the local environment. However, it requires an understanding of the factors present.

Drift is normal, but understanding why the drift has happened and modifying processes accordingly in a proactive and informed manner is a good thing. It is normally known as innovation. Enforcing rules for the sake of them, without understanding the unintended consequences, can lead to safety and performance being compromised.


Solving complex problems with simple solutions never works. Divers are part of a complex system of human interactions with other divers, with organizations, with equipment, and with the environment. You can’t write rules for complex environments, especially when you don’t have an effective feedback mechanism so that lessons can be learned without fear of litigation.

So, before you look at violations or at-risk behaviors following an accident, incident or near-miss, consider the rule itself and the person involved and their peer-group. Finally, look at the situational or organizational factors present.

As I have written numerous times, divers don’t get up in the morning and decide “Today is a good day to die.” As such, whatever they were doing at the time will have made sense to them, even if that meant breaking the rules…whatever ‘rules’ mean in the context of a leisure activity with an inherent risk of death and a lack of supervision and quality control.

What now?

The Human Diver provides globally-unique training which encourages a change in perspective to look at high performance in divers and dive teams by applying knowledge, skills, and materials from high-risk domains such as military aviation to recreational and technical diving.

These programs are delivered via eLearning, webinar and face-to-face classroom-based sessions. They have gained praise from some of the world’s top divers. You can find out more about how to improve your performance, and safety as a consequence, by following this link www.thehumandiver.com. Apply Human Factors. Master the Dive.

Related Blog Articles

5 replies
  1. Richard L Pyle
    Richard L Pyle says:

    Gareth — EXCELLENT post! I couldn’t agree more.

    The diving world has elevated armchair quarterbacking to a mixture of art-form and blood-sport, which is exceedingly unhelpful in many ways. What we call “rules” in diving represent the collective wisdom accumulated by thousands of divers over decades of experience, and in the vast majority of cases they serve to reduce risk and improve safety. However, as in any endeavors (perhaps even more so in diving), atypical situations arise that require atypical responses. In my diving career, I have encountered MANY such situations, and my approach to them has always been to assess each one in context, make a quick cost/benefit analysis, and proceed with actions that seem most likely to lead to a happy outcome. Often this involves violating some “rules” because the circumstances are well outside the scope of “norm” for which the rules have been formulated. In many cases, these quick cost/benefit calculations are based on incomplete or incorrect information, and therefore lead to a less-than-desirable outcome. When assessing the wisdom of any particular decision made by another diver, a third party having the benefit of 20/20 hindsight should always keep in mind the information available to the diver at the time the decision was made. I can’t count the number of times I have recognized my own decisions as being deeply flawed only after the incident is over, and having access to the full set of facts. Such is the nature of dealing with atypical and unexpected situations.

    All too often these kinds of situations (particularly the ones that led to unfortunate or near-unfortunate outcomes) serve as fodder for pontification and judgement among the various social gatherings of divers (online and otherwise) by people who were not directly involved, and who do not have access to all the relevant facts. Don’t get me wrong: analysis and discussion of incidents among discussion groups is an excellent way to identify (and learn from) genuine mistakes, and help divers consider alternatives and situations that will help them improve their own diving skills in the future. In many cases, a careful analysis reinforces the basis for the “rule”; but sometimes it can reveal flaws and lead to modification of the rule (After all, this is how the “rules” were developed in the first place). And sometimes, strict adherence to a “rule” is a significant contributing factor to the cause of an unhappy or tragic outcome. In such discussions, many people (usually the wisest and most experienced) are careful to parse the facts from the speculations, and provide appropriate caveats in their interpretations of events and possible causes. But too many others are quick to assert condemnation and pass judgement — usually without a full understanding of the events and circumstances — perhaps in an effort to make them appear smarter, better, or more experienced. But in my eyes, such behavior has precisely the opposite effect (i.e., quite often these are the people who spend more time on the internet talking about diving, than actually diving).

    I’m very happy you took the time to write this article, and I sincerely hope that anyone who reads it takes it to heart. The last thing our community needs is for people directly involved with incidents to be discouraged from sharing their experiences due to fear of being condemned for violating some “rule” or another by people who simply do not understand the full context of the situation.

  2. lindsay Branscombe
    lindsay Branscombe says:

    Wow what a great piece of writing! I love to read in-depth analysis of human behavior and how it relates to diving. It’s sad to see people flood the internet with “well they *should* have done…..” or “they should have known better”. NO ONE knows how they would react in certain scenarios and condemning someone after the fact benefits no one.

  3. Ross
    Ross says:

    Spot on article! I’ve investigated dive mishaps involving the death of divers before. Many elements in this article are paraphrased in my investigative reports. Well done. Ross

  4. FL
    FL says:

    I really want to leave my comment here and I hope it gets published so that readers can get a different perspective from what is written in this text.

    I have a very hard time understanding the objectiveness of your articles, Gareth.

    You constantly criticize accident analysis by stating things such as “well… a diver ran out of gas and drowned… but saying that the diver did not check how much gas they had and that it led to their death does not solve the problem.”

    You routinely question observance to guidelines by saying “not everyone follows them and we need to know why”, and – in this specific text – you have a section titled How Clear The Rule Was To The Diver, where you question the idea of using a checklist before a rebreather dive by asking the questions “what checklist? Who wrote? Is it operationally relevant based on the application of effective training?” Is it operationally relevant to go through a checklist… based on the application…. of effective training…? If you could explain this last question I would appreciate it. Open circuit divers are taught how to test their equipment as best as possible during OW classes. I cannot see any reason why a properly trained RB diver should NOT check if their life-support equipment is working properly before they start using it. I cannot see as helpful any attempt to dilute the importance of any diver being analytical and highly professional to the point of questioning themselves and their equipment.

    The risks associated with scuba diving are dictated by several non-modifiable natural factors (laws of physics) and on top of that, add behaviour and you can create a composite of how much risk there is involved in diving. People’s tolerances to diving (idiosyncratic factors) vary wildly and can be somewhat modified by changes in cardiovascular fitness, body composition, familiarity with the activity, and other interventions. Physiology is not an exact science because the human body has several mechanisms of adaptations to psychological, chemical and environmental stress, which can be triggered, suppressed and/or modified. Moreover, there are no scuba diving laws and, as I have previously mentioned, only natural laws are the non-modifiable factors that apply to diving and divers.

    You approach sociology and explain group values and, social norms but refer to troops of primates to address ostracism from a group when members no longer comply with the group. I personally think this is an attempt to dehumanize the fact that some individuals are “persona non grata” in some circles. I see as paramount for the existence of groups to stimulate cohesion among members and, as an example, I will mention that I have been reading recently about police officers who immediately turned in one of their fellow officers because he decided to use excessive force (taser to the neck) against a person who was already handcuffed and complying with orders. And on the other side of the spectrum, the mafia kills those who challenge its rules (according to every mafia movie I ever watched). A certain degree of cohesion and observance to rules/guidelines is paramount for the existence of a group. If I ever came across a potential dive buddy who dives in a “whatever, man… I don’t care” style I would leave them by themselves in the water immediately and never dive with them again. I am not going to die because of an idiot. I am not going to engage in extra-risky diving because of an idiot. And I certainly could not care less what their reasons for being an idiot are.

    Having been in the military for 3 years (and returning soon) I have to say that when performing highly risky activities one’s ability and desire to follow guidelines, directions, checklists, and other standard procedures is definitely a major factor separating life from death. Those who are known for acting “by emotion and not rationally” are definitely persona non grata and removed from certain environments when they are known to increase risks to other individuals or to a group.

    Your main ideas are in line with existing research in the field of “risk-taking behavior/personality” and – although I find this a very interesting field – simply talking about the reasons why some people, some times, under some circumstances, while with some friends, don’t check their air consumption is not going to help anyone. People should be taught “the best approach to minimize risk” and that includes (in the specific case I am mentioning in this paragraph) CHECKING AIR CONSUMPTION REGULARLY REGARDLESS OF ANY FACTOR AND IF UNABLE TO CHECK IT FOR ANY REASON END THE DIVE AND ASCEND SAFELY. PERIOD. The more anyone attempts to dilute the importance of checking air consumption by asking pseudo-intellectual questions, the less importance people will start giving to checking their air consumption, the more likely the number of accidents, near-misses, and deaths will increase.

    Having discussions is always great and I enjoy having them but I think you would be a lot more helpful to divers if you pointed out clearly that deviance from “best practice guidelines” are major causes of accidents, deaths, and near-misses. I understand that deviance is ubiquitous but its occurrence cannot be associated with acceptance by any means. Examples: banks are robbed daily across this planet; still, bank robberies are unacceptable and must be combated. Women are battered every day in the USA; still, violence against women is unacceptable and must be combated.

    Listing and explaining several reasons/factors why people engage in behaviours that increase risk while diving is one thing. However, promoting respect for and adherence to guidelines can definitely make a difference in the lives of many people.

    Lastly, I would like to mention that I have been in academia for the last 12 years and I can say from experience that your approach to accident analysis and deviance is a highly academic one: lots of words… lots of questions… lots of meetings… lots of discussions… lots of “this won’t work, that won’t work” and absolutely NO plan of action other than “sign up for my course because I am the solution and we can talk about it”.

  5. Tim Snape
    Tim Snape says:

    i work in IT security for a global operator and a large part of my day job is risk management. The parallels I can see with this article and the problems I face in my job are very clear. Peoples perception of risk is coloured by their experience and understanding of risk, I frequently encounter staff who break security policies who justify their actions by stating “That will never happen” or “That is really unlikely”. They lack the experience to make the risk decision, because their perception of risk is flawed due to their lack of knowledge. They don’t understand that in actual fact the risk is real and there is a high probability there will be an incident if they don’t follow the policy. Herd behaviour prevails & and the feedback mechanism I use is to kick their arses, so they and their peers understand that culturally this behaviour is not tolerated and also to educate them as to why it is not tolerated.

    A great article with absolute relevance to real risk management in real scenarios.


Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>