A continuum of safety models

Charles Vincent , Réné Amalberti, 19 Sep 2014

The idea of a single model of safety that applies to everything and aims to have zero accidents is naïve. There are many different responses to risk, which provoke many different authentic models of safety, each with their own approach, advantages and limitations. The differences between these models lie in the trade-offs between the benefits of adaptability and the benefits of the level of safety. Ultimately safety is a social construct and it adapts to demand.

We commonly assume that safety is achieved by imposing rules and restricting the auton­omy of management and workers. Everyone will agree however that writing a safety plan, including compliance to legal requirements, offers no guarantee that the plan will be put into practice. The literature is full of demonstrations of non-compliance to rules for a number of recurrent reasons (too many, not understood, not known, not adapting to non-standard cases, contradictions among rules, etc). Moreover, workers’ and system adaptation (and intelligence) to non-standard conditions are commonly found to be necessary to guar­antee efficiency and safety in work.

Concrete safety results are therefore the product of apparently contradictory actions: rules and constraints that guide work on the one hand, and on the other hand good and bad reasons for not complying with these rules, including regular reliance on the adaptive ca­pacities of operators when the situation goes outside the area covered by regulations. The setting and balance between these contradictory dimensions frame a continuum of safety models fitting various economics needs, from those giving priority to adaptation to those giving priority to rules and supervision. It is important to acknowledge that these various models considerably vary in safety solutions but all share the same ambition of reducing risk.

Approaches to risk and hazard: avoid, manage or embrace

The metaphor of the climber and the rock face may serve as a framework for describing the variety of these safety models. One can consider hazards as rock faces. They are an inevitable part of nature. In industry, such rock faces may represent sick patients in hospital, the chemical properties of compounds, solar radiation, oil-shale, etc. Risk management depends on the willingness to deal with these rock faces and the way in which this is done. One can refuse to climb them under current conditions, therefore waiting for better conditions (plan A), one can limit oneself to climbing only known rock faces and follow all the required procedures in normal and well-explored abnormal conditions (plan B), or one can attempt rock faces in non-standard situations (without equipment, without training, under poor or changing conditions), or worse still, daring to climb unknown rock faces in a context of an increasingly competitive work (plan C).

Outside a small number of ultra-safe industries (aviation, public transportation, nuclear), the majority of human occupational activities rely heavily on plan C. Industries exhibit a range of adaptations from those that heavily limit but tolerate plan C risks (process indus­try) to those living everyday with plan C to some that consider these adaptations to be an essential part of their knowhow (oncology and emergencies in healthcare, international fi­nance and trading, fishing industry, most military actions in war time). Strangely enough, however, all the literature on the quality and safety of systems offers prescriptions only for plans A and B.

It is not because those relying on plan C do not follow all the procedures and therefore rely on improvisation that it is not possible to make their practices safe. The problem is that the solutions that would make these practices safe at the same time as accepting their real­ity do not consist in developing procedures. (If they did, one would change to a plan B ap­proach). Instead, the response is ad-hoc and does not cover all the situations that arise dur­ing the work, whose very economic rationale often demands that it rely on plan C. Plan C solutions are found in quite resilient models: becoming more expert, becoming able to judge the difficulty of the task according to one’s own skills, learning to learn, drawing from experience, acquiring generic knowledge schemas which allow adaptation to border­line circumstances.

Systems that have a relatively modest level of safety (lower than 10-4) have considerable exposure to risk because they literally make a living from that exposure. This is true of fighter pilots1, sea fishing skippers and professional mountaineers. In these occupations, accepting exposure to risk and even seeking out risk forms the essence of their work. These occupations do, however, still want to improve safety. A number of studies carried out among fighter pilots and sea fishing skippers2 3, show a real desire for safety. Fishing skippers, for example, would like to have an intelligent anti-collision system to offer them better protection in high seas with poor visibility and with the mobility required for trawling (Automatic Radar Plotting Aid). Fighter pilots would like an electronic safety net to offer them better protection when they are undertaking maneuvers that are likely to make them lose consciousness (Electronic Safety Net).

In contrast, the high levels of safety in civil aviation are achieved by very different means. Here, the solution is radically different and most commonly involves not exposing crews to the hazardous conditions or risks that are thought to be the cause of accidents. For ex­ample, the eruption of the Eyjafjallajökull volcano in Iceland in 2010 led to all aircraft immediately being grounded based on a simple approach: no exposure to risk. These dif­ferent examples highlight two completely opposite strategies to dealing with risk: one, which is supported by small-scale systems involving skilled trades or highly competitive activities, involves relying on the intelligence of operators and giving them aids to deal with risk; the other involves relying on the organisation and supervision and ensuring that operators are not exposed to risks. It is easy to understand that both of these models have their own approach, but in that case it is also necessary to accept that the safety solutions are not identical in both cases.

Three authentic models of safety rather than only one

Taking into account the risk exposure strategies already mentioned, it make sense to take the view that each one has given rise to an authentic way of organising safety which is original, with its own approach and its own possibilities for improvement 4 5

  • The ultra resilient model involves occupations in which seeking exposure to risk is in­herent in the economic model of that occupation. Skilled-trade occupations in particular sell their services on the basis of their expertise which allows them to deal with new risks, or even deal with the unknown, by innovating, mastering new contexts, and coping, there­by winning through and reaping benefits where others fail or are afraid to go. This is the culture of champions, winners and losers (the losers are part of the context, but they are not perceived as failures of the system but rather as a reflection of the knowledge and skill of the champions). Sea fishing skippers, for example, are capable of seeking out the riskiest conditions in order to prioritize catching the most profitable fish at the best times (sales economy); traders constantly have to maximise their profits and military fighter pilots6 al­ways have to win. All these occupations have objective accident statistics that are more or less disastrous. They are not, however, insensitive to their occupational risks, and they deal with these through safety and training strategies that are very well thought-out, but of course within a different culture.

    In these occupations, the individuals’ autonomy and expertise take precedence over the hierarchical organisation of the group. In many cases the group is very small (consisting of two to eight individuals) and works in a highly competitive setting. The boss is recognised for his technical ability, his past performance and his charisma more than for his official status. Every operator is constantly invited to use a very wide margin of initiative. A cor­rect assessment of his own skill, courage and accumulated experience are the keys to rec­ognition as „a good professional and a winner“; safety is mostly about winning, surviving, and only winners have a chance to communicate their safety expertise in the form of champions’ stories. To summarise, there are a small number of procedures, a very high lev­el of autonomy and a very large number of accidents. It is still possible to make progress in terms of local safety, however, by becoming better trained through contact with the best masters, learning from their experiences and adding to one’s own mental capacity to adapt to even the most difficult situations. The differences between the least safe and the safest operators within a single resilient, skilled trade are of the order of a factor of ten7, which proves that it is possible to make progress through safety interventions, even while remaining within the „micro-Gaussian“ distribution of professionals engaged in these hazardous types of work.
  • The HRO model (High Reliability Organizations) uses the same idea of resilience8, since it also promotes adaptation, but this is a kind of adaptation which is more local and controlled, involving human activities which are clearly better organized. The HRO mod­el is in fact relatively averse to individual exploits that are not controlled by the group. HROs typically apply to occupations in which risk management is a daily affair, though the primary aim is to manage risk and avoid unnecessary exposure to it. Firefighters, mer­chant navy and naval armed forces, professionals in the operating theatre, and those oper­ating chemical factories all face hazards and uncertainty on a daily basis and typically rely on an HRO model.

    HROs rely on the leader and the professional group, which incorporates several different roles and types of expertise in order to maintain a constant perspective on progress being made towards the goal (while avoiding the risks of a local focus), where all the members of the group play a part in detecting abnormalities in a contextual setting (sense making), bringing them to the attention of the group, and adapting the procedure to these changes in the context. This includes deviations from procedures when necessary (but only when this makes sense within the group and is communicated to everyone). All members of the group show solidarity in terms of this safety objective. Combating adversity is an integral part of the HRO approach but the high level of collective regulation (not necessarily only by the leader) imposes considerable limitations on isolated individual initiatives and pro­motes prudent collective decision-making.

    The HRO model analyses its own failures and seeks to understand the reasons behind them. The lessons drawn from these accident analyses, however, are primarily about ways in which the situation has been managed and could be managed better in future. This is therefore a model which relies firstly on improving detection and recovery from hazardous situations, and secondly on improving prevention - which means avoiding exposure to these difficult situations. Training is based on collective acquisition of experience. Once again, the differences between the best operators and those that are less good within a sin­gle trade are of the order of a factor of ten9.
  • The ultra-safe systems model no longer makes it a priority to rely on the exceptional expertise of these front-line operators to escape from difficult situations; instead it requires operators to be identical and interchangeable within their respective roles, and in this case requires them to work at a standard level. This model relies upon the quality of external supervision, making it possible to avoid situations where these operators are exposed to the most exceptional risks; by limiting the exposure of operators to a finite list of break­downs and difficult situations, the model can become completely procedural, both when working under normal conditions and under abnormal conditions. Airlines, the nuclear power industry, medical biology and radiotherapy are all excellent examples of this catego­ry. Accidents are analyzed to find and eliminate the causes so that exposure to these risky conditions can be reduced or eliminated in the future. This model relies on prevention first. Training of front-line operators is focused on respect for their various roles, the way they work together to implement procedures and how they respond to abnormal situations in order to initiate ad-hoc procedures. Once again, the best and the least good operators within a single occupation differ by about a factor of ten10.

Lessons from above

The three models of safety are radically different. They represent responses to different economic conditions, each one has its own approach to optimisation, its own approach to training, its own advantages and its own limitations. They can be plotted along a curve in which there is a trade-of between flexibility and adaptability on the one hand, and safety on the other. All three, however, have the same capacity for internal self-improvement, and safety can be improved by a factor of 10 (making them 10 times safer).

It is not possible to impose a completely new model of safety against the will of local ac­tors and contrary to values that are considered essential to this system. These underlying values must be addressed first, before making any claim to make people adopt a different safety model. The lesson from this is simple: changing the safety model means changing the system. If the conditions are not met, and sometimes it is necessary to accept this fact, it is no good tilting at windmills or inventing solutions that have no chance of success.

There are two strategies to make a system safer. Either we use market leaders (champions) within the same category (same model), trying to understand what makes the differences between poor and good performers. As shown above, the range of expected improvement may reach up to a factor of 1 to 10 depending on where you start. Or we may change the category, which may result in potential improvement by an impressive factor. First, how­ever, we need to change the working conditions imposed by the activity. If you cannot change these conditions, safety improvements will likely be more modest and consist of local improvement within the existing model rather than betting on a ’potentially higher performing model’.

It is possible to switch from one model to another, but this requires a changeover event that will affect the entire occupation and its economy. The industrial chemical industry, for example, which in some cases is still based on resilience models dating from the 1960s and 1970s, made a definitive switch to an HRO model after the events that occurred in Seveso in Italy in 1976 and the European Directive that followed in 1982. It is often the regulatory mechanisms that impose such a transition to a new system. It will be noticed that in this case the system migrates gradually, loses the benefits of the previous model (a higher level of adaptation and inclusion of situations that are considered to be manageable within that occupation), but gains the advantages of the new model (mainly in terms of safety).

Some working environments have even more complex problems to solve since their activi­ties cross the three models. This is typically the case of hospitals. Some sections of the hospital face very unstable and unpredictable daily situations (oncology, emergencies), some are scheduled although their activity need considerable hour-to-hour adaptation to the huge variety of patients, case complexity, and unforeseen perturbations (typically the elective surgery in the operating theatre), and some are highly stable and ultrasafe like bi­ology or radiotherapy. Worse still, all of these categories of activities may rapidly move downward or upward from ’Tuesday morning’ (metaphor for the best working conditions where for example emergencies may adopt the characteristics of a HRO system) to ’Sunday night’ (the worst working conditions, where lab tests and delivery may become a bricolage, temporarily adopting traits of ultra-resilient system). In that sense, Healthcare is a fantas­tic model for studying safety, probably much better than any other setting, because all the complexity is to be found in the same place.


We need to acknowledge that there are three separate authentic safety models with differ­ent approaches to optimization. Different philosophical compromises assumed within each model will result in more or less censure when going forward under adverse or ambiguous conditions. These models provide a progressive trade-off between expertise and collective censure and supervision.

The imposition of a safety model does not change the task requirements, but changes in the task requirements may justify adopting a different safety model. If we do not change the constraints, it is more reasonable to select the most appropriate safety model for those conditions, and use the proper dimensions to optimize the outcomes, instead of pleading for another safety model. A different model may be intrinsically more effective, but we need to acknowledge that it may be inoperative in a particular context, or impossible to get from here to there. Many aspects of healthcare for instance primarily rely on an HRO model but could move towards an ultrasafe model. However, while some change could be effected within healthcare, a more substantial adjustment would probably require a radi­cally different approach to managing demand which is currently not politically feasible. Models of safety are ultimately context dependent and will vary by discipline, organization and jurisdiction.



1. Amalberti R, Deblon F (1992) Cognitive modelling of fighter aircraft’s control process: a step towards intelligent onboard assistance system. International Journal of Man-Machine studies 36: 639-71.
2. Morel G, Amalberti R, Chauvin C (2008) Articulating the differences between safety and resilience: the decision-making of professional sea fishing skippers. Human factors 50: 1-16.
3. Morel G, Amalberti R, Chauvin C (2009) How good micro/macro ergonomics may improve resilience, but not necessarily safety. Safety Science 47: 285-94.
4. Amalberti R, Barach P. Improving healthcare: understanding the properties of three contrasting and concurrent safety models. Submitted.
5. Grote G (2012) Safety management in different high-risk domains – All the same? Safety Science, 50, 1983-1992.
6. The case of fighter pilots is a special and interesting case of a dual context: in peacetime, their administration (the Air Force) operates essentially on an ultra-safe model, but once the aircraft are deployed on active service, the operating model suddenly changes and returns to its fundamentals of resilience. These very contrasting contexts do generate surprises in terms of safety in both directions: persistence of resilient, deviant behaviour (as compared with the model that would be desired in peacetime) after returning from military campaigns, and important opportunities that are missed during the first few days of engagement due to lack of practice in the resilient model, when pilots are suddenly thrust from peacetime into operational theatre.
7. The rate of fatal accidents in professional deep-sea fishing varies by a factor of 4 between shipowners in France and by a factor of 9 at the global level, source: Morel, Amalberti, Chauvin, 2009, op. cit.
8. Weick KE, Sutcliffe KM , Managing the Unexpected: Assuring High Performance in an Age of Complexity, 2001, Jossey-Bass, San Francisco
9. The rate of fatal industrial accidents in the gas and oil extraction industry varies from 130 deaths per 100,000 workers in some African countries to 12 deaths per 100,000 workers for the best oil wells; the global average is 30.5 deaths per 100,000 workers, source: http://nextbigfuture.com/2011/03/oil-and-gas-extraction-accidents-and.html
10. The rate of aviation accidents ranges from 0.63 per million departures in Western countries to 7.41 per million departures in African countries. These therefore differ by a factor of 12, source: IATA statistics, 23 February 2011, http://www.iata.org/pressroom/pr/pages/2011-02-23-01.aspx

Download full article (PDF 275 KB)


Charles Vincent

Health Foundation, Dept of Psychology, University of Oxford

Charles Vincent trained as a Clinical Psychologist and worked in the British NHS for several years. Since 1985 he has carried out research on the causes of harm to patients, the consequences for patients and staff and methods of improving the safety of healthcare. He established the Clinical Risk Unit at University College in 1995 where he was Professor of Psychology before moving to the Department of Surgery and Cancer at Imperial College in 2002. He is the editor of Clinical Risk Management (BMJ Publications, 2nd edition, 2001), author of Patient Safety (2ned edition 2010) and author of many papers on medical error, risk and patient safety. From 1999 to 2003 he was a Commissioner on the UK Commission for Health Improvement and has advised on patient safety in many inquiries and committees including the recent Berwick Review. In 2007 he was appointed Director of the National Institute of Health Research Centre for Patient Safety & Service Quality at Imperial College Healthcare Trust. He is a Fellow of the Academy of Social Sciences and was recently reappointed as a National Institute of Health Research Senior Investigator. In 2014 he has taken up a new most as Health Foundation professorial fellow in the Department of Psychology, University of Oxford where he will continue his work on safety in healthcare.        

Réné Amalberti

Prof. Medicine, MD, PhD

After a residency in Psychiatry, he integrated the Airforce in 1977, got a permanent Military Research position in 1982, and became Professor of Medicine in 1995.  He retired in 2007 and is now sharing time between the HAS (French Accreditation Agency- senior advisor patient safety), a private medical insurance Cie (MACSF, Head prevention strategies) and a public foundation (director FONCSI). He has published over 100 international papers, and authored or co-authored 10 books on human error and system safety (last Navigating safety, Springer, 2013).

Related articles

Teamwork and learning: Two fundamental processes for safety

Gudela Grote, John S. Carroll , 19 Sep 2014

Safe operations in organisations require both a formal managerial system and informal practices that enact and support the system. The formal aspect consists of a set of policies, procedures and practices often summarised under the heading of safety management systems.

Reducing Healthcare Costs by Investing in Safety: Safety Management Examples from the U.S.

Michaela Kolbe, 19 Sep 2014

How can high quality patient care be maintained in times of increasing production pressure within the health sector? In this paper, I will take an organisational psychologist’s perspective and discuss the risks of focusing heavily on cost reduction and production pressure in healthcare; and review measures for maintaining high quality and safety of patient care while reducing costs.

Socio-technically based risk assessment and management

Andrew Hale, Barry Kirwan, 19 Sep 2014

Dealing with personal risks is often straightforward, even instinctive. If a fire breaks out you move away from it; if a car swerves towards your car, you take avoiding action or brake or both. But managing risk in large organisations can be far more complex; risks are not always obvious and can manifest themselves in many ways, some of them unexpected.

The culture factor in safety culture

Ed Schein, 19 Sep 2014

Safety culture as a concept has suffered the same fate as culture itself. Theoreticians, safety professionals, members of different occupations in different industries have chosen to define it in terms of their particular goals and have produced, therefore, a lot of confusion about what safety culture is and whether it can usefully be generalized to help understand safety problems in different industries and cultures.