The culture factor in safety culture
19 Sep 2014
Safety culture as a concept has suffered the same fate as culture itself. Theoreticians, safety professionals, members of different occupations in different industries have chosen to define it in terms of their particular goals and have produced, therefore, a lot of confusion about what safety culture is and whether it can usefully be generalized to help understand safety problems in different industries and cultures.
This confusion has been made most clear recently in Réné Amalberti’s (2013) excellent review of the history of the concept and his pointing out that the safety problem itself differs greatly by type of industry. In this paper I want to build on his analysis and show that there are many cultural factors that need to be taken into account when analyzing safety problems, but this understanding hinges first on understanding the concept of culture itself because it also has come to mean many different things. I will close with some thoughts on how to get beyond simplistic generalizations to a more grounded view of safety behavior.
Definition of culture
I want to begin by clarifying the concept of culture. The word can, of course, be used in any way that a given person chooses, but for culture to be a useful concept in socio-technical analyses of safety phenomena it is necessary to stick to a definition that anthropologists have evolved and that I have applied to organizational and group phenomena. Culture is best thought of as what a group has learned throughout its own history in solving its problems of external survival and internal integration (Schein 2010). It is best conceptualized at its core as the shared, tacit assumptions that have come to be taken for granted and that determine the members’ daily behavior.
These assumptions are usually not stated explicitly because they have come to be taken for granted. Where did they come from? In the history of the group there will have been founders and leaders whose own values were imposed on the group and, if that group survived and thrived, came to be taken for granted as the right way to think, feel and behave. Sometimes these assumptions are stated as norms of behavior or descriptively as the way we do things around here. A quick test of what some of those norms are is to observe how newcomers in the group are socialized and what kind of behavior is immediately punished.
A relevant question is how one deciphers these tacit taken for granted assumptions. I have argued that since culture is a shared phenomenon the best approach is group interviews of selected members of the group/organization (Schein 2010). In these group interviews the culture model is first presented as a multi-level phenomenon best thought of as a lily pond. At the surface level the culture manifests itself in the kind of climate that exists in the organization and in the behavior that members exhibit. The behavior that is exhibited is usually justified by various espoused values but the group typically discovers that there are disconnects between what goes on and what the values espouse. For example, most organizations espouse teamwork but realize that all the reward and incentive systems are individually based. If one then asks why everything is individually based, the deep assumption comes out that we assume that only individuals can really accomplish things and be held accountable for whatever happens. The teamwork value is espoused but the deeper assumption is one of individual accountability. So, for example, in one company of this sort they went through group exercises to reach consensus but decisions did not stick. The effective managers went outside the group after the meetings and made deals with all the others on whom they were dependent. The key to deciphering this was to ask groups in the organization why the group decisions did not stick to which members usually responded with “we espouse consensus, but that is not how decisions and deals actually get made.” The assumptions are the root system and the climate is, in a sense, the water quality.
Even the ethnographer who spends a lot of time in the organization will need to ask groups why they do certain things. In the safety arena a common problem is why certain clear rules are sometimes violated by operators. In working with the front line people in an organization, the union, an innovative approach that is used by a consultant colleague of mine brings together groups of workers and asks the following series of three questions: 1) What are some of the important rules in doing this work safely? When a number of them have been identified, he asks the second question for a given rule: 2) Is it ever OK to break this rule? When? (Invariably he gets a bunch of examples). He then picks one example and asks: 3) Why is it okay to break the rule in that situation? It is the third question that reveals the deeper layer of the culture of the operators as a group. What is then often discovered is that the operators don’t believe in the rule or break it to get the job done. In one case in New York operators did not wear their safety glasses on a hot day down in the manholes because they steamed up so they could not see what they were doing.
An interesting example along these lines from medicine concerns Atul Gawande’s (Schein 2007) description of the program of getting doctors to wash their hands more frequently. After various kinds of persuasion programs and rules had been promulgated, the percentage of hand washing was still not high enough. Finally someone brought groups of doctors together and asked them sincerely: “Given all we know about the importance of hand washing, why don’t you wash your hands at prescribed times and places?” The doctors then revealed all kinds of reasons about the inconvenience and the loss of time involved which led among other things to the installation of the many easy hand washing dispensers that are now mounted all over the place.
The crucial point about this definition of culture is that culture is a property of a group. It is a stable property of a group that serves important functions for the group and is, at the same time a perpetually emerging set of understandings among the members of the group as they interact with each other and make sense of their current reality. Culture is both, in the sense that as humans we have a skeleton and a set of memories that change very slowly, ie the equivalent of tacit assumptions, yet in our daily experiences we are constantly reforming who we are and how we operate, ie the surface sense making to both reinforce and redefine the cultural elements. For purposes of understanding safety, it is my contention that we must look at both the basic assumptions, the skeleton in each group’s culture in terms of deep beliefs and assumptions about the importance of life and health, and the more surface contingencies that define immediate behavior.
What “cultures” are relevant?
If culture is a property of a group, what kinds of groups need to be analyzed in terms of their safety assumptions? Relevant groups can be a nation, an ethnic group, a religion, an occupation or profession, an industry, an organization, a subunit of an organization, or even a team if the members have enough of a shared history to have evolved shared assumptions about who they are. It would thus make sense to say that an airline, or a nuclear plant, or the oil industry in a given country each has a culture based on its unique history. Within that culture there will be a set of shared assumptions about how to manage the safety issues that may arise and how to feel about death and injury.
That subset of assumptions about safety in that industry or organization or group could be loosely labeled as its “safety culture.” But note that “safety” is not a group that can be the locus of a culture. Safety is a goal that is presumed to be more or less reachable if the culture of that group has within it assumptions about behavior that will make the group more or less safe. Note also that, to the extent that cultures differ in different industries, the subset of assumptions about safety will also differ to an unknown degree (Amalberti 2013).
Furthermore, each industry will have organizations with different histories and managements within it, leading to different organizational cultures that impact how safety will be handled. Clearly the culture of Tokyo Electric with regard to the Fukujima plant differed from how other nuclear plants in Japan handle safety. But most important of all, it is the key subcultures within an organization that have their own subsets of assumptions about safety which makes it dubious that one can even attribute a single set of safety related assumptions to an entire organizational unit.
Every organization has at least three generic subcultures—“executives” who are concerned mostly about the financial conditions, “engineers,” the designers and technical staff of the organization who are concerned about process safety and how to minimize the human factor in operations, and the “operators,” the line organization who runs the plant who are concerned with coping with all the surprises and anomalies that crop up even in the most well designed and standardized of operations (Schein 1996). These subcultures have their roots and origins in the occupations and professions. They are connected to occupational reference groups that cut across organizations and larger cultural units, in the sense that some assumptions of the engineering culture or medicine supersede even national or ethnic cultural boundaries (or at least are supposed to).
To illustrate, the executive subculture in most organizations defines safety as maintaining an image of caring about the public and the employee, but the measurement of that “caring” is tied to minimizing public scandals and being below industry average on OSHA statistics of employee injuries. As one executive put it: “I want the world’s best and cost effective safety program,” not realizing that cost is what he was really concerned about. Engineers and designers would prefer to build in as many safety defenses as possible but they are typically not granted because budgets are limited. As one pilot who flew both Russian and American planes put it, I prefer the American planes because they have three back up systems while the Russian ones only have two.” The operator wants good facilities, good training, and, most important, plenty of manpower to get the job done. As one member of an electric company crew working on an outage put it: “When the company decided that a job that used to be done by two people can now be done by one, they may be right, but it can't be done as safely.” The point is that within an organization tradeoffs and compromises have to be made in terms of the deep assumptions that the subcultures make about the ultimate safety issues. One executive who had not taken safety programs seriously enough changed his priorities when he could not face yet one more family to explain to them why a family member employee had died on the job.
Conclusion about culture
Rather than trying to develop broad criteria or processes and labeling them “safety culture” I would suggest that a more detailed analysis of how safety issues are viewed in different cultural units in a given industry will be more productive. Thus one would evolve a set of conclusions about the key safety issues in a given industry, taking into account national, ethnic and occupational cultures. Instead of a broad but relatively useless criterion like “there must be trust in the organization,” one could compare the specific issues that differentiate the way Japanese, French, German and U.S. nuclear plants are run. If China and India are going to be big future nuclear countries one would develop some cultural criteria that would enable one to assess how safety will be managed in these countries. As a quick aside, I was once told by an American nuclear engineer that the problem with Iran is not their weapons program but that their domestic nuclear design is based on Russian engineering which this man thought was quite unsafe.
This kind of stereotyping is dangerous if it is not followed up by serious research on how different countries and occupations do things. It is alleged, for example, that the Norwegian off shore oilrigs are safer than those of other countries because of Norwegian attitudes. Does that imply that Norwegians are better at creating trust across hierarchical boundaries than other countries? Or should the more relevant finding be that each culture has different ways of dealing with hierarchy, communication and trust, and it is in the details of how it is worked out that we will find the secrets to safety. Or, thinking occupationally, is it enough to say that top management must drive the safety process? Or should the more relevant finding be that top managers who are ex-nuclear engineers impose a different kind of safety program than top managers who are financial experts or lawyers? In the medical arena there is a good deal of variation in hospital patient safety programs as a function of whether the top executives are doctors, nurses or hospital administrators.
In other words, if we are to take cultural factors in safety seriously we have to accept that the devil is in the details. Only a more refined look at those details will unravel the principles or processes that “safety culture” is supposed to reveal.
Conclusion about improving safety
The risks and dangers that make us want “safety” do not derive from cultural factors. They derive from the work itself, the actual tasks that have to be performed that bring various kinds of risks with them. Culture may have influenced the design of those tasks and cultural factors may influence the kinds and degrees of risk we want to take, but if we want to increase safety itself in a given work situation the members of the subcultures, the designers, operators and executives must align their interests and work together to minimize those risks that worry them most. That will produce an effective safety program which will consist of many components, sets of rules and regulations, training programs, and systems of monitoring. Such a program will gradually change behavior that will make things safer for both operators and public, and, as those behaviors become habits and standards, they will become embedded in the cultures of those organizations.
When one examines such programs, for example in the nuclear industry, one will find that the principles such as the ones listed by the Institute of Nuclear Power Operations (INPO 2004):
- Everyone is personally responsible for nuclear safety
- Leaders demonstrate commitment to safety
- Trust permeates the organization; Decision-making reflects safety first
- Nuclear technology is recognized as special and unique
- A questioning attitude is cultivated
- Organizational learning is embraced and
- Nuclear safety undergoes constant examination
These have been embedded in the list items like trust and organizational learning that characterize any effective organization. And what will often be discovered is that the behavior changes invented by the group working to make things safer turn out to also make the organization overall more effective. The actual behavior changes, standards, rules and regulations that will derive from such local problem solving will, of course, vary immensely in terms of the kind of industry, the maturity of the technology and the economic conditions, as Amalberti has convincingly shown us.
Perhaps ultimately we will find some workable generalizations across the varieties of tasks that we engage in to make them all more safe. Principles such as those enunciated by INPO may be a convenient way to describe safety culture, but they are so general as to be useless when one asks the question: “How do I achieve the conditions that the principles describe?” My own answer to this question is that these conditions ultimately reflect the climate that the executive subculture creates through its own behavior. Only if people in the higher status positions begin to engage in more “humble inquiry” will they elicit enough trust throughout the organization to enable the operators and designers to speak up when they see safety problems (Schein 2013).
Rules, regulations and training are, of course, necessary and appropriate, but for subordinates to speak up when they see a safety problem requires a climate in which they will feel psychologically safe to do so. This becomes especially important because all task performance is subject to operators discovering new things—better ways to do things, shortcuts, unanticipated safety factors, what Snook (2000) has so usefully labeled “practical drift.” My own skepticism about the usefulness of broad principles and generalizations derives primarily from observing the practical drift in my own safety behavior and the behavior of those around me. As I go through my various routines of housework, of taking care of my health, of driving, of fixing things around the house, of taking care of an infant grandchild, I realize that there are rules and standards for all of those activities but, in the end, I make constant choices based on my mood, who else is around observing me, how well trained I am in performing the task, whether I am in a hurry or not, the image I am trying to portray, who is helping me, and so on. The factors, principles or processes that show up in safety culture lists are too general to help me or motivate me. In the end it is the task and the people doing it that create their own standards, rules, and behavior patterns. I believe we have to accept such practical drift as being inevitable in all operations, have to observe it, have to analyze it, and have to decide how best to integrate it into our safety programs.
Amalberti, R. (2013). Navigating Safety: Necessary Compromises and Trade-Offs-Theory and Practice. N.Y.: Springer.
Gawande, A. (2007). Better: A Surgeon’s Notes on Performance. N.Y.: Holt, Metropolitan Books.
Instittue of Nuclear Power Operations, (2004). Atlanta, Georgia. http://www.inpo.info/
Schein, E.H. (1996). Three Cultures of Management: The Key to Organizational Learning. Sloan Management Review, 38, 1, 9-20.
Schein, E. H. (2010). Organizational Culture and Leadership, 4th Ed. San Francisco: Jossey-Bass.
Schein, E. J. (2013). Humble Inquiry: The Gentle Art of Asking instead of Telling. San Francisco: Berrett/Kohler.
Snook, S. A. (2000). Friendly Fire. Princeton, NJ: Princeton Univ. Press.
Download full article (PDF 258 KB)