Systemic cyber/in/security – from risk to uncertainty management in the digital realm

Dr. Myriam Dunn Cavelty, 15 Sep 2011

Recent events have given the impression that cyber incidents are becoming more frequent, more organised and more costly in the damage that they inflict. Yet, while low-level micro disturbances are an everyday reality, the world has yet to see a cyber incident of systemic proportions. Establishing the likelihood of such an occurrence, however, is impossible. Systemic cyber-risks are unpredictable and incalculable due to the uncertainty surrounding them. The complexity of the socio-technical environment that they co-create makes traditional linear risk management approaches lose their meaning. This article argues that rather than trying to establish control over something that we cannot fully grasp, we need to learn how to embrace uncertainty when dealing with the digital realm. This implies a focus on dialogue and information exchange to increase situational awareness, a focus on strengthening technical and social resilience as well as sustained efforts to nurture a fault tolerant political culture that accepts the possibility of failure and lives with a certain and inevitable degree of insecurity.

Over the last year, cyber threats have been catapulted from being the realm of geek experts and some military strategists into the public sphere. The discovery of Stuxnet, the industry-sabotaging computer worm that scared politicians and business leaders all over the world; numerous tales of (Chinese) cyber espionage; the growing sophistication of cyber criminals as evident by their impressive scams; Wikileaks Cablegate and the subsequent actions of Anonymous; and the well-publicised activities of the now disbanded hacker group Lulzsec have all combined to give the impression that cyber-attacks are becoming more frequent, more organised and more costly in the damage that they inflict.

The onrushing cyber-apocalypse …

Despite years of alarmist warnings of impending cyber doom, either the result of accident or malicious actors, such scenarios remain fictional. The world has yet to experience an act of “cyber terrorism” or “cyber war” worthy of these labels.[1] Certainly, for over a decade, different forms of cyber(ed) conflict have accompanied any political, economic and military conflict. Furthermore, criminal and espionage activities with the help of computers happen every day, all around the world. It is undisputed that cyber incidents are causing minor and occasionally major inconveniences. These may be in the form of lost intellectual property or other proprietary data, maintenance and repair, lost revenue, and increased security costs. Beyond the direct impact, badly handled cyber attacks have also damaged corporate (and government) reputations and have, theoretically at least, the potential to reduce public confidence in the security of Internet transactions and e-commerce if they become more frequent.

However, in the entire history of computer networks, there have been only very few examples of attacks or other type of incidents that had the potential to reach a systemic level or cause a global shock. There are even fewer examples of cyber attacks that resulted in actual physical violence against persons or property.[2] The huge majority, if not all, cyber attacks and incidents have caused inconveniences or minor losses rather than serious or long-term disruptions. They are micro-level operational risks that can be dealt with by individual entities using standard information security measures.[3]

How likely is this to change in the future? Are we headed towards a point where truly major cyber-related events become systemic threats only highly likely but also outright unavoidable? Security practitioners are particularly concerned about so-called critical infrastructures; physical, virtual and organisational structures needed for the operation of a society, the prolonged unavailability of which would, in all likelihood, result in social instability and major crisis.[4] Today, these critical infrastructures take the form of interconnected, complex, and increasingly virtual systems.[5] Great reliance on cyber-means of control and maintenance of these infrastructures has augmented the potential for major disaster (or systemic risk) by vastly increasing the possibility of local micro incidents mutating into systemic macro consequences. Two types of system effects are particularly feared: surprise effects that unexpectedly arise out of the negative and positive feedback loops produced through interaction between agents in the system – and cascade effects that produce a chain of events that cross geography, time, and various types of systems.

… as a “narrated catastrophe”

Given this existing scenario, what, then, is the likelihood of a systemic macro-level disaster involving (critical) infrastructures caused by, or at least substantially, involving the cyber domain?

There is no straight answer to this question and there will never be one.

The first problem is that there is no reliable data for loss or damage estimation within our current cyber pattern of cyber usage and it is very unlikely that there will ever be satisfactory solutions to this data problem. Attempts to collect it have failed due to insurmountable difficulties in establishing what to measure and how to measure it and what to do about incidents that are discovered very late, or not at all.[6] Not surprisingly, therefore, existing publications are very vague about the actual level of risk. There is an excessive use of words like "could", "would", and "maybe" when describing cyber incidents, and musings of “what could have happened if” while discussing anecdotal evidence.

On top of that, existing threat assessment information is biased in more than one sense. First, as combating cyber threats has become a highly politicised issue, governmental statements about the level of threat must also be seen in the context of different bureaucratic entities that compete against each other for resources and influence. This is usually done by stating an urgent need for action (which they should take) and describing the overall threat as big and rising. As a result of this, the debate is characterised by rhetorical dramatisation and alarmist warnings. This is fuelled and sustained by the mass media, which repeatedly features sensationalist headlines on the topic.[7] Second, the main sources for the current threat assessments are reports provided by the exact-same industry that has emerged to grapple with the threat. Third, psychological research has shown that risk perception is highly dependent on intuition and emotions. Cyber risks, especially in their more extreme form, fit the risk profile of so-called “dread risks”, which appear uncontrollable, catastrophic, fatal, unknown and basically uncontrollable. There is a propensity to be disproportionally afraid of these risks given their low probability, which translates into pressure for regulatory action of all sorts and willingness to bear high costs of uncertain benefit.[8]

Given that there is not even agreement on how large the threat is today or was in the past, it is hardly surprising that experts also widely disagree how likely future cyber-doom scenarios are. The second major problem that makes judgment on future cyber risks difficult is their very nature. While there is at least proof and experience of cyber crime, cyber espionage or other lesser forms of cyber incidents on a daily basis, macro-level cyber incidents exist solely in the form of stories or narratives. They are so-called ‘narrated catastrophes’. The difference between these and other anticipated events or disasters that also exhibit a strong narrative component – such as the next pandemic or the next major terrorist attack – is the complete lack of precedents. Attempts to add substance to systemic cyber risks by pinning likelihoods on them or estimating their probable costs must fail because they only exist in the form of stories, scenarios, and even myths. The way we imagine them influences our judgment of their likelihood; and there are an infinite number of ways in how we could imagine them.

As a consequence of these two problems, systemic cyber risks are unpredictable and incalculable. They are hidden from us by two types of interrelated uncertainty: Epistemological uncertainty, which is essentially a measure of ignorance and refers to lack of knowledge; and ontological uncertainty, which arises from the fact that the world of cyber risks is in permanent flux, intentionally and unintentionally changed in nature and substance by the actions of a diverse set of actors.[9] The underlying reason for both forms of uncertainty is complexity, brought forward and sustained by the cyber infrastructure and its role in today’s society.

Complexity and the limits of risk assessment and management

It has almost become a truism to say that we live in a complex world. However, the cyber debate is all about complexity. At the very heart of the cyber debate lies “the Internet”, the largest man-made complex system that shows features and components that can give rise to unexpected emergent phenomena. The complexity of the world’s information infrastructure is still increasing through the extension of the geographical reach and the expansion of the services provided; the introduction of new components with richer functionality using diverse technologies; the increasing number of networks, nodes, and links and interdependencies; and the layering of systems over systems.

The above-mentioned critical infrastructures as the crystallisation point of the cyber risk debate are also composed into complex networks of different sizes as a consequence of the growing reliance on the cyber infrastructure, both for internal use and for interaction with external systems. Bridged and interlinked by information pathways, critical infrastructure systems have become interconnected at different layers: organisational, procedural, informational, material. Because all of the interacting parts that move between each other at varying speeds, future behaviour becomes hard to determine and predict. At the same time, the image of modern critical infrastructures is one in which it becomes futile to try and separate the human from the technological. Technology is not simply a tool that makes life liveable: rather, technologies become constitutive of novel forms of a complex subjectivity, which is characterised by an inseparable ensemble of material and human elements. From this ‘ecological’ understanding of subjectivity, a specific image of society emerges: society becomes inseparable from critical infrastructure networks. This way, systemic risks understood as risks to critical infrastructure systems are risks to the entire system of modern life and being.

System complexity has two immediate consequences. First, a well-known theory claims that technological systems that are interactively complex and tightly coupled will be struck by accidents that cannot be prevented. Because of the inherent complexity, independent failures will interact in ways that can neither be foreseen by designers nor comprehended by operators. If the system is also tightly coupled, the failures will rapidly escalate beyond control before anyone understands what is happening and is able to intervene.[10] The very connectedness of critical infrastructures through cyber means is what poses dangers, because perturbations within them can cascade into major disasters with immense speed and beyond our control.

Second, the dynamic interaction of complex, decentralised, open, unbounded systems amounts to an overtaxing of our abilities to articulate and evaluate them. Complex systems behave contra-intuitively due to parallel occurrences happening at different speeds, irregularities and non-linear cause/effect relationships. The result is that the human brain is unable to “read” these systems correctly.[11] In contrast, however, analytical frameworks developed for accidents with hazardous materials in the chemical industry and nuclear power plants still provide the backdrop for how cyber risks as related to critical infrastructures are primarily approached. These traditional risk assessment tools are grounded in strict, measurable assessments and predictive modelling (all of which is based on past behaviour and experiences) and linear cause-effect thinking.[12] They inevitably fail their purpose when applied to the truly complex and the uncertain.

Embracing the unknowable …

If systemic cyber risks are indeed unpredictable and incalculable as claimed above, then the term cyber risk is a misnomer. Following the seminal work of Frank Knight, risks are not uncertainties and uncertainties are not risks. Rather, risks are a tamed form of uncertainty because they are measurable and, therefore, manageable.[13] The very essence of risks understood this way is that we can try to purposefully influence and in some ways control them by actions taken in the here and now. Uncertainty, however, cannot be controlled. Calling something a risk inevitably conjures up images of controllability and will also quite inevitably lead us to use “old” well-known tools for their analysis and management. This might mislead us to believe that we can master them.[14] In the field of uncertain systemic cyber risks, it may therefore be advisable to refrain from calling them risks.

The move from systemic cyber risks to systemic cyber uncertainties has significant implications for how to handle them. As soon as we start accepting their “unknowability”, we must accept that their effective control and even manipulation is beyond our abilities – but we should also get to know where the limits of these abilities actually are and where traditional tools can and should still be applied. In addition, we must find ways to live with systems that can and will always surprise us and learn how to integrate awareness of the existence of systemic uncertainties into our planning processes and into our general way of life.

None of this is revolutionary or particularly new. In fact, many aspects of current critical infrastructure protection (CIP) practices already pragmatically deal with these uncertainties. Rather than trying to know and anticipate specific threats, for example, a key focus of protection efforts often lies on the identification and mitigation of system vulnerabilities, which seem identifiable, at least on lower system levels. In addition, we can observe the use of a combination of “discursive management strategies” and “resilience based strategies”, which should be employed when there is high ambiguity (discursive approach) or when there is high uncertainty or ignorance (resilience approach).[15]

Discourse-based management is characterised by the involvement of stakeholders, including industry and governmental organisations, and public participation. In accordance, a qualitative increase of public-private collaboration to enable a better exchange of information is one of the key goals of every CIP policy. Furthermore, public awareness campaigns or plans for enhanced support of cyber education from elementary schools to colleges and universities, or training of a capable and technologically advanced workforce and research for cyber security are integral parts of governmental protection plans. It has also been long acknowledged that efficacy of national efforts remains limited and that expanded and more efficient cooperation is needed, particularly when it comes to international legal cooperation.[16]

Resilience-based management on the other hand is characterised by a shift away from the concept of protection towards the concept of resilience.[17] Though the two concepts often overlap, infrastructure protection aims to prevent or reduce the effect of adverse events, while infrastructure resilience reduces the magnitude, impact or duration of a disruption. If resilience is a core concept, security does not refer to the absence of danger but rather the ability of a system quickly and efficiently to reorganise to rebound from a potentially catastrophic event. Thus, while protective (and defensive) measures aim to prevent disruptions from happening and remain rooted in a world of risk and linear cause-effect relationships, resilience fully embraces unknowability and accepts that certain disruptions are inevitable. This also shifts the focus of attention away from the preventative towards the response and recovery phase of disaster management.[18]

… and communicating the possibility of failure

Arguably the most important conclusion to be drawn from this is that a political discourse of uncertainty is needed in order to generate legitimacy for the possibility of failure. In a world of complex systems, there can be no security in the absolute sense. In fact, the opposite is true: cyber-incidents are deemed to happen, because they simply cannot be avoided. The myth of control and perfect manageability of risks must therefore be laid to rest.[19] This creates challenges for governments, the private sector, and society.

Given the nature and scope of systemic cyber threats, governments must think of ways to communicate the fact that they can no longer provide their societies with what they demand from them on the basis of a social contract: security and safety. Governments must be ready to admit that their role in cyber security can only be a very limited one, even though they consider cyber threats to be a major national security issue. The ownership, operation, and supply of most of the critical systems are currently in the hands of private industry. The challenge facing governments is thus to pinpoint where a stronger role (in the form of increased regulation, for example) in protecting critical infrastructure is truly necessary, given what is at stake, while determining how best to encourage market forces to improve the resilience of companies and sectors.

The challenge for the private sector, on the other hand stems, from the fact that collectively, it has far more technical resources and operational access to (critical) infrastructures than the government does. (Moral) responsibility for systemic cyber “risk” mitigation and resiliency of the entire society is thus increasingly falling on companies . This “new” corporate social responsibility demands a shift away from looking solely at micro-level (operational) risks and away from narrow cost-benefit analyses towards an integrated view of business serving a basic need of society.

The challenge for society, lastly, is to learn how to live with a considerable amount of insecurity born out of uncertainty. Currently, a high degree of diffuse anxiety lurking just beneath the surface of everyday life is characterising the public cyber threats debate. This view is problematic, because it reduces systemic (cyber) risks/uncertainties to a distressing limbo state of not-safe-but-waiting-for-destruction/disaster, a disaster, which is construed as inevitable. However, one of the great lessons of risk sociology is that risks do not exist; but are turned into risks by human perception and deliberations. More importantly, as they are not yet manifest but situated in a highly uncertain future, they can be shaped by human choices in the present.

Applied to the realm of the uncertain, very similar arguments can be made. What is needed to escape a potential fear and anxiety trap is a move away from a doomsday-automatism linked to propensities of system effects towards a focus on human action and human responsibility. Even though we must expect disturbances in the cyber domain in the future, we should not expect outright disasters. Some of the cyber disturbances may well turn into crises, but a crisis should be seen as a turning point rather than an end state, where the aversion of disaster or catastrophe is always possible. If societies become more fault tolerant psychologically and more resilient overall, the likelihood for catastrophe in general and catastrophic system failure in particular can be substantially reduced.

Myriam Dunn Cavelty was speaking at the Expert Hearing on New dimensions in cyber risk, hosted at the Centre, 7 April 2011.


[1] For definitions of the different types of cyber-attacks see Myriam Dunn Cavelty, “Cyberwar: Concept, Status Quo, and Limitations”, CSS Analysis in Security Policy, Issue 71, April 2010 (Zurich: Center for Security Studies).

[2] The most prominent example for physical impact caused by a cyber-incident is the computer worm known as “Stuxnet”, a computer program apparently written to specifically attack Supervisory Control And Data Acquisition (SCADA) systems used to control and monitor industrial processes, in this case centrifuges used in nuclear facilities.

[3] Such as ISO 15443 / ISO/IEC 27002 / ISO/IEC 27001.

[4] The most frequently listed examples of critical infrastructures encompass banking and finance, government services, telecommunication and information and communication technologies, emergency and rescue services, energy and electricity, health services, transportation, logistics and distribution, and water supply.

[5] On a good overview over system level risks see: Herman B. “Dutch” Leonard and Arnold M. Howitt, “Understanding and Coping with the Increasing Risk of System-Level Accidents”, in: Integrative Risk Management: Advanced Disaster Recovery, Risk Dialogue Series (Rüschlikon: Swiss Re, 2010), pp. 13-26.

[6] On data problem see: Peter Sommer and Ian Brown, Reducing Systemic Cyber Security Risk, Report of the International Futures Project, IFP/WKP/FGS(2011)3 (Paris: OECD, 2011), p. 12, also: Manuel Suter, “Improving Information Security in Companies: How to Meet the Need for Threat Information”, in: Myriam Dunn Cavelty, Victor Mauer and Sai Felicia Krishna-Hensel (eds), Power and Security in the Information Age: Investigating the Role of the State in Cyberspace (Aldershot: Ashgate, 2008), pp. 129-150.

[7] Myriam Dunn Cavelty, Cyber-Security and Threat Politics: US Efforts to Secure the Information Age (London: Routledge, 2008).

[8] Robin Gregory and Robert Mendelsohn, “Perceived Risk, Dread, and Benefits”, Risk Analysis 13(3) (1993): 259–264.

[9] David Dequech, “Uncertainty: Individuals, Institutions and Technology”, Cambridge Journal of Economics 28(3) (2004): 365-378.

[10] Charles Perrow, Normal Accidents: Living With High Risk Technologies (Princeton, NJ: Princeton University Press, 1984).

[11] Jay Forrester, Industrial Dynamics (Cambridge, MA: MIT Press, 1961).

[12] As a prominent example for this, see: US Department of Homeland Security, National Infrastructure Protection Plan: Partnering to Enhance Protection and Resiliency (Washington DC, 2009), available at: http://www.dhs.gov/xlibrary/assets/NIPP_Plan.pdf.

[13] Frank Knight, Risk, Uncertainty and Profit (Chicago: University of Chicago Press, 1921), particularly p. 19. For reasons of space and argument, no further discussion of other and different understandings of risk is provided in this article.

[14] On this problem and how to deal with it, see: Dave Snowden and Mary Boone, "A Leader's Framework for Decision Making", Harvard Business Review, November 2007, pp. 69–76.

[15] Andreas Klinke and Ortwin Renn (2001) Precautionary principle and discursive strategies: classifying and managing risks. In Journal of Risk Research. 4 (2), 159-173

[16] Elgin Brunner, Anna Michalkova, Manuel Suter and Myriam Dunn Cavelty, Cybersecurity – Recent Strategies and Policies: An Analysis, CRN Focal Report (Zurich: Center for Security Studies, 2009).

[17] Elgin Brunner and Jennifer Giroux, Examining Resilience – A Concept to Improve Societal Security and Technical Safety, CRN Factsheet (Zurich: Center for Security Studies, 2009).

[18] Herman B. “Dutch” Leonard and Arnold M. Howitt, “Advance Recovery and the Development of Resilient Organisations and Societies”, Integrative Risk Management: Advanced Disaster Recovery, Risk Dialogue Series, (Rüschlikon: Swiss Re, 2010), pp. 45-58.

[19] Michael Power, The Risk Management of Everything: Rethinking the Politics of Uncertainty (London: Demos, 2004).

Download full article (PDF 218 KB)

Author

Dr. Myriam Dunn Cavelty

Dr. Myriam Dunn Cavelty is Lecturer for Security Studies and Head of the New Risks Research Group at the Center for Security Studies, ETH Zurich, and Fellow for Cybersecurity at the Stiftung Neue Verantwortung, Berlin.

Related articles

Drive-bys, spear-phishing and Trojans - the changing face of cyber security

Candid Wüest, 15 Sep 2011

Few risk landscapes evolve as quickly as those in cyber space. Malicious software, primarily aimed at gaining illicit revenue, is multiplying and mutating at a rapid rate. Major concerted attacks, such as Stuxnet, have highlighted the vulnerability of systems to sustained and organised security breaches. Evolution in hardware and software – such as mobile devices and social networking sites – highlight new potential points of cyber vulnerability. Learn about Stuxnet on Wikipedia

Intellectual property rights in the knowledge economy

Donatella Fiala, 15 Sep 2011

We live in a knowledge based economy. Innovation and an untainted reputation are of fundamental importance to a company's success. Intellectual assets are amongst a company's most valuable assets and the means to protect and enforce them have become increasingly important. Intellectual property (IP) rights such as trademarks, patents or copyright can provide the tools to achieve this. The rise of the Internet and social networking sites present new challenges for IP regimes, as does an emerging IP market.