of people in the organization that take part in the analysis and is thus highly dependent on the experience
of the particular group. Moreover, it has been found that risk evaluations performed by different people on
a similar system many times do not agree, and instead have bigger differences when more information is
provided for the analysis, or if the participants have more experience [6].
Component failure assumes that the cyber-attacks happen because of a specific series of failures in com-
ponents through the "domino model" view of risk. However, in the case of IT systems, the rate of failures
is very low and accidents happen rather because of a faulty design that allows an attack to use the existing
infrastructure, by functioning faithfully to its design, yet this results in a disruption.
Historical information assumes that there will be information about similar failures in the same or other
systems to make an evaluation of the risks. This does not hold true for new systems, where no history of
failures is available.
Research about the roots of failure have instead led to the understanding that an attack with disruptive
consequences is "more often due to the unfortunate combination of a number of conditions, than to the failure of
a single function or component" [12]. Thus, there is a need to understand the structure of the system where
attacks happen, this is the elements that constitute the system, their connections, and the rules that govern
these connections.
Additionally, research has shown that safety, understood as the absence of unacceptable losses, is cre-
ated through a combination of proactive processes rather than through reactive defences and barriers. In
this context, "human error" is understood as a symptom of incomplete system design, and as such "the
operator’s role is to make up for holes in the designer’s work" [22]. It is thus necessary to consider a risk analysis
process that looks at the system around the attack and disruption, beyond merely the specific interactions
that led to an undesirable event.
The systems theoretic Accident Model and Process (STAMP) method with its hazard analysis version
STPA (Systems Theoretic Process Analysis) [16] has been identified as the most cited model for systemic risk
analysis [29]. Extensive literature has been published about the description of the STAMP methodology
framework for risk analysis [9], [17], [8], [3], with examples of application in different industries, such
as medical [4], environmental [11], robotics [19], power production [14], software development [27], and
defense [7].
The rest of the paper is organized as follows. The related work that has been published on systemic risk
analyses is described in Section 2. Section 3 presents STPA risk analysis method. Section 4 describes the
cyber ship framework and its different components. The risk analysis of CyberShip framework is presented
in Section 5. Finally, Section 6 concludes the paper.
3
Electronic copy available at: https://ssrn.com/abstract=3753663