This article addresses the question of to what extent conventional theories of high reliability organizations and normal accidents theory are applicable to public bureaucracy. Empirical evidence suggests precisely this. Relevant cases are, for instance, collapsing buildings and bridges due to insufficient supervision of engineering by the relevant authorities, infants dying at the hands of their own parents due to misperceptions and neglect on the part of child protection agencies, uninterrupted serial killings due to a lack of coordination among police services, or improper planning and risk assessment in the preparation of mass events such as soccer games or street parades. The basic argument is that conceptualizing distinct and differentiated causal mechanisms is useful for developing more fine-grained variants of both normal accident theory and high reliability organization theory that take into account standard pathologies of public bureaucracies and inevitable trade-offs connected to their political embeddedness in democratic and rule-of-law-based systems to which belong the tensions between responsiveness and responsibility and between goal attainment and system maintenance. This, the article argues, makes it possible to identify distinct points of intervention at which permissive conditions with the potential to trigger risk-generating human action can be neutralized while the threshold that separates risk-generating human action from actual disaster can be raised to a level that makes disastrous outcomes less probable.

Introduction: Public Bureaucracies and Human Security

The relevance of properly working public administration for the overall well-being of states and people became suddenly apparent in the recent COVID-19 pandemic. Whether or not governmental agencies were working efficiently and effectively turned out to be a crucial factor in more or less successful crisis management and, consequently, infection rates and death tolls. What was thus revealed is that the realm of critical infrastructure is much larger than is usually assumed. It entails not only electricity grids, telecommunication, water supply, and the entire public health system but also the wide array of public bureaucracy in general—its resources, professional capacity, and resilience.

Yet it is indicative that it took a crisis to make visible the fact that large parts of public administration are of crucial importance for human security. Whether or not a bridge or a building is sound and safe depends on construction oversight by incorruptible authorities. Whether particularly vulnerable people—for example, elderly people in retirement homes or children living under precarious social conditions—are being protected against carelessness, neglect, and physical violence depends on properly working local administration equipped with sufficient professional staff. Whether or not large crowd events such as soccer games or street parades remain peaceful and enjoyable events or turn into deadly traps when panic breaks out and people are trampled to death depends on proper planning and licensing by the local civil administration.

Although the bulk of public administration outside critical infrastructure is usually not defined as a high reliability organization (HRO) zone, public authorities usually act as de facto HROs as soon as they are concerned, one way or the other, with the protection of health and physical integrity. Conscious carelessness, neglect, sloppiness, or substandard professional performance are rare, unlikely, and, accordingly, unexpected phenomena. Yet these phenomena do occur even under the unlikely conditions of rule-of-law-based public bureaucracies in rich democracies with sound accountability structures. Bridges and buildings collapse, claiming the lives of many people and leaving many more injured although public licensing and oversight authorities are in place and sufficiently equipped with professional staff. Vulnerable groups are exposed to structural and personal violence despite the supervision of related facilities and critical conditions in families. Mass events get out of control despite licensed crowd management plans and sufficient security staff on the ground.

Table 1: Public Bureaucracies and Human Security: Sensitive Fields and Exemplary Cases
Field of Regulation and ImplementationHuman Security IssueAdministrative Failure, Examples
Construction Supervision and Licensing Collapsing Buildings and Bridges Collapse of the Canterbury TV Building, Christchurch, New Zealand, 2011 Canterbury Earthquakes Royal Commission, Volume 6: Canterbury Television Building (CTV) http://canterbury.royalcommission.govt.nz/Final-Report-Volume-Six-Contents 
Child Protection Child Abuse North Wales Child Abuse Scandal Department of Health: Lost in Care. Report of the Tribunal of Inquiry into the Abuse of Children in Care in the Former County Council Areas of Gwynedd and Clwyd since 1974. Prepared by The Stationery Office. February 2000 (“Waterhouse Report”). webarchive.nationalarchives.gov.uk 
Fire Protection Fire & Wildfire Fire at the Detention Centre Schiphol-Oost, 2005 Report of the Dutch Safety Board https://www.onderzoeksraad.nl/en/page/392/brand-cellencomplex-schiphol-oost-nacht-van-26-op-27-oktober 
Licensing and Managing Mass Events Panic The Hillsborough Disaster, 1989 The Report of the Hillsborough Independent Panel. Ordered by the House of Commons to be printed on 12 September 2012. HC 581 London: The Stationery Office. https://www.gov.uk/government/publications/hillsborough-the-report-of-the-hillsborough-independent-panel 
Law Enforcement, Domestic Intelligence Improper Criminal Investigation Serial Killings of Immigrants, Germany, 2000–2007 Bericht des NSU-Untersuchungsausschusses Deutscher Bundestag, 17. Wahlperiode, Beschlussempfehlung und Bericht des 2. Untersuchungsausschusses nach Artikel 44 des Grundgesetzes, Drucksache 17/14600, 22.08.2013 
Disaster Preparedness & Relief Insufficient Preparedness & Response Hurricane Katrina, 2005 A Failure of Initiative. Final Report of the Select Bipartisan Committee to Investigate the Preparation for and Response to Hurricane Katrina, 109th US Congress, Report 109-377 https://www.gpo.gov/fdsys/pkg/CRPT-109hrpt377/pdf/CRPT-109hrpt377.pdf 
Field of Regulation and ImplementationHuman Security IssueAdministrative Failure, Examples
Construction Supervision and Licensing Collapsing Buildings and Bridges Collapse of the Canterbury TV Building, Christchurch, New Zealand, 2011 Canterbury Earthquakes Royal Commission, Volume 6: Canterbury Television Building (CTV) http://canterbury.royalcommission.govt.nz/Final-Report-Volume-Six-Contents 
Child Protection Child Abuse North Wales Child Abuse Scandal Department of Health: Lost in Care. Report of the Tribunal of Inquiry into the Abuse of Children in Care in the Former County Council Areas of Gwynedd and Clwyd since 1974. Prepared by The Stationery Office. February 2000 (“Waterhouse Report”). webarchive.nationalarchives.gov.uk 
Fire Protection Fire & Wildfire Fire at the Detention Centre Schiphol-Oost, 2005 Report of the Dutch Safety Board https://www.onderzoeksraad.nl/en/page/392/brand-cellencomplex-schiphol-oost-nacht-van-26-op-27-oktober 
Licensing and Managing Mass Events Panic The Hillsborough Disaster, 1989 The Report of the Hillsborough Independent Panel. Ordered by the House of Commons to be printed on 12 September 2012. HC 581 London: The Stationery Office. https://www.gov.uk/government/publications/hillsborough-the-report-of-the-hillsborough-independent-panel 
Law Enforcement, Domestic Intelligence Improper Criminal Investigation Serial Killings of Immigrants, Germany, 2000–2007 Bericht des NSU-Untersuchungsausschusses Deutscher Bundestag, 17. Wahlperiode, Beschlussempfehlung und Bericht des 2. Untersuchungsausschusses nach Artikel 44 des Grundgesetzes, Drucksache 17/14600, 22.08.2013 
Disaster Preparedness & Relief Insufficient Preparedness & Response Hurricane Katrina, 2005 A Failure of Initiative. Final Report of the Select Bipartisan Committee to Investigate the Preparation for and Response to Hurricane Katrina, 109th US Congress, Report 109-377 https://www.gpo.gov/fdsys/pkg/CRPT-109hrpt377/pdf/CRPT-109hrpt377.pdf 

Source: Author’s compilation

So it would be a reasonable normative statement to declare public bureaucracies HROs as soon as their action or inaction impacts on human security. In what follows, I will argue that while this normative impulse is entirely justified, it misses the point as far as the nature of public bureaucracy itself is concerned. Outside the immediate realm of critical infrastructure, democratic rule-of-law-based government with its deeply echelon administrative substructure has to do justice to a variety of partly convergent and overlapping and partly divergent and rival requirements. As Boin and Schulman (2008) pointed out, it is the very multiplicity of purposes and both functional and political necessities that may relativize the imperative of human security. Elected and appointed officials may find themselves exposed to counterincentives that make them neglect “safety first” principles even when those principles are laid down in legal prescriptions and clear-cut stipulation.

More specifically, the argument is that an ontologically more realistic concept of public bureaucracies as HROs has to be more nuanced and differentiated on the epistemological end as well. For that purpose, conventional high reliability theory is undoubtedly helpful but only to a limited degree. Standard HRO theory strongly emphasizes the necessity of general awareness and mindfulness (Hopkins 2007; Weick and Sutcliffe 2007) in acknowledgment of the actual relativity of safety as an organizational goal and the resulting necessity to renegotiate related standards and practices (Schulman 1993; Roe and Schulman 2008). Yet that strand of literature remains focused on two main areas and challenges: on the one hand, critical infrastructure, and on the other hand, crises, emergencies, and high-risk technologies. Accordingly, the fact and the systemic nature of risks to human security emerging out of regular public bureaucracy under regular circumstances remain unaddressed.

A similar though complementary weakness characterizes the main “competitor” of HRO theory, which is normal accident theory (NAT) (Perrow 1984, 1986, 146-154). Much of what goes wrong in public administration at the expense of safety and human security has its root indeed in normal structural risk zones of the bureaucratic organization such as lack of flexibility due to hierarchized governance and horizontally fragmented division of labor, the proverbial red tape and rigid standard operating procedure. By the same token, however, those risk zones cannot be eliminated the way Charles Perrow recommended in his classic Normal Accidents of 1984, namely through the elimination of an entire organizational field. After all, bureaucracy epitomizes division of labor-based professionalism and incorruptible rule of law standards just in accordance with Max Weber’s classic characterization (cf. Kettl 2006; Olsen 2008).

Rather, control and containment of risks to life and limb under the condition of public bureaucracy is about the conscious control of standard organizational pathologies that themselves are inevitable and irremovable but, for this very reason, require focused attention very much in the vein of HRO theory and the quest for mindfulness. That mindfulness, however, needs to take into account the nature of public bureaucracy and its political embeddedness in a democratic polity in an attempt to define as accurately as possible the very points of intervention at which human action can neutralize risks and avert their disastrous consequences.

Hence the structure of the article. In the subsequent section, I will briefly summarize what HRO theory and NAT do and don’t explain when it comes to standard risk zones of public bureaucracies with potential consequences for human security. That entails a brief discussion of strengths and weaknesses of those theories and how the weaknesses could be overcome while the strengths could be combined to the benefit of the analysis of public bureaucracies as potential HROs. From that, I will derive my own argument that the risk zones of public bureaucracies as the locus of failure with severe consequences for human security appear in two distinct shapes: avoidable standard pathologies and inevitable trade-offs. Both will be illustrated through a case of organizational and managerial failure in a German municipal agency that claimed human lives and left hundreds of people injured.

The main theoretical statement of this article is that HRO theory and NAT can be fruitfully combined in process tracing analyses of disasters based on the diagnosis of general causal mechanisms and their configuration. It is through a differentiated mechanism-based perspective that general points of intervention may be identified at which, to the benefit of human security, bureaucratic standard pathologies and inherent trade-offs can be neutralized as risk generators.

1. Standard Theories for Standard Risks: Approaches to High Reliability Organizations and Normal Accidents

The very notion of high reliability organizations (HROs) gained prominence in the early 1990s in the wake of major disasters affecting critical infrastructure, primarily major blackouts such as the one in New York City in July 1977 or in San Joaquin County, California, in 1982 and, above all, the Three Mile Island nuclear power plant disaster of 1979. These incidences made the actual reliability of power plants and grids and the consequences of their malfunction an issue of both public policy and scholarly literature (cf. the seminal article by LaPorte and Consolini 1991).

Pretty soon, that literature discovered and discussed a fundamental dilemma of risk management and, consequently, the management of organizations dealing with high-risk technology: On the one hand, improving performance for the sake of security requires learning and, accordingly, an intra-organizational culture of admitting, acknowledging, and thus, within a certain margin or bandwidth, tolerating errors for the sake of learning and piecemeal improvement of performance. On the other hand, accepting errors for the sake of learning implies trial-and-error tactics that are just too costly and ethically unacceptable when “error” translates into immediate threats to human life and limb. Accordingly, similar to what in urban policing is known as “broken window” theory, even minor errors have to be avoided from the very outset in accordance with zero-tolerance principles.

From this explicit or implicit definition, two lines of thought emerged. One was HRO theory proper; the other was normal accident theory, based on Charles Perrow’s trailblazing analysis of the Three Mile Island nuclear plant disaster of 1979 (Perrow 1984; cf. also Shrivastava, Sonpar, and Pazzaglia 2009). Perrow’s analysis was radical by virtue of both its exclusive focus on organizational structure and its normative conclusions. According to Perrow, organizational fields with particularly risk-generating structural properties should be defined as a kind of no-go area beyond acceptable structural choices. Not surprisingly, nuclear power plants served as a plausible example. However, what made Perrow’s argument compelling was its theoretical justification. It was based on the distinction between two basic dimensions of structural properties, degrees of coupling of organizational subunits and the mode of interaction between them. Loosely coupled organizational systems allow for complex interaction since errors in one subunit do not instantly impact on other subunits. Tightly coupled organizational systems, by contrast, are both indispensable and manageable when it comes to linear interaction without particular necessities of time-consuming mutual adjustment. But the combination of tight coupling and complex interaction, Perrow argued, represents an unsolvable dilemma: Tight coupling requires centralized governance for the sake of effective coordination. Complex interaction, however, requires decentralization for the sake of cooperative governance and mutual adjustment. Since centralization and decentralization are mutually exclusive, organizations combining tight coupling and complex interaction are, according to Perrow, high-risk organizations from the very outset (Perrow 1986, 146-154). To tolerate them anyway creates a permissive environment for what he termed “normal accidents”—accidents that are just as normal as the structural risk zones they emerge from.

HRO theory, by contrast, directly addresses the dilemma of simultaneous indispensability and unaffordability of trial-and-error learning techniques for the improvement of organizational performance in a high-risk environment. There are two variants of this rather optimistic school of thought when it comes to organizational risk management. One is the modified error-tolerance school, the other the mindfulness school.

Roe and Schulman (2008) reject what they call the “misguided … hard-and-fast distinction between error-tolerant and error-intolerant organizations” and suggest instead adopting some elements of what in a different strand of literature had been termed the “error-tolerant organization under failure-tolerant leadership” (Roe and Schulman 2008, 109; with reference to Farson and Keyes 2002). A core element, here, is to admit one’s own mistakes rather than covering up or shifting the blame, fostered and facilitated by a culture of leadership that “routinely reinforce(s) the company’s mistake-tolerant atmosphere by freely admitting their [the management’s] own goofs.” And they add: “The same can be said, almost word by word, for HROs” (Roe and Schulman 2008, 109). Nonetheless, these authors state, rather than treating error tolerance as an end in itself, it must be tied to the ultimate and indispensable goal of avoiding error altogether: “Error tolerance must be vital to eventual error intolerance, that is, ‘learning to avoid failure’” (Roe and Schulman 2008, 110). By the same token, according to these authors, organizational reliability is not as nonnegotiable as it may appear from the perspective of conventional wisdom. The constant balancing of tolerance and intolerance when it comes to errors implies constant mutual adjustment of rigidity and partial pragmatism on a learning-by-doing and case-by-case basis. Again, the point here is that strict inflexibility and rigidity even for the sake of human security may be ultimately as detrimental as sloppiness and neglect in real organizational life. In their latest book, Roe and Schulman (2016) stress the importance of multiple reliability standards and the indispensable role of accomplished leaders with expertise and an incorruptible professional ethos when it comes to walking the tightrope between error tolerance and error intolerance.

It is here where writings of the pioneers in the field overlap with a second wave of HRO literature that can be denoted as the “mindfulness” school. Unlike classic HRO literature that necessarily focuses on a particular organizational field and, in a normative vein, on the establishment of learning patterns devoted to reliability, the “mindfulness” school focuses on high-risk environments (cf. Hopkins 2007, 2009) and/or crisis and emergency situations (Weick and Sutcliffe 2007). While Hopkins (2007, 103-125) emphasizes the necessity of standardizing risk assessment through what he terms the TARP philosophy (Triggers and Corresponding Action Response Plans), Weick and Sutcliffe suggest a particular technique of coping with the dilemma; as they put it, “you can’t just shut your organization down while you figure out ways to make it more reliable and resilient” (ibid., 139). To that extent, their approach resembles Roe’s and Schulman’s emphasis on learning and productive balancing of error tolerance and error intolerance. Weick’s and Sutcliffe’s key notion is “small wins strategy,” a plea for small and even “opportunistic” (ibid., 139) steps in the “right direction” to less risk-prone organizational structures and managerial attitudes. It is also reminiscent of the “muddling through” philosophy developed in political science in the late 1950s (Lindblom 1959), but, unlike Lindblom, Weick and Sutcliffe maintain that the “small wins” have to have a distinct and nonnegotiable objective, which is reliability for the sake of security. They end up with recommendations for mindful managerial attitudes that center around the preoccupation with failure rather than positive (and positivist) portrayals of high reliability principles and practices. It is, according to Weick and Sutcliffe, about “mistakes that must not occur,” “creating awareness for vulnerability,” “creating an error-friendly learning culture,” and explicitly addressing the question of whether something that almost went wrong (a “near miss”) “is a sign that your system’s safeguards are working or as a sign that the system is vulnerable” (Weick and Sutcliffe 2007, 151–52).

When it comes to a comparative evaluation of the strengths and weaknesses of the normal accident theory and the high reliability organization theory, it is complementarity rather than rivalry or mutual exclusion that prevails. The strength of NAT is its focus on structural risk zones, emphasizing that particular organizational fields entail particular risks of failure and even disaster. After all, some organizational structures are more permissive than others when it comes to lack of coordination, information asymmetry, organized irresponsibility, et cetera. Even though Charles Perrow’s original diagnosis of particular types of structural complexity might be criticized as not being nuanced and differentiated enough, the perspective on organizational structure as a risk factor in its own right remains analytically productive. NAT’s weak flank is obvious too, though (viz. the neglect of human agency and coping capacity). Just as Scott Sagan pointed out in his account of The Limits of Safety (1993), human agency is a critical and indispensable factor in keeping even extreme risks to human safety under control, an argument specifically directed against Perrow’s somber portrayal of human inability to handle the risks of nuclear energy in both the civil and the military context. This includes, however, the near miss or close call phenomenon—the absence of disasters that almost happened. Undetected near misses may create the illusion of safety in virtually vulnerable systems (cf. Sagan 1993, 250-279).

Conversely, emphasis on human agency in the form of learning and mindfulness is what characterizes the strength of HRO theory. Both the traditional school of thought connected to the work of LaPorte, Roe, Schoolman, and others and the “mindfulness” school represented by Hopkins or Weick and Sutcliffe emphasize the indispensability of awareness, anticipation, and preparedness and thus critical cognitive and actor-related factors of human efforts to enhance organizational reliability and to reduce structural risk zones as potential threats to human security. By the same token, however, a characteristic weakness of the HRO literature is the emphasis on reason and pedagogy and a neglect of organizational structures, their wider institutional embeddedness, and related incentives structures. After all, mindfulness might not be enough. It might not even be helpful as an actual strategy when it comes to risks that are inherently structural in nature and counterincentives that prevent real-world actors from learning and drawing consequences from the proverbial “lessons learned” (cf. March and Olsen 1975, Arygis 1980 for skeptical assessments of organizational learning capacities).

Moreover, neither NAT nor HRO theory contributes very much to the understanding of public bureaucracies as a risk generator in its own right when it comes to threats to life and limb. The next section gives an outline of peculiar risk-generating components of public bureaucracies and of both the deficiencies and the potential of NAT and HRO theory when it comes to related analyses.

2. Public Bureaucracies as Risk Generators: Avoidable Standard Pathologies and Inevitable Trade-Offs

Public bureaucracies are formal organizations, just like private businesses (cf. Puranam 2018), from which they differ substantially, however, in terms of resource mobilization, production, and output (cf. Kettl 2006; Olsen 2008). The resources come from individual and corporate taxpayers, and the “production” of public goods and services is regulated by legislation and protocol while the output has to be indiscriminate and evenly accessible for everybody according to standardized eligibility criteria. So on the one hand, public bureaucracies are exposed to standard pathologies of any formal organization such as principal-agency problems (bottleneck issues at the top of hierarchies), departmentalization, and, consequently, selective perception and negative coordination, groupthink phenomena, learning constraints, et cetera.

On the other hand, however, there are trade-offs peculiar to public bureaucracies embedded in a democratic polity. One is connected to the necessity of maintaining overall legitimacy and comes to bear as a tension between responsiveness and responsibility (Peters 2014): public agencies need to respond to what citizens like them to do, especially when, at the local level, the action or inaction of authorities impacts directly on the conditions of everyday life. Hence the notion of the “listening bureaucrat” (Stivers 1994) or “street level bureaucracy” (Lipsky 1980): public agencies need to be responsive. By the same token, however, public agencies and their representatives cannot be held accountable by individuals or social groups in their immediate societal or political environment alone. They are bound by legislation and to be held accountable in accordance with legal and professional standards. Just as the rigidity of rule-boundedness and protocol entails the risk of inadequate reaction to individual and local requirements, unrestrained responsiveness entails the risk of arbitrariness, clientelism, and corruption. Hence a trade-off between responsiveness and responsibility, which is inevitable and irremovable since it is part and parcel of public bureaucracy and a democratic political order. It is only in authoritarian regimes or outright dictatorship that bureaucrats can afford to ignore the legitimate demands of citizens and legal and professional standards at the same time.

Yet another inevitable trade-off is the one between goal attainment and system maintenance (cf. Seibel 2019 with reference to Deutsch 1963, 182-199 and Talcott Parsons’s 1951 distinction of adaptation, goal attainment, integration, and latency/pattern maintenance). It is not necessarily limited to public bureaucracies and a democratic polity, but, again, the peculiarities of the political system impact decisively on the way the trade-off can possibly be handled by appointed and elected public officials.

Goal attainment is an easy-to-understand category since it is directly connected with the effectiveness of public bureaucracy. Whether or not and to what degree public policy is being implemented through its administrative substructure is necessarily an important issue of both political reality and scholarly analysis. As a matter of fact, policy effectiveness is the underlying normative yardstick of the entire subdiscipline of policy analysis. System maintenance refers to the complex machinery of governmental apparatuses at all levels. It is about internal coordination, day-to-day cooperation, adjustment of internal protocol, recruitment and education of staff, interface management, and “bureaucratic politics” in an attempt to mobilize resources and to defend or enhance one’s bureau’s standing and influence. Public bureaucracies are, to a large extent, occupied just with themselves, but this ostensibly unproductive activity is an indispensable prerequisite for maintaining the internal cohesion and stability of the administrative system as a whole.

System maintenance activity may collide with goal attainment efforts. Goal attainment may necessitate coordination, and coordination is a resource-absorbing and time-consuming process. Accelerating coordination processes for the sake of goal attainment may put strain on interpersonal and interorganizational relations whose stability has to be kept intact for the sake of system effectiveness in general. As far as ultimate effectiveness is concerned, overemphasizing goal attainment may be just as detrimental as overemphasizing system maintenance.

Both standard pathologies and principal trade-offs are risk factors of organizational failure in general and failure at the expense of human security in particular. A decisive difference between standard pathologies of formal organizations and the specific trade-offs of public bureaucracies in a democratic polity is, however, that the former can be influenced, thus mitigated, by organizational design while the latter is an integral and necessary component of the democratic polity and its complex bureaucratic substructure, which has consequences for the analytical relevance of the HRO and normal accident theories. Both HRO theory and NAT are helpful for an appropriate understanding of organizational standard pathologies, but they cannot contribute very much to the understanding of the responsiveness versus responsibility or the goal attainment versus system maintenance trade-offs.

The emphasis on structural fault lines that characterizes NAT is helpful for identifying principal-agent problems connected with organizational hierarchy, notorious coordination problems as a result of fragmented division of labor, the risk of diluted responsibilities in collaborative governance structures, et cetera. Conversely, building awareness of such structural fault lines and the pitfalls attached to them is what HRO literature would suggest. Certainly, NAT entails more limitations as far as viable coping strategies concerning standard organizational pathologies are concerned. After all, the radical plea for the removal of inadequately risk-prone organizational structures is not applicable to the necessarily bureaucratic substructure of government itself. Still, in support of a structure-conscious strategy of risk assessment, NAT may be instrumental in identifying risk-generating institutional arrangements of no matter what type, which is compatible with what HRO theory suggests in terms of controlled error-friendliness for the sake of learning and systematic “mindfulness” as far as step-by-step risk reduction is concerned.

What neither NAT nor HRO theory has systematically addressed, though, is the peculiarities of both standard pathologies and inevitable trade-offs that characterize public bureaucracies. Neither of these influential theories was specifically designed for public sector organizations. Accordingly, both strands of literature treat organizations as changeable and their weaknesses and fault lines as restrictable. Public bureaucracies, however, are changeable only to a limited degree within the limits of the constitutional and otherwise legal order, while the very core of their bureaucratic structure is, by definition, not restrictable or changeable at all. Hierarchy as well as departmentalization, protocol as well as red-tape procedures, interagency cooperation as well as rivalry and competition for funds and competences remain part and parcel of public administration and, in that sense, standard pathologies that elected and appointed officials have to cope with rather than to overcome, let alone eliminate. Even less eliminable are the trade-offs between responsiveness and responsibility and between goal attainment and system maintenance although they too may qualify as risk factors in their own right when it comes to human security.

So the argument, here, is that precisely because of the robustness of standard pathologies and potentially detrimental trade-offs between performance standards of public bureaucracies, mindful coping in the sense of HRO theory matters. In its mainstream version, however, HRO theory remains unnecessarily vague as far as the peculiarities of public bureaucracies are concerned. Once standard pathologies and the characteristic trade-offs of performance requirements of public sector institutions have been identified, the complementary strengths of HRO theory and NAT can be brought to bear in the form of more fine-grained analyses and for the sake of targeted interventions that may eliminate security risks altogether. Prior to the development of the theoretical argument itself, an empirical case is presented in an attempt to illustrate the ontological relevance.

3. An Illustrative Case: Standard Pathologies and Mismanaged Trade-Offs

On July 24, 2010, a techno-music street parade known as “Loveparade” in the German city of Duisburg ended in a crowd panic that claimed the lives of 21 people and left 652 injured (for a full account, see Seibel and Wenzel 2017). The responsible organizer was an event management firm, Lopavent GmbH (Gesellschaft mit beschränkter Haftung, the equivalent of a British or US limited liability company). The event had several predecessors, most in Berlin and two in the cities of Dortmund and Essen. As a street parade, it required the permission of public authorities, in this case the municipal administration of the city of Duisburg. Under the aegis of the head of the division of security and law (Dezernat für Sicherheit und Recht, or Dezernat II), the municipal administration convened a task force in charge of planning and preparing the Loveparade in September 2009. It initially consisted of representatives of the Duisburg city administration, Lopavent GmbH, the owner of the compound envisaged for the concluding segment of the Loveparade, and a public marketing firm, Wirtschaftsförderung metropoleruhr GmbH (cf. Document no. 2; see list of cited documents).

It soon became apparent that the envisaged compound was the critical factor as far as the security of the one million or more estimated visitors was concerned. Moreover, not only the compound itself but also the routes of access and evacuation turned out to be especially delicate since they led through a tunnel twenty-four meters wide with a single ramp branching off to the compound itself. That eighteen-meter-wide ramp had to serve as way in and way out, creating the obvious risk of congestion given the expected size of the crowd that would have to use it. The related security risks were clearly articulated by the task force. According to the records (cf. Document no. 4), this happened as early as October 2009.

It was also clear from the very outset, however, that the event enjoyed strong political support since it was an integral part of a publicity and marketing campaign not only for the city of Duisburg but for the entire Ruhr area. The Ruhrgebiet, the heartland of what used to be Germany’s coal mining and iron ore industry, is a disadvantaged region that nonetheless had been designated as a “European capital of culture,” a prestigious initiative of the European Union. The very Wirtschaftsförderung metropoleruhr GmbH was in charge of an entire program under the headline “Ruhr. 2010 Kulturhauptstadt Europas” (Ruhr. 2010 European cultural capital), of which the Loveparade was an integral part.

Political pressure to organize the Duisburg Loveparade 2010 under any circumstances increased when the city of Bochum, also located in the Ruhr area and thus participating in the very same program, canceled its own Loveparade scheduled for the summer of 2009. What was at stake in the perception of regional politicians was the prestige of the Ruhr area altogether as far as the capability of planning and organizing a spectacular event with a particular appeal to young people was concerned. When concerns about security issues connected to the Loveparade were voiced by the head of the Duisburg police department, public criticism was so harsh that even the resignation of the police chief was requested (Document no. 1). When the task force met on October 2, 2009, the head of division of Dezernat II, who chaired the meeting, explicitly reminded the participants that after the cancellation of the Loveparade in Bochum at short notice, the Duisburg Loveparade was definitely “politically desired” (cf. Document no. 3).

However, it was only in early March 2010, more than four months into the planning process, that the municipal administration of Duisburg realized that the design of the Loveparade (i.e., closed compound with limited access and evacuation routes) implied a transfer of jurisdiction for risk assessment and public permission to the office of regulation and supervision of construction (Bauordnungsamt). The Bauordnungsamt (or Amt 62, according to the organizational chart of the Duisburg city administration) clearly stated that permission could not be given for the envisaged event site; its officials also made it clear that violation of the relevant legal provision (the Sonderbauverordnung Nordrhein-Westfalen, or decree for special construction of the state of North Rhine-Westphalia) would make any official involved liable under criminal law (Document no. 5).

From this point on, four and a half months before the event in question, both the substantial security risks and the incompatibility of the conditions at the event site and the related legal stipulations were known to the officials in charge and documented. In the files of the Duisburg city administration, it was also clearly stated that an indispensable prerequisite for any permission was a formal application to be submitted by Lopavent GmbH with substantiated documentation of the relevant security measures (ibid.).

Notwithstanding, what followed was a protracted planning and preparation process in which part of the Duisburg city administration sided with the event management firm in a blunt attempt to manipulate the facts and figures and to obstruct the clear and binding stipulation of the relevant security regulation, while the responsible Amt 62 remained determined to enforce the law.

Ironically, the leading figure among those determined to ignore the law and to issue permission to organize the Loveparade under any circumstances was the head of the city’s division for security and law (Dezernat II). This man was a close associate of the mayor, who had expressed unmistakably his will to have the Loveparade take place in his city. His opposite number was the head of the division for urban development (Dezernat für Stadtentwicklung, or Dezernat V), to which the Amt 62 belonged. He, however, kept a low profile and did nothing to buttress the position and action strategy of the administration belonging to his own jurisdiction. Hence a power asymmetry in favor of those determined to push through permission for the event under any circumstances and to the disadvantage of those compliant with the existing safety regulation.

Under these conditions, Lopavent GmbH managed to outmaneuver the Bauordnungsamt. The head of Dezernat II, in transgression of his own competent jurisdiction, commissioned several separate expert reports that focused on crowd management issues. This was clearly intended to circumvent the unmistakable security stipulations of the law whose enforcement was, in turn, the task of Amt 62. None of these expert reports submitted just a couple of weeks or even days (June and July 2010) before the Loveparade itself referred to the relevant legal provisions (cf. Documents nos. 9, 10, 11). Moreover, they were vague and peppered with salvatory clauses. Nonetheless, they served as justification for the permission of the Loveparade, which was ultimately pushed through by the head of Dezernat II. Instead of backing the responsible unit of his own administration vis-à-vis an applicant (Lopavent GmbH), he even formed an alliance with that very private firm against the relevant security regulation and the administrative unit tasked with enforcing it.

Figure 1: Public-Private Coalition Building and Interagency Rivalry within the Duisburg Municipality in the Run-Up to the Loveparade Event of July 24, 2010

Source: Author’s compilation.

Figure 1: Public-Private Coalition Building and Interagency Rivalry within the Duisburg Municipality in the Run-Up to the Loveparade Event of July 24, 2010

Source: Author’s compilation.

The borderline between the rule of law and a compliant public administration and private interests was not only blurred but the role and competence of public and private actors were virtually inverted. In a meeting on June 18, 2010, the representatives of Lopavent GmbH admitted one more time that they were not able to guarantee more than one-third of the overall width of evacuation routes on the event site of what the legal security provisions required (cf. Document no. 7). While this was astonishing and yet another unmistakable warning that authorization of the Loveparade was just not possible, the head of Dezernat II instructed, again in transgression of his own competent jurisdiction, the office of regulation and supervision of construction (Bauordnungsamt) to “cooperate” with Lopavent GmbH and to support the latter in the development of a security concept for the Loveparade scheduled for July 24, 2010. This meant not only to provoke a collision of interest—after all, the Bauordnungsamt as a public authority was tasked with drafting a security concept that it subsequently would have to evaluate and to certify—but also to task Amt 62 with a job it was not competent to do; namely, the development of an accurate evacuation plan on the basis of crowd management data and simulation models it could not possibly have at its disposal.

The representatives of Amt 62 participating in the meeting of June 18, 2010, articulated precisely this, but to no avail. Their superior, the head of the division of urban development (Dezernat V), supported this stance through a handwritten remark on the margin of the report written by the head of Amt 62 stating that the envisaged procedure did not conform to proper administrative practice and that Dezernat II—instead of his own division—would have to make all the relevant decisions.1 This statement was right and wrong at the same time. While it was entirely correct that the envisaged procedure violated basic principles of proper administrative practice, the head of Dezernat V was wrong in assuming that it was at his personal discretion to concede the jurisdiction for the relevant decision to Dezernat II. On the contrary, it was his own personal obligation to make sure that the Bauordnungsamt evaluated Lopavent’s security concept in accordance with the legal requirements and, if necessary, to deny authorization of the Loveparade as long as the requirements were not met. Instead, the head of Dezernat V bluntly refused to be involved in the relevant decision-making—a classic case of blame-shifting that was itself improper and an act of irresponsibility.

It was, ironically, the private event management firm Lopavent instead of the public authority actually in charge that filled the vacuum created by the failure of the head of Dezernat V to insist on a proper evaluation and certification of the security concept. Representatives of Lopavent GmbH participated in yet another meeting, held on June 25, 2010, and devoted to the unresolved security issues. The prime purpose of this meeting was not the development of a security concept itself but an agreement on how to evaluate such a concept. That agreement entailed the intent to commission another expert report on the evacuation plan to be developed, according to the previous instructions, by the head of the division of security and law (Dezernat II) in cooperation with Lopavent GmbH and the very Bauordnungsamt that ultimately would have to certify it. This agreement was, according to the minutes of the meeting of June 25, 2010, “approved” by the participating representatives of Lopavent GmbH (cf. Document no. 8). Thus, the private firm that, according to the legal provisions, was obliged to submit a security and evacuation plan and to have it evaluated and certified by the relevant public agency participated in deliberations whose subject was the very procedure of evaluation and certification whose result directly affected the private firm applying for permission.

Another telling detail of the replacement of what should have been an independent and unfettered examination by an irregular procedure was that the group of consulting engineers authorized to evaluate the security concept was, according to information provided on their website, a spin-off of the chair of the very professor of physics at the University of Duisburg who himself was authorized to evaluate and to certify the report of the engineers who were his former students and PhD candidates. Not only was this series of conflicts of interest not corrected or terminated; they were, instead, literally designed and organized by the Duisburg city administration with the obvious intent to suspend the regular procedure of an independent assessment of the security and evacuation concept for an event involving approximately one million visitors.

Exhausted by what may be called a war of attrition against an alliance of high-ranking public officials and the private event management firm, the Bauordnungsamt finally gave in and issued permission to hold the Loveparade as planned. That happened on July 23, 2010—twenty-four hours before the event. In revealing clarity, the wording of the authorizing permission made apparent that the security requirements of the relevant legal provision, the Sonderbauverordnung, were not met by the security concept submitted by Lopavent GmbH (Document no. 12). In issuing the permission anyway, the Bauordnungsamt, under the relentless pressure of the head of Dezernat II, invoked the right of administrative discretion whose existence it had explicitly denied so far (cf. Document no. 6, p. 5).

On the afternoon of July 24, 2010, panic broke out in the totally overcrowded tunnel leading to the event site and on the ramp that branched off to the actual compound where the final segment of the Loveparade was taking place. Most of the twenty-one casualties were caused by thorax contusion. The ramp, serving as access and exit at the same time, turned out to be a fatal trap without escape routes—a fact that was known to the private organizer and the relevant authorities from the very outset but that had not prevented the relevant authority from issuing a permission that should never have been given.

4. Case Analysis: Neither Normal Accident nor Insufficient Reliability Management

The Loveparade catastrophe was a man-made disaster in the classic sense. It did not happen by accident. Nor did it happen because of carelessness or lack of skill of those involved in planning and preparation. Neither was it the consequence of structural complexity of governance, bureaucratic inflexibility, “wicked problems,” or unsolvable managerial issues. It happened because key actors within the relevant municipality purposefully undermined and successfully circumvented the relevant security regulation. That they did with great skill and, yes, mindfulness and explicit agility in a quasi-entrepreneurial style. What the prehistory of the disaster reveals is, rather, the mutual reinforcement of inevitable trade-offs among legitimate performance parameters, on the one hand, and standard pathologies of any bureaucratic organization, on the other hand. What were the actual conditions of this vicious circle, why was it not neutralized by the officials in charge, and what are relevant theoretical implications for NAT and HRO theory?

The standard pathologies and, thus, “normal” risk structure in the Perrow sense were connected to the duality of intra-administrative departmentalization, on the one hand, and the hybrid arrangement of municipal authorities and a private event management firm—a classic public-private partnership (PPP)—on the other hand. Frictions and, step by step, open conflict broke out between the office (Amt 62) directly in charge of licensing the Loveparade event according to the relevant security regulation and a division of the Duisburg city administration at the interface between the intra-administrative security screening, the political leadership of the municipality, and the private event management firm. None of this was unusual or unmanageable. It was just a matter of priorities. And the priority should have been crystal clear: safety first. According to the relevant safety regulation, this would have implied denying permission to hold the Loveparade event right away.

So the obvious question is why high-ranking municipal decision makers did not do what they were undoubtedly supposed to do. One simple answer is that they did not realize that, under the given circumstances, they had to act as representatives of a high reliability organization. Not only was it entirely obvious and evident that at a mass event of the anticipated scale on a compound of limited size and with narrow access and exit routes, the protection of physical integrity was of prime importance from the very outset, it was also beyond reasonable doubt that the event could not be organized in accordance with the unmistakable legal safety regulation.

Under these circumstances, the standard argument of mainstream HRO theory according to which the officials in charge should have been more “mindful” would be misleading and therefore useless. After all, there were rank-and-file officials within the Duisburg municipal administration who, as civil servants obedient to the law, were entirely “mindful” and determined to insist on compliance with the legal safety stipulations and to deny permission for the Loveparade event. The problem was that their superiors were not willing to listen to them. Instead, these higher-ranking officials mobilized all the political and organizational resources available in the attempt to obstruct the regular vetting and verification procedures for the planned mass event. It is not that they were not smart enough to understand what was at stake. Rather, they were acting under the influence of strong counterincentives that prevented them from doing what they, according to all likelihood, would have usually done, which is just to enforce the law and to follow bureaucratic protocol.

One could argue, in accordance with Perrow’s normal accident theory, that one of those counterincentives was already the hybrid arrangement in the form of a public-private partnership tasked with the planning of the event. After all, this partnership meant sharing not just tasks and jobs but also jurisdiction and responsibilities and, accordingly, it implied particular requirements of coordination and reintegration. Undeniably, public-private partnerships generate particular coordination problems due to fragmented jurisdiction, diluted responsibilities, and opportunity structures for blame avoidance (Hood 2011). However, during the run-up to the Duisburg Loveparade, those coordination and reintegration problems were neither new nor particularly complex. Accordingly, they could have been easily solved, especially since the decisive ingredient of unmanageable complexity, in the Perrow sense, was missing—namely, tight coupling. Decision-making stretched over a long period of time and was subject to lengthy deliberation. The decision to ignore and thus violate unmistakable security regulation in order to pave the way for the Loveparade event under any circumstances was made in full consciousness. What could be easily anticipated, though, was that denying permission to perform the Loveparade would trigger massive negative reactions from the overall public. This must have been a much stronger impulse shaping the motivation of the higher-ranking public officials than the objections of their own subordinates and the “bureaucratic” logic of the legal restrictions that made permitting the Loveparade virtually impossible.

So, quite ironically, the relevant key actors were “mindful” in their own way. Certainly, these municipal officials mismanaged the structural fault lines of a fragmented intra-administrative jurisdiction and an indispensable cooperation with a private firm, but that was due to their mishandling of the trade-off between responsiveness and responsibility, on the one hand, and goal attainment and system maintenance, on the other hand. The higher-ranking officials were determined to respond to the expectations of political stakeholders to make the high-profile Loveparade event possible, literally against all odds. They were determined to demonstrate their ability to overcome “bureaucratic” obstacles even if those obstacles consisted of legal constraints designed to guarantee human security. In other words, in all their mindfulness, they lost their sense of responsibility. And, consequently, they made goal attainment the absolute priority at the expense of system maintenance since system maintenance would have implied protecting the integrity of the “bureaucratic” vetting and licensing process. Instead of fending off the political pressure to dilute the licensing criteria, the pressure was amplified and focused at the same time, targeting the licensing office (Amt 62) as the weakest link in the chain of hierarchy. Moreover, a quasi-entrepreneurial style of goal-oriented decision-making was much easier to sell on the political market than sober-minded bureaucratic decision-making according to some legal prescriptions.

The question of why public officials set wrong priorities when balancing the trade-off between responsiveness and responsibility and/or between goal attainment and system maintenance is beyond the grasp of NAT and conventional HRO theory. Normal accident theory is good at identifying risk-generating organizational structures as permissive conditions of public mismanagement, but it doesn’t address the ability of human actors to realize precisely this and to develop successful coping strategies for the sake of risk reduction. By contrast, the ability to be mindful and to learn is precisely what HRO theory emphasizes. However, HRO theory cannot explain why even mindfulness may be based on priorities that ultimately undermine rather than strengthen organizational reliability.

Yet combining the relative strengths of both theories is helpful for a better understanding of organizational failure with severe consequences and, accordingly, generalization for the sake of prevention. While NAT is helpful in identifying risk-prone structures, HRO theory is helpful in identifying both counterstrategies and counterincentives to develop mindsets designed to neutralize risks. What the present article suggests is that combining these complementary strengths requires a particular analytical perspective that focuses on actual causal processes rather than on structural variables or abstract normative statements. It is only through a process analytical perspective that both the permissive conditions and their actual impact on risk-generating human action and the relevant points of intervention or critical junctures can be identified. In what follows, a related explanatory model and its application to the Duisburg Loveparade disaster will be described.

5. An Alternative Perspective: Causal Mechanisms and Points of Intervention

In their respective conventional versions, NAT and HRO theory focus on either structural properties (NAT) or behavioral attitudes (HRO theory) as risk factors and frames of reference for risk reduction. The common blind spots of these theories are (1) the actual risk- and failure-generating “machinery” of organizations and (2) the points of intervention at which human actors can actually stop that machinery from generating undesirable results, especially when it comes to risks to human security. The analytical challenge, then, is to identify predictable elements of risk-generating organizational machineries—in other words, causal mechanisms—and, likewise, typical points of intervention. One way to address that challenge is to systematize the linkages between structural permissive conditions, related risk-increasing patterns of human action, and the actual materialization of risks in the form of disastrous outcomes. These linkages or joints mark the points of intervention at which disastrous causal processes can, in principle, be neutralized.

A model that responds to these analytical requirements is Hedström and Swedberg’s (1998) and, respectively, Hedström and Ylikoski’s (2010) distinction of layered causal mechanisms in combination with Mario Bunge’s (Bunge 1997, 2004) characterization of mechanisms as being systemic in nature. Hedström and Ylikoski distinguish three types of mechanisms.

Figure 2: Typology of causal mechanisms

Adapted from Peter Hedström and Petri Ylikoski: Causal mechanisms in the Social Sciences. Annual Review of Sociology 36 (2010): 49-67 (59), with author’s addenda.

Figure 2: Typology of causal mechanisms

Adapted from Peter Hedström and Petri Ylikoski: Causal mechanisms in the Social Sciences. Annual Review of Sociology 36 (2010): 49-67 (59), with author’s addenda.

This analytical scheme is derived from what in the jargon of social science methodology (methodological individualism, in particular) is known as Coleman’s bathtub (or Coleman’s boat): the quest for disaggregated causal analytic steps in an attempt to identify at the micro level of individual human agency the origins of causal effects at the aggregate (or macro) level (Coleman 1990, 10). What Hedström, Swedberg, and Ylikoski add to Coleman’s original “bathtub” is, first, the language of “mechanisms” previously introduced by Jon Elster (1989) and, second, an explicit account of the disaggregation and reaggregation of causality. Hence three types of mechanisms: action formation mechanisms as core mechanisms shaping individual human agency embedded in, and shaped by, situational mechanisms and ultimately turned into an aggregated result through transformational mechanisms.2

Moreover, Marion Bunge’s (1997, 2004) particular contribution to the conceptualization of causal mechanisms is the reference to the system specificity3 of mechanisms in the first place. We do not expect biochemical mechanisms to occur in a social system, nor do we expect social mechanisms to occur in a mechanical system. The ontology of systems of any sort implies specific mechanisms that, under particular circumstances, may turn out to be causal mechanisms for various outcomes. Formal organizations, by their very ontological nature, entail hierarchy, and hierarchy implies principal-agent problems (Akerlof 1970; Grossman and Hart 1983). Moreover, organizations operate on the basis of division of labor, which implies requirements of coordination and related risk zones of coordination deficiencies (Thompson 1967). Decision-making in organizations, as long as not strictly rule bound, usually takes place in groups, which implies the risk of groupthink in the sense of deviant opinions being restrained or suspended by social pressure or implicit hierarchy (Janis 1972).

What Bunge also emphasized—unlike mainstream literature on causal process tracing and causal mechanisms (cf. Beach and Pederson 2019; Bennett and Checkel 2015)—is that mechanisms in social systems are not visible, so that they can only be conjectured. It is only on the basis of appropriate theorizing that we are able to identify mechanisms in social systems regardless of whether the analysis is descriptive or causal analytic in nature. Principal-agent problems cannot be observed, deficiencies of coordination or the logic of bargaining processes are not visible either, and neither is groupthink or a prisoner’s dilemma. On the basis of an appropriate diagnosis of the actual ontology, however, visibility is reached through conjecturing. Disputes may remain, such as the debate on whether “authority” actually exists or can be redefined as a quasi-contractual relation (cf. Alchian and Demsetz 1972 for the latter position and Arrow 1974 for the former), but this is, after all, what theoretical reasoning is about.

What makes, again, the analytical differentiation of situational, action formation, and transformational mechanisms in the Hedström, Swedberg, and Ylikoski sense particularly helpful for understanding the failure of risk assessment and risk control is the exposure of points of intervention and their implicit ambivalence. Situational mechanisms that either represent or aggravate standard pathologies of organizations may be tolerated as long as they are neutralized at the level of the action formation mechanisms. Similarly, risk-increasing action formation mechanisms may be tolerated as long as they are neutralized at the threshold separating them from “transformational” mechanisms.

Accordingly, there are two points of “natural” intervention, one at the interface between situational mechanisms and action formation mechanisms and one at the threshold between action formation mechanisms and the actual occurrence of the outcome. Situational mechanisms, due to necessary division of labor or collaborative governance, may split up jurisdiction and dilute responsibility, but these potentially harmful forces can be checked by existing protocol or mindful leaders, or both. And even if that is not the case and risk-increasing behavior does take place, fateful consequences can be averted when no transformational mechanisms are available or mobilized. This is actually the logic of the “near miss” phenomenon, characterized by risk-increasing mechanisms whose impact is finally neutralized, even if sometimes at the very last moment (cf. Hayes 2009, 124-125).

Hence a productive complementarity of the NAT and HRO theories: it might be precisely “mindfulness” in the Hopkins (2007) and Weick and Sutcliffe (2007) sense, through which harmful effects at the interface between situational mechanisms and action formation mechanisms and at the threshold between action formation mechanisms and transformational mechanisms are neutralized while the situational mechanisms themselves might represent exactly the type of structural fault lines whose inevitability is emphasized in Perrow’s classic version of NAT. In the real world of an ongoing process of decision-making, actors may or may not realize that the point of intervention exists, but even when they do realize its existence, it remains uncertain if and how they make use of related cognitions and recognitions. This ambivalence is exactly what skeptical theories of organizational learning have underlined (Argyris 1999; Arrow 1974; Janis 1972; March and Olsen 1975).

6. A Case Reanalyzed

Applied to the run-up to the Duisburg Loveparade disaster, the differentiation of three different types of causal mechanisms combined with the productive complementarity of the NAT and HRO theories reveals not only distinct categories of mechanisms but also the very points of intervention at which the path to disaster could have been interrupted—which, however, did not happen.

There were clearly permissive conditions and terms of organizational structure and, particularly, a political climate that paved the way to negligence of security issues and the final issuance of permission that, according to the law and the expertise of the relevant officials in charge, should never have been given. What theory-based conjecturing discovers is what Crozier (1963) has described as the use of remaining zones of uncertainty in bureaucracy for the mobilization of power (cf. also Mintzberg 1983), thus politicization, and what is known in behavioral psychology and economics as the negative impact of a multitude of actors on the readiness to assume responsibility (Darley and Latane 1968; Wallach, Kogan, and Bem 1963) and the temptation of blame shifting (Bartling and Fischbacher 2012; Hood 2011). The politicization impacted on the process of administrative licensing while the diffusion of responsibility resulted from the public-private partnership between the Duisburg municipality and the private event management firm. Both factors were interacting: without the politicization of what was basically a banal process of administrative verification and approval, the event management firm would have been just one applicant among many others asking for some sort of authorization or permission. In that case, no implicit coalition would have emerged between a private business and one part of the city administration against a different part of the same administration. In reality, however, this was precisely what happened. What should have been clear and distinct patterns of division of labor and related responsibilities was blurred and diluted at the expense of the actual integrity of the city administration as the guardian of the public interest in human security. So politicization and diffusion of responsibility turned out to be the very situational mechanisms that shaped the patterns of action at the actual decision-making level.

What happened at the level of decision-making was that two opposing camps within the Duisburg city administration were competing for influence, a fight in the course of which the micropolitical entrepreneurship of a jurisdictionally not even competent head of division was relentlessly pressing for permitting the Loveparade under any circumstances. What one recognizes as action formation mechanisms is what Downs (1967, 109-110) has denoted as zealot-style attitude, Finer (1941) as over-feasance, or Kingdon (1984) as policy entrepreneurship. In addition, signs of groupthink (Janis 1972) were unmistakable. It was the personal zeal of the head of Division II, the lord mayor’s right-hand man, that shaped the climate of internal meetings in which the jurisdictionally competent officials soon found themselves on the defensive. In the final stage of what may be called a war of attrition, nobody dared to raise objections anymore against a clearly illegal strategy to make possible a public event whose security status was more than dubious. The combination of zeal and groupthink made the reversal of an ultimately disastrous process of decision-making increasingly unlikely.

And yet, even at that stage, the process itself was still not unstoppable. It could have been stopped by the immediate superior of the officials in the licensing authority, the head of Division V. This, however, would have required overcoming strong path dependencies (David 1985) through determined leadership in defense of institutional integrity (Selznick 1957, 118-133) as well a clear and incorruptible sense of responsibility (Friedrich 1940). The fact that this did not happen turned out to be the sufficient condition for a permission that cleared the way to disaster. The actual transformational mechanisms through which the relevant series of decision-making passed the proverbial point of no return were path dependency and blunt lack of leadership. A decision to stop the run-up to a popular mass event at the eleventh hour would have implied having wasted a considerable amount of investment of both financial and political capital. All the political, organizational, and monetary efforts to make the Loveparade possible would have been in vain. Anticipating precisely this, it would have taken not just sober judgment but, above all, a tremendous amount of courage and resolve to cancel the event (Bruttel and Fischbacher 2013; ’t Hart and Tummers 2019, esp. 50–51). The official in charge, the head of Division V, had none of these personal traits.

The definition of different types of causal mechanisms as depicted in Figure 3 also make discernible the relevant points of intervention and the consequences of nonintervention. The opportunity to neutralize the harmful effect of the situational mechanisms was not just missed. Rather, the impact was reinforced through purposeful action. Still, the resulting action formation mechanisms did not make a disastrous outcome inevitable. It took a second missed opportunity at the threshold between risk-taking behavioral attitudes and actual decisions to activate the transformational mechanisms triggering the actual occurrence of the disaster.

Figure 3: Causal Mechanisms Duisburg Loveparade with Points of Intervention/Missed Opportunities

Source: Author’s compilation.

Figure 3: Causal Mechanisms Duisburg Loveparade with Points of Intervention/Missed Opportunities

Source: Author’s compilation.

Not only can we discern the points of intervention, we also know the individual actors who missed the opportunity to make use of them. Which, in turn, not only implies pinpointing personal responsibility but also building assumptions about motivational forces. It would be overly simplistic to classify the apparent neglect as recklessness and blundering. It was not just personal failure that made high-ranking and accomplished public officials purposefully violate unmistakable security regulations. Those officials had to respond to divergent requirements of justification and legitimacy shaped by the trade-offs between responsiveness and responsibility and between goal attainment and system maintenance. Within that framework of legitimization, the immediately responsible officials made entirely plausible choices. They prioritized responsiveness over responsibility and goal attainment over system maintenance. The head of Division II of the Duisburg city administration, when pressing for issuing permission for the event under any circumstances, responded to the political expectations of the overall public. The head of Division V, in denial of his own skepticism, ultimately did the same thing. We do not know anything about their cognitive dissonances or if they felt any. If so, they could have mitigated them by referring to the necessity to do justice to what was politically requested and to avoid what would have frustrated hundreds of thousands of people—namely, canceling the popular mass event named “Loveparade” altogether. Assuming responsibility for actual law enforcement and protecting the institutional integrity of the licensing office in charge would have been costly in terms of reputation and personal standing. Normative quests to mindfully resist those incentives are entirely appropriate, but a more promising approach is to make their occurrence less likely.

Conclusion

While the applicability of the high reliability concept to public sector organizations has been generally challenged by some authors (cf. esp. Boin and Schulman 2008), this article maintains that public bureaucracies should and can be treated as high reliability organizations under conditions where risks to life and limb are involved. It comes with the very nature of bureaucracies that their operations are standardized and regulated, which implies a limited range of both standard procedures and standard pathologies and, consequently, typical patterns of risk-generating mechanisms. Conceptualizing and differentiating those mechanisms is useful for developing a more fine-grained variant of both normal accident theory (NAT) and high reliability organization (HRO) theory, taking into account the very standard pathologies of public bureaucracies and inevitable trade-offs connected to their political embeddedness in democratic and rule-of-law-based systems. This, the article argues, makes it possible to identify distinct points of intervention at which permissive conditions with the potential to trigger risk-generating human action can be neutralized while the threshold that separates risk-generating human action from actual disaster can be raised to a level that makes disastrous outcomes impossible or, at least, less probable.

It is quite in the vein of Normal Accident Theory (NAT) that permissive conditions in the form of situational mechanisms and standard pathologies of a given organizational arrangement can be diagnosed as risk zones connected to particular institutional structures and their societal and political environments. Such diagnoses require a distinct ontological definition of the organizational system at hand and, hence, the basic mechanisms that “make the system work” (Bunge 2004) and, by the same token, may also indicate standard pathologies and trade-offs. In the present article, this is illustrated for public bureaucracies whose ontological nature is characterized not only by hierarchy, division of labor, legal prescriptions, and related standard operating procedures but also, for the sake of effectiveness and legitimacy, by trade-offs between responsiveness and responsibility and between goal attainment and system maintenance—institutional integrity, in particular.

In the reality of public bureaucracies, those trade-offs have to be balanced by human decision makers in attempt to neutralize the undesirable effects of standard pathologies. As standard pathologies and typical trade-offs, they are, in principle, predictable as risk-generating factors. It is here where the emphasis on “mindfulness” as stressed in the HRO literature is helpful and analytically relevant. While, in real organizational life, that mindfulness cannot be taken for granted, accomplished officials in public bureaucracies can be assumed to be mindful enough when it comes to the recognition and acknowledgment of bureaucratic standard pathologies like insufficient vertical communication, selective perceptions and negative coordination, bureaucratic politics, red tape, narrow-mindedness, defensive routines, et cetera. They are probably also mindful enough to develop coping strategies designed to mitigate or even to avoid the undesirable effects of those pathologies. The same is true for the basic trade-offs between responsiveness and responsibility, on the one hand, and goal attainment and system maintenance, on the other hand. Mindfulness is what, in a nutshell, characterizes the average “responsible administrator” (cf. Cooper 2012).

Yet mindfulness as such is an insufficient normative requirement as long as the operational leeway remains unspecified where the “mind” can actually come to bear in shaping decision-making at the operational level of public bureaucracies in an attempt to neutralize standard pathologies and to balance inherent trade-offs. The argument of the present article is that a mechanism-based perspective on organizational standard pathologies and systemic trade-offs makes a valuable contribution to the required specification. The differentiation between situational action formation and transformational mechanisms according to Hedström and Swedberg (1998) and Hedström and Ylikoski (2010) allows for the identification of critical points of intervention at which, according to NAT, structural and situational risk zones can be recognized, and, according to HRO theory, mindful actors should be able to neutralize their undesirable effects.

Figure 4: Causal mechanisms and points of intervention

Adapted from Peter Hedström and Petri Ylikoski: Causal mechanisms in the Social Sciences. Annual Review of Sociology 36 (2010): 49-67 (59), with author’s addenda.

Figure 4: Causal mechanisms and points of intervention

Adapted from Peter Hedström and Petri Ylikoski: Causal mechanisms in the Social Sciences. Annual Review of Sociology 36 (2010): 49-67 (59), with author’s addenda.

The joint between situational and action formation mechanisms can be characterized as the permissive conditions ./. actual agency interface. It is here where a first point of intervention is located since mindful individual or collective actors should be able to make sure that the negative effects of situational mechanisms/permissive conditions do not materialize. The second and decisive point of intervention is located at the joint between action formation mechanisms and transformational mechanisms. That joint can therefore be characterized as the near miss ./. actual occurrence threshold. Risk-generating action formation mechanisms may be kept under control by key decision makers that combine mindfulness with determination and resolve.

One relevant empirical question emerging from the suggested combination of NAT and HRO theory in a mechanism-based perspective is why and under what circumstances the relevant points of intervention remain unrecognized and/or unused. The contention of the present article is that identifying structural risk zones of public bureaucracies and normative appeals insisting on mindfulness remain insufficient as long as predictable counterincentives to recognize risk zones and to take measures to contain their harmful effects are not systematically addressed. It is also stated that predictions about structural risk zones and applied mindfulness require an analysis of how basic trade-offs of democracy impact on discretionary decision-making in public bureaucracies.

The run-up to the Duisburg Loveparade disaster is indicative also in this particular respect. Massive political pressure on the licensing authority in charge of vetting and verifying planning and preparation ultimately caused professional and institutional integrity to collapse. That pressure did not originate at random; rather, it was the consequence of choices of upper-rank public officials who had to balance conflicting priorities of democratic decision-making. They had to be responsive in terms of goal attainment and responsible in terms of system maintenance at the same time. Canceling a mass event with an anticipated one million visitors, while doing justice to both legal requirements and the professional responsibility for safeguarding human security, would have violated the legitimate expectations of the public at large and thus the requirements of responsiveness. Issuing permission for the event, while doing justice to effective goal attainment in terms of public management and agility, inevitably resulted in violating the professional and institutional integrity of the decision-making process itself.

This article is intended to demonstrate, first, the necessity to specify general conditions under which structural risk zones of public bureaucracies as high reliability organizations can possibly be recognized and kept under mindful control and, second, the possibility to identify points of intervention at which existing risks can be neutralized. Whether or not actors in public bureaucracies make use of those points of intervention is not only a matter of knowledge and mindfulness but also, if not decisively so, a matter of personal resolve and public support. While resolve is, in turn, a matter of personality, and picking the right personalities is ultimately a matter of recruitment and training, support is a matter of public values. Insisting on personal responsibility for the sake of maintaining professional and institutional integrity is, from that normative perspective, a crucial ingredient of making public bureaucracies high reliability organizations in the real world of a democratic polity.

Author Biography

Wolfgang Seibel is a professor of politics and public administration at the University of Konstanz and an adjunct professor of public administration at the Hertie School.

He held guest professorships at the University of California, Berkeley (1994), Stanford University (2014), the Central European University (2014, 2016), and the University of Pretoria (2017) and was a guest scholar at the University of Utrecht (2019–20). He was a temporary member of the Institute for Advanced Study, Princeton (1989–90, 2003), a fellow of the Wissenschaftskolleg zu Berlin (2004–5), and a fellow of the Stellenbosch Institute for Advanced Study (STIAS) (2019). In 2009 he was elected a member of the Heidelberg Academy of Science. Among his latest publications are Persecution and Rescue: The Politics of the “Final Solution” in France, 1940–1944 (Ann Arbor: University of Michigan Press, 2016); Verwaltungsdesaster [Public Administration Disasters], Campus Publ. 2017 (with Kevin Klamann and Hannah Treis); The Management of UN Peacekeeping: Coordination, Learning and Leadership in Peace Operations (Boulder, CO: Lynne Rienner Publ., 2017) (coedited with Julian Junk, Francesco Mancini, and Till Blume); “Pragmatism in Organizations: Ambivalence and Limits,” Research in the Sociology of Organizations 59 (2019): 43–58; and “Autonomy, Integrity, and Values in Public Administration: A Dilemma and a Case,” Perspectives on Public Management and Governance 3 (2020): 155–166.

Author Note

The present article is based on the research project “Black Swans in Public Administration: Rare Organizational Failure with Severe Consequences” (https://issuu.com/euresearcher/docs/black_swans_in_public_administration_eur20_h_res), funded by the German Research Foundation (DFG) in the framework of the DFG Reinhart Koselleck program. I would like to express my gratitude to the Utrecht University School of Governance (USBO) for its hospitality during an extended stay as a guest scholar when the present article was in the making. I am also indebted to Stavros Zouridis and the Oderzoeksraad voor Veiligheid (Dutch Safety Board) for the opportunity to discuss the analytical concept of this article at the occasion of a talk given in The Hague in January 2020. Finally, I am indebted to Annette Flowe and Paulina Ulbrich for technical assistance in completing the manuscript.—WS

Cited Documents

[1] Letter Thomas Mahlberg MdB, 9 February 2009, Brief an den Innenminister Dr. Ingo Wolf von Thomas Mahlberg MdB, Internet: http://www.cduduisburg.de/index.jsp?index=presseandmid=20andcontent=jaandid=147, [downloaded: 11 March 2015].

[2] Minutes, meeting of 25 September 2010, Niederschrift über ein Gespräch zum Thema Loveparade 2010 in Duisburg, Internet: http://file.wikileaks.org/file/loveparade2010/loveparade-2010-anlage-03-protokoll-25-09-09.pdf, [last access: 27 June 2020].

[3] Minutes, meeting of 2 October 2009, Ergebnisniederschrift zur Besprechung Loveparade 2010, Internet: http://file.wikileaks.org/file/loveparade2010/loveparade-2010-anlage-04-protokoll-02-10-09.pdf [last access: 27 June 2020].

[4] Presentation Lopavent, 29 October 2009, Loveparade 2010 in Duisburg – Präsentation Lopavent, Internet: https://www.duisburg.de/ratsinformationssystem/bi/getfile.php?id=1458557andty

pe=do [downloaded: 11 March 2015].

[5] Minutes, meeting of 2 March 2010, Niederschrift über ein verwaltungsinternes Gespräch, Internet: http://file.wikileaks.org/file/loveparade2010/loveparade-2010-anlage-20-protokoll-02-03-10.pdf [last access: 27 June 2020].

[6] Letter Bauordnungsamt, City of Duisburg, to Lopavent, 14 June 2010, Eingangsbestätigung – Nachforderung fehlender Unterlagen von der Unteren Baubehörde, Internet: http://file.wikileaks.org/file/loveparade2010/loveparade-2010-anlage-24-nachforderung-fehlende-unterlagen-14-06-10.pdf [last access: 27 June 2020].

[7] Minutes, meeting of 18 June 2010, Protokoll eines Gesprächs bei Lopavent, Internet: http://file.wikileaks.org/file/loveparade2010/loveparade-2010-anlage-25-aktenvermerk-und-ablehnung-dressler-18-06-10.pdf, [last access: 27 June 2020].

[8] Note for the files, meeting of 25 June 2010, Aktenvermerk über ein Gespräch bei Lopavent des Bauamtes, Internet: http://file.wikileaks.org/file/loveparade2010/loveparade-2010-anlage-26-aktenvermerk-abnahme-sv-25-06-10.pdf [last access: 27 June 2020].

[9] Note for the files, 12 July 2010, Aktenvermerk, Bewertung des Zu- und Abwegekonzepts durch Prof. Schreckenberg, Internet: https://www.duisburg.de/ratsinformationssystem/bi/getfile.php?id=1458579andtype=doXIV [last access: 27 June 2020].

[10] Expert report, 13 July 2010, Entfluchtungsanalyse zur Loveparade 2010 der Firma TraffGo HT GmbH, Internet: https://www.duisburg.de/ratsinformationssystem/bi/getfile.php?id=1458601andtype=do [last access: 27 June 2020].

[11] Expert report, 16 July 2010, Stellungnahme zur Entfluchtungsanalyse durch Prof. Schreckenberg, Internet: https://www.duisburg.de/ratsinformationssystem/bi/getfile.php?id=1458602andty pe=do [last access: 27 June 2020].

[12] Permission, dated 21 July 2010, issued 23 July 2010, Loveparade 2010 Anlage 34 Genehmigung der Bauaufsicht mit Abweichungsgenehmigungen, Internet: https://file.wikileaks.org/file/loveparade2010/loveparade-2010-anlage-34-genehmigung-bauaufsicht-21-07-10.pdf [last access: 27 August 2015].

Footnotes

1.

The German original (Document no. 7) reads: “Ich lehne aufgrund dieser Problemstellung eine Zuständigkeit und Verantwortung von [Amt] 62 ab. Dieses entspricht in keinerlei Hinsicht einem ordentlichen Verwaltungshandeln und einer sachgerechten Projektsteuerung. Die Entscheidung in allen Belangen obliegt [Dezernat] II.” (Due to the nature of the problem, I reject jurisdiction and responsibility of [office] 62. It is in no way in accordance with proper administrative procedure and an appropriate control of the project. In every respect, the decision lies with [division] II.)

2.

What Hedström, Swedberg, and Ylikoski denote as “situational mechanisms” is an equivalent to permissive conditions that make particular behavioral patterns more likely but do not trigger them themselves. The term “permissive conditions” in a context of causal analytic methodology is explicitly used by Soifer (2012) to categorize the antecedent conditions of a critical juncture at which a particular causal process takes a decisive turn as soon as additional factors transform the very permissive conditions into what Soifer denotes as “productive conditions” without, however, referring explicitly to causal mechanisms.

3.

Author’s wording. Despite the parallel, both in language and in substance, Bunge himself makes no reference to the microeconomic concept of asset specificity developed by Oliver E. Williamson (1985)—that is, that the productivity of assets (e.g., skills, investments)—is not universal but context dependent.

References

References
Akerlof, George A. 1970. “The Market for ‘Lemons’: Quality Uncertainty and the Market Mechanism.” Quarterly Journal of Economics 84 (3): 488–500. https://doi.org/10.2307/1879431.
Alchian, Armen A., and Harold Demsetz. 1972. “Production, Information Costs, and Economic Organization.” The American Economic Review 62: 777–95.
Argyris, Chris. 1999. On Organizational Learning. 2nd ed. Oxford: Wiley-Blackwell.
Arrow, Kenneth J. 1974. The Limits of Organization. New York: Norton & Company.
Bartling, Björn, and Urs Fischbacher. 2012. “Shifting the Blame: On Delegation and Responsibility.” The Review of Economic Studies 79 (1): 67–87. https://doi.org/10.1093/restud/rdr023.
Beach, Derek, and Rasmus Brun Pederson. 2019. Process-Tracing Methods: Foundations and Guidelines. 2nd ed. Ann Arbor: University of Michigan Press.
Bennett, Andrew, and Jeffrey T. Checkel. 2015. “Process Tracing: From Philosophical Roots to Best Practices.” In Process Tracing. From Metaphor to Analytic Tool, edited by Andrew Bennett and Jeffrey T. Checkel, 3–38. Cambridge: Cambridge University Press. https://doi.org/10.1017/cbo9781139858472.003.
Boin, Arjen, and Paul Schulman. 2008. “Assessing NASA’s Safety Culture: The Limits and Possibilities of High-Reliability Theory.” Public Administration Review 68 (6): 1050–62. https://doi.org/10.1111/j.1540-6210.2008.00954.x.
Bruttel, Lisa, and Urs Fischbacher. 2013. “Taking the Initiative. What Characterizes Leaders?” European Economic Review 64 (November): 147–68. https://doi.org/10.1016/j.euroecorev.2013.08.008.
Bunge, Mario. 1997. “Mechanism and Explanation.” Philosophy of the Social Sciences 27 (4): 410–65. https://doi.org/10.1177/004839319702700402.
———. 2004. “How Does It Work? The Search for Explanatory Mechanisms.” Philosophy of the Social Sciences 34 (2): 182–210. https://doi.org/10.1177/0048393103262550.
Coleman, James S. 1990. Foundations of Social Theory. Cambridge (Mass.): Harvard University Press.
Cooper, Terry L. 2012. The Responsible Administrator: An Approach to Ethics for the Administrative Role. 5th ed. San Francisco: Jossey-Bass.
Crozier, Michel. 1963. Le phénomène buraucratique. Essai sur les tendances bureaucratiques des systems d’organisation modernes et sur leurs relations en France avec le système social et culturel. Paris: Éditions du Seuil.
Darley, John M., and Bibb Latane. 1968. “Bystander Intervention in Emergencies: Diffusion of Responsibility.” Journal of Personality and Social Psychology 8 (4): 377–83. https://doi.org/10.1037/h0025589.
David, Paul A. 1985. “Clio and the Economics of QWERTY.” American Economic Review 75: 332–37.
Deutsch, Karl W. 1963. The Nerves of Government: Models of Political Communication and Control. New York: The Free Press.
Downs, Anthony. 1967. Inside Bureaucracy. Boston: Little, Brown and Company. https://doi.org/10.7249/cb156.
Elster, Jon. 1989. Nuts and Bolts for the Social Sciences. Cambridge: Cambridge University Press.
Farson, Richard, and Ralph Keyes. 2002. “The Failure-Tolerant Leader.” Harvard Business Review 80: 64–71.
Finer, Herman. 1941. “Administrative Responsibility in Democratic Government.” Public Administration Review 1 (4): 335–50. https://doi.org/10.2307/972907.
Friedrich, Carl J. 1940. “Public Policy and the Nature of Administrative Responsibility. Public Policy.” In A Yearbook of the Graduate School of Public Administration, 3–24. Harvard University.
Grossman, Sanford J., and Oliver D. Hart. 1983. “An Analysis of the Principal-Agent Problem.” Econometrica 51 (1): 7. https://doi.org/10.2307/1912246.
Hayes, Jan. 2009. “Incident Reporting: A Nuclear Industry Case Study.” In Learning from High Reliability Organisations, edited by Andrew Hopkins, 117–34. Canberra: CCH Australia.
Hedström, Peter, and Richard Swedberg. 1998. “Social Mechanisms: An Introductory Essay.” In Social Mechanisms. An Analytical Approach to Social Theory, edited by Hedström and Swedberg, 1–31. Cambridge UK: Cambridge Univ. Press.
Hedström, Peter, and Petri Ylikoski. 2010. “Causal Mechanisms in the Social Sciences.” Annual Review of Sociology 36 (1): 49–67. https://doi.org/10.1146/annurev.soc.012809.102632.
Hood, Christopher. 2011. The Blame Game. Spin Bureaucracy and Self-Preservation in Government. Princeton, NJ: Princeton University Press. https://doi.org/10.1515/9781400836819.
Hopkins, Andrew. 2007. Lessons from Gretley. Mindful Leadership and the Law. Canberra: CCH Australia.
———, ed. 2009. Learning from High Reliability Organisations. Canberra: CCH Australia.
Janis, Irving L. 1972. Victims of Groupthink: A Psychological Study of Foreign-Policy Decisions and Fiascoes. Boston: Houghton Mifflin Company.
Kettl, Donald F. 2006. “Public Bureaucracies.” In The Oxford Handbook of Political Institutions, edited by R.A.W. Rhodes, Sarah A. Binder, and Bert A. Rockman, 366–84. Oxford, UK: Oxford University Press.
Kingdon, John W. 1984. Agendas, Alternatives, and Public Policies. Boston: Little, Brown & Co.
LaPorte, Todd R., and Paula M. Consolini. 1991. “Working in Practice but Not in Theory: Theoretical Challenges of ‘High-Reliability Organizations.’” Journal of Public Administration Research and Theory 1: 19–48.
Lindblom, Charles E. 1959. “The Science of ‘Muddling Through.’” Public Administration Review 19 (2): 79. https://doi.org/10.2307/973677.
Lipsky, Michael. 1980. Street-Level Bureaucracy: Dilemmas of the Individual in Public Services. New York, NY: Russell Sage Foundation.
March, James G., and Johan P. Olsen. 1975. “The Uncertainty of the Past: Organizational Learning under Ambiguity.” European Journal of Political Research 3 (2): 147–71. https://doi.org/10.1111/j.1475-6765.1975.tb00521.x.
Mintzberg, Henry. 1983. Power in and around Organizations. Englewood Cliffs, N.J: Prentice-Hall.
Olsen, Johan P. 2008. “The Ups and Downs of Bureaucratic Organization.” Annual Review of Political Science 11 (1): 13–37. https://doi.org/10.1146/annurev.polisci.11.060106.101806.
Parsons, Talcott. 1951. The Social System. New York: The Free Press.
Perrow, Charles. 1984. Normal Accidents: Living with High-Risk Technologies. New York: Basic Books.
———. 1986. Complex Organizations: A Critical Essay. 3rd ed. New York, NY: McGraw-Hill.
Peters, B. Guy. 2014. “Accountability in Public Administration.” In The Oxford Handbook of Accountability, edited by Mark Bovens, Robert E. Goodin, and Thomas Schillemanns, 211–25. Oxford: Oxford University Press.
Puranam, Phanish. 2018. The Microstructure of Organizations. Oxford Scholarship Online. Oxford: Oxford University Press. https://doi.org/10.1093/oso/9780199672363.001.0001.
Roe, Emery, and Paul R. Schulman. 2008. High Reliability Management: Operating on the Edge. Stanford, CA: Stanford Business Books.
———. 2016. Reliability and Risk: The Challenge of Managing Interconnected Infrastructures. Stanford University Press. https://doi.org/10.11126/stanford/9780804793933.001.0001.
Sagan, Scott D. 1993. The Limits of Safety: Organizations, Accidents, and Nuclear Weapons. Princeton, N.J.: Princeton University Press. https://doi.org/10.1515/9780691213064.
Schulman, Paul R. 1993. “The Negotiated Order of Organizational Reliability.” Administration & Society 25 (3): 353–72. https://doi.org/10.1177/009539979302500305.
Seibel, Wolfgang. 2019. “Professional Integrity and Leadership in Public Administration.” In The Blind Spots of Public Bureaucracy and the Politics of Non‐Coordination, Executive Politics and Governance, edited by Tobias Bach and Kai Wegrich, 71–86. Basingstoke: Palgrave Macmillan. https://doi.org/10.1007/978-3-319-76672-0_4.
Selznick, Philip. 1957. Leadership in Administration. A Sociological Interpretation. New York, Evanston and London: Harper & Row.
Shrivastava, Samir, Karan Sonpar, and Federica Pazzaglia. 2009. “Normal Accident Theory versus High Reliability Theory: A Resolution and Call for an Open Systems View of Accidents.” Human Relations 62 (9): 1357–90. https://doi.org/10.1177/0018726709339117.
Stivers, Camilla. 1994. “The Listening Bureaucrat: Responsiveness in Public Administration.” Public Administration Review 54 (4): 364–69. https://doi.org/10.2307/977384.
’t Hart, Paul, and Lars Tummers. 2019. Understanding Public Leadership. 2nd ed. London: Red Globe Press.
Thompson, James D. 1967. Organizations in Action. Social Science Bases of Administrative Theory. New York: Mc Graw-Hill.
Wallach, Michael A., Nathan Kogan, and Daryl J. Bem. 1963. “Diffusion of Responsibility and Level of Risk Taking in Groups.” The Journal of Abnormal and Social Psychology 68 (3): 263–74. https://doi.org/10.1037/h0042190.
Weick, Karl E., and Kathleen M. Sutcliffe. 2007. Managing the Unexpected: Resilient Performance in and Age of Uncertainty. 2nd ed. San Francisco, CA: Jossey-Bass.
Williamson, Oliver E. 1985. The Economic Institutions of Capitalism. New York: Free Press.