+ All Categories
Home > Documents > Community, technology, and risk: Collective well-being in the aviation industry

Community, technology, and risk: Collective well-being in the aviation industry

Date post: 23-Aug-2016
Category:
Upload: gary-brown
View: 212 times
Download: 0 times
Share this document with a friend
9
NORTH- HOLLAND Community, Technology, and Risk: Collective Well-Being in the Aviation Industry GARY BROWN ABSTRACT Highly hazardous activities are currently overwhelmingly researched within the context of the organizations that produce or exploit them. Analysts have varying degrees of confidence in the capacity of these organizations to manage complex risks and hazards. Yet we often overlook the fact that the actual management of these technologies is entrusted to a very significant degree by organizations to occupational communities, whether they be engineers, pilots, air traffic controllers, or scientists. These communities usually straddle organizational boundaries, and state and public expectations invest them with substantial countervailing power [15] with respect to the organizations that actually employ them. Although tensions arise between the professional and commercial/ bureaucratic agenda, the "cosmopolitan" perspective of the occupational community very frequently prevails over the "local" perspective of the organizational bureaucracy. An overemphasis in research on the dysfunctions (real or asserted) of organizations can conceal a record of successful management of hazard by occupational com- munities. This paper arises from research carried out among aviation professionals: air traffic controllers, pilots, engineers, and cabin crew. It argues that modern civil aviation should not always be viewed as a series of"accidents waiting to happen," but rather as a celebration of long-term collective well-being in a complex sociotechnical environment. The wider lesson for other technologies may be that we are already better at handling significant hazards than we are frequently willing to admit- although this modesty may itself be functional to our search for improved management of risk and hazard. The research was carried out whilst the author was 1993 Lloyd's Tercentenary Fellow, Templeton College, University of Oxford. Introduction In all, the aviation safety record has been one of the remarkable achievements of the twentieth century. The industry's success is a principal reason for so much attention being focused on those very rare times an airplane crashes or something goes wrong in flight, while so relatively little attention is paid to other phenomena that cause far more death and injury. GARY BROWN is Lecturer in Organizational Behavior in the Department of Management Studies, Brnnel University, U.K. Following a career in technical management in theater and opera, he received a Master's and Doctorate in Management from Oxford University, and held the 1993 Lloyd's Fellowship at Templeton College, Oxford. Address reprint requests to Gary Brown, Department of Management Studies, Brunel University, Uxbridge, Middlesex UB8 3PH, United Kingdom. A version of this paper was presented to the 38th International Society for the Systems Sciences Confer- ence, 1994. Technological Forecasting and Social Change 48, 259-267 (1995) © 1995 Elsevier Science Inc. 655 Avenue of the Americas, New York, NY 10010 0040-1625/95/$9,50 SSDI 0040-1625(94)00048-2
Transcript

NORTH- HOLLAND

Community, Technology, and Risk: Collective Well-Being in the Aviation Industry

GARY BROWN

ABSTRACT

Highly hazardous activities are currently overwhelmingly researched within the context of the organizations that produce or exploit them. Analysts have varying degrees of confidence in the capacity of these organizations to manage complex risks and hazards. Yet we often overlook the fact that the actual management of these technologies is entrusted to a very significant degree by organizations to occupational communities, whether

they be engineers, pilots, air traffic controllers, or scientists. These communities usually straddle organizational

boundaries, and state and public expectations invest them with substantial countervailing power [15] with respect to the organizations that actually employ them. Although tensions arise between the professional and commercial/ bureaucratic agenda, the "cosmopolitan" perspective of the occupational community very frequently prevails

over the "local" perspective of the organizational bureaucracy. An overemphasis in research on the dysfunctions (real or asserted) of organizations can conceal a record of successful management of hazard by occupational com- munities.

This paper arises from research carried out among aviation professionals: air traffic controllers, pilots,

engineers, and cabin crew. It argues that modern civil aviation should not always be viewed as a series of"accidents waiting to happen," but rather as a celebration of long-term collective well-being in a complex sociotechnical

environment. The wider lesson for other technologies may be that we are already better at handling significant hazards than we are frequently willing to admit - although this modesty may itself be functional to our search for improved management of risk and hazard.

The research was carried out whilst the author was 1993 Lloyd's Tercentenary Fellow, Templeton College,

University of Oxford.

Introduction

In all, the aviation safety record has been one of the remarkable achievements of the twentieth century. The industry's success is a principal reason for so much attention being focused on those very rare times an airplane crashes or something goes wrong in flight, while so relatively little attention is paid to other phenomena that cause far more death and injury.

GARY BROWN is Lecturer in Organizational Behavior in the Department of Management Studies, Brnnel University, U.K. Following a career in technical management in theater and opera, he received a Master's and Doctorate in Management from Oxford University, and held the 1993 Lloyd's Fellowship at Templeton

College, Oxford. Address reprint requests to Gary Brown, Department of Management Studies, Brunel University, Uxbridge,

Middlesex UB8 3PH, United Kingdom. A version of this paper was presented to the 38th International Society for the Systems Sciences Confer-

ence, 1994.

Technological Forecasting and Social Change 48, 259-267 (1995) © 1995 Elsevier Science Inc. 655 Avenue of the Americas, New York, NY 10010

0040-1625/95/$9,50 SSDI 0040-1625(94)00048-2

260 G. BROWN

So wrote Oster, Strong and Zorn in 1992 [33], analyzing over a decade of worldwide safety data in the postderegulation era. The contents of the book are detailed, measured and (to this writer and passenger at least) persuasive. But let us consider for a moment the presentation of the book. It is called Why Airplanes Crash: Aviation Safety in a Changing World. Certainly an eye-catching title, yet nothing on the dust jacket of this admirable volume-tit le, publisher's abstract, back-cover review quotes-gives any real indication that the message of the book is one of success. Every publisher knows that crashes, disasters, and conspiracies sell best. Just looking at my own bookshelves, I can see a pattern in titles: Brookes' Disaster in the Air [4]; Cook's An Accident Waiting to Happen [7]; Forman's Flying into Danger: the Hidden Facts about Air Safety [14]; Gol- ich's The Political Economy of International Air Safety: Design for Disaster? [ 16]; Gray- son's Terror in the Skies: The Inside Story of the World's Worst Air Crashes [ 18]; Ogilvy's UK Airspace: Is it Safe? (. . .yes!) [32]; Taylor's Air Travel: How Safe is It? ( . . . very!) [48]; Labich's Why Air Traffic Control is a Mess ( . . . yet almost never causes an accident !) [ 2 3 ] . . . I could go on! One of the very few books with a sober title is Hardy's Callback: NASA's Safety Reporting System [19]: one wonders how many more copies might it have sold as "You Don't Have to Kill People"." NASA "s Safety Reporting System (after one of its chapter titles)? Most, though not all, of these books in fact stress how exceptionally safe air travel is, and how extraordinarily cautious aviation professionals are in using their technologies. It is no part of my argument in some way to legitimize complacency, nor to discount the changing nature of the technological risks we face (for example, less errors, but more sudden, unprefigured, catastrophic errors: see [33]), but I do increasingly wonder whether we have too high a stake in identifying and prophesying failure and too low a stake in celebrating-albeit with due modesty-success. This paper stems from that unease.

Risk, Reliabiity, and Institutional Design There is no doubt that we live in an age with at least one characteristic wholly new

to human existence. We have developed technologies that deliver massive social and economic benefits, but that can and do suffer catastrophic failure, causing sudden and massive loss of life, property, and confidence. A recent report in the UK by a very distinguished study group [47] into this whole area of "risk" stressed the value (and, by implication, the current neglect) of the study of the institutional frameworks within which humans handle risky decisions. At a deep level "institution" refers to the social bases of shared classifications and ways of thought that underlie both collective and individual decisions [8, 9]. I contend that all too often when debating the pros and cons of the management of hazardous technologies we focus primarily on a subset of institutions: formal organizations and their bureaucractic "management structures." Consequently, I hope to show in this paper that important institutional attitudes to risk are structured not by organizational but rather by community allegiance.

A simple, if crude, way of illustrating my concerns is once again to look at the titles of just a few leading (and excellent) books in the field: Roberts' New Challenges to Understanding Organizations [39] and the rest of the output of the Berkeley High Reliabil- ity Organizations group [24-27, 40-46, 52]; Von Glinow and Mohrman's Managing Com- plexity in High Technology Organizations [51]; Pauchant and Mitroff's Transforming the Crisis-Prone Organization [34]. There is no doubt that we live, at least in the West, in societies dominated by formal organizations, their internal sense of priorities, their control structures and our employment relationships with them. That this domination

COMMUNITY, TECHNOLOGY, AND RISK 261

is all too real means that an organizational focus on analysis is scarcely unexpected: organizations do matter. Yet at the same time, there is a widespread unease in the literature and in our communities at large that organizations themselves may well be a significant part of the problem with risk management. Pauchant and Mitroff [34] put this most forcefully: to them all formal organizations are based on "machine age" thinking, and are thus totally inappropriate-downright dangerous- for what they call "systems age" problems and concerns. Hence the redesign of organizations- particularly to match the complexity of a "chaotic" world full of potentially catastrophic-interrelated technolo- gies, should be, must be, the new agenda [34, 36, 52].

It is exceptionally easy, and indeed very tempting, simply to conflate institutions and organizations! In a penetrating analysis of this problem, Van Maanen and Barley [50] note that a persistent theme in sociology has been the dichotomy between rational or administrative and communal or colleagual forms of work organization. They argue that most work in organization behavior (and consequently in Schools of Management) adopts the former, organizational perspective to the virtual exclusion of the latter, occupa- tional perspective. Woodward [53] similarly argues that "management" itself has become defined as the preserve of the body of people who run the administrative / rational system, and that we have neglected its functional sense of being an integral part of all social activity. Taking these perspectives into account, I argue that approaches by practitioners to the "management of risk and hazard" are in fact configured by institutional allegiance, of which organizational membership is only one-albei t impor tant - subset. The implica- tion of this is that when we account for either success or failure in managing risks we need to beware of simple attribution to the health or sickness of formal organizations alone.

But before taking this line of reasoning further, I need to argue that there are some

risks and hazards that we appear to be very good at managing.

The Pathology of Organizations In their impassioned call for a new agenda, Pauchant and Mitroff [34] not only

assert that contemporary organizational technique is dysfunctional in the "Systems Age," but the a//actual organizations are so: "At present there is no organization that can serve as a model for a Systems Age organization, especially a model for exemplary crisis management" (p. 2). This sentiment is largely echoed elsewhere in the relevant literature, for example by scholars as distinguished in their fields as Perrow [35] and Reason [37, 38]. Both of these authors admit, however, that there may in fact exist a very few exceptionally successful technological activities that appear to stand in some contrast to this received wisdom. Furthermore they accept that these few are insufficiently studied or understood.

Why these types of systems are understudied is in part explained by Vaill's [49] observation that "the number of social scientists who are trying to understand excellence in human systems is very small. Pathology is more accessible and, for some, more fun" (p. 39). By "pathology" we here mean the post hoc analysis of failure, disaster, or grossly suboptimal achievement; in the hazardous technologies world post hoc is of course very frequently post mortem, which gives this type of analysis extra zest. Reason, a "human error" psychologist, makes the similar point that, just as in medicine, it is far easier to characterize sick systems than healthy ones [37, 38]. He makes a plea for concurrent work in the field of hazardous technologies-especially those in which the incidence of error is low, but the results of error have catastrophic potential-into both types of systems, where both types can be seen to exist.

Here Reason draws on work carried out through the 1980s by a multidisciplinary team from Berkeley into what they termed "High Reliability Organizations" (HROs) [39].

262 G. BROWN

Their direct concern was the future management of such new, exceptionally complex hazards as nuclear waste management. The work started with a manifesto that the lessons we learn about the preconditions for success from studying either past failure or small- scale simulations of future operations are not necessarily the same lessons that we would learn from studying actual real-life success itself, and drawing appropriate analogies from it.

The team claimed that it was possible to identify a small population of extremely hazardous operations that operated very nearly error free. They further observed that these operations, perhaps indeed because of their success, were neither well-known nor much studied. In consequence, there was no well-developed body of social science litera- ture that either described or explained such success. In the search for guidance from analogy, the organizations the Berkeley group have studied over the years include the Diablo Canyon nuclear power plants, the Federal Aviation Authority's Air Traffic Control System, and US Navy nuclear-powered Aircraft Carriers. The HRO literature is problem- atic, not least in the eyes of its own originators [6a, 39]. This is not the occasion for a detailed review, but the essence of the problem lies in conceptualizing "reliability" itself. We are trying to find compelling cases of valuable, difficult, dangerous technologies that are perfectly executed over long periods. Each italicized word is slippery and contestable!

It is both a curiosity and an attraction of the Berkeley work that it silently mirrors a similar theme in business and organization studies during the 1980s, which re-directed investigation to success factors that might have predictive power in revitalizing the Western economies [28]. Of course any such approach faces, and stumbles at, the "necessary and sufficient" conditions test. Suffice it to say here that I follow Hans Eysenck [quoted in 10] in believing that the purpose of in-depth analysis of a compelling case is primarily to learn something rather than to prove anything.

"ORGANIZATION CULTURE" AND RELIABILITY I intend to concentrate in the remainder of this paper on one influential theme

that has emerged from (or rather been re-discovered in) what has become known as the "organizational culture" work: that organizations with a sense of mission that is deeply engrained in the everyday working culture are outstandingly effective at meeting their objectives. These two elements of effectiveness-"mission" and "engraining"-neatly re- flect the two different ways in which "culture" has been considered in the literature [2, 28]. The first is the "top down" approach, which regards culture as something that an organization has, and that is in consequence amenable to being directed, managed, and changed. The second is the "bottom up" approach, which regards culture as being what an organization is, and being in consequence institutionalized over the long-term and much less easily subject to either replication elsewhere or "quick fixes" from above. This second approach usually recognizes the subcultures exist, and that the meaning that work has for organizational members is by no means wholly constructed by the organization itself [17, 50]. To my mind it is most odd that the strength of subcultures and their allegiances to institutions that cross organizational boundaries has not been considered in the HRO work.

THE "CULTURE OF HIGH RELIABILITY" IN UK AIR TRAFFIC CONTROL The HRO literature does make organizational culture a particular focus of its atten-

tion and seems broadly (if seldom explicitly) to conclude that an institutionalized mission is a key component of high reliability (see for example the contributions to Roberts' recent edited volume [39]). This proposition is most strongly set out in Weick's influential 1987 paper Organizational Culture as a Source o f High Reliability [52]. This paper, rather

COMMUNITY, TECHNOLOGY, AND RISK 263

than the then largely unpublished HRO work itself, was the departure point for my own studies in the UK. The original research [5, 6] studied the cultural origins of the remarkable safety record of Britain's civil Air Traffic Control (ATC) system: since its inception it

has never caused, or contributed to the cause of, an aviation accident. Work-in-progress is a study of UK airline operations: the safety record here, though not ATC-perfect, is among the best in the world [6b; see also 32, 33, 48].

In simple terms the ATC system manages the routing of aircraft from starting point to destination. This involves carving up the airports and the sky into published routes and allocating space in those routes to individual airplanes. As those airplanes often cannot see each other, instructions regarding height, speed, and direction are given to pilots by ground controllers working from either a "mental map" of the total traffic picture based on who has been given permission to do what, or from a real, radar picture of the traffic in a certain area. The details of the rules and procedures used are extraordinarily complex, but the simplified essence is that passenger air traffic is routed through "pro- tected" airspace under the continuous direction of an air traffic controller; the aircraft crew is responsible for the integrity of aircraft equipment and procedures and for the precise execution of necessary maneuvers.

In most of the world, the ATC system is a central government function frequently split between regulatory and operational arms. In the UK, however, most unusually, the operation of ATC is carried out by a mixture of national, local, and purely commercial organizations, all heavily government regulated. The regulatory system governs the techni- cal and operational integrity of the entire aviation system and allows only licensed individ- uals to operate any part of it.

The regulatory regime centers on the prime purpose of having an ATC system at all: to prevent collisions between aircraft in the air or on the ground. The secondary p u r p o s e - and it is always explicitly secondary - is to facilitate the orderly and expeditious passage of air transport. Throughout the aviation world there has been, from the very earliest days of commercial aviation, a consensus among operators, users, governments, and the public that aviation infrastructure has the nature of a public good [6a, 16, 22, 24]. This consensus arises from: concerns over military and strategic issues; fears of an "unnatural" form of transportation; insurance costs based in part on an aggregate view of industry risk; and the need to derive maximum social benefits from a very rapidly expanding, wholly new industry. For all these reasons, the words aviation and safety are

seldom de-coupled. As Miller and Rice observe in their "socio-technical system" classic [29], the commonly agreed objectives of civil aviation seem to run as safe a system as is consistent with flying at all. "Mission Statements" in aviation are in consequence not merely aspirational, but are intended as an explicit re-affirmation of long-term, clear purpose in respect to safety standards [6a].

The clarity of the mission results in air traffic control operations being strongly task

focused, rather than being driven by technological, political, or efficiency concerns. The technology deployed is remarkably s imple-of ten still pen-and-paper based, with radar moni to r ing-and wonderfully robust; it is also defended in both depth and width. All decisions regarding aircraft movements are made by highly trained individuals, who have had to demonstrate their competence to handle "live" traffic during a lengthy, rigorous period of peer review. This competence has to be re-validated by further peer review, usually annually. Nobody without a current licence may control air traffic, and such licences are restricted to particular types of airspace, to particular locations, and to particular equipment. The licence gives controllers the personal, legal responsibility to execute their duties according to the many standard rules and procedures, as well as

264 G. BROWN

authorizing them to use their initiative and discretion at any point where safety appears to be prejudiced for whatever reason.

The unambiguous mission, its acceptance by all stakeholders, the hazards of the work, the intensive training, and the continual peer review all combine to produce high levels of moral adhesion [11, 12] to their work among air traffic controllers. The crucial point here is that air traffic controllers draw a substantial part of their identity not from being employees of various organizations that provide ATC services, but from their membership of the community of controllers. Furthermore it is apparent that the highly specialized nature of the work and the institutionalization of the mission by the regulatory regime invest the community with very substantial expert power. This power is deployed not only to serve the interests of "the safety of the flying public" but also to advance the economic and social interests of the community itself [6a, 50]. This latter point, common in the sociology of the professions [e.g., 1, 3, 21], gives controllers a very large stake in maintaining their power by actively deploying it.

The targets of this power are of course the employing organizations, which are the immediate sources of community members' revenue. So not only does the whole structure of the community's work give them substantial countervailing power with regard to em- ployers, but the very nature of the work--"public safety"-enables controllers to claim a "cosmopolitan" perspective, institutionally opposed to any inherently "local" concerns of the employing organizations [17]. These cosmopolitan concerns are the maintenance of standards, staffing levels, task focus, public service, lengthy professional accreditation and so on; the local concerns, so it is asserted by controllers, are such things as cost reductions, efficiency concerns, increases in system capacity, and new technologies that are not conceived primarily in terms of assisting in task accomplishment.

In the U K - a n d the UK is in fact no special exception he re - th i s has resulted in the curious combination of extraordinarily safe services, delivered during decades of appalling industrial relations and almost every dysfunction of bureaucracy that can be imagined [20, 30, 31]! Put simply, organizational managers could do very little (for example, progress efficiency concerns, plan for future systems) when substantial parts of their arena of interests were in the power of an exceptionally cohesive occupational group, who idealize [13] their task focus and privilege it above all other concerns.

One surprising feature of the fieldwork among air traffic controllers was the difficulty I had throughout in persuading them that their safety record was both perfect, and most unusual in being so! Although they were proud of their professionalism, they were very re luctant- often to the point of extreme u n e a s e - t o concede that the system was anything other than permanently teetering on the edge of disaster, and many "near miss" stories are constantly circulated in support of this proposition. Most are wildly exaggerated [6a]. Now as micro-sociologists in particular will point out, occupational communities engaged in dangerous work will routinely use the presence of great hazard as a forensic resource in their constant battle for autonomy and market power: similarly "stress" [9, 50]. The more pertinent point here is that this objectively excessive modesty regarding the success record is also an extremely functional cultural heuristic, which serves to keep the awfulness of failure uppermost in the minds of a community that has in fact never experienced it [6]. That this modesty also serves partly to obscure the fact that institutions can manage contemporary, complex industrial hazards with extreme reliability is paradoxically unfor- tunate and comforting.

Summary Civil aviation is one of the great success stories of the twentieth century, though

one would scarcely guess this from a surface glance at the bulk of the literature. When

C O M M U N I T Y , T E C H N O L O G Y , AND RISK 265

discussing both the successful and the pathological management of major risk and hazard, I have argued that it is necessary to attend to the institutional bases of the phenomena and not to make the assumption that these bases are merely organizational. Even when the literature actually does focus on admitted success, as a corrective to excessive pathology, 1 have shown that its analysis remains, to its disadvantage, firmly at an organizational level. My own work has attempted to account for success in aspects of UK aviation. Here were found some exceptionally reliable operations that were, at one and the same time, conflict-ridden and bureaucratically dysfunctional. Although there existed a consen- sus among all stakeholders that the safety mission was the "prime directive," the nature of the regulatory regime gave the front-line power to exec~Jte the mission to licensed individuals. These individuals obtained their license by rigorous training and peer review, and as a consequence owed their cultural allegiance to an occupational community rather than to individual employing organizations: a deeply engraincd, and well-protected, fea- ture of this community was the idealization of passenger safety and near-contempt for local managerial concerns at the organizational level. A surprising feature of the commu- nity was their reluctance to celebrate their remarkable record in public or private; this was argued to be functional in maintaining the clarity of the safety mission and in banishing complacency born of success.

In conclusion, I am very grateful to my colleague Professor Barry Turner [personal communicat ion, September, 1993] for summarizing my own argument to me: in examining the management of significant hazards we may well find no healthy organizations, but we can find, and empower, healthy occupational communities within them.

References 1. Abbott, A., The System o f Professions, Chicago University Press, Chi~'ago; 1988. 2. Ackroyd, S., and Crowdy, P. A., Can Culture be Managed? Working with "!~taw'" Material: The Case of

the English Slaughtermen, Personnel Review, 19(5) 3-13 (1990). 3. Blain, A. N. J. Pilots and Management: Industrial Relations in the UE Airlines, Allen and Unwin, Lon-

don; 1977. 4. Brookes, A., Disaster in the Air, Ian Allen, Shepperton, Surrey; 1992. 5. Brown, G., The Recruitment and Selection o f Air Traffic Controllers in the UK: Three Exploratory Case

Studies, M.Phil. Thesis, Oxford University, unpublished (1990). 6a. Brown, G. Organizational Culture as a Source o f High Reliabilio,: The Case o f UK Air Traffic Control,

D.Phil. Thesis, Oxford University, unpublished (1992). 6b. Brown, G., Sociological Aspects o f the Management of Safety in Civil Aviation, Report to the Lloyd's

of London Tercentary Foundation, unpublished (1994). 7. Cook, J. An Accident Waiting to Happen, Unwin, London; 1989. 8. Douglas, M., How Institutions Think, Routledge, London; 1987. 9. Douglas, M., Risk and Blame, Routledge, London; 1993.

10. Dunkerley, D. Historical Methods and Organizational Analysis: The Case ¢f a Naval Dockyard, in Doing Research in Organizations. A. Bryman, ed., Routledge, London, 1988.

11. Etzioni, A., Organizational Control Structure, in Handbook" of Organizations. J. G. March, ed., Rand McNally, New York; 1965.

12. Etzioni, A. A Comparative Analysis o f Complex Organizations, Free Press, New York; 1961. 13. Feldman, S., The Idealization of Technology: Power Relations in an Engineering Department, Human

Relations, 42(7), 575-592 (1989). 14. Forman, P. Flying Into Danger." The Hidden Facts about Air Safe(v, lleinemann, London; 1990. 15. Galbraith, J. K., American Capitalism: The Concept o f Countervailing Power, Houghton Mifflin, Bos-

ton; 1956. 16. Golich, V. L. The Political Economy o f International Air Safety: Design for Disaster? Macmillan, Lon-

don; 1989. 17. Gouldner, A. Cosmopolitans and Locals: Towards an Analysis of Latent Social Roles, Administrative Science

Quarterly, 1(2), 281-306 (1957). 18. Grayson, D., Terror in the Skies." The Inside Story o f the WorM's Wor.~t Air Crashes, Star, London; 1988.

266 G. BROWN

19. Hardy, R. Callback: NASA's Aviation Safety Reporting System, Airlife, Shrewsbury, England, 1990. 20. House of Commons of the United Kingdom. House of Commons Transport Committee: Air Traffic Control

Safety, Vols. I, II, III, HMSO, London; 1989. 21. Johnson, T., Professions and Power, Macmillan, London; 1972. 22. Komons, N. A., Bonfires to Beacons: Federal Aviation Policy under the Air Commerce Act, 1926-1938,

Smithsonian Press, Washington; 1977. 23. Labich, K., Why Air Traffic is a Mess, Fortune. August 17th, 54-58 (1987). 24. LaPorte, T., The US ATC System: Increasing Reliability in the Midst of Rapid Growth, in The Development

of Large Technical Systems. R. Mayntz and T. Hughes, eds. Westview, Boulder, CO, 1988. 25. LaPorte, T., and Consolini, P., Working in Practice but not in Theory: Theoretical Challenges of "High-

Reliability Organizations", Journal of Public Administration Research and Theory 1(1), 19-47 (1991). 26. LaPorte, T., ed., Organized SocialComplexity: Challenge to Politics andPolicy, University Press, Princeton,

NJ, 1975. 27. LaPorte, T., On the Design and Management of Nearly Error-Free Organizational Control Systems, in

Accident at Three Mile Island: The Human Dimensions. D. L. Sills et al., eds. Westview, Boulder, CO; 1982. 28. Meek, V. L., Organization Culture: Its Origins and Weaknesses, Organization Studies, 9(4), 453-473 (1988). 29. Miller, E. J., and Rice, A. K., Systems of Organization: The Control of Task and Sentient Boundaries,

Tavistock, London; 1967. 30. MMC, Cm9068 Civil Aviation Authority: A Report on the Supplying by the Authority of Navigation and

Air Traffic Control Services to Civil Aircraft, HMSO for the Monopolies and Mergers Commission, Lon- don; 1983.

31. MMC, Cm 1122 Civil Aviation Authority: A Report into the Supply of Navigation and Air Traffic Control Services to Civil Aircraft, HMSO for the Monopolies and Mergers Commission, London; 1990.

32. Oqilvy, D., UKAirspace: Is It Safe?, Haynes, Yeovil, Somerset, U.K., 1988. 33. Oster, C. V., Strong, J. S., and Zorn, C. K., Why Airplanes Crash: Aviation Stifety in a Changing World,

OUP, New York; 1992. 34. Pauchant, T. C., and Mitroff, I. M., Transforming the Crisis-Prone Organization, Jossey-Bass, San Fran-

cisco; 1992. 35. Perrow, C., Normal Accidents: Living with High Risk Technologies, Basic Books, New York; 1984. 36. Peters, T., Liberation Management: Necessary Disorganization for the Nanosecond Nineties, Macmillan,

London; 1992. 37. Reason, J. T., The Contribution of Latent Human Failures to the Breakdown of Complex Systems, in Human

Factors in Hazardous Situations. D. Broadbent, A. Baddeley, and J. Reason, eds. OUP, Oxford; 1990. 38. Reason, J. T. Human Error, Cambridge University Press, Cambridge; 1990. 39. Roberts, K. H. New Challenges to Understanding Organizations, Macmillan, New York, 1993. 40. Roberts K. H., Some Characteristics of One Type of High Reliability Organization, Organization Science,

1(2), 160-176 (1990). 41. Roberts, K. H., Managing High Reliability Organizations, California Management Review, Summer, 32(3),

101-113 (1990). 42. Roberts, K. H., New Challenges in Organizational Research: High Reliability Organizations, Industrial

Crisis Quarterly, 3(2), 111-125 (1989). 43. Roberts, K. H., and Gargano, G., Managing a High-Reliability Organization: A Case for Interdependence,

in Managing Complexity in High Technology Organizations. M. Glinow and S. Von Mohrman, eds., OUP, Oxford; 1990.

44. Roberts, K. H., and Rousseau, D. M., Research in Nearly Failure-Free, High-Reliability Organizations: Having the Bubble, IEEE Transactions on Engineering Management, 36(2), 132-139 (1989).

45. Rochlin, G. I., Informal Organizational Networking as a Crisis Avoidance Strategy: US Naval Flight Opera- tions as a Case Study, Industrial Crisis Quarterly, 3(2), 159-176 (1989).

46. Rochlin, G. I., LaPorte, T. R., and Roberts K. H., The Self-Designing High-Reliability Organization: Aircraft Carrier Flight Operations at Sea, Naval War College Review, Autumn, 76-90 (1987).

47. Royal Society, Risk: Analysis, Perception and Management. Report of a Royal Society Study Group, The Royal Society, London; 1992.

48. Taylor, L., Air Travel: How Safe Is It?, Blackw¢ll Scientific, Oxford; 1989. 49. Vaill, P. B., The Purposing of High Performing Systems, Organizational Dynamics, 26(3), 39-51 (1982). 50. Van Maanen, J., and Barley, S. R., Occupational Communities: Culture and Control in Organizations, in

Research in OrganizationalBehavior Vol. 6. B. Staw and L. L. Cummings, eds., JAI, Greenwich, CT; 1984. 51. Von Glinow, M. A., and Mohrman, S. A., Managing Complexity in High Technology Organizations, OUP,

New York; 1990.

COMMUNITY, TECHNOLOGY, AND RISK 267

52. Weick, K. E., Organizational Culture as a Source of High Reliability, California Management Review, 29(2), 112-127 (1987).

53. Woodward, S. N., Management--The Ghost in the Machine?, London Business School Journal, Autumn, 24-31 (1985).

Received 18 June 1994; accepted 15 October 1994


Recommended