+ All Categories
Home > Documents > Living with High-Risk Systems Michael S. Tashbook Department of Computer Science University of...

Living with High-Risk Systems Michael S. Tashbook Department of Computer Science University of...

Date post: 14-Dec-2015
Category:
Upload: lindsay-gleaves
View: 212 times
Download: 0 times
Share this document with a friend
Popular Tags:
36
Living with High- Living with High- Risk Systems Risk Systems Michael S. Tashbook Michael S. Tashbook Department of Computer Department of Computer Science Science University of Virginia University of Virginia September 23, 2002 September 23, 2002
Transcript

Living with High-Risk Living with High-Risk SystemsSystems

Michael S. TashbookMichael S. Tashbook

Department of Computer ScienceDepartment of Computer Science

University of VirginiaUniversity of Virginia

September 23, 2002September 23, 2002

9/23/02 Living with High-Risk Systems 2

Categories of RiskCategories of Risk

• Not all high-risk systems are created equal

• We can partition the set of high-risk systems into three classes: Hopeless Cases Salvageable Systems Self-Correcting Systems

9/23/02 Living with High-Risk Systems3

Hopeless Cases

• This category is composed of systems where the (inevitable) risks far outweigh any reasonable benefits

• These systems should just be abandoned — at least in Perrow’s view

• Examples: Nuclear weapons Nuclear power

9/23/02 Living with High-Risk Systems4

Salvageable Systems

• Salvageable systems are systems that we can’t do without, but

that can be made less risky with considerable effort, or

systems where the expected benefits are so great that some risks should be run

• Examples: Some marine transport DNA research

9/23/02 Living with High-Risk Systems5

Self-Correcting Systems

• This category contains systems that are not completely self-correcting, but are self-correcting to some degree

• Only modest efforts are needed to improve these systems further

• Examples: Chemical plants Airplanes/Air Traffic Control

9/23/02 Living with High-Risk Systems6

Is Abandonment the Answer?

• Should systems in the “Hopeless Cases” category be abandoned summarily?

• Should drastic modifications be made for other high-risk systems (namely, those in the “Salvageable” category)?

• Not necessarily; Perrow’s argument makes several assumptions that may not be true

7Living with High-Risk Systems9/23/02

Perrow’s Assumptions

Current risk assessment theory is flawed

The public is adequately equipped to make rational decisions, and its opinions should be respected by policy experts

Organizational changes will have little effect in increasing system safety

9/23/02 Living with High-Risk Systems 8

1. Risk Assessment1. Risk Assessment

• Analysis of the risks and benefits offered by new systems — examination of the tradeoffs (if any)

• Modern risk assessors work to: inform and advise on the risks and benefits of

new systems legitimate risks and reassure the public second-guess regulatory agencies’ actions

9/23/02 Living with High-Risk Systems9

How Safe is Safe Enough?

• More accurately, how do we model risk?

• Mathematical models are generally used to model risk

• The problem with this kind of analysis is that it only measures things that can be quantified How much is your life worth?

9/23/02 Living with High-Risk Systems10

Biased Interpretations

• Problem of systematic biases and public opinion Does every death have the same

impact? Is a death from diabetes or cancer as bad as

a murder? The public doesn’t seem to think so.

Are fifty thousand annual highway deaths really equivalent to a single nuclear catastrophe?

9/23/02 Living with High-Risk Systems11

Systematic Biases

• Risk assessment differentiates between voluntary risks and involuntary risks

• However, the system doesn’t discriminate between the imposition of risks and the acceptance of risks

• This dispassionate cost-benefit approach often leads to “the tyranny of the bean-counters”

9/23/02 Living with High-Risk Systems12

Cost-Benefit Analysis (CBA)

• CBA ignores the distribution of wealth in society Risk assessments ignore the social class

distribution of risks

• CBA relies heavily on current market prices Thus, low-paid employees are worth less

when risks are considered

9/23/02 Living with High-Risk Systems13

More CBA Assumptions

• New risks should not be higher than others we have already accepted if other systems become riskier, we can

lower safety levels on new systems

• Competitive markets require risky endeavors

9/23/02 Living with High-Risk Systems14

More RA/CBA Criticisms

• RA/CBA does not distinguish between: Addiction and free choice Active risks and passive risks

This isn’t just a matter of in/voluntary risk — it’s a question of control

• Risk assessors would prefer to exclude the public from decisions that affect their interests

9/23/02 Living with High-Risk Systems 15

2. Decision-Making2. Decision-Making

• Risk assessors assert that the public is ill-equipped to make decisions on their own behalf, and cognitive psychologists agree

• Humans don’t reason well: We maximize some dangers while minimizing

others We don’t calculate odds “properly”

9/23/02 Living with High-Risk Systems16

Three Types of Rationality

• Absolute rationality Risks and benefits are calculated exactly,

offering a clear view of what to do

• Bounded rationality Employs heuristics to make decisions

• Social and cultural rationality Limited rationality has social benefits

9/23/02 Living with High-Risk Systems17

Bounded Rationality

• People don’t make absolutely rational decisions, possibly due to: neurological limitations memory/attention limits lack of education lack of training in statistics and probability

• Instead, we tend to use hunches, rules of thumb, estimates, and guesses

9/23/02 Living with High-Risk Systems18

More on Bounded Rationality

“There are two reasons for perfect or deductive rationality to break down under complication. The obvious one is that beyond a certain complicatedness, our logical apparatus ceases to cope—our rationality is bounded. The other is that in interactive situations of complication, agents can not rely upon the other agents they are dealing with to behave under perfect rationality, and so they are forced to guess their behavior. This lands them in a world of subjective beliefs, and subjective beliefs about subjective beliefs. Objective, well-defined, shared assumptions then cease to apply. In turn, rational, deductive reasoning—deriving a conclusion by perfect logical processes from well-defined premises—itself cannot apply. The problem becomes ill-defined.”

— W. Brian Arthur, “Inductive Reasoning and Bounded Rationality” (1994)

9/23/02 Living with High-Risk Systems19

The Efficiency of Heuristics

• Heuristics are useful; they save time, even if they are wrong on occasion

• Heuristics: prevent decision-making “paralysis” drastically reduce search costs improve (are refined) over time facilitate social life work best in loosely-coupled (slack,

buffered) environments

9/23/02 Living with High-Risk Systems20

Pitfalls of Heuristics

• Heuristics rely on the problem context; if this is wrong, then the resulting action will be inappropriate

• Context definition is subtle and difficult

• Heuristics are related to intuitions Intuitions are a form of heuristic Intuitions may be held even in the face

of contrary evidence

9/23/02 Living with High-Risk Systems21

Rationality and TMI

• The TMI accident occurred shortly after it was put into service

• Absolute rationality acknowledges that a problem was was bound to happen eventually; it just happened sooner rather than later

• Is this comparable to the “1x10-9 standard”?

9/23/02 Living with High-Risk Systems22

Rationality and TMI (cont’d)

• This may be true, but is it the point?• TMI was a new type of system, and

no heuristics existed for it at the time• Even though problems may be rare,

they can be very serious• Experts predicted that TMI was

unlikely to occur, yet it did; could they have been wrong?

9/23/02 Living with High-Risk Systems23

Bounded Rationality vs. TMI

• The logic of the public response to TMI was technically faulty; even so, it was efficient and understandable

• Experts have been wrong before; it’s efficient to question them

• Bounded rationality is efficient because it avoids extensive effort Can John Q. Public make a truly

informed decision about nuclear power?

9/23/02 Living with High-Risk Systems24

Social and Cultural Rationality

• Our cognitive limits are a blessing rather than a curse

• There are two reasons for this: Individuals vary in their relative cognitive

abilities (multiple intelligences theory) These differences encourage social bonding

Individual limitations or abilities lead to different perspectives on (and solutions to) a given problem

9/23/02 Living with High-Risk Systems25

Risk Assessment Studies

• Clark University study of experts and the lay public The two groups disagreed on how to judge the

risk of some activities Disaster potential seemed to explain the

discrepancy between perceived and actual risk For the public, dread/lethality ratings were

accurate predictors of risk assessments

• Subsequent study identified three “factors” (clusters of interrelated judgments)

9/23/02 Living with High-Risk Systems26

Dread Risk

• Associated with: lack of control over activity fatal consequences high catastrophic potential reactions of dread inequitable risk-benefit distribution belief that risks are not reducible

• Correlation with interactively complex, tightly-coupled systems

9/23/02 Living with High-Risk Systems27

Unknown Risk

• This factor includes risks that are: unknown unobservable new delayed in their manifestation

• This factor is not conceptually related to interaction and coupling as well as dread risk

9/23/02 Living with High-Risk Systems28

Societal/Personal Exposure

• This factor measures risks based on: the number of people exposed the rater’s personal exposure to the risk

in question

• Of all three factors, dread risk was the best predictor of perceived risk

9/23/02 Living with High-Risk Systems29

Thick vs. Thin Descriptions

• A “thin description” is quantitative, precise, logically consistent, economical, and value-free

• A “thick description” recognizes subjective dimensions and cultural values, and expresses a skepticism about human-made systems

9/23/02 Living with High-Risk Systems 30

3. Organizational Solutions3. Organizational Solutions

• In general, risky enterprises are organizational enterprises

• Tightly controlled, highly centralized, authoritarian organizations should be put into place to run risky systems and eliminate “operator error”

• But does this really help things?

9/23/02 Living with High-Risk Systems31

Suggested Organization Types

Linear Interaction Complex Interaction

Tight Coupling

Centralization for tight coupling and linear interactions

Centralization for tight coupling.Decentralization for complex interactionsThese demands are incompatible!!!

Loose Coupling

Centralization and decentralization are both feasible

Decentralization for loose coupling and complex interactions

9/23/02 Living with High-Risk Systems32

Where Does the Problem Lie?

• Technology? “[W]e are in the grip of a technological

imperative that threatens to wipe out cultural values….”

• Capitalism? Private profits lead to short-run

concerns Social costs are borne by everyone

• Greed? Private gain versus the public good

9/23/02 Living with High-Risk Systems33

The Problem of Externalities

• Externalities are the social costs of an activity (pollution, injuries, anxieties) that are not reflected in its price

• Social costs are often borne by those who receive no benefit from the activity, or who are even unaware of it

• Systems with identifiable/predictable victims are more likely to consider externalities

9/23/02 Living with High-Risk Systems34

A New Cost-Benefit Analysis

• How risky are the systems we have been considering, only in terms of catastrophic potential?

• How costly are the alternative ways (if any) of producing the same outputs?

35Living with High-Risk Systems9/23/02

The Final Analysis

Systems are human constructs, whether carefully designed or unplanned emergences

These systems are resistant to change System catastrophes are warning

signals, but not the ones we think These signals come not from individual

errors, but from the systems themselves

Living with High-Risk Living with High-Risk SystemsSystems

Michael S. TashbookMichael S. Tashbook

Department of Computer ScienceDepartment of Computer Science

University of VirginiaUniversity of Virginia

September 23, 2002September 23, 2002


Recommended