How can Human-Systems Integration Support a Safety Culture? · How can Human-Systems Integration...

Post on 17-Jan-2020

0 views 0 download

transcript

How can Human-Systems Integration Support a Safety

Culture?

An Overview of Human-Systems Integration:

Implications for Quality and Safety in the Healthcare

1

5th Annual Middle Eastern Forum on Quality and Safety in Healthcare

Hamad Medical Corporation (HMC) & Institute for Healthcare Improvement

Doha, Qatar

Saturday May 6, 2017

Dr. Najmedin MeshkatiProfessor, Department of Civil/Environmental Engineering

Professor, Department of Industrial & Systems Engineering

Professor, School of International Relations

University of Southern California

&

Commissionaire, The Joint CommissionEmail: meshkati@usc.edu

Outline

• Introduction/My story – Cross-cutting/common human factors and safety culture issues

• Human-Machine System (HMS) and Human-Machine Interactions

• My Premises re major subsystems of a complex technological system, e.g., healthcare–The “HOT” Model

• An example of “design-induced error”

• The role of cultural factors

• Why safety culture is important/vital, what is it, what are its roles in patient safety

• High Reliability Organization (HRO) and healthcare

• Conclusion – Closing Remarks

2

My story…Last 30 years of working directly with and experience with:

• Aviation

• Nuclear power

• Offshore Drilling

• Petrochemical

• Refining

• Oil & Gas Pipeline

• Railroad

• Maritime

• Coal Mining

And recently (last 15+ years) with Health Care industries

March 28,

1979

Three Mile

Island

December

3, 1984

April 26,

1986

March 23,

2005

April 20,

2010

March 11,

2011

Chernobyl

Bhopal

BP

Deepwater

Horizon

BP

Refinery

Fukushima

My life story…..

Medical Systems

USC Keck Hospital2014

7

An Example of

Human-Machine System

Human-Machine Interactions

Human-Machine System

Operator A

Operator B

MACHINE

Interface Level MachineHuman

Input Output

System Z

System X System Y

Primary

Interactions

Secondary

Interactions

Situational

awareness

12

A Fundamental Issue(My Premise)

Safety and Reliability of Complex

System

The ‘HOT’ ModelMajor Subsystems of a Complex Technological System

(e.g., a nuclear power plant, refinery, hospital)

13

Human

Organization

Technology

Volume of Output

Interactive

Effect

14

Human

Organization

Technology

Volume of Output

Interactive

Effect

Human Error

A Fundamental Issue

“Human error can be considered

as either human-machine or

human-task mismatches.”

Professor Jens Rasmussen

Los Angeles, 1992

System-induced or Design-induced human errors can be considered

as (or caused by the) following types of mismatches:

- Human-machine;

- Human-task; and/or

- Human-organization

16

17

Wrong-site Surgery

18

Source: The New York Times, “So, the Tumor Is on the Left, Right? Seeking Ways to

Reduce Operating Room Errors”, Sunday, April 1, 2001, P. 23

An Example of

Design-Induced Error and

System’s Failure

A B C D

A

BC

DA B D C

BD

A C

B A D C

BD

A C

A B C D

BD

A C

I II

III IV

A B C D

A

BC

DA B D C

BD

A C

B A D C

BD

A C

A B C D

BD

A C

I II

III IV

Number of errors

Design out of 1200 trials

I 0

II 76

III 116

IV 129Source: Chapanis & Lindenbaum, 1959.

Human Factors, Workplace Design, &

System Safety/Reliability

“The Human Error Probability (HEP) will be reduced by factors of 2 to

10 if the workstation (display and controls) are improved by the

incorporation of standard human engineering concepts”(Swain and Guttmann, 1983, p.11-5)

From: Swain, A.D. and Guttmann, H.E. (1983, June). Handbook of Human Reliability Analysis with Emphasis

on Nuclear Power Plant Applications. Final Report (NUREG/CR-1278). Washington, D.C.: U.S. Nuclear

Regulatory Commission.

Personal Observations on

the role of Cultural Factors in

Human-Machine/Technology

Interactions

&

Safety

Culture, Facts and Theories

Facts are not pure and unsullied bits of information; culture also influences what we see and how we see it. Theories, moreover, are not inexorable inductions from facts. The most creative theories are often imaginative visions imposed upon facts; the source of imagination is also strongly cultural.

(The late) Professor Stephen Jay Gould, renowned Harvard University professor of geology, biology, and the history of science (The Mismeasureof Man, 1981, p. 22).

March 28,

1979

Three Mile

Island

December

3, 1984

April 26,

1986

March 23,

2005

April 20,

2010March 11,

2011

Chernobyl

Bhopal

BP

Deepwater

Horizon

BP

Refinery

Fukushima

My life story + Aviation accidents (with cultural issues)

Avianca,

1990

Korean Air 801,

1997

Überlingen,

2002

Asiana 214,

2013Tenerife,

1977

National Culture Implicated as a

Contributing Factor to 5 Severe Accidents

• Tenerif - Runway Incursion – Canary Island,

Sprain - 1977 (583 fatalities)

• Avianca 052 – Crash - New York – 1990 (73

fatalities)

• Korean Air 801 – Crash - Guam – 1997 (228

fatalities)

• The Überlingen mid-air collision –

Switzerland – 2002 (71 fatalities)

• Asiana 214 – Crash - San Francisco -2013 (3

fatalities)

International Civil Aviation Organization

Journal(Oct 1996)

Revista Tecnia del ANPAC – (2000)(Nazionale Piloti Aviazione Commerciale)

National Commercial Pilots Association

Italy

The Cultural Context of Nuclear Safety

Culture:

A Conceptual Model and Field Study(1999)

Australian Aviation, March 2014Writer: Geoffrey Thomas

“Asiana crash shows continued need for vigilance against CRM & cultural issues”

“The number stands at 42. To be more precise, SIA currently employ pilots from 42

different countries” (Email from Capt …September 22, 2003)

National Culture

Corporate Culture

Safety Culture

National, Corporate, & Safety Culture(s)

What is Safety Culture and why it is so important

Why safety culture is so critical/vital?

Safety Culture as a Root-Cause of a System’s

Common Mode Failure

• Because of their diversity and redundancies, the defense-in-depth will be widely distributed throughout the system.

• As such, they are only collectively vulnerable to something that is equally widespread. The most likely candidate is safety culture.

• It can affect all elements in a system for good or ill.

Professor James Reason, A Life in Error, 2013, Page 81

What is Safety Culture?

US Nuclear Regulatory Agency’s (US NRC) Definition of

Safety Culture

“The core values and behaviors resulting from a collective commitment

by leaders and individuals to emphasize safety over competing goals to

ensure protection of people and the environment.” (SECY-11-0005,

January 5. 2011)

The USNRC’s Policy Statement on Safety Culture

(SECY-11-0005, January 5. 2011)

Nine “traits of positive safety culture”

1) Leadership Safety Values and Actions - Leaders demonstrate a commitment to

safety in their decisions and behaviors;

2) Problem Identification and Resolution - Issues potentially impacting safety

are promptly identified, fully evaluated, and promptly addressed and corrected

commensurate with their significance;

3) Personal Accountability - All individuals take personal responsibility for

safety;

The USNRC’s Policy Statement on Safety Culture

(SECY-11-0005, January 5. 2011)

Nine “traits of positive safety culture”

4) Work Processes - The process of planning and controlling work activities is

implemented so that safety is maintained;

5) Continuous Learning - Opportunities to learn about ways to ensure safety are

sought out and implemented;

6) Environment for Raising Concerns - A safety conscious work environment

(SCWE) is maintained where personnel feel free to raise safety concerns

without fear of retaliation, intimidation, harassment, or discrimination;

The USNRC’s Policy Statement on Safety Culture

(SECY-11-0005, January 5. 2011)

Nine “traits of positive safety culture”

7) Effective Safety Communication - Communications maintain a focus on safety;

8) Respectful Work Environment - Trust and respect permeate the organization; and

9) Questioning Attitude - Individuals avoid complacency and continuously challenge

existing conditions and activities in order to identify discrepancies that might result

in error or inappropriate action.

Leadership and Safety Culture

A few words about

US Nuclear Regulatory Agency (US NRC)

and Institute of Nuclear Power Operations (INPO) Similar

Definition of

Safety Culture

• “The core values and behaviors resulting from a collective

commitment by leaders and individuals to emphasize safety over

competing goals to ensure protection of people and the environment.”

(Safety Culture Policy Statement, Federal Register, June 14, 2011)

• “For the commercial nuclear power industry, nuclear safety remains

the overriding priority” (INPO 12-012, Traits of a Healthy Nuclear

Safety Culture, April 2013)

INPO’s

Traits of a Healthy Nuclear Safety Culture

48

INPO (p.6)

“Nuclear safety culture is a leadership responsibility. Experience

has shown that leaders in organizations with a healthy safety

culture foster safety culture through activities such as the

following:

• Leaders reinforce safety culture at every opportunity. The

health of safety culture is not taken for granted.

INPO (p.6 &7)

• Leaders frequently measure the health of safety culture with a focus on

trends rather than absolute values.

• Leaders communicate what constitutes a healthy safety culture and

ensure everyone understands his or her role in its promotion.

• Leaders recognize that safety culture is not all or nothing but is, rather,

constantly moving along a continuum. As a result, there is a comfort in

discussing safety culture within the organization as well as with outside

groups, such as regulatory agencies.

BP Refinery Accident

March 23, 2005

© Financial Times

54

56

Presentation to the Safety Conference of the

Florida Minerals and Chemistry Council

Examining Organizational and Safety

Culture Causes of the BP Texas City

Refinery Explosion

CSB Investigation Supervisor Don HolmstromMay 1, 2008

Tampa Florida, DC

57

Incident summary

• March 23, 2005

• 15 deaths and 180

injuries

• During startup tower

and blowdown drum

overfilled

• Liquid hydrocarbon

released, vapor

cloud formed and

ignited

• Explosion and fire

58

Safety Culture

Companies with a Positive Safety Culture:

• Learn from previous incidents, near misses, and safety deficiencies

• Encourage reporting of safety concerns, issues, and problems by all

levels of staff and provide a mechanism for reporting

• Focus on controlling the risks of major hazards

59

Companies with a Positive Safety Culture:

• Ensure there is leadership and corporate oversight over the

management of organizational safety

• Recognize and assess the safety impact of major organizational change

60

BP Texas City Did Not Have a Positive Safety Culture

• Organizational causes were embedded in the refinery’s history and culture

• Causes extended beyond the ISOM unit to actions of people at all levels

of the corporation

• Multiple safety system deficiencies were found

61

Accident Causation

• Human error is a symptom, not a cause, of safety problems

• The technical causes of catastrophic incidents vary significantly

• But the organizational failures of incidents are remarkably similar

• The greatest preventative impact comes from assessment and

improvement of organizational deficiencies

62

Safety culture – Organizational causes

63

The March 2005 ISOM disaster was an organizational accident

• Causes extended beyond the ISOM unit to actions of people at all

levels of the corporation

• Multiple safety system deficiencies were found

• Causes were embedded in the refinery’s history and culture – plant

history of fatality incidents

History of Accidents and

Safety Problems

• In the previous 30 years, the Texas

City site experienced multiple major

accidents and 23 fatalities

• Audits and investigations revealed

recurring safety problems at Texas City

64

BP Texas City was an incident with organizational causes

• Lack of a reporting and learning culture

• Lack of focus on controlling risks of major hazards

• Ineffective leadership and corporate oversight

• Insufficient assessment of the safety impact of organizational change

65

66

BP Texas City Lacked A

Reporting and Learning

Culture

BP Texas City Lacked a Reporting and Learning Culture

• There were 8 serious ISOM blowdown system incidents prior to March 23, 2005, yet:

– 3 weren’t reported in any database

– 5 were reported as environmental releases

– Only 2 were investigated as safety incidents

• More than 3/4 of the splitter tower startups experienced deviations from operating parameters yet the deviations went uninvestigated by management

67

BP Texas City Lacked a Reporting and Learning Culture

• Logbooks, incident databases, and fire and environmental reports

provided little detail or analysis of the events

• Work order system primarily for accounting purposes, with little

information on equipment history, failure causes, or repair success

68

BP Texas City Lacked a Reporting and Learning Culture

Bad news was not reported

• Prior to the March incident, the 2005 refinery business plan stated that the

“site [was] not reporting all incidents in fear of consequences”

• A 2004 safety culture assessment of the refinery found that “investigations

were too quick to stop at operator error as the root cause” of incidents

69

BP Texas City Lacked a Reporting and Learning Culture

Corporate-level reporting was ineffective

• The 3 major accidents at the Texas City refinery in 2004 were not mentioned

in the reports sent up to corporate executives and the Board of Directors

• A 2002 BP corporate analysis found that the “quality of investigation and

reporting varies considerably and quality of evidence gathering is sometimes

questionable”

70

Najm Meshkati’s published Op-Ed

in the Houston Chronicle

February 23, 2007 (Page B9)

From Meshkati’s Op-Ed

“In the long run, because of the common root human factors causes of accidents

among many industries such as refining, chemical processing, nuclear power,

transportation, and patient care the far-reaching and wide-spread implications of this

Baker Panel’s findings will touch the lives of almost every American. Mr. Baker’s

legacy will be his recommendations that will affect all of us, in one way or another,

and as such, they will have much more impact than any other panel in which he has

participated in the past.”

From Meshkati’s Op-Ed

“Both the performance and the inherent accident potential of complex, large-scale

technological systems, such as refineries, nuclear and chemical processing plants, are

functions of the way their parts -- engineered and human -- fit together and interact. My

research of last quarter century has shown that on many occasions, the error and the resultant

failures are both the attribute and the effect of a multitude of factors such as poor workstation

and workplace designs, complicated operational processes, unbalanced mental and/or

physical workload and inadequate staffing, unsafe working conditions, cumulative

fatigue, faulty maintenance, disproportionate attention to production, ineffective training,

lack of motivation and experiential knowledge, non-responsive managerial systems, poor

planning, dysfunctional organizational structures, rigid job-based pay systems, haphazard

response systems, and sudden environmental disturbances, such as earthquakes.”

The Effects of Unbalanced or High Mental Workload

74

Unbalanced Workload = Lacking balance between task demands and the operator’s capabilities

75

Unbalanced Workload (Overload)

Equilibrium

Job

Operator

“Too demanding,” “Difficult,” “Stressful,” “Terrorizing,” “In-humane,” “Killer,” “Between rocks

and hard place,” “Mission impossible,” “Rat race,” etc.

Unbalanced or High Mental Workload Causes:

• Narrowing span of attention;

• Inadequate distribution and switching of attention;

• Forgetting the proper sequence of actions;

• Incorrect evaluation of solutions;

• Slowness in arriving at decisions

Source: Tikhomirov (1969), Gaume (1978), Meshkati (1983)

76

77

Unbalanced Workload (Under-load)

Equilibrium

Operator

Job

“Boring,” “Not challenging enough,” “Weeks of sheer boredom…”

78

An Unbalanced Human-Machine System

Equilibrium

1. Work Org

Level

2. Job/Tasks

Level

3. Workstatio

n Level

Work allocation (interface), work coordination (communication), work

organization (manpower) problems

e.g.

:

79

Machine/Task Operators

Equilibrium 1. Work Org

Level

2. Job/Tasks

Level

3. Workstatio

n Level

Need (call) for Higher Knowledge, Skills & Abilities

or call for changes in the level of manpower…..

To Compensate for an Unbalanced Human-Machine System

Human (Machine-Task-Organization) Mismatches Can be

Caused by:

• Inappropriate work conditions;

• lack of familiarity;

• improper or poor workstation and workplace designs;

• complicated operational processes;

• unbalanced workload, unsafe conditions;

• faulty maintenance;

• disproportionate attention to production;

• ineffective training;

• lack of motivation and experiential knowledge;

• non-responsive managerial systems;

• poor planning;

• non-adaptive organizational structures;

• rigid job-based pay systems;

• haphazard response systems; and

• sudden environmental disturbances.

Work As Imagined Vs. Work As Done

Richard S. Hartley, (2011) High Reliability Organizations and Practical Approach, CCRM HRO

Conference, UCDC, http://ccrm.berkeley.edu/conferencesandevents.shtml

Work As Imagined Vs. Work As Done

Source: US Department of Energy (DOE) (2012). Accident and Operational Safety Analysis. Volume I: Accident Analysis Techniques. US DOE, P1-32

There will always be a performance gap between “work-as-planned” and “work-as-done”

work performance gap (ΔWg) because of the variability in the execution of every human

activity

Fatigue and Human Performance

83

NRC Sources

Fatigue and Human Error Probability (HEP)

• Across a broad range of industries, studies concerning extended work hours suggest that fatigue-induced personnel impairment can increase human error probabilities by a factor of more than 2 to 3 times

• Source: Hanecke, et al., 1998; Colquhoun, et al., 1996; Akerstedt, 1995; U.S. DOT, 49 CFR Parts 350, et al., Proposed Rule, May 2, 2000, 65 FR 25544.

84

85

…fatigue-induced personnel impairment can increase human error probabilities by a factor of more than 2 to 3 times

…“The Human Error Probability (HEP) will be reduced by factors of 2 to 10 if the workstation (display and controls) are improved by the incorporation of standard human engineering concepts”

Guess what will be the error probability of a fatigued operator/aviator/mariner working with a badly designed workstation?

Fatigue’s Effects on

on

HOT

86

87

Human

Organization

Technology

Volume of Output

Fatigue

Effect

Fatigue

Effect

88

89

90

An Example of

Human-Machine System

Human-Task Interactions

91

Operator

Balanced Human-Machine System

(Human-Task Interactions)

Equilibrium

From Meshkati (1983)

Interaction

sMachineJob (Task Demands)

92

Operator(Characteristics)

Job(Characteristics)

An Example: The Balanced WorkloadEquilibrium

Examples of Individual Differences-related Factors

• Skill, Knowledge, Attributes• Complexity Orientation• Tolerance for Uncertainty and Incongruity• Decision Styles (IBP)• Personality Variables

Examples of Job-related Factors• Task Demands• Amount and Complexity of Information• Time Pressure and Pace• Importance of Job’s (Performance)

Consequences• Structure, Autonomy & Decision Latitude• Social Needs and Interactions• Organizational Variables (Culture)