Performance and outcomes measurement Andrew Auerbach MD MPH Associate Professor of Medicine UCSF...

Post on 13-Jan-2016

219 views 3 download

transcript

Performance and outcomes measurement

Andrew Auerbach MD MPH

Associate Professor of Medicine

UCSF Division of Hospital Medicine

ada@medicine.ucsf.edu

Introduction

• Housekeeping• Course overview• Foundational concepts in performance and

outcome assessment• Structural and management measures• Process measures• Outcome measurement

Introduction

• Ways in which outcomes and performance measurements are used: Research:

• ‘Outcomes research’ and ‘health services research’• Comparative effectiveness research• Implementation research

Operational/Policy• Quality improvement• Public reporting

Housekeeping

• Andy Auerbach ada@medicine.ucsf.edu 415-502-1412 (office) 415-443-6670 (pager)

• Ashok Krishnaswami outcomeskrish@gmail.com Homework due before class. Leave at Olivia’s desk.

Housekeeping

• Each session: Will have 1-2 readings A homework (not today)

• Grades Based on homework + final exam + participation in final

class (quality debate)

Curriculum

The method (model) behind the madness

Donabedian A. Evaluating the quality of medical care. Milbank Mem Fund Q. 1966;44(3):Suppl:166-206.

More detailed model

• Donabedian A. JAMA 1988;260:1743-8

Structure Process Outcomes

Community Characteristics

Delivery System Characteristics

Provider Characteristics

Population Characteristics

Health Care Providers- Technical processes- Care processes- Interpersonal processes

Public & Patients- Access- Equity- Adherence

Health Status

Functional Status

Satisfaction

Mortality

Cost

Because in the end your analyses will look like this….

Measure = Predictor + confounders + error term

Evaluative models can be divided similarly

• Structurally-focused models: Research: Compare different systems of care in terms of clinical

outcomes• AKA - Health services research

Quality improvement/policy: Incent specific characteristics• Eg. Leapfrog measures

Evaluative models can be divided similarly

• Process-focused models: Research: Compare treatments in terms of outcomes

• Adherence research, variations in care delivery, educational research

• Use of ‘Surrogate outcomes’ with proven or a priori connection to outcomes

• Comparative effectiveness research Quality improvement/policy: Hospitalcompare.org

Evaluative models can be divided similarly

• Outcomes focused models Research: Dartmouth atlas Policy/QI: National Surgical Quality Improvement Program

What are some key features of outcomes/performance research?

• What sorts of data are used?

• What sorts of study designs does it encompass?

• How do we differentiate it from cost-effectiveness?

Features of outcomes/performance research

• Generally not experimental in nature• Generally uses preexisting data sets• Prone to numerous kinds of confounders and sorts of

biases• ‘Effectiveness’, not causality

Outcome variations based on structure

• Beds• Availability of testing on site• Teaching hospital• Volume of cases seen• Closed ICU• Number of primary care providers• Proximity to hospital or ED

Other structural factors

• Four important domains in management Strategic Cultural Technical Structural

Shortell SM, Bennett CL, Byck GR. Assessing the impact of continuous quality improvement on clinical practice: what it will take to accelerate progress. Milbank Q. 1998;76(4):593-624, 510.

Management factors required for quality improvement

• Strategic : Is there understanding about the processes and conditions

that are most important to the organization?• Cultural

Beliefs, values, norms, and behaviors in the organization which inhibit/support QI work

Increasing interest in ‘accountability’, in context of individual report cards, incentives

Management factors required for quality improvement

• Technical Training : Do people have skills necessary to carry out QI

efforts? Information support: Do people have the information

required to manage quality?• Structural

Are mechanisms which facilitate learning/dissemination of ‘learning’ available

DDx of quality problemsStrategic Cultural Technical Structural Results

NO YES YES YES • No results on anything important

YES NO YES YES • Small, temporary impact with backsliding• ‘Not how things are done’• No one notices that change is happening, good or bad

YES YES NO YES • Frustration/false starts

YES YES YES NO • Inability to capture learning and spread throughout organization

YES YES YES YES • Desired state

Shortell SM, Bennett CL, Byck GR. Assessing the impact of continuous quality improvement on clinical practice: what it will take to accelerate progress. Milbank Q. 1998;76(4):593-624, 510.

Process of care-focused research

Structure of care initiatives• Leapfrog group

Purchaser-catalyzed organization founded 1998 Collected and reported hospital data in 2001 Encourages change through public reports and purchasers 4 Leaps

• Computer Physician Order Entry (CPOE):• Evidence-Based Hospital Referral (EHR):• ICU Physician Staffing (IPS): Staffing ICUs with doctors

who have special training in critical care medicine• Leapfrog Safe Practices Score: based on NQF measures

2008 Leapfrog survey

• Leap 1: Computerized Order entry• Leap 2: Evidence-based referral volumes

Report – volumes of cases Some estimates of case-mix Geometric mean length of stay Mortality (if publicly reported)

Leapfrog 2008

Safe Practice

Creating and Sustaining a Culture of Safety

Element 1: Leadership Structures and Systems

Element 2: Culture Measurement for Performance

Element 3: Teamwork Training and Skill Building

Element 4: Identification and Mitigation of Risks and Hazards

Informed Consent

Life Sustaining Treatment

Nursing Workforce

Leapfrog 2008

Safe Practice

Communication of Critical Information

Labeling of Diagnostic Studies

Discharge Systems

Medication Reconciliation

Prevention of Aspiration and Ventilator Associated Pneumonia a

Central Venous Catheter Related Bloodstream Infection Prevention a

Hand Hygiene

DVT/VTE Prevention b

Anticoagulation Therapy b

Leapfrog 2001 and 2008

• Transition from primarily structural measures to more process focus – Why?1. Some measures hard to implement due to resource

limitations• Closed ICU – national intensivist shortage• CPOE

2. Some goals hard to achieve because of secular trendsIn CABG, national rates of the procedure are fallingEffects of volume are being subsumed by other national

initiatives

Leapfrog and outcomes

Big problem:How do you measure structure?

• Site visits• Voluntary surveys

Of which people? Patients? Physicians on the front line? Executives? Mystery shoppers?

Process-focused performance measurement

• General criteria for an optimal process measure Must target a common condition Measureable Definable ‘optimal’ patient population Requires evidence linking process to improved outcome

• Advantages: Generally, processes are more common than outcomes

statistical advantages Provide a clear performance target for clinicians Maybe: No risk adjustment once optimal patients defined

Process-focused performance measurement

• Disadvantages Medications, devices generally preferred process targets Documented practices or interpersonal/contextual factors

less so ‘Tyranny of the RCT’ Focus on only a few diseases, patient populations

Process focused performance comparisons – the major caveat

• Biases

In observational data, people who get a treatment are fundamentally different than those who don’t

Process of care initiatives

• Surgical Care Improvement Project Grew out of the Oklahoma Medicare PRO in the late 90’s,

focused initially on surgical infection prevention. Voluntary public reporting 2002-3 Medicare 2% withold for non-participation in 2004

SCIP

• Measures used in surgery Appropriate drug choice Antibiotics within 1 hour of incision Discontinuation in 24 hours VTE prevention practices

• Appropriate orders written• Appropriate therapy received in timely fashion

In cardiac surgery: Glucose < 200 at 6a on first postop day

Shortcomings of process measures

• Cautionary tale about SCIP Few data, as yet, to prove that adherence to individual

measures is associated with improved outcomes in any surgery

Growing thinking that even process measures need risk adjustment

Measuring processes

• Where do you collect process measures from?

Shortcomings of process measures

• Ceiling effects • What do we do when everyone is at 100%?• Weighting schemes for measures thought to be more

important• Incent continued excellence (e.g. number of months at

100%)• ‘All or none’ measurement

• Only ‘adherent’ if all quality measures met

Should quality be an all or none proposition?

Auerbach A, Ann Intern Med.2009;151.

Public reporting : Does it improve care?

• 45 papers reviewed, 27 since 1999• Few (or no) studies examining impact of reporting on patient or

provider level Receipt of care processes Outcomes

• When data are reported, moderate effect at system (care plan, hospital level) In case of NY CABG reporting – shift towards fewer high-risk cases being

performed No data about how public reporting influences the quality structure per

se

Ann Intern Med.2008;148:111-123.

Does paying for performance improve care?

• Natural experiment of 255 hospitals in P4P matched to 406 control hospitals

• Hospitals in the top decile on a composite measure of quality for a given year received a 2% bonus in addition to the usual reimbursement rate. Hospitals in the second decile

received a 1% bonus.

• Bonuses averaged $71,960 per year and ranged from $914 to $847,227.

N Engl J Med 2007;356:486-96.

Between 4.1 and 2.6% improvement in quality of care with P4P.But everyone improved

Do process measures need risk adjustment?

• Accounting for factors outside the site’s (or provider’s) control may be important Proportion of patients whose first HGBA1C is elevated Proportion of patients with morbid obesity This is somewhat controversial

Important caveat: Risk adjustment should not adjust away gaps related to disparities in care (e.g. gender, age, race)

Outcomes

• Most performance measurement will include a clinical outcome as the dependent variable

What are the major issues when you start comparing peoples’ (or systems’) performance in terms of their patients’ mortality, satisfaction, etc?

Outcomes focused initiatives

• Northern New England Cardiovascular Cooperative• Society of Thoracic Surgeons• Vermont Oxford Neonatal Networks• ACC PCI registries• VA NSQIP• Private Sector NSQIP

Common elements

• Collection of clinical risk adjustment and outcomes data via chart abstraction NSQIP – via nurses

• Robust risk adjustment and benchmarking• Some collaborative function via central steering

group and regional or site directors• Implicit or explicit agreements to undertake site

visits or audits of outliers

VA NSQIP

• Marked reduction in mortality for commonly performed procedures in the VA Overall mortality fell from 3% in 1995 to 1% in 2005 Consistent themes of high performers:

• Protocols, pathways, and guidelines with emphasis on standardization

• Interdisciplinary focus on systems improvement

VA NSQIP

• Not just data: Annual reports from all participants, attendance in all

group activities PRN reports from sites in upper 20% of mortality (worse

mortality) Mandatory site visits to worst performers (upper 5%) Aggressive efforts to identify poor performing physicians

Caveats of outcomes models

• Risk adjustment mandatory• Costly to collect data thought to be ‘better’

NSQIP (private sector) – about $40-50/case $50,000/year subscription fee, plus costs of data collection

nurse (1-2FTE) = 150-250K/year investment• Low event rates limit power

2005 data suggest that <60% of hospitals did enough CABG to provide good sample size to compare mortality

Advantages

• High face validity• Truly the ‘end product’• Culturally concordant and most impactful• Often contain specialty-specific complications

Tailored to needs of participating clinicians further maximizes buy-in

Conclusions

• Performance measurement can include structures, processes, or outcomes as comparators

Structural factors have strong connections to management theory, sociology, industrial design/IT

Conclusions

• Performance measurement can include structures, processes, or outcomes as comparators

Process related factors generally represent treatments physicians administer

Comparing these treatments in terms of outcomes = Comparative effectiveness research

Conclusions

• Outcomes The clearest ‘performance measure’, though not the entire

picture Risk adjustment, statistical methods paramount