+ All Categories
Home > Documents > Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese...

Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese...

Date post: 25-Apr-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
44
Oncology Roundtable Practice Brief © 2002 The Advisory Board Company Oncology Dashboards Tools for Monitoring Program Performance
Transcript
Page 1: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology RoundtablePractice Brief

© 2002 The Advisory Board Company

Oncology DashboardsTools for Monitoring Program Performance

Page 2: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni

Legal Caveat The Advisory Board Company has worked to ensure the accuracy of the information it provides to members. This report relies on data obtained from many sources, however, and The Advisory Board Company cannot guarantee the accuracy of the information provided or any analysis based thereon, in all cases. Further, neither The Advisory Board Company nor any of its programs are in the business of giving legal, clinical, accounting, or other professional advice, and its reports should not be construed as professional advice on any particular set of facts or circumstances. In particular, members should not rely on any legal commentary in this report as a basis for action, or assume that any tactics described herein would be permitted by applicable law. Members are advised to consult with their medical staff with respect to matters that involve clinical practice and patient treatment, and with other appropriate professionals concerning legal, tax, or accounting issues, before implementing any of these tactics. Neither The Advisory Board Company nor any of its programs shall be liable for any claims or losses that may arise from (a) any errors or omissions in their work product, whether caused by The Advisory Board Company or any of its programs or sources, or (b) reliance on any graded ranking or recommendation by The Advisory Board Company.

Note to MembersThis document has been prepared by the Oncology Roundtable for the exclusive use of its members. It contains valuable proprietary information belonging to The Advisory Board Company, and each member should make it available only to those employees and agents who require such access in order to learn from the profi les described herein, and who undertake not to disclose it to third parties. In the event that you are unwilling to assume this confi dentiality obligation, please return this document and all copies in your possession promptly to The Advisory Board Company.

Oncology Roundtable Staff

Managing EditorCynthia Johnson

AnalystShay Pratt

Lead DesignerMargaret Blair

Unlimited Copies

Oncology Dashboards (ONC-000-021) is available to members in unlimited quantity without charge. To obtain additional copies of this or other Oncology Roundtable publications, please call 202-266-5920 and ask to speak with a delivery services associate. Studies may also be ordered via our website at http://advisory.com.

ii Oncology Dashboards

Page 3: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Special Thanks

The Oncology Roundtable expresses sincere appreciation to the following individuals and organizations for providing comments, assistance, data, and advice during the research for this publication:

Heather DavidsonChildren’s Healthcare of AtlantaAtlanta, GA

Tom McAfeeCleveland Clinic Health System—Eastern RegionCleveland, OH

Carolyn Caulfi eld CarpenterDuke University Health SystemDurham, NC

Raymond Demers, MDHenry Ford Health SystemDetroit, MI

Debbie Murphy, Maureen Eliseo, Carol HutchinsonLicking Memorial HospitalNewark, OH

Sally WelshMemorial Health University Medical CenterSavannah, GA

Frank VeltriSt. John Health SystemDetroit, MI

Ed NawrockiSt. Luke’s Health SystemBethlehem, PA

Dawna MenkeSwedishAmerican Health SystemRockford, IL

Brian CassellVCU Health SystemRichmond, VA

Brian McCaghWashington Hospital CenterWashington, DC

Oncology Dashboards iii

Page 4: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards

Oncology Dashboards in Brief .................................................................2

I. Oncology Dashboard Metric Selection...........................................4

II. Best Practices in Oncology Dashboard Utilization ................... 11

III. Eight Larger Lessons on Oncology Dashboards .......................... 22

Dashboards of Leading Cancer Centers............................................... 25

1

Page 5: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

2 Oncology Dashboards

Oncology Dashboards Dashboards Provide Valuable Snapshots of Program Performance

Hospitals and health systems have increasingly adopted an out-of-industry practice—the performance dashboard—to provide executives with an overall picture of organizational performance. Hospital dashboards (or balanced scorecards) are valued for their ability to synthesize management information and distill large amounts of performance data into a limited number of key indicators. Based on the concept of the automobile dashboard, these tools enable executives to quickly note trends in critical performance areas.

Four Elements of an Effective Dashboard

As illustrated on the facing page, four essential elements characterize hospital dashboards. First, dashboards include indicators or metrics that typically fall into four basic categories—fi nance, operations, clinical quality, and customer satisfaction. Such a cross-section ensures a balanced view, as performance problems are not always readily detectable by tracking a single area. Second, most hospitals refi ne their dashboards to include between 15 and 30 indicators, a simplicity that provides a targeted snapshot without overwhelming executives with detailed data. Third, each indicator is paired with a benchmark or target, allowing executives to compare performance against standards and provide thresholds beyond which corrective action may be required. Finally, dashboards often are presented in multiple graphical formats to facilitate data interpretation.

Cancer Programs Now Tailoring Dashboards for the Oncology Service Line

Hoping to improve performance monitoring at the service line level, an increasing number of cancer programs have begun developing oncology dashboards. Dashboards encourage data-driven decision-making, crucial in a time of constant changes in APCs and rapid development of new drugs and technologies. By tracking key indicators for oncology, dashboards allow cancer program administrators to monitor performance against both budgetary concerns and strategic goals.

A Guide to Oncology Dashboard Implementation

This practice brief is intended to help administrators apply the dashboard concept to the specifi c needs of their cancer programs. Based on a review of oncology dashboards currently used by leading cancer centers, this brief seeks to provide a foundation upon which cancer programs can select performance indicators. In addition, this brief highlights fi ve practices in dashboard utilization, such as investigating “red fl ags,” effectively using benchmarks, and linking dashboards to employee performance reviews—each complementing the dashboard’s ability to monitor program performance. Finally, this brief provides actual examples of dashboards currently used at selected cancer centers.

Page 6: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 3

in BriefFour Elements of an Effective Dashboard

Average Daily Census

Patient Falls

Total Unadjusted Mortality

Days in AR by Payer

Days Cash on Hand

Operating Margin

CLINICAL QUALITY

Number of Admissions by Top Admitters

ALOS for Acute Patients

Sept2002

Oct2002

Nov2002

Dec2002

Jan2003

Page 3

Operating Margin

Page 1

Actual Target

$ Target

SATISFACTION

OPERATIONALFINANCIAL

Element #4: Multiple Presentation FormatsTrends and data manipulated to allow better, more rapid pattern recognition

Element #1: Metric BalanceDashboard leavens key fi nancial and operational indicators with “leading indicators” such as patient satisfaction, which often presage fi nancial decline

Element #2: Metric AusterityPerformance distilled to between 15 to 30 metrics, with limited redundancy, to facilitate “big picture” awareness

Element #3:BenchmarksInclusion of specifi c targets or benchmarks, provides context for understanding performance data

Page 7: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

4 Oncology Dashboards

Duke’s Hospital-Level Balanced Scorecard Provides Template

Financial Perspective

Patient Satisfaction Likelihood to Recommend

Positive/Negative Comment Ratio

Medication Safety

MRSA Acquisition Rate

Nosocomial Pressure Ulcer Rate

Preventable ADEs

Cost per Discharge

Operating Income

ALOS

Employee Turnover

Patient Flow

FTE/Adjusted Occupied Bed

1.4

1.6

1.4

1.2

1.00.80.6

0.91.1

0.40.6

0.97 0.980.95

0.80.96

0.95

1.4

1.0

1.4

Overall Patient Satisfaction (Inpatient)

Duke University HospitalPerformance Improvement Coordinating CommitteeOrganizational Balanced Scorecard—Current Performance

Clinical Quality/Internal Business Perspective

Work Culture

Customer Perspective

I. Oncology Dashboard Metric SelectionHospital-Level Dashboards Lay Groundwork for Oncology

When developing oncology dashboards, service line administrators may turn to existing hospital-level dashboards for direction on appropriate indicators. As shown below and on the facing page, for example, Duke University Hospital’s balanced scorecard acts as a template for its oncology dashboard. Using identical metrics to the hospital ensures that the cancer center tracks areas of importance to the hospital and provides “apples-to-apples” data on cancer program contributions to hospital performance.

Note: The data contained in these dashboards are for demonstration purposes only and do not refl ect actual performance—the Advisory Board has modifi ed all data on the dashboards presented to protect the competitive position of the institutions profi led.

Source: Duke University Hospital, Durham, North Carolina.

Duke University HospitalPerformance Improvement Coordinating CommitteeOrganizational Balanced Scorecard—Current Performance

Fina

ncia

l Per

spec

tive

Measure Target Current Performance Target Met?Cost/Adjusted Discharge% Operating Profi t MarginOperating IncomeALOSSupply $/Adjusted DischargeFTE/Adjusted Occupied BedSalaries and Benefi ts as% of Net Revenue

Page 8: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 5

Source: Duke University Hospital, Durham, North Carolina.

Oncology-Specifi c Metrics Tailor Dashboard to Service Line

Not all hospital metrics, however, apply to the service line level; furthermore, the broad nature of hospital-level indicators may not capture key areas of cancer center performance. Thus, many cancer programs add oncology-specifi c metrics to fi ll voids in the dashboard. Oncology administrators at Duke added nine indicators to the program scorecard, including fi nancial metrics for cost centers such as pharmacy and patient satisfaction metrics for outpatient areas, which experience a high volume of cancer patients.

Oncology Dashboard Combines Hospital, Cancer Center Measures

MeasureDUH

TargetOncology

TargetCurrent

PerformanceOncology

Target Met?

Cost/Adjusted Discharge

% Operating Profi t Margin

Operating Income

ALOS

Supply $/Adjusted Discharge

FTE/Adjusted Occupied Bed (Before Costs)

Salaries and Benefi ts as % of Net Revenue

Pharmacy Avg. Variable Direct Costs/Case Inpatient

n/a

Pharmacy Avg. Variable Direct Costs/Case Outpatient

n/a

Total Actual FTE/Flex Budget FTEs n/a

n/a n/a n/a

n/a

Patient Satisfaction: Overall Outpatient n/a

Patient Satisfaction: Pain Management n/a

Duke University HospitalPerformance Improvement Coordinating CommitteeOrganizational Balanced Scorecard—Current Performance

Fina

ncia

l Per

spec

tive

Cus

tom

er P

ersp

ectiv

e

Note: One hospital measure, “patient satisfaction: ED,” is excluded from the oncology scorecard because oncology-specifi ed statistics cannot be generated.

Hospital indicators reproduced on oncology dashboard

Pat

ient S

tisfa

e Co

T

Indicators unique to oncology dashboard

Not all hospital indicators applicable to oncology

Page 9: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

6 Oncology Dashboards

Meaningful Metrics Refl ect General Performance, Specifi c Priorities

Regardless of whether institutions maintain a hospital-level dashboard, oncology administrators should choose quantifi able dashboard metrics that not only provide a balanced overview of program performance but also directly correlate to specifi c aspects of the cancer center budget and strategic plan. In this manner, indicators refl ect program priorities and preserve meaning and accountability.

The table below and on the following page represents a collection of indicators prevalent among several cancer programs’ dashboards reviewed during the course of research. While this list provides a (hardly defi nitive) menu from which to select general oncology dashboard metrics, programs should choose additional indicators—not represented here—that are measurable within their institutions and refl ect their specifi c priorities, goals, and concerns. Furthermore, programs may select certain metrics that intentionally “overlap,” as performance problems may not be readily detectable from a single vantage point.

Common Oncology Dashboard Indicators

Category Metric Rationale

FINANCIAL

Revenue

Charges Records total billing for oncology services; often used as proxy for net revenue should actual oncology revenues be diffi cult to track

Net revenue Tracks total incoming dollars; more indicative of fi nancial health than charges

Ancillary revenue Tracks extra-departmental revenue generated by oncology patients

CostDirect cost Measures costs of cancer care delivery

Pharmacy costs Drugs a major, ever-changing cost center for cancer programs

Margin Total (contribution) margin Measures profi tability of service line

Other

Percentage payer denials Indicates foregone revenue opportunities

Collection rates Surfaces discrepancy between expected and actual net revenue

Days in accounts receivableIndicates number of days of operating revenue due from patient billings after deductibles for doubtful accounts; measure of quality of fi nancial management

Salaries/wages/benefi ts per adjusted discharge Measures the proportion of operating costs attributable to employee labor

OPERATIONAL

Volume

Outpatient encounters Tracks trends in outpatient utilization, the mainstay of cancer programs

Inpatient discharges Tracks trends in medical and surgical oncology volume

Radiation oncology consults Tracks utilization of radiation oncology services, typically main profi tability driver

New patients Indicator of market share increase

ALOS By inpatient unit Proxy for resources used per admission; ALOS signifi cantly longer than median values indicates operational ineffi ciencies

Page 10: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 7

Common Oncology Dashboard Indicators, Continued

Source: Oncology Roundtable interviews and analysis.

Category Metric Rationale

OPERATIONAL, Continued

FTEs

FTE per adjusted occupied bed

Indicators of staffi ng adequacyActual versus budgeted FTEs

Vacancy rates

Overtime hours

Patient Flow/Access

Time from diagnosis to procedure

Key indicators of operational effi ciency; delays and increased wait times also may negatively affect patient satisfaction

Patient discharges before noon

Days to clinic appointment

Clinic wait times

SATISFACTION

Patient

Patient satisfaction scores Key indicator of overall service quality

“Likelihood to recommend” score Indicates likelihood of choosing cancer program over competitors

Pain management score Controlling pain a central driver of patient satisfaction

Physician

Internal physician satisfaction scores Tracks satisfaction among medical staff

Referring physician satisfaction scores Number of physician referrals directly affects number of admissions and

subsequent revenue generationReferrals by physician

Staff

Staff/nurse satisfaction survey results, including “likelihood to recommend”

Indicates hospital’s ability to attract and retain quality staff, critical to high-level care

Turnover rates

CLINICAL

GeneralMortality

Traditional indicators for oncology clinical qualityStage at presentation

Adherence to Care Guidelines

Procedural rates (e.g., breast conservation surgery, sentinel node biopsy)

Tracks utilization of newly adopted standards or emerging clinical practicesDocumentation guidelines

Percentage offered multidisciplinary care

Complication Rates

Readmission ratesPrevention of unnecessary admissions a proxy for patient management

Emergent ED visits

Medical Errors Adverse drug events (ADEs) Essential to monitor given the potential severity and high profi le of chemotherapy errors

New Technology/

Drugs

Volume for newly instituted technologies, e.g., Mammosite, brachytherapy

Tracks utilization of new services; can also assist in break-even analyses

Utilization of new drugs, e.g., Neupogen/Neulasta Monitors use of emerging (and often expensive) therapies

Page 11: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

8 Oncology Dashboards

The Challenge of Clinical Quality Metrics

The selection of clinical quality metrics often presents the greatest challenge during dashboard development. Indicators typically included on hospital-level dashboards—caesarean section rates, nosocomial infection rates, and patient falls—are often not top drivers of clinical quality in oncology. Furthermore, with limited space available on the dashboard, administrators and clinicians must agree on a few key indicators most refl ective of oncology clinical quality.

In response to these challenges, program leaders at Duke University Hospital’s cancer center include a single “meta measure” of oncology-specifi c clinical indicators. The cancer center director asked managers of fi ve oncology business units—infusion, recreation therapy, radiation oncology, inpatient oncology, and bone marrow transplant—to identify the two or three indicators deemed essential for monitoring the status of their respective departments. The resultant eleven metrics each represent a target; the meta measure then refl ects the percentage of targets met. The use of the summary meta measure, while still undergoing refi nement, helps to retain the dashboard’s metric austerity while improving its specifi city for oncology.

Oncology Clinical Quality Targets at Duke

Indicators Rolled into the Meta Measure

Source: Duke University Hospital, Durham, North Carolina.

Inpatient Oncology• Preventable ADE S.I. > 2 Chemotherapy = 0• VRE Acquisition Rate < 2.5

Bone Marrow Transplant• % Autologous Readmit < 30 days ≤ 20%• % Allogeneic Readmit < 100 days ≤ 50%• % Autologous Survival > 30 days ≥ 95%

Infusion• Less than 2% extravasation rate• 100% new patients receive chemo education

Recreation Therapy• % of total time in direct patient contact ≥ 50%• % of total referrals responded to within 48 hours ≥ 80%

Radiation Oncology• Nutrition screening• Skin study

Duke University Hospital Oncology Balanced Scorecard

Clinical Quality/Internal Business Measures Current Performance Target Met?Medication Safety (Preventable ADE with SI>2/All ADE) Patient Flow (Discharged between 7 a.m. and 11 a.m.) MRSA Acquisition Rate Nosocomial Pressure Ulcer Rate Unplanned Return to OR<14 days Preventable ADE S.I.>2 = Heparin Preventable ADE S.I.>2 = Insulin Preventable ADE S.I.>2 = Opiates Adherence to Standard Drip Concentration Policy Effectiveness of Selected Recommendations from RCATFs Meta Measure

“Meta measure” tracks aggregate performance of eleven oncology-specifi c quality indicators

Some clinical quality metrics taken from the hospital-level dashboard lack specifi city for oncology

Page 12: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 9

Tracking Indicators by Tumor Site

Program directors at Josephine Ford Cancer Center (Henry Ford Health System) took an alternative approach to monitoring clinical quality—stratifying outcomes by tumor site rather than by business unit. Citing the diffi culties in identifying clinical quality metrics that apply equally to all cancers, they decided to focus on one tumor site at a time. To date, multidisciplinary teams have prioritized clinical outcomes to track the top four tumor sites by volume, as shown in their outcomes grid below.

Outcomes Grid at Josephine Ford Cancer Center

Source: Henry Ford Health System, Detroit, Michigan.

Tumor Site Outcomes Results Planned Intervention Goal

Breast

Percentage diagnosed at stage 0–II JFCC: 87% National: 85%

None Stay at or above national benchmark

Percentage accrued to clinical trial 2000: 22%2001 YTD: 19%

None > 20% accrual

Timeliness of diagnostic process Abnormal mammogram or mass to treatment = 75 days

Physicians addressing; assess biannually; 2nd biopsy table

< 30 days

Tamoxifen counseling 85% counseled Tamoxifen guidelines accepted; implement work tools in clinic

100% counseled

Colorectal

Percentage diagnosed at stage 0–II JFCC: 48%National: 55%

Work with HAP to increase screening

Meet or exceed national benchmark

Percentage accrued to clinical trial 2000: 3%2001: 7%

Strong studies recently opened

> 10% accrual

Percentage discussed at tumor board 2001: 52% Leaders developing proper intervention

> 90% per leader preference

Timeliness of care a) Symptom or heme+ stool to CRS appt=56–74 daysb) CRS to surgery=35 days

Leaders developing proper intervention

< 30 days

Prostate

Percentage staged in MIMS 2001 < 80% Chair discussion with staff

100% for ACOS

Percentage of staff attending tumor board Urol—100% RadOnc—50%MedOnc—37%

Chairs discuss with staff/reassign staff

100% for all services

Percentage enrolled in clinical trials 2000: 7%2001: 9%

Physicians assessing prioritization of protocols

> 10%

Reduce redundant follow up care 2000/01—14 visits annually64% have 2–3 PSAs but only 30% get 2–3 DREs

Implement work tools in clinic

3 visits per yearAssess fi nancial impact

Expand PCOP 2001—only 8% offered multidisciplinary clinic visit

Opened main campus clinic; plan to open east side clinic in 2002

100% of patients offered clinic

Lung

Percentage diagnosed at stage 0–II JFCC: 30%National: 29%

None Meet or exceed national benchmark

Percentage accrued to clinical trials 2000: 19%2001: 24%

None > 20%

Timeliness of treatment Time chest x-ray to treatment—66 days

Lead physicians addressing < 30 days

Treatment of stage IIIA disease 9% of patients had surgery Lead physicians addressing 80% have surgery (under review)

Page 13: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 11

II. Best Practices in Oncology Dashboard Utilization

While the primary use of dashboards involves timely identifi cation of service line trends, dashboards have also been successfully adapted to monitor progress toward strategic goals and measure employee performance during formal reviews. Moreover, while many cancer centers use dashboards to monitor the service line as a whole, some programs have also applied dashboards to more granular levels, such as individual tumor sites or specifi c technologies. This section outlines fi ve practices in oncology dashboard utilization employed by leading programs.

Five Practices in Oncology Dashboard Utilization

Dashboard: Patient Satisfaction

Jan. Feb. Mar. Apr.

Center of

Excellence

DashboardDashboard

Practice #1: “Red Flag” Investigation

Practice #2: Strategic Report Cards

Practice #3: Dashboard-Linked Employee Performance Reviews

Practice #4: Benchmark–Driven Action Plans

Practice #5: Cascading Dashboards

Target

Page 14: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

12 Oncology Dashboards

Practice #1: “Red Flag” Investigation

While dashboards can speed problem recognition and trend identifi cation, they are not meant to determine root causes nor pinpoint answers. Rather, program leaders must use the dashboard’s raw data to initiate further investigation. For example, the oncology dashboard used at Cleveland Clinic Health System—Eastern Region revealed an atypical change in prostate cancer patient volumes at two of the four system hospitals in the region; volumes increased signifi cantly at Hillcrest Hospital while they simultaneously decreased at South Pointe Hospital. Seeing this variance in volume, the director of oncology noted a probable link to the introduction of a new prostate brachytherapy service at Hillcrest. The director then charged the cancer registry staff with tracking physician referrals for prostate cancer across the system, which confi rmed that patients were migrating from South Pointe to Hillcrest, and not leaving South Pointe for a competing program.

Case #1: Dashboard Flags Volume Shift

Dashboard records increase in prostate cancer volume at Hillcrest Hospital with simultaneous decrease at South Pointe Hospital

Oncology director investigates probable causes, notes that the new prostate brachytherapy program at Hillcrest Hospital coincided with volume fl uctuation

Cancer registry tracks physician referrals, confi rms a shift in prostate cancer volumes from South Pointe to Hillcrest, rules out loss of market share to competitors

Data Divergence1

3Supportive Investigation

2Root Cause Analysis

Prostate Cancer VolumesHillcrest South Pointe

BlairPrattJones

A B C D 10 20 7 30 11 6 22 8 4 40 0 0

Registry

Source: Cleveland Clinic Health System—Eastern Region, Cleveland, Ohio.

PatientOutmigration

Improved Marketing Campaign

New Brachytherapy

Program

Change in Volume

Page 15: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 13

Patient wait time dashboard shows wait times well beyond 20 minute standard established as benchmark

Patient Wait Times

Dec. Jan. Feb. Mar.

Administrators launch focused study of wait times, revealing that one physician consistently overbooked his clinic

After administrators present evidence, physician conducts review of own scheduling guidelines, identifi es same issue

Physician sets new scheduling parameters, including stratifi ed priorities for follow-up patients, gives new guidelines to schedulers and secretaries

Dashboard Surfaces Increased Waiting Times

1

Dashboard Shows Wait Time Stabilizes

5

Physician Adopts New Scheduling Guidelines

4

Administrators Present Issue to Physician

3

Ad Hoc Investigation Identifi es Obstacle

2

Resurvey of physician’s wait times shows drop by half after three months

Downstream Inquiry Unearths Root Causes, Solution

In a second example, the oncology dashboard employed by administrators at Ryland Medical Center1

revealed a steady increase in clinic waiting times from the established standard of 20 minutes. An investigation of the issue found that the schedule of one physician in particular often showed double and triple bookings and often inordinate wait times. Administrators met with the physician, who then reengineered his clinic scheduling process. When administrators resurveyed the data three months later, the physician’s wait time had dropped by half.

Case #2: Dashboard Prompts Change in Scheduling Process

1 Pseudonymed institution.

Dec.Jan.

Feb. Mar.

Jun.

Patient Wait Times

Source: Oncology Roundtable interviews.

Page 16: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

14 Oncology Dashboards

Practice #2: Strategic Report Cards

In addition to raising potential red fl ags, dashboards can also be used to measure institutional progress toward longer term strategic goals. At Children’s Healthcare of Atlanta (CHOA)/Emory University, cancer program leaders created a dashboard, called a strategic report card, to evaluate the progress of the AFLAC Cancer Center toward becoming a preeminent pediatric oncology center of excellence. The leadership team fi rst outlined several goals commensurate with a center of excellence, including expanding the research program, growing volume and service scope, and increasing philanthropic support. Once these goals were established, the oncology leadership team selected indicators, defi ned methods of data collection, and created the dashboard format.

Dashboard Construction Mirrors Strategic Goals

Strategic Goals Defi ned1 Indicators Selected2Strategic Report Card

Implemented3

• Selection process includes chief medical offi cer, business and clinical directors and key physician leaders

• Indicators selected for ability to measure progress toward center of excellence vision

• Data collection process defi ned• Baseline data gathered for

benchmarks

• Cancer center leaders outline long-term goals for developing program into pediatric oncology center of excellence

• In particular, program leaders target expansion of clinical services, basic research, teaching program, and philanthropic support

Center of Excellence

• Performance measured against targets

• Report card updated and distributed semi-annually

• Volume• Clinical

and Research• Teaching• Marketing

/FundraisingSt

rate

gic R

epor

t Car

d

Source: Children’s Healthcare of Atlanta, Atlanta, Georgia.

Page 17: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 15

Strategic Report CardAFLAC Cancer Center and Blood Disorders ServiceChildren’s Healthcare of Atlanta/Emory University

Volume

Indicator 2002 20032003

Target

Total new cancer casesActive sickle cell patientsActive hemophilia patientsBMT cases, in-stateBMT cases, out-of-state

BMT cases, external referral

Clinical and Research

Excellence

Allogeneic BMT mortality rate, day 100Autologous BMT mortality rate, day 100Clinical trial patient accruals: phase I/IIClinical trial patient accruals: phase IIIClinical trial patient accruals: other categoriesChildrens Oncology Group (COG) protocol participation—percent eligibleCOG protocol participation—percent enrolledGrant dollars awarded—nationally peer-reviewedGrant dollars awarded—other Publications by CHOA/Emory physicians and staff—nationally peer-reviewedPublications by CHOA/Emory physicians and staff—other Number of faculty in COG leadership positions

Basic science research FTEsBasic science space allocation (square feet)

Teaching Excellence

Fellowship success in matchTotal fellowship positions

Marketing Fundraising Excellence

Total restricted dollars raisedTotal endowmentMD outreach visits

BMT caVolumes and referral streams included to monitor program growth

Selection of Indicators Necessitates Strategic Focus

During the development process, program leaders dedicated increased attention to selecting indicators. As shown by the report card excerpted below, program leaders ensured that each indicator, when measured, informs administrators and cancer program staff of their progress toward becoming a preeminent pediatric oncology center of excellence. For example, volume indicators devote specifi c attention to growth in blood and marrow transplant, a crucial service for the AFLAC Cancer Center. Moreover, the report card includes several indicators that measure the growth of clinical and basic research and national positioning of faculty.

Strategic Report Card at Children’s Healthcare of Atlanta

Source: Children’s Healthcare of Atlanta, Atlanta, Georgia.

Number of fNum

Growth of fellowship program an indicator of center’s national reputation

Measure indicative of the level of basic science research taking place at Emory

Individual leadership in national childhood cancer research organization refl ects CHOA as preeminent center

Page 18: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

16 Oncology Dashboards

Bring patient consent documentation in EMR to 100%

Practice #3: Dashboard-Linked Employee Performance Reviews

Regular monitoring of departmental results through dashboards can enfranchise staff in program improvement as well as promote staff development. As administrators share the results of the dashboard with employees, staff members can assume ownership over indicators that refl ect directly on the performance of their department as well as their own job roles. Licking Memorial Hospital has hardwired this concept by linking dashboard indicators to employee performance review results and development goals.

As part of the review process, every six months each employee sets two personal development goals tied to specifi c indicators on their department’s dashboard. In addition, Licking Memorial requires that employees work toward two dashboard-linked team or departmental goals. A sample of a dashboard-oriented review for a chemotherapy technician is reproduced below.

Individual, Departmental Goals Tied to Dashboard Indicators

Employee Goals and Performance Review

Name:Dept:

Dept:Title:

GOALSIndividual Goals

GoalInclude detail regarding measurement

TargetDate

SelfRating

ManagerRating

Comments

Rating Options (0 and 1 ratings must include specifi c examples)0 – Does not meet goal. Requires improvement.1 – Meets 100% of the goal.

M. Flanders1/1/03

OncologyChemotherapy Technician

6/30/03

Team Department GoalsTeam goals not taken directly from Departmental Dashboard require VP approval

Reduce chemotherapy patient turnaroundfrom 1.5 hours to 30 minutes

6/30/03

GoalInclude detail regarding measurement

TargetDate

SelfRating

ManagerRating

Comments

6/30/03

6/30/03

Exceed patient satisfaction scores of90% in infusion therapy

100% documentation of chemotherapy patient education

Source: Licking Memorial Hospital, Newark, Ohio.

Page 19: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 17

Process Reengineered

• Tech designs new ordering, drug preparation processes

• Chemotherapy ordering turnaround brought to 20 minutes, exceeding goal

• Chemo tech fulfi lls performance goal, receives raise

Source: Licking Memorial Hospital, Newark, Ohio.

Problem Identifi ed

• Chemotherapy patients waiting on average 1.5 hours, often up to 4 hours, for chemo orders to be completed

Developmental Goal Set

• Chemo tech sets goal to lower order turnaround time to 30 minutes

90 Minutes

30 Minutes

Six-Month Progress Checked by Dashboard

90

20

Chemotherapy Ordering

Turnaround (Minutes)

Dashboards Measure Progress Toward Performance Goals

The dashboard is then used to monitor progress toward these goals during the six-month period between reviews—a suitable timeframe for employees to make distinct behavioral changes that surface results on the dashboard. The graphic below follows the progress of the Licking Memorial chemotherapy technician on one individual development goal. Through singlehandedly redesigning the chemotherapy ordering process, the tech exceeded the performance target and produced improvement in that metric on the departmental dashboard.

Exceeding Performance Goals at Licking Memorial

Page 20: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

18 Oncology Dashboards

Director of process reengineering reviews each performance indicator, compares to benchmark

Oncology program manager completes dashboard and submits it to the director of process reengineering each month

Dashboard

Target

Patient Satisfaction

Director of process reengineering fl ags each indicator as red, yellow, or green based on the status of the indicator, sends copy upward to CEO and back to oncology manager

Oncology manager prepares action plan for indicators marked red, continues to monitor metrics yellow fl agged yellow

PatientSatisfactionStatus–Red

Action PlanPatient

Satisfaction 1) 2) 3)

Practice #4: Benchmark-Driven Action Plans

Benchmarks for dashboard indicators allow cancer programs to compare actual performance against internal and external standards and provide thresholds beyond which corrective action may be initiated. While some benchmarks may represent absolute targets that warrant immediate action should they not be met, others may serve as barometers of progress toward long-term goals that are better monitored over time. To help directors act on benchmark comparisons, some programs have instituted easily actionable coding systems to guide appropriate levels of response to potential problems surfaced by the dashboard.

Every month, the oncology program manager at Licking Memorial Hospital documents all indicators and delivers the dashboard to the director of process reengineering, who directs the fl ow of approximately 70 different dashboards from all hospital departments. The director of process reengineering compares indicators to benchmarks, noting variances and performance trends. Then the director determines the appropriate level of response required, fl ags each measured indicator with a status of red (action warranted), yellow (vigilance suggested), or green (no action required). This simple, easily recognizable color coding system assists department managers with translating results into action plans.

Monthly Dashboard Review Grades Against Benchmarks

Source: Licking Memorial Hospital, Newark, Ohio.

4Departmental Response

3Performance Reporting

2Benchmark Comparison

1Monthly Measurement

Page 21: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 19

Source: Licking Memorial Hospital, Newark, Ohio.

Color Coding Facilitates Appropriate Response to Indicators

Provided below is an excerpt of Licking Memorial’s oncology dashboard with color codes assigned to the status of each indicator. While most measurements are marked green, indicating that performance has equaled or exceeded benchmarks, pain assessment documentation has received a status of yellow, which signals the oncology manager to monitor the indicator closely over the next few reporting periods. Medical necessity denials have been fl agged red, prompting the oncology manager to develop an action plan to realign the indicator with the benchmark. With this system of prioritizing responses to metric variance, Licking Memorial administrators can differentiate potential problems from true downward trends that warrant immediate action.

Triaging Action Plans at Licking Memorial

Licking Memorial Hospital—Process Improvement Dashboard—Oncology

Indicator Measurement Benchmark

YTD Average Status

• Chemo consent form completed—new patients 100% 100% Green

• Pre-hydration per chemo guideline 100% 97% Green

• Pain assessment documented 100% 95% Yellow

• Allergy documentation 100% 100% Green

• Discharge education documented 100% 100% Green

• Medical necessity denials 75 120 Red

• Medication errors—chemo 0 0 Green

Green status shows that indicator has satisfactorily met the benchmark

Yellow status prompts administrator to hypermonitor indicator

Red status signals program leaders to investigate causes and develop plan for corrective action

Page 22: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

20 Oncology Dashboards

Practice #5 : Cascading Dashboards

Some cancer centers have developed an integrated system of dashboards to monitor their programs at various levels, from an individual department such as radiation oncology, to the overall service line, to the aggregate performance in oncology across multiple programs in a health system.

The cancer institute at Cleveland Clinic Health System—Eastern Region uses a multi-level oncology dashboard, maintained on a shared network drive, to track key indicators across four system hospitals in the region. The cascading, on-line nature of the dashboard allows real-time access to indicators at an appropriate level of detail for managers, the cancer center director, and system executives. Provided on the facing page are examples of three different levels of the cancer center’s dashboard.

At each hospital, a spreadsheet-oriented dashboard is maintained for each oncology business unit, such as medical oncology or clinical research, as well as for services of strategic importance such as the breast center and gynecologic oncology. The spreadsheet at the top of the page illustrates the most granular level of the dashboard, statistics from the outpatient oncology business unit at Euclid Hospital, one of four institutions within the health system.

Each of these departmental spreadsheets automatically feeds into a master summary maintained by the cancer center director. This master dashboard, shown in the middle of the facing page, is a series of page-length spreadsheets exhibiting the cumulative performance of the four system hospitals in each department. The master summary provides a thorough snapshot of the cancer center’s budget and operational performance, and allows the director to compare trends in oncology in aggregate or by business unit across multiple sites in a health system.

From the cancer center director’s master summary, dashboard indicators are rolled up into customized reports for hospital executives. Shown at the bottom of the facing page is the report for Hillcrest Hospital’s chief operating offi cer, an eight-indicator statistical summary that includes year-to-date performance, budget variance, and the percentage change from the previous fi scal year.

Note: The data contained in these dashboards are for demonstration purposes only and do not refl ect actual performance—the Advisory Board has modifi ed all data on the dashboards presented to protect the competitive position of the institutions profi led.

Page 23: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 21

Euclid Hospital Ambulatory Oncology Summary 2002Jan. Feb. Mar. Apr. May. Jun. Jul. Aug. Sep. Oct. Nov. Dec. Total

Chemo 29 27 22 36 28 29 31 32 27 34

# Blood 22 27 20 23 22 22 23 23 24 26

# Platelets 21 23 21 25 23 24 21 23 23 23

# Injections 60 63 79 82 91 71 85 101 89 199

# Procedures 28 27 20 23 23 25 26 24 27 25

# Lab Draw 55 46 50 52 56 55 56 61 61 61

# Other 26 26 24 27 37 40 35 39 43 43

ResearchAccruals

0 20 20 20 21 21 21 30 20 20

Total # of visits

198 120 299 226 231 124 231 248 241 255

Hillcrest HospitalCleveland Clinic Health System Oncology Operating Dashboard—October, 2002

Indicator YTD-2002 01/02 Change Budget Budget Variance

Newly Diagnosed Cases 2,000 65 1,925 (75)Radiation Oncology Treatments 30,000 2,000 28,000 2,000Medical Oncology Visits 30,007 2,496Chemo Gross Revenue 60,000,000 15,000,000 30,000,000 10,000,000GYN Surgical Volume (IP Surgeries) 250 104 180 20GYN Surgical Revenue 2,000,000Service Line Productivity 511%Patient Satisfaction 5.00 4.99Clinical Research Accruals 80 n/a n/a n/a

ResearchStart

123456789101112131415161718

Radonc Medonc Gynonc Registry Breast Productivity

Medical Oncology Volume—Ambulatory Cleveland Clinic Health System—Eastern RegionEuclid Hillcrest Huron

Month 2001 2002 2001 2002 2001 2002January 240 127 2,799 1,712 377 371February 104 132 2,689 2,624 211 271March 96 128 2,554 3,143 207 255April 96 243 2,460 2,240 300 213May 129 241 2,301 3,061 247 319June 93 199 3,258 3,173 218 246July 103 131 2,460 3,575 313 292August 104 158 2,197 3,076 297 217September 106 131 2,092 1,950 125 007October 179 255 3,423 3,007 176 103November 154 0 2,100 0 253 0December 96 0 5,139 0 345 0

Total 2,007 2,650 39,942 12,345 2,211 2,700Year-to-date 1050 2,725 28,007 30,007 2,199 3,007

Spreadsheets for different oncology business units…

Cascading Dashboards at Cleveland Clinic Health System—Eastern Region

…which in turn rolls up to customized reports for system executives

Source: Cleveland Clinic Health System—Eastern Region, Cleveland, Ohio.

…upload into the cancer center director’s master summary…

Page 24: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

22 Oncology Dashboards

III. Eight Larger Lessons on Oncology Dashboards

Dashboard

#2 Primary Role of Dashboard to Surface Problems, not Diagnose Root Causes

Integral to a dashboard’s summary purpose is reducing a complex program’s overall performance to a small, manageable number of indicators; data needed for in-depth analysis should be available at arm’s length, but not in the dashboard itself

"The Year Ahead"

#3 Meaningful Dashboards Reinforce Strategic Intent

Effective dashboards do more than just monitor performance to avoid fl ashpoints; reaching performance targets for key metrics should advance program toward “big picture” strategic goals

12

3 #4 Use of Dashboards a Dynamic, Evolutionary Process

Annual revision of indicators and benchmarks necessary to “weed out” ineffective metrics and incorporate new priorities, service offerings, and technologies

Finance

Clinical Satisfaction

Operations #1 Dashboards Provide Meaningful “Pulse Check” of Overall Performance

Simultaneous tracking of key fi nancial, operational, clinical drivers provides constant overview of service line and facilitates timely intervention in event of unexpected change

Page 25: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 23

Target

#5 Comparative Benchmarks Key to Data Interpretation

Targets and thresholds provide context for understanding performance data; while most programs benchmarking current performance against internal trend data, progressive centers actively seeking and developing external standards

Target

#6 Not All Red Flags Created Equal

While most dashboards include “action triggers”—specifi c thresholds below which immediate action must be taken—progressive programs instituting triage mechanisms to guide appropriate level of response to each red fl ag

Dashboard Action Plan

#7 Inclusion of Action Plan Maximizes Dashboard Utility

Incorporating action steps within the dashboard itself helps to hardwire the improvement process and ensure continued progress toward “righting the ship”

#8 Dashboards Fuel Accountability on Departmental and Individual Level

Measuring and monitoring performance in a quantitative, data-driven manner encourages goal-setting and provides an educational avenue for communication with staff

Page 26: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

25

Dashboards of Leading Cancer Centers

A Note to the ReaderThe data contained in these dashboards are for demonstration purposes only and do not refl ect actual performance—the Oncology Roundtable has modifi ed data on the dashboards presented to protect the competitive position of the institutions profi led.

� Cleveland Clinic Health System—Eastern Region

� Duke University Hospital

� Gannet Medical Center1

� Licking Memorial Hospital

� Perkins Hospital1

� Providence Hospital and Medical Center

� SwedishAmerican Health System

� VCU Health System

� Watkins Medical Center1

1 Pseudonymed institution.

Page 27: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

26 Oncology Dashboards

Dashboard in BriefInstitution Cleveland Clinic Health System—Eastern Region, Cleveland, Ohio

Length Multi-page spreadsheet

Update Frequency Every month

Highlights Dashboards for individual oncology business units maintained at four system hospitals in region; data rolls up to master summary spreadsheet monitored by the executive director of the cancer institute, as well as customized reports for the COO and other executives.

Dashboard Indicators

Radiation Oncology

Volume: Actual treatmentsVolume: ConsultsVolume: New startsVolume: Old startsVolume: Prostate brachytherapy

Medical OncologyTotal volumeNew patients

Gynecologic Oncology

VisitsNew patientsSurgeriesStand-by

Cancer Registry Newly diagnosed cases by tumor site (bladder, breast, colorectal, lung, prostate, other)

Breast Center

Number of diagnosed casesNumber of diagnostic proceduresNumber of defi nitive proceduresNumber of sentinel node proceduresNumber of reconstructions

Oncology Productivity FTEs earned versus FTEs actual for administration and oncology programs at four system hospitals

Infusion Therapy Total gross revenue

Clinical ResearchPatient accrualsPatients followed

Financial

Budget varianceOncology pharmacy gross revenueAmbulatory oncology technical gross revenueAPC Trending

Patient Satisfaction

Satisfaction with oncologistSatisfaction with staffOverall satisfactionNumber of surveys

Page 28: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 27

Actual Treatments Consults New Starts Old StartsProstate

Brachytherapy

Month 2001 2002 02 Bud 2001 2002 2001 2002 2001 2002 2001 2002

JanuaryFebruaryMarchAprilMayJuneJuly

AugustSeptember

OctoberNovemberDecember

TotalYTD

Annualized

YTD 2002

Jan. Feb. Mar. Apr. May. Jun. Jul. Aug. Sep. Oct.

Total Number of Breast Cases

Palpable Procedures• Cancers (dx/palpable)ABBI Procedures• Cancers (dx w/ABBI)Needle Localizations• Cancers (dx w/needle loc)Number of Procedures Number of CancersDefi nitive Procedures• Lumpectomy/Partial w/Axillary

Dissection• LumpectomyBreast Conservation• Modifi ed

• SimpleMastectomyTotal Defi nitive ProceduresSentinal Node ProcedureTissue Transfer (tram fl ap)• ImplantReconstruction

Dashboard Presentation

HillcrestHospital

HillcrestHospital

Radiation Oncology

Breast Center

Page 29: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

28 Oncology Dashboards

Dashboard in BriefInstitution Duke University Hospital, Durham, North Carolina

Length Two pages

Update Frequency Every two months

Highlights Oncology dashboard includes all indicators from DukeUniversity Hospital’s dashboard, as well as an additional nine oncology-specifi c measures. Dashboard revised annually.

Dashboard Indicators

Financial Perspective

Cost/adjusted dischargePercentage operating profi t marginOperating incomeALOSSupply dollars per adjusted dischargeFTE/adjusted occupied bed (before labs)Salaries and benefi ts as a percentage of net revenuePharmacy average variable direct cost/case (inpatient)Pharmacy average variable direct cost/case (outpatient)Total actual FTE/fl ex budget FTEs

Work Culture Perspective

Overall employee turnoverNurse turnoverEmployee satisfactionVacancy rate—for approved positionsVacancy rate—days to hireMandatory training—HEICS trainingMandatory training—HIPAAParticipation in continuing educationNumber of publications/presentations

Customer Perspective

Patient satisfaction—overall inpatientPatient satisfaction—likelihood to recommendPatient satisfaction—EDPositive/negative comment ratioPatient satisfaction—teamworkPatient satisfaction—overall outpatientPatient satisfaction—pain management

Clinical Quality/Internal Business Perspective

Medication Safety (Preventable ADE with S.I.>2/A11 ADE)Patient fl ow (discharged between 7 AM and 11 AM)MRSA acquisition rateNosocomial pressure ulcer rateEffectiveness of selected recommendations from Root Cause Analysis Task FormsPreventable ADE S.I.>2 = HeparinPreventable ADE S.I.>2 = InsulinPreventable ADE S.I.>2 = OpiatesAdherence to standard drip concentration policyUnplanned return to OR<14 daysClinical quality meta measure

Page 30: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 29

Perspective MeasureDUH

TargetOncology

TargetCurrent

PerformanceOncology

Target Met?

Customer Patient Satisfaction: Overall Patient Satisfaction (Inpatient)Patient Satisfaction: Likelihood to RecommendPatient Satisfaction: EDPositive/Negative Comment RatioPatient Satisfaction: TeamworkPatient Satisfaction: Overall Outpatient SatisfactionPatient Satisfaction: Pain Management

ClinicalQuality/InternalBusiness

Medication Safety (Preventable ADE with S.I.>2/A11 ADE)Patient Flow (Discharged between 7 AM and 11 AMMRSA Acquisition RateNosocomial Pressure Ulcer RateEffectiveness of Selected Recommendations from RCATFsPreventable ADE S.I.>2 = HeparinPreventable ADE S.I.>2 = InsulinPreventable ADE S.I.>2 = OpiatesAdherence to Standard Drip Concentration PolicyUnplanned Return to OR<14 daysMeta Measure

Perspective MeasureDUH

TargetOncology

TargetCurrent

PerformanceOncology

Target Met?

Financial Cost/Adjusted Discharge% Operating Profi t MarginOperating IncomeALOSSupply $ per Adjusted DCFTE/Adj Occupied Bed (Before Labs)Salaries & Benefi ts as a % of Net RevenuePharmacy Avg Variable Direct Cost/Case (Inpt)Pharmacy Avg Variable Direct Cost/Case (Outpt)Total Actual FTE/Flex Budget FTEs

WorkCulture

Overall Employee TurnoverNurse TurnoverEmployee Satisfaction (2002)Vacancy Rate—for approved positionsVacancy Rate—Days to HireMandatory Training—HEICS TrainingMandatory Training—HIPAAParticipation in Continuing Education# Publications/Presentations

Dashboard Presentation

Duke University HospitalOncology CSU Balanced ScorecardCurrent Performance vs. Target

Duke University HospitalOncology CSU Balanced ScorecardCurrent Performance vs. Target

Page 31: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

30 Oncology Dashboards

Dashboard in BriefInstitution Gannet Medical Center1

Length One page

Update Frequency Every two months

Highlights Data-driven dashboard predominately includes volume and fi nancial measures.

Dashboard Indicators

Volume Indicators

Tumor registry volume for three system hospitals

Radiation oncology new consults

Radiation oncology units of service

Infusion center total volume

Cancer Care Associates (surgical practice) new patients

Cancer Care Associates surgery volume

Oncology fl oor average daily census

Oncology fl oor LOS

Oncology fl oor paid hours per patient day

Inpatient discharges—cancer medicine

Inpatient discharges—cancer surgery

Cancer mortality—total

Cancer mortality—oncology fl oor

Hospice average daily census—(inpatient and outpatient)

Hospice admissions—(inpatient and outpatient)

Hospice swing beds—inpatient

Clinical trial accrual—treatment

Clinical trial accrual—prevention

Financial Indicators

Charges, net revenue, expenses, contribution margin, YTD comparison of charges versus expenses for the following business units:

• Oncology fl oor

• Radiation oncology

• Infusion center

• Cancer Care Associates

• Cancer support services

• Tumor registry

1 Pseudonymed institution.

Page 32: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 31

1I. Financial Indicators Charges Net Rev Expenses Cont. Margin

YTD Comparison

Charges Expenses

A. Inpatient

B. Radiation Oncology

C. Infusion Center

D. Cancer Care Associates

E. Cancer Center Support Services

F. Tumor Registry

Total

Dashboard Presentation

Current Period YTD

Actual 02 Actual 01 Actual 02 Actual 01Goal/

Benchmark

A. Cases (All three hospitals)

B. Radiation Oncology

1. Consults

2. Units of Service (productivity)

C. Infusion Center

1. Total Volume

D. Cancer Care Associates

1. New Patients

2. Surgeries

E. Inpatient

1. Average Daily Census

2. LOS

3. Paid Hours per Patient Day

F. Inpatient Discharges (DRG)

1. Cancer Medicine

2. Cancer Surgery

G. Mortality

1. Overall Cancer Mortality Rate

2. Inpatient Cancer Mortality Rate

H. Hospice

1. Inpatient Admissions

2. Average Daily Census

3. Swing Beds

4. Hospice Outpatient Admissions

I. Clinical Trials

1. Treatment

2. Prevention

1. Volume Indicators

Page 33: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

32 Oncology Dashboards

Dashboard in BriefInstitution Licking Memorial Hospital, Newark, Ohio

Length One page

Update Frequency Monthly

Highlights Dashboard indicators are color-coded according to status of measure.Dashboard revised annually.

Dashboard Indicators

Cost

Number of offi ce visits

Number of chemo unit visits

Cost/offi ce visit

Cost/chemo unit visit

Worked hours/unit in offi ce

Worked hours/unit in chemo unit

Overtime hours

Days in A/R for offi ce

Quality/Effi ciency

Collections +/– number of take actions

Number of outstanding receipts

Outstanding receipts ($)

Percentage cash collections

Percentage last postings

Number on reconciliation report

Patient education documented in electronic medical record

Pain assessment documented

Documentation of pain level within one hour after medicated

Chemo consent form complete—new patients

Pre-hydration per chemo guideline

Allergy documentation

Consent to bill signed

Directive are documented

Medication errors—chemo

Medical necessity denials

Discharge education documented

Number of mislabeled/unlabeled lab specimens

Customer Satisfaction

Percentage of white-glove survey indicators accepted

Patient satisfaction from surveys (chemo unit)

Patient satisfaction (1st and 3rd quarters)

Physician satisfaction (2nd and 4th quarters)

Internal customer satisfaction (bi-annual)

Page 34: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 33

Indicator Measurement Benchmark Jan Feb Mar YTD Avg Status

Cost

Number of offi ce visits 466 525 395 445 480

Number of chemo unit visits 275 340 304 328 325

Cost/offi ce visit $161.00 $150.00 $150.00 $150.00 $151.25

Cost/chemo unit visit $119.33 $130.00 $100.00 $100.00 $105.00

Worked hours/unit in offi ce 2.8 2.5 3 2.5 3

Worked hours/unit in chemo unit 2.7 2 2.5 2 19

Overtime hours 0 0 5 19 15

Days in A/R for offi ce 45 37 37 37 31

Quality/Effi ciency

Collections plus/minus # of take actions 15 13 18 16 9

# Outstanding receipts 0 6 1 14 4

Outstanding receipts $$ 0 $65 $15 $200 $46

% Cash collections 100% 89% 133% 115% 102% Yellow

% Late postings 20% 8% 1% 5% 3% Green

# on Reconciliation report 0 0 0 0 0

Patient education documented in EMR 100% 80% 100% 38% 76% Yellow

Pain assessment documented 100% 80% 100% 100% 95% Yellow

Documentation of pain level within one hour after medicated 100% None None 80%

Chemo consent form complete—new patients 100% 100% 100% 100% 100% Green

Pre-hydration per chemo guideline 100% 75% 100% 100% 97% Green

Allergy documentation 100% 100% 100% 100% 100% Green

Consent to bill signed 100% 90% 95% 100% 82% Yellow

Directives are documented 100% 95% 90% 100% 98% Yellow

Medication errors—chemo 0 0 0 0 0 Green

Medical necessity denials 75 142 64 28 42 Green

Discharge education documented 100% 100% 100% 98% Yellow

# Mislabeled/unlabeled lab specimens 0 1 0 1 0 Yellow

% White-glove survey indicators acceptable 100%

Customer Satisfaction

Patient satisfaction from satisfaction surveys (chemo unit) 90% 100% 98% 98% 98% Green

Patient Satisfaction (1st and 3rd quarters) 90% 99% 99% Green

Physician Satisfaction (2nd and 4th quarters) 90% 97% Green

Internal customer satisfaction (Bi-annual) 90% 89% Yellow

Dashboard Presentation

Licking Memorial HospitalProcess Improvement DashboardFY 2002

Page 35: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

34 Oncology Dashboards

Dashboard in BriefInstitution Perkins Hospital1

Length One page

Update Frequency Every quarter

Highlights Top of dashboard summarizes current service line issues and recent accomplishments; the middle section holds volume and fi nancial indicators for each of the major oncology business units; the bottom section, the focus of which changes each quarter, presents trends in areas such as profi tability and patients satisfaction.

1 Pseudonymed institution.

Dashboard Indicators

Volume Indicators

Total inpatient discharges

• Cancer medicine discharges

• Cancer surgery discharges

Total outpatient visits

• Ambulatory treatment center visits

• Infusion center visits

• Medical oncology/hematology center visits

• Radiation oncology visits

Total home care visits

Total hospice visits

YTD variance for all indicators

Financial Indicators

Inpatient charges

Outpatient charges

Home care charges

YTD variance for all indicators

Page 36: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 35

Perkins HospitalOncology Product Line Report: August 2002

• Biannual strategic plan completed• Expansion of the ambulatory treatment center completed• The 2001 Annual Cancer Program Report distributed to 2,400 physicians, administrators, and other constituents• Overall patient satisfaction was 96.9 percent for inpatients and 97.4 percent for outpatients in the most recent quarter

Current Program Highlights and Issues

By the Numbers

Inpatient Discharges—Total Cancer Medicine Cancer SurgeryOutpatient Visits—Total Ambulatory Treatment Ctr. Infusion Center Med. Onc./Hem. Ctr. Radiation OncologyHome Care Visits—TotalHospice Visits—Total

Volume Indicators August Fiscal YTD YTDActual 2002

19784

1133,719

804317504

2,0941,241

81

Actual 200120080

1203,410

792393447

1,7781,655

35

Actual 2002409174235

7,6261,695

669988

4,2742,319

157

Budget 2002422167255

7,2461,448

760938

4,100N/AN/A

Variance(3.1%)

4.2%(7.8%)

5.2%17.1%

(12.0%)5.3%4.2%

Inpatient ChargesOutpatient ChargesHome Care Charges

August Fiscal YTD YTDActual 2002$6,230,428$1,654,955

$134,065

Actual 2001$8,114,541$1,466,300

$181,822

Actual 2002$12,875,254

$3,416,448$250,367

Budget 2002$13,151,646$3,202,732

$362,703

Variance (2.1%) 6.7% (31.0%)

Financial Indicators

Trends and Updates

Inpatient Profi tability Outpatient Profi tability

Gr. Rev.

Net. Pmt.

Dir. Cost.

Dir. Mgn.

OH Inc. Gr. Rev.

Net. Pmt.

Dir. Cost.

Dir. Mgn.

OH Inc.

71.8

34.8

18.5 16.310.1 6.2

18.2

9.36.3

3.0 4.5(1.5)

80

60

40

20

0

201510

(5)

50

FY 2001FY 2002

FY 2001FY 2002

Milli

onsMilli

ons

Inpatient profi tability increased from $5.5 million in FY 2001to $6.2 million in FY 2002 on 159 more discharges.

Outpatient services lost $1.5 million in FY 2002 versus$1.1 million in FY 2001 on 5,182 more visits.

Both inpatient and outpatient services contribute positive profi t margin prior to allocation of overhead.

Dashboard Presentation

Page 37: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

36 Oncology Dashboards

Dashboard in BriefInstitution Providence Hospital and Medical Center, Southfi eld, Michigan

An affi liate of St. John Health System

Length One page

Update Frequency Every quarter

Highlights Oncology dashboard presented as radar graph and in spreadsheet format. Each selected indicator is tied to the oncology strategic plan. Dashboard revised annually.

Dashboard Indicators

Fiscal Year 2002 Dashboard Indicators

Increase in analytic cases

Increase in radiation oncology treatments

Quality: Increase clinical trials

ALOS

Satisfaction: Pain

Satisfaction: Overall inpatient

Satisfaction: Overall radiation oncology

Cost per case

Margin per case

Fiscal Year 2001 Dashboard Indicators

Growth: Admissions

Growth: Radiation therapy

Quality: Overall LOS

Satisfaction: Inpatient

Satisfaction: Radiation oncology

Expense per case

Page 38: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 37

Dashboard Presentation

Oncology Balanced Scorecard: FY02 Increase dx Cases

Margin/Case

Cost/Case

Satisfaction: Overall Rad Onc

Satisfaction: Overall IP Satisfaction: Pain

ALOS

Quality: Increase trials

Increase Rad Onc treatments

250

200

150

10050

0

1st Quarter 2nd Quarter

Oncology Balanced Scorecard: FY01Growth Admissions

Growth: Rad Ther

Quality: Overall LOS

Satisfaction – IPSatisfaction – Rad Onc

Expense/Case

120

10080604020

0

1st Quarter 2nd Quarter

Providence Hospital and Medical Centers

Providence Hospital and Medical Centers

Page 39: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

38 Oncology Dashboards

Dashboard in BriefInstitution SwedishAmerican Health System, Rockford, Illinois

Length Multiple dashboards

Update Frequency Quarterly

Highlights Dashboards maintained for each cancer center department or business unit. Leadership indicator dashboard weighted the most heavily and provides most thorough snapshot of cancer program performance.

Dashboard Indicators

Leadership Indicators

DischargesPatient days by DRGALOSOutpatient hematology/oncology visitsRadiation oncology visitsInfusion room visitsInpatient market sharePatients referred outside system for other oncology procedures (top fi ve sites for 2002)Utilization of home health care services for volumes of oncology referralsInpatient profi tability by DRG Outpatient contribution marginOutpatient profi tability by business unitRadiology and laboratory revenue generated by the oncology service line

Radiation Oncology Indicators

Accuracy of patient’s fi nal radiation oncology billing with APCOne-year post-lumpectomy patient satisfactionProstate seed implant patient overall satisfactionPSA score one-month post-seed implantRadiation oncology pain management scoresRadiation oncology patient satisfaction with pain managementPatient willingness to recommendFatigue scores

Greater Rockford Hematology Oncology

Center Indicators

Complete reporting of test resultsPercentage standardization of antiemetic usagePatient satisfactionWillingness to recommendHematology oncology pain management scoresHematology oncology patient satisfaction with pain managementOutreach oncology clinic activityCommunication back to referring physicianCommunication to primary care physicianMedication errors

Cancer Registry Indicators

Cancer registry abstract completionStage II breast cancer patients receiving chemotherapyRituximab use among low-grade NHL patientsHormone receptor positive patients receiving tamoxifen

Complementary Medicine Indicators

Patient visits to Midwest Center for Health and HealingRadiation oncology patient needs for complementary therapy

Page 40: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 39

Dashboard Indicators, Continued

Clinical Research Indicators

Enrollment in breast cancer prevention STAR trialAdult cancer cases enrolled in clinical trialsEnrollment in prostate cancer prevention SELECT trialCompliance with required protocol parameters

Professional and Public Education Indicators

Attendance at SwedishAmerican Health System Oncology ConferenceConference evaluation ratingsPercentage of physician conference attendees reporting resultant change in patient care patternsGraduating classes of School of Radiation Therapy passing the ARRT certifi cation with scores above the Illinois and national averageTime from screening to diagnostic mammogramTime from diagnostic to stereotactic biopsyTime from diagnostic mammogram to surgical biopsy

Breast Health Care Strategic Clinical

Benchmarks

Volume of newly diagnosed breast cancer patients using CBCTPatient satisfaction with CBCTTime from stereotactic biopsy to surgeryTime from surgical biopsy to surgeryTime from screening to surgeryPatient stage at diagnosis compared to ability to payUse of RT following breast conservation surgerySurvival rate for breast cancer stages I-IVPatients diagnosed with breast cancer at SwedishAmerican but not treated within systemStage II patients receiving chemotherapyHormone receptor positive patients receiving tamoxifenNumber of partial mastectomiesNumber of simple mastectomiesNumber of radical mastectomies

Dashboard Presentation

SwedishAmerican Health System Breast Health Care Strategic Clinical Benchmarks FY2003

Indicator GoalMeasurement

ToolResults Action Plan

Time from screening mammogram to diagnostic mammogram 1–7 days BHC data

Time from diagnostic mammogram to stereotactic biopsy 1–7 days BHC data

Time from diagnostic mammogram to surgical biopsy 7–14 days BHC data

Volume of newly diagnosed breast cancer patients using CBCT

30% of newly diagnosed BHC data

Patient satisfaction with CBCT 4 or higher on all survey questions

Patient satisfaction survey

Patient stage at diagnosis compared to ability to pay

Benchmark to 2001 registry data

Cancer registry data

Use of RT following breast conservation surgery 100% Cancer registry

data

Survival rate for breast cancer stages I–IV Meet or exceed NCDB

Cancer registry data

Page 41: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

40 Oncology Dashboards

Dashboard in BriefInstitution VCU Health System, Massey Cancer Center, Richmond, Virginia

Length Two pages

Update Frequency Every two months

Highlights Dashboard modeled after VCU Health System balanced scorecard.

Dashboard Indicators

Financial Indicators

Inpatient contribution marginOutpatient contribution marginInpatient direct cost per dayInpatient total cost per dayOutpatient direct cost per encounterOutpatient total cost per encounter

Learning and Growing

Percentage of employee evaluations submitted on timeOverall employee turnover rateExit interview resultsVacancy rateNumber of HR offers declined/number offeredMedical staff turnoverNominations for employee of the monthOncology nursing classesPublicationsResearch grants (total costs)

Customer

Inpatient market share, primary service areaAmbulatory clinic visitsInpatient dischargesInpatient daysALOSPatient satisfaction (inpatient, outpatient, BMT, radiation)Number of patients on clinical trialsCommunity education activities (number of events)

Internal Medicine

Cost Observed: Expected ratio• Cost Observed: Expected ratio

– Inpatient total mastectomies – Inpatient colorectal cancer surgeries

• LOS Observed: Expected ratio– Inpatient total mastectomies– Inpatient colorectal cancer surgeries

Percentage of patients discharged before noonDays to clinic appointment—HemOncDays to clinic appointment—Breast clinicDays to clinic appointment—Chest Days to clinic appointment—GI Days to clinic appointment—Urological tumor

Page 42: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 41

Dashboard Presentation

Medical College of Virginia Hospitals and Physicians of VCU Health SystemMassey Cancer CenterBalanced Scorecard Oncology Service LineCustomer

Measure Benchmark FY 01FY02Total Defi nition, notes

Inpatient Market Share, Primary Service Area

Solucient state database. 52 DRGs for all areas of oncology.

Ambulatory Clinic Visits

• MCVH—Dalton MCVH. 1 count per patient per date for MAS code = 326. All Docs/Divisions.

• MCVH—BMT MCVH. 1 count per patient per date for MAS code = 302.

• Multi-Disciplinary All clinics/divisions/departments/specialties.

• MCVP—Nelson HemOnc

• MCVP—Stony Point HemOnc

• Amb Clinics—Total

Inpatient Discharges from service line fi nancial report; HemOnc and SurgOnc only.

Inpatient Days from service line fi nancial report; HemOnc and SurgOnc only.

Average LOS from service line fi nancial report; HemOnc and SurgOnc only.

Patient Satisfaction— Overall Quality of Care

• Inpatients—North 6 Benchmark: PRC national mean. 100 = “Excellent,” 80 = “Very Good.” Not limited to svcline patients.

• Inpatients—BMT Benchmark: PRC national mean. 100 = “Excellent,” 80 = “Very Good.” Not limited to svcline patients.

• Ambulatory—RadOnc Benchmark: PRC national mean. 100 = “Excellent,” 80 = “Very Good.” Not limited to svcline patients.

• Ambulatory—Dalton Benchmark: PRC national mean. 100 = “Excellent,” 80 = “Very Good.” Not limited to svcline patients.

Number of Patients on Clinical Trials

• Participants on Follow-Up on-trial and on-follow-up; some undercount probable.

• New Accruals on-trial and on-follow-up; some undercount probable.

Community Educational Activities (no. of events)

Page 43: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

42 Oncology Dashboards

Dashboard Presentation, Continued

Learning and Growing

Measure Benchmark FY 01 FY02 Total Defi nition, notes% of Employee Evaluations Submitted On TimeOverall Employee Turnover Rate New source for information as of 5/1/02Exit Interview ResultsVacancy RateNumber of HR Offers Declined/Number OfferedMedical Staff TurnoverNominations for Employee of the Month Source: Sandra Gellings, Vicki Olson, Alison HopsonOncology Nursing ClassesPublicationsResearch Grants (total costs)

Financial

Measure Benchmark FY 01 FY02 Total Defi nition, notesInpatient Contribution Margin Service Line Only (exclud BMT). FY02 costs estimated.

Costs fully loaded. Reimb = blended.Outpatient Contribution Margin Service Line Only (exclud BMT); includ PedsHemOnc

in FY02. FY02 costs est. Costs fully loaded. Reimb = blended.

Inpatient Direct Cost per Day Service Line Only (exclud BMT). FY02 costs estimated. Costs fully loaded. Reimb = blended.

Inpatient Total Cost per Day Service Line Only (exclud BMT). FY02 costs estimated. Costs fully loaded. Reimb = blended.

Outpatient Direct Cost per Encounter Service Line Only (exclud BMT); includ PedsHemOnc in FY02. FY02 costs est. Costs fully loaded. Reimb = blended.

Outpatient Total Cost per Encounter Service Line Only (exclud BMT); includ PedsHemOnc in FY02. FY02 costs est. Costs fully loaded. Reimb = blended.

Internal Business

Measure Benchmark FY 01 FY02 Total Defi nition, notes

Cost Observed:Expected Ratio benchmark = median ratio for 10 UHC institutions chosen for comparison.

Inpatient Total Mastectomies DRG 257 & 258 Inpatient Colo-Rectal Cancer Surgeries DRG 148 & 149 with cancer principal dxLOS Observed:Expected Ratio benchmark = median ratio for 10 UHC institutions

chosen for comparison. Inpatient Total Mastectomies DRG 257 & 258 Inpatient Colo-Rectal Cancer Surgeries DRG 148 & 149 with cancer principal dx% Patients Discharged before Noon includes inpatient, observation, and ambulatory.Days to Clinic Appointment—HemOnc average # days for fi rst 15 patients requesting appt.Days to Clinic Appointment—Breast Clinic average # days for fi rst 15 patients requesting appt.Days to Clinic Appointment—Chest average # days for fi rst 15 patients requesting appt.Days to Clinic Appointment—GI average # days for fi rst 15 patients requesting appt.Days to Clinic Appointment—Urological Tumor average # days for fi rst 15 patients requesting appt.

Page 44: Tools for Monitoring Program Performance...Creative Services Jenn Wallace Garner • Maureen Giese • Karen Jenkins • David Lawson • Fabiola Micheloni Legal Caveat The Advisory

Oncology Dashboards 43

SRT HEAD FRAME REPRODUCIBILITY

PERFORMANCE MEASURE: To measure the number of SRT head frames being repeated due to observed poor fitting or measured (maximum allowable = 1.5mm.) tolerance changes.

COMMITTEE/OWNER: Chief Therapist, Department of Radiation Oncology

FINDINGS/DATA:

0.0%

5.0%

10.0%

15.0%

20.0%

25.0%

30.0%

35.0%

40.0%

45.0%

50.0%

%REPEATS

Nov–Jan-01 Feb–May01 Jun–Oct-01 Nov–Jan-02 Jan–Aprl-02 May–Aug-02

DEFINITION: The purpose is to decrease the number of repeat head frames.DATA SOURCE: Impac® systemCONCLUSIONS/RECOMMENDATIONS ACTION PLAN/FOLLOW-UP

Dashboard Presentation

Dashboard in BriefInstitution Watkins Medical Center1

Length Approximately 80 different one-page dashboards

Update Frequency Every quarter

Highlights Institution maintains approximately 80 dashboards, each examining a discrete indicator such as patient wait times, medication errors, patient satisfaction, linear accelerator downtime, and stereotactic radiotherapy head frame reproducibility. Process improvement committee reviews between 8 and 10 dashboards each month. Dashboards examine an indicator’s current performance levels, trends over time, and action plans to continue or improve performance.

1 Pseudonymed institution.


Recommended