Automating front-end queries for...2 Objectives • 1. Understand the rationale for using front-end...

Post on 11-Aug-2020

3 views 0 download

transcript

0

Automating front-end queries for physicians to drive quality documentation

David Rich, MD, FAAP

CMIO, Associate CMO, Excela Health, Greensburg, PA

Ehab Hanna, MD FHM

CMIO, Universal Health Services, Inc., King of Prussia, PA

2

Objectives

• 1. Understand the rationale for using front-end documentation queries as a complement to existing CDI efforts.

• 2. Know workflow and system considerations for implementing DQR with auto-trigger across multiple specialties and multiple documentation paradigms.

• 3. Appreciate the potential impact associated with use of this tool.

3

Who remembers this?

4

5

Then, this was invented

6

What’s the Healthcare Equivalent?

Proofreaders = Clinical Documentation Specialist

Spellcheck & Grammar Check = ?

7

8

Excela Health

9

Excela Health

• 3 hospital system in Western Pennsylvania

• 612 beds, 20,000+ discharges/year

• Mixed medical staff, 200+ employed providers, 700+ total

• Family practice residency, rotating surgical residents

• Cerner inpatient client since 2012, “other” ambulatory and emergency department systems

• Remote hosted environment

• Blended use of PowerNote and Dynamic Documentation for progress notes

• Transcription/eScription for H&Ps, Consultations, Operative Notes, and Discharge Summaries

• Limited use of front end voice recognition

10

Documentation Quality Timeline

• 2003 - Compliant Documentation Management Program (CDMP)• Nuance (formerly JA Thomas) Clinical Documentation Improvement Program

• 2012 - EMR Transition• CDI team moved off campus

• Response rates declined

• 2015 - Effort to improve response rates, re-educate, etc.• Onsite documentation specialist hired to address outstanding queries and other

key documentation challenges (core measures, 2 day rule, etc.)

• 2016 - Document Quality Review (DQR) • Somewhat concurrent with physician optimization efforts which included workflow

views and dynamic documentation for 6 specialties

11

Current CDI Activity/Results

• 9 FTEs for documentation review

• 17,000+ charts reviewed/year

• all payors, 88% of DRG cases

• 4500+ queries sent/year

• 24% of reviewed cases

• 86% response rate

• 90% agreement rate for responses

• $2.9 million “bump” last year (pre-DQR) related to improved case mix index (CMI) after successful query/clarification

12

DQR Implementation Plan

• We were asked by Cerner/Nuance to be early adoption partners, specifically with respect to the auto-trigger functionality

• While hospitalists and residents were the primary targets, we would also include specialists, particularly those who would be taking the specialty optimization

• 3 month lead time for development, mapping, configuration, etc.

• Training

• Lunch sessions hosted by Nuance, highlighting documentation principles

• Pocket reference guide

• Activation

• Manual addition of the functionality for individual providers

• Inclusion by note type

13

14

15

16

PowerNote

Dynamic Documentation

Free Text Note

17

18

Immediate Realizations

Usability - General

• Users can disable the auto-trigger on their own

• Integration/output not consistent across documentation paradigms

• No easy way to run on transcribed documents in message center

• Diminishing returns on re-presenting “skipped” queries

Usability - Specialists

• Auto-refresh for assessment/plan component in Dyn Doc creates challenges for those who like to narrow the list of relevant diagnoses for their specialty

• Clarifications not always applicable to consultants

• No way to narrow the target audience by clarification type

• No way to mark as not applicable for me or my specialty

19

Secondary Realizations

Engine Logic

• Initially interpreting content in note headers

• NOT reviewing lab results, only those results included in documentation

• SNOMED/IMO/ICD10 correlation issues

• Cannot be run for documentation on observation patients

Technical

• Challenging to identify those who have opted out and when

• No easy way to systematically opt users back in

Reporting/Analysis

• No “out of the box” operational reporting capability comparable to that with the JA Thomas product

• Difficult to assess the financial impact of an accepted clarification

20

User Feedback

• Coding logic and clinical practice do not always agree

• Just because I am treating as though it may be MRSA pneumonia, does not mean I want to call it MRSA Pneumonia

• If my resident has already reviewed the clarification, I do not need to be presented with the same clarification

• This adds time to my workflow

• Would be nice for this to run automatically at discharge

• CDI criteria for assigning a DRG, MCC, CC are more stringent

• Our CDI folks typically want to see something substantiated more than once during the hospitalization

21

Data Analysis

• 6 months of data (April – September 2016) • All users

• Employed hospitalists as a subgroup

• CDMP data compared with DQR data• Total clarifications

• Response rates

• Agreement rate for responses

• Nuance analysis of financial impact

22

DQR

Excela Presented Agree

Does Not

Apply Skip

Response

Rate

Agreement Rate

for Responses

April, 2016 10524 162 300 10062 4% 35%

May, 2016 11891 227 473 11191 6% 32%

June, 2016 15114 336 583 14195 6% 37%

July, 2016 15168 231 524 14413 5% 31%

Aug, 2016 7632 270 503 6859 10% 35%

Sept, 2016 6648 260 440 5948 11% 37%

1486

CDMP

Excela Sent Agree Decline

No

Response

Response

Rate

Agreement Rate

for Responses

April, 2016 362 288 25 49 86% 92%

May, 2016 345 268 34 43 88% 89%

June, 2016 340 252 36 52 85% 88%

July, 2016 327 256 25 46 86% 91%

Aug, 2016 336 269 24 43 87% 92%

Sept, 2016 356 276 29 51 86% 90%

1609

23

6 Month Data for Employed Hospitalists

Clarification Family Presented Accepted DKA

Response

Rate

Accept Rate for

Responses

ACUTE RESPIRATORY FAILURE 298 22 7 10% 76%

ANEMIA 1290 83 141 17% 37%

ASTHMA 271 12 32 16% 27%

CHRONIC OBSTRUCTIVE PULMONARY

DISEASE 617 25 75 16% 25%

CHRONIC RESPIRATORY FAILURE 830 86 83 20% 51%

ENCEPHALOPATHY 882 62 106 19% 37%

HEART FAILURE 1937 149 217 19% 41%

MALNUTRITION 883 85 79 19% 52%

PNEUMONIA 1227 36 129 13% 22%

RENAL FAILURE 532 23 40 12% 37%

SHOCK 673 16 83 15% 16%

SIRS 4270 192 611 19% 24%

Overall 13710 791 1603 17% 33%

24

1.2

1.25

1.3

1.35

1.4

1.45

1.5

1.55

1.6

CMI for Excela

2015

2016

1.1

1.15

1.2

1.25

1.3

1.35

1.4

1.45

CMI for Hospitalists

2015

2016

25

Excela ROI-1 (April) and ROI-2 (August +)

DQR Shift Before (gray), After (green)

Impact changes ROI-1 ROI-2

PDX 5 17% 15 11%

MCC 4 13% 20 15%

CC 1 3% 4 3%

Severity 20 66.7% 93 70%

Summary for CMI impact cases ROI 1 ROI 2

CMI impact (PDX, MCC, CC) 10 cases 39 cases

Assuming MS-DRG Medicare $58,714 $140,735

Clarification status for “agreed” cases ROI 1 ROI 2

Agreed, documented, impact 30 132

Agreed, but not coded 46 41

Agreed, coded, but no impact 7 5

4 weeks 5 weeks

ROI-2

0%

5%

10%

15%

20%

25%

30%

35%

40%

45%

50%

Minor Moderate Major Extreme

Severity of Illness

0%

5%

10%

15%

20%

25%

30%

35%

40%

Minor Moderate Major Extreme

Risk of Mortality

0%

10%

20%

30%

40%

50%

60%

Minor Moderate Major Extreme

Severity of Illness

0%

5%

10%

15%

20%

25%

30%

35%

40%

Minor Moderate Major Extreme

Risk of Mortality

ROI-1

26

Findings

• The number of accepted clarifications is comparable to that for our pre-existing CDMP process

• Even our most incentivized/compliant providers had lower response rates than expected

• There seems to be a correlation with clarification family and likelihood to respond

• It is difficult to quantify the direct impact on CMI independent of CDMP activity, but:

• Estimated annualized impact of $1.4 million or $70/inpatient encounter

27

Lessons Learned

• Early adoption partnership can be a valuable experience

• Invest in training

• There is such a thing as too much change at one time

• Collaboration with finance and physician leadership is key

• Not a replacement for CDMP on all levels, but the 2 processes can be complementary

• Onsite presence of documentation specialists does help

• If we can improve documentation, we can improve the patient story…

28

What’s Next for Us?

• Determine how to better integrate with existing CDI resources and workflows

• Reduce overlap where possible

• What was picked up by DQR that did not need to be addressed by CDMP?

• What was picked up by CDMP that is not in the logic for DQR?

• Enable DQR for all providers, but be selective with autotrigger

• Provide feedback for software enhancements from Cerner/Nuance

• See wish list

29

Universal Health Services, Inc.

30

UHS Overview

$9B REVENUE

31

Cerner Project Background

• UHS has 25 Acute Care facilities across the U.S.

• 3 production “domains” (East, Central, West)

• Timeline• Design and configure January, 2010 – April 2011

• Installation @ 25 hospitals May, 2011 – July, 2013

• Phase 1 – Completed, Core clinical modules

• Phase 2 – Completed, Inpatient CPOE+Med Rec

• Phase 3 – Completed, Inpatient MD documentation

• Ambulatory• 250 MDs on Cerner hosted “ASP” environment (no direct UHS control)

• Migrating them to our 3 (hospital) “domains” (direct UHS control)

• Activating Cerner Registration and Patient Accounting as we go

32

Dynamic Doc Adoption

33

Dynamic Doc Adoption

$-

$50,000

$100,000

$150,000

$200,000

$250,000

$300,000

$350,000

$400,000

$450,000

JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC

14 UHS TSP Spend

15 UHS TSP Spend

16 UHSTSP Spend

2014 - 2016 transcription monthly spend

34

34

• Pilot started July 2016

• SWHC – Inland Valley, Rancho Springs (~120 bed facility)

• Physicians enrolled:

• Primarily hospitalists

• 30 MDs currently

• One week of on-site support from UHS/Cerner/Nuance

• Engaged CDI team and CDI physician was invaluable

• Web based Training for MDs and CDI specialists

Pilot setup at UHS

35

• Measures

• Quality

• Severity of Illness

• Observed / Expected Mortality

• Length of Stay

• Readmission Rate

• Simple / Severe Sepsis Progression to Shock

• Adoption

• System Performance

Pilot setup at UHS

36

Issues and resolution

Priority Issue/Request Anticipated Release

1 - HIGHClarification statement not being anchored to the Assessment and Plan section

causing CAC issue (DynDoc Hook for all notes)Tentative PROD install week of 10/17

1 - HIGH

We would like to clean up the clarification statement that is inserted into the note.

Right now it has the Diagnosis listed 3 different times using different vocabulary

(SNOMED Term, Nuance Preferred Term, and an ICD10 Term). We would like it

to just be listed once with the ICD10 term.

Packages are in CERT and we have began testing

with positive results. We ran into an issue with some

of our Smart Templates and we are looking into this

with John D (trying to hit Oct. 17th PROD week).

1 - HIGH Engine updatesCERT domain - 9/20

PROD domain - 9/21

1 - HIGH

Clarifications not being presented on the front end, but being pulled into the report

and the Summary component that CDI team is using (on the Clinical Coding

Summary)

Completed

37

Pilot results

0

1

2

3

4

5

6

7

8

9

Week 2 (7/11) Week 3 (7/18) Week 4 (7/25) Week 5 (8/1) Week 6 (8/8) Week 7 (8/15) Week 8 (8/22) Week 9 (8/29) Week 10 (9/5) Week 11(9/12)

Average of both transactions (secs)

38

Pilot results

Month

Clarifications

Unique

Total

Responses

Agree

Responses

Response

Rate

Agree

Rate

Clarifications

Presented Rate

July 600 432 168 72% 39% 48%

August 501 342 125 68% 37% 41%

September 402 359 150 89% 42% 33%

40

Next steps

• Roll out to 6 Vegas hospitals in 2 weeks

• “Soft roll out” to remaining hospitals over the next month.

• More formal on site support model with Med Exec meetings over the next year

41

Combined early adopter wish list

• Extinction behavior for queries (i.e. after x days or x presentations for the same diagnosis)

• Ability to target queries by specialty

• Or add a “not my specialty” response option

• Prevent users from disabling the auto-trigger functionality

• Automated review on discharge

• Better “out of the box” reporting to determine impact

• Ability to easily trigger for transcribed documents in message center

42

Education sessions: Complete a post-session survey for each session in Cerner Events app

Continuing education: Complete the attestation survey, available Nov.16 - Dec. 16

General reminders