Post on 29-Jul-2018
transcript
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
2
Using Metrics and Data Analytics to Support CDI Current and Future States
Julia Moreau, RN, MBA
VP Mid‐Revenue Cycle
Andrea Eastwood, RHIT, BAS
Director Clinical Encounter and Documentation Excellence
Trinity Health
Livonia, Michigan
3
Learning Objectives
• At the completion of this educational activity, the learner will be able to:– Demonstrate how consolidated metrics and benchmarks can be
used to monitor, evaluate, and improve the effectiveness of CDI programs
– Describe how to leverage technology to improve CDI productivity and effectiveness
– Explain the utilization of data to better align CDI with the changing healthcare delivery model by moving from focusing on CC/MCC capture to focusing on achieving documentation that accurately reflects the severity of illness, risk of mortality, and resources consumed
– Associate clinical best practices to ensure medical necessity for the services rendered
– Illustrate how to re‐engage and educate physicians on documentation basics
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
4
CHE and Trinity Health: Who Are We
• On May 1, 2013, Catholic Health East (CHE) and Trinity Health consolidated to form the second largest Catholic healthcare system in the nation and one of the largest non‐for‐profit healthcare and home health care organizations in the country
5
Trinity Health: Who We Are
175 clinical documentation specialists
6
CDI Programs at the Time of Consolidation
• CDI programs were in place at all hospitals in both systems at the time of consolidation– CHE had a decentralized program structure while Trinity Health
had corporate office CDI support with a system‐level CDI manager in place since 2009
– Both systems calculated and reported CDI financial benefit, but used different methodologies
– CHE programs were staffed for Medicare reviews only while Trinity Health programs primarily staffed for all‐payer reviews
– Both systems had multiple CDI software vendors in place
– Both systems had a monthly CDI forum and call in place to share best practices
– Trinity Health had in place a CDI dashboard that tracked 17 elements of CDI program performance
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
7
Trinity Health Current CDI Program Profile
• 48 acute care hospitals with CDI programs in place
• Consolidated to one monthly CDI council call to share best practices
• Selected one methodology for calculating CDI financial benefit and standardized the reports provided to the hospitals
• Standardized reports to assist programs in monitoring CDI effectiveness
• Consolidated all CDI programs into the systemwide CDI dashboard with key performance indicators and program standards
• Selected one CDI software vendor and standardized system setup and implementation whenever possible
8
Trinity Health CDI Staff Profile
• There are approximately 175 CDS FTEs across the system with a 13% vacancy rate
• Systemwide corporate CDI director and corporate CDI manager in place to provide support to the sites
9
Available CDI Tools and Data
• Standardized suite of CDI program tools and data include:
– CDI financial benefit reports, CC/MCC capture, alternate principal diagnosis, nonspecific DRG
– CDI dashboard with key metrics and program standards
– CDI software reports
• Learned that we were data rich and information poor:
– CDI leads did not know how to use the reports to drive performance
– CMOs and CFOs were not aware of the reports or how their sites were performing
– Frontline CDSs were unaware of the data being collected and reported
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
10
Closing the Data/Information Gap
• Identified the need for education of CDI leads, CMOs, CFOs, and frontline CDSs on information available to manage their program– Provided education to the CDI leads and CDSs on the following:
• Data elements in reports and calculations
• How to use the data to identify opportunities and areas of focus
• Use of action plans to drive performance
– Raised awareness and presented reports and available data at multiple forums
• Monthly CMO meeting
• Documentation excellence forum
• CDI council
• Joint CMO/CFO/CNO call
– Added CDI dashboard to Trinity’s analytics portal so information is readily available to anyone who wants or needs it
11
CDI Value Measurement – Executive View
• The table below represents only a few select fields in the CDI financial benefit report
• Hospitals and executives should focus on their month‐to‐date and year‐to‐date performance
• For long‐term programs, the financial benefit will level off or decrease every time the baseline year is reset, but it should never be a negative number
Facil ity Medical Surgical Total Medical Surgical Total Medical Surgical Total Medical Surgical Total
Hospital 1 37,645$ 33,304$ 70,949$ 5,030 1,878 6,908 (0.0156) (0.0193) (0.0819) (490,163)$ (275,344)$ (765,506)$
Hospital 2 4,650$ 8,496$ 13,146$ 193 41 234 0.1599 0.0702 0.0613 84,362$ 11,581$ 95,942$
Hospital 3 (47,668)$ (56,316)$ (103,984)$ 1,246 1,058 2,304 (0.0259) (0.0178) (0.0106) (197,373)$ (111,811)$ (309,184)$
Hospital 4 19,594$ (8,580)$ 11,013$ 404 254 658 0.0520 0.1232 0.1189 132,897$ 167,912$ 300,809$
Hospital 5 27,939$ 49,460$ 77,399$ 1,056 179 1,235 (0.0057) (0.0353) (0.0278) (39,215)$ (45,408)$ (84,623)$
Hospital 6 29,827$ 198,827$ 228,654$ 1,670 737 2,407 0.0116 0.1852 0.0413 97,853$ 755,098$ 852,951$
Hospital 7 477,283$ 311,116$ 788,399$ 3,519 1,325 4,844 0.0825 0.3032 0.1389 2,289,476$ 3,078,573$ 5,368,049$
Hospiatl 8 188,818$ 11,013$ 199,831$ 1,854 776 2,630 0.0240 0.0736 0.0579 318,903$ 382,177$ 701,080$
Hospital 9 92,133$ (532)$ 91,601$ 746 80 826 0.0757 (0.1197) 0.0850 398,604$ (49,847)$ 348,756$
Hospital 10 (25,558)$ 95,494$ 69,936$ 3,343 833 4,176 (0.0176) 0.0161 0.0077 (386,773)$ 80,957$ (305,816)$
Hospital 11 (22,912)$ (168,744)$ (191,657)$ 1,857 961 2,818 0.0558 (0.0125) 0.0794 668,168$ (67,444)$ 600,724$
Hospital 12 79,375$ 20,044$ 99,419$ 816 293 1,109 0.0457 0.2152 0.1333 299,578$ 413,513$ 713,091$
Hospital 13 (3,261)$ (5,325)$ (8,586)$ 105 7 112 (0.1025) (0.2722) (0.1299) (29,603)$ (7,634)$ (37,238)$
Hospital 14 270,881$ (44,169)$ 226,713$ 1,484 866 2,350 0.0786 (0.0282) 0.0446 768,685$ (158,982)$ 609,703$
Hospital 15 487,726$ (8,807)$ 478,919$ 2,236 884 3,120 0.1173 0.0791 0.0501 1,596,920$ 505,647$ 2,102,567$
Hospital 16 699$ (73,326)$ (72,627)$ 11 879 890 (0.1161) 0.0464 0.0449 (7,445)$ 230,887$ 223,442$
FYTD Adj Cases Accounts FYTD Change in Adj CMI FYTD Change in ReimbursementMonthly Change in Reimbursement
12
Using Data to Drive Performance
• CDI financial benefit reports:– Hospitals focus on their month‐to‐date and year‐to‐date
medical/surgical and overall CMI, medical/surgical and overall reimbursement
– Long‐term program performance should level off or decrease only when the baseline year is reset
– Chief medical officers and CDI program leads are expected to discuss this report monthly, with the CMO sharing with executive leaders no less than quarterly
• Discussion of data at joint CMO/CFO call has elevated the importance of CDI across the health system
– Negative trends prompt focus and evaluation
• Sites have been able to justify additional CDI staff based on negative reimbursement trend
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
13
CDI Value Measurement Report Specifics
• Key components of the financial benefit report are:– Adjusted medical and surgical cases (number of cases in each category)
– Adjusted mix of medical and surgical (percentage of cases in each category)
– Adjusted case‐mix index
– Reimbursement changes for defined period
• Periods reported are as follows:– Baseline – 12 months of data prior to monitoring period (currently July 2012 thru
June 2013)
– Current period – rolling 12 months
– Monthly
– Year to date
• Key assumptions – CMIs are adjusted for all high‐weighted tracheostomy and transplant exclusions
– The adjusted current period medical/surgical mix is reset to equal the adjusted baseline medical/surgical mix
– For comparison purposes, the baseline CMI is calculated by regrouping all pre‐10/1/2007 ICD‐9 data to MS‐DRGs
– Medicare only
14
CDI Value Measure Report Calculation Sample
15
Using Data to Drive Performance
• CC/MCC capture reports:
– Compare volume of cases and % to previous months and to baseline to identify variations
– Compares hospital capture rate percentage to MedPar 80th
percentile as a guide to identify high/low outliers for further focus
– Detail case listing is also provided with the summary report
– Focus chart reviews identified documentation opportunities for short‐stay surgical cases (template H&Ps, little to no CDI review due to LOS, missed query opportunity on part of CDS and coder)
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
16
CC/MCC Capture Report Sample
This is an example of one of our hospital’s reports; as you can see, their MCC capture of surgical cases is low as compared to benchmark. We have been working with them to increase their reviews of surgical cases and focus on template history and physical reports.
Baseline FY 2014 Jul-2014 Aug-2014 Sep-2014 Oct-2014 Nov-2014 80th
Volume % Volume % Volume % Volume % Volume % Volume % Percentile
879 31.8 % 81 33.8 % 71 27.6 % 85 31.7 % 80 34.8 % 53 24.2 % 35.5 %
72 74.2 % 3 75.0 % 2 66.7 % 4 66.7 % 5 71.4 % 4 80.0 % 81.6 %
3,105 82.4 % 276 88.5 % 257 82.9 % 234 82.1 % 248 82.4 % 247 83.7 % 81.8 %
1,341 43.2 % 104 37.7 % 106 41.2 % 98 41.9 % 90 36.3 % 113 45.7 % 45.8 %
66 12.3 % 1 2.9 % 4 12.5 % 7 13.5 % 6 13.6 % 2 5.0 % 20.5 %
65 52.4 % 8 66.7 % 6 50.0 % 8 61.5 % 5 45.5 % 7 70.0 % 44.4 %
1,058 79.5 % 79 69.9 % 91 79.8 % 74 77.9 % 102 82.9 % 80 80.0 % 75.1 %
454 42.9 % 41 51.9 % 44 48.4 % 35 47.3 % 51 50.0 % 37 46.3 % 45.5 %Triplet Subgroup:MCC vs CC
Triplet Subgroup:MCC+CC vs Without CC/MCC
Triplet Subgroup:MCC vs CC
Surgical Capture Rate:
MCC Pairs Subgroup:MCC vs Without MCC
CC/MCC Pairs Subgroup:CC/MCC vs Without CC/MCC
Triplet Subgroup:MCC+CC vs Without CC/MCC
St. Mary Mercy Hospital
Medical Capture Rate:
MCC Pairs Subgroup:MCC vs Without MCC
CC/MCC Pairs Subgroup:CC/MCC vs Without CC/MCC
Monthly Capture Rate
Sample Hospital
Refresh Date: 12/16/2014
17
Using Data to Drive Performance
• Alternate principal diagnosis:
– Identify volume of cases in DRG, pair, triplet distribution for problematic DRGs
– Focus reviews identify need for CDS/coding team education or physician education on specific DRGs
2014Current
FYTDJul-2014 Aug-2014 Sep-2014 Oct-2014 Nov-2014 Dec-2014 Jan-2015
Complex (177) 115 52 6 9 2 5 12 7 11
Simple Pneumonia (193) w /MCC 242 124 10 10 11 13 14 26 40
Total 357 176 16 19 13 18 26 33 51
Complex (177) vs SimplePneumonia (193) w / MCC 32.2% 29.5% 37.5% 47.4% 15.4% 27.8% 46.2% 21.2% 21.6%
80th % 43.2% 43.2% 43.2% 43.2% 43.2% 43.2% 43.2% 43.2% 43.2%
Complex (178) 98 43 4 9 7 6 4 9 4
Simple Pneumonia (194) w / CC 388 191 22 17 21 13 24 41 53
Total 486 234 26 26 28 19 28 50 57
Complex (178) vs SimplePneumonia (194) w / CC 20.2% 18.4% 15.4% 34.6% 25.0% 31.6% 14.3% 18.0% 7.0%
80th % 31.0% 31.0% 31.0% 31.0% 31.0% 31.0% 31.0% 31.0% 31.0%
Alternate Principal Diagnoses of FocusSite Name Here
18
Using Data to Drive Performance
• Non‐specific DRG report: – Assists in identifying high‐volume cases that fall into a nonspecific
DRG for further documentation improvement review
• For one site, secondary reviews of chest pain cases early during admission improved documentation and reduced number of cases falling into unspecified DRG of chest pain
• Cases were referred to physician champion (i.e., cardiologist) for additional review that may result in physician education and/or query opportunity or simply case validation of the correct diagnosis
• CDSs and coders discuss findings and have education as appropriate
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
19
Non‐Specific DRG – Chest Pain 313
‐0.50%
0.00%
0.50%
1.00%
1.50%
2.00%
2.50%
3.00%
3.50%
MC % of Total:
Non‐MC % of Total:
Linear (MC % of Total: )
CDI chest pain program 6/26/2012
Final DRG 313 Medicare vs. non‐Medicarepercentage of discharged patients
20
Using Data to Drive Performance
• CDI dashboard focus on key metrics:– Assists sites in determining individual performance levels and for
benchmarking against others’ programs.
– Enables action planning, process improvement, and/or education.
– Provides insights on how daily operations impact program performance (i.e., staffing and % of reviews).
– Triggers site visit from system office and recommendations for action plans to close program gaps (staffing, program structure, medical staff support).
– Identifies key areas for change at the enterprise level. Added/revised metrics:
• Medicare total patient population reviewed now reported separately
• Revised threshold for retrospective coding query rate
• 3 new program standards added as a result of systemwide documentation excellence forum recommendations
21
CDI Dashboard Example
Site 5 FY09 Total
FY10 Total
FY11 Total
FY12 Total
FY13 Total Jul-13 Aug-13 Sep-13 Oct-13 Nov-13 Dec-13 Jan-14 Feb-14 Mar-14 Apr-14 May-14 Jun-14
FY14 Total
Total Inpatient Discharges 6,958 8,994 8,841 8,356 7,653 642 665 608 645 647 590 615 589 645 5,646Total # of Reviews Performed 3,935 7,690 6,053 5,414 6,087 522 472 497 516 533 531 574 568 613 4,826% of Total Inpatient Population Reviewed by CDS (Benchmark > 70%; Green > 70%; Yellow 60-70%; Red < 60%) 57% 86% 68% 65% 80% 81% 71% 82% 80% 82% 90% 93% 96% 95% #DIV/0! #DIV/0! #DIV/0! 85%
Total Medicare Discharges NEW 409 373 447
Total # of Medicare Reviews Performed NEW 369 366 426% of Medicare Inpatient Reviewed by CDS NEW (Benchmark > 70%; Green > 70%; Yellow 60-70%; Red < 60%) #DIV/0! #DIV/0! #DIV/0! #DIV/0! #DIV/0! #DIV/0! #DIV/0! #DIV/0! #DIV/0! #DIV/0! #DIV/0! 90% 98% 95% #DIV/0! #DIV/0! #DIV/0! #DIV/0!
Total # of CDS Queries Initiated 1,021 1,257 1,281 1,197 1,774 182 137 142 222 183 143 164 196 213 1,582Physician CDS Query Rate (Benchmark > 15%; Green > 15%; Yellow 10-15%; Red < 10%) 26% 16% 21% 22% 29% 35% 29% 29% 43% 34% 27% 29% 35% 35% #DIV/0! #DIV/0! #DIV/0! 33%
Physician Response to CDS (# of Cases) 832 1,153 1,210 1,192 1,772 182 137 142 199 181 143 158 186 212 1,540Physician Response to CDS (Benchmark > 90%; Green > 90%; Yellow 80-90%; Red < 80%) 81% 92% 94% 100% 100% 100% 100% 100% 90% 99% 100% 96% 95% 100% #DIV/0! #DIV/0! #DIV/0! 97%
Physician Agreement with CDS (# of Cases) 774 1,099 1,189 1,172 1,734 174 133 132 195 179 143 158 186 211 1,511Physician Agreement with CDS (Benchmark > 85%; Green >85%; Yellow 75-85%; Red < 75%) 93% 95% 98% 98% 98% 96% 97% 93% 98% 99% 100% 100% 100% 100% #DIV/0! #DIV/0! #DIV/0! 98%
Total # of Coder Queries Initiated N/A 185 311 280 220 14 21 24 17 16 16 17 12 17 154Physician-Coder Query Rate (Benchmark < 5%; Green < 5%; Yellow 5-10%; Red > 10%) N/A 2% 5% 5% 3.6% 2.7% 4.4% 4.8% 3.3% 3.0% 3.0% 3.0% 2.1% 2.8% #DIV/0! #DIV/0! #DIV/0! 3.2%
Physician Response to Coder (# of Cases) N/A 181 293 280 219 14 21 24 17 16 15 15 12 17 151Physician Response to Coder (Benchmark > 90%; Green > 90%; Yellow 80-90%; Red < 80%) N/A 98% 94% 100% 100% 100% 100% 100% 100% 100% 94% 88% 100% 100% #DIV/0! #DIV/0! #DIV/0! 98%
Physician Agreement with Coder (# of Cases) N/A 166 271 266 208 14 19 22 16 15 15 14 11 17 143Physician Agreement with Coder (Benchmark > 85%; Green > 85%; Yellow 75-85%; Red < 75%) N/A 92% 92% 95% 95% 100% 90% 92% 94% 94% 100% 93% 92% 100% #DIV/0! #DIV/0! #DIV/0! 95%
CDI Program Manager (Yes or No; Provide name & title to right) Yes
CDI Steering Team/Administrative Support (Yes or No) Yes
CDI Task Force Team (Yes or No) Yes
HIM Support and Participation (Yes or No) YesPhysician Champion (Yes or Vacant; Provide name to right) Yes
CDI Program Process Map (Yes or No) - All required for 'Yes' 1. CDI Program Process Flow 2. Clarification Process Flow 3. Non-Compliant Physician Process Flow No
CDI Program Communication & Education Plan (Yes or No) YesDedicated CDS staff (Yes or No) Yes
SEVERITY-ADJUSTED MORTALITY RATIO FY09 Avg. FY10 Avg. FY11 Avg. FY12 Avg.FY13 Avg.
Jul-13 (Oct -M ar
13)
A ug-13 (N ov -A pr
13 )
Sep-13 (De c- M a y
13 )
Oct- 13 (Ja n-J un
13)
No v- 13 ( Fe b-J ul
13)
De c-13 (M a r-A ug
13)J an-14 (A pr-
Sep 13 )
F eb-14 (M ay-Oc t
13 )
M ar-14 ( Jun-N o v
13 )A pr- 14 (J ul-
De c 13)
M a y- 14 (A ug-J an
14)Jun-14 (Sep-
F eb 14 )
FY14 Avg.
Severity-adjusted Mortality Ratio * (Benchmark < 0.70; Green < 0.70; Yellow 0.71-1.00; Red > 1.00) N/A 0.58 0.58 0.49 0.57 0.69 0.67 0.64 0.69 0.68 0.71 0.74 0.74 0.70
Quality Metrics:CONCURRENT CDS REVIEWS
PHYSICIAN CDS ACTIVITY
PHYSICIAN-CODER ACTIVITY
CRITICAL SUCCESS FACTORS
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
22
CDI Program Standards
CDI program standards
CDI program manager/lead in place (yes or no)
CDI administrative oversight team/administrative support (yes or no)
CDI and coder team in place (yes or no)
HIM support and participation (yes or no)
Physician champion/sponsor in place (yes or no)
CDI program process flow (yes or no) – all required for “yes”• CDI program process flow documented• CDI clarification/query process flow documented• Noncompliant physician process flow documented
CDI program communication and education plan (yes or no)
Dedicated CDS staff (yes or no)
CMO discusses/meets with CDI leadership monthly to review dashboard and financial benefits
CMO presents CDI performance and issues to senior leadership quarterly
CMO meets with physician champion/ICD‐10 physician champion monthly to review documentation and payer‐specific issues
Benchmark is: Green = Yes Red = No
23
CDI Dashboard – Executive View
The three metrics on the dashboard that warrant close attention by the executive are the percentage of the population reviewed and the physician query response rate.
24
Leveraging Technology to Improve Coding and CDI
• Decision made to leverage computer‐assisted coding and clinical documentation improvement software to offset productivity losses associated with ICD‐10
• Pursued an aggressive timeline to get all sites live prior to the conversion to ICD‐10 on October 1, 2015
• Return on investment for each site will be tracked and monitored for 12–24 months post implementation
Oct 2013
Jun2015
38 sites live over 18 months
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
25
Return on Investment Benefit Tracking
• Began tracking return on investment 4 months after implementation to allow for a stabilization period after introduction of new software
• Multiple metrics are reviewed as a part of the benefit assessment; however, the increase in Medicare CMI is used as the basis for actual benefit calculation
Benefit Expense
• Case‐mix index • Implementation fees
• Inpatient records coded • Training fees
• DNFB as % of gross revenue • Hardware
• Days in DNFB • Consulting fees
• Secondary diagnosis capture rate
• Licensing fees
• Procedure code capture rate
• CC capture rate
• MCC capture rate
26
Return on Investment Tracking Below Target
• Preliminary return on investment data shows that we are tracking below projections provided by vendor for the first 15 hospitals that went live. Currently, we are showing $1.2M in benefit as compared to the vendor’s projection of $9.9M in year 1.
• We are seeing an improvement in number of secondary diagnosis codes assigned for inpatient and outpatient.– Inpatient
• Overall 15.7% increase in secondary codes assigned
• Individual sites range from 12.1% to 20.3% increase in secondary codes assigned
– Outpatient
• Overall increase of 5.4% in secondary codes assigned
• Individual sites range from ‐20.3 to 21.0%
• One large site in the group has majority of outpatient codes assigned outside of HIM coding so does not see benefit of CAC/auto suggested coding
27
Technology Improved CDI Productivity
• Use of the tool has led to an increase in the number of all‐payer and Medicare reviews without adding additional clinical documentation specialist staff
Site 1 - All Payer and Medicare Reviews Mar-14 Apr-14 May-14 Jun-14 Jul-14 Aug-14 Sep-14 Oct-14 Nov-14 Dec-14
Total Inpatient Discharges 2,001 1,991 2,108 2,090 2,014 2,049 2,018 2,208 1,862 2,244Total # of Reviews Performed 1,032 986 1,126 1,222 802 1,288 1,370 1,375 1,377 1,744% of Total Inpatient Population Reviewed by CDS (Benchmark > 70%; Green > 70%; Yellow 60-70%; Red < 60%) 52% 50% 53% 58% 40% 63% 68% 62% 74% 78%
Total Medicare Discharges 1,211 1,245 1,318 1,292 958 997 987 1,042 882 1,067Total # of Medicare Reviews Performed 735 732 817 878 379 658 692 661 656 842% of Medicare Inpatient Reviewed by CDS NEW (Benchmark > 70%; Green > 70%; Yellow 60-70%; Red < 60%) 61% 59% 62% 68% 40% 66% 70% 63% 74% 79%
Quality Metrics:CONCURRENT CDS REVIEWS
Site #2- Medicare Reviews Mar-14 Apr-14 May-14 Jun-14 Jul-14 Aug-14 Sep-14 Oct-14 Nov-14 Dec-14
Total Medicare Discharges 618 684 663 606 634 634 576 563 630 540Total # of Medicare Reviews Performed 178 284 281 314 428 428 450 449 447 393% of Medicare Inpatient Reviewed by CDS NEW (Benchmark > 70%; Green > 70%; Yellow 60-70%; Red < 60%) 29% 42% 42% 52% 68% 68% 78% 80% 71% 73%
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
28
Poor ROI Performance Is Driver to Improve
• Conducted site visits with vendor to identify opportunities for improvement. A key activity during the site visit was to job shadow the CDSs and coders. Key observations included:
– Need to reinforce that CDSs and coders need to delete or reject codes that are not relevant to the case in order to fine‐tune the engine to appropriately code their records.
– Adoption of viewing documents in CAC by CDSs and coders is slow and difficult to determine without over‐the‐shoulder monitoring.
– CDSs adapt to code assignment from annotations and auto‐suggested codes quicker than coders. Coders’ tendency is to directly enter codes or use an encoder because they believe an engine slows them down.
• Vendor provided reports that showed code agreement rates, method of code acceptance, and productivity rates, which will help us to monitor adoption.
29
Optimization Plan
• Determined a need for optimization of technology/ software and processes post implementation focusing on:
– Schedule follow‐up touch base calls with live sites to provide continued support and education
– Use CAC rather than EMR to review documents
– Tips and tricks training for CDSs and coders
– Manager report training
– Revisit document scoping (what key documents are missing in engine as well as formatting issues)
– Need to work toward reducing/eliminating paper documentation (progress notes, history and physicals, consultations)
– Need ongoing mechanism to report and resolve issues
30
Management and Oversight of Process Key to Success
• Use of the tool requires a change in how both CDSs and coders perform and think about their work. CDS and coding leadership must be diligent in monitoring the following areas to assess adoption of technology:– Adoption of auto‐suggested codes
• Monitor on an individual CDS and coder level
• Some individuals will need additional coaching to trust auto‐suggested codes
– Monitoring how reviews are taking place by direct observation of CDS and coding record reviews
• Reviews taking place in EMR vs. CAC software
– Monitoring code entry methodology
• Use of CAC engine vs. encoder vs. direct code entry
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
31
Auto‐Suggested Code Acceptance
• Monitor progress of adoption of auto‐suggested codes
– Precision is the percent of accepted CAC codes out of the total generated
– Recall is the percent of accepted CAC codes out of the total final codes
54.48%
60.66%
61.58%
72.57%
0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00%
Jul 14 ‐ Jan 15
15‐Jan
Recall
Precision
Target Peer Group:Precision 60%Recall 56%
32
Code Entry Method Monitoring
1.94%
38.89%
34.38%
11.10%
29.76%
0.80%
0.93%
42.77%
19.05%
12.43%
44.80%
1.16%
0% 5% 10% 15% 20% 25% 30% 35% 40% 45% 50%
Annotation
Auto Suggested List
Encoder
Direct Code Entry
Document with Evidence
Recode Pathway
CoderJan 15
CoderJul 14 ‐ Jan 15
2.29%
46.30%
19.05%
1.60%
33.57%
0.58%
1.89%
47.81%
8.30%
3.21%
42.35%
0.80%
0% 10% 20% 30% 40% 50% 60%
Annotation
Auto Suggested List
Encoder
Direct Code Entry
Document with Evidence
Recode Pathway
CDSJan 15
CDS Jul 14 ‐ Jan 15
CDS code entry
Coder code entry
33
Where Do We Go From Here?
• We have gone after the “low hanging” fruit by standardizing the dashboard and implementing technology
• The challenge for us now is to identify additional opportunities for improving our CDI programs across the system as well as meet the demands for accurate documentation across the continuum of care
• We have been working with a multidisciplinary documentation improvement steering team, which includes physicians, CDI, coding, and compliance, to develop strategies for improving documentation
• Improving documentation across Trinity Health is a priority for senior leadership
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
34
The Changing Healthcare Environment
• There are many forces converging on the traditional fee‐for‐service models in healthcare and will ultimately change how we are reimbursed and how we exchange data about the patient
– Population health management
– Meaningful use
– Value‐based purchasing
– Quality and core measures
– Hospital and physician profiles
• Traditional fee‐for‐service payment models will gradually be phased out, and healthcare organizations will be paid based on the quality of care provided and how well they manage select patient populations and the associated costs of providing the care
• Trinity Health is rapidly moving toward a people‐centered health system that is focused on delivering better health, better care, and lower costs
35
Documentation Is More Than CC/MCC Capture
• With many different factors converging on healthcare and our organization’s migration to a people‐centered health system, we have started to take a new look at how to better address the organization’s documentation deficiencies and to look beyond CC/MCC capture, the traditional focus of CDI programs
• Our director of population health has invited us to have a seat at the table as they strategize about how we will ensure our documentation in the ambulatory setting supports this new payment model
• The CDS role has never been more critical, but it must evolve and be agile to support the changing business model
36
Poor Documentation Is the Root of All Evil
• Contributes to medical necessity and RAC denials, which we define as:
– Downgrades from inpatient to outpatient
– No advance beneficiary notice obtained
– Denial of inpatient admissions
– Denial of inpatient days
• Risks loss of incentive payment for Hospital Value‐Based Purchasing Program and results in penalties for hospital‐acquired conditions
• Impacts severity of illness and risk of mortality scores as we migrate to a population health model of payment
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
37
Clinical Documentation Challenges
• Decreased length of stay reduces the time frame the CDS has to conduct reviews (some cases may only have 1–2 reviews before discharge)
• Physician electronic templates do not always contain enough information to tell the patient’s clinical story– Copy/paste
– Exception documentation only
– Canned statements without detail
– Includes excerpts from diagnostics without correlation to treatment and diagnoses rendered
• Significant number of physician handwritten notes still exist
• Diagnosis and problem lists are cumbersome for physicians to use and often results in a nonspecific diagnosis selection
38
Trinity Health CDI Vision and Goals
Vision• Establish and create standards of documentation that support a complete and accurate record of patient care rendered,
severity of illness, risk of mortality, and resources consumed so as to enable management of individual patients and population through the continuum of care
Goals• Transition CDI program to meet the changing needs of the healthcare landscape• Develop effective partnership with revenue excellence and clinical teams to address documentation needs that support
People Centered 2020• Ensure clinical data capture keeps pace with industry data usage needs (population health management, value-based
purchasing)• Continue engagement with key stakeholders to ensure vision of People Centered 2020 is realized (CFO, CMO, CEO,
etc.)
Plan• Migrate clinical documentation specialist role from traditional CC/MCC capture model to a comprehensive role that meets the
expanding documentation needs • Develop evidence-based clinical guidelines in coordination with the Trinity Health Clinical Division for critical diagnoses to
minimize denials as well as support documentation of clinical evidence in the medical record• Partner with clinical stakeholders to develop clinical documentation improvement plans for the emergency department,
ambulatory services, and physician practices• Leverage technology to enable clinicians to document more accurately and effectively and optimize the natural language
processing function
39
How Will We Achieve Our Vision and Goals?
• In order to achieve our objectives, we had to understand where our pain points and vulnerabilities exist today in the area of documentation by performing the following assessments:
• Reviewed data results and trends of third‐party payer audits and found that more and more emphasis was being placed on having the appropriate clinical indicators/evidence in the chart to support the diagnoses reported on the claim. Specific opportunities across Trinity Health for improvement include:
– 313 Chest pain
– 312 Syncope and collapse
– 392 Esophagitis, gastroenteritis
– 69 Transient ischemia
• The overarching theme of our vision is ultimately improving documentation, but how do we do that without overwhelming our physicians?
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
40
Physician Education Marketing Campaign
• Needed a fresh approach for educating physicians about documentation, so we decided to use marketing concepts to deliver the message
• Developed a marketing plan that focused on the following themes:– WHY should I document in more detail?
• What’s it in it for me (physician)? Why is documentation so important to me?
– WHAT should I document?
• Document cause and effect and link results/treatments to condition
• Document the medical necessity/reason for the visit, test, hospitalization
• Document possible, probable, or suspected illnesses for inpatients
– WHERE (and how) should I document?
• Provide guidance on where critical documentation/information is most frequently missed and how to ensure it is documented
41
Physician Education Marketing Campaign
• Created tagline and logo for the educational campaign– RIDE: Rapidly Improving Documentation Excellence
• Utilizing various content style options to deliver themes:– Humor, music, acronyms, slogans to help remember key points
– Printed communications such as memos, newsletters, posters, signage
– Video(s): Use video to communicate a variety of messages, including the actual training content and messages from CEO/CMO/physician testimonials; est. 1–3 minutes per video
• Creating educational toolkits for each educational concept and provide materials to the sites (buttons, pens, pocket guides, flyers, hard copy, and electronic files)
• Leveraging the new physician continuing education system to provide information on documentation requirements for disease processes
Anemic "Due To" acute blood loss
Hello CHE Trinity physicians … we are calling out to you We need your help to improve documentation Starting with the phrase Pathological fracture of T1 "Due To" age related osteoporosis
Conjunctivitis "Due To" adenovirus
Infertility "Due To" tubal occlusion
Sepsis "Due To" group A streptococcus
Retinopathy "Due To" Type 1 diabetesE coli sepsis "Due To" urinary tract infection
Urethritis "Due To" chlamydiaBlood loss anemia "Due To" duodenal ulcerAcute bronchitis "Due To" rhinovirus
Peripheral neuropathy "Due To" Type II diabetesHypertension "Due To" renal disease
Cerebral Infraction "Due To" occlusion of vertebral artery
Pneumonia "Due To" klebsiella pneumoniae
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
43
Documentation Excellence Forum
• In late 2013, our CMO created a documentation excellence forum that focused on ICD‐10‐related documentation issues. The format of the call includes educational sessions as well as information sharing amongst the hospitals.
• Recently the decision was made to focus the forums on disease conditions, so we will be presenting the following concepts on the 3 conditions resulting in the most denials and queries:– Clinical documentation needs for disease process
– Documentation changes for ICD‐10
– Solicit hospitals to share their approach or experience
• The focus on disease conditions will include multidisciplinary presentations from quality, CDI, coding, and physician leadership.
44
Challenges With Physician Education Plan
• Some physicians felt our “Due To” message did not take the matter seriously and was irreverent and refused to let the CDSs use the material to educate physicians
• A few CMOs believe that only physicians can educate physicians
• Due to system office IT constraints, it is difficult to get messages posted on our intranet, which is why we are exploring utilizing the CME software to post content
• Struggling with how to establish metrics for measuring the success of the program
45
Preparing CDSs for Role Change
• Introduced the following concepts for CDS staff
– Overview of the changing landscape of healthcare
– Importance of complete and accurate documentation beyond CC/MCC capture and tying documentation to the changing landscape of healthcare
– What clinical evidence is, its importance, and the need to query for clinical evidence
– Querying for clinical evidence is not questioning the physician’s judgment or medical decision‐making
– Inconsistencies in documentation during the patient stay are problematic
• Adapting current workflows to be more in line with the changing business environment (e.g., creating handoffs from ED case manager to CDS)
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
46
Clinical Standards and Guidelines
• Working towards developing clinical guidelines and standards for diagnosing certain disease processes– Malnutrition
– Sepsis
– Acute respiratory failure
– Acute kidney injury vs. acute tubular necrosis vs. acute kidney failure
– Encephalopathy
• There are two critical reasons why this is important– Provides basis for addressing payer take‐backs for lack of medical necessity
– Enables us to educate CDSs on how to assess for the presence/absence of clinical evidence for key diagnoses
• Several challenges associated with the use of standardized clinical indicators include:– Physician adoption
– Acceptance of standards by payers
– Monitoring compliance with the standard
47
Next Steps
• Continue to develop short educational, multimedia messages for physician education marketing campaign
• Develop documentation education content for continuing medical education software program
• Collaborate with clinical colleagues to create clinical indicator/evidence guidelines for problematic diagnoses
• Establish metrics for monitoring adherence/adoption rates of clinical guidelines
• Collaborate with colleagues in managed care to have third‐party payers recognize and reimburse for key diagnoses when guidelines are documented in medical record
48
Thank you. Questions?
In order to receive your continuing education certificate(s) for this program, you must complete the online evaluation. The link can be found in the continuing education section at the front of the program guide.
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
49
Appendix:
CDI dashboard metrics and measures detail
50
CDI Dashboard Metrics
% of total inpatient population reviewed by CDS
# of discharges defined: Total number of discharges excluding the following populations: Psych, inpt. rehab, nursery, NICU, OB, peds, SNG/long‐term care, critical access, and select specialty. This metric should include all payers, not just the type of payer that your MO reviews.
# of reviews defined: The number of patients reviewed by CDS; for example, Mr. Smith is counted as one review, not the number of times the CDS looked at Mr. Smith's record.
Benchmark: > 70%; Green > 70%; Yellow 60%–70%; Red < 60%
% of total Medicare (traditional) inpatient population reviewed by CDS NEW
# of discharges defined: Total number of traditional Medicare discharges excluding the following populations: Psych, inpt. rehab, nursery, NICU, OB, peds, SNG/long‐term care, critical access, and select specialty.
# of reviews defined: The number of traditional Medicare patients reviewed by CDS; for example, Mr. Smith is counted as one review, not the number of times the CDS looked at Mr. Smith's record.
Benchmark: > 70%; Green > 70%; Yellow 60%–70%; Red < 60%
51
CDI Dashboard Metrics
Physician‐CDS query rate: Enter total # of CDS queries initiated
Total # of CDS queries initiatedTotal # of inpatient reviews performed
Benchmark: > 15%; Green > 15%; Yellow 10%–15%; Red < 10%
Physician response to CDS: Enter the total # of CDS queries responded to by physicians
Physician response to CDS (# of cases) Total # of CDS queries initiated
Benchmark: > 90%; Green > 90%; Yellow 80%–90%; Red < 80%
Physician agreement with CDS: Enter the total # of cases where physicians agreed to the query
Physician agreement with CDS (# of cases) Physician response to CDS (# of cases)
Benchmark: > 85%; Green > 85%; Yellow 75%–85%; Red < 75%
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
52
CDI Dashboard Metrics
Physician‐coder query rate: Enter the total number of coder queries initiated
Total # of coder queries initiatedTotal # of inpatient reviews performed
Benchmark: < 15%; Green < 15%; Yellow 15%–20%; Red > 20% CHANGED
Physician response to coder: Enter the total number of coder queries responded to by physicians Physician response to coder (# of cases) Total # of inpatient reviews performed
Benchmark: > 90%; Green > 90%; Yellow 80%–90%; Red < 80%
Physician agreement with coder:
Physician agreement with coder (# of cases) Physician response to coder (# of cases)
Benchmark: > 85%; Green > 85%; Yellow 75%–85%; Red < 75%
53
Appendix:
Evolving role of CDS educational slides
54
Tools to Query for Clinical Evidence
• Physicians can document clinical evidence by doing the following:– Documenting the significance of diagnostic findings instead of repeating lab
and diagnostic values in the documentation
– Documenting what they suspect is wrong with the patient
– Documenting the concerns about what could go wrong and what are the predictable risks
– Documenting why they ordered a test and why they are keeping the patient another day
– Documenting what has been ruled out upon discharge
• Options for when face‐to‐face discussions are not effective:– Engage medical staff leadership to share importance of clinical evidence
documentation
– Track facility‐specific areas of concern and develop educational sessions on the topics (sepsis, AKI, postop respiratory failure)
– Obtain a second opinion from a clinician (CDI physician champion, CMO) before querying
– Follow the facility’s escalation process for noncompliant physicians
©2015 HCPro, a division of BLR. All rights reserved. These materials may not be duplicated without express written permission.
55
Suggestions for Approaching the Physician
• Best approach is to discuss with the physician in person and to cite standard clinical guidelines as the reason for your questions. This eliminates misinterpretations and confusion that could result from written queries.
• Explain to the physician that you noticed he or she documented a diagnosis and that you are not seeing the clinical indicators in the record; ask for his or her help in understanding the diagnosis documented.
• Explain that clinical evidence in the record is essential to support the patient care rendered and the resources consumed.
56
Evolving Role of the CDS Beyond CC/MCC/DRG
• CDS reviews need to expand beyond CC/MCC capture and DRG assignment and include:
– Documentation supports the medical necessity for the hospital stay, reason for diagnostic tests, and medical decision‐making process.
– Test results are interpreted; for example, patient is hyponatremic with a serum sodium of 120. A plan to address the condition should also be documented in the medical record.
– Cause‐and‐effect relationships, such as peripheral vascular disease due to diabetes, are documented in the medical record.
– Etiology of conditions are documented; for example, chest pain most likely due to reflux esophagitis.
– Chief complaint/reason for visit is clearly specified in medical record.
– Documentation specificity for accurate severity of illness and risk of mortality (SOI/ROM).
• CAC/CDIS software will provide CDSs with SOI/ROM level information concurrently and a tool to address unresolved clinical edits