+ All Categories
Home > Documents > UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC...

UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC...

Date post: 06-Feb-2018
Category:
Upload: dinhnhan
View: 213 times
Download: 1 times
Share this document with a friend
22
Successfully Implementing and Reporting the PQRS Group Practice Reporting Option: Lessons From the 2010 UHC-AAMC Academic GPRO Benchmarking Project November 2012
Transcript
Page 1: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

Successfully Implementing and Reporting the PQRS Group Practice Reporting Option: Lessons From the 2010 UHC-AAMC Academic GPRO Benchmarking Project November 2012

Page 2: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

Table of Contents Introduction ....................................................................................................................................................... 1

Acknowledgements .......................................................................................................................................... 1

Executive Summary .......................................................................................................................................... 2

Background ....................................................................................................................................................... 4

PQRS and the Value Modifier ......................................................................................................................... 4 Mechanics of Group Reporting ........................................................................................................................ 4

2010 UHC-AAMC Academic GPRO Benchmarking Project ......................................................................... 5

Methods ........................................................................................................................................................... 5 GPRO Academic Cohort ................................................................................................................................. 6 Reasons for Choosing GPRO ......................................................................................................................... 8 Operationalizing GPRO ................................................................................................................................... 8 Performance on Quality Measures .................................................................................................................. 9 Better Performance ....................................................................................................................................... 11 Poor Performance ......................................................................................................................................... 12 Performance on Cost of Care Measures ....................................................................................................... 13

2011 GPRO Reporting Cycle .......................................................................................................................... 14

How Can AMCs Prepare for the Future of GPRO and Quality Reporting? ................................................... 14

Appendix .......................................................................................................................................................... 15

UHC-AAMC PQRS GPRO Measure Rates Interview Guide ......................................................................... 15 Overview of Your GPRO Participation .......................................................................................................... 15 Electronic Health Record ............................................................................................................................... 15 Operational Processes .................................................................................................................................. 15 Measure Performance ................................................................................................................................... 15 Overall Experience With GPRO .................................................................................................................... 16

Contributors .................................................................................................................................................... 19

Page 3: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

1

Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together by the UHC-AAMC Faculty Practice Solutions Center® (FPSC) that focuses on the Physician Quality Reporting System, or PQRS, Medicare’s pay-for-reporting program. This document is intended for faculty practices that wish to better understand the PQRS group practice reporting option (GPRO) and how to successfully report in that format. The issue brief summarizes the experiences of 8 faculty practices that participated in the first year of the PQRS GPRO.

Acknowledgements UHC and AAMC would like to thank the following organizations for participating in the 2010 Academic GPRO Benchmarking Project:

The Cleveland Clinic Foundation

Fletcher Allen Health Care

Loyola University Health System

Oregon Health & Science University Medical Group

Southern Illinois University Healthcare

University of TexasHealth Science Center, Houston

University of Texas Medical Branch at Galveston Faculty Group Practice

UNM Health System

Page 4: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

2

Executive Summary In 2010, the Centers for Medicare & Medicaid Services (CMS) implemented PQRS GPRO. The approach and considerations for the first form of group reporting were significantly different from previous PQRS reporting methods, which focused on the reporting performance of the individual eligible professional (EP). In individual PQRS reporting, the EP selects the measure(s) to report on, identifies the correct patient(s), and reports the identified quality information on those patients. Group reporting, in contrast, requires group practices to report on a single set of defined measures for a sample of patients that are assigned at the end of the reporting period.

Since 2010, the role of group reporting in PQRS has been greatly expanded. Group reporting is embedded in the alternative payment models, such as the Pioneer and Medicare Shared Savings Program accountable care organizations. The 2012 PQRS GPRO data will be the first performance data publicly reported, and CMS has stated that physician groups with at least 100 eligible professionals must report as a group in 2013 or face a penalty in 2015 associated with the physician value modifier, which is the Medicare physician pay-for-performance program. Coinciding with this value modifier proposal, CMS is expanding the number of methods (or reporting mechanisms) for group reporting. The group reporting mechanism described in this issue brief is now called GPRO Web Interface.1

This issue brief is intended for faculty practices that want to understand how to implement and perform well on this relatively new form of reporting. UHC and AAMC conducted interviews with and analyzed performance data from 8 academic practices that participated in the 2010 GPRO to understand why they participated, how they operationalized their reporting infrastructure, and what successes and challenges they had with reporting measures and measure performance.

The GPRO implementation strategies varied among the 8 practices. Most groups had a core team that attended the mandatory monthly calls with CMS and prepared for the 5-week reporting period. The size of the core implementation team ranged from 3 to more than 2 dozen staff members. During the 5-week data submission window, some groups assembled additional staff for data collection. The approach for implemen-tation depended on the vision of and purpose for reporting. For example, one practice noted that because the 2010 data were not publicly reported, they treated GPRO as a pay-for-reporting program and focused on how to efficiently submit data with minimal resources, while other organizations stated that they were concerned about the accuracy and thoroughness of the data and committed additional resources to complete extensive chart reviews.

1 Value modifier and GPRO expansion options were described in the 2013 Medicare Physician Fee Schedule Final Rule, available at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/PhysicianFeeSched/index.html.

Page 5: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

3

Performance on both quality and cost measures varied widely across organizations. When discussing reasons for poor performance, 3 trends emerged:

• Attribution. CMS retrospectively assigned patients to each group’s tax identification number based on the plurality of evaluation and management visits. As a result, some patients that received only specialty care would be assigned to unexpected modules. For example, one practice identified a large number of cancer patients in the reporting sample for the diabetes module.

• Missing data or incomplete documentation in the electronic health record (EHR). Groups noted that services outside an organization often were not listed in the EHR and/or that certain measure specifi-cations were not captured as discrete elements in their EHRs.

• Unclear/incomplete measure specifications. Some measures did not allow the practice to report patient refusal as an acceptable reason for exclusion when calculating measure performance. In other cases, specifications seemed vague and clinicians felt that certain specification requirements were too narrow. Finally, some groups reported that the documentation in the record did not meet the necessary specification for the services.

The groups were able to overcome these challenges for specific measures. Better performance on measures was associated with established workflows that had been in place prior to GPRO reporting. Often these workflows were connected with larger quality improvement initiatives that had established processes to help ensure that the correct data were entered in the EHR. One group that did better than the academic cohort on 22 of 26 measures used extensive chart abstraction to collect information that was not in the EHR discrete field but was located in detailed notes.

Through the annual feedback report, GPRO participants were able to compare their cost performance with that of the other 35 participants. As with the quality measurement, there was a range in the cost performance, but as a cohort, the median cost per capita was higher for the 8 academic group practices than in the full GPRO cohort.

With the value modifier requirement to report as a group or face a penalty, it is important for faculty practices to understand the GPRO options and how to perform well on the cost and quality measures. The 8 groups suggested the following for faculty practice organizations that are considering GPRO Web Interface reporting:

• Assign the team early

• Understand the measure specifications, determine how to retrieve the necessary information, and identify gaps in your current systems

• Understand variations in EHR documentation

Page 6: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

4

Background In 2010, CMS implemented the first GPRO for PQRS. Only 35 groups participated in the first cycle of reporting, but participation has grown since then. GPRO Web Interface is the basis for alternative payment programs, such as the Pioneer and Medicare Shared Savings Program accountable care organizations. GPRO data will also be the first set of performance data publicly reported on the Physician Compare Web site. Finally, in the 2013 Physician Fee Schedule Final rule CMS finalized that group reporting will be an important component of the upcoming physician value modifier.

PQRS and the Value Modifier

The PQRS is a voluntary pay-for-reporting program that began in 2007. Under the program EPs can receive an incentive to submit quality information on Medicare patients. In 2010, the incentive was 2% of the total Medicare allowed charges; however, in recent years, PQRS incentives have decreased. Starting in 2015, EPs will be penalized for not reporting. (There is a 1.5% reduction in fees for 2015 and a 2% reduction for years 2016 and beyond. The 2015 penalty will be based on PQRS data reported in 2013.)

In addition to the upcoming PQRS penalties, starting in 2015 CMS is required to implement a physician value modifier that pays some physicians and physician groups differentially based on how they perform on cost and quality measures. In the 2013 Physician Fee Schedule Final rule, CMS stated that all physician group practices with at least 100 EPs would have to submit 2013 data to PQRS using a GPRO methodology or face an automatic 1% penalty in 2015 (in addition to any PQRS penalty).2

Mechanics of Group Reporting

The 2010 GPRO was significantly different from previous PQRS reporting methods. In individual reporting, the EP is responsible for selecting measures (from more than 200 measures), identifying the patient popu-lation, and reporting quality information on those patients. In contrast, GPRO practices report on a single predetermined set of measures for a sample of patients, which is assigned retrospectively based on where the patient sought care during the year.

For 2010, all GPRO participants were large groups with at least 200 EPs.3 Groups had to report on 26 meas-ures that fell into 4 disease modules—coronary artery disease, heart failure, hypertension, and diabetes—and a set of preventive care measures. CMS assigned a patient sample for each of the preventive care measures and for each of the 4 disease modules. The practices then had to report on the first 411 assigned patients.4

2 Groups that are participating in a Pioneer or a Medicare Shared Savings Program accountable care organization are exempt from the 2015 value modifier.

3 Groups were defined by taxpayer identification number and the individual EPs were defined by their National Provider Identifier numbers.

4 If 411 patients could not be assigned, then the practices had to report on all the patients. Each sample could have up 616 patients. Skipping records was only allowed in certain limited circumstances.

Page 7: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

5

The 2010 GPRO data collection tool was a Microsoft Access database. CMS assigned patients who had at least 2 visits to the group practice and received the plurality of their evaluation and management services at the group practice. Practices received the data tool in February 2011 and had 5 weeks to complete the data submission.

While group practices received their performance results immediately after submitting the data, they did not receive any comparison data until fall 2011 when they received their Quality and Resource Use Reports.5 These feedback reports provided quality benchmark scores as well as cost benchmarks for the patients that were assigned to the practice so that practices could understand how they compared with other GPRO participants on each of the quality measures as well as on the overall per-capita costs of their assigned patient population.

Since 2010, CMS has made several changes to the group reporting. In 2011, it transitioned from a Microsoft Access database to a Web-based tool. In 2012, the CMS allowed groups to be as small as 25 EPs (although the reporting requirements are different for groups with fewer than 100 EPs). In 2013, CMS has established several new GPRO reporting mechanisms and has relabeled this specific type of group reporting as the “GPRO Web Interface.” Finally, there have been several changes to the measures groups have to report.

2010 UHC-AAMC Academic GPRO Benchmarking Project In the fall of 2011, UHC and AAMC initiated a benchmarking project among faculty practice plans that partici-pated in the 2010 GPRO. The objectives of this engagement were threefold: (1) to compare performance on specific physician practice–based quality measures among academic organizations, (2) to create an internal learning network, and (3) to share findings with other academic practices considering the GPRO in the future. The following sections focus solely on the 2010 reporting cycle.

Methods

Eight academic practices agreed to share their performance measure rates with each other in a blinded

fashion. To better understand the factors affecting performance, UHC and AAMC conducted semi-structured

exploratory interviews with each of the 8 practices to identify factors that affect GPRO reporting. (The prac-

tices identified the GPRO team members to participate in the interview. A copy of the interview can be found

in the appendix.) The interviews focused on the groups’ reasons for participation and their operational proc-

esses to prepare and implement GPRO. Rather than reviewing the performance data for all 26 measures,

groups were asked to share details on the following:

• General experiences of the 2010 reporting period.

• The experience for a measure in which the group had relatively better performance and the factors related to better performance.

5 CMS plans to calculate the value modifier based on the Quality and Resource Use Reports.

Page 8: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

6

• The experience for a measure in which the group had relatively poor performance and the factors related to worse performance.

• Specific experience on the following 3 measures: DM-2 (diabetes HbA1c control), DM-8 (diabetes foot exam), and HTN-3 (hypertension plan of care). These measures were selected because there was wide variation in performance across the 8 organizations in the study.

GPRO Academic Cohort

The 8 practices had a wide range of practice patterns, infrastructure, and implementation styles. Table 1 sum-marizes key characteristics of each of the groups that participated in the benchmarking project. The table is divided into 3 categories:

• Organizational characteristics, which includes information about the payer mix, the approximate percentage of primary care patients, the practice’s EHR, and the number of EPs in the group.

• GPRO implementation, which describes the core team that managed the preparation for GPRO reporting; the reporting team that submitted data during the 5-week data submission process; the amount of time needed to submit data; and whether the data were extracted using EHRs, chart review, or both.

• Data from CMS Quality and Resource Use Report, including the size of the population assigned to the practice as well as the average percentage of EPs that the assigned patients saw outside of their organization. (The last variable is a measure of how much of the attributed patients’ care occurs outside of the practice.)

Page 9: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

7

Table 1. Characteristics of Practices Participating in the 2010 UHC-AAMC GPRO Benchmarking Project

Organization A B C D E F G H

Organizational Statistics Medicare payer mix (%) 35 19 18.7 27

Combined 40 31 35 23

Medicaid payer mix (%) 18 29.5 29.7 17 14 8 22

Percentage of primary care physicians 17 22 20 12 15 10 15-20 10

EHR vendor Centricity Epic Cerner Epic Epic Epic Epic AllScriptsa

Length of time using current EHR system, years

5 7 10+ 7 3 7 9 6

No. of EPs in the practice

220 563 976 988 618 479 2,085 673

GPRO Implementation Core team Team included

COO, QI specialist (primary lead), EHR project coordinator and IT reporting database specialist

One primary lead who coordinated with others

8 staff members Stakeholder group included practice plan CEO and team members from IT and clinical informatics

See members listed under reporting team

Project lead, managerial support, IT team, and team of VPs

Senior director of quality reporting & reform

See members listed under reporting team

Reporting team 3 FTE coders for 2 wk and 5-6 nurses for chart abstractions

12 coders from billing office, 1 Epic pro-grammer dedicated 50% time for ~ 2 mo

1 IT person and 2 data analysts for chart review

Hired FT analyst to review meas-ures and used additional analysts to help with chart review

6 FTE abstractors = 400 h and 1 part-time abstractor = 30 h. Project support staff member = 308 h

1 FTE IT analyst at 80% for 5 mo for pulling data

Hired and trained 50 part-time nurses; used 25 for chart abstraction

2.5 SQL analysts at 75% time during submission and 1 clinical analyst

1 coder 1 manager 1 clerical staff person

Time needed to submit data

3 wk 3 wk 3 wk 4 wk 17 d 3 wk 4 wk 3 wk

Reporting mechanism EHR/chart EHR/chart EHR/chart EHR/chart Chart only EHR/chart EHR only EHR/chart

Quality Resource Use Reports Information No. of attributed patients 4,801 7,579 4,155 4,813 10,592 12,364 31,006 4,388

Percentage of EPs outside of group

72 31 29 47 33 35 43 52

a AllScripts is the EHR vendor for the practice plan. The hospital EHR vendor is Epic. CEO = chief executive officer; EHR = electronic health record; EP = eligible professional; FT(E) = full-time (equivalent); GPRO = group practice reporting option; IT = information technology; SQL = structured query language; VP = vice president.

Page 10: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

8

Reasons for Choosing GPRO

While each practice approached GPRO reporting differently, UHC and AAMC identified several common themes from the experiences of practice plans. Most practices participated in the 2010 GPRO because they believed that reporting as a group would be more cost effective and less burdensome than reporting for individual clinicians. (In 2010, the incentive for PQRS reporting was 2% of the practice’s Medicare Part B allowed charges for services that are covered under the physician fee schedule. In addition, GPRO practices had the option to participate as a group in the e-prescribing incentive program, which provided another 2% incentive for successful reporting.) The average practice size was 800 providers and most practices had previously attempted PQRS individual reporting with mixed success.

Some practices chose to treat GPRO as a pay-for-reporting project to receive the incentive dollars and were less focused on making reporting changes to improve performance, in part because first-year performance was kept confidential. Other groups opted to use GPRO as an opportunity to understand performance on key quality measures and employed considerable resources to ensure that they had a complete and accurate picture of their performance. Their strategies helped determine who would be on the planning team and how they would invest in the reporting.

Operationalizing GPRO

Most groups had a core team that attended the mandatory monthly calls with CMS and prepared for the 5-week reporting period. Some groups assembled additional staff for the data collection during that 5-week period. The size of the core implementation team ranged from 3 to more than 2 dozen staff members. The day-to-day activities of the team were most often led by a staff member from the clinical informatics or quality arm of the organization. Other common roles on the teams included information technology analysts who extracted data and certified coders or nurses who conducted chart abstractions. For example, one organization hired and trained 50 registered nurses to perform chart abstractions; approximately 25 of these nurses participated in the project on a part-time basis.

A common theme among interviewees was the importance of a deep familiarity with both the GPRO measure rate specifications and the capabilities of the organization’s EHR. One organization noted that forming a “stakeholders group” to conduct a gap analysis and speaking with other GPRO participants that use the same EHR were important steps. Most groups developed queries in advance, validated them, and pulled the necessary data from their EHRs before receiving access to the GPRO tool. This lessened the time to collect and submit the final data, as much of the data could be automatically populated.

Data collection methods included a combination of electronic data extraction and chart reviews for the majority of practices. Groups noted that they were able to easily extract certain measures such as DM-2 (diabetes HbA1c control); however the data for other measures, such as HTN-3 (hypertension plan of care), did not exist in discrete fields or required additional information, resulting in supplemental chart

Page 11: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

9

reviews. Organization G chose to report only what could be extracted electronically, while Organization E only performed chart reviews.

The 8 participants use 4 different EHRs. The majority use Epic as their primary ambulatory EHR. While all practices were in the process of implementing their EHR, not all practices had their ambulatory modules operational when GPRO was started. None of the practices received help from their EHR vendors; indi-vidual organization and staff determined how to extract data from their systems.

Performance on Quality Measures

UHC and AAMC compiled each practice plan’s measure rates into a performance dashboard to allow comparisons between organizations and measures (Figure 1). The dashboard also showed participants how they compared with an academic cohort, as opposed to the national mean reported by the CMS.

Page 12: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

10

Figure 1. Measure Rate Dashboard for Academic Groups

Page 13: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

11

This comparison revealed a large degree of variation in measure performance across the academic groups. For example, Organization F performed at least 5% better than the academic mean on 22 of the 26 measures. Organizations B, C, and G performed above the academic mean on roughly half of the measures.

The comparison also demonstrated that while organizations performed similarly on some measures, performance on other measures was much more varied. For example, academic organizations performed within 8 percentage points of each other (range 87.8%-95.2%) on measure DM-2 (diabetes HbA1c con-trol). Conversely, organizations’ performance on measure DM-8 (diabetes foot exam) spanned 83 per-centage points across all 8 groups.

Compared with the national GPRO benchmark, the academic cohort’s performance was mixed. The academic groups averaged slightly higher than the national group on the coronary artery disease module, lower on the diabetes and prevention modules, and had mixed performance by measure within the heart failure and hypertension modules.

Better Performance

As shown in figure 1, performance varied across both organizations and measures. During the inter-views, groups were asked to identify measures on which they performed well and not so well to under-stand whether there were underlying factors affecting performance. Among the 8 practices interviewed, 3 identified coronary artery disease measures and 3 identified diabetes measures as those on which they performed better. For the selected better-performance measures, virtually all organizations had some process in place, other than GPRO reporting, to help ensure that proper data collection took place. Other factors that contributed to better performance included:

• Established workflows to document the test or service; typically this process was for another reporting program or collaborative. In one faculty practice, physicians’ annual compensation was affected by their per-formance data on select measures.

• Participation in a quality collaborative that focused on performance on certain chronic disease or preventative measures (e.g., National Committee for Quality Assurance primary care medical home recogni-tion, state quality improvement initiatives).

• Data elements for the measure that were captured in discrete fields within the EHR, making it easy to electronically extract the data for reporting.

• Extensive chart reviews in addition to electronic extraction to find docu-mentation supporting measure criteria. This was particularly important in cases in which the supporting data were embedded in notes and not captured directly in the EHR.

Ideas to Improve GPRO Performance

The following are some innovative approaches practices used to achieve better performance:

• Develop a training guide and edu-cational materials for abstractors to ensure consistency in the abstraction process.

• Incorporate residents into the program to ensure that appropriate services are received and docu-mented, even if the services occur outside the organization.

• Hire or dedicate staff to review the measure specifications, develop programs to extract the data, and identify gaps.

Page 14: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

12

Poor Performance

Themes for poorer measure rate performance also emerged from the interviews. The 3 most common reasons cited were the CMS attribution methodology, incomplete or absent physician documentation for tests or services, and limited CMS exclusion criteria.

Attribution

Six of the 8 groups interviewed indicated that the patient attribution methodology affected performance. In 2010, CMS attributed Medicare fee-for-service patients to a group practice if the EPs in the group practice billed for at least 2 new or established office visits and the group practice had the plurality of evaluation and management charges for that patient. The attribution methodology did not distinguish between the physician specialties, so patients seeking specialty care at an academic center could be assigned to one of the disease modules or to a preventive care measure.

To a varying degree, all organizations found that patients were attributed to their group based on specialty visits unrelated to the GPRO module to which the patient was assigned. For example, Organization D noted that a large percentage of patients that were attributed to the diabetes module had come to the organization for oncology services. When this organization reran the metrics for its primary care patients only, the performance rate increased. Across all organizations, the diabetes and prevention modules were most commonly cited as having the largest issues resulting from attribution.

Missing Information

Another reason cited for poorer performance was limited or missing documentation for a service or test. In some cases, this was because the patient received the test or service outside the system, so the results were either embedded in a note or not entered in the EHR at all. At least one practice associated missing data with the patients assigned to the disease module. As noted in the “Better Performance” section above, some practices were able to overcome the missing data issues by establishing workflows to ensure that the data were properly captured and documented.

Limited Exclusion Data

Some organizations noted that not all preventive care measures had exclusions for patient choice. For example, patient refusal was not a valid exclusion criterion for colorectal cancer screening. Even if the physician recommended that a patient receive this screening, if the patient refused, it counted against the organization’s performance on that measure.

In other cases, the documentation was not sufficient to meet the measure specification. This was partic-ularly noted for the DM-8 (diabetes foot exam) measure, for which several practices noted that a phys-ician had documented “lower extremity exam” but could not receive credit because that language did not meet the specification criteria.

Page 15: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

13

For the HTN-3 (hypertension plan of care) measure, which counted the percentage of patients who had blood pressure measurements > 90 mm Hg and ≤ 140 mm Hg and who had a plan of care, in some cases the clinician did not agree that the patients had poor blood pressure control when the patients’ individual characteristics were considered. 6

Performance on Cost of Care Measures

In addition to measuring the quality performance, UHC and AAMC collected cost of care information from groups as reported in the Quality and Resource Use Reports. These reports were distributed by CMS to the 2010 GPRO participants and were intended to provide information about the quality and cost of care provided to fee-for-service Medicare patients. The upcoming value modifier will be based on these feedback reports.

Figure 2 provides a summary of the cost of care data received from the participating academic groups. CMS calculated the Medicare per capita costs from all Medicare Part A and B claims submitted by all providers who treated Medicare fee-for-service patients attributed to each organization. The patient costs were standardized and risk adjusted. The costs include those for providers that were not affiliated with the organization. The academic median was approximately $1,000 higher than the median for all 2010 GPRO groups. Two of the practice plans in the benchmarking study fell below the national median, while the other 6 fell above the median.

Figure 2. Medicare Patients’ Per Capita Cost Comparison Among Academic GPRO Participants

6 The HTN-3 measure was removed from the 2012 GPRO Web interface.

Page 16: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

14

2011 GPRO Reporting Cycle Although the 2011 GPRO protocol for large groups was virtually the same as 2010, the conversion to a Web-based protocol caused some setbacks in the reporting process, including a much delayed reporting cycle. Summaries of the 2011 experiences were provided in the AAMC and UHC physician fee schedule comment letters (www.facultypractice.org/117.htm). Overall, the 8 groups that participated in GPRO for the second time performed better than those groups that were brand new to GPRO.

How Can AMCs Prepare for the Future of GPRO and Quality Reporting?

In the next few years, academic faculty practices will be held increasingly accountable for the quality and cost of the care they provide and must therefore identify strategies for meeting the new demands for infor-mation. Organizations currently doing individual PQRS need to decide whether to continue individual reporting or transition to a group reporting method, and if so, which group reporting option they want to use. Groups should consider the impact of the reporting selection on the value modifier, PQRS, and other federal programs such as the Medicare and Medicaid EHR Incentive programs.

For groups who are considering the GPRO Web interface option, current GPRO participants recom-mended several steps:

• Assign the team. Determine early who at your organization should be on the planning team and who should be responsible for reporting.

• Understand measure specifications. All of the groups in the benchmarking study emphasized that understanding the measure specifications, knowing how to extract the appropriate data from your system, and ensuring complete data capture are critical to successful implementation.

• Validate EHR data capture. In addition to ensuring that the data can be captured via an EHR when possible, groups should also compare EHR results with chart abstraction, identify gaps, and institute processes to improve documentation.

• Focus on education. Educate physicians and staff on why the organization is moving towards the PQRS GPRO. Educate physicians and coders on how to achieve the best results (e.g., what the measures are, what needs to be done so measure data can be easily extracted, any specific measure criteria that should be highlighted).

Regardless of whether groups choose to do GPRO Web reporting, groups should prepare for cost reporting by evaluating their internal data, identifying variations and opportunities to reduce waste through standard clinical protocols. Groups can also anticipate the new measures, such as the patient experience survey, Clinician and Groups-Consumer Assessment of Healthcare Providers and Systems (CG-CAHPS). Finally, groups should understand their performance on quality and cost measures, and what drives that performance, so they are prepared as results are posted on Physician Compare and incorporated into the value modifier.

Page 17: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

15

Appendix UHC-AAMC PQRS GPRO Measure Rates Interview Guide

Overview of Your GPRO Participation

1. Who was involved in the decision-making process to participate in GPRO? 2. What were the deciding factors for participating in the group vs. individual option for PQRS? 3. Approximately what percentage of your physicians is primary care-based? 4. What is the payer mix of your practice? What percentage of your patients are dual eligibles?

Electronic Health Record

1. Which EHR system does your organization use? 2. How long have you been using this system? 3. Did you receive external support from the EHR vendor for the GPRO reporting process?

Operational Processes

1. Describe your organization’s preparation for data collection prior to receiving the GPRO data tool. a. What pre-work did you find most useful? b. How quickly were you able to begin the data collection process after receiving the GPRO

data tool? 2. How long did it take to complete and submit the GPRO data tool? 3. Which roles from your organization were key players in successfully completing GPRO?

a. Did you form any internal groups to focus on performance improvement for specific measures?

b. What staff resources were used to complete the GPRO data tool? 4. Did your organization have a standardized process for collecting GPRO quality data across all

departments? If so, describe this process. a. What steps did you take after realizing there was a gap in a process?

5. How did you share feedback with your providers?

Measure Performance

1. To what degree (high, medium, low) do you feel the following influenced your performance: a. Patient attribution. If high, which measures were most affected? b. Patient complexity/severity. If high, which measures were most affected?

Page 18: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

16

2. Better-performance measure: a. Please identify 1 of the better-performing measures (i.e., one with a green cell) to

discuss. Which measure did you select? b. What contributed to your performance on that measure?

3. Worse-performance measure: a. Please identify 1 of the measures with poorer performance (i.e., one with a red cell) to

discuss. Which measure did you select? b. What contributed to the challenges with that measure?

4. Measure: DM-2 (diabetes HbA1c control) a. How was this data element collected? b. What successes/challenges did you have with reporting this measure?

5. Measure: DM-8 (diabetes foot exam) a. How was this data element collected? b. What successes/challenges did you have with reporting this measure?

6. Measure: HTN-3 (hypertension plan of care) a. How was this data element collected? b. What successes/challenges did you have with reporting this measure?

Overall Experience With GPRO

1. Based on your GPRO experience in 2010, what changes (if any) have you made for the 2011 reporting period?

2. What recommendations would you offer groups that are participating in the GPRO for the first time?

3. How do you think your experience with the GPRO has affected the quality of care at your organization?

Page 19: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

17

For each measure rate below, indicate with an “x” whether:

• Measure collection was automated

• Chart review was required

• Data were collected before your participation in GPRO

• A workflow was established to collect measure rate data for reporting

• A large number of nonprimary care patients were attributed to the measure

Measure Description

Automated Measure

Collection

Chart

Review

Collected Data

Previously

Established

Workflow

Large No. of Nonprimary Care

Patients Attributed CAD: Coronary Artery Disease CAD-1: Oral antiplatelet therapy prescribed for patients with CAD CAD-2: Drug therapy for lowering LDL cholesterol CAD-3: Beta blocker therapy for CAD patients with prior myocardial infarction

CAD-7: ACE inhibitor or ARB therapy for patients with CAD and diabetes and/or LVSD

DM: Diabetes Mellitus DM-1: Hemoglobin A1c testing DM-2: Hemoglobin A1c poor control in diabetes mellitus DM-3: High blood pressure control in diabetes mellitus DM-5: LDL-C control in diabetes mellitus DM-6: Urine screening for microalbumin or medical attention for nephropathy in diabetic patients

DM-7: Dilated eye exam in diabetic patient DM-8: Foot exam DM-9: Lipid profile

Page 20: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

18

Measure Description

Automated Measure

Collection

Chart

Review

Collected Data

Previously

Established

Workflow

Large No. of Nonprimary Care

Patients Attributed HF: Heart Failure HF-1: Left ventricular function assessment HF-2: Left ventricular function testing HF-3: Weight measurement HF-5: Patient education HF-6: Beta blocker therapy for LVSD HF-7: ACE inhibitor or ARB therapy for LVSD HF-8: Warfarin therapy for patients with atrial fibrillation HTN: Hypertension HTN-1: Blood pressure measurement HTN-2: Blood pressure control HTN-3: Plan of care PREV: Prevention PREV-5: Screening mammography PREV-6: Colorectal cancer screening PREV-7: Influenza immunization for patients ≥ 50 years old PREV-8: Pneumonia vaccination for patients ≥ 65 years old

ACE = angiotensin-converting enzyme; ARB = angiotensin receptor blocker; LDL = low-density lipoprotein; LVSD = left ventricular systolic dysfunction.

Page 21: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

Contributors Carmen Barc, RN Loyola University Health System

Nancy Bertolino, CPC University of Texas Medical Branch at Galveston

Jim Besjak Southern Illinois University Healthcare

Elizabeth J. Fingado, BSMT, MBA UNM Health System

Cindy Geiger Oregon Health and Science University

Lorraine Hewitt UNM Health System

Jessie Johnson Oregon Health and Science University

Christina Koster, MHSA UHC

Peggy Lesage Fletcher Allen Health Care

Jacqueline Matthews, RN, MS The Cleveland Clinic Foundation

Rex McCallum, MD University of Texas Medical Branch at Galveston

Karen B. McKnight, MA, ASQ CMQ/OE Fletcher Allen Health Care

Matt Navigato Oregon Health and Science University

Shaifali Ray, MHA UHC

Johanna Sheehey-Jones, RN, BSN Fletcher Allen Health Care

Kelli Turner, CPC University of Texas Health Science Center – Houston

Mary Wheatley, MS AAMC

Page 22: UHC-AAMC PQRS GPRO Issue Brief · PDF file1 Introduction This issue brief was developed by UHC and the Association of American Medical Colleges (AAMC) and is part of a series put together

© 2012 UHC and Association of American Medical Colleges. All rights reserved. The reproduction or use of this document in any form or in any information storage and retrieval system is forbidden without the express, written permission of UHC and the AAMC. This document is not meant to take the place of an institution’s own professional judgment or assessment of the topic covered herein.


Recommended