+ All Categories
Home > Documents > Onsite Quarterly Meeting SIPP PIPs

Onsite Quarterly Meeting SIPP PIPs

Date post: 22-Feb-2016
Category:
Upload: renata
View: 62 times
Download: 0 times
Share this document with a friend
Description:
Onsite Quarterly Meeting SIPP PIPs. Presenter: Christy Hormann, LMSW , CPHQ Project Leader-PIP Team. June 13, 2012. Overview of Presentation. Progression of SIPP PIPs SFY 2012 validation results Areas for improvement. SFY 2011 SIPP PIPs. First year SIPPs completed the PIP process - PowerPoint PPT Presentation
Popular Tags:
42
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team
Transcript
Page 1: Onsite  Quarterly Meeting SIPP PIPs

Onsite Quarterly MeetingSIPP PIPs

June 13, 2012

Presenter:

Christy Hormann, LMSW, CPHQProject Leader-PIP Team

Page 2: Onsite  Quarterly Meeting SIPP PIPs

Overview of Presentation

Progression of SIPP PIPs

SFY 2012 validation results

Areas for improvement

Page 3: Onsite  Quarterly Meeting SIPP PIPs

SFY 2011 SIPP PIPs

First year SIPPs completed the PIP process

Each SIPP required to submit one PIP

Total of 14 PIPs were submitted

Page 4: Onsite  Quarterly Meeting SIPP PIPs

SFY 2011 SIPP PIPs cont.

One was completed through Activity VI

13 were completed through VIII

Initial scores were lower due to lack of the proper documentation

Page 5: Onsite  Quarterly Meeting SIPP PIPs

SFY 2012 SIPP PIPs For SFY 2012, the SIPPs were required to

submit the collaborative PIP and a PIP topic of their choosing

Some of the individual SIPP topics were increasing family participation in treatment, minimizing weight gain during treatment, and reducing readmissions

Page 6: Onsite  Quarterly Meeting SIPP PIPs

SFY 2012 SIPP PIPs cont.

There were a total of 27 PIPs submitted for validation

Six PIPs were assessed through Activity VI Four PIPs were assessed through Activity

VII Five PIPs were assessed through Activity

VIII

Page 7: Onsite  Quarterly Meeting SIPP PIPs

SFY 2012 SIPP PIPs cont. Twelve PIPs were assessed through Activity

IX None of the PIPs were assessed for Activity

X-Sustained Improvement

Page 8: Onsite  Quarterly Meeting SIPP PIPs

PIP Stages

III. OUTCOMES

II. IMPLEMENTATION

I. DESIGN

Page 9: Onsite  Quarterly Meeting SIPP PIPs

Study Design Stage Establishes methodological framework for

the PIP Includes development of study topic,

question, indicators, and population (Activities I through IV)

A strong study design is necessary for the successful progression of a PIP

Page 10: Onsite  Quarterly Meeting SIPP PIPs

Study Design Stage Evaluation Elements

Activity I: Study Topic Reflects high-volume or high-risk

conditions Is selected following collection and analysis

of data

Page 11: Onsite  Quarterly Meeting SIPP PIPs

Study Design StageEvaluation Elements

Activity I: Study Topic Addresses a broad spectrum of care and

services Includes all eligible populations that meet

the study criteria Does not exclude members with special

health care needs Has the potential to affect member health,

functional status, or satisfaction

Page 12: Onsite  Quarterly Meeting SIPP PIPs

Study Design StageEvaluation Elements

Activity II: Study Question States the problem to be studied in simple

terms Is answerable

Page 13: Onsite  Quarterly Meeting SIPP PIPs

Study Design StageEvaluation Elements

Activity III: Study Indicators Are well-defined, objective, and measurable Are based on current, evidence-based

practice guidelines, pertinent peer-reviewed literature, or consensus expert panels

Allow for the study question to be answered

Page 14: Onsite  Quarterly Meeting SIPP PIPs

Study Design StageEvaluation Elements

Activity III: Study Indicators Measure changes (outcomes) in health or

functional status, member satisfaction, or valid process alternatives

Have available data that can be collected on each indicator

Page 15: Onsite  Quarterly Meeting SIPP PIPs

Study Design StageEvaluation Elements

Activity III: Study Indicators Are nationally recognized measures, such

as HEDIS technical specifications, when appropriate

Includes the basis on which indicator(s) was adopted, if internally developed

Page 16: Onsite  Quarterly Meeting SIPP PIPs

Study Design StageEvaluation Elements

Activity IV: Study Population Is accurately and completely defined Includes requirements for the length of a

member’s enrollment in the MCO Captures all members to whom the study

question applies

Page 17: Onsite  Quarterly Meeting SIPP PIPs

SIPP Design Stage ResultsStudy Stage Activity Met Partially

MetNot Met

Design

I. Appropriate Study Topic* 87%(139/160)

3%(4/160)

11%(17/160)

II. Clearly Defined, Answerable Study Question(s) 74%

(40/54)11%

(6/54)15%(8/54)

III. Clearly Defined Study Indicator(s)* 68%

(86/127)9%

(11/127)24%

(30/127)

IV. Correctly Identified Study Population

75%(48/64)

14%(9/64)

11%(7/64)

Design Total*

77%(313/405)

7%(30/405)

15%(62/405)

* The activity or stage total may not equal 100 percent due to rounding. 

Page 18: Onsite  Quarterly Meeting SIPP PIPs

Study Implementation Stage Includes sampling, data collection, and

interventions (Activities V through VII) During this stage, MCOs collect data,

evaluate and identify barriers to performance, and development interventions targeted to improve outcomes

The implementation of effective improvement strategies is necessary to improve PIP outcomes

Page 19: Onsite  Quarterly Meeting SIPP PIPs

Study Implementation StageEvaluation Elements

Activity V: Sampling Consider and specify the true or estimated

frequency of occurrence Identify the sample size Specify the confidence level Specify the acceptable margin of error

Page 20: Onsite  Quarterly Meeting SIPP PIPs

Study Implementation StageEvaluation Elements

Activity V: Sampling Ensure a representative sample of the

eligible population Are in accordance with generally accepted

principles of research design and statistical analysis

Page 21: Onsite  Quarterly Meeting SIPP PIPs

Study Implementation Stage Evaluation Elements

Activity VI: Data Collection The identification of data elements to be

collected The identification of specified sources of

data A defined and systematic process for

collecting baseline and remeasurement data A timeline for the collection of baseline and

remeasurement data

Page 22: Onsite  Quarterly Meeting SIPP PIPs

Study Implementation Stage Evaluation Elements

Activity VI: Data Collection Qualified staff and personnel to abstract

manual data A manual data collection tool that ensures

consistent and accurate collection of data according to indicator specifications

A manual data collection tool that supports interrater reliability

Page 23: Onsite  Quarterly Meeting SIPP PIPs

Study Implementation Stage Evaluation Elements

Activity VI: Data Collection Clear and concise written instructions for

completing the manual data collection tool An overview of the study in written

instructions

Page 24: Onsite  Quarterly Meeting SIPP PIPs

Study Implementation Stage Evaluation Elements

Activity VI: Data Collection Administrative data collection algorithms/

flow charts that show activities in the production of indicators

An estimated degree of administrative data completeness

Page 25: Onsite  Quarterly Meeting SIPP PIPs

Study Implementation Stage Evaluation Elements

Activity VII: Interventions Related to causes/barriers identified through

data analysis and quality improvement processes

System changes that are likely to induce permanent change

Revised if the original interventions are not successful

Standardized and monitored if interventions are successful

Page 26: Onsite  Quarterly Meeting SIPP PIPs

SIPP Implementation Stage Results

Study Stage Activity MetPartially

Met Not Met

Implementation

V. Valid Sampling Techniques (if sampling was used) 86%

(6/7)0%(0/7)

14%(1/7)

VI. Accurate/Complete Data Collection* 58%

(159/272)13%

(36/272)28%

(77/272)

VII. Appropriate Improvement Strategies 70%

(45/64)11%

(7/64)19%

(12/64)

Implementation Total61%

(210/343)13%

(43/343)26%

(90/343)

* The activity or stage total may not equal 100 percent due to rounding.

 

Page 27: Onsite  Quarterly Meeting SIPP PIPs

Outcomes Stage The final stage of the PIP process

(Activities VIII through X) Involves data analysis and the evaluation of

improvement based on the reported results and statistical testing

Sustained improvement is achieved when outcomes exhibit improvement over multiple measurements

Page 28: Onsite  Quarterly Meeting SIPP PIPs

Outcomes Stage Evaluation Elements

Activity VIII: Data Analysis Are conducted according to the data

analysis plan in the study design Allow for the generalization of results to the

study population if a sample was selected Identify factors that threaten the internal or

external validity of findings Include an interpretation of findings

Page 29: Onsite  Quarterly Meeting SIPP PIPs

Outcomes Stage Evaluation Elements

Activity VIII: Data Analysis Are presented in a way that provides

accurate, clear, and easily understood information

Identify the initial measurement and the remeasurement of the study indicators

Identify statistical differences between the initial measurement and the remeasurement

Page 30: Onsite  Quarterly Meeting SIPP PIPs

Outcomes Stage Evaluation Elements

Activity VIII: Data Analysis Identify factors that affect the ability to

compare the initial measurement with the remeasurement

Include an interpretation of the extent to which the study was successful

Page 31: Onsite  Quarterly Meeting SIPP PIPs

Outcomes Stage Evaluation Elements

Activity IX: Real Improvement The remeasurement methodology is the

same as the baseline methodology There is documented improvement in

processes or outcomes of care The improvement appears to be the result of

planned intervention(s) There is statistical evidence that observed

improvement is true improvement

Page 32: Onsite  Quarterly Meeting SIPP PIPs

Outcomes Stage Evaluation Elements

Activity X: Sustained Improvement Repeated measurements over comparable

time periods demonstrate sustained improvement or that a decline in improvement is not statistically significant

Page 33: Onsite  Quarterly Meeting SIPP PIPs

SIPPs Outcomes Stage ResultsStudy Stage Activity

MetPartially

Met Not Met

Outcomes

VIII. Sufficient Data Analysis and Interpretation*

54%(63/116)

17%(20/116)

28%(33/116)

IX. Real Improvement Achieved54%

(26/48)21%

(10/48)25%

(12/48)

X. Sustained Improvement Achieved ‡ ‡ ‡

Outcomes Total*

54%(89/164)

18%(30/164)

27%(45/164)

* The activity or stage total may not equal 100 percent due to rounding.‡ The PIPs did not progress to this phase during the review period and could not be assessed for real or sustained improvement.

Page 34: Onsite  Quarterly Meeting SIPP PIPs

SIPP Indicator Results There were a total of 44 study indicators 22 were not assessed for improvement 15 demonstrated improvement Of those that demonstrated improvement,

11 demonstrated statistically significant improvement

Page 35: Onsite  Quarterly Meeting SIPP PIPs

SIPP Indicator ResultsSFY 2012 Performance Improvement Project Outcomes

for the SIPPs (N=27 PIPs)

SIPPs Total Number of Study Indicators

Comparison to Study Indicator Results from Prior Measurement Period

Sustained Improvement

1Declined

Statistically Significant

DeclineImproved

Statistically Significant

ImprovementNot

Assessed

Plan A 3 0 0 0 2 1 ‡

Plan B 3 0 0 0 0 3 ‡

Plan C 3 0 1 0 1 1 ‡

Plan D 3 0 0 2 0 1 ‡

Plan E 3 1 1 0 0 1 ‡

Plan F 2 0 0 0 0 2 ‡

Plan G 3 0 0 0 2 1 ‡

Plan H 3 0 0 0 0 3 ‡

Plan I 3 0 0 0 2 1 ‡

Plan J 3 0 1 0 1 1 ‡

Plan K 3 0 0 1 1 1 ‡

Plan L 3 1 0 1 0 1 ‡

Plan M 4 0 2 0 0 2 ‡

Plan N 5 0 0 0 2 3 ‡

Overall Totals 44 2 5 4 11 22 ‡

1 One or more study indicators demonstrated sustained improvement.‡ The PIP(s) did not progress to this phase during the review period and/or required an additional measurement period; therefore, sustained improvement could not be assessed. 

Page 36: Onsite  Quarterly Meeting SIPP PIPs

Common Areas for Improvement

Activity I: Study Topic No historical plan-specific data provided to

support the selection of the study topic (Evaluation Element #2)

Page 37: Onsite  Quarterly Meeting SIPP PIPs

Common Areas for Improvement

Activity IV: Study Population Length of enrollment required not specified

(Evaluation Element #2)

Page 38: Onsite  Quarterly Meeting SIPP PIPs

Common Areas for Improvement

Activity VI: Data Collection Timeline for data collection not provided

(Evaluation Element #4) Not all information regarding manual data

collection was provided (Evaluation Elements #5-9)

Page 39: Onsite  Quarterly Meeting SIPP PIPs

Common Areas for Improvement

Activity VIII: Data Analysis Baseline data and data analysis not reported

in this year’s submission (Evaluation Elements #1-5)

Page 40: Onsite  Quarterly Meeting SIPP PIPs

Recommendations Use the PIP Summary Form Completion

Instructions when documenting the PIP Summary Form

If you have questions, contact HSAG for technical assistance

Page 41: Onsite  Quarterly Meeting SIPP PIPs

HSAG ContactsFor any PIP questions or to request PIP technical assistance contact:

Christy [email protected]

Jenny [email protected]

Page 42: Onsite  Quarterly Meeting SIPP PIPs

Questions


Recommended