+ All Categories
Home > Documents > Applying the Performance Framework Data Model June 10, 2013.

Applying the Performance Framework Data Model June 10, 2013.

Date post: 17-Dec-2015
Category:
Upload: maurice-cannon
View: 221 times
Download: 0 times
Share this document with a friend
Popular Tags:
32
Applying the Performance Framework Data Model June 10, 2013
Transcript
Page 1: Applying the Performance Framework Data Model June 10, 2013.

Applying the Performance Framework Data Model

June 10, 2013

Page 2: Applying the Performance Framework Data Model June 10, 2013.

Performance Framework

Metadata: lom, EffectiveDate, RetiredDate, Replaces, IsReplacedBy, SupportingInformation

PerformanceScale: id, LeastCompetentScore, MostCompetentScore

Component• id• Title• Abbreviation• Competency (reference to

a competency object)

• Author• Reviewer• Background• Resources• References

1+

ComponentReference: id of a nested Component 0+

PerformanceLevelSet 0+

or

PerformanceScale (reference to a LevelScale id) 1

PerformanceLevel 2+

• DisplayOrder• ScoreValue• Label

• Background• Note

Indicator: id, Description, Background, Note 1+

Page 3: Applying the Performance Framework Data Model June 10, 2013.

Internal Medicine Example• Title: The Internal Medicine Milestone Project • Identifier:

– Catalogue: URI– Entry: http://www.acgme-nas.org/assets/InternalMedicineMilestones

• Description: The Milestones are designed only for use in evaluation of resident physicians in the context of their participation in ACGME accredited residency or fellowship programs…

• Contributions:– Role: Author– Entity: William Iobst, M.D.– Role: Author– Entity: Eva Aagaard, M.D.

• Effective date: 2013-01-30

Page 4: Applying the Performance Framework Data Model June 10, 2013.

Internal Medicine Continued

• Supporting Information– This document presents milestones designed for

programs to use …• Performance Scale

– 1 to 5 (1 least competent, 5 most competent)– 1 to 3 (1 least competent, 3 most competent)

Page 5: Applying the Performance Framework Data Model June 10, 2013.

Internal Medicine Component

• ID: 12345• Competency:

http://www.example.org/im_milestones/PC1.xml• Title: Gathers and synthesizes essential and

accurate information to define each patient’s clinical problem(s).

Page 6: Applying the Performance Framework Data Model June 10, 2013.

Performance level set• Performance Scale Reference: 1 to 5

Position: 1 Score Value: 1 Label: Critical Deficiencies

Indicator (id = i1_1): Does not collect accurate historical data

Indicator (id = i1_2): Does not use physical exam to confirm history

Indicator (id = i1_3): Relies exclusively on documentation of others to generate own database or differential diagnosisIndicator (id = i1_4): Fails to recognize patient’s central clinical problems

Indicator (id = i1_5): Fails to recognize potentially life threatening problems

This would allow indicator to be tracked as on/off.

Page 7: Applying the Performance Framework Data Model June 10, 2013.

Position: 2 Score Value: 2 Label:

Indicator (id = i2_1): Inconsistently able to acquire accurate historical information in an organized fashion Indicator (id = i2_2): Does not perform an appropriately thorough physical exam or misses key physical exam findingsIndicator (id = i2_3): Does not seek or is overly reliant on secondary data

Indicator (id = i2_4): Inconsistently recognizes patients’ central clinical problem or develops limited differential diagnoses

Position: 3 Score Value: 3 Label:

Indicator (id = i3_1): Consistently acquires accurate and relevant histories from patientsIndicator (id = i3_2): Seeks and obtains data from secondary sources when neededIndicator (id = i3_3): Consistently performs accurate and appropriately thorough physical examsIndicator (id = i3_4): Uses collected data to define a patient’s central clinical problem(s)

Page 8: Applying the Performance Framework Data Model June 10, 2013.

Position: 4 Score Value: 4 Label: Ready for unsupervised practice

Indicator (id = i4_1): Acquires accurate histories from patients in an efficient, prioritized, and hypothesis-driven fashionIndicator (id = i4_2): Performs accurate physical exams that are targeted to the patient’s complaints Indicator (id = i4_3): Synthesizes data to generate a prioritized differential diagnosis and problem listIndicator (id = i4_4): Effectively uses history and physical examination skills to minimize the need for further diagnostic testing

Position: 5 Score Value:5 Label: Aspirational

Indicator (id = i5_1): Obtains relevant historical subtleties, including sensitive information that informs the differential diagnosis;Indicator (id = i5_2): Identifies subtle or unusual physical exam findings

Indicator (id = i5_3): Efficiently utilizes all sources of secondary data to inform differential diagnosis Indicator (id = i5_4): Role models and teaches the effective use of history and physical examination skills to minimize the need for further diagnostic testing

Page 9: Applying the Performance Framework Data Model June 10, 2013.

Internal Medicine Component

• Competency: http://www.example.org/im_milestones/PC.xml

• ID: 23456• Title: Patient Care• Background: The resident is demonstrating

satisfactory development of the knowledge, skill…

Page 10: Applying the Performance Framework Data Model June 10, 2013.

Performance levels• Performance Scale Reference: 1 to 3

Position: 1 Score Value: 3 Label: Yes

Indicator (id = pc1): The resident is demonstrating satisfactory development of the knowledge, skill, and attitudes/behaviors needed to advance in training. He/she is demonstrating a learning trajectory that anticipates the achievement of competency for unsupervised practice that includes the delivery of safe, timely, equitable, effective and patient-centered care.

Position: 2 Score Value: 1 Label: No

Indicator (id = pc2): The resident is demonstrating unsatisfactory development of the knowledge, skill, and attitudes/behaviors needed to advance in training. He/she is not demonstrating a learning trajectory that anticipates the achievement of competency for unsupervised practice that includes the delivery of safe, timely, equitable, effective and patient-centered care.

Page 11: Applying the Performance Framework Data Model June 10, 2013.

Performance levels

Position: 3 Score Value: 2 Label: Marginal

Indicator (id = pc3): The resident is demonstrating marginal development of the knowledge, skill, and attitudes/behaviors needed to advance in training. He/she is marginally demonstrating a learning trajectory that anticipates the achievement of competency for unsupervised practice that includes the delivery of safe, timely, equitable, effective and patient-centered care.

Page 12: Applying the Performance Framework Data Model June 10, 2013.

Pediatrics Example

• Title: The Pediatrics Milestone Project• Identifier:

– Catalog: URI– Entry:

http://www.acgme-nas.org/assets/PediatricsMilestones• Contributions:

– Role: Author– Entity: Carol Carraccio, M.D.– Role: Reviewer– Entity: Richard Antonelli, MD, MS

• Effective date: 2013-01-30

Page 13: Applying the Performance Framework Data Model June 10, 2013.

Pediatrics Continued

• Supporting Information– http://www.acgme.org/acgmeweb/Portals/0/PFAssets/

ProgramResources/320_PedsMilestonesProject.pdf

• Performance Scale– 1 to 4 (1 least competent, 4 most competent)– 1 to 5 (1 least competent, 5 most competent)– 1 to 6 (1 least competent, 6 most competent)

Page 14: Applying the Performance Framework Data Model June 10, 2013.

Pediatrics Component

• ID: 12345• Title: Gather essential and accurate information

about the patient• Competency:

http://www.example.org/peds_milestones/PC1.xml• Author: Daniel Schumacher, MD • Background:

Early Development of Information-Gathering Skills In the early stages of clinical reasoning, learners must rely upon their knowledge of basic pathophysiology and …

Page 15: Applying the Performance Framework Data Model June 10, 2013.

Pediatrics Component Continued

• References1. Schmidt HG, Norman GR, Boshuizen HPA. A cognitive perspective on medical expertise: theory

and implications. Academic Medicine. 1990;65:611-621.2. Carraccio CL, Benson BJ, Nixon LJ, Derstine PL. From the educational bench to the clinical

bedside: translating the Dreyfus Developmental Model to the learning of clinical skills. Academic Medicine. 2008;83:761-767.

3. Eva K. What every teacher needs to know about clinical reasoning. Medical Education. 2004;39:98-106.

4. Schmidt HG, Boshuizen HPA. On acquiring expertise in medicine. Educational Psychology Review. 1993;5:205-221.

5. Schmidt HG, Rikers RMJP. How expertise develops in medicine: knowledge encapsulation and illness script formation. Medical Education. 2007;41:1133-1139.

6. Charlin B, Boshuizen HPA, Custers EJ, Feltovich PJ. Scripts and clinical reasoning. Medical Education. 2007;41:1178-1184.

7. Patel VL, Groen GJ, Patel YC. Cognitive aspects of clinical performance during patient workup: the role of medical expertise.” Advances in Health Sciences Education. 1997;2:95-114.

8. Elstein AS, Kagan N, Shulman LS, et al. Methods and theory in the study of medical inquiry. Journal of Medical Education. 1972;47:85-92.

Page 16: Applying the Performance Framework Data Model June 10, 2013.

Performance level set• Performance Scale: 1 to 5

Position: 1 Score Value: 1 Label:

Indicator (id = pc_i1): Either gathers too little information or exhaustively gathers information following a template regardless of the patient’s chief complaint, with each piece of information gathered seeming as important as the next. Recalls clinical information in the order elicited,7 with the ability to gather, filter, prioritize, and connect pieces of information being limited by and dependent upon analytic reasoning through basic pathophysiology alone.

Position: 2 Score Value: 2 Label:

Indicator (id = pc_i2): Clinical experience allows linkage of signs and symptoms of a current patient to those encountered in previous patients. Still relies primarily on analytic reasoning through basic pathophysiology to gather information, but the ability to link current findings to prior clinical encounters allows information to be filtered, prioritized, and synthesized into pertinent positives and negatives as well as broad diagnostic categories.

Free text reference

No formatting

Page 17: Applying the Performance Framework Data Model June 10, 2013.

Position: 3 Score Value: 3 Label:

Indicator (id = pc_i3): Advanced development of pattern recognition leads to the creation of illness scripts, which allow information to be gathered while it is simultaneously filtered, prioritized, and synthesized into specific diagnostic considerations. Data gathering is driven by real-time development of a differential diagnosis early in the information-gathering process.8

Position: 4 Score Value: 4 Label:

Indicator (id = pc_i4): Well-developed illness scripts allow essential and accurate information to be gathered and precise diagnoses to be reached with ease and efficiency when presented with most pediatric problems, but still relies on analytic reasoning through basic pathophysiology to gather information when presented with complex or uncommon problems.

Position: 5 Score Value: 5 Label:

Indicator (id = pc_i5): Robust illness scripts and instance scripts (where the specific features of individual patients are remembered and used in future clinical reasoning) lead to unconscious gathering of essential and accurate information in a targeted and efficient manner when presented with all but the most complex or rare clinical problems. These illness and instance scripts are robust enough to enable discrimination among diagnoses with subtle distinguishing features.

Page 18: Applying the Performance Framework Data Model June 10, 2013.

Pediatrics Component

• ID: 12345• Title: Prescribe and perform all medical

procedures • Competency:

http://www.example.org/peds_milestones/PC8.xml• Author: Patricia Hicks, MD

Page 19: Applying the Performance Framework Data Model June 10, 2013.

Background

• All of the competencies are involved in prescribing and performing medical procedures. In an integrated …

Page 20: Applying the Performance Framework Data Model June 10, 2013.

Background ContinuedThe component KSA of each procedure are numerous and complex. They include: • Anatomy and Physiology • Indications and Benefits • Contra-indications and Risks • Informed Consent • Pain Management, Patient Psychological Preparation • Specimen Handling • Interpretation of Results or Outcomes • Procedural Technique (multiple elements unique to procedure; common elements to all [e.g., sterile

technique, situational awareness, course correction]) • Post-procedure Management

This approach to assessment makes some assumptions: • Performance level is specific to each procedure based on the relevant components and level of

responsibility of the physician. • Given the variability of required components, measures of competence are based on all of the relevant

components for that procedure. • Performance level for a given procedure, therefore, requires reaching the desired performance level for

each of the individual components.

Page 21: Applying the Performance Framework Data Model June 10, 2013.

References1. Wigton R, Nicolas J, Blank L. Procedural skills of the general internist: a survey of 2500 physicians. Annals of Internal Medicine. 1989;111:1023-

1034.2. Wigton R, Blank L, Nicolas J, Tape T. Procedural skills training in internal medicine residencies. Annals of Internal Medicine. 1989;111:932-938.3. Wigton R. Training internists in procedural skills. Annals of Internal Medicine. 1992;116:1091-1093.4. Hicks C, Gonzales R, Morton M, et al. Procedural experience and comfort level in internal medicine trainees. General Internal Medicine.

2000;15:716-722.5. Kirkpatrick DL. Evaluating Training Programs: The Four Levels. San Francisco, CA: Berrett-Koehler Publishers; 1998.6. Kirkpatrick L, Kirkpatrick JD. The four levels: an overview. In: Kirkpatrick DL, ed. Evaluating Training Programs: The Four Levels. San Francisco, CA:

Berrett-Koehler Publishers; 2006:21-26.7. Davis D. Accuracy of physician self-assessment compared with observed measures of competence. Journal of the American Medical Association.

2006:296: 1094-1102.8. Carbine D, Finer N, Knodel E, Rich W. Video recording as a means of evaluating neonatal resuscitation performance. Pediatrics. 2000;106:654-658.9. Adams K, Scott R, Perkin R, Langga L. Comparison of intubation skills between interfacility transport team members. Pediatric Emergency Care.

2000;16:5-8.10. Falck A, Escobedo M, Baillargeon J, et al. Proficiency of pediatric residents in performing neonatal endotracheal intubation. Pediatrics.

2003;112:1242-1247.11. Colliver J, Vu N, Barrows H. Screening test length for sequential testing with a standardized-patient examination: a receiver operating

characteristic (ROC) analysis. Academic Medicine. 1992;67:592-595.12. Jones D, McGuinness G. The future for pediatric residency education: the prescription for more flexibility. Journal of Pediatrics. 2009;154:157-158.13. American Board of Internal Medicine. Internal Medicine Policies. http://www.abim.org/certification/policies/imss/im.aspx?print#procedures.

Accessed December 12, 20011.14. Kovacs G, Bullock G, Ackroyd-Stolarz S, et al. A randomized controlled trial on the effect of educational interventions in promoting airway

management skill maintenance. Annals of Emergency Medicine. 2000;36:301-309.15. Wickstrom GC, Kelley DK, Keyserling TC, et al. Confidence of academic generalist internists and family physicians to teach ambulatory procedures.

Journal of General Internal Medicine. 2000;15:353-360.16. Beauchamp TL, Childress JF. Principles of Biomedical Ethics. 5th ed. New York, NY: Oxford University Press; 2001.17. Sarker S, Chang A, Albrani T, Vincent C. Constructing hierarchical task analysis in surgery. Surgical Endoscopy. 2008;22:107-111.18. Shepherd A. HTA as a framework for task analysis. Ergonomics. 1998;41:1537-1552.

Page 22: Applying the Performance Framework Data Model June 10, 2013.

Nested Component

• ID: 98765• Competency:

http://www.example.org/peds_milestones/anatomy_physiology.xml

• Title: Anatomy and Physiology

Page 23: Applying the Performance Framework Data Model June 10, 2013.

Performance levels• Performance Scale Reference: 1 to 4

Position: 1 Score Value: 1 Label: Beginning of Spectrum

Indicator (id = pcap_i1): 2 SD below mean on knowledge test

Position: 2 Score Value: 2 Label:

Indicator (id = pcap_i2): 1 SD below mean on knowledge test

Position: 3 Score Value: 3 Label:

Indicator (id = pcap_i3): 1 SD above mean on knowledge test

Position: 4 Score Value: 4 Label:

Indicator (id = pcap_i4): 2 SD above mean on knowledge test

Page 24: Applying the Performance Framework Data Model June 10, 2013.

Nursing (DNP Eval) Example

• Title: University of San Diego Hahn School of Nursing and Health Science DNPC 630 Residency DNP NP Student Evaluation

• Identifier: – Catalog: URI– Entry:

http://www.sandiego.edu/nursing/DNPC_630_Eval

• Performance Scale– 1 to 3 (1 least competent, 3 most competent)

Page 25: Applying the Performance Framework Data Model June 10, 2013.

DNP Component

• Competency: http://www.example.org/dnp630_3

• ID: 5678• Title: Prepared to practice independently

managing previously diagnosed and undiagnosed patients.

Page 26: Applying the Performance Framework Data Model June 10, 2013.

Performance Levels• Performance Scale Reference: 1 to 3

Position: 1 Score Value: 3 Label:

Indicator (id = pi_i1): Met

Position: 2 Score Value: 2 Label:

Indicator (id = pi_i2): In progress

Position: 3 Score Value: 1 Label:

Indicator (id = pi_i3): Not Met

Page 27: Applying the Performance Framework Data Model June 10, 2013.

National University of Singapore Example

• Title: Standards of Achievement• Identifier:

– Catalog: URI– Entry: http://www.nus.edu.sg/standards2013

• Performance Scale– 1 to 5 (1 least competent, 5 most competent)

Page 28: Applying the Performance Framework Data Model June 10, 2013.

NUS Component

• ID: 2013_1• Title: Standards of Achievement

Page 29: Applying the Performance Framework Data Model June 10, 2013.

Performance Levels• Performance Scale Reference: 1 to 5

Position: 1 Score Value: 1 Label:

Indicator (id = i1): Unable to achieve outcome.

Position: 2 Score Value: 2 Label:

Indicator (id = i2): Requires a lot of guidance to achieve outcome.

Position: 3 Score Value: 3 Label:

Indicator (id = i3): Requires moderate amount of guidance to achieve outcome.

Page 30: Applying the Performance Framework Data Model June 10, 2013.

Performance Levels

Position: 4 Score Value: 4 Label:

Indicator (id = i4): Able to achieve outcome with little or no guidance (entrustment)

Position: 5 Score Value: 5 Label:

Indicator (id = i5): Has the ability to guide/teach others.

Page 31: Applying the Performance Framework Data Model June 10, 2013.

How are use cases addressed?• Map performance frameworks to competencies, including Entrustable Professional Activities

– YES (Competency Reference)• Publish a performance framework for use in curriculum planning and assessment

– YES (Requires IDs for Components so that assessments may be tied to performance levels)• Import descriptions of performance levels for use in assessment

– YES• Reference a Performance Framework for use in assessment

– YES (Requires IDs for Components so that assessments may be tied to performance levels)• Describe an individual’s current level of performance for purposes of formative or

summative assessment– YES (a score and a reference to a Component should do the trick)

• Describe an individual’s level of performance over a longitudinal period for purposes of formative or summative assessment

– YES (provided the framework is authored to support that)• Associate assessment evidence with a particular level of performance in a portfolio

– Do we want to associate it with a level of performance, or do we want assessment evidence to have a score that is interpreted using a particular group of performance levels defined for the relevant competency?

Page 32: Applying the Performance Framework Data Model June 10, 2013.

How are use cases addressed?• Define where in a curriculum students are expected to achieve certain levels of

performance– YES (will require updates to curriculum inventory, may require id on performance level)

• Define what level of performance is required to progress to the next phase or block within a curriculum

– YES (will require updates to curriculum inventory, may require id on performance level)• Use in a system that is capable of showing a learner changes in performance over time

– YES• View data regarding the performance levels of learners in a program for purposes of

program evaluation (external)– YES (would require a spec that allows for the exchange of aggregate data)

• View data regarding the performance levels of learners in a program for purposes of program evaluation (internal)

– YES• Publish program data for public viewing

– YES (would require a spec that allows for the exchange of aggregate data)


Recommended