Date post: | 22-Dec-2015 |
Category: |
Documents |
Upload: | roderick-daniels |
View: | 215 times |
Download: | 0 times |
Training the OSCE Examiners
Katharine BoursicotTrudie Roberts
Programme
• Principles of OSCEs for examiners• Video marking• Marking live stations• Strategies for enhancing examiner
participation in training
Academic principles of OSCEs
• The basics• What is an OSCE?
• More academic detail• Why use OSCEs?
• The role of examiners• Examiners in OSCEs
The basics
• For examiners who don’t know about OSCEs
• A brief reminder for those who are familiar with OSCEs
What is an OSCE?
•Objective •Structured •Clinical •Examination
Station
OSCE test design
OSCEs - Objective
• All the candidates are presented with the same test
OSCEs - Structured
• The marking scheme for each station is structured
• Specific skill modalities are tested at each station• History taking• Explanation• Clinical examination• Procedures
OSCEs – Clinical Examination
• Test of performance of clinical skills: not a test of knowledge• the candidates have to demonstrate
their skills
More academic detail
• Why use OSCEs in clinical assessment?
• Improved reliability• Fairer test of candidate’s clinical abilities
Why use OSCEs in clinical assessment?
• Careful specification of content• Observation of wide sample of
activities • Structured interaction between
examiner and student• Structured marking schedule• Each student has to perform the
same tasks
Characteristics of assessment instruments•Utility =
• Reliability • Validity• Educational impact• Acceptability• Feasibility
ReferenceVan der Vleuten, C. The assessment of professional competence: developments,research and practical implications Advances in Health Science Education 1996, Vol 1: 41-67
Test characteristics
• Reliability of a test/ measure• reproducibility of scores
across raters, questions, cases, occasions
• capability of differentiating consistently between good and poor students
Test Sample
Test Sample
Domain of Interest
Sampling
Reliability
• Competencies are highly domain-specific
• broad sampling is required to obtain adequate reliability• across content i.e. range of
cases/situations• across other potential factors that cause
error variance i.e.• testing time, examiners, patients, settings,
facilities
OSCE : blueprintHistory Explan Exam Procedur
e
CVS Chest pain Disch drugs
Cardiac BP
RS Haemoptysis
Smoking Resp Peak flow
GIS Abdo pain Gastroscopy
Abdo PR
Repro Amenorrhoea
Abnormal smear
Cx smear
NS Headache Eyes Ophthalmosc
MS Backache Hip
Generic Pre-op assess
Consent for post mortem
IV cannulationBlood trans rea
Test characteristics
• Validity of a test/measure• the test measures
the characteristic (eg knowledge, skills) that it is intended to measure
Knows
Knows how
Shows how
Behaviour~ skills/attitudes
Cognition~ knowledge
Model of competence
Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67.
Does
Pro
fess
ion
al
auth
enti
city
Validity of testing formats
Knows
Knows how
Professional practice assessment
Shows how
Does
Knowledge assessment: MCQs
Problem-solving assessment: EMQs, SEQs
Performance assessment: OSCEs, long/short cases, OSLERs, etc
Test characteristics: Educational impact
Teacher
Curriculum
Student
Assessment
Relationship between assessment and learning
Test characteristics
• Feasibility• cost• human resource• physical resources
Test characteristics
• Acceptability• tolerable effort• reasonable cost
• Acceptability• doctors• licensing bodies• employers• patients/consumer groups• students• faculty
The role of examiners in OSCEs
• General• Types of stations• Standard setting• Practice at marking
The role of examiners in OSCEs
• To observe the performance of the student at a particular task
• To score according to the marking schedule
• To contribute to the good conduct of the examination
The role of examiners in OSCEs
• It is NOT to:• Conduct a viva voce• Re-write the station• Interfere with the simulated patient’s
role• Design their own marking scheme• Teach
Types of OSCE stations
• History taking• Explanation• Clinical examination• Procedures
Communication skills
• Stations involving patients, simulated patients or volunteers
• Content vs process i.e what the candidate says
vs
how the candidate says it
Clinical skills
• People• Professional behaviour
• Manikins• Describe actions to the examiner
The examiner’s role in standard setting
• Use your clinical expertise to judge the candidate’s performance
• Allocate a global judgement on the candidate’s performance at that station
• Remember the level of the examination
Global scoring
Clear fail
Borderline
Clear pass
Very good pass
Excellent pass
Borderline method
Checklist1.
2.
3.
4.
5.
6.
7.
TOTAL
Passing score
Borderline score distribution
Pass, Fail, Borderline P/B/F
Test score distribution
Regression based standard
Checklist1.
2.
3.
4.
5.
6.
7.
TOTAL
Overall rating 1 2 3 4 5
1 2 3 4 5
ChecklistScore
X
X = passing score
1 = Clear fail2 = Borderline3 = Clear pass4 = V Good pass5 = Excellent pass
Clear Borderline Clear V Good Excellent fail pass pass pass
Practice at marking
• Videos
• Live stations
• Mini-OSCE
Strategies for enhancing examiner participation
1. CME2. Job plan/ part of contract3. Specific allocation of SIFT4. Experience for post-graduate
examinations5. Payment