Post on 01-Apr-2015
transcript
How Do We Measure Student Achievement During Placements?
Margaret FisherCeppl Activity Lead/ Senior Lecturer in Midwifery/ Academic Lead Placement Development Team,
University of Plymouth
www.placementlearning.org
Email: mfisher@plymouth.ac.uk
2nd DIETS Conference, 26/9/08
Introduction and overview
• The importance of placements and practice assessment
• Evidence from the literature
• Ceppl project: Assessment of Practice
• Application in Midwifery – an electronic portfolio
• Summary and questions
The importance of placements and
practice assessment
• Real-life exposure – environment and role models
• Enables assessment of practice skills
• Variety of placements
• Methods of assessment need to be valid, reliable and appropriate
Evidence from the literature
•Assessment of practice is crucial in determining whether or not a student meets the criteria required of their profession, thus ensuring safety of the public
UKCC 1999, Watkins 2000, Cowburn et al 2000
•Defining competence has long been a challenge
Cowan et al 2005
•Efforts to ‘measure’ competence and professional abilities have resulted in a wide variety of methods of assessment
Baume and Yorke 2002, McMullan et al 2003
•Unless outcomes are clear, the result may be that the student focuses too heavily on completing the portfolio [or other tool] rather than learning from the experience itself
Scholes et al 2004
•Reflections on practice may form part of portfolio assessments, and this process may also contribute to the student’s learning
Mountford and Rogers 1996
So:
• clear purpose and outcomes
• effective and objective measurement of competence
• contribution of the assessment process to students’ learning
are important factors to consider
Ceppl project: Assessment of Practice
• Longitudinal case studies
• Staff focus groups
• Literature search
• Trawl of websites
• Conference networking
Aim
To establish an evidence-based set of key principles and
resources to guide
Assessment of Practice,
relevant across professional boundaries.
Research Questions
1. What are perceptions of validity and reliability of the practice assessment methods used?
2. What are perceptions of the impact of the practice assessment process on the student learning experience?
Methodology
• 14 participants from Midwifery, Social Work and Emergency Care programmes (nurses and paramedics)
• Semi-structured interviews at the end of each year
• Longitudinal case study approach• Single-case and cross-case analysis
and synthesis of findings – “Framework technique”
Ritchie and Spencer 1984
Key themes
People
GUIDANCE
Consistency
Becoming a
professional Doing the job
PURPOSE
PROCESS
Timing Clarity Placements
Paperwork
Methods used
1. Portfolios
2. Reflections
3. Tripartites/ 3-way meetings
4. Criterion referenced assessments
5. Conversations
6. Observations
7. OSCEs
1. Portfolios
Provide focusEvidence of
capability/ achievement
Encourage student as see their progress
Self-directedMotivate learning
×Prescriptive/ restrictive (“tick boxes”)×Weighting of marks
unbalanced/ difficult to assess×Potential to “cheat
the system”×Bulk (paper format)×Heavy workload
2. Reflections
Aid and extend learning
Enable development and growth
May be reliable
× Do not always reflect the reality of practice
× Potential to “blur the edges”
× Don’t necessarily gain from “ticking the boxes”
× May be unreliable
3. Tripartites/ 3-way meetings Useful checkpoint Opportunity to reflect
on progress and learning
Opportunity to get feedback from mentor and tutor
Enable clarification of issues
Student-centred Reliable if student and
mentor have worked closely together
× Difficult to arrange× May be challenging to
express conflicting opinions
× Likened to a “parent’s evening”
× Some students though mentor and tutor should also have private discussion
4. Criterion referenced assessment
Focused learning
Best if continuous assessment
Mostly valid, reliable and achievable
× Criteria not always relevant to placement
× Some criteria ambiguous/ overly complex/ unclear
× Dependent on professional judgement and experience of mentor
5. Conversations
Useful feedback
Demonstrate communication skills
× Difficult to organise
× Caused anxiety
6. Observations
Benefit from feedback from different people
Assess attitudes to service-users
Valid and reliable
× Did not always reflect real practice
× Difficult to arrange/ heavy workload
× Restrictive× Inconsistency of
assessors× Would prefer to be
shadowed for a day
7. OSCEs(Objective Structured Clinical Examinations)
Reflect real practice Provide focus Consistent Enjoyable Well prepared Put students’
knowledge to use Huge impact on
learning Useful/ best way of
assessing practice
× Pressurised/ stressful
× False environment
× Not holistic
Application in Midwifery – an electronic portfolio
Portfolio work-party
Decision to develop part-paper (summative) and part-electronic (formative/ evidence learning) portfolio
E-portfolio developed → “Wiki’s”
Pilot study Demonstration
Key findings from the pilot
• Guidelines: very positive evaluation by all, but face-to-face explanation recommended in addition
• Hyperlinks: logical system; tricky to begin with but became easier with use; particularly useful when making external links (eg: to national guidelines)
“Hyperlinks are good as it shows evidence of learning” (S)
• Uncertain how readily accessible in clinical area
• Students liked the fact that the personal tutor would have access and provide formative feedback
• Variety of learning styles and IT skills amongst student respondents but this did not appear to affect whether or not students were able to cope with the new format
Summary
→
Clear understanding of
PURPOSE
Safe, competent
PRACTITIONERS
who have achieved
PERSONAL and
PROFESSIONAL
GROWTH
Clear, consistent and timely
PREPARATION
Optimise the
PROCESS
People Placement Paperwork