Date post: | 25-Dec-2015 |
Category: |
Documents |
Upload: | hope-candice-rice |
View: | 221 times |
Download: | 2 times |
MIXED METHODS AND THE CREDIBILITY OF EVIDENCE IN EVALUATION: LEARNINGS FROM PHYSICIAN ASSISTANTDEMONSTRATIONS
Synergia Ltd 6th June 2015
Dr Sarah Appleton
Dr Adrian Field
2
This presentation
• Identifies the context of the evaluation• Summarises our approach to evaluating the physician assistant
demonstrations• Presents learnings from applying a mixed methods approach• Identifies key considerations for evaluation practice and theory
3
EVALUATING HEALTH WORKFORCE DEMONSTRATIONS
4
Use of demonstrations in the health sector
• Demonstration– A practical exhibition and explanation of how something works or is
performed
• Not an RCT or pilot study but an:– “assessment of workforce changes in specific settings” (Health Workforce New
Zealand, 2012)
• Time limited• Smaller samples
5
Challenges of demonstrations
• Generalisability • Attribution• Adjustment and learning• Diversity of contexts and approaches
6
Context of evidence in health sector
• Power of positivist constructs – hierarchy of methods
7
THE PHYSICIAN ASSISTANT DEMONSTRATIONS
8
Background
• Extension of physician role, working under supervision• Four demonstration sites trialling role of Physician Assistant in primary
health care and rural ED settings• Established role in US, emerging elsewhere (UK, Australia, Netherlands)• Assess value and contribution of PA role to health workforce, in sites and
implications more widely• Mixed methods approach
9
Rationale and key drivers for demonstrations
Site and PA selection
Mix of settings: urban, rural, Maori and Pacific populations, SES, primary care, ED, IFHC
Intended for PAs to have 2-3 years experience with leadership qualities and a good fit for settings
For demonstration sitesImproving patient
throughputCost effective
solutionsImprove continuity
(vs locums)Trial new models of
workingAddress workforce
shortages
For HWNZMedical workforce supply, particularly distributional
(e.g. rural, high dep areas) Fiscally sustainable solutions
10
Site settings and roles
Radius Group (Hamilton)• 3 PAs at 3 sites• Mixture of high needs populations (Davies Corner and K’Aute) and more affluent (Rototuna)• Mix of patient types, some focus on acute• Mix of fee paying/non-fee paying (K’Aute)
Tokoroa Medical Centre• 2 PAs• Acute and women’s health; fewer long-term conditions patients• Transition in 2014 to integrated model with 2 other practices and NP
Gore Hospital• 1 PA• Key point of contact in ED• Limited ward work (follow-up)
Commonalities• Largely drop-in clinics; relatively few appointments (exception Tokoroa Medical Centre)• Extension of physician role• Tend to focus on acute rather than long-term conditions
11
A contested terrain
PA as a disruptive innovation
Established professional practices and boundaries
Cultural fit
Regulatory and
educational frameworks
Paolo Ucello, The Battle of San Romano, 15th century
12
Contested terrain
“The physician assistant trial is probably the best example, creating a minor storm when two US-trained physician assistants were employed at Middlemore Hospital. Nurses were upset they weren't considered for the role; junior doctors worried they would be sidelined somehow. Some groups wanted the trial halted, others had reservations about introducing yet another entity into the health workforce. And, while the trial has so far been a success, scepticism remains”
(New Zealand Doctor, June 2011)
13
OUR EVALUATION APPROACH
15
Role introduction theory
Rummler G, Brache A. 1995. Improving Performance: How to Manage the White Space in the Organization Chart. San Francisco: Jossey Bass.
FEASIBILITY CLARITY
FEEDBACKKNOWLEDGE/SKILLS
INPUT OUTPUT CONSEQUENCES
FEEDBACK
Frontline staff
CONSEQUENCES
16
Realistic evaluation
Mixed methods: The idea
“Doing our work better, generating understandings that are broader, deeper, more inclusive and that more centrally honour the complexity and contingency of human phenomenon”
(Greene, 2007, p. 98).
18
Data DomainsEvaluation Questions
Evaluation questions and data domains
1. How have PAs integrated with practice activities and service models?
2. What was the impact and contributory value of the PA role for patient outcomes, service quality and business models at the demonstration sites; within this, have the PAs extended or changed the practice model?
3. What factors supported or challenged the integration of the PA role into local practices and with specific professional groups?
4. What are the implications and/or risks for the fit and applicability of the PA role within New Zealand, arising from the evaluation findings?
5. What issues arise from the demonstrations for the potential establishment, transferability and sustainability of the PA role in New Zealand?
Patient experience and impact
Clinical contribution
Workforce impact
PA integration and development
Financial and business impact
Contextual contributors
Evaluation questions Data domains
19
KEY LEARNINGS FROM OUR APPROACH
20
Data collection opportunities and challenges
Patient experience and impact
Clinical contribution
Workforce impact
PA integration and development
Financial and business impact
Contextual contributors
Patient management
systemClinical notes Administration
dataStakeholder interviews
Staff/patient surveys
21
Credibility of quantitative data
• Perceived limitations:– Proxy indicators of change– Small sample size
• Credibility enhanced by:– Support from qualitative data– Depth of insight
22
Credibility of qualitative data
Stakeholder: “it’s only anecdotal…..”
Evaluator: “But these findings are also reflected in the quantitative data.”
Moving beyond methods:
23
VALUE OF A MIXED METHODS APPROACH
24
WHAT VALUE DO YOU SEE IN A MIXED METHODS APPROACH?
Rapid reflections:
25
Value of mixed methods in the PA evaluation
Strengths•Comprehensive•Multiple sources•High engagement•Integration with service data•Insights for other settings
Limitations•Coverage•Some source data•Linking databases•Pre- and post •Contexts
26
Our thoughts on value
• Interpreting outcomes in context • Understanding the typical and the unique case• Comprehensive insight
27
Sharing more and hearing more
• Multiple perspectives of credible evidence• Inclusive and respectful of different ways of knowing and valuing
28
Key considerations for practice
• What do you value?• What do you know?• When and how will you deploy methods?• Time and resources • Limited guidance on write-up and analysis• Maintains focus of the evaluation
29
Fit for purpose
“Premise is that using multiple and diverse methods is a good idea, but is not automatically good science. Rather, just as survey research, quasi-experimentation, panel studies, and case studies require careful planning and thoughtful decisions, so do mixed method studies.”
Mertens (2013). Mixed Methods and Credibility of Evidence in Evaluation: New Directions for Evaluation, Number 138