ANZEA-Mixed-Methods-PA-Presentation

M IXED
METHODS AND THE CREDIBILITY
OF EVIDENCE IN EVALUATION :
L EARNINGS FROM P HYSICIAN A SSISTANT
DEMONSTRATIONS
Dr Sarah Appleton
Dr Adrian Field
Synergia Ltd
6th June 2015
This presentation
• Identifies the context of the evaluation
• Summarises our approach to evaluating the physician assistant
demonstrations
• Presents learnings from applying a mixed methods approach
• Identifies key considerations for evaluation practice and theory
2
EVALUATING HEALTH WORKFORCE
DEMONSTRATIONS
3
Use of demonstrations in the health sector
• Demonstration
– A practical exhibition and explanation of how something works or is
performed
• Not an RCT or pilot study but an:
– “assessment of workforce changes in specific settings” (Health Workforce New
Zealand, 2012)
• Time limited
• Smaller samples
4
Challenges of demonstrations
•
•
•
•
Generalisability
Attribution
Adjustment and learning
Diversity of contexts and
approaches
5
Context of evidence in health sector
• Power of positivist constructs – hierarchy of methods
6
THE PHYSICIAN ASSISTANT
DEMONSTRATIONS
7
Background
• Extension of physician role, working under supervision
• Four demonstration sites trialling role of Physician Assistant in primary
health care and rural ED settings
• Established role in US, emerging elsewhere (UK, Australia, Netherlands)
• Assess value and contribution of PA role to health workforce, in sites and
implications more widely
• Mixed methods approach
8
Rationale and key drivers for demonstrations
For HWNZ
Medical workforce supply, particularly distributional
(e.g. rural, high dep areas)
Fiscally sustainable solutions
For demonstration sites
Improving patient
throughput
Cost effective
solutions
Improve continuity
(vs locums)
Trial new models of
working
Address workforce
shortages
Site and PA selection
Mix of settings: urban, rural, Maori and Pacific
populations, SES, primary care, ED, IFHC
Intended for PAs to have 2-3 years experience with
leadership qualities and a good fit for settings
9
Site settings and roles
Radius Group (Hamilton)
• 3 PAs at 3 sites
• Mixture of high needs populations (Davies Corner and K’Aute) and more affluent (Rototuna)
• Mix of patient types, some focus on acute
• Mix of fee paying/non-fee paying (K’Aute)
Tokoroa Medical Centre
• 2 PAs
• Acute and women’s health; fewer long-term conditions patients
• Transition in 2014 to integrated model with 2 other practices and NP
Gore Hospital
• 1 PA
• Key point of contact in ED
• Limited ward work (follow-up)
Commonalities
• Largely drop-in clinics; relatively few appointments (exception Tokoroa Medical Centre)
• Extension of physician role
• Tend to focus on acute rather than long-term conditions
10
A contested terrain
PA as a
disruptive
innovation
Established
professional
practices and
boundaries
Cultural fit
Regulatory
and
educational
frameworks
Paolo Ucello, The Battle of San Romano, 15th century
11
Contested terrain
“The physician assistant trial is probably the best example,
creating a minor storm when two US-trained physician assistants
were employed at Middlemore Hospital. Nurses were upset they
weren't considered for the role; junior doctors worried they
would be sidelined somehow. Some groups wanted the trial
halted, others had reservations about introducing yet another
entity into the health workforce. And, while the trial has so far
been a success, scepticism remains”
(New Zealand Doctor, June 2011)
12
OUR EVALUATION APPROACH
13
Role introduction theory
FEASIBILITY
CLARITY
OUTPUT
INPUT
CONSEQUENCES
Frontline staff
CONSEQUENCES
FEEDBACK
KNOWLEDGE/SKILLS
FEEDBACK
Rummler G, Brache A. 1995. Improving Performance: How to Manage the White Space in the
Organization Chart. San Francisco: Jossey Bass.
15
Realistic evaluation
16
Mixed methods: The idea
“Doing our work better, generating
understandings that are broader,
deeper, more inclusive and that
more centrally honour the
complexity and contingency of
human phenomenon”
(Greene, 2007, p. 98).
Evaluation questions and data domains
Evaluation questions
Evaluation
Questions
1.
How have PAs integrated with practice activities and service models?
2.
What was the impact and contributory value of the PA role for patient
outcomes, service quality and business models at the demonstration
sites; within this, have the PAs extended or changed the practice
model?
3.
4.
5.
What factors supported or challenged the integration of the PA role
into local practices and with specific professional groups?
What are the implications and/or risks for the fit and applicability of the
PA role within New Zealand, arising from the evaluation findings?
What issues arise from the demonstrations for the potential
establishment, transferability and sustainability of the PA role in New
Zealand?
DataDomains
domains
Data
Patient experience
and impact
Clinical
contribution
Workforce impact
PA integration and
development
Financial and
business impact
Contextual
contributors
18
KEY LEARNINGS FROM OUR APPROACH
19
Data collection opportunities and challenges
Patient
management
system
Clinical notes
Administration
data
Stakeholder
interviews
Staff/patient
surveys
Patient experience
and impact
Clinical
contribution
Workforce impact
PA integration and
development
Financial and
business impact
Contextual
contributors
20
Credibility of quantitative data
• Perceived limitations:
– Proxy indicators of change
– Small sample size
• Credibility enhanced by:
– Support from qualitative data
– Depth of insight
21
Credibility of qualitative data
Stakeholder: “it’s only anecdotal…..”
Evaluator: “But these findings are also reflected in the quantitative data.”
Moving beyond methods:
22
VALUE OF A MIXED METHODS
APPROACH
23
Rapid reflections:
WHAT VALUE DO YOU SEE IN A MIXED
METHODS APPROACH?
24
Value of mixed methods in the PA evaluation
Strengths
• Comprehensive
• Multiple sources
• High engagement
• Integration with
service data
• Insights for other
settings
Limitations
• Coverage
• Some source data
• Linking databases
• Pre- and post
• Contexts
25
Our thoughts on value
• Interpreting outcomes in context
• Understanding the typical and the unique case
• Comprehensive insight
26
Sharing more and hearing more
• Multiple perspectives of credible evidence
• Inclusive and respectful of different ways of knowing and valuing
27
Key considerations for practice
•
•
•
•
•
•
What do you value?
What do you know?
When and how will you deploy methods?
Time and resources
Limited guidance on write-up and analysis
Maintains focus of the evaluation
28
Fit for purpose
“Premise is that using multiple and diverse methods is a
good idea, but is not automatically good science. Rather,
just as survey research, quasi-experimentation, panel
studies, and case studies require careful planning and
thoughtful decisions, so do mixed method studies.”
Mertens (2013). Mixed Methods and Credibility of Evidence in
Evaluation: New Directions for Evaluation, Number 138
29