Evidence Based and Outcome Informed Practice

What does it mean to be a good
scientific practitioner ?
Developing evidence based and outcomes informed CAMHS
Dr Miranda Wolpert
Director CAMHS Evidence based Practice Unit (EBPU)
Director CAMHS Outcomes Research Consortium (CORC)
6th September 2010
Plan of talk
• Definitions
• Intro to Erinsborough case example
4 steps :
1) appraise the evidence
2) apply the evidence
3) reflect at practitioner level
4) reflect at service level
Evidence based Practice (EBP)
Using the best available external evidence from systematic
research in order to reach decisions for one’s own specific
circumstances
•
•
•
Turning to the “evidence” first
Applying a critical view
Making decisions based on the evidence but also in the light of
locally agreed values and priorities and wishes of those you are
working with
Does NOT mean
• Not reviewing local impact if literature says it works
Outcomes informed practice
• Being focused from the outset on what you are trying to
achieve and how you will measure success.
• Routinely evaluating any initiative whether it is based on
existing “evidence” or whether it is a new approach or a
modification of an approach
• Weighing up costs as well as benefits
Does NOT mean
• Collecting lots of data with no clear plans for
interpretation
Evidence Based Practice….
Evidence
(largely but not exclusively from published research)
Reflection
(not exclusively from routine outcome evaluation)
Values
(largely but not exclusively from policy)
Evidence based and outcomes informed practice
Why important:
Natural biases in reasoning mean that people
tend to make decisions and draw conclusions
based mainly on prior assumptions, traditions
or influenced by charismatic leaders, and will
continue to do things that “feel” right, rather
than introduce things that have been shown to
be effective.
Debilitating Dichotomies
•
•
•
•
•
Quantitative vs Qualitative
Measurement vs Intuition
Manualisation vs Creativity
Top down vs bottom up
Importance of different approaches vs
importance of non specific factors
• Medical vs social models
• Arts vs sciences
e.g. Trying to
defend
approach/int
ervention
against cuts
Focused
Open-minded
Impact
How do we demonstrate that
the approach/intervention has
a positive impact?
Does the intervention have an
impact? In what ways? What
makes a difference?
Cost effectiveness
How do we
show that the work is cost
effective?
Is the work costs effective?
Specific groups
How do we demonstrate that
the intervention works for
certain groups of children?
Does the efficacy of the
intervention depends on which
groups are being worked with?
e.g.,
exploratory
research to
understand
something
better
Practice
Research
Activity
Individual
practitioner
reflection
Supervision, regular
monitoring / evaluation
of individual practice,
progress
Monitoring /
evaluation of team
practice and
progress
Local evaluation,
monitoring /
evaluating a
service’s outcomes
Local research
aimed at national
audience
National / international
research aimed for
national / international
audience
Generalisability
Individual
case
Individual
practitioner
Individual team
(possibly other teams
matched on a number
of features)
Local Authority or PCT
(possibly other LAs /
PCTs matched on
a number of features)
Local Authority or
PCT, possibly other
areas nationally while
acknowledging caveats
Timescales
Ongoing,
iterative process
Discrete
Can be either
research project
Requires ethical approval?
Unlikely
Almost always
Sometimes
Uses for the data
Inform practice
Both
Add to the general
evidence base
Nationally /
internationally
Erinsborough Case-study
Commissioner for this
service is Anne McCarthy
Helen Morgan-client’s mother
Service Manager
Ravi Sharma
Lexie Morgan- 11 Years old client
Primary Mental Health
Worker – Richard Smith
Anne Mc Carthy’s questoins
• What is the best ways to invest public money to get
best outcomes?
• How can services be encouraged (and monitored)
to allow practitioners to practice safely and
effectively and also be genuine learning
organisations.
Ravi Shama’s questions
• How can I make sure the service we
provide is the best it can be?
• How can I help team practice safely and
effectively but also learn
Richard Smith’s questions
1. How do I decide what is the best intervention
for Lexie?
2. How can I help Lexie and Helen weigh these
up and choose what's right for them
3. How will I know if could be more effective in
helping Lexie?
Helen’s questions
• Will they be able to help Lexie?
• Will I be blamed?
Lexie’s questions
• Will they be able to help me?
• Will they be kind and approachable?
Step 1
Appraise the published evidence
Peter Fonagy: University College London & The Anna Freud Centre
[email protected]
Sharing the Evidence with practitioners
Drawing on the
Evidence: advice for
child mental health
professionals
Wolpert, Fuggle, Cottrell,
Fonagy, Phillips, Target
and Stein 2006
Sharing the evidence
Knowing Where to Look
Questions of the evidence
(adapted from Kazdin 2004)
1. What are the costs, risks and benefits of this intervention
relative to no intervention?
2. What are the costs, risks and benefits of this intervention
relative to other interventions?
3. What are the key components that appear to contribute to
positive outcomes?
4. What parameters can be varied to improve outcomes
(e.g. including addition of other interventions, non specific
clinical skills etc)?
5. To what extent are effects of interventions generalizable
across a) problem areas, b) settings, c) populations of
children and d) other relevant domains
Which of these can we answer now?
How do we get answers?
Hierarchy of Evidence?
• Ia Evidence from meta-analysis of randomised controlled
trials   
• Ib Evidence from at least one randomised controlled trial

• IIa Evidence from at least one controlled study without
randomisation  
• IIb Evidence from at least one other type of quasiexperimental study 
• III Evidence from descriptive studies such as comparative
studies, correlation studies and case-control studies 
• IV Evidence from expert committee reports or opinions, or
from clinical experience of a respected authority, or both.
Alternative Hierarchies?
Some evidence based treatments
Peter Fonagy: University College London & The Anna Freud Centre
[email protected]
•
•
•
•
Anxiety and related conditions
– Modelling, Reinforced exposure, CBT
Depressive symptoms and disorders
– CBT, Interpersonal therapy, activation therapy
ADHD and related problems
– CBT, relaxation and biofeedback training, behavioural parent and
teacher training
Conduct-related problems and disorders
– Youth focused operant treatment, CBT (problem-solving skills),
behavioural parent training, multisystemic therapy
Dissemination of Evidence Based Therapies
Peter Fonagy: University College London & The Anna Freud Centre
[email protected]
•
•
•
Most EBTs are CBT or behavioural
– Most everyday clinical practice with youths is non-behavioural
(eclectic, systemic and psychodynamic) (Ho et al., 2007; Martin et
al., 2007)
Clinical trainings of psychologists and psychiatrists
– Evidence based treatments taught less then 10 years ago (Woody
et al., 2005) – 1993: 11/22 EBTs; 2003: 5/22 EBTs
UK ACAMH survey (2006) CBT is dominant approach of only 20% of
respondents
.
Treatment Process Variables Predicting Outcome
and/or Dropout from Treatments
Peter Fonagy: University College London & The Anna Freud Centre
[email protected]
• Perception of therapist as not invested in the child
and/or parent (Shirk & Karver, 2003)
• Perception of therapist as not competent (Garcia &
Weisz, 2002)
• Therapeutic alliance with child and/or parent (Hawley &
Weisz, 2005)
• Creating sense of hopefulness about the treatment
(Karver et al., 2005)
• Behavioural participation outside therapy sessions
(McCarty & Weisz, 2007)
Limitations of the evidence
•
•
•
•
•
•
•
•
•
Paucity of research
Skew in researched areas
Skew in researched populations
Generalisability to range of groups and settings questionable
Design flaws in studies
Lack of consensus on appropriate outcomes and perspectives
Lack of model for economic costings
Lack of focus on possible harm
Publication bias
Selection of Patients: The Cinderella Groups
Peter Fonagy: University College London & The Anna Freud Centre
[email protected]
• Gaps in coverage of problems
– Few RCTs of anorexia (none of bulimia)
• Annual mortality is 12x above 15-24
• Bulik et al. (2007) 32 studies of AN (13 too poor in design, 8
medication, 7 family therapy, 3 CBT, 1 CAT, 1 psychoanalytic, 1
supportive but mostly for adults)
– Substance abuse in youths
• Particularly harder drugs
– ADHD in adolescence
• 150 DSM diagnoses that can be applied to youths
– EBTs cover only a small selection of these
Secular trends in ESs for EBTs:
Effect size of CBT in 27 trials for youth depression
1.4
Peter Fonagy: University College London & The Anna Freud Centre
[email protected]
1.2
LARGE
Log Relative Risk
1
R=.69
0.8
Equal to
Control
0.6
0.4
MEDIUM
0.2
SMALL
0
1985
1990
1995
2000
Year of Publication
2005
Step 2
Apply the evidence
Share the evidence:
Miranda Wolpert,
Robert Goodman
Carl Raby,
David Cottrell
Paul Lavis,
Jonathan Bureau
Steve Kingsbury,
David Trickey
Samuel Stein,
Nisha Dogra
Jeanette Phillips,
Barbara Herts
Dinah Morley,
Jude Sellen
Kathryn Pugh,
Cathy Street
Peter Fuggle,
David Goodban
Ann York,
Dawn Rees
Step 3
Reflect and evaluate at individual level
Review and reflect: CORC approach
Learning Disability Measures being
piloted
(analysing on pilot basis)
SLDOM
NISONGER
Adolescent Measures being piloted
(analysing on pilot basis)
YP-CORE
CORE
MEASURES
(routinely analysed)
SDQ
CGAS
CHI-ESQ
HONOSCA
GOALS
Consultation Questionnaire
Early Infant Measures
(looking to analyse on a pilot basis)
Parent/ Child Interaction
measures:
Emotional Availability
Scale
and video tape analysis:
CARE Index KIPS
Measures based
on child ASQ-SE
Measures based on parent:
EPDS
BPRS,
Kessler 10
EFQ
Session by Session – piloting
approach by several CORC member
groupings
RMQ
YP CORE
Therapeutic Alliance
(analysing on pilot basis at present)
Pilot measures being analysed on a
one-off basis:
DBC (LD measure)
CBCL (adolescent measure)
Mental Health Outcome Measures
Individual practitioner feedback
From: Duncan Law,
Hertfordshire Partnership NHS
Trust
Closure
CGA T1 /
S
T2
SDQ Chil
Time d
Two Par
ent
SDQ Chil
Time d
One Par
ent
Client
Qualitative feedback
Parent
1
3
1
0
11 5
5
3
3
0
0
7
5
No. everything was
fine
10
1
4
9
9
5
5
Professional
approach, clarified
problem
4
0
1
0
6
5
A
6 Months
B
End of contact
26
Being able to speak
openly about
problems or
concerns and
having someone to
get feedback from
Appointments after
school would have
been good
Child
He took me seriously
he never laughed.
Always had
suggestions and
decent questions I
could answer
no.
I was taken seriously
and I always had a
chance to talk
nope
Outcomes measured “session by session”
36
From John
Weisz, Harvard
2010
Individual Child
Dashboard
(Internalizing)
Are results on
track?
Do the practices
fit the problem?
Is family
engagement OK?
Step 4
Reflect and evaluate at service level
Review and reflect: service level
Making Evidence Based and outcome
informed practice a reality
1.
2.
3.
4.
Finding ways to help us challenge ourselves and our
assumptions
Finding ways to explicitly share learning, including with
children and families
Finding ways to introduce feedback loops for practitioners
Finding wyas to introduce feedback loops for services
http://www.annafreud.org/ebpu/
http://www.corc.uk.net/