Designing an evaluation for physiotherapy clinical education

Designing an evaluation for physiotherapy clinical education.
[slide] Introduction
Introduction


Changing policies,
changing practices
Perspectives of
Quality
HPCSA
HEQC
Employer
QUALITY
Student
Academic
Clinical education is the cornerstone that socializes undergraduate healthcare
professionals into their communities of healthcare practice. It is a vital component
of the physiotherapy undergraduate curriculum. One’s clinical education
experience is largely shaped by the context of practice, the quality of teaching
and learning, and opportunities presented for learning. Change of practice
settings will necessarily influence shifts towards changing knowledge and skills.
Within South Africa, the policy for a transforming health care system has reshaped the contexts for health care delivery. Responding to transforming policy
imperatives, it has become necessary to include hospital-, institution- and
community-based care within clinical education programmes to prepare health
professionals for practice in both low-resourced, developing settings and the
private health care environment.
The increasing emphasis on quality and evidence-based practice in healthcare
has stimulated the need for enquiry into the processes that underscore clinical
education. The education and training of physiotherapists is located within both
1
the higher education and health care environments. Therefore the quality of
physiotherapy education and training programmes in this country is under control
of the Professional Board for Physiotherapy, a constituent of the Health
Professions Council of SA, and the Higher Education Quality Committee. Whilst
these quality assuring bodies may have similar and/or differing views of quality
for programmes, there exists, in addition, different perceptions among students,
physiotherapy academics and physiotherapy employers of what constitutes
‘quality’ for clinical education.
According to Cross, 1995, students’ views on quality derive largely from studies
of behaviours of clinical educators that students consider important. Academic
staff, on the other hand, may define quality in terms of a ’process’ view i.e. the
processes involved in learning and the ‘value added’ as a result of these, whilst
employers may view quality in terms of a ‘product’ i.e. as a measure of the
education’s system ability to provide a quality practitioner which has ‘utility value’
in the workplace.
In this presentation I share with you how changing policies and practice contexts
motivated the design of an evaluation of the physiotherapy clinical education
programme, involving multiple participants, towards an assessment of the
programme’s quality. Further, I present levels of data analysis and their potential
uses.
[slide]
Motivation for Evaluation



Absence
Response to change
Explicit processes influencing practice
2
The design of the evaluation was motivated by the following factors:
First, the absence of a tool to evaluate the quality of physiotherapy clinical
education, second, to assess how this clinical programme was responding to the
changes in higher education and health care. Therefore the outcome of the
evaluation process would generate an assessment of how physiotherapy clinical
education was performing against the inter-relation of changing theory, practice
and policy. The third motivating factor related to interrogating how quality of
practice was influenced by environmental issues and the actual processes of
teaching and learning that are a function of clinical education. Underlying this
relation was the notion of determining the processes that occur at the interface of
the student, the clinical educator and the site for clinical practice, and how this
relation influences student learning and the quality of clinical education.
[slide]
Constructs




Teaching and Learning
Assessment
Environment
Clinical educator
The evaluation tool for students was designed around four main constructs:
teaching and learning, assessment, the environment, and the clinical educator.
[slide]
3
Ideology underlying Design



Merging perspectives of ‘quality’
Theories of Learning
Merging goals of HE and Health care
policy
The ideology that underscored the design of the tool related to
1. merging the different perspectives of ‘quality’ of the student, the academic
and the employer.
2. experiential learning theory and adult theory of learning that are
appropriate for higher education and clinical education contexts
3. merging the broad goals of higher education and health care policies
At a more pragmatic level, the instrument produced and assessed an alternate
perspective of quality that intersected the process, product and behaviour views,
whilst simultaneously assessing explicit aspects that translated the aims of the
higher education and health care policies.
Evaluating Multiple Perspectives
[slide]
Evaluating Multiple Perspectives
Clinical Education
Evaluation
Students
Clinical Educators
4
Physiotherapy Clinical
Managers
The tool designed for evaluating the clinical education experience from the
student’s perspective was supplemented by instruments that evaluated the
perspectives of the clinical educators and the physiotherapy managers of clinical
sites.
[slide]
Value of Multiple Perspectives





Intended vs Actual outcome
Practice Preparation vs actual
requirements
Holistic assessment
Plurality for transformation
Validity
The value of evaluating multiple perspectives lies in first, establishing evidence
for the level of congruency between the intended outcome and the actual
outcome of the programme, second, provides an understanding of how
preparation for practice in the one community (academic institution) interacts with
actual requirements for practice in the other (workplace), third, a more holistic
and accurate assessment of practice derives from triangulating perspectives from
multiple stakeholders, yielding a more complete picture of the programme, fourth,
engaging multiple perspectives provides a space for recognition of plurality and
voice resonating with the principles of democratic transformation, and fifth, the
introduction of appropriate external criteria strengthens the validity of the data.
Designing the Instruments
[slide]
5
Designing the Instruments
Evaluating Similar Items
Student
Clinical
educator
Physiotherapy
Manager
The range of patients at
this clinical site was suitable
for achieving the aims and
objectives of the clinical
education block.
The clinical site provides a
suitable range of patients
for clinical education to
achieve the aims and
objectives of the clinical
The clinical site provides a
suitable range of patients
for clinical education to
achieve the aims and
objectives of the clinical
education block.
education block.
Clinical educators and physiotherapy clinical managers were asked to respond to
a questionnaire whose items roughly corresponded to those forming the four
constructs on the student questionnaire: in effect, they rated themselves as
clinical educators and managers using the same or similar criteria as the
students. The instruments were constructed using similar questions, ranked on
a Likert scale. For example, an item that assessed the appropriateness of the
clinical site with respect to its range of patients was evaluated across the three
constituencies under the construct of environment, as follows.
[slide]
Evaluating Particular Items
Student: Clinical Educator
The clinical educator was
approachable.
Clinical educator: Teaching &
Learning
My approach to clinical education
is patientpatient-centred.
Physiotherapy Manager: Environment
The clinical site benefits from student physiotherapist
placements.
6
Particular items were also included in relevant instruments that were most
meaningful for a constituency. For example, students were required to respond
to questions related to the behaviour of the educator, whilst clinical educators
were required to reflect on their approach to clinical education, against the
broader goals of health care. The instruments also provided opportunities for
qualitative responses.
How could the emerging data be analysed and what interpretations could
one make of the findings?
[slide]
Analysing the Data & Interpreting
the Analysis



Macro
Micro
Unit per site =
student + educator +
manager
Unit
Constructs
within
levels
Constructs across levels
Aggregate data for constituency
The emerging data could be analysed at three levels: macro, micro and unit
levels of analysis.
At a macro or first level, the data could be analysed and aggregated for each
constituency, providing an overall summary of the performance of the clinical
programme. One would pay particular attention to items that scored either
positively or those with glaring negative ratings. However, crucial results may be
masked if only summary data are reported.
At a micro level or second level of analysis, the data for each construct could
be disaggregated and analysed per level of study for each constituency. Mean
7
scores for items within each construct could be compared with the identical item
or construct across levels of study.
[slide]
Teaching and Learning Across
Levels




Student:
Student: I am able to make the link between classroomclassroom-based lectures and
practicals,
practicals, and the clinical education experience.
Clinical educator:
educator: The clinical experience is linked with the material covered
in lectures and practical sessions in the classroom.
Physio Manager (19): The students have appropriate and relevant
theoretical knowledge for practice at this site.
Physio Manager (20): The students display appropriate and relevant
practical skills for practice at this site.
This slide illustrates how an item within the construct for teaching and learning,
assessed the link between classroom-based teaching and clinical practice across
the three constituencies.
[slide]
8
Analysis Across Levels
Teaching & Learning Across Levels
5
4.5
4
3.5
Mean
3
Mean L2
2.5
Mean L3
Mean L4
2
1.5
1
0.5
0
Student
Clin Ed
Clin Manager (19)
Clin Manager (20)
Constituency
This slide depicts the mean scores for the item for each constituency across the
levels of study. All responses related to level 2 are indicated in blue, level 3 in
pink and level 4 in green. The graph illustrates the mean score for this item from
all students and clinical educators within a level of study, and the view of clinical
managers for that particular level.
First, the analysis could produce the degree of correlation that exists for an item
within a particular construct with respect to all the students, clinical educators
and clinical managers at that level of study. For example, as the slide suggests in
level 3 there appears to be fairly good correlation among the constituencies that
a neutral link exists between classroom learning and knowledge for practice.
Whilst in level 4 there appears to be positive link between the student and clinical
educators about the relation between classroom learning and clinical practice,
the clinical managers, on the other hand, are not in absolute agreement with the
academic perspective about the knowledge and skills that the students display
during practice. This suggests incongruence between the academic preparation
for practice and the actual requirements for the current practice context.
9
Second, the mean scores of items could be analysed across levels of study for a
particular constituency. For example, the perspective of the clinical managers is
rated consistently lower across all levels with respect to the relevance and
appropriateness of practical skills that students display during clinical practice
across multiple clinical sites.
[slide]
Unit Analysis per Site Within Level
Site Analysis Within Level 3
4.5
4
3.5
Mean
3
Site A
2.5
Site B
Site C
2
1.5
1
0.5
0
Student
Clin Ed
Physio Manager (19)
Physio Manager(20)
Constituency
The third level of analysis, involves the analysis of a unit. At this level of
analysis, the mean scores of the student, the clinical educator and the clinical
manager are triangulated on similar items for each site within a level of study.
The slide illustrates a unit analysis per site for level 3 constituents with regards to
the link between classroom preparation and the practice requirements at each
site. At site A, for example, the students, clinical educator and clinical manager
are somewhat in agreement that there is a link with the preparation for practice
and the knowledge required at that site for practice but the clinical manager is not
in agreement that students’ practical skills are relevant for this site. At site C, the
students and the clinical managers reflect a neutral perspective about the link
between classroom preparation and practice requirements at this site, however,
the clinical educator is positive that a link does exist between the academic and
10
practice settings. Another observation relates to the similarity in response
between the students and the clinical educator, with the exception that the
clinical educators rate themselves higher. The advantage of this unit analysis lies
in recognizing explicit difference amongst sites and their actual practice
requirements. Further, it provides evidence for the actual processes that
contribute towards the strengths and deficiencies of the programme at individual
sites. These could guide definite intervention strategies for improving quality
rather than broad-based programme changes.
[slide]
Conclusion
Research
Theory
Policy
Practice
This evaluation was designed to assess the quality of the clinical education
experience in relation to the broader processes that link policy, theory and
practice, through engagement with stakeholders other than the end-users of the
clinical programme. The outcome of this evaluation could provide the basis for
concrete professional development activities with potential to improve the
congruency between clinical education and the requirements for practice within a
transformed health care context.
Thank you
11
12