Curriculum and Assessment Models: In Search of a Panacea

Curriculum and Assessment Models: In Search of a Panacea?
Susan M Rhind, BVMS, PhD, FRCPath, FHEA, MRCVS
Veterinary Medical Education Division, Royal (Dick) School of Veterinary Studies,
University of Edinburgh, Scotland, UK ([email protected])
Introduction
This talk will give a brief historical overview of curriculum models with a focus on current
debates, trends and the implications for disciplines such as pathology. Assessment
developments and the factors that need to be considered when designing an overall
programme of assessment will also be presented.
A Brief Historical Overview of Curriculum Models and Teaching Strategies
Since the 1950’s there have been moves in medical education towards better integration
between traditional disciplines e.g. anatomy and pathology. with greater focus on systems
based approaches and development of Problem Based Learning (PBL) curricula.
The 4 main organisational models of curricula are: modular; integrated; core and options;
and spiral (where students revisit material throughout the curriculum with increasing
complexity as the curriculum progresses) 1 2. Integrated curricula are now common in
veterinary education with the integration either being ‘horizontal’ (across content within
years) or ‘vertical’ (across years e.g. between basic and clinical sciences). Many curricula
now contain examples of both horizontal and vertical integration.
The Problem with Curriculum Level Evaluation of Anything…..
As scientists, we are hard wired to expect ‘evidence’ to be presented in order to convince us
that a new model or approach is an improvement on previous approaches. When it comes to
curriculum design and evaluation, this can however, be fraught with difficulties. The issue
arises because of evidence that ‘within programme’ variations in student achievement are
greater than any impact of the overall curriculum 3 . Thus any attempt to ‘prove’ that one
curriculum model or strategy is better (or worse) than another is hampered by the myriad of
factors that contribute to an overall students journey and experience of a given curriculum.
All is not lost: Embracing Evidence Where We Can
Accepting the difficulties of high level curriculum evaluation, there is no doubt that as with
everything we do, we should where possible adopt an evidence based approach to our
educational practice. So whilst conceding that curriculum level evaluation is problematic,
evaluation of more discrete elements within the curriculum is certainly achievable – for
example, the use of teaching models, e-learning and different assessment strategies.
An issue for many in veterinary education is that due to relatively small size and often limited
resources, it may not always be possible to hire faculty who have a specific background in
social sciences research (and thus a background in the appropriate research methodology).
So although veterinary medical education units or groups of people with an interest in
education are becoming more common in veterinary schools, we are still some way off the
situation in medical education where such units are now commonplace 4. What we do have
instead as a small profession is a relatively tight-knit community with strong national and
international links between educators and great potential for national and international
collaborations.
Some Topical Learning and Teaching Strategies
Problem Based Learning
PBL is a teaching strategy rather than a curriculum per se. It is becoming increasingly
common in veterinary education however it is a term that can be open to interpretation and is
often used to describe any educational intervention where problems (usually clinical) are
used to focus learning. Originating in McMaster University in 1969, there have since been
many developments and modifications presented 5. More recently – and perhaps
unsurprisingly, the focus has often been on more hybridised approaches to PBL. In such
models varying levels of lectures are embedded into the programme design to support or
‘scaffold’ the PBL cases 6.
To date, there is no consistent evidence that PBL actually produces ‘better’ outcomes. The
potential reasons for this that have been speculated in the literature are rather intriguing and
seem to relate to the issue of ‘transfer’ of knowledge from one context to another i.e. using
knowledge acquired in one context to solve another unrelated problem in a different context
7
. Some studies have actually shown poorer performance in PBL curricula with a suggested
interpretation that the learning of a problem in one context makes it more difficult to
‘disentangle’ from that context and then apply to a new and different context 8. In contrast,
other studies have shown similar or slightly improve outcomes in PBL curricula – however
the resource costs associated with such models are not insignificant; to quote Norman and
Schmidt (2000) 9:,
‘We believe that PBL has been oversold by its advocates, promising enormous benefits and
largely ignoring the associated resource costs.’
E-learning
Whilst it is outwith the scope of this session to go into e-learning in detail, 2 specific aspects
will be discussed:
1. The use of multimedia as one strategy to help improve knowledge transfer
2. The concept of the ‘flipped classroom’
There has been a large amount of research in the field of e-learning, but only recently has
empirical evidence been published in the context of medical education. This evidence has
indicated improvements in both long-term retention and long-term transfer when using
multimedia in lecture slides10. A related topical area is the use of ‘the flipped classroom’. In
this strategy, teachers record lectures which students view in advance of an in-class
discussion session. Potential advantages of this approach are: (1) students can view
material at their own pace, (2) improved interaction between students and faculty, (3)
opportunities for feedback are increased, and (4) providing the opportunity to focus face-toface discussions on topics the student find most challenging11.
Implications for Pathology
What are the implications of the above developments for pathology as we aspire to cultivate
within our students a sound understanding of key pathological processes and enable them to
apply these principles to new scenarios?
Learning from the work around ‘transfer’ mentioned earlier, presenting key principles in the
context of multiple different cases is a major recommendation and comes as no surprise 7.
Identification of the common conceptual difficulties that students have and building a flipped
classroom approach to allow more time to discuss the challenging areas would be another
area worthy of exploration.
As a discipline, teachers of pathology (whether medical or veterinary) and indeed of other
discrete disciplines within the curriculum, have often expressed concerns about integrated
curricula due to their perception that a loss of discipline identity may be a risk of the
integration process 12 13 14. Where technology can help in this aspect is with facilitation of
‘curriculum mapping’ 15. Curriculum maps can provide a visual representation of the
key components of any curriculum e.g. the content that is taught, how and when it is
taught and how and when it is assessed. A curriculum map can also demonstrate
the relationships and connections between these key components 15,16. An
increasing array of technological solutions to facilitate curriculum mapping are now available.
Whilst expensive, these tools not only help in terms of demonstrating learning outcomes
across the curriculum but they can allow staff , student or other stakeholders (e.g.
accrediting bodies) to more readily visualisation the distinctive disciplines underpinning any
integrated curriculum 15. A practical note here is that embarking on a process of curriculum
mapping should not be undertaken lightly and clear processes for updating should be
considered at an early stage
So is there an Optimum Curriculum?
For the reasons presented earlier, even if a perfect curriculum did exist, it would be unlikely
we could gather any robust empirical evidence to prove this was the case. Nevertheless,
even supposing the existence of such a ‘Holy Grail’, the point has been made elsewhere that
‘‘An imported curriculum has similar chances of survival as a palm tree implanted at the
North Pole’! 17 I will argue that a focus on the individual student experience may well reap as
many benefits as the wholesale implementation of new and ‘fashionable’ curriculum. Key to
this is student engagement with enthusiastic and contented staff.
Developments in Assessment and Feedback
Whilst curriculum level evaluation can be fraught with difficulties as described above, in the
area of assessment and feedback, we can more comfortably position ourselves in a culture
of ‘best-evidence’ informed practice. Although there is little available empirical evidence in
our own discipline currently 18, medical education can provide us with a rich and intriguing
source of data and ‘trial and error’ stories.
Miller’s Pyramid
No discussion on assessment in medical or veterinary education takes place without a
mention of Miller’s pyramid 19 . The model is reproduced below with some examples of
appropriate assessment methods aligned with the different levels of the pyramid 20.
There is a large amount of published data across disciplines dealing with written forms of
assessment (i.e. the lower levels of Millers pyramid) and it is out with the scope of this paper
to explore this in great detail.
Much of the focus in recent years has been on how we assess at the higher levels of Millers
pyramid i.e. ‘shows’ and ‘does’. At the ‘shows’ level, the Objective Structured Clinical
Examination (OSCE) was first described by Harden in 1979 21. Since then hundreds of
papers have been written on the subject. In recent years, veterinary educators have shown
much interest in learning from the experiences of their medical counterparts, particularly in
the realm of competency assessment and in the development of OSCEs to help assess
practical competencies 22 23. It is in areas such as this that we can really take advantage of
learning from medicine and perhaps avoid repeating the same mistakes. We may even be
able to “skip” unnecessary or redundant steps in the learning process as a result. Originally
developed to focus on ‘Objectivity’, the mark sheets for OSCEs are typically highly detailed
checklists and as such, tasks can tend to become fragmented. Research data has shown
that ‘global rating scales’ based on professional overall judgements, rather than these
detailed checklists can be equally reliable in the context of this type of assessment, if
assessors are well trained 24 25. A practical point here is that whilst appropriate for technical
level assessments earlier in the curriculum (e.g. basic microscope set-up, suturing, gowning,
gloving,), OSCEs with detailed checklists may not be the best way to assess competency at
the final year stage.
A helpful ‘formula’ to consider when selecting assessments/ designing a programme of
assessment has been presented by Van der Vleuten (1996) where the utility of an
assessment method can be considered to include elements of reliability, validity, educational
impact, acceptability and cost. 26
Reliability: How reproducible are the results? Reliability is measured and expressed as a
coefficient ranging from 0 to 1 i.e. from zero reliability to complete reliability. Reliability can
be affected by and expressed in terms of the consistency of results across examiners (interrater reliability), for the same examiner at different times (intra-rater reliability), between
examinees, and between test items.
Validity: Does the assessment method measure what it is supposed to measure e.g. if the
assessment is being used to test a particular skill, is that skill actually being assessed?
Various aspects contribute to validity and this topic is well reviewed by Hecker and Violato 27
Educational Impact: Relates to the ‘reaction’ by the learners that results from implementing
an assessment; in some instances, this may be unpredictable so should be monitored
carefully 26
Acceptability: To the constituency. This relates back to the earlier point about the
importance of considering local factors, such as staff culture/teaching philosophy, facilities
and local politics and, as presented by Van der Vleuten (1996), ‘opinions, sentiments and
traditions’26
Costs: Put simply, building a sound and all-encompassing assessment can be expensive
and needs to be factored in to any curriculum design from the start.
Feedback
An area of intense discussion and activity currently in UK Higher Education is feedback to
learners on their progress. In general, it is increasingly being recognised that rather than the
aim being assessment OF learning, we should move to practices and course structures that
promote and facilitate assessment FOR learning. In practice what this means is ensuring
that assessment interventions have feedback opportunities built into them such that students
have an opportunity to learn from the feedback and apply it to a future assessment or their
future practice 28. In addition, much of the current research emphasises the importance of
considering feedback as a dialogue rather than one way monologue from faculty to
student.29
Assessment OF Learning
Summative assessment. e.g. end of course
assessment; final assessment, NAVLE
Assessment FOR learning
Formative assessment. Helps learners
gauge where their strengths and
weaknesses are and allows them to address
these in advance of subsequent
assessments
Assessment AS Learning
Self-assessment and reflection on one’s own
abilities. Requires appropriate supporting
activities to help students learn to selfassess. Builds their understanding of the
examiners perspective and understanding of
what good quality work/ performance is30
Implications for Pathology
Considering practical points from the above discussion specific to pathology, it would be
worth considering which ‘technical’ competencies to assess at the relevant earlier stages of
the curriculum (assessment in vitro) followed by a more holistic assessment in the later
stages of the curriculum. These later assessments can incorporate global professional
judgements on a student’s overall level of competence e.g. in performing and interpreting
necropsies on the PM room floor (assessment ‘in vivo’).
Whether associated with written or practical assessments, we should be aiming to build
assessment programmes that consider from the earliest stages how feedback will ultimately
be given to students and how each assessment will ‘feed-forward’ information to the next
stage of the curriculum. An important part of this is to incorporate opportunities for students
to evaluate their own performance, as well as that of their peers.
Conclusion
We should learn as much as we can from prior research in medical education and
educational research in general, as well as looking for opportunities to generate our own
evidence base within veterinary medical education. As a small discipline, we need to
optimise our resources by collaborating where we can nationally and internationally.
Workshop: Designing a Programme of Assessment and Feedback: Key
Considerations
Overview
Building on the issues raised in the second half of the plenary presentation, this session will
explore in more detail the link between assessment and feedback and in particular the
current focus on assessment for learning rather than assessment of learning. Participants
will consider how they might develop the optimal balance between these sometimes
conflicting types of assessment in their own contexts.
References
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
Harden RM, Stamper N. What is a spiral curriculum? Medical Teacher. Mar
1999;21(2):141-143.
Grant J. Principle s of Curriculum Design. In: Swanwick T, ed. Understanding
Medical Education. London: Wiley-Blackwell; 2012.
Norman GR. RCT=Results confounded and trivial: The perils of grand educational
experiments. Medical Education. 2003;37:582-584
Davis MH, Karunathilake I, Hardens RM. AMEE education guide no. 28: The
development and role of departments of medical education. Medical Teacher. Dec
2005;27(8):665-675.
Neville AJ, Norman GR. PBL in the undergraduate MD program at McMaster
University: Three iterations in three decades. Academic Medicine. Apr
2007;82(4):370-374.
Swanwick T, ed Understanding Medical Education: Evidence, Theory and Practice:
Wiley-Blackwell; 2012.
Norman GR. Teaching basic science to optimize transfer. Medical Teacher. 2009
2009;31(9):807-811.
Ross B. This is like that: The use of earlier problems and the separation of similarity
effects. Journal of Experimental Psychology: Learning, memory and Cognition.
1987;15:456-468.
Norman GR, Schmidt HG. Effectiveness of problem-based learning curricula: theory,
practice and paper darts. Medical Education. Sep 2000;34(9):721-728.
Issa N, Mayer R, Schuller M, Wang E, Shapiro M, DaRosa D. Teaching for
understanding in medical classrooms using multimedia design principles. Medical
Education 2013;47:388–396.
Goodwin B, Miller K. Evidence on Flipped Classrooms Is Still Coming In.
Educational Leadership. Mar 2013;70(6):78-80.
Bolender D, Ettarh R, Jerrett D, Laherty R. Curriculum Integration=Course
Disintegration:What does this mean for Anatomy? Anatomical Sciences Education.
2013;6:205-208.
Marshall R, Cartwright N, Mattick K. Teaching and learning pathology: a critical
review of the English literature. . Medical Education. 2004;38: 302–313.
Cavalieri J. Curriculum Integration within the Context of Veterinary Education.
Journal of Veterinary Medical Education. Win 2009;36(4):388-396.
Harden RM. AMEE Guide No. 21: Curriculum mapping: a tool for transparent and
authentic teaching and learning. Medical Teacher. Mar 2001;23(2):123-137.
Bell C, Ellaway R, Rhind S. Getting Started with Curriculum Mapping in
a Veterinary Degree Program. Journal of Veterinary Medical Education.
2009;36(1):100-106.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
Al-Eraky MM. Curriculum Navigator: Aspiring towards a comprehensive package for
curriculum planning. Medical Teacher. 2012 2012;34(9):724-732.
Rhind SM, Baillie S, Brown F, Hammick M, Dozier M. Assessing Competence in
Veterinary Medical Education: Where's the Evidence? Journal of Veterinary Medical
Education. Fal 2008;35(3):407-411.
Miller G. The assessment of clinial skills/ competence/ performance. Academic
Medicine.65(9):63-67.
Baillie S, Rhind S. A Guide to Assessment Methods in Veterinary Medical Education
2008.
Harden RM, Gleeson FA. Assessment of clinical competence using an objectiove
strcutured clinical examination (OSCE). Medical Education. 1979 1979;13(1):41-54.
Hecker K, Read EK, Vallevand A, et al. Assessment of First-Year Veterinary
Students' Clinical Skills Using Objective Structured Clinical Examinations. Journal of
Veterinary Medical Education. Win 2010;37(4):395-402.
Bark H, Shahar R. The Use of the Objective Structured Clinical Examination (OSCE)
in Small-Animal Internal Medicine and Surgery. Journal of Veterinary Medical
Education. 2006;33(4):588-592.
Regehr G, MacRae H, Reznick R, Szalay D. Comparing the psychometric properties
of checklists and global rating scales for assessing performance on an OSCE format
examination. Academic Medicine. 1998;73(9):993-997.
Cunnington J, Neville A, GR. N. The risks of thoroughness: Reliability and validity of
global ratings and checklists in an OSCE. Advances in Health Sciences Education.
1996;1(3):227-233.
Van der Vleuten CPM. The assessment of professional competence: Developments,
research and practical implications. Advances in Health Sciences Education. Jan
1996;1(1):41-67.
Hecker K, Violato C. Validity, Reliability, and Defensibility of Assessments in
Veterinary Education. Journal of Veterinary Medical Education. 2009;36(3):271-275.
Boud D, Molloy E. Rethinking models of feedback for learning: the challenge of
design. Assessment & Evaluation in Higher Education. 2012;38(6):698-712.
Nicol D. From monologue to dialogue: improving written feedback processes in mass
higher education. Assessment & Evaluation in Higher Education. 2010
2010;35(5):501-517.
Sadler R. Ah!... So That's 'Quality'. Assessment: Case Studies, Experience and
Practice from Higher Education. . London: Kogan Page; 2002:30-136.