From Learning Objectives to Outcomes

From Learning Objectives to
Outcomes
Marie Gilbert MA, RN
Introduction
Marie Gilbert
MA, RN
Simulation Coordinator
Fresno State University,
California
Learning Objectives
 Discuss core components to a healthcare
simulation.
 Discuss the importance of outcomes evaluation
and challenges to traditional assessments.
 Discuss the importance of validity, reliability
and feasibility as it relates to assessment.
 Discuss types of assessments and their
application in healthcare education.
Simulation
Simulation is the imitation or representation of
one act or system by another.
Healthcare simulations can be said to have
four main purposes;
• education
• assessment
• research
• health system integration in facilitating
patient safety
Society for Simulation in Healthcare
Core Components of Healthcare Simulation
The simulation setting with its different,
connected parts.
(Adapted from Dieckmann, 2009)
Core Components of Healthcare Simulation
Prebrief
Simulation
Debrief
Core Components of Healthcare Simulation
PREBRIEF
Core Components of Healthcare Simulation
Simulation
Core Components of Healthcare Simulation
Debrief
Objective-Based Scenario Design
Learner
Objectives
and
Outcomes
Debrief
Flow
Assessment
Patient
Environment
Learner Population
Healthcare professionals;
•Have complex and multidimensional roles
•Work independently
•Work in teams
•Require specific cognitive, technical and
behavioral skills
Learner Population
Healthcare professionals;
Domains of measurable skills range from
history taking, physical examination and
patient management to areas such as
teamwork, cultural competence and
professionalism.
(Boulet et al., 2011)
Importance of Objectives and Outcomes
Direction for the designer, facilitator and
the learner
• They identify what the learner will be
able to do at the end of the
simulation/course/program
• They identify the knowledge, skills and
attitudes needed to be able to do this
• They guide the designer and/or
facilitator
Outcomes & Objective
A Learning Outcome relates to the final
product or end result
What the student can DO
A Learning Objective relates to the
process and content
“Nuts & bolts”
Learning Objectives should map to
Learning Outcomes
Outcomes for Simulation Based Learning
• Are Essential
• Outcomes express higher level thinking
skills that integrate the content and
activities
• Can be observed as a behavior, skill, or
discrete useable knowledge upon
completing the simulation/course/program
• Primary sources of outcomes are core
competencies designated by professional
bodies, academic or clinical institutes
Outcomes for Simulation Based Learning
Example:
At the end of the nursing program the
learner will provide patient-centered care
based on a comprehensive and focused
health assessment.
Objectives for Simulation Based Learning
• Are Essential
• Must reflect the intended outcome of
the experience
• Specify expected learner behavior
• Include sufficient detail to allow learners
to participate in the simulation
effectively
Outcomes for Simulation Based Learning
Examples
The learner will identify the most relevant
data
The learner demonstrates effective
communication
The learner recognizes subtle changes in
the patients condition
Objective-based scenario design
Outcome
• At the end of the nursing program the
learner will provide patient-centered
care based on a comprehensive and
focused health assessment.
Objective
• The learner will identify the most
relevant data
• The learner demonstrates effective
communication
• The learner recognizes subtle changes
in the patients condition
Guidance on writing Outcomes & Objectives
http://be-know-do.com/introduction-to-smart-objectives-and-smart-goals/
Objective-based scenario design
How do we know we‟ve achieved
our learning objectives and
learning outcomes?
ASSESSMENT
Types of Assessment
Summative
Formative
• Higher stake
• Used for performance evaluation
• Occurs upon completion of the
work and the focus is on the
final product
• Certification and licensure
• Lower stake
• Used to modify teaching and
learning
• Occur over time of course or
program
• Training/education activities
Types of Assessment
Practice Review
Audits,Video
Performance Tests
Task demonstration, Simulation, OSCE
Clinical Based Tests
Patient management questions, Essay,
Oral
Knowledge Tests
MCQ, Essay, Oral exams
Assessment
Practice Review
Performance Based Tests
Does
(action)
Shows
How
(Performance)
Clinical Based Tests
Knowledge Based Tests
Bloom‟s Taxonomy
Knows How
(Competence)
Knows
(Knowledge)
Miller‟s Assessment
of clinical skills,
competence,
performance.
Kirkpatrick‟s Evaluation Model
Level 1
Level 3
Reaction
Behavior
Level 2
Level 4
Learning
Results
(Source from Kirkpatrick, 1989)
Choosing appropriate assessment methods/tools
Is it valid?
Is it reliable?
Is it feasible?
Assessment - Validity
Are we measuring what we are supposed
to be measuring?
Is it an appropriate instrument for the
knowledge, skill, or attitude you are
testing within the context it is being used?
Assessment - Validity
“Emerging paradigms replace prior
distinctions of face, content, and criterion
validity with the unitary concept „construct
validity‟, the degree to which a score can
be interpreted as representing the
intended underlying construct.”
(Cook & Beckman, 2006)
Assessment - Reliability
Does the test consistently measure
what it is supposed to be
measuring?
Types of reliability:
Inter-rater (consistency over
raters)
Test-retest (consistency over time)
Assessment - Feasibility
Is the administration of the
assessment instrument feasible in
terms of time and resources?
Assessment - Feasibility
Considerations
? Time to construct and score
? Ease of interpreting the score/producing
results
? Practical given staffing/organization
? Number of students to be assessed
? Time available for the assessment
? Number of staff available
? Resources/equipment available
Where should you start?
Remember the purpose of the tool is to
assess that learning has occurred.
It needs to map to your OUTCOMES &
OBJECTIVES
Examples of Tools
Kardong-Edgren, S., Adamson, K.A.,
Fitzgerald, C. (2010).
A review of currently published
evaluation instruments for human patient
simulation.
Clinical Simulation in Nursing, 6(1), e25-e35.
Doi:10.1016/jecns.2009.08.004.
Use of validated tools
It is rarely appropriate to justify the use of a
particular scoring tool based solely on previous
validation studies (Boulet et al., 2011)
Consider
•The purpose of the assessment
•The administration conditions
•The evidence supporting the tools.
Where I started
• Self report satisfaction survey
• Outcome measures per semester group
• Specific Learning Objective checklist per group
Student Satisfaction
Evaluation of Learning Outcomes
Learning Outcomes
S1
S2
S4
S5
Comments
Apply clinical decision
Met Met Met Met Demonstrated
making skills in analyzing
either directly,
and interpreting complex
during the
data
simulation, or
Apply clinical decision
Met Met Met Met through guided
making skills to plan care
reflection during
the debriefing
Provide care to patients
Met Met Met Met
session,
utilizing principles of
appropriate to
safety
the level of
Effectively communicates Met Met Met Met
experience and
expected
Reflect on performance
Met Met Met Met knowledge base.
‘in action’ and ‘after
action’
Semester-Specific Learning Objectives
Semester 2 Simulation Learning Objectives
1. Evaluate patient assessment information including vital
signs
2. Prioritize and implement physician orders appropriately
3. Recall indications, contraindications, and potential
adverse effects of prescribed medication
4. Implement the “5 rights” of medication administration
5. Communicate effectively with patient
6. Document nursing intervention and/or clinical skill
performed appropriately
Evaluation of Learning Objectives
Semester 2 Assessment/Evaluation
1
Demonstrated. Initial assessments were generally performed well.
However, most groups were slow to complete a full respiratory
assessment following nebulizer
2 Demonstrated. Some groups were challenged with prioritizing which
meds to administer first. Some groups did not recognize an order
change to increase oxygen from NC to FM. Students were able to
reflect on this during debrief and identified why they thought they were
challenged and what they could do in the future to prioritize more
effectively
3 Demonstrated at an appropriate level for semester 2 students
4
Demonstrated well
5
Demonstrated at a higher level than expected for semester 2 students
6
In most groups documentation was poor. It is unclear why students
performed poorly in this area. It may be because the instructions
during the pre-brief did not explain adequately that documentation was
an expectation during the simulation
Assessment Improvement
• Define low – medium – high performance
• Use of Rubric or rating metric
• Employ a quality assurance/improvement
system (Closing the loop)
Assessment Tools
(Brett-Fleegler, 2008)
Assessment Tools
(Malec et al, 2007)
Assessment Tools
(Gaba, 1998)
Assessment Tools
(Guise et al, 2008).
Assessment Tools
(Lasater, 2007)
Objective-Based Scenario Design
 Learner
 Objectives and Outcomes
 Assessment
Environment
Patient
Flow
Debrief
Environment
The environment should be prepared to enhance
learning
– Maximize fidelity
– Choose appropriate simulation
methodology/equipment
– Room design, props, equipment, patient
documentation available to support learning
environment
– Actors
– Scripts for patient/actors
Patient
The patient should be
created to enhance
learning
Must be appropriate to
meet the learning
outcomes and objectives
Use a template
Example from the
California Simulation
Alliance
https://www.californiasimulationalliance.org/
Flow
Use a storyboard
Example designed by
Marjorie Miller and is
used by the CSA
Initial parameters
Identify planned
events
Identify triggers
Identify scenario
end point
https://www.californiasimulationalliance.org/
Scenario Template
https://www.californiasimulationalliance.org/
Debrief
Debriefing represents facilitated or guided
reflection in the cycle of experiential learning
(Fanning and Gaba, 2007)
Debrief Process
A 3 Phase model
1.Reactions Phase
2.Understanding Phase
3.Summary
Examples of questions
1.
2.
3.
4.
How do you feel about it?
Can you explain that further?
Was it effective/appropriate?
What would you change, if anything, in
the future?
5. What do you plan to incorporate into
your practice next time?
6. What, if any, obstacles did you
encounter?
Summary
• Learning outcomes and objectives are the
foundation for your simulation
• Learning outcomes and objectives drive the
assessment method, the environment setting,
the patient scenario and the flow of the
simulation
• The debrief is driven by the learning outcomes
and objectives
• Learning outcomes and objectives should be
assessed using valid and reliable data
Take Home Message
OUTCOMES & OBJECTIVES
ARE ESSENTIAL
FOR AN EFFECTIVE
HEALTHCARE SIMULATION
References
Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of
educational objectives: the classification of educational goals; Handbook I: Cognitive Domain
New York, Longmans, Green, 1956
Boulet, J.R., Jeffries, P.R., Hatala, R.A., Korndorffer, J.R., Frienstein, D.M., and Roache, J.P. (2011)
Research Regarding Methods of Assessing Learning Outcomes. Simulation in Healthcare, 6(7):
S48-S51
Brett-Fleegler, M., Vinci, R., Weiner, D., Harris, S., Shih, M., & Kleinman, M. (2008). A simulator-based
tool that assesses pediatric resident resuscitation competency. Pediatrics, 121(3), e597-603.
Dieckmann, P (2009) Using simulation for Education, Training and Research. Lengerich: Pabst
cited in Simulation is more than Technology-The Simulation Setting
Cook D.A., & Beckman T.J (2006) Current Concepts in Validity and Reliability for Psychometric
Instruments: Theory and Application. The American Journal of Medicine 119, 166.e7-166.e16
Fanning, R.M., Gaba, D.M (2007) The Role of Debriefing in Simulation-Based Learning . Simulation in
Healthcare 2(2):115-125
Gaba, DM et al (1998) Assessment of Clinical Performance during Simulated Crises Using Both
Technical and Behavioral Ratings. Anesthesiology, 89(1): 8-18.
Guise, J.M., Deering, S.H., Kanki, B.G., Osterweil, P., Li, H., Mori, M., and Lowe, K. (2008) Validation of
a Tool to Measure and Promote Clinical Teamwork. Simulation in Healthcare 3(4): 217–223.
Kirkpatrick, D.L (1989) Evaluating Training Programs; The Four Levels. 2nd ed. San Francisco, CA:
Berrett-Koehler
Lasater, K (2007) Clinical Judgment Development: Using Simulation to Create an Assessment Rubric,
Journal of Nursing Education, 46 (1):496-503
Malec, J.F., Torsher, L.C., Dunn, W.F., Wiegmann, D.A., Arnold, J.J., Brow, D.A; et al. (2007) The Mayo
High Performance Teamwork Scale: Reliability and Validity for Evaluating Key Crew Resource,
Management Skills, Simulation in Healthcare, (2)1: 4-5.
Miller, G.E (1990) The assessment of clinical skills/competence/performance. Academic Med.
5(9):S63-7
Useful Resources
 Society for Simulation in Healthcare.
http://www.ssih.org/SSIH/ssih/Home/
 International Nursing Association for Clinical Simulation and
Learning www.inacsl.org
 Simulation Innovation Resource Center (SIRC) http://sirc.nln.org
 CINHC www.cinhc.org
 California Simulation Alliance (CSA)
http://www.cinhc.org/programs/simulation/
 Bay Area Simulation Collaborative (BASC)
www.bayareanrc.org/rsc
 Simulation User Network http://simulation.laerdal.com/
 www.HealthySimulation.com
 http://www.behindthesimcurtain.com
Final Slide
Thank You
[email protected]