data collection

Standardizing Learner Surveys
Across the Enterprise
Francis Kwakwa, MA, Radiological
Society of North America
Valerie Smothers, MA, MedBiquitous
Disclosure
We have no financial relationships to
disclose.
Objectives
At the completion of this session, you will be able
to:
 Adopt strategies to improve the collection of
consistent evaluation data from learners
 Adopt strategies to improve the analysis of
evaluation data across the CME enterprise
Overview
1.
2.
3.
4.
5.
6.
Challenges in analyzing learner surveys
MedBiquitous and MEMS
RSNA’s Implementation of a Standardized
Survey
Results of RSNA course evaluation
Challenges faced by RSNA
Key strategies for improving data collection
and analysis
Challenges in Analyzing Learner
Surveys




Most of us use surveys
Surveys often differ
based on activity
Survey data may be in
different systems or
formats
The result: it’s hard to
analyze results across
activities
RSNA

Radiological Society of North America
“to promote and develop the highest standards of
radiology and related sciences through education and
research”
 Over 40,000 members
 Online and in-person CME activities
 Member of MedBiquitous
 Francis Kwakwa, Chair of the MedBiquitous Metrics
Working Group

MedBiquitous

Technology standards developer for healthcare
education
ANSI Accredited
 Develops open XML standards
 60 members (societies, universities, government,
industry)
 7 working groups

The Focus on Metrics


“Without the creation of
a standard data set for
reporting CME program
outcomes … it is
difficult to obtain
consistent metrics of
those outcomes. And if
you can’t measure it, you can’t
improve it.”
Medical Education
Metrics – MEMS
Ross Martin, MD, Director
Healthcare Informatics Group,
Pfizer
Another Perspective

“I need this to better
understand how my
program as a whole is
doing.”
Nancy Davis, American
Academy of Family Physicians
The MedBiquitous Metrics Working
Group



Mission: to develop XML standards … for the
exchange of aggregate evaluation data and other
key metrics for health professions education.
Originally a subcommittee of the Education
Working Group
Became a working group in April 2005
We’re all using the same measuring stick…
--Francis
Who is Involved?






Francis Kwakwa, RSNA,
Chair
Linda Casebeer, Outcomes
Inc.
Nancy Davis, AAFP
Michael Fordis, Baylor
College of Medicine
Stuart Gilman, Department
of Veterans Affairs
Edward Kennedy, ACCME *
* Invited experts







Jack Kues, University of
Cincinnati
Tao Le, Johns Hopkins
University
Ross Martin, Pfizer
Jackie Mayhew, Pfizer
Mellie Pouwels, RSNA
Andy Rabin, CE City
Donna Schoonover,
Department of Veterans
Affairs
What’s in MEMS

Participation Metrics


Learner Demographics


name, type
Participant Activity Evaluation


profession, specialty
Activity Description


how many participants
survey results
Other types of evaluation metrics planned for future
versions
For more information:


Metrics Working Group Page
http://www.medbiq.org/working_groups/metri
cs/index.html
MedBiquitous Website
http://www.medbiq.org
Discussion

Describe the learner surveys that you are using
and how they differ from or are similar to the
survey described. What are the benefits or
drawbacks of using a standardized survey?
RSNA’s Project…

Adoption of MEMS survey instrument
coincided with implementation of a new
Learning Management System

Currently MEMS is used to evaluate over 300
online courses
RSNA’s Project…
Types of online courses using MEMS

Cases of the Day (COD)

RadioGraphics CME Tests/Education Exhibits (EE)

Refresher Courses (RSP)
Results…
COD-45 (N = 24)
The course achieved its learning objectives
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
EE-355 (N = 32)
The course achieved its learning objectives
70
60
50
40
%
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
RSP-2904 (N = 43)
The course achieved its learning objectives
90
80
70
60
50
%
40
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
COD-45 (N = 24)
The course was relevant to my clinical learning needs
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
EE-355 (N = 32)
The course was relevant to my clinical learning needs
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
RSP-2904 (N = 43)
The course was relevant to my clinical learning needs
90
80
70
60
50
%
40
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
COD-45 (N = 24)
The course was relevant to my personal learning needs
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
EE-355 (N = 32)
The course was relevant to my personal learning needs
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
RSP-2904 (N = 43)
The course was relevant to my personal learning needs
90
80
70
60
50
%
40
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
COD-45 (N = 24)
The online method of instruction was conducive to learning
70
60
50
40
%
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
EE-355 (N = 32)
The online method of instruction was conducive to learning
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
RSP-2904 (N = 43)
The online method of instruction was conducive to learning
90
80
70
60
50
%
40
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
COD-45 (N = 24)
The course validated my current practice
70
60
50
40
%
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
EE-355 (N = 32)
The course validated my current practice
70
60
50
40
%
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
RSP-2904 (N = 43)
The course validated my current practice
70
60
50
40
%
30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
COD-45 (N = 24)
I plan to change my practice based on what I learned in the
course
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
EE-355 (N = 32)
I plan to change my practice based on what I learned in the
course
50
45
40
35
30
% 25
20
15
10
5
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
RSP-2904 (N = 43)
I plan to change my practice based on what I learned in the
course
40
35
30
25
% 20
15
10
5
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
COD-45 (N = 24)
The faculty provided sufficient evidence to support the
content presented
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
EE-355 (N = 32)
The faculty provided sufficient evidence to support the
content presented
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
RSP-2904 (N = 43)
The faculty provided sufficient evidence to support the
content presented
60
50
40
% 30
20
10
0
Strongly Agree
Agree
Neutral
Disagree
Strongly
Disagree
COD-45 (N = 24)
Was the course free of commercial bias towards a particular
product or company?
Yes
100%
EE-355 (N = 32)
Was the course free of commercial bias towards a
particular product or company?
Yes
100%
RSP-2904 (N = 43)
Was the course free of commercial bias towards a
particular product or company?
No
2%
Yes
98%
COD-45 (N = 24)
Did the course present a balanced view of clinical options?
No
4%
Yes
96%
EE-355 (N = 32)
Did the course present a balanced view of clinical
options?
Yes
100%
RSP-2904 (N = 43)
Did the course present a balanced view of clinical
options?
No
2%
Yes
98%
Group Discussion

What challenges to survey data collection and
analysis have you faced?
Challenges Faced by RSNA





Survey is optional; little data available for some
courses
Little variation in the data
Some disconnect with educators on how the
data is used
Difficult to get data out of the LMS
Surveys for live events are not included
Key Strategies

Data Collection
Common core set of survey questions
 Common format for evaluation data


Data Analysis
Compare within type and modality
 Compare across type and modality
 Look for trends and variation
 Look for red flags

An Added Benefit

Assists with program analysis and improvement
required by the ACCME

“The provider gathers data or information and conducts a
program-based analysis on the degree to which the CME
mission of the provider has been met through the conduct of
CME activities/educational interventions.”
--ACCME Updated Accreditation Criteria,
September 2006