Basics of Survey Design

Basics of Survey & Scale Design
Chan Kulatunga-Moruzi, PhD
Department of Family Medicine
McMaster University
Agenda
Presentation:
 Overview of scales and surveys
Survey and scale - similarities
Survey and scale – differences
 Guidelines to scale and survey construction
Group work:
 Identify common mistakes in survey questions
Surveys & Scale: Similarities




Tools of research
Usually ask a series of questions
Gather data pertaining to central construct
May contain sub-constructs
Survey: Construct- Perception of PAs
Scale: Construct - Professional Burnout
Surveys & Scale: Similarities
 Often use rating scales
Likert-type, semantic differential
 Often self-administered
 Often based on self-report
 Similar issues/problems
Social desirability bias – jeopardize validity
Scale: Description
 Also known as an index or inventory
 Responses: categorical, more likely rating scale*
 Combine an individual’s data to one meaningful
number (interval level)**
– Eating Disorders Inventory
– Quality of Life Index
– Minnesota Multiphasic Personality Inventory
– Suicidal Ideation Scale
Scale: Function
 Used to describe a population/construct
 Overall scores or sub-scores used to make
– Inferences
– Identify, describe and compare
– Make decisions (e.g. treatment)
– Further research
Scale: Construction
 Knowledge of construct
Depression: symptoms DSM, ICD-9, differentials
 Knowledge of psychometrics
Reliability: test-retest, internal consistency,
discrimination
Validity: construct, external (concurrent/predictive)
Reliability sets upper limit of validity
Scale: Construction
 Research to find existing measurement scale(s)
 Use previously validated scale
 Amend previously validated scale to suit your
needs
Survey: Description
 Response format: mixture preferred
Rating scale - Likert/semantic differential
Multiple choice-categorical
Rank order
Open-ended
 Do not combine individual’s data to produce one
meaningful number
Survey: Function
 Often used simply to describe a population
 Used to inform policy /administration,
 Used for program evaluation
 Individual questions may be used to make
inferences, compare cohorts/populations
Survey: Construction
 Requires some knowledge of construct
 May be exploratory to learn about the construct
 Reliability & Validity assumed:
– by securing representative sample
– by asking well written questions
– by using well constructed response options
– by sound analyses
Survey & Scale Development
 Broad general topic
 Narrow down focus
- Identify research question(s)
- Operationalize/define concepts
 Objective: What is it that you want to know?
 Can you state your objective clearly and succinctly?
 What information is necessary to meet objective?
 Start with the end in mind
Survey & Scale Development
 Each question addresses research question
 Each question relevant to objectives
 Limited time/Survey fatigue




Anticipate results you might receive
Think about how you might analyze data
Will help to construct better questions
Will help use best questions formats
Survey & Scale Development
 Keep your respondents in mind
 Who will complete your survey?
representative sample
 Respondents able to understand the question?
 Respondents able to answer the question?
 How can you make it easy to complete?
 Are questions relevant to all respondents?
Question Design: “BOSS”
Be BRIEF
 Keep questions short and to the point
 Avoid long list of response alternatives to choose
from or to rank order
 Take time to edit
meaning
visual clutter
Question Design: “BOSS”
Be OBJECTIVE
 Ensure questions are neutral
Avoid leading questions
Avoid built in assumptions
Avoid loaded questions
 Be cognizant of the possible impact of words
chosen and question phrasing/framing
Question Design: “BOSS”
Be Simple
 Use simple language
 Avoid jargon and technical terminology
 Avoid double-barrel questions
Question Design: “BOSS”
Be Specific
 Avoid broad questions
 May be interpreted differently by respondents
 May need to define/specify what you mean
Group Work: 4 Cases
 Identify any problems you see with the item
 Re-write the items to address problems.
 Pay attention to stem & response options.
 Is there a better way to ask the question to meet
the objectives of the research?
Case 1: Age
Stem provides no context for the question
a. To which age category do you belong?
(nominal level)
b. How old are you? (interval level)
What is your date of birth?
a. Easy to fill, increase response rate, personal
question
b. Enable better analysis, option to group later
Case 1: Age
Problems with response options:
 inconsistent - words/hyphens
 not exhaustive – older/younger students
 not exclusive – 16 included in 2 options
 intervals not equal - 3 vs. 4 years
Case 2: Communication Skills
 Language used in the question
Vague, wordy, jargon/too advanced
 Leading question
 Researcher assumptions
“metamorphosized over the duration of…”
 Expects students are able to remember and
accurately report back from the beginning
Case 2: Communication Skills
 How might the researcher better meet his
objectives?
 Student rate his/her communication skills after
each patient encounter through out year
 SP rate students’ communication skills after each
patient encounter though out year
 Video tape students throughout the year, ask
blinded expert to rate communication skills
Case 3: Engagement & Learning
Outcomes
 Vague stem
Which of these activities do you engage in?
What do we mean by engage in?
 Dichotomous response options (yes/no)
Reduce variability, reliability, validity
 Scaled response (5-7 pts)
increase variability, reliability, validity
Case 3: Engagement & Learning
Outcomes
 Inconsistent pronouns (you/I)
 Double barreled questions (class & office hrs)
 Improper punctuation (?)
Case 4: Diversity & Barriers
to Higher Ed
 Loaded question
 Researcher’s assumptions
 Leading question
 Double barreled question
 Response options (odd vs. even number)