Levin_OBrien

Web Design Issues in a Business Establishment Panel Survey
Third International Conference on Establishment Surveys
(ICES-III)
June 18-21, 2007
Montréal, Québec, Canada
Kerry Levin & Jennifer O’Brien, Westat
Overview of Presentation
• A brief review of the web design
system and its origins
• Design issues we encountered
• Opportunities for experimental
investigation
2
Background
• The Advanced Technology Program (ATP) at the National
Institute of Standards and Technology (NIST) is a
partnership between government and private industry
to conduct high-risk research
• Since 1990, ATP’s Economic Assessment Office (EAO) has
performed rigorous and multifaceted evaluations to
assess the impact of the program and estimate the
returns to the taxpayer. One key feature of ATP’s
evaluation program is the Business Reporting System
(BRS).
3
General Description of the BRS
• Unique series of online reports that gather
regular data on indicators of business progress
and future economic impact of ATP projects
• ATP awardees must complete four BRS reports
per calendar year– three short quarterly
reports and one long annual report
4
General Description of the BRS
• There are several different types of instruments (each with a
profit and nonprofit version):
1. Baseline
2. Annual
3. Closeout
4. Quarterly
• The BRS instruments are a hybrid survey/progress report that
ask respondents attitudinal questions as well as items designed
to gather information on project progress.
• The Baseline, Annual, and Closeout reports are between 70 and
100 pages in length. Due to this length and complexity, web
administration is the most logical data collection mode
5
Design issues: Online logic checks vs. back-end logic checks
1. Examples of online logic checks (i.e., hard edits)
• Sum checking
• Range checks
2. Examples of back-end logic checks
• Frequency reviews
• Evaluation of outliers
6
Online sum checking: Example
7
Online range checking: Example
8
Back-end checking: Frequency reviews and
outlier evaluations
• At the close of each cycle of data collection, the
data for each instrument are carefully reviewed for
anomalies
• Frequency reviews are conducted to ensure that
there were no errors in skips in the online
instrument
• Although the BRS includes range checks for certain
variables, the ranges are sometimes quite large,
therefore an evaluation of outliers is a regular part
of our data review procedures
9
Use of pre-filled information in the BRS
The BRS instruments make use of two types of pre-filled
information:
1. Pre-filled information from sources external to the
instrument (i.e., information gathered in previous
instruments or information provided by ATP such as
issued patents)
2. Pre-filled information from sources internal to the
instrument (i.e., information provided by the
respondent in earlier sections of the current report)
10
Pre-filled information: External source example
11
Pre-filled information: Internal source example
12
Required items
While most items in the BRS instruments are not
required, the few that are fall into two categories:
1. Items required for accurate skips later in the
instrument
2. Items deemed critical by ATP staff
13
Required items: Example item important for skip pattern
14
Required items: Example item critical to ATP
15
Unique Design: Financial items
16
Administration issues in the BRS: Multiple respondents
• Each ATP-funded project has multiple contacts
associated with it
• It is rarely the case that a single respondent can
answer all items in the survey. Westat provides
only one access ID per report, however, therefore
the respondents are responsible for managing who
at their organizations are given access to the BRS
online system
17
Experimental investigations using the BRS
Reducing item nonresponse: The Applicant Survey
• The ATP’s Applicant Survey is not one of the BRS
instruments, but is regularly administered via the web
to companies and organizations that applied for ATP
funding
• In 2006, Westat embedded an experiment within the
Applicant Survey to test which of two different types of
nonresponse prompting would result in reduced item
nonresponse
19
Reducing item nonresponse: The Applicant Survey
904 respondents were randomly assigned to one of three
conditions:
1) Prompt for item nonresponse appeared
(if applicable) at the end of the survey;
2) Prompt for item nonresponse appeared
(if applicable) after each section;
3) No prompt (control group).
20
Reducing item nonresponse: The Applicant Survey
End of Survey:
After each section:
21
Reducing item nonresponse: The Applicant Survey
Both prompts for item nonresponse appeared effective, and
to an equal degree.
Percentage of
completed surveys
containing missing
data
Mean number of
missing items per
completed survey
Prompt at end
of survey
23.0%
0.9
Prompt after
each section of
survey
23.3%
1.1
No prompt
(control)
39.8%
2.2
p<.02
p<.008
Group
22
Boosting response rates: The days of the week
experiment
• Literature suggests that there are optimal call
times for telephone surveys. But are there also
optimal days of the week to email survey
communications?
• Optimal day to email was measured by:
• The overall response rate
• The time it takes to respond
23
Boosting response rates: The days of the week
experiment
• Three different experimental conditions:
1) Monday cohort
2) Wednesday cohort
3) Friday cohort
• The invitation email and up to 3
reminders were all sent on the same day,
either Monday, Wednesday, or Friday.
24
Boosting response rates: The days of the week
experiment
Experimental Group
Total
Total
Response
Eligible Completes
Rate
Monday
161
141
87.6
Wednesday
159
140
88.1
Friday
152
141
92.8
25
Time to Complete the Survey
Cumulative Response Rates
Email 1
Email 2
Email 3
Email 4
14.9%
32.3%
65.2%
87.6%
Wednesday
15.7
37.7
67.3
88.1
Friday
18.4
38.8
67.1
92.8
Monday
26
Boosting response rates: The days of the week
experiment
• Friday cohort trends toward higher response
rates, but all cohorts require the same amount
of effort to achieve their respective response
rates
• Overall, there is some evidence that the day of
the week does matter
27
Conclusion
• The BRS has presented us with various design
and administration challenges
• We have had the chance to fine-tune and
address a variety of issues that that have come
to our attention
• As researchers encounter new issues in the
administration of web surveys, the BRS offers a
place to study them
28