Evaluating the Effectiveness of Two Strategies to Improve

Evaluating the Effectiveness of Two Str ategies to Impr ove
Telephone Sur vey Response Rates of Employer s
Jeremy Pickreign1 and Heidi Whitmore1
NORC at the University of Chicago, 4350 East-West Hwy, Bethesda, MD 20814
1
Abstract
The California Employer Health Benefits Survey sponsored by the California HealthCare
Foundation has been hovering between 35 percent and 40 percent since 2004 with a 2010
response rate of 40 percent. The response rate in 2010 was 24 percent for non-panel firms and
27 percent for firms with 3-49 workers. This study examines two strategies for improving the
response rate among these smallest non-panel firms via a telephone survey: 1) mailing a
personalized advance letter and 2) offering financial incentives. We pre-called 1,024 non-panel
firms with 3-49 workers for the 2011 survey and sent a personalized advance letter to 513 firms
successfully contacted. Simultaneously, we randomly assigned these 1,024 firms to three
incentive groups: firms sent a $20 incentive with the initial mailing; firms promised $20 upon
completion of the survey; and a control group receiving no incentive. Results find that firms sent
a personalized advance letter have a significantly higher response rate than those sent a generic
advance letter (31.0 percent vs. 18.3 percent, p<0.001). Firms sent a financial incentive with the
initial mailing (22.0 percent vs. 28.1 percent, p=0.209) or were promised $20 upon completion of
the survey (30.0 percent vs. 28.1 percent, p=0.707) did not have significantly different response
rates compared to firms receiving no incentives. This lack of significance is further supported
via logistic regression analysis. Sending a personalized advance letter has a significant impact
on improving the overall response rate while offering incentives does not.
Key Words: Incentives, personalized advance letter, response rate, telephone survey
1. Introduction
Low response rates in establishment surveys are an issue commonly encountered by researchers.
Dillman (2000) references an unpublished study by Paxson (1992) that calculated an average
establishment response rate among 183 business surveys of 21 percent. Compounding this issue
is that little clear guidance exists in the literature to guide researchers towards useful techniques
for improving response rates in establishment surveys. A meta-analysis by Roth and BeVier
(1998) highlights common approaches for researchers that are associated with higher response
rates, including: 1) advance notice to prospective participants; 2) follow-up reminders; 3)
personalized communication with prospective participants; and 4) saliency of the topic. For their
study, they note the shortage of literature pertaining to establishment surveys, but suggest that
each technique may be useful tools for improving response rates in establishment surveys as
well.
Many surveys also rely on incentives as a method to improve response rates. Research literature
on the use of incentives in household surveys is broad and deep (Church, 1993; Roth & BeVier,
1
1998; Ryu, Couper, & Marans, 2005; Singer, 2002; Willimack, Schuman, Pennell, & Lepkowski,
1995). More recent literature regarding the use of incentives in establishment surveys shows
mixed results. There is a notable lack of studies of the impact of monetary incentives on
response rates in establishment surveys (Moore & Ollinger, 2007).
Cycyota and Harrison (2002) focused on four techniques (advance notice; follow-up reminders;
personalized communication and incentive use) and conducted a randomized experiment on a
survey of business executives to determine the effectiveness of each technique. They concluded
that none of these techniques were effective in improving response rates. Likewise, Biemer,
Ellis, Pitts, and Robbins (2007) conducted a randomized experiment on a large survey of
employers testing the effect of a $20 money order at the point of contact. They, too, found no
effect on response rates and also found no net savings in the cost of conducting the survey
related to the implementation of an incentive. In a randomized experiment on farm
establishments, Ott and Beckler (2007) found that prepaid incentives performed better than
promised incentives, and that non-monetary incentives (such as a gift) were ineffective. Finally,
Moore and Ollinger (2007) in a randomized experiment on an establishment survey concluded
that a mixed mode survey approach combined with financial incentives was effective at
improving response rates.
Cook, LeBaron, Flicker, and Flanigan (2009), in their review of published literature on the use of
incentives in establishment surveys, highlight four key questions regarding the use of incentives
in establishment surveys:
•
•
•
•
Should monetary or non-monetary incentives be used?
Should the incentive be prepaid or promised?
Should the incentive be aimed toward the individual respondent or the establishment?
What is the impact of the incentives on survey cost?
They concluded that the research found conflicting results on the effects of non-monetary
incentives on response rates, and that prepaid incentives perform better than promised incentives
at improving response rates. As for whether the incentive should target the individual or the
establishment, the decision should be made with respect to characteristics of the establishment
such as the size of the establishment, the presence of gatekeepers (e.g., receptionists) within the
company, the policies of the establishment, and the motivation of the key respondent. The use of
incentives does present murky ethical issues in that some companies might have rules regarding
the receipt of money or gifts while others might not. Regardless, Dillman, Smyth, and Christian
(2009) suggests that the smaller the business, the more likely it is that an incentive would be
accepted. Finally, Cook et al. (2009) conclude that the size of the incentive relative to the cost of
follow-up could produce a net savings to the survey.
This paper presents results from a randomized experiment of firms on the impact of prepaid and
promised monetary incentives on survey response rates, as well as the impact of mailing a
personalized advance letter. We hypothesize that:
1) Respondents receiving a financial incentive will have a higher response rate than
respondents not receiving a financial incentive;
2
2) Respondents receiving a prepaid financial incentive will have a higher response rate than
respondents receiving a promise of a financial reward; and
3) Respondents sent a personalized advance letter will have a higher response rate than
those sent a generic advance letter.
Finally, we explore what effect the personalized advance letter has when combined with prepaid
or promised financial incentives.
2. Methods
The California Employer Health Benefits Survey (CEHBS), sponsored by the California
HealthCare Foundation, has been conducted annually since 1999. The sample of firms 1 with at
least three or more employees working within the State of California was systematically selected
following a stratified design. The strata were defined by five size categories and eight industry
groups. 2 Firms with ten or more employees that responded to the survey in either of the past two
years remained in the sample and were supplemented by newly selected firms with at least three
or more workers so that the sample design was satisfied. The sample was provided by Survey
Sampling International from a listing of firms created by Dun and Bradstreet. The survey was
conducted from July through October 2011 via Computer Assisted Telephone Interview (CATI)
by National Research, LLC. Since 2004, the final response rate for this survey has hovered
between 35 percent and 40 percent. The final response rate in 2010 was 40 percent. The
response rate for non-panel firms and firms with 3-49 workers was 24 percent and 27 percent,
respectively. The final response rate in 2011 was 36 percent.
We have been employing multiple strategies in an effort to improve or maintain response rates.
As described above, the survey sample has a panel component which tends to improve response
rates. The panel uses a minimum threshold of ten employees as our experience has shown that
firms with fewer than ten employees have lower completion rates than larger firms and thus have
a greater chance of bias. Newly sampled firms with 3-49 workers are “pre-called” to obtain
accurate contact information prior to fielding the survey and to increase the likelihood of contact
with the person most knowledgeable about the survey topic (personalized communication and
saliency). An advance letter is sent to each respondent prior to call attempts (advance notice).
Letters to firms that are neither part of the panel nor pre-called are addressed to “employee
benefits manager.” Numerous call attempts are made to each firm during the field period
(follow-up). Responding firms are offered a copy of the results to use as a benchmark against
their health benefits (non-monetary promised incentive). Financial incentives have not been
offered in the past
1
An establishment has one physical location while a firm is comprised of multiple
establishments. That is, the firm is the business concern created by a group of establishments.
2
Firm sizes are: 3-9 workers; 10-24 workers; 25-199 workers; 200-999 workers; and 1000+
workers. Industry groups are: Agriculture/Mining/Construction; Manufacturing;
Communications/Transportation/Utilities; Wholesale; Retail; Finance; Services (excluding
Health and Government); and Health Services.
3
Following the sample selection process of the 2011 CEHBS, we identified 1,024 newly sampled
firms with 3-49 workers. These firms were randomly assigned to one of three analysis groups:
1) receiving a monetary incentive with the initial mailing; 2) receiving a promise of a monetary
reward upon completion of the survey; and 3) the control group with no monetary reward. We
randomly assigned 250 firms to Group 1 and 250 firms to Group 2. The remaining 524 were
assigned to Group 3. The monetary incentive for Groups 1 and 2 was $20 (for the pre-paid
incentive group, in cash). Next, we pre-called all 1,024 newly sampled firms and successfully
contacted 513. 3 From these successful contacts, we identified the best point of contact and
verified the firm’s address and telephone number, enabling us to send a personalized advance
letter.
Advance letters were sent to all prospective survey participants. The letter was modified as
needed for those receiving the prepaid monetary incentive and for those receiving the promise of
a monetary reward upon completion of the survey. During the fielding period 27 advance letters
with a prepaid monetary incentive and 22 advance letters with the promise of a monetary reward
were returned, with the vast majority of returns being undeliverable. All firms with returned
advance letters were reassigned to the control group. Table 1 shows the final sample counts for
each of the six analysis groups created by the two assignments.
Table 1: Analysis Group Sample Size (Firms with 3-49 Workers)
Prepaid Monetary Incentive
Promised Monetary Reward
No Incentive / Control Group
Total
Sent Personalized
Advance Letter
109
129
275
513
Sent Generic
Advance Letter
114
99
298
511
Total
223
228
573
1,024
3. Results
Table 2 highlights the marginal response rates for the two incentive approaches and the
personalized advance letter. The marginal response rate for firms receiving a prepaid monetary
incentive was 22.0% while the marginal response rate for firms receiving the promise of a
monetary reward was 30.0%. The marginal response rate for the control group was 28.1%.
Neither method was significantly different from the control group, nor were the two methods
significantly different from one another. The marginal response rate for firms receiving a
personalized advance letter, however, was significantly greater than those receiving a generic
advance letter (31.0% versus 18.3%).
We next looked at firms by their incentive group given whether they received a personalized
advance letter. The response rates for the three monetary incentive groups were nearly identical
among those firms receiving a personalized advance letter. However, for firms not receiving a
personalized advance letter, the response rate for firms that received the prepaid monetary
incentive (6.4%) was significantly smaller than both the control group (19.9%) and the group
receiving a promise of a monetary reward (26.9%).
3
The firms effectively self-selected into one of the two pre-call groups.
4
Table 2: Response Rate by Incentive Group and Advance Letter Group by Incentive Group
Sample Size
1,024
Response Rate (SE)
27.3% (2.0%)
Prepaid Monetary Incentive
Promised Monetary Reward
No Incentive / Control Group
223
228
573
22.0% (4.1%)
30.0% (4.2%)
28.1% (2.6%)
Sent Personalized Advance Letter
Prepaid Monetary Incentive
Promised Monetary Reward
No Incentive / Control Group
Sent Generic Advance Letter
Prepaid Monetary Incentive
Promised Monetary Reward
No Incentive / Control Group
513
109
129
275
511
114
99
298
31.0% (2.4%)
29.1% (5.3%)
30.8% (4.8%)
31.8% (3.3%)
18.3% (3.1%)
6.4% (4.4%)
26.9% (8.7%)
19.9% (4.1%)
All Firms
Neither incentive had an impact on the completion counts (Table 3). By assuming a response
rate comparable to the control group, we estimate that the prepaid monetary incentive produced
28 percent (23 responses vs. 32 responses) fewer completes than if there was no incentive
offered. In contrast, the promised monetary reward produced nine percent (36 responses vs. 33
responses) more completes than if there was no incentive offered. Firms receiving a
personalized advance letter, on the other hand, had a large impact on the completion counts.
These firms produced three times (114 responses vs. 28 responses) more completes than if they
were sent a generic advance letter.
Table 3: Impact of Incentives and Advance Letter on Completion Counts
All Firms
Prepaid Monetary Incentive
Promised Monetary Reward
No Incentive / Control Group
All Firms
Sent Personalized Advance Letter
Sent Generic Advance Letter
Sample
Size
1,024
223
228
573
Response
Rate
27.3%
22.0%
30.0%
28.1%
Completion
Count
142
23
36
83
Expected
Completion
Count with
No
Incentives
148
32
33
83
1,024
513
511
27.3%
31.0%
18.3%
142
114
28
56
28
28
Percentage
Improvement
Due to
Incentives
-4.1%
-28.1%
9.1%
0.0%
153.6%
307.1%
0.0%
Finally, we conducted a logistic regression analysis to calculate standardized odds ratios for the
likelihood of obtaining a complete given the use of incentives and personalized advance letter
(Table 4). Our base model contains five sets of variables: 1) an advance letter indicator (Sent
Generic Advance Letter is referent); 2) incentives indicator (No Incentive is referent); 3) advance
letter / incentive interaction (Sent Generic Advance Letter / No Incentive is referent); 4) Firm
5
Size (25-49 Workers is referent); and 5) Principal Industry (Healthcare is referent). The model
fits fairly well based on prediction accuracy4 (62%) and the Hosmer and Lameshow Goodness of
Fit Chi-square statistic (5.94). A measure of fit (λ= 0.1720) suggested by Cramer (1999) is
consistent for prediction models with unbalanced dependent variables.
Table 4: Logistic Regression: Odds of a Completion
Dependent Variable
Intercept
Advance Letter
Sent Personalized Advance Letter
Sent Generic Advance Letter
Incentive
Prepaid Monetary Incentive
Promised Monetary Reward
No Incentive / Control Group
Advance Letter / Incentive Interaction
Sent Personalized Advance Letter / Prepaid Monetary Incentive
Sent Personalized Advance Letter / Promised Monetary Reward
Sent Generic Advance Letter / No Incentive
Firm Size
3-9 Workers
10-24 Workers
25-49 Workers
Industry
Agriculture / Mining
Construction
Manufacturing
Transportation / Utility / Communication
Wholesale
Retail
Finance
Service
Healthcare
Meta Variables
Dial Attempts
Callback Conversion Eligibility
Prediction Accuracy
Hosmer and Lameshow Fit Statistic
Cramer’s Fit Measure
* Significant at p ≤ 0.05
Odds Ratio
0.1257 *
Odds Ratio with
2010 Meta Data
0.4249 *
2.4685 *
REF
3.0514 *
REF
0.5851 *
1.3210
REF
0.6780
1.2387
REF
1.5025
0.7790
REF
1.3263
0.7800
REF
0.8517
1.8395 *
REF
0.4890 *
1.0809
REF
1.5644
0.7094
1.8658
0.8710
0.6576
0.6635
0.9706
1.1085
REF
1.8179
0.6170
1.8696
0.9021
0.7002
0.6195
0.8637
1.0666
REF
0.9843 *
0.2049 *
62%
5.9388
0.1720
74%
8.1597
0.2525
Firms sent a personalized advance letter are almost 2.5 times more likely to complete the survey
compared to firms sent a generic advance letter. Firms sent a prepaid monetary incentive were
4
In unbalanced samples, the less frequent outcome may be predicted very poorly when using the standard
procedure (prediction at 0.50). A common alternative, as used here, is to use the observed probability of the
dependent variable as the prediction value (0.1387).
6
almost half as likely to complete the survey compared to firms receiving no incentive. In
contrast to the descriptive analysis (Table 2), there was no significant difference for the
interaction between the incentives and the personalized mailing.
However, missing from the base model are variables describing the effort extended by the survey
firm to elicit a completed survey. Meta-data such as Number of Total Dial Attempts and
Eligibility for Refusal Conversion Callback are influencing components beyond the use of
incentives and advance letters and should be controlled in the model. Unfortunately, the metadata for 2011 was unavailable. Meta-data for 2010, however, was available for use and was used
to impute the missing 2011 data. The 2011 database and the 2010 meta-database were first
stratified into 20 unique groups based on firm size, advance letter indicator, and general response
category (complete, refusal, ineligible, unknown). For each observation in the 2011 database, an
observation from the 2010 meta-database from the corresponding stratum was randomly selected
with replacement.
Like the base model, the model with the meta-data fits fairly well based on prediction accuracy
(74%), the Hosmer and Lameshow Goodness of Fit Chi-square statistic (8.16), and Cramer’s
measure of fit (0.2525). Similar to the base model, firms sent a personalized advance letter are
over three times more likely to complete the survey compared to firms sent a generalized
advance letter. This model, however, suggests that financial incentives are no more effective
than not offering a financial incentive. The direction and effect of the added meta-variables is as
expected. The odds of completing the survey declines slightly for each additional dial attempt.
4. Discussion
In this study, we explored three hypotheses that have implications on survey field methods for
establishment surveys. Consistent with other research (Biemer et al., 2007; Cycyota & Harrison,
2002), response rates did not significantly improve over the control group (28.1%) with the offer
of an incentive, whether prepaid (22.0%) or promised (30.0%). Contrary to other research
(Church, 1993; Cook et al., 2009; Ott & Beckler, 2007), however, the promise of a monetary
reward versus a prepaid monetary incentive produces mixed results. Descriptively, promising a
monetary reward among the group receiving a generic advance mailing appears to be effective (p
= 0.0344) for improving response rates despite not being significant in general (p = 0.1736). The
logistic regression analysis, on the other hand, suggests that promising a monetary reward has no
effect, while providing a prepaid monetary incentive in fact reduces the likelihood of completing
the survey and subsequently reducing response rates. This counter-intuitive finding is altogether
eliminated when meta-data characterizing the effort extended by the survey firm are included in
the model.
There are several possible reasons for the ineffectiveness of the incentives in this study. First,
the $20 cash offer might have been insufficient to encourage participation. Incentives are often
seen as an inducement to compensate respondents for their time or expertise. The average time
to completion for the survey was approximately 30 minutes, and the incentive amount might
have been viewed as inadequate compensation.
7
Second, the cash offer might have been inappropriate for some firms. Some organizations might
have rules against employees receiving financial compensation (Dillman, 2009), and the use of
incentives might have had the opposite effect on response rates than what is desired. In this
survey, the desired point of contact is the person most knowledgeable about the health benefits
offered by the company. The smaller the firm, the more likely the point of contact will be the
owner of the business as the one person at an establishment where a financial incentive might not
be inappropriate.
Third, a cash equivalent monetary incentive such as an ATM card has performed well in some
studies and produced overall cost savings to the survey effort (Dillman, 2009; Ott & Beckler,
2007). For those firms promised a monetary reward, perhaps not offering a choice of how to
receive that reward resulted in a dampening of the impact on the response rate than might
otherwise have been the case. Perhaps respondents not motivated by cash might have been
motivated by, for example, a charitable donation.
Fourth, the lack of significance might be due to insufficient power for the analysis. Doubling the
sample size for each incentive group while maintaining identical response rates makes a strong
argument for the use of promising a monetary reward over offering a prepaid monetary incentive
(p = 0.0543). However, even this doubling of the sample for each incentive is not enough to
produce a significant difference from the control group. When examining the effect of the
advance letters on each incentive group, limiting the incentive offerings to firms receiving a
generic advance mailing does produce significantly improved response rates for those promised a
monetary reward (26.9%) compared to those sent a prepaid incentive (6.4%; p = 0.0344).
However, the improved response rate for those promised a monetary reward is not significantly
different from the control group (19.9%; p = 0.4631).
In contrast to monetary incentives, we can confirm that, consistent with previous research
(Martin, Duncan, Powers, & Sawyer, 1989; Roth & BeVier, 1998; Yammarino, 1991), sending a
personalized advance letter produces a significant improvement in response rate (31.0% vs.
18.3%, p = 0.0013). Logistic regression suggests that firms receiving a personalized advance
letter are two and a half to three times more likely to complete the survey.
5. Conclusion
The present work adds to the current literature regarding the use of monetary incentives and
personalized advance letters for improving response rates in surveys of establishments. While
most of the literature to date pertains primarily to household surveys, more research is needed to
determine if the successful research techniques used in household surveys can be applied to
establishment surveys. We find that, like in household survey research, the use of a personalized
advance letter is effective in improving and maintaining response rates, while the use of
monetary incentives has no effect. We find that the effectiveness of a personalized advance
letter far exceeds the effectiveness of either a prepaid monetary incentive or the promise of a
monetary reward. If monetary incentives are to be used, more work is needed to determine if the
promise of a monetary reward is the better choice over a prepaid monetary reward for improving
response rates, particularly among businesses receiving a generic advance letter.
8
Much work remains to provide clear guidance to researchers conducting establishment surveys.
Future studies implementing controlled experiments that assess the impact of incentives on
survey modes, or provide guidance towards determining the most effective incentive level and
type for improving response rates, would be extremely valuable to researchers conducting
establishment surveys. These future studies should provide insight for improving the quality of
establishment surveys and towards reducing costs associated with data collection.
References
Biemer, P., Ellis, C., Pitts, A., & Robbins, K. 2007. Do Monetary Incentives Increase Business
Survey Response Results? Results from a Large Scale Experiment. Paper presented at the
Third International Conference on Establishment Surveys, June 18-21, Montreal,
Quebec, Canada.
Church, A. H. 1993. Estimating the Effect of Incentives on Mail Survey Response Rates: A
Meta-Analysis. Public Opinion Quarterly, 57: 62-79.
Cook, S., LeBaron, P., Flicker, L., & Flanigan, T. S. 2009. Applying Incentives to Establishment
Surveys: A Review of the Literature. Paper presented at the American Association for
Public Opinion Research 64th Annual Conference, Hollywood, Florida.
Cramer, J. 1999. Predictive Performance of the Binary Logit Model in Unbalanced Samples.
Journal of the Royal Statistical Society, Series D (The Statistician), 48: 85-94.
Cycyota, C. S., & Harrison, D. A. 2002. Enhancing Survey Response Rates at the Executive
Level: Are Employee- or Consumer-Level Techniques Effective? Journal of
Management, 28: 151-176.
Dillman, D. A. 2007. Mail and internet surveys: The tailored design method. New York: John
Wiley & Sons.
Dillman, D. A., Smyth, J. D., & Christian, L. M. 2009. Internet, mail, and mixed-mode surveys:
The tailored design method. Hoboken, NJ: John Wiley & Sons.
Martin, W. S., Duncan, W. J., Powers, T. L., & Sawyer, J. C. 1989. Costs and Benefits of
Selected Response Inducement Techniques in Mail Survey Research. Journal of Business
Research, 19: 67-79.
Moore, D. & Ollinger. M. 2007. Effectiveness of Monetary Incentives and Other Stimuli Across
Establishment Survey Populations. Paper presented at the Third International Conference
on Establishment Surveys, June 18-21, Montreal, Quebec, Canada.
Ott, K. & Beckler, D. 2007. Incentives in Surveys with Farmers. Paper presented at the Third
International Conference on Establishment Surveys, June 18-21, Montreal, Quebec,
Canada.
Paxson, M. C. 1992. Unpublished data: Response rates for 183 studies. Pullman, WA:
Department of Hotel and Restaurant Administration, Washington State University.
Roth, P. & BeVier, C. 1998. Response Rates in HRM/OB Survey Research: Norms and
Correlates, 1990-1994. Journal of Management, 24: 97-117.
Ryu, E., Couper, M.P., & Marans, R.W. 2005. Survey Incentives: Cash vs. In-Kind; Face-toFace vs. Mail; Response Rate vs. Nonresponse Error. International Journal of Public
Opinion Research, 18: 89-106.
Singer, E. 2002. The use of incentives to reduce nonresponse in household surveys. In Groves,
R.M., Dillman, D.A., Eltinge, J.L., & Little, R.J.A. (Eds.), Survey Nonresponse: 163-178;
New York: John Wiley & Sons.
9
Willimack, D. K., Schuman, H., Pennell, B., & Lepkowski, J.M. 1995. Effects of a Prepaid
Nonmonetary Incentive on Response Rates and Response Quality in a Face-to-Face
Survey. Public Opinion Quarterly, 59: 78-92.
Yammarino, F. J., Skinner, S. J. & Childers, T. L. 1991. Understanding Mail Survey Response
Behavior. Public Opinion Quarterly, 55: 613-629.
10