Select Page
  

Did any of Dillman’s (article author — attached) guidelines to reducing people’s reluctance to respond to surveys strike you as especially useful (or less useful)?
dillman_chapter_2_reducing_people_s_reluctance_to_respond_to_surveys__.pdf

Unformatted Attachment Preview

Don't use plagiarized sources. Get Your Custom Essay on
Research Methods in Public Health discussion
Just from $10/Page
Order Essay

Copyright © ${Date}. ${Publisher}. All rights reserved.
Reducing People’s
Reluctance to
Respond to Surveys
CHAPTER
2
Survey sponsors and the people they ask to respond to their surveys often have
contrasting views of the situation. Designing quality surveys requires understanding those differences and how to reconcile them.
For many recipients of survey requests, the invitations come as annoying
intrusions into their lives, such as unwanted phone calls, postal letters, or junk
e-mails. “Why me?” and “How do I make this go away?” are common quick and
decisive reactions from sample members, resulting in a hang-up, a toss into the
wastebasket, or a deletion.
If the recipient should begin to study the invitation, these feelings may be
amplified by thoughts such as disinterest in the topic, uncertainty about who is
making the request, or concern about opening an electronic link from an unknown
source that could infect his computer. If a survey request survives these initial perils, other considerations are likely to arise, with individuals wondering, how long is
this survey going to take to complete, will the results be useful, do the questions—
especially the first ones—make sense, is this request legitimate, and will my name
be placed on a mailing list that produces even more annoyances?
The survey sponsor, on the other hand, often sees herself as facing a huge task
of contacting hundreds or thousands of individuals and getting them to answer
burdensome questions. She also wants to do it quickly, efficiently, and at minimal
cost. The surveyor’s thinking is often focused on what kind of communications can
be written that cover all possible information that someone in the sample might
like to know and how all the contacts can be produced in the least costly way.
This thinking often leads to practices such as sending only two or three requests by
e-mail, only using bulk rate postal mail, or repeating word-for-word in follow-ups
the same information that was provided earlier. The content of these communications often focuses on the survey problem as the survey sponsor sees it, even to the
point of becoming defensively prepared messages such as “My agency is required
to find out what the health improvement needs of people are, and therefore I must
ask you to tell us your concerns.”
The questionnaire may include dozens of questions, with the list continuing to
grow as new possibilities are created. The most critical questions for the planned
analyses may be asked first, especially in web surveys, in case people decide to quit
after answering only a few questions. This kind of reasoning sometimes results in
starting with open-ended questions, such as “How much was your total household
income last year?” The sponsor asks for the exact amount, to the last dollar, instead
of offering broad categories, because it is deemed essential to the survey’s purpose
that measurement be as precise as possible. When only a few people respond to
these requests, surveyors are often disappointed, concluding, “People just aren’t
19
20
Reducing People’s Reluctance to Respond to Surveys
interested in helping with important surveys.” At times, the sponsor’s perspective
on surveys appears to be, “It’s all about me.”
It is sometimes hard to know who is most annoyed with follow-up phone calls
that are made one after another, over a period of days and weeks: the recipient of
the call, who has learned to avoid them, or the surveyor, who cannot understand
why those calls are not getting answered. Figure 2.1 provides a few examples of
what surveyors sometimes do, and common respondent reactions to what is read
Copyright © ${Date}. ${Publisher}. All rights reserved.
FIGURE 2.1
Why respondents may not complete surveys.
What surveyors sometimes do …
… and what the respondent may
think or do
Send a brief e-mail from an unknown
organization; it gets to the point
quickly by asking recipients to click on
a link to complete a survey about
crime in their community.
How do I know this is legitimate?
There is no address or telephone
number, and I wonder if this link
will connect me to some malware
that will infect my computer.
Send a letter emblazoned with “Survey
enclosed. Respond immediately.”
This is advertising. I’m not
interested.
“This is Jane calling for the Smithfield
Polling Company. I am not selling
anything and I only need to ask you a
few questions.”
Uh, oh. She hasn’t said why she is
calling, and I think I need to be
really careful here. The easiest
thing for me to do is hang
up … click!
Include a lengthy consent form at
the beginning of a web survey that
requires an x to indicate that the
respondent has agreed to complete
the survey.
I have not yet seen the questions.
I don’t know if I am willing to
complete all of the questions. What
is so worrisome about this survey
that this kind of consent is needed?
Write in the invitation to respond:
“I have included $5 to pay for your
time in completing this brief survey.”
My time is worth more than this.
This is a paltry amount to be paid.
Start the survey request with
“My agency is required to report types
of individuals we serve, so please
answer the demographic questions so
we can fulfill that requirement.”
Just because an agency is required
to do something does not mean
that I am required.
Include “To unsubscribe click here”
at the end of an e-mail request.
Oh, this is spam and I can just unsubscribe so I do not get the same
e-mail tomorrow and the next day.
Program the web survey to require an
answer to every question.
None of these answer categories fit
me; I don’t know what to do.
Should I quit or just make
something up?
Copyright © ${Date}. ${Publisher}. All rights reserved.
Chapter 2
Reducing People’s Reluctance to Respond to Surveys
or heard. These negative reactions are often in response to quite specific aspects
of the survey invitation materials or questionnaire.
These behaviors on the part of the surveyor may individually or collectively
produce incomplete answers or no response at all. In addition, if responses come
only from those especially interested in talking about a particular topic—for
example, views about abortion, a particular election outcome, or climate change—
the survey cannot accomplish its intended purpose. When we examine surveys
in this way, it is easy to understand why survey response rates are frequently
quite low—sometimes in the single digits—with considerable nonresponse error
regardless of survey mode.
How to obtain acceptable response rates and response quality from a sample
that will allow the precise estimation of characteristics in the population of interest
is the focus of this chapter. We describe specific steps that can and should be taken
by survey designers to develop respondent-friendly questionnaires and implementation procedures that accommodate the concerns and interests of potential
respondents to help them find reasons for responding. To do this we develop a
perspective that considers what happens when an organization or individual asks
a randomly sampled stranger to complete a survey and how multiple communication attempts can be utilized to encourage a positive response when the first
request falls short.
We are guided in our design efforts by a sociological perspective on what
causes humans to behave as they do in normal daily life, known as social exchange
theory. The basic idea is that surveyors need to consider potential benefits and costs
that accrue as a result of responding (or not responding), and work to create trust
that these benefits will be realized by the respondent during the response process
and afterward. Although this perspective has been utilized in previous editions of
this book, the specific recommendations for survey design presented here go well
beyond those introduced earlier, taking into consideration the massive changes
in technology and how people communicate with others that are occurring
all around us.
In light of these changes, mixed-mode surveys are increasingly needed and
are emphasized here. The use of multiple modes to make contact provides surveyors with additional opportunities to present the survey request and reasons
for responding to it. Offering alternative modes for providing the response also
becomes possible. Together these possibilities increase the opportunities for multiple efforts at communication that are comfortably within societal norms for interaction, and that allow a surveyor to improve the balance of rewards and costs as
well as enhance feelings of trust. To begin introducing this framework, we consider
results from a recently completed mixed-mode survey.
EXAMPLE OF A SURVEY WITH A HIGH RESPONSE RATE
A recent survey was conducted to obtain responses from nearly 600 doctoral students at Washington State University (WSU) about their dissertation work and
graduate training. The study was targeted toward students who had successfully
completed their required preliminary examinations and had only to finish the dissertation in order to meet their degree requirements. Data collection needed to
be completed within about a month. After learning that we could obtain both
21
22
Example of a Survey With a High Response Rate
e-mail and postal contact information for the sampled individuals, we proposed
the following implementation design:
1: Send a postal letter asking students to respond over the web. Enclose
· Day
a $2 incentive with this request.
4: Send an e-mail that builds upon the information contained in the invi· Day
tation letter, while emphasizing that the sender is following up by e-mail to
·
·
·
provide an electronic link to the survey with the hope that this will make
responding easier.
Day 10: Send a second e-mail request.
Day 18: Send a postal letter offering the option of responding via mail. Include
a paper questionnaire and an addressed and stamped return envelope.
Day 22: Send a final e-mail follow-up.
Our initial proposal elicited some hesitation. A faculty member reminded us,
“These are graduate students. They are all highly skilled with computers, have
e-mail and check it all or most days. Why would you even consider starting with
a postal contact and offering a paper questionnaire as a follow-up?” He suggested
that it would be just as effective to use only e-mails. Also, $2 sounded like a waste of
money; it would barely buy a cup of coffee. Why not save money by giving $5 only
to those who responded? He also argued that if we insisted on using web as well
as mail, that we should give people a choice of response modes in the first contact
to improve the initial response. After considering these objections, we decided to
proceed as planned.
Figure 2.2 shows the cumulative response rate over the course of the study.
The figure highlights the increase in response achieved after each of the
FIGURE 2.2 Cumulative response rate by day and mode for the 2013 WSU
Doctoral Student Experience Survey, showing contribution of each contact to final
response rate.
100%
Mail Surveys
Response Rate
73%
Postal Invite
$2 and URL
60%
63%
76%
76%
77%
66%
E-mail
1:00 p.m.
49%
40%
Paper
Questionnaire
E-mail
9:30 a.m.
20%
8%
E-mail
2:00 p.m.
0%
29-Mar
31-Mar
1-Apr
4-Apr
6-Apr
8-Apr
10-Apr
12-Apr
14-Apr
16-Apr
18-Apr
20-Apr
22-Apr
24-Apr
26-Apr
28-Apr
30-Apr
2-May
4-May
6-May
8-May
10-May
12-May
14-May
16-May
18-May
20-May
22-May
24-May
26-May
Copyright © ${Date}. ${Publisher}. All rights reserved.
80%
Web Surveys
Date
Source: Adapted from Determining Whether Research Is Interdisciplinary: An
Analysis of New Indicators (Technical Report 13-049), by M. M. Millar, 2013, Pullman:
Washington State University, Social and Economic Sciences Research Center.
Chapter 2
Reducing People’s Reluctance to Respond to Surveys
five contacts. Importantly, each contact produced an additional increment of
response. Most striking perhaps is the effect of the quick e-mail follow-up sent a
few days after mailing the initial postal contact. The response rate from the initial
postal mail-out had reached 8% by 2 p.m. on Day 4 when the first e-mail contact
was sent. From then until midnight, a period of only 10 hours, the response rate
jumped to nearly 30% and continued to climb over the next few days so that at
the time of the next contact on Day 10 it had already reached 49%. The e-mail
sent on Day 10 produced a smaller increase, with total response reaching 63%
the day the paper questionnaire was mailed.
The combined effect of the second postal contact and the final e-mail followup pushed the overall response up to 77%. About half of the additional responses
came in the form of paper questionnaires, and the rest were additional Internet returns stimulated primarily by the quick follow-up by e-mail. To put this
final rise of 14 percentage points in context, nearly a third of the approximately
200 students who were sent the follow-up contacts containing the paper questionnaire responded by paper or the web.
We expected this approach to be effective as earlier research suggested
that combining these implementation elements would produce a good response
(Millar & Dillman, 2011). This previous research showed that offering people
an initial choice of survey modes (which we had been encouraged to do but did
not) tends to decrease final response rates, perhaps because it makes the response
decision more complex, and leads to individuals delaying response.
We return to this example at the end of this chapter with a full discussion of
why and how social exchange concepts, as described in the next section, were systematically applied to all aspects of the survey design to achieve such high response,
which was well beyond the 20% to 30% range that student surveys conducted at
this university typically achieve.
USING SOCIAL EXCHANGE CONCEPTS TO MOTIVATE
POTENTIAL RESPONDENTS
Copyright © ${Date}. ${Publisher}. All rights reserved.
There is no shortage of theory and research suggesting how respondents might be
motivated to respond to surveys. Among those efforts are these:
Cognitive Dissonance Theory: Approach people in a way that encourages
cognitive consonance with a previous behavior, expecting that people who
responded to other surveys will feel the need to respond to your survey
(Festinger, 1957).
Reasoned Action Theory: Appeal to people’s positive attitudes toward surveys and the existence of subjective norms that favor responding, both of
which produce behavioral intentions that are likely to encourage a positive
response (Ajzen & Fishbein, 1980).
Adult-to-Adult Communication Style: Use an adult-to-adult style of communication rather than an adult-to-child interaction style that potential
respondents will find demeaning, for example, “You must respond to this
request today!” (Comley, 2006).
Influence Theory: Communicate scarcity of opportunity to respond,
emphasize consistency with previous behavior, facilitate reciprocation for
a favor already performed, focus on enjoyment of task and social proof,
23
24
Using Social Exchange Concepts to Motivate Potential Respondents
Copyright © ${Date}. ${Publisher}. All rights reserved.
and describe what other people have done or are perceived as doing in
the face of similar opportunities (Cialdini, 1984).
Leverage-Saliency Theory: Be attentive to the fact that survey features can
have a positive effect on the response decision for some sample members
and a negative effect for others (leverage). Make positive features more
salient and negative features less salient in follow-ups to increase the likelihood of obtaining a response (Groves, Singer, & Corning, 2000).
Cost–Benefit Theory: Focus explicitly on respondent costs and benefits.
Reducing costs of responding is desirable but insufficient; people choose
to act when, in their subjective calculus, the benefits of doing so outweigh
the costs (Singer, 2011).
Gamification Theory: Make responding to surveys fun by making them
appear like games with awards like badges or points that can be earned by
engaging in certain behaviors (Lai, Bristol & Link, 2012), graphics that
make questions more visual, and other elements that appeal to people’s
emotions and desire to have an enjoyable experience (Puleston, 2012a,
2012b).
Each of these theories provides us with different concepts and tools to use in
thinking about survey response. One thing they have in common is that they all
place an emphasis on what appears to be going on in the potential respondent’s
head as she weighs whether or not to respond. That is, they are psychological in
nature. There is less emphasis on how well the survey materials fit with the general culture in a way that affects response behavior, something that cannot easily
be articulated by the recipient of the survey request. Because of this, these theories do not provide guidance about how multiple features of each survey such as
the mode(s) of contact and response, content of communications, questions asked,
question order and presentation, and the timing of contacts should be designed
to create a holistic data collection protocol that will improve response rates and
data quality. That is, these theories do not tell us much about how to connect
response-encouraging elements together into a comprehensive design. In particular, they have not addressed how surveyors might use multiple modes of contact
and response modes to increase the likelihood that recipients of survey requests
will attend to those requests and respond.
The first edition of this book (Dillman, 1978) introduced social exchange as
a means of connecting multiple design issues, some of which are dealt with in
isolation by the aforementioned theories, in order to obtain high survey response
rates, but did so primarily in a single-mode context. In this edition, we apply social
exchange theory to the mixed-mode context that characterizes much of survey
research today.
The concept of social exchange is quite simple. It is that people are more
likely to comply with a request from someone else if they believe and trust that
the rewards for complying with that request will eventually exceed the costs of
complying. Social exchange was developed by Blau (1964), Homans (1961), and
Thibaut and Kelley (1959) as a general model for understanding how people
behave in their interactions with one another, and to understand how social norms
develop to guide those interactions. This framework was used to explain how people realize their self-interests as well as achieve effective interaction with others in
social groups, from communities to the society in which they live. Social exchange
concepts provide a means of reconciling philosophical views of the human desire
Copyright © ${Date}. ${Publisher}. All rights reserved.
Chapter 2
Reducing People’s Reluctance to Respond to Surveys
to find meaning through interactions with others and the human desire to achieve
self-interests (e.g., Roloff, 1981) from which they also draw satisfaction.
Social exchange is not the same as economic exchange. Social exchanges are
different from economic ones because there is only a general expectation of a positive return. The exact nature of benefits and why they will be provided are often
not specified in advance; instead, they are left open, based upon trust that they
will be delivered. As noted by Stafford (2008), social exchanges involve trust in a
likely outcome, rather than relying on explicit bargaining, and are more flexible.
In addition, social exchanges involve various types of as-yet undelivered benefits (e.g., social, psychological, etc., in addition to economic benefits) as well as
any immediate ones. In contrast, economic transactions rely only on assigning a
monetary value to the service or product to be transferred in the exchange.
Social exchange is also not a rational behavior model. It does not assume tha …
Purchase answer to see full
attachment

Order your essay today and save 10% with the discount code ESSAYHSELP