Compensating for Low Topic Interest and Long Surveys: A Field Experiment on Nonresponse in Web Surveys (original) (raw)
Related papers
Response Rate and Response Quality of Internet-Based Surveys: An Experimental Study
Marketing Letters, 2004
DOI to the publisher's website. • The final author version and the galley proof are versions of the publication after peer review. • The final published version features the final layout of the paper including the volume, issue and page numbers. Link to publication General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal. If the publication is distributed under the terms of Article 25fa of the Dutch Copyright Act, indicated by the "Taverne" license above, please follow below link for the End User Agreement:
Survey Design Features Influencing Response Rates in Web Surveys
2002
In this paper we present an overview of several Web surveys. The aim of this research is to study Web survey design characteristics that may influence participation in Web surveys (Lozar Manfreda, 2001; Vehovar et al., 2002). Except for a few studies (whose results we present below), the previous research has mainly studied the effect of a single factor, or a group of related factors on the response rate, while attempting to hold all other potential factors constant. However, there may be interaction among the factors. In addition, authors rarely define which stage of the Web survey process (Lozar Manfreda, 2001; Vehovar et al., 2002) they refer to. However, different factors may have an impact at different stages.
Assessing Response Rates and Nonresponse Bias In Web and Paper Surveys
Research in Higher Education, 2003
Using data collected as part of the second pilot administration of Your First College Year (YFCY), a national survey of first-year college students, this study was designed to examine both response rates and nonresponse bias across four survey administration groups: paper-only, paper with web option, web-only with response incentive, and web-only without response incentive. Findings indicate that response rates vary by mode of administration. Moreover, predictors of response differed by administration group. Results are discussed in light of the recent surge of interest in online survey research.
Online Survey Research: Can Response Factors Be Improved?
Journal of Internet Commerce, 2008
The use of the Internet and online methods for data collection brings about new challenges for academic and managerial researchers. Our objective is to advance our level of knowledge regarding online data collection through an experiment. Using a local organization's e-mail list, we evaluate survey introductory elements such as invitation from a known leader and introductory length. We evaluate resulting levels of response rate, response quality, and respondent satisfaction to assess the effects of modified survey introductory designs. Findings reveal that an invitation from a known leader leads to improved levels of response quality while shorter introductions result in a quicker response from survey participants.
Improving response to web and mixed-mode surveys
Public Opinion Quarterly, 2011
We conducted two experiments designed to evaluate several strategies for improving response to Web and Web/mail mixed-mode surveys. Our goal was to determine the best ways to maximize Web response rates in a highly Internet-literate population with full Internet access. We find that providing a simultaneous choice of response modes does not improve response rates (compared to only providing a mail response option). However, offering the different response modes sequentially, in which Web is offered first and a mail follow-up option is used in the final contact, improves Web response rates and is overall equivalent to using only mail. We also show that utilizing a combination of both postal and email contacts and delivering a token cash incentive in advance are both useful methods for improving Web response rates. These experiments illustrate that although different implementation strategies are viable, the most effective strategy is the combined use of multiple responseinducing techniques.
2010
Most analyses of the effect of incentives on response rates have attempted to extrapolate to the Internet the already existing broad knowledge base of methodologies for personal surveys by telephone and post. However, such knowledge about existing methods is not directly applicable to the Internet-based interviews. This study therefore examines the manner in which different combinations of incentives can affect response to a survey over the Internet. With such a purpose incentives have been chosen based on draws that has been administered under the pre-incentives philosophy (giving the possibility to participate in a draw, independently that the questionnaire was stuffed or not) or post-incentives (giving the possibility to participate in a draw only those that stuffed the questionnaire). A surprising result is that the joint use of pre-incentives and post-incentives slightly improves response rates, although it also causes rates to be considerably lower than when these incentives are used separately.
Factors Affecting Response Rates of The Web Survey with Teachers
Computres, 2022
Although web survey has been a popular method of data collection in the academic community, it presents meagre response rates, which primarily affect the validity of the results as well as the reliability of the outcomes. Surveys worldwide that study the response rate only of teachers have not been found in the relevant literature. In this survey, with a sample of 263 Greek teachers, we investigate possible factors that explain teachers’ intention to participate in web surveys that are conducted by online questionnaires indicating, therefore, the factors that probably influence the response rate of web surveys. Our findings support those factors such as (a) authority, (b) incentives, (c) survey structure/form, (d) ethical issues, (e) reminders and pre‐notifications, and f) survey time received, which seem to explain the teachers’ intention to participate in web surveys with questionnaires. Based on the findings, methodology implications and limitations for researchers are discussed.
Survey Practice
Web surveys provide survey researchers an opportunity to survey and gather data in an inclusive manner and from a global audience. However, Web surveys typically suffer from low response rates. To combat this issue, numerous studies have investigated how invitation messaging can improve responses. There is only one study, to our knowledge, that has studied the impact on response rates of embedding the first survey item in the email invitation message. Furthermore, no studies, to our knowledge, have investigated whether such an email invitation would augment the response rates during time periods that are less salient to the population under investigation. Our study tested a 2 (invitation message) x 6 (the day the invitation was sent) experimental design. The results show individuals who received the email invitation with the embedded survey item were more than two times more likely to start the survey and almost one-and-a-half times more likely to complete the survey than those who only received the traditional email invitation. Providing the embedded invitation improved response rates on test days more so than on days that were less salient to the respondents. Practical implications and suggestions for future research are discussed.