Effect of survey instrument on participation in a follow-up study: a randomization study of a mailed questionnaire versus a computer-assisted telephone interview (original) (raw)
Related papers
Surveys: Results from an Experiment
2016
Interest in a multimode approach to surveys has grown substantially in recent years, in part due to increased costs of face-to-face (FtF) interviewing and the emergence of the Internet as a survey mode. Yet, there is little systematic evidence of the impact of a multimode approach on survey costs and errors. This article reports the results of an experiment designed to evaluate whether a mixed-mode approach to a large screening survey would produce comparable response rates at a lower cost than an FtF screening effort. The experiment was carried out in the Health and Retirement Study (HRS), an ongoing panel study of Americans over age 50. In 2010, HRS conducted a household screening survey to recruit new sample members to supplement the existing sample. The experiment varied the sequence of modes with which the screening interview was delivered. One treatment offered mail first, followed by FtF interviewing; the other started with FtF and then mail. A control group was offered only ...
BMC Medical Research Methodology, 2010
Background: Evidence suggests that survey response rates are decreasing and that the level of survey response can be influenced by questionnaire length and the use of pre-notification. The goal of the present investigation was determine the effect of questionnaire length and pre-notification type (letter vs. postcard) on measures of survey quality, including response rates, response times (days to return the survey), and item nonresponse.
BMC Medical Research Methodology, 2013
Background: Although in health services survey research we strive for a high response rate, this must be balanced against the need to recruit participants ethically and considerately, particularly in surveys with a sensitive nature. In survey research there are no established recommendations to guide recruitment approach and an 'opt-in' system that requires potential participants to request a copy of the questionnaire by returning a reply slip is frequently adopted. However, in observational research the risk to participants is lower than in clinical research and so some surveys have used an 'opt-out' system. The effect of this approach on response and distress is unknown. We sought to investigate this in a survey of end of life care completed by bereaved relatives.
mHealth
Background: Decision-makers need up to date information on risk factors for effective prevention and control of non-communicable diseases (NCDs). Currently available surveys are infrequent and costly to implement. The objective of the study was to explore perceptions on using an interactive voice response (IVR) survey for data collection on NCD risk factors. Methods: Five focus group discussions (FGDs), including rural and urban, elderly and young adults, male and female groups; and eleven key informant interviews (KIIs) of researchers and NCD policy makers were conducted. Respondents were audio recorded and data were transcribed into text. Data were entered into QDA miner software for analysis. Meaningful units were generated and then merged into codes and categories. Quotes are presented highlighting findings. Results: At the individual level, age, gender, disability, past experience and being technology literate were perceived as key determinants on whether respondents would accept an IVR survey. Receiving the IVR at a time at which people are usually available to take calls increases participation. However, technological accessibility like presence of a mobile network signal and possession of mobile phones were critical for use of IVR. Participants recommended that community sensitization activities be provided, IVR be conducted at appropriate times and frequency, and that incentives may improve survey participation. Conclusions: IVR has the potential to quickly collect data from a wide geographic scope. However, caution needs to be taken to ensure that certain categories of people are not excluded because of their location, ability, age or gender. Sensitization prior to the survey, proper timing and structured incentives could increase participation.
A telephone survey of factors affecting willingness to participate in health research surveys
BMC Public Health, 2015
Background: In recent years, reduced participation has been encountered across all epidemiological study designs, both in terms of non-response as well as refusal. A low response rate may reduce the statistical power but, more importantly, results may not be generalizable to the wider community. Methods: In a telephone survey of 1413 randomly selected members of the Australian general population and of 690 participants sourced from previous studies, we examined factors affecting people's stated willingness to participate in health research. Results: The majority of participants (61 %) expressed willingness to participate in health research in general but the percentage increased when provided with more specific information about the research. People were more willing if they have personal experience of the disease under study, and if the study was funded by government or charity rather than pharmaceutical companies. Participants from previous studies, older people and women were the groups most willing to participate. Younger men preferred online surveys, older people a written questionnaire, and few participants in any age and sex groups preferred a telephone questionnaire. Conclusion: Despite a trend toward reduced participation rates, most participants expressed their willingness to participate in health research. However, when seeking participants, researchers should be concrete and specific about the nature of the research they want to carry out. The preferred method of recommended contact varies with the demographic characteristics.
BMC Medical Research Methodology, 2011
Background: Randomised controlled trials have investigated aspects of postal survey design yet cannot elaborate on reasons behind participants' decision making and survey behaviour. This paper reports participants' perspectives of the design of, and participation in, a longitudinal postal cohort survey. It describes strengths and weaknesses in study design from the perspectives of study participants and aims to contribute to the: 1) design of future cohort surveys and questionnaires generally and, 2) design of cohort surveys for people with musculoskeletal disorders (MSDs) specifically. Methods: In-depth interviews explored the design of postal surveys previously completed by participants. Interviews used open ended questioning with a topic guide for prompts if areas of interest were not covered spontaneously. Thematic data analysis was undertaken based on the framework method. A second researcher verified all coding. Results: Data from fourteen interviews were analysed within three main themes; participation, survey design and survey content. One of the main findings was the importance of clear communication aimed at the correct audience both when inviting potential participants to take part and within the survey itself. Providing enough information about the study, having a topic of interest and an explanation of likely benefits of the study were important when inviting people to participate. The neutrality of the survey and origination from a reputable source were both important; as was an explanation about why information was being collected within the survey itself. Study findings included participants' impressions when invited to take part, why they participated, the acceptability of follow-up of non-responders and why participants completed the follow-up postal survey. Also discussed were participants' first impression of the survey, its length, presentation and participants' views about specific questions within the survey. Conclusions: Ideas generated in this study provide an insight into participants' decision making and survey behaviour and may enhance the acceptability of future surveys to potential participants. As well as clear communication, participants valued incentives and survey questions that were relevant to them. However, opinions varied as to the preferred format for responses with some advising more opportunity for open-ended feedback. We also found that some standard format questions can raise quandaries for individual participants.
Improving Survey Response Rates from Parents in School-Based Research Using a Multi-Level Approach
PLOS ONE, 2015
While schools can provide a comprehensive sampling frame for community-based studies of children and their families, recruitment is challenging. Multi-level approaches which engage multiple school stakeholders have been recommended but few studies have documented their effects. This paper compares the impact of a standard versus enhanced engagement approach on multiple indicators of recruitment: parent response rates, response times, reminders required and sample characteristics.
Recruitment methods for survey research: Findings from the Mid-South Clinical Data Research Network
Contemporary Clinical Trials, 2017
The objective of this study was to report survey response rates and demographic characteristics of eight recruitment approaches to determine acceptability and effectiveness of large-scale patient recruitment among various populations. Methods: We conducted a cross sectional analysis of survey data from two large cohorts. Patients were recruited from the Mid-South Clinical Data Research Network using clinic-based recruitment, research registries, and mail, phone, and email approaches. Response rates are reported as patients who consented for the survey divided by the number of eligible patients approached. Results: We contacted more than 90,000 patients and 13,197 patients completed surveys. Median age was 56.3 years (IQR 40.9, 67.4). Racial/ethnic distribution was 84.1% White, non-Hispanic; 9.9% Black, non-Hispanic; 1.8% Hispanic; and 4.0% other, non-Hispanic. Face-to-face recruitment had the highest response rate of 94.3%, followed by participants who "opted-in" to a registry (76%). The lowest response rate was for unsolicited emails from the clinic (6.1%). Face-to-face recruitment enrolled a higher percentage of participants who self-identified as Black, non-Hispanic compared to other approaches (18.6% face-to-face vs. 8.4% for email). Conclusions: Technology-enabled recruitment approaches such as registries and emails are effective for recruiting but may yield less racial/ethnic diversity compared to traditional, more time-intensive approaches.
Internet Versus Mailed Questionnaires: A Randomized Comparison (2)
Journal of Medical Internet Research, 2004
Background: Low response rates among surgeons can threaten the validity of surveys. Internet technologies may reduce the time, effort, and financial resources needed to conduct surveys. Objective: We investigated whether using Web-based technology could increase the response rates to an international survey. Methods: We solicited opinions from the 442 surgeon-members of the Orthopaedic Trauma Association regarding the treatment of femoral neck fractures. We developed a self-administered questionnaire after conducting a literature review, focus groups, and key informant interviews, for which we used sampling to redundancy techniques. We administered an Internet version of the questionnaire on a Web site, as well as a paper version, which looked similar to the Internet version and which had identical content. Only those in our sample could access the Web site. We alternately assigned the participants to receive the survey by mail (n=221) or an email invitation to participate on the Internet (n=221). Non-respondents in the mail arm received up to three additional copies of the survey, while non-respondents in the Internet arm received up to three additional requests, including a final mailed copy. All participants in the Internet arm had an opportunity to request an emailed Portable Document Format (PDF) version. Results: The Internet arm demonstrated a lower response rate (99/221, 45%) than the mail questionnaire arm (129/221, 58%) (absolute difference 13%, 95% confidence interval 4%-22%, P<0.01). Conclusions: Our Internet-based survey to surgeons resulted in a significantly lower response rate than a traditional mailed survey. Researchers should not assume that the widespread availability and potential ease of Internet-based surveys will translate into higher response rates.