Here is the CMT Uptime check phrase

Self-Administered Surveys

 
Julie de Jong, 2016

Introduction Fully self-administered questionnaires (SAQs) are not as common as interviewer-administered surveys in the context of multinational, multiregional, and multicultural ('3MC') surveys. However, as surveys become more costly to administer using interviewers, whether face-to-face or by telephone, more researchers are considering SAQ modes, including mail surveys, Web surveys, and interactive voice response (IVR) surveys conducted via telephone. An important element of the self-administered mode is that there is, by definition, no interviewer involved. As discussed in detail in Survey Quality, interviewer error can contribute significantly to total survey error. In removing the interviewer from the equation, survey quality can improve. This may be particularly true if the survey topic is sensitive. Self-administered modes can also be effective when privacy during the survey interview is difficult to obtain. However, the absence of an interviewer also demands a carefully designed survey instrument that is easy for the respondent to complete. Because there is no interviewer present, there is no one to assist the respondent in understanding instructions or to provide encouragement to complete the questionnaire. Differences in literacy levels among countries should also be considered in the questionnaire design phase of self-administered instruments (see Questionnaire Design for further details). In addition, because of the lack of interviewer-respondent interaction, nonresponse is more difficult to assess, and it is a challenge to disentangle the effects of noncontact, refusal, and a poor sampling frame. For example, nonresponse to a mail survey may result from misdirected mail that never arrived at the sample respondent’s house, misplaced mail within the respondent’s house, initial willingness to complete the survey but subsequent forgetfulness, unwillingness to complete the questionnaire (i.e., a refusal), or any number of issues. And, in a multi-person household, it may be impossible to identify who the actual respondent was. Therefore, when designing an SAQ, it is crucial to implement strategies to maximize survey quality. For further discussion on the advantages and disadvantages of self-administered surveys, see Study Design and Organizational Structure. For additional information on sample design and related challenges for self-administered modes, see Sample Design.

⇡ Back to top

Guidelines Goal: To achieve an optimal cross-cultural data collection design by maximizing the amount of information obtained per monetary unit spent within the allotted time, while meeting the specified level of precision and producing comparable results, within the context of a self-administered survey.

⇡ Back to top

1. When a mail survey using a paper-based instrument will be sent to respondents, develop the questionnaire and protocols with consideration that the survey must be straightforward for respondents to self-administer. Rationale Concerns about response rates, survey length, and data quality have all resulted in a reduction in the use of mail surveys in recent years. However, Dillman and others argue that high-quality mail surveys with close attention to detail can result in accurate data [zotpressInText item="{2265844:6EYTAX5Y},{2265844:YN8MAABV}"]. The mail survey is becoming more widespread as the cost of interviewer-administered surveys increases. If a mail survey is the chosen mode of data collection, consider the following steps when developing the instrument. Procedural steps

1.1     Assess the postal system in the study country and use it to develop a timeline for data collection that is realistic given the local context. In a 3MC survey, there are often differences in postal reliability, cost, possible carriers, and timeliness.

1.2     When designing materials (letters, questionnaires, etc.) that will be mailed to the respondent, assess the following:

1.2.1    Literacy levels among the target population.

1.2.2    Use of languages and/or regional dialects other than the country’s official language(s), and any implications for the feasibility of a self-completed questionnaire. Indeed, there are some languages and dialects that do not have a written form.

1.3     Determine how the data entry of returned mail questionnaires will occur. Data entry can occur manually, but it is more efficient to use optical or intelligent character recognition software, wherein the computer will read and code responses from paper questionnaires.

1.4     Before mailing out the paper questionnaire, consider sending a well-written advance letter to legitimize the survey and reassure and motivate potential respondents. Most effective is a carefully drafted, simple, short letter [zotpressInText item="{2265844:8EFBXS7Q},{2265844:G4I87XIU},{2265844:H2QCLNXW}"].

1.5     Develop a cover letter to include with the paper questionnaire, introducing the research study, explaining the purpose of the survey, and providing instructions on how to complete the instrument and organization contact information for any questions the respondent might have.

1.6     Develop an instrument appropriate for the mode and target population, keeping in mind that there will be no interviewer present to assist with the survey administration.

1.6.1    Assess the literacy of the target population, and adjust the text for comprehension if necessary.

1.6.2    Place instructions clearly next to the survey questions to which they correspond.

1.6.3    Make the layout of the instrument visually appealing and the question order easy to follow. Use visual elements (e.g., brightness, color, shape, position on page) in a consistent way to define the desired path through the questionnaire [zotpressInText item="{2265844:HTV52MIL},{2265844:DDYLNLET}"].

1.6.4    Use skip patterns only when absolutely necessary. Include clear instructions for skip patterns, and reinforce with visual and graphical cues such as boldfacing and arrows.

1.6.5    Limit the number of open-ended questions.

1.6.6    Ask only one question at a time. Combining multiple items into one question places a heavy cognitive burden on respondents and can impact data quality.

1.7     Provide clear instructions for returning the completed survey to the research organization or other point of collection. Adequate postage should be provided on the envelope so as not to incur cost to the respondent.

1.8     Develop a sample management system (and procedures for its execution) to process completed paper questionnaires.

1.9     Institute protocols to protect respondent confidentiality. It is common for research organizations to assign a unique identification number to each sampled household’s questionnaire for sample management purposes as questionnaires are mailed back to the office. This ensures that if a paper questionnaire is lost in the mail or is not otherwise returned to the survey organization, the respondent’s answers cannot be linked to their identity by a third party.

1.10   Develop a protocol for addressing nonresponse, including how many attempts to reach respondents by mail and/or other possible methods will be made.

Lessons learned

1.1     Because a mail survey is self-administered without an interviewer present, it is crucial that the layout and design of the questionnaire elements is clear and easy to follow and that instructions are visibly marked. Often, the first page of a mail survey contains a lengthy set of instructions, which [zotpressInText item="{2265844:HTV52MIL}" format="%a% (%d%)"] argue respondents generally skip or do not retain when completing the questionnaire. They advise the placement of relevant instructions to be directly where they need to be.

1.2     A recent mail survey in Siberia, which varied experimental factors across random subgroups of respondents, achieved greatest response rates when official university letterhead was used in correspondence, when there was an incentive offered, and when a larger number (vs. a smaller number) of contacts with the respondent were attempted [zotpressInText item="{2265844:L9ICIYPJ}"].

1.3     Expected response rates for mail surveys will differ by country. For a limited set of studies examining cross-national differences in response rates, see [zotpressInText item="{2265844:8EFBXS7Q}" format="%a% (%d%)"], [zotpressInText item="{2265844:UR2PC7V2}" format="%a% (%d%)"], [zotpressInText item="{2265844:BUGMUMVC}" format="%a% (%d%)"], and [zotpressInText item="{2265844:VITCJ2LX}" format="%a% (%d%)"].

⇡ Back to top

2. When administering a survey via the Web, develop the questionnaire and protocols with consideration that the survey must be straightforward for respondents to self-administer. Rationale Internet penetration has been steadily increasing worldwide in recent years. Given the increased costs of interviewer-administered surveys, many researchers are turning to the use of Web-based surveys to reach respondents when an adequate sample frame is available. Web surveys should be designed so respondents can easily access and complete the survey. Procedural steps

2.1     Assess each study country’s technological infrastructure to select software appropriate for use, depending on instruments prevalent in the study country, for the development, distribution, and completion of the Web survey.

2.1.1    Assess Internet speed and reliability in the study country and the potential impact on ease of Web survey use by respondents, and design the survey to fit the country’s bandwidth limitations.

2.1.2    Determine which Web browser(s) fully supports the Web-based survey instrument, and communicate this to the respondent. Consider including a link to download a specific browser to facilitate the respondent’s participation in the Web survey.

2.1.3    Consider that respondents will likely use different devices to access the survey, including desktop computers, laptop computers, tablets, smartphones, and other electronic devices. The Web survey should be able to be completed on a Web browser, regardless of the type of device. See Instrument Technical Design for additional information on preparing style sheets appropriate for multiple devices.

2.1.4    Plan for adequate programming and testing time on multiple devices. For example, software that is compatible with Android devices may have glitches in iOS (Apple) devices.

2.2     Determine how respondents will be invited to participate in the Web survey.

2.2.1    Before disseminating the link to the Web-based survey instrument, consider sending a well-written advance letter to legitimize the survey and reassure and motivate potential respondents. Most effective is a carefully drafted, simple, short letter [zotpressInText item="{2265844:8EFBXS7Q},{2265844:G4I87XIU},{2265844:H2QCLNXW}"].

2.2.2    Mode of invitation will be limited by the respondent contact information available from the sample frame. For example, a Web survey using a sampling frame consisting solely of email addresses will not be able to send an invitation via postal mail because of the lack of a mailing address.

2.3     Determine how respondents will gain access to the survey. [zotpressInText item="{2265844:H2QCLNXW}" format="%a% (%d%)"] proposes providing a PIN limit access to only people in the sample. Another option is to provide each respondent with a unique URL linking to the survey, which is linked to the respondent’s sample ID.

2.4     Develop a concise introduction to be presented at the start of the Web survey, introducing the research study, explaining the purpose of the survey, and providing instructions on how to complete the survey and organization contact information for any questions the respondent might have.

2.5     Develop and test the Web survey, keeping in mind that there will be no interviewer present to assist with the survey administration.

2.5.1    Assess the literacy of the target population, and adjust the text for comprehension if necessary.

2.5.2    The first question should be an item that is likely to be interesting to most respondents and easy to answer.

2.5.3    Place instructions alongside the survey questions to which they correspond.

2.5.4    Make the layout of the instrument visually appealing.

2.5.5    Program any skip patterns used directly into the instrument, relieving the respondent from navigational decisions.

2.5.6    Keep the survey as brief and engaging as possible. The longer the questionnaire and the greater the number of screens, the more likely the respondent will not finish the questionnaire [zotpressInText item="{2265844:3RYH9HTP}"].

2.5.7    Limit the number of open-ended questions.

2.5.8    Ask only one question at a time. Combining multiple items into one question places a heavy cognitive burden on respondents and can impact data quality.

2.5.9    Make prompts, particularly those asking for the respondent to correct an answer, helpful, polite, and encouraging.

2.5.10   Decide whether respondents can navigate backwards to revisit and/or revise previous survey items and responses.

2.5.11   See Instrument Technical Design for additional guidance on the layout and technical design of the Web survey.

2.6     Decide whether respondents will be permitted to complete the questionnaire in more than one session, allowing for the data to be saved in the interim, and program the instrument accordingly.

2.7     Institute protocols to protect respondent confidentiality.

2.7.1    Ensure that electronic transmission of the data from the respondent’s computer to the survey firm collecting the data is secure.

2.8     Select an appropriate electronic sample management system and develop procedures for its execution. If an electronic sample management system is used, coordinating centers may play a role in monitoring fieldwork. See Study Design and Organizational Structure for details.

2.9     Determine which paradata will be collected. Paradata from Web surveys can be used to enhance respondents’ experience or to understand more about the respondents and how they interact with the Web survey [zotpressInText item="{2265844:QHSVB7BL}"]. See Paradata and Other Auxiliary Data for more information and examples.

2.10   Develop a protocol for addressing nonresponse, including how many attempts to reach respondents by email and/or other possible methods will be made.

Lessons learned

2.1     Web surveys are often used in subsequent waves of panel surveys following an interviewer-administered baseline study, and can be a practical and cost-effective mode choice. In such cases, the respondent is already familiar with the study, and strategies to minimize nonresponse can be executed via phone, mail, and even in-person visits because complete contact information is generally available.

2.2     With adequate design, Web surveys can achieve response rates comparable to non-Web surveys.

2.2.1    A randomized telephone-/Web-mode experiment in a Swiss election study found that the use of an incentive in a Web survey produced response rates comparable to those from the telephone survey (which also included incentives). The Web survey was much less costly, even accounting for the cost of incentives, than the telephone survey [zotpressInText item="{2265844:U6CXVPS8}"].

2.2.2    However, like 3MC surveys conducted in other modes, Web surveys can produce difference response rates across countries. A comparison of data collected through a Web survey from Italy, France, Turkey, and the U.S. showed that France had the highest overall refusal rate, but low item nonresponse for those who did participate. Italy and the U.S. had response rates and low item nonresponse. Respondents in Turkey had the lowest contact and response rates, and the highest item nonresponse for sensitive questions [zotpressInText item="{2265844:DLPTFD4B}"].

2.3     Internet censorship occurs at the national level in at least several non-Western countries, such as China and Iran. If planning a survey in a country where censorship occurs, consider the survey topic and technical programming and determine whether the Web is an acceptable form of data collection for the particular study country.

2.3.1    Censorship by certain governments can impact the types of questions that are permitted on a Web survey questionnaire.

2.3.2    Censorship can impact response rates due to confidentiality and security concerns among respondents.

2.3.3    If the study country engages in censorship, consider the location of the server hosting the survey, and whether the study respondents will be able to access the server in its host country; that is, whether the server website IP address is accessible from the study country.

2.4     Software and website vendors can restrict access by users in other countries. Regardless of any government censorship, verify that respondents in the study country can access the survey.

2.5     Smartphone apps are currently being used for time use surveys. For example, a research study in the Netherlands is using a smartphone app to collect time use data in combination with auxiliary data. By requiring respondents to install an app, rather than access a website to complete the survey, researchers can guarantee that respondents will visually see the instrument exactly as the researchers intended. The app does not need permanent Internet access, as completed survey data is stored and transmitted as Internet access permits [zotpressInText item="{2265844:6MMUBUF6}"].

⇡ Back to top

3. When administering a survey using IVR, develop the questionnaire and protocols with consideration that the survey must be straightforward for respondents to self-administer. Rationale IVR can be an effective mode for administering a survey to a population where telephone accessibility is adequate, and particularly when the survey topic is sensitive. However, as with mail and Web surveys, the absence of an interviewer necessitates careful instrument design and field execution. Procedural steps

3.1     Determine which IVR software will be used to carry out the survey, including whether the IVR system will accept incoming telephone calls from respondents and/or initiate outgoing calls to respondents to complete the survey.

3.2     Determine how respondents will be invited to participate in the IVR survey. The mode of invitation will be limited by the respondent contact information available from the sample frame.

3.2.1    If postal addresses are available, respondents can receive an invitation with a telephone number to call to participate.

3.2.2    If email addresses are available, respondents can receive an invitation and telephone via email.

3.3.3    If only telephone numbers are available, the invitation to complete the IVR will occur by telephone.

3.3     If an automated dialing system will be used to initiate contact with the respondent, assess any legal restrictions in place that apply to the use of such systems in the study country.

3.4     Develop a concise introduction to be presented at the start of the IVR survey, introducing the research study, explaining the purpose of the survey, and providing instructions on how to complete the survey and organization contact information for any questions the respondent might have.

3.5     Decide whether to program the IVR system as touchtone, voice input, or a combination of the two.

3.5.1    When deciding on the programming, consider the target population. Studies in rural India and Botswana found that respondents with less education and lower literacy do better with touchtone, and cited privacy for touchtone preference as well [zotpressInText item="{2265844:PKGIXZ8N},{2265844:I3L3PZDH}"].

3.5.2    A study in Pakistan found that a well-designed speech interface was more effective than a touchtone system for respondents regardless of literacy level [zotpressInText item="{2265844:796M9DDZ}"].

3.6     Devote sufficient time to the development of a high-quality IVR system to maintain respondent interest and continued cooperation.

3.6.1    The IVR system must have a high-quality recording, as the respondent is likely to break off the survey if quality is poor.

3.6.2    See [zotpressInText item="{2265844:HBKSVNAK}" format="%a% (%d%)"] for a guide to the development of an IVR system and the associated speech characteristics which need consideration.

3.7     Select an appropriate sample management system, and develop procedures for its execution.

3.7.1    If an electronic sample management system is used, coordinating centers may play a role in monitoring fieldwork. See Study Design and Organizational Structure for details.

3.8     Develop a protocol for addressing nonresponse, including how many attempts to reach respondents by telephone and/or other possible methods will be made.

Lessons learned

3.1     Consider the voice used for recording.

3.1.1    In a health helpline project in Botswana, researchers employed a well-known local actress for the IVR recording, and users reacted very positively [zotpressInText item="{2265844:PKGIXZ8N}"].

3.1.2    Depending on the social context, using an IVR recording of a man for male respondents and a woman for female respondents may elicit more accurate reporting, particularly of sensitive information.

3.2     [zotpressInText item="{2265844:DFMSKAYM}" format="%a% (%d%)"] developed an innovative approach to the challenge that dialectical variation and multilingualism poses to speech-driven interfaces for IVR in India, applicable to other settings as well. In their approach, people from specific villages are recorded during interactions, and their speech is semi-automatically integrated into the acoustic models for that village, thus generating the linguistic resources needed for automatic recognition of their speech.

3.3     Consider an alternate mode for first contact to inform respondent of impending IVR survey, such as SMS or a mailing. In a study in rural Uganda, the IVR survey call was preceded by an SMS message about the upcoming call 24 hours prior. In a pretest, respondents who didn’t receive the text were unable to make sense of the later survey call [zotpressInText item="{2265844:CHXGGNYG}"].

3.4     A survey of teachers in Uganda resulted in a number of useful considerations when designing an IVR system to improve response rates and data quality [zotpressInText item="{2265844:CHXGGNYG}" etal="yes"].

3.4.1    The IVR call began with the immediate information that “This is a recorded call from Project X. You are not talking to a real person.”

3.4.2    The IVR call provided very specific instructions about whether to use keypad or to speak.

3.4.3    Respondents were initially confused by the automation of the IVR system. Researchers had better results when using a chime to get respondents’ attention before the automated voice gave instructions.

3.4.4    Leveraging conversational and turn-taking conventions of normal conversation in the IVR system lead to more success than detailed instructions in eliciting desired user behavior.

3.4.5    An IVR system which projected a loud voice, with prompts recorded like the speaker was using a poor cell connection, resulted in a survey that was easier for respondents to follow.

3.4.6    When producing the IVR recording, use slow speech to get slow speechrespondents will emulate the voice, and resulting data will be easier to understand.

3.4.7    The IVR recording included 3 seconds of silence before the recorded speakers says “thank you” and moves onto next question, which was reported as well-received by respondents.

⇡ Back to top

References [zotpressInTextBib style="apa" sortby="author"]

⇡ Back to top