Julie de Jong, 2016
Many cross-cultural projects attempt to keep the mode of administration constant by choosing face-to-face data collection, where the survey questionnaire is administered, at least in part, by a survey interviewer. Generally, the face-to-face mode has the best sample coverage properties, highest response rates (and, therefore, possibly lower nonresponse bias), and does not require respondents to be literate. For a discussion of the advantages and disadvantages of the face-to-face mode of interviewing, see Study Design and Organizational Structure.
Before the advent of personal computing, face-to-face surveys were administered using a paper-and-pencil instrument (PAPI). However, laptops and other electronic instruments (e.g. tablets, smartphones, etc.) are now widely used in lieu of PAPI.
In order to collect comparable data, multinational, multiregional, and multicultural (‘3MC’) surveys must establish a standard data collection protocol. At the same time, the protocol will sometimes need to allow for modifications required by local norms, conditions, or customs.
The implementation of face-to-face surveys presents a number of logistical challenges not faced in other modes. This chapter first addresses issues pertaining to the face-to-face mode, regardless of the instrument used to collect the data (i.e., paper-and-pencil questionnaire, computerized instrument, etc.), and then presents considerations particular to each type of instrument.
⇡ Back to top
Goal: To achieve an optimal 3MC data collection design by maximizing the amount of information obtained per monetary unit spent within the allotted time, while meeting the specified level of precision and producing comparable results, within the context of a face-to-face survey.
⇡ Back to top
1. Consider the following when conducting survey interviews using a face-to-face mode. Surveys conducted by interviewers face-to-face share a number of common procedural steps.
There are a number of important considerations when interviewers are contacting respondents in a face-to-face survey, whether the instrument is paper-based (PAPI) or computer-based (CAPI).
1.1 Contact local authorities for clearance for the interviewers to collect data at the sample site(s); if necessary, negotiate with local authorities (or, in some cases, military authorities) to gain access to sample areas.
1.2 Allow adequate time for interviewer recruitment and training.
1.2.1 Match interviewer and respondent characteristics (e.g., race, ethnicity, or gender) when cultural norms so dictate, and/or if there is reason to think that interviewer effects may occur depending on the social conditions (see Interviewer Recruitment, Selection, and Training).
1.2.2 While interviewers might be relatively easy to recruit in some countries, researchers in other places, such as the Gulf States and Middle East region, might face some challenges in recruiting qualified field interviewers .
1.3 Take measures to ensure interviewer safety.
1.3.1 Inquire about potential safety problems, such as civil unrest and high crime areas.
1.3.2 Decide whether interviewers should travel in groups and be accompanied by security personnel.
1.3.3 Have interviewers visit their work areas during the daytime before the first day of data collection. They should check for potential hazards and safe havens during this visit.
1.3.4 Have interviewers tell their supervisors and family members when they plan to leave for the field, the location of the area, and when to expect them back.
1.4 Have interviewers carry the following items in the field to establish their legitimacy:
1.4.1 Official identification from the survey organization.
1.4.2 Official letters to local authorities describing the study, if appropriate.
1.4.3 Other letters of permission or support from local authorities, if appropriate and/or necessary given the local social context and governmental regulations.
1.5 Provide adequate transportation and accommodation for staff and supplies.
1.5.1 If maps are unavailable or unreliable, consider the use of local guides or GPS instruments.
1.5.2 Arrange to secure fuel and oil and to maintain any vehicles used by the field staff; this may present logistical problems in some countries where there are breakdowns in infrastructure.
1.5.3 Arrange for emergency transportation in the event that a field team member becomes ill or injured and needs immediate medical attention, or it becomes unsafe to stay in an area.
1.5.4 Arrange for backup transportation.
1.5.5 Secure housing accommodations in more remote areas prior to fieldwork or have the team carry their own (e.g., tents or mobile homes).
1.6 If physical measurements are taken as part of the survey, check the cultural acceptance of taking such measurements.
1.7 Provide all members of the field staff with access to a reliable line of communication with their supervisor. This will allow them to report progress and problems.
1.7.1 Majority countries may have weak communication capacities, especially in rural areas.
1.7.2 Cellular or satellite phones may be a worthwhile investment for teams in remote areas.
1.8 Aim to conduct the interview in a setting which affords visual, physical, and auditory privacy.
1.8.1 Privacy is critical for keeping respondents’ answers to the survey questions confidential.
1.8.2 Although complete privacy is ideal, it is impossible to achieve in some cultures. Interviewers should attempt to keep the interview as private as possible while still respecting cultural norms. This may involve self-administration on more sensitive questions. See Guideline 4 below regarding self-administration in the context of a face-to-face interview. An alternative may be to keep any others present occupied while the targeted respondent completes the survey.
1.8.3 In some countries, it may be unacceptable to have an interviewer come to the respondent’s home, or for an interviewer of the opposite sex to interview or enter the home of the selected respondent or informant. As noted above, this may necessitate interviewer-respondent gender matching.
1.8.4 Privacy increases the likelihood that respondents will answer honestly about sensitive behaviors, such as sexual practices or drug use, or about sensitive attitudes such as religion in some contexts. What is considered sensitive may vary among countries or cultures; administration practices may need to differ accordingly.
1.9 In order to reduce nonresponse in the face-to-face mode of interviewing, train the interviewers to make observations of the housing unit to assess likely at-home patterns.
1.9.1 Note that in some countries, interviewers are not allowed to ask neighbors about targeted, but not yet contacted, respondents.
1.1 Because responses to some survey questions can be affected by other individuals present during data collection, it is optimal—but not always possible—to conduct face-to-face surveys in private. In a face-to-face fertility survey of women in what is now Bangladesh, privacy was difficult to establish; most interviews took place in the presence of the respondent’s mother- or sister-in-law. This may have affected responses to sensitive questions .
1.2 Similarly, men in some parts of Africa were found to object to confidential interviews of their wives or children. The interviewers were instructed to conduct interviews in a place that was visible to the male heads of household but out of earshot .
1.3 In some rural places, it might not always be feasible to conduct an interview inside a home, and it may have to take place outside and in a more public setting.
1.4 In other rural places, the survey interview is still a novel concept, making interview privacy difficult to attain. In the Chitwan Valley Family Surveys in Nepal, a survey interview often draws family members and even interested neighbors, who sit with the respondent and interviewer to listen in.
1.5 Analyses using data from nine countries participating in the World Mental Health Survey Initiative provided evidence that the presence of a third party during the survey interview process effected the reporting of sensitive information, but that the effect is moderated by differences in social conformity and the cultural setting from country to country . Analyses using data from the Saudi Arabia Mental Health Survey found that reporting of sensitive attitudes and behaviors can also vary depending on the relationship between the respondent and the other party (Mneimneh, de Jong, & Altwaijri, 2020).
⇡ Back to top
2. Consider the following steps when using a paper-and-pencil instrument (PAPI).
While the use of computerized technology has increased in survey administration, the paper-and-pencil instrument continues to be used by survey projects that lack the infrastructural capacity to adopt and maintain the necessary technology.
2.1 The paper instrument should be designed so that it is visually easy for the interviewer to administer. See Instrument Technical Design for further detail.
2.2 Develop a sample management protocol for use in the field by data collection supervisors. The protocol should include instructions for passing sampling units from one interviewer to another if the need arises, as well as the corresponding documentation of such transfers (see Data Collection: General Considerations 3.3 and sample management system).
2.2.1 Use a coversheet to track each sample element during the study (see Appendix A for an example of a paper coversheet).
2.2.2 Interviewers using paper coversheets have found that they work most efficiently if they sort the coversheets by (1) appointment time and (2) geographical location.
2.2.3 Consider efficient methods that allow interviewers to fill in coversheets and do household contacting at the same time. Filling in coversheet forms after making the contact has shown to be error prone.
2.3 Train interviewers to complete household enumeration and randomly select eligible members within the household unit (see Appendix B for household enumeration and Appendix C for an example of a Kish table).
2.4 Develop a distribution procedure for supplies to interviewers in the field, including a surplus of paper questionnaires to ensure a continual supply.
2.5 Develop a protocol for transferring completed paper questionnaires from interviewers to field supervisors, and from field supervisors to the head office or other location where data entry will occur.
2.6 Develop a protocol for maintaining completed questionnaires and coversheets in a secure location to ensure protection of respondent confidentiality.
2.1 Paper questionnaires and other survey materials can be misplaced, stolen, or otherwise lost in the field. Document any such circumstances and develop a protocol to determine whether affected respondents will be recontacted for a repeat interview.
2.2 Plan for an adequate storage, security, and filing system to get back to interviews efficiently.
2.3 In certain countries, like Ghana, weather conditions such as high humidity can destroy paper questionnaires in storage.
2.4 Researchers administering a PAPI survey of business and social entrepreneurship in the Kingdom of Tonga with complex skip patterns used a detailed skip pattern map to simplify training and questionnaire preparation. They also developed a notation system on the actual questionnaire page to assist the interviewer .
2.5 If there are multiple components to the questionnaire, consider using paper of different colors for each component (e.g., the coversheet in yellow, interviewer-completed survey in green, self-administered section in orange, etc.).
2.6 Alternately, if the questionnaire will be administered in several different dialects or languages within a country, consider printing each dialect/language on differently-colored paper.
2.7 Consider using heavy card stock or lamination for show cards and other paper-based instruments that will be used for multiple respondents.
2.8 If using an event history calendar or other unusually-sized instrument, allow for adequate printing time, particularly in countries where printing of odd-sized documents may be challenging. Researchers in Nepal report having difficulties in locating printing businesses with the capacity to print the large life-history calendars designed for administration.
2.9 Researchers administering a PAPI survey in the Kingdom of Tonga faced limitations in printing in the country itself, including the lack of paper, printing cartridges, and water-resistant paper that could withstand moisture and travel. Researchers emailed the questionnaire to a specialty printer in New Zealand, and the printed questionnaires were sent via airmail back to Tonga for use .
2.8 The cost of paper can be very expensive in some countries, and if the survey instrument contains many skip patterns, there can be a large amount of waste. For example, the PAPI version of the World Mental Health Composite International Diagnostic Interview (CIDI) 3.0 was about 400 pages in length, but contained numerous sections which began with a screener question and resulted in the respondent skipping the entirety of any section(s) for which the questions were not applicable.
⇡ Back to top
3. If an electronic instrument will be used instead of a paper-based instrument, consider the following procedural steps.
As technology becomes more accessible and affordable, with use increasing worldwide, computer-assisted personal interviewing (CAPI) is a popular mode choice and is frequently used in lieu of PAPI. Laptop computers have generally been the instrument of choice for CAPI, but tablets, smartphones, and other handheld device are increasing in popularity.
3.1 If CAPI use is new to the study site, develop an introduction strategy for both local collaborators and study respondents.
3.1.1 If possible, involve local collaborators in the study design to facilitate its adoption. The clinical and administrative staff in a rural Kenyan health center aided in the identification of appropriate data and in the formatting of the paper and electronic data recording interfaces. This helped reduce fears and distrust of computers, and engaged the clinical staff in the clinical research project .
3.1.2 In settings with limited technology, computerization can stimulate survey respondent interest and add legitimacy to the interviewers. Interviewers might also be more motivated to use technology in such settings. However, at the same time, in certain cultures, the use of technology can raise suspicion among respondents .
3.2 Assess technical experience at the data collection firm.
3.2.1 Critical staff should have adequate language competency. Programs interested in incorporating technology into their activities need to hire bilingual staff as trainers and programmers to improve understanding of how to use the chosen technology and to facilitate design and analysis activities, as most technology specifications are available only in universally-used languages such as English.
3.2.2 The data collection organization needs to have the technical expertise to create the questionnaire, provide technical support for interviewers, manage the in-flow and out-flow of data, manage databases, and run quality control checks. While some of these tasks could be outsourced, building local capacity is always recommended for continuity and long-term goals.
3.3 Assess available infrastructure in the study country.
3.3.1 If the data collection organization requires data to be transmitted for quality control on a regular basis, and reliable Internet connectivity needs to be in place, evaluate WiFi and other Internet connection capabilities across the geographic areas covered by the sample. Even though a country’s major cities may have adequate Internet capabilities allowing for regular data transfer, rural areas may present more challenges.
3.3.2 Interviewers and field office staff need to have access to reliable electrical power sources for the interviewing, as well as for communication devices (e.g., mobile phones). Interviewers might need to carry multiple batteries for their devices if they are visiting areas with limited power supply. Interviewers could also be instructed to use other methods for charging batteries, including in-car chargers (such as cigarette lighter adapters) or portable generators (; )
3.3.3 If the need to revise the questionnaire during data collection arises, computerization and connectivity allows for easy transmission of updated questionnaires to interviewers or respondents without the need for reprinting, mailing, or personal pick-up of material. Moreover, avoiding printing any material at all before or during production is environmentally friendly.
3.4 Choose and procure the necessary primary and auxiliary equipment.
3.4.1 Primary equipment:
- There must be a good fit between the project and the technological tool. Handheld devices such as smartphones may be more appropriate for smaller or simpler questionnaires, and less suitable for collecting open-ended responses due to their size .
- Purchasing equipment and accessories locally can facilitate more efficient servicing of equipment than if equipment is purchased internationally .
- Although new technology may be more expensive if purchased locally within less-developed countries, the cost saved in shipping, delays, and in-country technical support can more than compensate for that difference.
- If equipment is not available locally, however, most hardware is available through collaborators in industrialized countries or can be ordered directly via the Internet.
- Consider ordering an excess supply of batteries and extra equipment (e.g., several extra laptops) in case of equipment malfunction.
3.4.2 Auxiliary equipment:
- Decide on a backup and uploading process (SD cards, flash drives, automatic upload to a central system, etc.).
- Data synchronization between a mobile device and a central computer can be very time-consuming in a rural, remote setting. In a survey in Zanzibar, interviewers used mobile devices to collect and store data. At the end of each day, supervisors met up with each interviewer in order to capture the day’s data from each device using an SD card. The supervisors then manually transferred the data from the SD cards to the central computer, rather than relying on Internet service and data synchronization .
- The backup system must be carefully developed to handle possible transitions or losses. In a root-cause analysis from a survey using PDAs in Bolivia, poor backup protocol due to programmer error precluded researchers from interpreting the data .
- If possible, at least two separate central backup systems should be developed, in addition to having backup on the unit itself (i.e. memory cards) and a communal archiving system.
3.5 Select appropriate data collection software.
3.5.1 Additional attention should be given to non-Latin languages (e.g., Chinese, Arabic, Russian, etc.) when selecting technology and programming software, as not all software packages can support non-Latin script.
3.6 Select an appropriate electronic sample management system.
3.6.1 If an electronic sample management system is used, coordinating centers may play a role in monitoring fieldwork. See Study Design and Organizational Structure for details.
3.6.2 The electronic sample management system should permit interviewers to sort the sample respondents by both appointment time and geographical location.
3.7 Develop and test the CAPI instrument.
3.7.1 Allow for sufficient time and budget for computerized specifications in the preproduction phase .
3.7.2 Consider using paper documents for certain aspects of the survey. For example, interviewers in China using handheld computers reported that it was overly time-consuming to read the full consent form on a small screen .
3.8 Develop a distribution system for supplies to the field.
3.8.1 Develop procedures for storage and transport of equipment.
3.8.2 Interviewers who are traveling long distances or through difficult terrains or weather conditions find it easier to carry their laptops, or even smaller devices (i.e., tablets), to conduct their interviews than to carry cumbersome paper questionnaires .
3.9 Develop procedures for use and maintenance of technology in the field.
3.9.1 Charge batteries daily to mitigate data loss due to battery discharge. Instruct interviewers to verify daily that batteries are charged.
3.9.2 Provide interviewers with a reliable electrical source to charge both CAPI instrument and mobile phone batteries so that interviewers can contact supervisors in the event of equipment malfunction. Communication is necessary for possible instrument troubleshooting and monitoring team progress. Most technical issues are simple user errors that can be resolved with a short discussion with the supervisor.
3.9.3 Backup plans need to be designed and implemented in case of power outages, especially in resource-constrained environments.
3.9.4 It is possible for data to be lost because of hardware or software malfunction, and for equipment to be lost or stolen during fieldwork. Researchers need to establish protocols for preventing and handling such situations.
3.9.5 Decide whether interviewers should be provided with paper copies of the questionnaire, or some material with which to take notes, in case of equipment failure . Some studies choose not to provide paper versions because they do not want to encourage the use of alternate paper instruments by interviewers.
3.9.6 It is crucial to have local informatics experts on hand for development and custom integration of databases, continued support, and adaptation to new applications. Specifically, a programmer with experience in database and systems design, implementation, and maintenance is recommended, and this resident expertise is available in most if not all countries .
3.9.7 Equip interviewers with accessories that are needed for protecting and maintaining their equipment, such as laptop bags, screen covers, sleeves, rain shields, etc.
3.9.8 Have interviewers sign terms-of-use agreements detailing ownership and responsibilities relating to any equipment they will be using.
3.10 Management of data files during the field period.
3.10.1 The electronic data audit trail provides important paradata, and should be uploaded and backed up as well. Determine what will happen to paradata in case of equipment failure during an interview.
3.10.2 Lack of electricity and/or Internet connection can lead to delays in the backup and upload process. For example, a survey in Kenya experienced delays in immediate transfer of data collected due to electrical instability; data often could not be backed up in the field, and so was only backed up once a week at the study office .
3.11 Develop strategies to increase privacy.
3.11.1 Though interviewer-administered computerization can, in general, increase the level of respondent privacy, the novelty of it in some cultures might attract bystanders, and interviewers may need additional training on how to request and achieve privacy in such situations . DHS interviewers in Nepal found they often had to make extra effort to maintain privacy, which usually demanded more time to administer the questionnaire . See also .
3.11.2 Reading computer screens under direct sunlight can lead to difficulty in administering an interview and limit the options for confidential interview space. This can be a particular concern when asking sensitive questions related to sexual behavior and domestic violence.
3.12 Devote adequate time to interviewer training for CAPI-specific issues. When using CAPI, interviewer training is a two-step process, requiring technical training focused specifically on the survey instrument itself (e.g., the introduction of the instrument to respondents, how to use and care for the instrument, etc.) as well as study-specific and general interviewer training.
3.12.1 Instruct interviewers on how to introduce technology to the survey population, especially in settings where exposure to technology is more limited. This could be done by collaborating with community leaders, who could act as liaisons announcing the survey and the use of technology to their community members.
3.12.2 Instruct interviewers on how to explain the use of technology to respondents during the consent process (e.g. that recording will or will not be disabled).
3.12.3 Provide training on how to handle, label, care, transport, and store equipment properly. This is especially important in contexts where technology is more novel.
3.12.4 Instruct interviewers on steps to take in case of equipment failure and theft.
3.12.5 Instruct interviewers on password use, stylus (if needed), and how to access the questionnaire, enter responses, and insert and remove any memory cards used.
3.12.6 Operational instructions should be in study site language, not only in English .
3.12.7 If paper questionnaires will be available in the event of equipment malfunction, training on the PAPI instrument is essential as well.
3.12.8 When using technology, there can be a tendency for interviewers to focus on the technology itself rather than the respondent, which should be addressed during interviewer training.
3.12.9 Allow interviewers ample time to practice administering the questionnaire to increase comfort with the flow of questions. Interviewers using CAPI instruments are more likely to lose track of where they are in the sequences of questions because they can see only one screen at a time, and familiarity with the instrument can decrease difficulty (, , ).
3.12.10 If using an electronic sample management system, train interviewers to complete household enumeration and randomly select eligible members within the household unit.
3.12.11 Although interviewers must be trained in the use of the specific computer program being utilized, it is crucial to devote adequate time to training on other important interpersonal aspects of survey implementation .
3.13 Develop interviewer management procedures for use in the field.
3.13.1 Interviewers must have fast and regular communication with field staff team leaders and technical support staff. This is necessary for troubleshooting and monitoring team progress.
3.13.2 Information technologies allow the implementation of a system of work ownership if all personnel are assigned a code for database entry, supervision, and analysis to maintain logs controlling data management and information flow.
3.1 Technology can be adopted even in resource-poor countries, leading to improvements in efficiency and data collection capabilities.
3.1.1 Researchers successfully conducted a Demographic and Health Survey (DHS) in Nepal using tablet PCs. The connection to the central network took one minute on average, and data transfer to the server in Kathmandu took approximately 5 to 7 minutes. In contrast, in the past, paper-based surveys had to be sent to Kathmandu via pouch mail or hand-carried, which took days or even weeks. The use of CAPI reduced data collection time by one month compared to the previous survey completed by PAPI (from 6.5 months total to 5.5 months). However, there were some security concerns with carrying and storing the tablets, especially in remote areas, because some interviewers had to stay in community members’ homes. Enforcing joint responsibility for theft of, or damage to, the tablet PCs among the interviewer teams helped to ensure security of the tablets during transport and storage. For example, interviewers were trained to lock and be aware of their tablet PCs at all times, even during meal and rest times . And, in a Peruvian survey, handheld computers were inserted into a wood-and-Styrofoam clipboard to conceal it and shield it from possible damage .
3.1.2 In cross-cultural surveys such as the World Mental Health (WMH) Initiative, some participating countries have been unable to implement technology-based survey instruments due to infrastructural constraints. However, the WMH Coordination Centre made the decision that those countries which can use technology (in this case, CAPI) should, as the advantages outweigh the methodological concerns of noncomparability. Other experimental studies have found few significant differences in survey estimates (; ). The WMH Organization’s current recommendation is to challenge where CAPI can and cannot be used. For example, in 2003, Columbia was able to implement the WMH survey with great success using CAPI. Countries that used PAPI in the most recent WMH surveys expressed the wish that they had more strongly pursued CAPI, especially because of quality control and complexity of survey instrument .
3.1.3 Researchers should be aware that mode differences can occur in unanticipated ways. In a meta-analysis of studies from the United States, Canada, the United Kingdom, and Italy comparing data from PDAs to PAPI data, the results favor handheld computers over paper-and-pencil for data collection among study participants, but the data are not uniform for the different outcomes. Handheld computers appear superior in timeliness of receipt and data handling (four out of four studies) and are preferred by most subjects (three out of four studies). On the other hand, only one of the trials adequately compared adherence to instructions for recording and submission of data (handheld computers were superior), and comparisons of accuracy were inconsistent between five studies .
3.1.4 The availability of information and communication technologies for direct data transfer has the potential to improve the conduct of research (especially public health research) in resource-poor settings. The shortened data entry time in a CAPI vaccination survey in Zanzibar allowed for the transition time to vaccination and subsequently to disease surveillance to be shortened . As technology continues to evolve, research on its impact on survey data collection should continue.
3.1.5 In a study by , a PDA-based survey in Tanzania resulted in an estimated 25% reduction in cost compared to a paper-based survey. Elimination of questionnaire printing costs is even more significant if multiple languages/versions are needed in a country, because multiple versions can be programmed into the same platform . In another effort to reduce costs, researchers found that sending an excess supply of batteries to study sites helped decrease use of PAPI and its associated additional costs .
3.1.6 The use of technology can greatly increase the efficiency through which data from multiple data collection modes can be linked. Current smartphone capabilities allow for scanning barcodes on respondent records, which has the potential to further effectively link data from multiple sources, such as completed surveys, signed letters of consent, medical charts, biomarker records, etc. (; ).
3.1.7 The use of CAPI platforms can extend potential working hours. Because CAPI can be used in low-light situations, interviewers can work during evening hours, otherwise a challenge with paper questionnaires in settings with frequent power outages.
3.2 In non-Western settings, interviewers have generally reported a preference for CAPI instruments. Examples include the following from across the world:
3.2.1 CAPI was successfully implemented in a survey of malaria morbidity in Gambia, where interviewers reported a preference for CAPI over PAPI in terms of amount of work, number of errors, length of interviews, and ease of transport .
3.2.2 Handheld computers were used for a tobacco use survey in a hard-to-reach population in China, where most interviewers stated a preference for handheld computers for future surveys .
3.2.3 In a survey in Zanzibar, acceptability of PDA use was high among staff not familiar with computers or PDAs, and after an initial training period, none of the users were interested in returning to paper-based data entry .
3.2.4 In a survey in Bolivia, interviewers reported that using PDAs to administer interviews stimulated their own interest in working on the survey .
3.3 Use of technology has been well-received by respondents.
3.3.1 In a survey in Tanzania using PDAs, most respondents who expressed their opinions about the use of PDAs had something positive to say. For example, a 30-year old man with primary-level education said “I was very happy to see a computer, as it was my first time to see it. It simplified recording of our responses.” An elderly man expressed his appreciation of having learnt what day of the week he had been born .
3.3.2 In a Demographic and Health Survey in Nepal, respondents were curious about being interviewed using the tablet PCs. The interviewers perceived a high level of respect and enthusiasm from respondents, and they felt that respondents viewed them as technical employees with higher education. This was an unanticipated but encouraging finding, especially because of respondents’ limited exposure to computers. However, at the same time, in certain cultures, the use of technology can raise suspicion among respondents, and although acceptability of the tablet PCs was high, there were a few cases of skepticism. As part of the informed consent process, respondents were informed that the interview would not be video- or audio-recorded and that the recording feature had been disabled on the tablet PCs. However, a few respondents were still concerned .
3.3.3 Analyses from a survey in rural southern Kenya using PDAs found a reduction in refusals, attributed to the perception by respondents that the PDA was more secure .
3.3.4 Due to the increasing use of mobile phones and other similar technologies in day-to-day life, operating a computerized questionnaire on a handheld device might be more familiar to respondents with little or no experience in the use of computers .
3.4 Allow for adequate project preparation before beginning fieldwork.
3.4.1 Do not underestimate the additional time needed for preparation for both initial adoption and continued use of technology. In a survey in Burkina Faso, researchers reported underestimating the amount of work required to program questionnaires and, as a result, failed to maximize the use of some of the available options for input checking and other real-time quality control procedures. Village names, for example, were implemented as a text-entry field, but would have been better as a drop-down list to avoid ambiguities of spelling, etc. Combinations of input checks, plus quality control measures at the stage where data were downloaded to portable computers in the field, should have picked up concerns at an earlier and remediable stage .
3.4.2 Having local trained personnel is essential. Using a ‘train the trainers’ model, technical and supervisory staffs in a public health survey in China were able to develop the questionnaire and complete the programming with minimal assistance from technical experts from the coordinating center. When problems occurred, the Chinese technical experts could then provide immediate technical guidance and troubleshooting to interviewers and other staff .
3.4.3 It can be difficult to repair equipment in-country. recommend the implementation of a preventative maintenance program.
3.4.4 Consent letters mentioning the use of technology can be helpful in reducing nonresponse. In a survey in Tanzania using PDAs, most respondents said that they had noted the PDA after its mention in the consent letter. Several interviewees appreciated the interviewer having introduced them to the technology during the consent procedure .
3.5 When using CAPI, particularly with interviewers previously unfamiliar with computerized instruments, consider the following with regards to interviewer recruitment, training, and management.
3.5.1 Experience suggests that interviewers with little education and no experience in the use of a computer are easily able to use handheld devices for survey administration . With increasing use of mobile phones and other similar technologies, operating handheld devices, downloading data, and recharging batteries are becoming increasingly familiar concepts.
3.5.2 Although use is increasing, however, plan for adequate time for interviewer training. In a survey in Bolivia using PDAs, interviewers wanted additional practice time because of previously limited experience with the technology, and in particular wanted more instruction on the use of a stylus, as keyboards on handheld devices can be cumbersome .
3.5.3 Analyses of inter-observer accuracy and performance revealed a considerable range in a survey in Burkina Faso. Some interviewers clearly worked faster with the PDAs than others, though they were not necessarily those who covered the greatest number of households per day worked. However, those who carried out interviews relatively quickly were generally also those who made the fewest input errors. In surveys of this kind, where competence in local languages is an important factor, there are often not many options in terms of who can be recruited as interviewers .
3.5.4 Training on proper handling and care of the equipment is also very important, particularly in a rural context, where the equipment has to be transported through rough terrain, the power supply is not stable, and unexpected rain is a concern. In the DHS survey in Nepal, teams were provided with generators, rain shields, umbrellas, and several other items to manage these challenges. Enforcing joint responsibility for theft of, or damage to, the tablet PCs among the interviewer teams helped to ensure security of the tablets during transport and storage. With proper care and maintenance, tablet PCs (and portable generators) can be reused in future surveys, resulting in additional cost savings over the long term .
3.6 The use of CAPI is not without its technology-related challenges.
3.6.1 Project staff should be aware of the possibility of corrupted dates/timestamps due to equipment malfunction. In a survey using PDAs in Kenya, researchers found that if the PDA lost power, its clock would automatically reset, which had effects on the pregnancy data that was collected. Particular caution should be used if data is time-sensitive, as it was in this case .
3.6.2 If using CAPI, the concurrent use of paper files for portions of the survey can lead to logistical challenges. used PDAs to follow patients who visited a rural Kenyan health center. The data entry program did not allow for entry of text field notes by the research assistants, who had to use a paper notebook for such notes. This can cause a disconnect between the text notes and the patient data to which they referred.
3.7 Researchers have used several methods for maintaining respondent confidentiality and ensuring data security when using CAPI.
3.7.1 Data can be copied and automatically saved to an SD card, after which interviewers are unable to retrieve or change an entry and no record of the entry was retained on the PDA (;). In case of equipment loss, it is then impossible to access and see the data on the SD card without a password and the requisite software. In a survey in Kenya, when one PDA was stolen in political violence, two interviews were lost on the SD card, but respondent confidentiality was maintained because of security protocols in place .
3.7.2 In a survey in Tanzania, data were downloaded to the laptop computers and daily summary reports produced to evaluate the completeness of data collection. Data were backed up at three levels: (i) at the end of every module, data were backed up onto storage cards in the PDA; (ii) at the end of every day, data were downloaded to laptop computers; and (iii) a compact disc (CD) was made of each team’s data each day .
3.8 As smartphones become more ubiquitous in daily life, their use in survey research is expected to increase. Findings from a recent survey examining the usability of smartphones vs. tablets in Kenya generally favor tablets over smartphones . Highlights from the study include:
3.8.1 Confidence and comfort in typing dependent on past experience with device and touchscreens.
3.8.2 Interviewers felt more likely to accidentally select the wrong options on phones.
3.8.3 Interviewers admitted to not scrolling completely through questions/responses on phones.
3.8.4 Interviewers felt more professional with tablets.
3.8.5 Interviewers felt safer with phones because of the smaller size compared to the tablets, which attracted unwanted attention.
3.8.6 Smartphones were associated with more typing error.
3.8.7 Long open-ended questions and long numeric strings are difficult.
⇡ Back to top
4. If the questionnaire includes items of a sensitive nature, consider administering these questions in a self-administered module during the face-to-face interview.
Evidence suggests that increasing privacy during an interview can improve the accuracy of reporting such topics in surveys (; ; ), but achieving privacy in non-Western settings varies considerably between countries . For a face-to-face interview, consider administering the sensitive sections in a self-administered questionnaire (SAQ). Research indicates that respondents in an interviewer-administered, non-private setting tend to misreport information perceived to be sensitive. For example, respondents might underreport undesirable or private information such as drug use or illegal status, and they might overreport desirable information such as voting.
Many surveys include potentially sensitive questions about both respondent behavior and attitudes concerning such topics as sexual behavior and contraceptive use, substance abuse, violence, and politics. These delicate topics are particularly susceptible to social desirability bias in non-Western settings. However, asking sensitive questions in a self-administered format has the potential to decrease bias and achieve more accurate reporting.
4.1 Assess the literacy of the target population and choose the most appropriate instrument for the SAQ.
4.1.1 The SAQ can be a paper questionnaire given to the respondent to self-complete. The paper-based SAQ should not have complex skip patterns, and the target population should have adequate literacy levels.
4.1.2 The SAQ can take the form of computer-assisted self-interviewing (CASI), where respondents use a technology platform (i.e., a laptop, tablet, smartphone, etc.) and complete either the entire questionnaire or a specific section of it independently. The technology therefore facilitates the administration of a complex instrument, much like CAPI facilitates administration for the interviewer.
4.1.3 Audio computer-assisted self-interviewing (A-CASI) has the advantages of CASI, but can be particularly helpful in low-literacy settings. In A-CASI, respondents listen to an audio track recording of each survey question using a headset and move through the survey at their own pace. If illiterate, survey respondents can be instructed to push color-coded buttons on a touchscreen or mini-keyboard, or have graphical representations of answer categories to indicate their response to each question (see Instrument Technical Design, Appendix F for an example).
4.1.4 If using A-CASI, assess whether the setting would benefit from gender-matching in terms of the audio voice used. That is, if the recording presented to female respondents should be a female voice, while male respondents are presented with a recorded male voice.
4.2 When designing an SAQ instrument, consider the following:
4.2.1 Be mindful of survey length. Longer surveys administered using an SAQ mode may have more missing data, both because of the lack of interviewer probing and the lack of the pressure respondents feel to cooperate with the interviewer .
4.2.2 Develop interviewer instructions for explaining the SAQ to the respondent.
- The level of detail of the instructions will differ by mode, with CASI and A-CASI necessitating more explanation than a paper-based SAQ, particularly in low-literacy settings.
- If an SAQ is utilized for reasons involving increased respondent confidentiality, then this rationale should be explained to respondents.
- Develop a protocol for interviewer behavior during the interview, particularly concerning the extent to which interviewers should be encouraged to help or otherwise interact with the respondent. All interactions should be documented.
- Consider adding questions at the end of the interview to assess respondents’ perceived ease of use, privacy, and truthfulness.
4.2.3 When using CASI and A-CASI modes, attention to details that facilitate the respondent experience can lead to increased data quality.
- Consider disabling the screen saver and monitor power-saver settings on the device so that screens do not go blank if a participant takes additional time to answer a question .
- Graphical and/or audio representations of the response process can help guide the respondent through the interview. In a survey in India using A-CASI, the entry of a response was marked by the change in the color of the corresponding response bar on the screen to grey along with a ‘beep’ sound. A ‘thank you’ screen indicated the end of the survey .
- If a participant did not answer a question after approximately 60 seconds, consider repeating the question and/or programming additional text. The additional text can be programmed to appear encouraging participants to answer the item(s) in a truthful manner .
- If used, the keyboard should be user-friendly. Keyboard options can be limited to responses (e.g. YES, NO, and numbers), and larger color-coded keyboard keys could be used. Additional keyboard shortcuts to replay questions can also be marked.
- Text on the computer screen should be large enough to be easily legible for respondents.
- In an A-CASI survey in India, neither the questions nor the responses were displayed on the screen to ensure privacy and confidentiality for the respondents (Bhatnagar et al., 2013).
- Touchscreens on A-CASI instruments can be particularly helpful for less-educated populations .
4.3 Additional technologies for SAQ modes in a face-to-face interview continue to emerge, including video-computer-administered self-interview (V-CASI) (; ). If planning to use an SAQ, investigate the most recent literature available for further guidance.
4.1 The use of novel technology, particularly in non-Western settings, can motivate respondents to participate.
4.1.1 In a comparison of paper SAQ vs. self-administered PDA questionnaires on sexual behavior, South African adolescents reported more favorable attitudes toward the PDA mode (; ).
4.1.2 End-of-questionnaire items measured high respondent-perceived truthfulness in a South African survey about sexual behavior and greater preference for A-CASI compared to other modes, primarily because of perceived increased confidentiality and privacy, as well as the novelty of technology (; ; ; ; ).
4.1.3 However, if technology is unfamiliar to the population, it may cause concern about the project activities. In a study using A-CASI in rural Kenya, interviewers and supervisors reported that the presence of computers heightened the animosity and opposition the community had toward the project activities. Rumors spread that the survey was the work of devil worshipers, and that interviewers were collecting the names of adolescents who would later be abducted. Many respondents believed that the computers collected information for the government. Also, respondents were angry that expensive equipment was brought into resource-starved communities during a time of drought. Misinformation spread throughout the region before interviewers even entered some sampling units. Some residents thought that the computer was having a ‘conversation’ with the respondents, despite insistence that the computer voice was taped. And an initial A-CASI protocol in which respondents’ answers would be read back to them after each question for verification needed to be discontinued, because some respondents perceived the computer to be “talking to them,” resulting in decreased perceptions of confidentiality .
4.2 Use of an SAQ mode can impact the length of time needed for interviewer administration depending on setting and demographics. The HIV/STD Prevention Study found that surveys using A-CASI generally took longer to administer than CAPI in China, Peru, India, and Zimbabwe. However, A-CASI took less time in Russia, where the participants had had more exposure to technology and were of a younger age .
4.3 There is evidence that using A-CASI is feasible in non-Western settings.
4.3.1 The NIMH Collaborative HIV/STD Prevention Trial Group conducted a feasibility study comparing results from surveys using CAPI and A-CASI in China, India, Peru, Russia, and Zimbabwe . Despite the varying levels of literacy and exposure to computers by country, most study participants reported that it was easy to enter their answers into the computer, that they felt comfortable doing so, and that they either preferred the computer over an interviewer for answering questions about topics such as sexual behavior and drug and alcohol use or had no preference. Most participants gave the same responses on both their A-CASI and CAPI interviews.
4.3.2 While A-CASI has generally been feasible in non-Western settings, however, ease of use can vary by socio-demographic characteristics.
- Older and unemployed respondents report increased difficulty with A-CASI , as do less educated respondents .
- Women with little education (primary school or less) had considerably more problems using the computer keyboard, reading the computer screen, and correcting mistakes than women in higher educational groups (also ).
4.4 In regions where there are multiple languages and dialects, use of A-CASI can facilitate the interview process. A completely self-administered questionnaire can ease the logistical challenges in the field of matching a respondent with an interviewer who has the necessary language capabilities.
4.5 Use of A-CASI can lead to improvements in data quality.
4.5.1 A-CASI is a more standardized method of assessment than CAPI. Using CAPI, interviewers may use probes beyond the standard set, even though they are instructed not to do so.
4.5.2 Unlike a paper-based SAQ, use of A-CASI leads to fewer data entry errors and less missing data, because the skip patterns are programmed into the computer and are executed as the interview is administered (; ).
4.5.3 In a comparison study on topics related to HIV/AIDS in three cities in Vietnam, respondents assigned to A-CASI had lower item refusal rates than those assigned to a face-to-face interview or a paper-based-SAQ .
4.5.4 There is evidence that using A-CASI has the potential to improve data quality through the reduction of missing data. Studies of mode differences in South Africa and Thailand have found that those respondents assigned A-CASI had less missing data than those assigned to a paper-based SAQ (; ).
4.5.5 A short respondent training session prior to the administration of A-CASI can improve data quality. A survey of young women in Malawi utilized headphones and an external color-coded mini keypad, with a red key to replay the question, a green key to go on to the next question, and a yellow key to skip a question. For dichotomous questions, respondents were instructed to press 1 for ‘yes’ and 2 for ‘no.’ Prior to the A-CASI main survey, each respondent completed three ‘practice’ questions to evaluate her understanding of the interview process; for example, “Are you a male or a female?”. For each practice question, the correct answers were previously entered by the interviewer to serve as a check against the respondent’s entry. Respondents were not able to proceed to the main interview until they were able to answer all three practice questions correctly .
4.6 On the other hand, use of A-CASI can bring challenges to data quality as a result of decreased interviewer interaction.
4.6.1 Respondents may not understand skip patterns or other aspects of the survey, but feel reluctant to ask the interviewer for direct assistance given the hands-off nature of A-CASI .
4.6.2 An SAQ on sensitive topics may also lead to reluctance to engage the interviewer in a related question about completing the survey because, in using A-CASI, there can be an underlying perception that the topic is too delicate to discuss outright.
4.6.3 A study using SAQ in Tanzania found that about 7% of respondents selected only the first or the last response categories in a section for which such a response pattern would be inconsistent. This bias was associated with females, those less educated, and those more geographically remote .
4.6.4 Mode can impact data quality because of inconsistency in editing. A comparative survey of young women in Malawi found more consistent reporting in the face-to-face mode than in the A-CASI mode. Researchers speculated that, against protocol, the face-to-face interviewers may have edited the questions for consistency post hoc, whereas such editing was not possible in the A-CASI mode by respondents.
4.7 A-CASI is often used with the a-priori expectation that privacy for the respondent will result in increased reporting of more sensitive behaviors, with the related implication that this reporting is, indeed, accurate (; ). However, meta-analyses using data from non-Western settings are inconclusive as to whether SAQ modes increase accuracy of sensitive behaviors.
4.8 Results from a meta-analysis of face-to-face and A-CASI modes in studies on sexually transmitted infections and associated behaviors in Brazil, Vietnam, Thailand, Kenya, India, Russia, Zimbabwe, Malawi, China, and Tanzania demonstrate that overall, A-CASI methods are not consistently associated with a significant increase in reporting of sensitive behaviors, but trends can be seen in certain contexts. In general, increased reporting in A-CASI has been associated with region (Asia), setting (urban), and education (secondary education) .
4.9 In contrast, a meta-analysis by of 26 studies in developing countries on sexual behavior demonstrated that, in general, A-CASI can significantly reduce reporting bias. The results of this review, as well as findings from other researchers (cited below), show that the relationship with and success of novel interviewing methods has proved complex in low- and middle-income country contexts, and researchers should be aware of the mode differences that can result depending on the study topic and social context (; ;; ; ; ; ; ; ; ; ; ; ; ).
⇡ Back to top