Study Design and Operational Structure |
Inputs
- Study goals and objectives
- Country-specific legislation on conducting survey research
- Leadership roles and responsibilities
- Timeline
- Deliverables
- Quality standards
- Budget
Activities
- Create framework and structure of responsibilities and tasks
- Arrange regular meetings of working group and team leaders
- Develop communication flowchart
- Determine the study’s quality standards, then implement them throughout the research process
- Develop quality management plan and identify quality profile elements
- Assess quality indicators (i.e., paradata-derived indicators) at each stage, and finally make appropriate changes to repeat the Plan-Do-Check-Act (PDCA) cycle
- Implement a certification process to check study design and quality standards
- Consider site visits to all countries to monitor or support the implementation of quality standards
- Monitor costs in order to avoid overruns
- If and where possible, incorporate methodological research
|
- Monitor paradata-derived quality measure indicators
- Monitor budget, costs, and timeline for each country
|
- Study goals and objectives
- Documentation and formatting requirements
- All study implementation procedures
- Documentation of modifications to study protocol
- Summary of each organization’s performance
|
Ethical Considerations |
Inputs
- Standards for ethical and scientific conduct
- Local and national human subject regulations and legislation
- Ethical guidelines in project management and human resource management
- Voluntary informed consent protocol and procedures
- Procedures for ethics training of project staff
- Comprehensive plan for protection of confidentiality
Activities
- Review and apply ethical standards, best practices, and relevant regulations and legislation in designing study and collecting and disseminating survey data
- Develop and apply knowledge of local customs and norms relevant for designing culturally-sensitive survey protocols
- Pretest consent protocol and forms to ensure comprehension
- Translate and adapt consent protocols and forms according to best practices for translation
- Assess respondent burden (overall and by subgroup, if appropriate)
- Train project staff on ethics
- Have project staff sign pledge of confidentiality
- Complete ethics review submission and maintain documentation of submission materials
- Review recorded interviews and monitoring to assure adherence to informed consent procedures
- Monitor implementation of confidentiality protocols and procedures
- Perform audits to determine adherence to confidentiality protocols and procedures
- Securely store signed pledges of confidentiality and consent forms
- Maintain records of all ethics review committee correspondence
- Recontact a sample of cases for each interviewer to verify that screening and interview procedures were appropriately followed
- Conduct verification to detect possible interview falsification
- Use analyses of paradata to identify unusual variable distributions for one or more interviewers compared to the overall distribution
- Conduct disclosure analysis
- Investigate any deviation from ethical protocols and take appropriate action to address the situation
|
- Report on staff completion of ethics training
- Review the implementation of informed consent procedures (percent of cases reviewed, percent of cases failing to follow procedures, actions taken, etc.)
- Report on interview falsification (percent of cases reviewed, percent of reviewed cases falsified, subsequent actions taken, etc.)
- Report on any actual or potential breaches of confidentiality, security, or other adverse event, including any resulting changes to study protocol
- Report on any failures of statistical disclosure control
|
- Description of voluntary consent and confidentiality procedures
- Copies of materials provided to respondents as part of informed consent process
- Summary of respondent burden assessment
- Description of ethics training for project staff
- Summary of ethics committee review
- Summary of review of recorded interviews regarding the implementation of informed consent procedures
- Summary of falsification findings
- Summary of any reported actual or potential breaches of confidentiality
- Description of disclosure analysis methods and summary of findings
|
Tenders, Bids, and Contracts |
Inputs
- Type of contract offered
- Study specifications
- Minimum quality requirements and evaluation criteria for bids
Activities
- Prepare tender based on study specifications
- Conduct competitive bidding process within each country
- Evaluate bids and select a survey organization in each country
- Consider rereleasing the tender if no bidding survey organization can meet the requested quality standards
- Define progress approval points throughout the research process
- Develop a quality management plan
- Develop quality control and quality assurance procedures
|
- Report on evaluation scores of bidding organizations
|
- Summary of process of evaluating and selecting bidding organizations
|
Sample Design |
Inputs
- Target and survey population descriptions
- Sampling frame(s), definitions (including definitions of strata and sampling units), and any updating of the frame that was needed
- Desired level of precision, both overall and for specific subgroups
- Sample size based on specified levels of precision
- Selection procedure(s) and estimates of probabilities of selection at each stage
- Field listing standard procedures and minimum requirements of field listers
- Unique sample identification codes for each selected sampling unit
- Data dictionary
Activities
- Produce, update, and/or clean sample frame(s), as needed
- Calculate sample size
- Implement selection procedure(s)
- Create a unique sample identification code for each selected element or unit
- Arrange regular meetings of working group, project manager, and sampling statistician
- Conduct responsive design plans to minimize survey costs and errors
|
- Estimate [latex]c[/latex]
- Report on percentage of duplicate and ineligible sampling units on the sampling frame(s)
- Produce tables/charts of paradata indicators that serve as proxies of survey costs and errors
- Alter the survey design during data collection to minimize costs and errors in a responsive design framework
- Produce frequency tables for key variables from the frame of sampling units
|
- Time dimension of design (e.g., one-time cross-sectional, fixed, or rotating panel)
- Target and survey population definitions, including inclusion/ exclusion criteria
- Sampling frame(s) descriptions
- Examples of maps and protocol used in field listing
- Description of all stages of selection, including sample sizes, stratification, clustering, oversampling, and number of replicates fielded at each stage
- Documentation of procedures to determine probabilities of selection and weights for each stage of selection
- Tables of the precision of the estimates of key survey statistics
- (If necessary) descriptions of substitution procedures
|
Questionnaire Design |
Inputs
- Research question(s)
- Review of literature and any relevant studies to identify useful material
- Documentation templates
- Documentation of origins of any existing questions or materials to be considered for re-use
Activities
- Create cross-cultural and cross-competence development team, providing briefing, training, and tools as relevant
- Determine design approach
- Create analysis plan relating constructs, indicators, and question topics
- Implement design steps
- Determine appropriate methods to assess the quality of questions
- When possible, use wording experiments to decide between different candidate question wordings
- Version control procedures are necessary whenever a source questionnaire is modified across time
|
- Description of the questionnaire design procedures
|
- Report on modifications made to questions at different stages
- Document different versions of questionnaires if applicable
|
Adaptation |
Inputs
- Source questionnaires and any materials which might be adapted
- Translated questionnaires and any materials which might be adapted
- Documentation templates as relevant
- Guidelines on adaptation goals and more common forms
- Briefing and training of team, as necessary
- Delivery schedule and required outputs
Activities
- Determine stage(s) at which adaptation is possible
- Create adaptation team with skills suited for whichever stage(s) are envisaged
- Make adaptation proposals with documented justifications
- Conduct external review of adaptation proposals and their documentation
- Test adaptations for targeted population(s) and revise as relevant
- Adjudicate/sign off on adaptation decisions and finalize documentation
|
|
|
Translation |
Inputs
- Source questionnaire and any material to be translated
- Guidelines and stipulations on procedures to be followed and on outputs required (e.g., need for documentation on decisions)
- Templates for translation development, as relevant
- Delivery schedule, including any further refinements proposed that relate to translation (procedures such as language harmonization, adaptation, pretesting, and any required adjudication steps)
- Procedure to monitor performance as appropriate
Activities
- Create translation team, briefing, training, and monitoring as relevant
- Produce draft translations, checking translator output at an early stage of production
- Maintain documentation at each stage
- Review and adjudicate translations
- Pretest translations
- Repeat any translation refinement step as needed
|
- Draft translation review report
|
- Documentation of translation review process
|
Instrument Technical Design |
Inputs
- Instrument specification guidelines
- Comprehensive design evaluation plan, including goals, evaluation techniques, and timeline
- Quality assurance metrics (e.g., questionnaire and item timings, review of computer-assisted application audit trails, behavior/event codes)
Activities
- Provide clear instrument specifications and/or data dictionary
- Provide culture- or language-specific adaptations of design specifications
- Develop instrument evaluation procedures
- Perform and report on design assessments
- Review quality assurance metrics reports
- Make recommendations for improvement
|
- Collect and report on quality metrics or measures, including:
- Questionnaire length and section and item timings
- Audit trails for computerized applications
- Behavior codes or event codes based on audio or video recordings of pretests or usability tests
- Qualitative analysis of cognitive interviews and usability testing (see Pretesting)
- Heuristic evaluation or expert review
|
- Instrument specification guidelines
- Procedures for design evaluation
- Results of design evaluations
- Documentation and results of quality assurance and quality monitoring and control
|
Interviewer Recruitment and Training |
Inputs
- Recruitment and training timeline
- Minimum standards for employment
- Study-specific requirements (e.g., gender, language, etc.)
- Assessment tests
- Minimum interviewer requirements checklist
- Criteria for dismissal or followup training
- Standard certification procedures
Activities
- Establish a checklist of minimum interviewer candidate requirements
- Train trainers before they train interviewers
- Complete checklist during candidate screening
- Take attendance during training
- At the end of basic interviewer training, evaluate the knowledge of the interviewer candidates
- Certify candidates
- Dismiss or retrain candidates who fail certification
- Maintain written records of candidates’ certification test results
- Track the cost and success rates of different recruitment avenues
- Survey interviewer candidates to determine what improvements could be made to the recruitment process
- Debrief interviewer trainees to determine how training could be improved
|
- Report on training attendance
- Report on candidate training certification (including rates)
- Report on followup training certification (including rates)
|
- Employment criteria
- General and study-specific training documentation
- Certification procedures
- Certification rates for training and followup training
|
Pretesting |
Inputs
- Pretesting plan, including pretest goals, evaluation techniques, timeline, and budget
- Standard procedures for staff training
Activities
- Provide staff training and certification
- Examine the findings of each pretesting technique used and identify the causes of any problems discovered
- Review results from a pilot study, if conducted
- Review recordings of focus groups and cognitive interviews for staff errors
- Provide retraining as necessary
- Test for inter-coder reliability, if appropriate
- Coordinate the documentation of the pretest across participating countries
|
- Monitor costs and timeline
- Monitor staff error rates
- Test inter-coder reliability
|
- Pretest procedures documentation
- Pretest training documentation
- Pretest findings, change recommendations, and changes made
- Staff error rates
|
Data Collection |
Inputs
- Target (e.g., response, refusal, and completion rates
- Target hours per interview
- Recontact
- Percentage of interviewer cases to be verified
- Verification questions
- Verification of case disposition codes and selected responses
- Interviewer performance checklist
- Criteria for interviewer dismissal or supplementary training
Activities
- Establish a sample management system
- Review paper coversheets and/or questionnaires
- Dismiss or retrain interviewers with substandard performance
- Collect paradata needed for statistical adjustment
|
- Overall, by key respondent groups and by interviewer, report on:
- Screening rates
- Eligibility rates
- Response rates
- Refusal rates
- Noncontact rates
- Completion rates
- Hours per interview
- Number of completed interviews
- Report on interviewer performance outcomes
- Develop a responsive design based on cost/error tradeoffs
|
- Documentation of mode(s) of data collection and the protocol for determining mode(s) to use
- Documentation of the sample management system
- Study materials
- Screening/respondent selection procedures
- Number of completed interviews, overall and by mode
- Documentation of proxy interview protocol
- Documentation of respondent incentives and interviewer incentive protocol
- Documentation of techniques to maximize response (e.g., prenotification, recontact, and refusal conversion protocol)
- Outcome rates, overall and by key respondent groups
- Dates of data collection
- Interviewer monitoring procedures and outcomes
- Verification form(s) and outcomes
- Any descriptions and outcomes of validation study (e.g., administrative record check against survey data)
|
Data Harmonization |
Inputs
- Standard codebook specifications
- Standard procedures for collecting and producing national data files
- Comprehensive plan for harmonization of cross-cultural data files
- Procedures to judge the quality of the harmonized outputs
- Procedures for testing harmonized files with knowledgeable users
- Procedures to modify and update harmonized datasets after public release, if applicable
Activities
- Create cross-cultural monitoring team
- Periodically review analytic results to allow for changes in harmonization rules
- Review end-user test results
- Make recommendations for harmonization process improvement
|
- Report on analytic results
- Report on user tests
|
- Documentation of specification and procedures standards
- Documentation of conversion and harmonization decisions
- Results of user tests
|
Data Processing and Statistical Adjustment |
Inputs
- Percent of manually entered questionnaires to be verified
- Criteria for data entry staff dismissal or supplementary training
- Items to be coded
- Coding protocol (manual or automatic)
- Percent of manually coded cases to be check coded
- Minimum acceptable inter-coder reliability
- Data editing protocol
- Appropriate statistical software
- Appropriate statistical adjustments (e.g., imputation, weights)
- Appropriate standard error estimation
- Quality control procedures for calculation of statistical adjustments and variance estimation
Activities
- Train data entry and data coding staff
- Verify data accuracy
- Develop coding scheme(s)
- Assess inter-coder reliability
- Use data entry tools to perform keying quality checks
- Check
- Edit data
- Continually monitor coding activities
- Monitor editing using some key process statistics
- Remove any identifying information from the production data
- When possible, use paradata for post-survey adjustments
- Assign a second sampling statistician to check the post-survey adjustment methodology and the statistical software syntax of the survey’s primary sampling statistician
|
- Report on data entry accuracy rate
- Test inter-coder reliability
- Key process statistics for editing:
- Edit failure rate
- Recontact rate
- Correction rate
|
Data processing:
- Data coding and data entry training documentation
- Evaluation protocol for data coding and data entry staff and outcomes
- Items that were coded or re-coded
- Coding reliability
- Data entry verification protocol and outcomes
- Data editing protocol
Statistical adjustment:
- Rationale for assigning sample identification numbers
- Calculation of (e.g., response, refusal, noncontact), weighted and unweighted
- Standard error estimates
- Percent item-missing data
Where applicable:
- Imputation method(s)
- Generation of weight(s)
- Trimming of weight(s)
- Scaling of weight(s)
- Adjustment(s) for differential nonresponse
- Poststratification adjustment
|
Data Dissemination |
Inputs
- A quality compliance protocol
- Procedures for testing accessibility of archives with knowledgeable users
- Procedures for digitized preservation of files
- Procedures for assessing disclosure risk to respondents
- Procedures for distributing restricted-use files, if applicable
- Procedures for testing files with major statistical packages
Activities
- Create electronic versions of all files
- Provide data files in formats usable by all major statistical software packages
- Designate resources to provide user support and training for secondary researchers
- Review results of user tests
|
- Data archive test reports
|
- Description and classification of target users and their needs
- Results of user satisfaction assessments
- Summary of conditions of access to data, accompanying documentation, and user feedback
- Distribution reports (dataset requests, Web hits, downloads, etc.)
|
Paradata |
Inputs
- Study goals and objectives
- Survey mode and available paradata
- Protocols to collect different types of paradata
- Instructions on how to construct paradata-derived indicators
- Procedures to conduct responsive design using paradata
- Plans on how to use paradata for analyzing different types of errors
Activities
- For computer-generated paradata, develop procedures to make sure all programming works as designed
- For interviewer-generated paradata, develop clear protocols for interviewers about how to record paradata and for coders on how to code interviewer-generated paradata in the dataset
- Develop quality examination procedures for different types of paradata
- Monitor the process of using paradata for analysis
- Monitor the process of using paradata in responsive designs
|
- Paradata collection report
- Documentation on the construction of paradata-derived indicators
- Documentation on any coding procedure for interviewer-generated paradata
- Document clearly how paradata is used to monitor and intervene the data collection process
|
- Documentation on how paradata can be linked to main survey data
- Documentation on the use of paradata in a responsive design
- Documentation on the use of paradata in studying different types of errors
|