Methodology of Longitudinal Surveys II

This site is now archived.

Innovations in survey management: incorporating experimental designs in the 2014 Survey of Income and Program Participation

Type:Contributed Paper
Jul 26, 13:45
  • Holly Fee - U.S. Census Bureau
  • Stephanie Coffey - U.S. Census Bureau
  • Jason Fields - U.S. Census Bureau
  • Shelley Irving - U.S. Census Bureau
  • Matthew Marlay - U.S. Census Bureau
  • Benjamin Reist - U.S. Census Bureau
  • Mahdi Sundukchi - U.S. Census Bureau
  • Kevin Tolliver - U.S. Census Bureau
  • Gina Walejko - U.S. Census Bureau
  • Ashley Westra - U.S. Census Bureau
  • Allison Zotti - U.S. Census Bureau

The 2014 Survey of Income and Program Participation (SIPP) Panel debuted extensive changes in the design of the survey.  These changes, coupled with increasingly restrictive budgets and more difficult interviewing conditions, make this an opportune time to focus on innovations in survey management that leverage developments in paradata and adaptive-design methodology.

We conducted two overlapping experiments during the 2014 SIPP panel.  First, we used adaptive-design models to test distributing field representatives’ (FRs’) workload at the beginning of the four-month interviewing period.  In the 2014 SIPP, FRs get all of their cases all at once, rather than getting a quarter of their cases each month to interview in a staggered fashion.  In the 2014 panel, managing work over a single four-month period allowed adaptive and other data-driven procedures to play a more critical role.

Using Contact History Instrument (CHI) information from Waves 1 and 2 (2014 and 2015), we worked during Wave 3 (2016) data collection to develop a model for prioritizing work.  In addition to CHI input, we used respondent characteristics, current wave interview appointments, and case geography to refine models to prioritize FR’s workloads.  The models focused on balancing priorities to redirect effort to cases that could improve sample balance relative to Wave 1 on key characteristics.  Based on adaptive workload models developed during Wave 3 data collection experiments, we implemented refined adaptive workload experiments for Wave 4 (2017) data collection.

Second, an incentive experiment was included in the 2014 SIPP Panel.  Incentives are a widely used tool for reducing nonresponse bias, improving respondent cooperation, boosting response rates, and improving data quality.  However, they are increasingly under scrutiny as budgets constrict and response rates continue to decline.  In the 2014 SIPP panel experiment, we evaluated conditional post-paid incentives and the development of model-based incentives.  Based on Wave 1 characteristics and response in Wave 1 and 2, we used a logistic regression model to predict a household’s likelihood of responding in Wave 2 with and without incentives.  The experimental design in Wave 2 facilitated evaluation of different model specifications, leading to model-based incentive assignment tests that focused on households where incentives were a critical factor in retention for Waves 3 and 4.  The implementation and eventual integration of adaptive-design and model-based incentives in SIPP allows limited resources to be focused where they are most impactful and are based on data-driven decisions.


Latest tweets from @MOLS2Essex. Follow the conversation at #MOLS2

Congratulations & happy new year 🌹
15 hours 12 min ago
Quick off the mark! I haven't finished dig… https://t.co/eVfyoY8Gxy
18 hours 12 min ago
My first publication of 2019!! Assessing the reliability of longitudinal study data Findings from… https://t.co/UZnHihQjlG
18 hours 18 min ago
Today is the last day for submitting abstracts on longitudinal survey methods for the special edition of Longitudin… https://t.co/MYNRQh3bpE
3 months 1 week ago