Prevalence and magnitude of question order effects in panel surveys
Jul 25, 09:55
Question order effects refer to the phenomenon that different orders in which questions (or response options) are presented may influence respondents’ answers in a systematic way. Past empirical research documents considerable order effects, casting doubts on the general validity of survey reports. Question order effects are also a concern in panel studies when the wording of survey items remains the same, however, their position within the questionnaire changes.
Yet, experimental studies conducted in psychological laboratories are usually designed and tweaked to bring about and maximize effects. This practice can be problematic and is seen as one reason for the rather low replicability of psychological studies (Open Science Collaboration, 2015). Consequently, there is increasing skepticism about whether comparable effects – in prevalence and magnitude – emerge if data is collected in real-world settings.
For our analyses, we draw on three large panel surveys –the German Socio-Economic Panel Study (SOEP; 28,000 respondents in 16,000 households; waves 2005-2015) and its related studies, the SOEP-Innovation Sample (SOEP-IS; 5,500 respondents in nearly 3,500 households; waves 2011-2015) and the Program for the International Assessment of Adult Competencies (PIAAC-L; 6,200 respondents surveyed in 2014), in which the order of questions in the questionnaire often changes in an essentially random fashion over time. This may be due to the implementation of new survey items, rotating modules of the questionnaire, or changes in filter questions. Our analysis focuses on survey questions on attitudes, beliefs, and opinions, which have been shown to be particularly plagued by order effects.
A first analysis of psychological concept surveyed in these studies suggests that distributions, means, and standard deviations of the responses to the various questions are highly comparable and almost identical across the different panels and across different survey years. Question order effects were non-existent or trivial. Even variations that could be expected to yield strong effects showed only small effects.
Based on these findings we tentatively conclude that the data collection in household surveys is robust with regard to question order effects and that the existing literature reporting strong effects in (social-psychological) experiments should be reexamined. In the next step of the analysis we will expand the set of items to political attitudes, expectations, and reports of sensitive issues. Also, we plan to study the extent to which question order effects in panel surveys depend on characteristics of interviewers and respondents.