Download full text
(262.1Kb)
Citation Suggestion
Please use the following Persistent Identifier (PID) to cite this document:
https://nbn-resolving.org/urn:nbn:de:0168-ssoar-449736
Exports for your reference manager
Asking probing questions in web surveys: which factors have an impact on the quality of responses?
[journal article]
Abstract Cognitive interviewing is a well-established method for evaluating and improving a questionnaire prior to fielding. However, its present implementation brings with it some challenges, notably in terms of small sample sizes or the possibility of interviewer effects. In this study, the authors test we... view more
Cognitive interviewing is a well-established method for evaluating and improving a questionnaire prior to fielding. However, its present implementation brings with it some challenges, notably in terms of small sample sizes or the possibility of interviewer effects. In this study, the authors test web surveys through nonprobability online panels as a supplemental means to implement cognitive interviewing techniques. The overall goal is to tackle the above-mentioned challenges. The focus in this article is on methodological features that pave the way for an eventual successful implementation of category-selection probing in web surveys. The study reports on the results of 1,023 respondents from Germany. In order to identify implementation features that lead to a high number of meaningful answers, the authors explore the effects of (1) different panels, (2) different probing variants, and (3) different numbers of preceding probes on answer quality. The overall results suggest that category-selection probing can indeed be implemented in web surveys. Using data from two panels - a community panel where members can actively get involved, for example, by creating their own polls, and a "conventional" panel where answering surveys is the members' only activity - the authors find that high community involvement does not increase the likelihood to answer probes or produce longer statements. Testing three probing variants that differ in wording and provided context, the authors find that presenting the context of the probe (i.e., the probed item and the respondent's answer) produces a higher number of meaningful answers. Finally, the likelihood to answer a probe decreases with the number of preceding probes. However, the word count of those who eventually answer the probes slightly increases with an increasing number of probes. (author's abstract)... view less
Keywords
measurement instrument; scaling; online survey; panel; questionnaire; development; data quality; response behavior; reactivity effect; survey research
Classification
Methods and Techniques of Data Collection and Data Analysis, Statistical Methods, Computer Methods
Method
empirical; development of methods; quantitative empirical; basic research
Document language
English
Publication Year
2012
Page/Pages
p. 487-498
Journal
Social Science Computer Review, 30 (2012) 4
DOI
https://doi.org/10.1177/0894439311435305
ISSN
0894-4393
Status
Published Version; peer reviewed
Licence
Deposit Licence - No Redistribution, No Modifications
With the permission of the rights owner, this publication is under open access due to a (DFG-/German Research Foundation-funded) national or Alliance license.