GOR 16 General Online Research Conference 02-04 March 2016, HTW-Dresden University of Applied Sciences, Dresden, Germany March4 th, 2016 Daniele Toninelli University of Bergamo (Italy) - daniele.toninelli@unibg.it Melanie Revilla RECSM - Universitat Pompeu Fabra (Spain) - melanie.revilla@upf.edu Full working paper available upon request. Slides suggested citation: Toninelli, Daniele, Revilla, Melanie. 2016. the Survey Experience when Sensitive Questions are Proposed?. General Online Research (GOR) Conference, Dresden. This work is licensed under a Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/)
1. Introduction 2. Literature review 3. Goal and hypotheses 4. Methodology 5. Data collection 6. Results 7. Conclusions 8. Limits and further ideas 9. References Does the Use of Smartphones to Participate in Web Surveys Affect the Survey Experience when Sensitive Questions are Proposed?
1.Introduction: Sensitive questions Definitions questions that trigger social desirability concerns [and] are seen as intrusive by the respondents; [ ] respondents have concerns about disclosing such kind of information to third parties. (Tourangeau & Yan, 2007, p. 859; Tourangeau et al., 2009) o Social desirability bias: tendency of respondents in presenting themselves in positive ways o E.g.: alcohol consumption, drug use, deviant behaviors, 3
1.Introduction: Web surveys Sensitive questions in web surveys Survey design or setting can cause the misreporting of sensitive information (Tourangeau, Groves and Redline, 2010) Web survey are self-administered (Kreuter et al., 2008) but what happens if mobile devices are used? Different device characteristics Potentially different survey contexts Device effect on survey responses (Peytchev & Hill, 2010) 4
2.Literature review: PCs / mobile devices Web surveys participation Worldwide mobile internet usage: from 8.5% (Jan. 2012) to 41% (Jan. 2016) (source: StatCounter Global Stats (2016)) Unintended mobile participation (Peterson, 2012) Effects of mobile participation on survey responses: o Different population (Antoun, 2015; Revilla et al., 2015) o Coverage error (Mohorko et al., 2013; Fuchs & Busse, 2009) o Survey experience (de Bruijne & Wijnant, 2013) o Quality of data (Mavletova, 2013; Wells et al., 2013) 5
2.Literature review: PCs / mobile devices Mobile participation Higher social desirability bias Potential effects on (Mavletova & Couper, 2013): o Survey experience Perceived privacy and bystanders effect vary with the contents of questions; No differential satisficing; No significant differences for the survey context. o Report of sensitive information Significant differences for 2 out of 5 indices 6
3.Goal and hypotheses: Goal Replication of Mavletova & Couper (2013) Different country: Spain o Robustness of previous conclusions (testing similar hypotheses); more complete view More recent data: 2015 o Update view of a very quickly changing phenomenon Few differences: o Focus on measurement error o Focus on smartphone users (Revilla et al., 2015) o Additionally: questionnaire optimization effect (McClain et al., 2012) 7
3.Goal and hypotheses: Hypotheses (a) Mobile web survey context o H1: smartphones make more common the surveys participation in places different than home o H2: smartphones show higher probabilities for the presence of bystanders during the participation Survey experience o H3: smartphone respondents feel less comfortable, due to a perceived lack of privacy/trust in confidentiality 8
3.Goal and hypotheses: Hypotheses (b) Reporting sensitive information o H4: smartphone respondents are more likely to underreport sensitive information o H5: the questionnaire optimization does not significantly affect the reporting of sensitive information 9
4.Methodology: the Experiment Settings similar to Mavletova & Couper s (2013) ones o Questionnaire, topics, questions, response scales, Two-wave cross-over experiment o Same questionnaire proposed twice to the same respondents o Wave 1: random assignment of survey condition between: PC (participation through PC): 600 respondents SNO (smartphone, non optimized questionnaire): 600 resp. SO (smartphone, optimized questionnaire): 600 resp. o Wave 2: random assignment of the survey condition o 6 experimental groups / 3 control groups 10
5.Data collection Opt-in online panel Netquest (www.netquest.com) o Country: Spain Two waves Wave 1: from 23 rd of February to 2 nd of March, 2015 o 1,800 respondents completed the survey (200 units for each experimental group) Wave 2: from 9 th to 18 th of March, 2015 o 1,608 respondents (89,3%) completed the survey 11
6.Results: Outline Comparison PCs vs smartphones Survey context o Place; presence of bystanders Survey experience o Trust in confidentiality; sensitivity of questions; feeling uneasy during the survey; multitasking Measurement error o 5 sensitive indices: Attitude towards deviant practice; rate of deviant behavior; alcohol consumption; alcohol-related behavior; monthly household income o Relative bias in reporting sensitive information (comparing wave 1 and wave 2) o LMM (Linear Mixed Models) applied to the five indices 12
6.Results: Survey context (a) H1 - Place TABLE 1: % distribution by device and by wave, independence chi-square test p-values Variable Categories Wave 1 Wave 2 PC (%) S (%) p-value PC (%) S (%) p-value Place of Outside home 26.7 22.9.077 21.1 18.3.186 participationhome 73.3 77.1 78.9 81.7 TOTAL 100.0 100.00 100.0 100.00 Higher % of questionnaire filled at home for smartphones o Difference with Mavletova & Couper (2013), but consistent with other works (Revilla et al., 2016; de Bruijne & Wijnant, 2013) o Bias due to the experiment structure No device effect for place of participation 13
6.Results: Survey context (b) H2 - Bystanders TABLE 2: % distribution by device and by wave, independence chi-square test p-values Variable Categories Wave 1 Wave 2 Presence of bystanders PC (%) S (%) p-value PC (%) S (%) p-value No 80.2 73.0.001 83.2 70.6.000 Yes 19.8 27.0 16.8 29.4 TOTAL 100.0 100.00 100.0 100.00 Higher percentages if smartphones are used Results similar to Mavletova & Couper (2013): o Bystanders: 16% with PCs; 29% with smartphones 14
6.Results: Survey experience H3 Respondent feeling / Multitasking TABLE 3: % distribution by device and by wave, independence chi-square test p-values Variable Categories Wave 1 Wave 2 PC (%) S (%) p-value PC (%) S (%) p-value Trust in Trust 99.5 98.7.129 98.6 98.0.428 confidential. Do not trust 0.5 1.3 1.4 2.0 Sensitivity Sensitive 93.7 94.7.368 93.0 94.0.426 of questions Not sensitive 6.3 5.3 7.0 6.0 Uneasy feeling Feel uneasy 26.8 27.9.625 31.4 27.2.079 Do not feel u. 73.2 72.1 68.6 72.8 Other activit. 70.9 75.2.052 71.5 74.8.156 Multitasking No other act. 29.1 24.8 28.5 25.2 Plus: no optimization effect for all survey experience aspects 15
6.Results: Measurement error H4/H5 Sensitive Information / Optimization TABLE 4: Sensitive indices: means/std. deviations by survey cond. (averages of waves) Survey condition PC SO SNO Sensitive Indices Mean St.D. Mean St.D. Mean St.D. 1) Positive attitude towards deviant practices 22.3 14.3 21.1 13.7 21.4 13.5 2) Rate of deviant behaviour 20.1 13.0 19.8 12.9 20.5 13.4 3) Monthly alcohol consumption (times) 17.7 25.1 17.4 25.5 17.3 26.1 4) Rate of alcoholic behaviour 22.4 22.9 22.4 23.4 22.6 23.6 5) Monthly household income (class) Median/Mode Median/Mode Median/Mode 1501-2500 1501-2500 1501-2500 No systematic differences for the three groups (0.1% to 5.6%) 16
6.Results : Measurement error Linear Mixed Models (LMM) (West et al., 2007) Linear relationship between factors/covariates and a dependent variable (special case of the general linear models) Taking into account: o random effect linked to each respondent o within-subject correlation and non-constant variability Fixed effects: Wave (Level 1) / Respondent (Level 2) o Other fixed effects: survey settings (PC, SNO, SO), gender o Covariate: age Y = β + β Sett + β Gend + β Wave + β Age + u + ε it 00 M ti G i W i A i 0i ti 17
6.Results: Measurement error (LMM) H4/H5 Sensitive Information / Optimization TABLE 5: Linear mixed models coefficients by sensitive indices Sensitive indices Positive attit. deviant pract. Deviant Behaviour Alcohol Consumption Alcohol Behaviour Income Parameter Est. Std.E. Est. Std.E. Est. Std.E. Est. Std.E. Est. Std.E. Intercept 32,89***,967 22,45*** 1,043-7,29*** 1,958 36,5*** 1,843 1349*** 90,64 Condition: SO -,17,342 -,38,286 -,07,683,15,476-52,99* 24,60 Condition: PC,42,341 -,32,286,36,681,17,476 12,66 24,48 Gender: M 1,96***,545 4,35***,593 7,33*** 1,103 5,80*** 1,051 106,04* 51,70 Wave: First,11,211,13,172,20,420 1,14***,284 9,76 14,65 Age -,21***,025 -,12***,027,58***,050 -,48***,048 18,07*** 2,35 Note: Significance levels: * = p<.05; ** = p<.01; *** = p<=.001 18
7.Conclusions : Main findings Survey context Place of participation: preferred home Bystanders: significantly higher with smartphones Survey experience No device effect (survey s confidentiality; questions sensitivity; feeling uncomfortable; multitasking) Reporting sensitive information No significant effect of used device Other variables more linked (Age, Gender) No effect of the questionnaire s optimization (1 out of 5) 19
7.Conclusions: Comparison Mavletova & Couper (2013) results Most confirmed (robust to contexts) o no device effect on perceived privacy; o comparability of data not affected (sensitive information). Some of them are not confirmed o no link between device and place of participation; o no higher trust in survey confidentiality with PCs. Main possible reasons: o quick spread/changes and recent evolution of phenomenon; o differences in culture/tendencies in using mobile devices. 20
8.Limits and further ideas Limits of this research Panellists of opt-in panel Some conclusions could be limited to Spain Findings limited to smartphone users only Results could be linked to topics surveyed Suggestions for further research Use probability samples (interest on general population) Study more countries Include other devices (tablets) Test a wider range of topics 21
9.References (a) Antoun, C. (2015). Who Are the Internet Users, Mobile Internet Users, and Mobile- Mostly Internet Users?: Demographic Differences across Internet-Use Subgroups in the U.S.. In: Toninelli, D., Pinter, R. & de Pedraza, P. (eds.) Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies, Pp. 99 117. London: Ubiquity Press. DOI: http://dx.doi.org/10.5334/bar. de Bruijne, M and Wijnant, A. (2013). Comparing survey results obtained via mobile devices and computers: An experiment with a mobile web survey on a heterogeneous group of mobile devices versus a computer assisted web survey. Social Science Computer Review, Vol. 31 No. 4, pp. 482-504. Fuchs, M., & Busse, B. (2009). The coverage bias of mobile web surveys across European countries. International Journal of Internet Science, 4, 21 33. Kreuter F., Presser S., Tourangeau, R. (2008). Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity. Public Opinion Quarterly, 72(5):847-865. Mavletova, A. (2013). Data quality in PC and mobile web surveys. Social Science Computer Review, Vol. 31 No. 4, pp. 725-743. Mavletova, A. and Couper, M.P. (2013). Sensitive Topics in PC Web and Mobile Web Surveys: Is There a Difference?. Survey Research Methods, Vol. 7 No. 3, pp. 191-205. 22
9.References (b) McClain, C., Crawford, S. D., Dugan, J. P. (2012). Use of Mobile Devices to Access Computer-Optimized Web Instruments: Implications for Respondent Behavior and Data Quality. Paper presented at AAPOR Annual Conference. May 17-20, 2012. Orlando, USA. Mohorko, A., de Leeuw, E., Hox, J. (2013). Internet Coverage and Coverage Bias in Europe: Developments Across Countries and Over Time. Journal of Official Statistics, 29(4), 609 622. DOI: http://dx.doi.org/10.2478/jos-2013-0042. Peterson, G. (2012). Unintended mobile respondents. Paper presented at CASRO Technology Conference, 31 May, New York, NY. Available at: http://c.ymcdn.com/sites/www.casro.org/resource/collection/d0686718-163a-4af4- A0BB-8F599F573714/Gregg_Peterson_-_Market_Strategies.pdf Peytchev, A. and Hill, C.A. (2010). Experiments in mobile web survey design: Similarities to other modes and unique considerations. Social Science Computer Review, Vol. 28 No. 3, pp. 319-335 Revilla M., Toninelli D., Ochoa C., Loewe G. (2015). Who Has Access to Mobile Devices in an Opt-in Commercial Panel? An Analysis of Potential Respondents for Mobile Surveys. In: Toninelli, D., Pinter, R., de Pedraza, P. (eds.) Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies. Ubiquity Press, London. Pp. 119 139. DOI: http://dx.doi.org/10.5334/bar. 23
9.References (c) Revilla M., Toninelli D., Ochoa C., Loewe G. (2016, forthcoming). Do online access panels need to adapt surveys for mobile devices?. Internet Research (accepted). StatCounter Global Stats (2015), link: http://gs.statcounter.com/#desktop+mobilecomparison-ww-monthly-201202-201602 (accessed Feb. 22 nd, 2016) Tourangeau, R., Groves, R., Kennedy C., Yan, T. (2009). The Presentation of the Survey, Nonresponse, and Measurement Error. Journal of Official Statistics, 25, 299-321. Tourangeau, R., Groves, R. M., Redline, C. D. (2010). Sensitive Topics and Reluctant Respondents: Demonstrating a Link between Nonresponse Bias and Measurement Error. Public Opinion Quarterly, 74(3), 413-432. Tourangeau, R., Yan, T. (2007). Sensitive Questions in Surveys. Psychological Bulletin, 133(5), 859-883. Wells, T., Bailey, J.T. and Link, M.W. (2013). Filling the void: Gaining a better understanding of tablet-based surveys. Survey Practice, Vol. 6 No. 1, pp. 1-9. West, B.T., Welch, K.B., Gałecki, A.T. (2007). Linear Mixed Models: A Practical Guide Using Statistical Software. Boca Raton: Chapman & Hall/CRC. 24
Does the Use of Smartphones to Participate in Web Surveys Affect the Survey Experience when Sensitive Questions are Proposed? For further information: Daniele Toninelli - daniele.toninelli@unibg.it Melanie Revilla - melanie.revilla@upf.edu Full working paper available upon request (forthcoming, 2016). Suggested citation: Toninelli, Daniele, Revilla, Melanie. 2016. the Survey Experience when Sensitive Questions are Proposed?. General Online Research (GOR) Conference, Dresden.