GOR 16 General Online Research Conference March 2016, HTW-Dresden University of Applied Sciences, Dresden, Germany

Similar documents
Mobile Web Survey Design: Scrolling versus Paging, SMS versus Invitations

The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability-based General Population Panel

PSI EBRIEF #6 BEST PRACTICES FOR MOBILE SURVEY CREATION: INDUSTRY RECOMMENDATIONS AND USEFUL GUIDELINES.

Mobile Research Methods: Possibilities and Issues of a New Promising Way of Conducting Research

Mode and Patient-mix Adjustment of the CAHPS Hospital Survey (HCAHPS)

Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study

Willingness of Online Access Panel Members to Participate in Smartphone Application-Based Research

Internet Coverage and Coverage Bias Trends across Countries in Europe and over Time: Background, Methods, Question Wording and Bias Tables

Alcohol, CAPI, drugs, methodology, non-response, online access panel, substance-use prevalence,

Focus Suites On O li l ne Pa P nel l M a M nage g me m nt t Pr P a r cti t ce

Guided Reading 9 th Edition. informed consent, protection from harm, deception, confidentiality, and anonymity.

ESOMAR 28: SurveyMonkey Audience

The Evaluation and Analysis of Survey Paradata. Instructors: Bob Groves and Mick Couper Winter, 2008

Gender Stereotypes Associated with Altruistic Acts

Westpac Kids and Money Report FINDINGS

Targeted initial letters to longitudinal survey sample members: effects on response rates, response speed, and sample composition

Social Science Computer Review OnlineFirst, published on February 7, 2008 as doi: /

Financial capability and saving: Evidence from the British Household Panel Survey

Determinants of Item Nonresponse to Web and Mail Respondents in Three Address-Based Mixed-Mode Surveys of the General Public

2015 Christmas Post-Campaign Tracking Research

Missing Data. A Typology Of Missing Data. Missing At Random Or Not Missing At Random

Using Proxy Measures of the Survey Variables in Post-Survey Adjustments in a Transportation Survey

The Coverage Bias of Mobile Web Surveys Across European Countries

5/15/2014. A Look at Mobile Device Usage Among College Students Shimon Sarraf Jennifer Brooks James Cole. Introduction & Purpose

The Office of Public Services Reform The Drivers of Satisfaction with Public Services

Seemingly Irrelevant Events Affect Economic Perceptions and Expectations: The FIFA World Cup 2006 as a Natural Experiment

Ad blocking software - consumer usage and attitudes Wave 4 - Feb 2016

Pearson Student Mobile Device Survey 2013

Do Introductory Sentences Cause Acquiescence Response Bias in Survey Questions? Jon A. Krosnick. Ana Villar. and. Bo MacInnis. Stanford University

Sample Paper for Research Methods. Daren H. Kaiser. Indiana University Purdue University Fort Wayne

Mobile Youth Around the World

A Reasoned Action Explanation for Survey Nonresponse 1

National Disability Authority Resource Allocation Feasibility Study Final Report January 2013

Microsoft Get It Done Survey of Office Workers

I L L I N O I S UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN

The importance of using marketing information systems in five stars hotels working in Jordan: An empirical study

13: Additional ANOVA Topics. Post hoc Comparisons

NON-PROBABILITY SAMPLING TECHNIQUES

Can Annuity Purchase Intentions Be Influenced?

Mixed-Mode Methods for Conducting Survey Research

Survey Research: Choice of Instrument, Sample. Lynda Burton, ScD Johns Hopkins University

This chapter discusses some of the basic concepts in inferential statistics.

White Paper. The Changing Landscape of Technology and its Effect on Online Survey Data Collection. by Nicole Mitchell, Knowledge Specialist, June 2014

ONLINE INTERVIEWING THROUGH ACCESS PANEL: QUANTITY AND QUALITY ASSURANCE

IMPACT OF TRUST, PRIVACY AND SECURITY IN FACEBOOK INFORMATION SHARING

Online versus traditional teaching evaluation: mode can matter

Consumer behaviour in laundry washing

Stigmatisation of people with mental illness

New Findings on Questionnaire Design from 2012 AAPOR Conference Papers AAPOR conference recap mini-conference 6/22/2012 San Francisco, CA Su Li

EUROPE ERICSSON MOBILITY REPORT

Does It Pay Off to Include Non-Internet Households in an Internet Panel?

BY Maeve Duggan NUMBERS, FACTS AND TRENDS SHAPING THE WORLD FOR RELEASE AUGUST 19, 2015 FOR FURTHER INFORMATION ON THIS REPORT:

COI Research Management Summary on behalf of the Department of Health

The impact of liquidity on the capital structure: a case study of Croatian firms

Data Analysis, Research Study Design and the IRB

Questionnaire Design in Telephone Surveys: Interviewers and Research Call Center Managers Experience

Survey Nonresponse and Nonresponse Bias. Survey Research Laboratory University of Illinois at Chicago

Social Media Study in European Police Forces: First Results on Usage and Acceptance

Research Methods & Experimental Design

DATA COLLECTION AND ANALYSIS

USING DATABASE OVERLAYS TO DETECT NON- RESPONSE SURVEY BIAS

INVESTIGATION OF EFFECTIVE FACTORS IN USING MOBILE ADVERTISING IN ANDIMESHK. Abstract

Generic Management of Surveys and Interviewers: A Technological Solution for Mixed-Methods Surveys

RECOMMENDED CITATION: Pew Research Center, January, 2016, Republican Primary Voters: More Conservative than GOP General Election Voters

SURVEY RESEARCH AND RESPONSE BIAS

Non-response bias in a lifestyle survey

Consumers Preference for Flat Rates: A Case of Media Access Fees

Seniors Choice of Online vs. Print Response in the 2011 Member Health Survey 1. Nancy P. Gordon, ScD Member Health Survey Director April 4, 2012

Multivariate Analysis of Variance. The general purpose of multivariate analysis of variance (MANOVA) is to determine

Financial incentives, personal information and dropout rate in online studies

De-Coding Digital Trends Ireland Job No: (1)

Internet Surveys. Examples

Clocking In Facebook Hours. A Statistics Project on Who Uses Facebook More Middle School or High School?

THE JOINT HARMONISED EU PROGRAMME OF BUSINESS AND CONSUMER SURVEYS

Internet, broadband, and cell phone statistics

Scientific Methods in Psychology

Social and Behavioral Research and the Internet

An Examination of the Association Between Parental Abuse History and Subsequent Parent-Child Relationships

Newspaper Multiplatform Usage

# # % &# # ( # ) + #, # #./0 /1 & 2 % & 6 4 & 4 # 6 76 /0 / 6 7 & 6 4 & 4 # // 8 / 5 & /0 /# 6222 # /90 8 /9: ; & /0 /!<!

Evaluating the Factors Affecting on Intension to Use of E-Recruitment

Full Text (3085 words) Copyright MCB University Press Limited 1995 Introduction

CRITICAL FACTORS AFFECTING THE UTILIZATION OF CLOUD COMPUTING Alberto Daniel Salinas Montemayor 1, Jesús Fabián Lopez 2, Jesús Cruz Álvarez 3

1. What is the critical value for this 95% confidence interval? CV = z.025 = invnorm(0.025) = 1.96

Health Reform Monitoring Survey -- Texas

The Top ISVs for SaaS business in Europe. The project covers DACH region UK France Italy Spain Nordics BNL

Descriptive Methods Ch. 6 and 7

How do we know what we know?

Using SAS Proc Mixed for the Analysis of Clustered Longitudinal Data

Digital Media Monitor 2012 Final report February

Theories of consumer behavior and methodology applied in research of products with H&N claims

Adults media use and attitudes. Report 2016

Self-Check and Review Chapter 1 Sections

UK children s media literacy

Children and parents: media use and attitudes report

Eye Tracking on a Paper Survey: Implications for Design

Lean Six Sigma Black Belt Body of Knowledge

Module 5: Multiple Regression Analysis

Testing Theories of Policy-Making: Educational Funding MICAH MCFADDEN

Television, Internet and Mobile Usage in the U.S. Three Screen Report

Transcription:

GOR 16 General Online Research Conference 02-04 March 2016, HTW-Dresden University of Applied Sciences, Dresden, Germany March4 th, 2016 Daniele Toninelli University of Bergamo (Italy) - daniele.toninelli@unibg.it Melanie Revilla RECSM - Universitat Pompeu Fabra (Spain) - melanie.revilla@upf.edu Full working paper available upon request. Slides suggested citation: Toninelli, Daniele, Revilla, Melanie. 2016. the Survey Experience when Sensitive Questions are Proposed?. General Online Research (GOR) Conference, Dresden. This work is licensed under a Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/)

1. Introduction 2. Literature review 3. Goal and hypotheses 4. Methodology 5. Data collection 6. Results 7. Conclusions 8. Limits and further ideas 9. References Does the Use of Smartphones to Participate in Web Surveys Affect the Survey Experience when Sensitive Questions are Proposed?

1.Introduction: Sensitive questions Definitions questions that trigger social desirability concerns [and] are seen as intrusive by the respondents; [ ] respondents have concerns about disclosing such kind of information to third parties. (Tourangeau & Yan, 2007, p. 859; Tourangeau et al., 2009) o Social desirability bias: tendency of respondents in presenting themselves in positive ways o E.g.: alcohol consumption, drug use, deviant behaviors, 3

1.Introduction: Web surveys Sensitive questions in web surveys Survey design or setting can cause the misreporting of sensitive information (Tourangeau, Groves and Redline, 2010) Web survey are self-administered (Kreuter et al., 2008) but what happens if mobile devices are used? Different device characteristics Potentially different survey contexts Device effect on survey responses (Peytchev & Hill, 2010) 4

2.Literature review: PCs / mobile devices Web surveys participation Worldwide mobile internet usage: from 8.5% (Jan. 2012) to 41% (Jan. 2016) (source: StatCounter Global Stats (2016)) Unintended mobile participation (Peterson, 2012) Effects of mobile participation on survey responses: o Different population (Antoun, 2015; Revilla et al., 2015) o Coverage error (Mohorko et al., 2013; Fuchs & Busse, 2009) o Survey experience (de Bruijne & Wijnant, 2013) o Quality of data (Mavletova, 2013; Wells et al., 2013) 5

2.Literature review: PCs / mobile devices Mobile participation Higher social desirability bias Potential effects on (Mavletova & Couper, 2013): o Survey experience Perceived privacy and bystanders effect vary with the contents of questions; No differential satisficing; No significant differences for the survey context. o Report of sensitive information Significant differences for 2 out of 5 indices 6

3.Goal and hypotheses: Goal Replication of Mavletova & Couper (2013) Different country: Spain o Robustness of previous conclusions (testing similar hypotheses); more complete view More recent data: 2015 o Update view of a very quickly changing phenomenon Few differences: o Focus on measurement error o Focus on smartphone users (Revilla et al., 2015) o Additionally: questionnaire optimization effect (McClain et al., 2012) 7

3.Goal and hypotheses: Hypotheses (a) Mobile web survey context o H1: smartphones make more common the surveys participation in places different than home o H2: smartphones show higher probabilities for the presence of bystanders during the participation Survey experience o H3: smartphone respondents feel less comfortable, due to a perceived lack of privacy/trust in confidentiality 8

3.Goal and hypotheses: Hypotheses (b) Reporting sensitive information o H4: smartphone respondents are more likely to underreport sensitive information o H5: the questionnaire optimization does not significantly affect the reporting of sensitive information 9

4.Methodology: the Experiment Settings similar to Mavletova & Couper s (2013) ones o Questionnaire, topics, questions, response scales, Two-wave cross-over experiment o Same questionnaire proposed twice to the same respondents o Wave 1: random assignment of survey condition between: PC (participation through PC): 600 respondents SNO (smartphone, non optimized questionnaire): 600 resp. SO (smartphone, optimized questionnaire): 600 resp. o Wave 2: random assignment of the survey condition o 6 experimental groups / 3 control groups 10

5.Data collection Opt-in online panel Netquest (www.netquest.com) o Country: Spain Two waves Wave 1: from 23 rd of February to 2 nd of March, 2015 o 1,800 respondents completed the survey (200 units for each experimental group) Wave 2: from 9 th to 18 th of March, 2015 o 1,608 respondents (89,3%) completed the survey 11

6.Results: Outline Comparison PCs vs smartphones Survey context o Place; presence of bystanders Survey experience o Trust in confidentiality; sensitivity of questions; feeling uneasy during the survey; multitasking Measurement error o 5 sensitive indices: Attitude towards deviant practice; rate of deviant behavior; alcohol consumption; alcohol-related behavior; monthly household income o Relative bias in reporting sensitive information (comparing wave 1 and wave 2) o LMM (Linear Mixed Models) applied to the five indices 12

6.Results: Survey context (a) H1 - Place TABLE 1: % distribution by device and by wave, independence chi-square test p-values Variable Categories Wave 1 Wave 2 PC (%) S (%) p-value PC (%) S (%) p-value Place of Outside home 26.7 22.9.077 21.1 18.3.186 participationhome 73.3 77.1 78.9 81.7 TOTAL 100.0 100.00 100.0 100.00 Higher % of questionnaire filled at home for smartphones o Difference with Mavletova & Couper (2013), but consistent with other works (Revilla et al., 2016; de Bruijne & Wijnant, 2013) o Bias due to the experiment structure No device effect for place of participation 13

6.Results: Survey context (b) H2 - Bystanders TABLE 2: % distribution by device and by wave, independence chi-square test p-values Variable Categories Wave 1 Wave 2 Presence of bystanders PC (%) S (%) p-value PC (%) S (%) p-value No 80.2 73.0.001 83.2 70.6.000 Yes 19.8 27.0 16.8 29.4 TOTAL 100.0 100.00 100.0 100.00 Higher percentages if smartphones are used Results similar to Mavletova & Couper (2013): o Bystanders: 16% with PCs; 29% with smartphones 14

6.Results: Survey experience H3 Respondent feeling / Multitasking TABLE 3: % distribution by device and by wave, independence chi-square test p-values Variable Categories Wave 1 Wave 2 PC (%) S (%) p-value PC (%) S (%) p-value Trust in Trust 99.5 98.7.129 98.6 98.0.428 confidential. Do not trust 0.5 1.3 1.4 2.0 Sensitivity Sensitive 93.7 94.7.368 93.0 94.0.426 of questions Not sensitive 6.3 5.3 7.0 6.0 Uneasy feeling Feel uneasy 26.8 27.9.625 31.4 27.2.079 Do not feel u. 73.2 72.1 68.6 72.8 Other activit. 70.9 75.2.052 71.5 74.8.156 Multitasking No other act. 29.1 24.8 28.5 25.2 Plus: no optimization effect for all survey experience aspects 15

6.Results: Measurement error H4/H5 Sensitive Information / Optimization TABLE 4: Sensitive indices: means/std. deviations by survey cond. (averages of waves) Survey condition PC SO SNO Sensitive Indices Mean St.D. Mean St.D. Mean St.D. 1) Positive attitude towards deviant practices 22.3 14.3 21.1 13.7 21.4 13.5 2) Rate of deviant behaviour 20.1 13.0 19.8 12.9 20.5 13.4 3) Monthly alcohol consumption (times) 17.7 25.1 17.4 25.5 17.3 26.1 4) Rate of alcoholic behaviour 22.4 22.9 22.4 23.4 22.6 23.6 5) Monthly household income (class) Median/Mode Median/Mode Median/Mode 1501-2500 1501-2500 1501-2500 No systematic differences for the three groups (0.1% to 5.6%) 16

6.Results : Measurement error Linear Mixed Models (LMM) (West et al., 2007) Linear relationship between factors/covariates and a dependent variable (special case of the general linear models) Taking into account: o random effect linked to each respondent o within-subject correlation and non-constant variability Fixed effects: Wave (Level 1) / Respondent (Level 2) o Other fixed effects: survey settings (PC, SNO, SO), gender o Covariate: age Y = β + β Sett + β Gend + β Wave + β Age + u + ε it 00 M ti G i W i A i 0i ti 17

6.Results: Measurement error (LMM) H4/H5 Sensitive Information / Optimization TABLE 5: Linear mixed models coefficients by sensitive indices Sensitive indices Positive attit. deviant pract. Deviant Behaviour Alcohol Consumption Alcohol Behaviour Income Parameter Est. Std.E. Est. Std.E. Est. Std.E. Est. Std.E. Est. Std.E. Intercept 32,89***,967 22,45*** 1,043-7,29*** 1,958 36,5*** 1,843 1349*** 90,64 Condition: SO -,17,342 -,38,286 -,07,683,15,476-52,99* 24,60 Condition: PC,42,341 -,32,286,36,681,17,476 12,66 24,48 Gender: M 1,96***,545 4,35***,593 7,33*** 1,103 5,80*** 1,051 106,04* 51,70 Wave: First,11,211,13,172,20,420 1,14***,284 9,76 14,65 Age -,21***,025 -,12***,027,58***,050 -,48***,048 18,07*** 2,35 Note: Significance levels: * = p<.05; ** = p<.01; *** = p<=.001 18

7.Conclusions : Main findings Survey context Place of participation: preferred home Bystanders: significantly higher with smartphones Survey experience No device effect (survey s confidentiality; questions sensitivity; feeling uncomfortable; multitasking) Reporting sensitive information No significant effect of used device Other variables more linked (Age, Gender) No effect of the questionnaire s optimization (1 out of 5) 19

7.Conclusions: Comparison Mavletova & Couper (2013) results Most confirmed (robust to contexts) o no device effect on perceived privacy; o comparability of data not affected (sensitive information). Some of them are not confirmed o no link between device and place of participation; o no higher trust in survey confidentiality with PCs. Main possible reasons: o quick spread/changes and recent evolution of phenomenon; o differences in culture/tendencies in using mobile devices. 20

8.Limits and further ideas Limits of this research Panellists of opt-in panel Some conclusions could be limited to Spain Findings limited to smartphone users only Results could be linked to topics surveyed Suggestions for further research Use probability samples (interest on general population) Study more countries Include other devices (tablets) Test a wider range of topics 21

9.References (a) Antoun, C. (2015). Who Are the Internet Users, Mobile Internet Users, and Mobile- Mostly Internet Users?: Demographic Differences across Internet-Use Subgroups in the U.S.. In: Toninelli, D., Pinter, R. & de Pedraza, P. (eds.) Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies, Pp. 99 117. London: Ubiquity Press. DOI: http://dx.doi.org/10.5334/bar. de Bruijne, M and Wijnant, A. (2013). Comparing survey results obtained via mobile devices and computers: An experiment with a mobile web survey on a heterogeneous group of mobile devices versus a computer assisted web survey. Social Science Computer Review, Vol. 31 No. 4, pp. 482-504. Fuchs, M., & Busse, B. (2009). The coverage bias of mobile web surveys across European countries. International Journal of Internet Science, 4, 21 33. Kreuter F., Presser S., Tourangeau, R. (2008). Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity. Public Opinion Quarterly, 72(5):847-865. Mavletova, A. (2013). Data quality in PC and mobile web surveys. Social Science Computer Review, Vol. 31 No. 4, pp. 725-743. Mavletova, A. and Couper, M.P. (2013). Sensitive Topics in PC Web and Mobile Web Surveys: Is There a Difference?. Survey Research Methods, Vol. 7 No. 3, pp. 191-205. 22

9.References (b) McClain, C., Crawford, S. D., Dugan, J. P. (2012). Use of Mobile Devices to Access Computer-Optimized Web Instruments: Implications for Respondent Behavior and Data Quality. Paper presented at AAPOR Annual Conference. May 17-20, 2012. Orlando, USA. Mohorko, A., de Leeuw, E., Hox, J. (2013). Internet Coverage and Coverage Bias in Europe: Developments Across Countries and Over Time. Journal of Official Statistics, 29(4), 609 622. DOI: http://dx.doi.org/10.2478/jos-2013-0042. Peterson, G. (2012). Unintended mobile respondents. Paper presented at CASRO Technology Conference, 31 May, New York, NY. Available at: http://c.ymcdn.com/sites/www.casro.org/resource/collection/d0686718-163a-4af4- A0BB-8F599F573714/Gregg_Peterson_-_Market_Strategies.pdf Peytchev, A. and Hill, C.A. (2010). Experiments in mobile web survey design: Similarities to other modes and unique considerations. Social Science Computer Review, Vol. 28 No. 3, pp. 319-335 Revilla M., Toninelli D., Ochoa C., Loewe G. (2015). Who Has Access to Mobile Devices in an Opt-in Commercial Panel? An Analysis of Potential Respondents for Mobile Surveys. In: Toninelli, D., Pinter, R., de Pedraza, P. (eds.) Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies. Ubiquity Press, London. Pp. 119 139. DOI: http://dx.doi.org/10.5334/bar. 23

9.References (c) Revilla M., Toninelli D., Ochoa C., Loewe G. (2016, forthcoming). Do online access panels need to adapt surveys for mobile devices?. Internet Research (accepted). StatCounter Global Stats (2015), link: http://gs.statcounter.com/#desktop+mobilecomparison-ww-monthly-201202-201602 (accessed Feb. 22 nd, 2016) Tourangeau, R., Groves, R., Kennedy C., Yan, T. (2009). The Presentation of the Survey, Nonresponse, and Measurement Error. Journal of Official Statistics, 25, 299-321. Tourangeau, R., Groves, R. M., Redline, C. D. (2010). Sensitive Topics and Reluctant Respondents: Demonstrating a Link between Nonresponse Bias and Measurement Error. Public Opinion Quarterly, 74(3), 413-432. Tourangeau, R., Yan, T. (2007). Sensitive Questions in Surveys. Psychological Bulletin, 133(5), 859-883. Wells, T., Bailey, J.T. and Link, M.W. (2013). Filling the void: Gaining a better understanding of tablet-based surveys. Survey Practice, Vol. 6 No. 1, pp. 1-9. West, B.T., Welch, K.B., Gałecki, A.T. (2007). Linear Mixed Models: A Practical Guide Using Statistical Software. Boca Raton: Chapman & Hall/CRC. 24

Does the Use of Smartphones to Participate in Web Surveys Affect the Survey Experience when Sensitive Questions are Proposed? For further information: Daniele Toninelli - daniele.toninelli@unibg.it Melanie Revilla - melanie.revilla@upf.edu Full working paper available upon request (forthcoming, 2016). Suggested citation: Toninelli, Daniele, Revilla, Melanie. 2016. the Survey Experience when Sensitive Questions are Proposed?. General Online Research (GOR) Conference, Dresden.