Community Life Survey Summary of web experiment findings November 2013

Similar documents
A Comparison of Level of Effort and Benchmarking Approaches for Nonresponse Bias Analysis of an RDD Survey

HMRC Tax Credits Error and Fraud Additional Capacity Trial. Customer Experience Survey Report on Findings. HM Revenue and Customs Research Report 306

Social. The Facts RESEARCH FINDINGS SERIES

Beyond 2011: Population Coverage Survey Field Test 2013 December 2013

Survey on Person-to-person Direct Marketing Calls

Contents Executive Summary Key Findings Use of Credit Debt and savings Financial difficulty Background...

The Effect of Questionnaire Cover Design in Mail Surveys

Attitudes towards Financial Planners. Factuality research

Study into the Sales of Add-on General Insurance Products

Consumer Engagement and Detriment Survey 2014

Internet Surveys. Examples

2011 UK Census Coverage Assessment and Adjustment Methodology. Owen Abbott, Office for National Statistics, UK 1

Majority of Americans Use Recycled Boxes When Moving and Take About Two Months to Unpack

A Basic Introduction to Missing Data

OFFICE OF THE PRIVACY COMMISSIONER OF CANADA. Audit of Human Resource Management


How do I: Conduct Market Research?

Attachment A: Customer Follow Up Phone Survey Of Non-Respondents to the Business Case Customer Survey

A secondary analysis of primary care survey data to explore differences in response by ethnicity.

IAB Evaluation Study of Methods Used to Assess the Effectiveness of Advertising on the Internet

Optimising survey costs in a mixed mode environment

Abuse of Vulnerable Adults in England , Final Report, Experimental Statistics

Assessment of compliance with the Code of Practice for Official Statistics

Workplace travel surveys

HSE MANAGEMENT STANDARDS INDICATOR TOOL USER MANUAL

Jon A. Krosnick and LinChiat Chang, Ohio State University. April, Introduction

New thinking on benchmarking customer experience

Bath and North East Somerset Council - Resources Directorate Plan 2016/17 to 2019/20

LONG-TERM CARE: The Associated Press-NORC Center for Public Affairs Research. Perceptions, Experiences, and Attitudes among Americans 40 or Older

Survey research. Contents. Chaiwoo Lee. Key characteristics of survey research Designing your questionnaire

2014 National NHS staff survey. Results from London Ambulance Service NHS Trust

AARP Bulletin Survey on Financial Honesty

Social work education in England

Reflections on Probability vs Nonprobability Sampling

Debt Options Information guide

Statistical & Technical Team

iprospect November 2010 Real Branding Implications of Digital Media an SEM, SEO, & Online Display Advertising Study

CDPHP CAHPS 4.0 Adult Medicaid Health Plan Survey

Investors in People Impact Assessment. Final report August 2004

Quitline Tax Increase. Survey NEW ZEALAND POLICE CITIZENS SATISFACTION RESEARCH (TN/10/19) Six Month Follow Up. Contents

SOCIETY OF ACTUARIES THE AMERICAN ACADEMY OF ACTUARIES RETIREMENT PLAN PREFERENCES SURVEY REPORT OF FINDINGS. January 2004

Key Findings ASIC Report 419. Australian Financial Attitudes and Behaviour Tracker Wave 1: March August 2014

Best Practices in Recruitment Assessment

Employer Survey Guide

This guide has been produced by the Insolvency Service with the help and support of the IVA Standing Committee. The Insolvency Service would like to

Report to Cabinet 28 January 2013 Item No 16 Strong and Well: Strengthening Support for Older People in Norfolk

13 July 2010 QUALITY OF SERVICE GFK. GfK NOP Anders Nielsen, Priyesh Patel & Amanda Peet

2003 National Survey of College Graduates Nonresponse Bias Analysis 1

Descriptive Methods Ch. 6 and 7

The AmeriSpeak ADVANTAGE

Performance Appraisal: Director of Education. Date of Next Review: September 2015 (every 2 years)

1. Introduction. Amanda Wilmot

DRAFT Local Digital Television Programme Service (L-DTPS) licence application form

Killed 2013 upper estimate Killed 2013 lower estimate Killed 2013 central estimate 700

Review of Financial Planning and Monitoring. City of York Council Audit 2008/09 Date

Competitor Analysis. Conducted on behalf of Nutmeg. February ComPeer Limited 8 Laurence Pountney Hill London EC4R 0BE

DEPARTMENT OF HEALTH. TRANSPARENCY AND QUALITY COMPACT MEASURES (voluntary indicators) GUIDE FOR CARE AND SUPPORT PROVIDERS

COMMERCIAL LEASE TRENDS FOR 2014

In Debt. Dealing with your creditors

Performance audit report. Performance of the contact centre for Work and Income

Affordability and Availability of Flood Insurance

LORE methodological note 2015:7

A Balanced Scorecard for a Professional Services Business

Americans Current Views on Smoking 2013: An AARP Bulletin Survey

Comprehensive Spending Review 2007 covering the period

Strengthening the Performance Framework:

NATIONAL INFORMATION BOARD. WORK STREAM 1.2 ROADMAP Enable me to make the right health and care choices

Fannie Mae National Housing Survey. What Younger Renters Want and the Financial Constraints They See

Chapter 5: Analysis of The National Education Longitudinal Study (NELS:88)

The following is issued on behalf of the Committee on the Promotion of Civic Education:

Identifying Social Attitudes and Barriers to Water Conservation A Community Water Survey

Transcription:

Community Life Survey Summary of web experiment findings November 2013 BMRB Crown copyright 2013

Contents 1. Introduction 3 2. What did we want to find out from the experiment? 4 3. What response rate can be achieved with this method of recruitment and what difference do incentives make? 5 4. What is the profile of a web sample compared to a face-to-face interview sample? 7 5. What is the quality of the data using web designs? 8 6. Do the two samples give similar estimates? 10 7. Conclusions 17 8. Next steps 18 2

1. Introduction TNS BMRB was commissioned to carry out the Community Life Survey for 2012-13 on behalf of the Cabinet Office. The survey was commissioned by the Cabinet Office to track the latest trends and developments across areas that are key to encouraging social action and empowering communities. The survey tracks measures including: Volunteering and charitable giving; Views about the local area; Community cohesion and belonging; Community empowerment and participation; Influencing local decisions and affairs; and, Subjective well-being In 2012-13 survey data was collected using face to face interviews with a representative sample of 6,600 individuals in England which, because of constraints in the commissioning timetable, ran from August 2012 through to April 2013, covering threequarters of a year. In parallel, TNS BMRB was commissioned to carry out development work to explore cost effective methods for future survey years, should the study have been recommissioned. In particular, TNS BMRB explored options for incorporating online methods of data collection which cost significantly less than the face-to-face interview design employed for the 2012-13 survey. These options included: A stand alone web-based survey, based on a probability sample of addresses and of sufficient quality to produce Official Statistics A face to face interview survey with a reduced sample size alongside a large webbased survey that would either formally contribute to Official Statistics or provide supplementary information To provide evidence for or against these two options, TNS BMRB carried out a large scale test of a probability sample web survey, incorporating a paper questionnaire option for those not used to the web or who prefered not to provide data through that channel. The design of this test was informed by comments from a Technical Advisory Group of senior survey methodologists. In this report, the main findings from the testing work are summarised. Subsequently, the Cabinet Office commissioned TNS BMRB to carry out further developmental work over the course of 2013-14. This activity is currently being carried out and the results will inform future survey years, if commissioned. 3

2. What did we want to find out from the experiment? TNS BMRB designed an experiment to answer the following questions: What response rate can be achieved with random probability recruitment for a predominantly web based survey? What difference do incentives make to the response rate and what value is added by the inclusion of a postal questionnaire? What is the profile of a web sample compared to a face-to-face interview sample? What is the data quality like? Do the two samples (face to face and web) give similar estimates? The experiment is one of the largest ever tests of web survey methodology in which random sampling has been employed. Commercial access panels have been employing web data collection for many years but any inferences about the general population drawn from these panels rely upon strong and unverifiable assumptions. Therefore they are not considered appropriate tools for collection of Official Statistics. TNS BMRB drew a random sample of approximately 6,700 addresses. Each address was sent an invitation, by post, to complete an online survey plus up to two reminders for non-responders. A random subset of non-responders received a postal questionnaire with their second reminder. At each address, TNS BMRB requested that the adult with the most recent birthday complete the questionnaire. The intention was to avoid respondent self-selection which can introduce systematic bias to some survey estimates. TNS BMRB also tested a mix of incentives: 5 and 10 s conditional on completing the questionnaire and a 5 high street shopping voucher included in the first letter as an unconditional incentive. There was also a control group which was offered no incentives. 4

Response Rate 3. What response rate can be achieved with this method of recruitment and what difference do incentives make? Figure 1 shows the achieved response rates for the web-only designs (i.e. no postal questionnaire offered with the second reminder) compared to the standard face-to-face interview design. The highest online-only response rate was 25%, acheived with a 5 unconditional incentive, Response considerably rate below the face-to-face interview response rate of 60% but comparable 70 with telephone interview surveys that use random digit dialling to generate a sample. 60 60 Offering an incentive increased the response rate from 16% to a maximum of 25%. 50 The cost of the unconditional 5 incentive (equal to a conditional incentive of around 39 20) makes 40 it unattractive compared to the incentive, 35 despite achieving 31 a slightly higher response rate (25% 27 compared to 22%). 30 20 10 Figure 1: Incentives were found to increase the response rate, but not to the level achieved 0 with face to face interviews 70% 60% FTF - 5 conditional shopping voucher Web & Postal - No incentive Web & Postal- 5 conditional Web & Postal - Web & Postal - 5 shopping voucher in 1st letter = 95% confidence interval 50% 40% 30% 60% 20% 10% 16% 19% 22% 25% 0% FTF - 5 conditional shopping voucher No incentive 5 conditional 5 shopping voucher in 1st letter 5

Response Rate Adding a postal questionnaire with the second reminder increased the response rate substantially. Figure 2 shows the achieved response rates for the web/postal designs compared to the standard face-to-face interview design. Even if no incentive was offered, the response rate for a web/postal design was higher than a web-only design with a 10 incentive offered (27% compared to 22%). Nevertheless, the highest response rate (39%) was still lower than the face-to-face interview response rate. The postal questionnaire had to be an edited, shorter version of the full questionnaire to avoid appearing too onerous a task to potential respondents. A postal questionnaire which included all of the questions would have contained approximately seventy pages. As a consequence, the quoted response rate is relevant only to the questions common to both forms of the questionnaire. Figure 2: Adding a postal questionnaire to the second reminder substantially increased the response rate, but required a shorter questionnaire, so less data was collected 70% = 95% confidence interval 60% 50% 40% 30% 60% 20% 10% 27% 31% 35% 39% 0% FTF - 5 conditional shopping voucher Web & Postal - No incentive Web & Postal - 5 conditional Web & Postal - Web & Postal - 5 shopping voucher in 1st letter 6

4. What is the profile of a web sample compared to a face-to-face interview sample? A number of profile differences were observed between the web sample and the face-toface interview sample. In particular, the web sample contained a higher concentration of individuals who are: everyday internet users; high earners; degree educated; native English speakers; middle-aged (45-64); living as a couple; home-owning; at least partly responsible for the care of another person. All of these differences are statistically significant at the 5% level but there are wide margins of error around the point estimates themselves. The additional tests carried out in 2013 will provide firmer data but preliminary analysis suggests the differences listed above are the most distinctive. Figure 3: Despite an increase in the response rate when adding a postal questionnaire, the age profile of the achieved sample was not improved 16-24 25-34 35-49 50-64 65-74 75+ Population 15% 17% 26% 22% 11% 10% F2F Final 11% 15% 26% 25% 13% 9% Web Only C 10 8% 15% 28% 31% 13% 6% Web & Postal C 10 5% 12% 24% 28% 16% 14% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Response Rate 7

As an example, Figure 3 shows how the age profile of the sample differed by test design. The top bar illustrates the profile of the population as a whole, which can then be compared to the profiles achieved through the different survey modes. The second bar shows the face-to-face interview sample profile. Even with a response rate of 60%, there is some bias towards the older age groups. This is a typical interview survey finding. The third bar shows the web-only age profile if a 10 incentive was offered. It has a slightly older age profile than the face-to-face interview design albeit with fewer people aged 75+. The last bar shows the age profile when a postal questionnaire was included with the second reminder. Adding the postal questionnaire brought in more middle-aged and older people but not more young people. The age profile got worse despite the increase in response rate. In some other respects (e.g. housing tenure and working status), the addition of a postal questionnaire option improved the sample profile but not as much as might be expected given the significant increase in response rate. Demographic biases can be eliminated by weighting the sample so that it matches the population. However, sample bias can remain even after weighting because sample bias is more than just demographic bias. 8

5. What is the quality of the data using web designs? Beyond sample quality, data quality is an important consideration when choosing between survey designs. Access panel data quality can be seen as less robust, with evidence of some respondents completing questionnaires as fast as possible in order to maximise earnings. Ideally, respondents should take as much care answering the questions as they would in an interview. With no observer present, data quality can only ever be measured obliquely through considering multiple metrics in combination. There were some positive findings about the quality of the data: The time stamp data showed that web respondents took the same length of time to complete the questionnaire (and sections within it) as interview respondents. The same level of care was taken to complete the questions (as assessed by several metrics): o o o A similar number of items were selected when multiple responses could be given There was a similar level of differentiation between questions that used the same response scale There was no (additional) bias towards items at the top of long response lists. There were also some less positive findings: One quarter of respondents appeared to be the wrong individual (i.e. their birthday was not the most recent in the household) There was higher dropout during the web survey than with face-to-face interviews (10% compared to 1%), especially at the first few screens There were more don t know and refusal answers (a finding that is typical of self-completion questionnaires compared to interviews) The most concerning of these findings is the number of wrong respondents. The respondent data from the experiment is not extensive enough to assess the impact of this but the development work in 2013-14 should mean a fuller understanding is obtained before a commissioning decision is made for the 2014-15 period. TNS BMRB has also designed and is testing an alternative in which all adults in a household are invited to take part (and therefore no-one is the wrong respondent). Technical developments mean that the number of don t know and refusal answers should be reduced when the web survey is repeated. Some minor changes to questionnaire order may also reduce drop-outs (impact unknown at the time of writing). 9

6. Do the two samples give similar estimates? Comparing the web and interview findings is difficult because the small sample sizes for each variant of the web survey design mean there are quite large margins of error around each estimate. The additional work in 2013 should provide stronger data in this respect. Nevertheless: It is expected that approximately 40-50% of web survey estimates will be significantly different from the equivalent face-to-face interview estimate. Many of these differences will be small but some will be large; Use of an incentive payment leads to a smaller difference between web-only and face to face interview estimates; Including a postal questionnaire with the second reminder does not make the estimates more aligned with the face to face interview estimates despite the much higher response rate Standard demographic weighting of data makes very little difference to any key estimates The last finding suggests a weak relationship between response rate and key measures. Remaining differences may be due to the different modes of data collection rather than (uncorrected) differences in sample composition. The next six charts aim to illustrate these findings and provide a sense of what might change if the survey design changes. Each chart shows one key population estimate from among the 22 headline variables. The six have been selected to represent different topic areas in the questionnaire. For simplicity, analysis is restricted to the web-only and web/postal design in which a incentive is offered. The results are shown alongside the face-to-face interview results from the same period (August to September 2012). The data have sampling weights applied but no demographic weights. Demographic weighting would change the estimates but, for the most part, only by marginal amounts. The black arrows show the margins of error (95% confidence intervals) 1 around each estimate. These are quite wide so these findings should be treated as indicative only. 1 The 95% confidence interval is a range of plausible population values. Given the survey design, there is only a 5% probability of generating a point estimate outside of this range. 10

Percentage Figure 4: Any civic participation in the last 12 months? 100 90 80 70 60 50 40 30 20 10 51 44 43 0 Web & postal - FTF - 5 conditional shopping voucher Figure 4 shows that a web-only design appears to lead to a higher estimate of the level of civic participation, possibly because of the slightly older age profile and other demographic biases (more highly educated, more likely to be home owners etc). Inclusion of the postal questionnaire in the second reminder seems to lower the civic participation rate despite the extra influx of older people. It could be that those responding only at a third request are less civic-minded than those who respond early. 11

Percentage Figure 5: Any formal volunteering in the last 12 months? 100 90 80 70 60 50 40 30 20 45 45 44 10 0 Web & postal - FTF - 5 conditional shopping voucher Figure 5 shows the proportion reporting any formal volunteering activity in the last 12 months. The results look more aligned here, regardless of whether an online-only or online/postal design is adopted. None of the differences are statistically significant. 12

Percentage Figure 6: Any informal volunteering in the last 12 months? 100 90 80 70 60 50 40 30 20 10 67 62 61 0 Web & postal - FTF - 5 conditional shopping voucher Figure 6 shows the proportion reporting activity that has been categorised as informal volunteering. There is broad alignment here although the web-only estimate is a little higher than the other two. Given the web survey is a self completion survey, it is reasonable to assume that the respondents are generally more willing to participate in surveys compared to face to face respondents, where the interviewer can encourage participation. As with civic participation, there may well be a moderate association between willingness to help others and willingness to complete a self completion survey, which could help explain why the web only estimate is slightly higher. Such attitudinal biases are hard to eradicate because they are not strongly correlated with demographic factors that can be dealt with by weighting the data. 13

Percentage Figure 7: Given to charity in the last 4 weeks? 100 90 80 70 60 50 40 30 20 10 82 82 76 0 Web & postal - FTF - 5 conditional shopping voucher Figure 7 shows the proportion reporting giving to charity at least once in the last four weeks. Both the web-only and web/postal designs give a slightly higher estimate than the faceto-face interview design. As before, there may be a relationship between a willingness to help others (by giving to charity) and willingness to complete a self completion survey, with no interviewer providing encouragement to take part. 14

Percentage Figure 8: Agree that I can influence decisions made about the local area? 100 90 80 70 60 50 40 30 20 10 26 27 37 0 Web & postal - FTF - 5 conditional shopping voucher Figure 8 shows the proportion agreeing that I can influence decisions made about the local area. The experiment suggests that a web-only or web/postal design would produce a lower result for this measure. This is an example of a question where some respondents may slightly overstate their influence when in an interview situation but not when completing a questionnaire by themselves. The observed magnitude of difference is in line with examples from the methodological literature in which sample composition has been controlled more fully than is possible here. Consequently, in the view of TNS BMRB, this difference is largely due to data collection mode rather than uncorrected sample composition differences. 15

Percentage Figure 9: Agreement that people in the neighbourhood pull together? 100 90 80 70 60 50 40 30 62 63 62 20 10 0 Web & postal - FTF - 5 conditional shopping voucher This chart shows the proportion agreeing that people in the neighbourhood pull together. The phrase is open to some interpretation but yields a similar result with all three survey designs. 16

7. Conclusions In summary the experimental study provided some key information: What response rate can be achieved with web-only data collection? o A little over 15% unless an incentive is used. What difference do incentives make to the response rate? o A conditional incentive of 10 raises the response rate to 22%. o An unconditional incentive can achieve a slightly higher rate but the cost is prohibitive. What is the data quality like? o Generally good although there are cases of respondent self-selection within the household. What is the profile of a web sample compared to a face-to-face interview sample? o The web sample is more likely to comprise everyday internet users, high earners, native English speakers and people with HE qualifications. It will also include more middle-aged people living with a partner and owning their own home. This demographic skew can be corrected through sample calibration but non-demographic skews may remain. Do the two samples give similar estimates? o TNS BMRB would expect approximately 40-50% of web sample estimates to be significantly different from interview sample estimates with some very substantial differences. Demographic weighting is unlikely to strengthen the alignment of the estimates. It is likely that data collection mode has more of an influence on the results than sample composition. What statistical value is added by the inclusion of a postal questionnaire? o Very little; the response rate is increased but the sample profile is not greatly improved. 17

8. Next steps As mentioned in the introduction to this report, TNS BMRB has been commissioned to continue testing of the web survey alternative in 2013-14. This testing will comprise of a reduced sample face to face survey (1,250 per quarter, 5,000 in 2013-14 survey year) alongside a web survey (2,000 per quarter, 8,000 in the 2013-14 survey year). The web sample will be given the option to request a postal questionnaire if preferred. Given the feedback received in previous consultations, the survey results will be published on an annual, rather than quarterly basis. In addition, a version will be tested wherein all household adults can complete the questionnaire, avoiding the problem of respondent self-selection. This test will be carried out alongside a tweak to the last birthday method, namely that in half of the sampled households the respondent should have the last birthday and in the other half the respondent should have the next birthday. Obtaining a much larger web based sample size over the course of 2013-14 will allow greater refinement in (i) analysis of sample composition, and (ii) analysis of the substantive data. TNS BMRB expects to use this data to develop a weighting strategy should a web version of the Community Life Survey be commissioned. Finally, it is hoped that a survey design communication strategy can be developed which will ensure that data users are fully informed about the change of design and how to interpret and analyse the the web and face-to-face data. 18