Laurie Dillon. Architecting good research think with the end in mind



Similar documents
Junk Research Pandemic in B2B Marketing: Skepticism Warranted When Evaluating Market Research Methods and Partners By Bret Starr

Before you start creating your survey for use at LSBU, ask yourself the following questions:

Secrets to Successful Event Marketing and Management

AER reference: 52454; D14/54321 ACCC_09/14_865

Office of the Auditor General AUDIT OF IT GOVERNANCE. Tabled at Audit Committee March 12, 2015

Content Marketing in 2014:

Get it together: six ways to effectively integrate phone, web, and surveys

Hosting and cloud services both provide incremental and complementary benefits to the organization

Benchmark Report. Marketing: Sponsored By: 2014 Demand Metric Research Corporation in Partnership with Ascend2. All Rights Reserved.

Best Practices in B2B Customer Service Data Collection

Resource 6 Workplace travel survey guide

testing EFFECTIVE TESTS FOR EVERY METRIC OF AN CAMPAIGN

Digital Analytics Checkup:

Tips for Growing and Managing Effective Marketing Lists. Jennifer Culbertson Looking Glass Marketing September 10, 2012

The Ultimate Guide to Selecting a Web Content Management System. An 8-step guide to successful WCMS selection

CHAPTER III METHODOLOGY. The purpose of this study was to describe which aspects of course design

How to choose the right marketing partner for you

Our Success Rates Have Changed, Please Listen Carefully

Best Practices for improving internal communications

ESOMAR 28: SurveyMonkey Audience

Application in Predictive Analytics. FirstName LastName. Northwestern University

building and sustaining productive working relationships p u b l i c r e l a t i o n s a n d p r o c u r e m e n t

ACCELERATE AND MODERNIZE YOUR BUSINESS KNOWLEDGE WITH DATA MANAGEMENT

Core Curriculum Syllabus. August 5, 2011

WEBSITE & DIGITAL MARKETING MUST HAVES NOVEMBER 9 TH, 2015 What They Are, Why You Need Them & How They Will Make Your Business Succeed Online

Market Research. What is market research? 2. Why conduct market research?

Software User Experience and Likelihood to Recommend: Linking UX and NPS

Small & Medium-sized Business (SMB) Lead Generation Benchmark Report

INSIGHTS FROM OPERA MEDIAWORKS

MIPRO s Business Intelligence Manifesto: Six Requirements for an Effective BI Deployment

MARKET RESEARCH FOR STARTUPS AND SMALL COMPANIES

Competitive Intelligence for B2B Companies

Welcome to Active Giving Fundraising!

Insights That Position Your Brand As A Market Leader

Perfect Fit: A Communications Strategy for KPMG Recruiting Angie Andich and National Communications Team KPMG LLP Canada Toronto, Ontario, Canada

1. Introduction Deliverability-Benchmarks Working with Your Service Provider sent delivered...

Ads Optimization Guide

The Informz 2012 Association Marketing Benchmark Report

Marketing Success

2014 Social Customer Engagement Index Executive Summary

Dive Deeper into Your Sales Metrics: 4 Ways to Discover Hidden Sales Treasure. Rich Berkman Qvidian

Conducting a Tenant Satisfaction Survey Key Issues

CONSUMER LIST MANAGEMENT Overcoming Challenges To Growing & Maintaining Your List CONSUMER LIST MANAGEMENT

Return on Investment from Inbound Marketing through Implementing HubSpot Software

Plan Wellness Scorecard For the period ending June 30, 2015

Customer Market Research Primer

PROVEN INTERACTIVE TELEVISION REVENUES THROUGH INTERACTIVE ADVERTISING & DIRECT MARKETING

How To Monitor A Document Management System On A Web Browser On A Linux Computer (For Free)

On-Demand Versus On- Premise CRM. Are There Performance Differences?

CP/ux. The 5 Key Steps to Hiring the Best UX Talent. A CLEAR/POINT White Paper. By Carol Szatkowski. Clear/Point. ux staffing consultants

Top 10 Tips to Improve Your Permission

The Adoption of IaaS A Market Analysis

Build More Loyal Customers with Marketing. Marketing from Constant Contact

Live Chat Performance Benchmarks A Statistical Analysis 2012 Edition

Understanding the Digital Voter Experience. Research Strategy Group

REQUEST FOR PROPOSAL GlobalPittsburgh Website February 11, Summary and Background

2016 ASSOCIATION MARKETING BENCHMARK REPORT

Marketing Research Core Body Knowledge (MRCBOK ) Learning Objectives

B2C Marketing Automation Action Plan. 10 Steps to Help You Make the Move from Outdated Marketing to Advanced Marketing Automation

Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff

GIM Capital Goods / B2B. Heidelberg, April 2015

Conversion Rate Optimisation Guide

White Paper: Designing Resourceful Graphical User Interfaces (GUIs) for Healthcare Applications

Market Research. Market Research: Part II: How To Get Started With Market Research For Your Organization. What is Market Research?

Marketing Benchmark Report

Activating a Social Media Strategy within an Organization

KNOWLEDGE 4YOU KNOWLEDGE4YOU.COM. Examples of the types of clients and the work we do for them are: - Educational Institutions

MUSKOKA TOURISM REQUEST FOR PROPOSAL

MARKETING TIPS. From Our InfoUSA Experts

DELIVERABILITY ESSENTIALS: 6 MUSTKNOW TIPS

ebusiness in Canada: Pushing beyond Good Enough How Canadian ebusinesses can better prepare to meet global competition SUMMARY

Developing Your Professional Online Identity

CONSUMER LIST MANAGEMENT Overcoming Challenges To Growing & Maintaining Your List CONSUMER LIST MANAGEMENT

Best Practices for Relationship Marketing

?????26?????????????? QUESTIONS TO HELP RESEARCH BUYERS OF ONLINE SAMPLES

Buyer s Guide to. Survey-Based. Lead Generation

Outpatient Survey 2011

Benchmark Report. SEO Marketing: Sponsored By: 2014 Demand Metric Research Corporation in Partnership with Ascend2. All Rights Reserved.

Define, Benchmark, Evaluate: A Guide to Assessing Your Marketing Performance

QUALITY ALERT: HOW TO ENSURE YOUR SOCIAL AND MARKET RESEARCH

11. Planning your 1workload

Deliverability:

Market Validation. 10% of the expected cost of developing the product. Two day triage of your idea to determine if more time and effort is required.

NATIONAL INFORMATION BOARD. WORK STREAM 1.2 ROADMAP Enable me to make the right health and care choices

Six Questions to Ask About Your Market Research

THE FUTURE CONTENT PART III: RETHINKING CONTENT CONSUMPTION

How To Save Money On Production

COMMENTARY. Corporate Environmentalism in China: An NGO Corporate Partnership to Improve Energy Efficiency in Chinese SMEs.

Rational Growth. An introduction to growing user signups via data and analytical thinking. By Sandi MacPherson Via interviews with Andrew Chen

Digital TV switchover: Social media

"SEO vs. PPC The Final Round"

Marketing for Hoteliers: A Step-by-Step Guide

IT Workforce. Your onboarding program has the potential to increase retention, productivity and revenue

5 tips to improve your database. An Experian Data Quality white paper

Business Intelligence

Benefits of conducting a Project Management Maturity Assessment with PM Academy:

Spreading the Word: Raising Awareness and Funds with . Presented by: Alec Stern, VP, Constant Contact

MARKETING LENS SAMPLE REPORT. Are your marketing efforts effective? September Available on

Inquisite Reporting Plug-In for Microsoft Office. Version 7.5. Getting Started

Services - Outsource Marketing, Inc.

Transcription:

Online Surveys Lessons Learned Online surveys are a fast and cost effective means to quantitative research. We could have never executed a survey the size of the CMA 2nd annual Internet Marketing survey within our timelines without online technology. From start to finish, the project was a mere eight weeks long, with the online survey conducted the week before the last long weekend of the summer. Fortunately, we yielded 452 responses, exceeding our target response rate by 29%. Below are our insights to leading a successful research project. Laurie Dillon Co-chair, National Practice:: Interactive Marketing and Brand Strategy Centres for IBM e-business Innovation :: TORONTO Architecting good research think with the end in mind Develop a comprehensive research proposal with clear objectives to guide research efforts. As with the beginning of any research effort, be clear about what the research objectives are and the intended use of the research results. A comprehensive proposal will focus the activities to deliver against core objectives. Moreover, having a pre-thought research proposal can save time with the researcher; it will be clear to both parties how the research results will be used. 1. For our efforts, the research objectives were To understand how Internet technologies were being leveraged to support and evaluate online marketing initiatives. To understand the intended usage of Internet marketing for future brand efforts. To outline key business, marketing or technology challenges facing Canadian Internet marketers. Our intended use of the research results were To develop an overall Canadian perspective on current, and future Internet marketing efforts. To provide the Canadian marketing community with a guide on how the marketplace currently leverages technology in support of branding efforts. Page 1 of 7

Leverage subject matter expertise expertise in the brand, the area of focus or target market to assist in formulating the survey objectives and questions, to provide input to the research methodology and to interpret survey results. The key to any research is asking the right questions to yield the answers you re seeking. This is not as easy as it may sound. Each survey questions needs to develop, confirm or refute the hypotheses that that are driving your need for research. For the CMA s second annual Internet Marketing survey, we leveraged our in-house marketing expertise at IBM to direct and organize the research effort. For example, expertise was used to: Recommend new research direction and focus areas that covered key branding and marketing hot topics. Rewrite the questionnaire. Source research partner, Surveysite.com Condense the 2000 survey from 37 questions to 20 in 2001 all the while ensuring survey results would generate enough meaningful data to create a comprehensive report. Eliminate questions that would deliver data that were widely supported by other recent Canadian Internet research. Rewrite questions that were perhaps appropriate for last year, but too generic for 2001. For instance, the 2000 survey asked respondents if they had a web site. For this year, we expanded that question to understand what kind of web presence companies had (business to consumer, business to business or business to employee). Analyze, identify key findings and add the business implications to the research conclusions found in the research summary report. Carefully evaluate how the research will be cross tabulated. For the 2001 survey, we cross tabulated results by respondent title and corporate size so that we could view any significant differences between executives and management as well as between small, medium and large businesses. We worked with our research partner, Surveysite Inc., to define the cross tabulations. For example, corporate size was defined by corporate employee size with small equating to 1 to 99 employees, medium 100 999 employees and large more than 1000 employees. Watch out: If not done carefully, cross tabulations can result in views that are too narrow to report on. For instance, our intention was to report on Internet presence by industry, but we provided too many types of industry fields (in an effort to be consistent to the CMA 2000 survey). This resulted in samples sizes by industry that were statistically too small to report on. Page 2 of 7

Leverage the strengths of a professional research expert / statistian in any quantitative effort. Our partner, Surveysite Inc. added tremendous value beyond the survey execution by Ensuring survey questions did not lead the respondent by designing an unbiased questionnaire. Organizing questions in a fashion that encouraged greater response, e.g. Comfortable, generic questions were listed first in the survey with more sensitive questions, such as customer profile questions, were listed last in the questionnaire. Ensuring that best practices in online research were implemented; the online survey was programmed in such a way that the number of clicks was reduced to a minimal for a better user experience Ensuring that the survey effectively used dropdown boxes with long response lists while making the survey look shorter Ensuring efforts to increase the responses of the online survey did not result in pushing participants to respond, which would affect the reliability of the survey results. Ensuring the execution of the survey controlled for multiple responses through the use of an individual password per respondent. If a survey is offering an incentive to respond, this controls against ballotstuffing, and also prevents one respondent from skewing the data one way or another. Increasing your online survey response rate Success! We increased the responses from 220 in 2000 to over 452 in 2001! Here are a few of the ways we managed this: Cast a wider net In 2000, the CMA invited 2200 respondents to the survey. This year, using an expanded, opt-in CMA customer database, we increased our e-mail invitations to 5560. Kept the survey to a minimum. Last year, the survey was 37 questions. For 2001, we condensed the survey to 20 24 questions, and greatly reduced the number of mouse clicks required to complete the survey. Be honest about how long the survey will really take Our first draft survey would have taken much longer than our promised 10 minutes to complete. We did not want to annoy the respondent (and so affect our credibility and response rates), so we shorted the survey to ensure it would take no more than 10 minutes we promised it would. Provide value, value, value In Surveysite s experience, online survey response rates increase dramatically when the participant gains value from responding. For the 2001 survey, we identified multiple and relevant value for responding to the survey. We offered a copy of our final results, additional learnings on executions, and added a contest component as additional incentive. Page 3 of 7

Continued. Increasing your online survey response rate Send the survey mid-week, during mid-afternoon Most e-mail users will start their Monday mornings cleansing their mailboxes of non corporate or personal emails. The likelihood of your email being read is increased by sending out e-mail invitations mid-week, after 12pm. Other e-mail marketing strategies (e.g. sender and subject line testing) can contribute to higher response rates. Use 1 reminder e-mail to the survey invitation Standard to an online survey execution is sending out one reminder email to the survey invitations. Our reminder email generated an additional 64 responses, or 15% more responses. Note: This reminder email is not to be confused with push tactics which would see multiple reminders, and/or follow up phone calls to encourage survey participation. For our target market, fax invitations were not found to be effective Out of 1811 fax invitations, we received only 32 responses or a 1.8% response rate. Allow for some open ended questions Allow customers the opportunity to provide some open ended answers instead of answering just other. It can disappointing at the close of a survey to discover very high other response. This indicates that there is an insight that has not been presented in the options provided in the closed question format. To counter this from happening, we added some open-ended questions whereby the respondent could articulate what the other answer meant. For report writing, these open ended responses can be used to confirm or articulate a finding. Research Challenges: Getting to a clean customer database. Cleansing the customer database is crucial to maintaining data integrity. With our survey, we ran across a common issue with databases; approximately five percent of our opt-in customer list contained duplicate entries. May Jim, an e-business strategy consultant with the Centre for IBM e-business Innovation in Toronto has extensive experience in amalgamating and centralizing separate databases into one. For our work, May recommended three key steps: First, clarify the cleansing process. What do you intend to do with the duplicate entry? How do you cleanse the duplicate entry? How do you determine the genuine source? Second, define what a duplicate entry is. Is it a name? a fax number? For the purposes of this survey, we defined the following rules to handle duplicates: A duplicate is an entry that has the same name and company, but two fax numbers. (this means if the entry has the same name but different company, then we will not filter out that name.) Steve Smith at IBM, fax 555-5555 is a duplicate to Steve Smith at IBM, fax 545-5454. Steve Smith at IBM, fax 555-5555 is NOT a duplicate to Steve Smith at ABC corp, fax 464-6666 Steve Smith at IBM, fax 555-5555 is NOT a duplicate to Stephen Smith at IBM, fax 555-5555. Page 4 of 7

For our purposes, it would take too long to check name spelling. This could be done as a more comprehensive exercise to determine a genuine entry. We also recognized that many people use the same fax number within the same office, as such, a duplicate fax number alone was not sufficient to define a duplicate entry. Thirdly, confirm what would be done with the entries identified as duplicate. For the 2001 survey, we filtered duplicate entries to another file. Once these three key steps were defined, filtering the existing database from duplicates was relatively easy. We were fortunate to get the customer list in an MS excel file format. Michal Jordan-Rozwadowski, Advisory IT Professional at the Centre for IBM e-business Innovation wrote a simplified query in MS Access to that extracted duplicate entries from a customer lists. Yvonne Lam, another IT professional from the IBM Centre, wrote an MS Excel macro that did the same. We gave this to both Surveysite and the CMA to cleanse the database from duplicates. In terms of privacy, we guarded the customer database list and responses to the survey with the utmost confidentiality. Respondents were not asked for their names or corporation names on the actual survey, as a method to ensure respondent privacy. An unsubscribe option accompanied our e-mail invitation, and, out of 5560 e-mail invitations, 4 respondents opted out of the customer database. How much? Online surveys offer both a time and cost advantage to more traditional methods of quantitative research. With traditional survey methods, survey costs are typically associated with sample size. When a survey is delivered electronically, sample size is no longer a cost determinant. Online survey costs are dependant on the length of the questionnaire (which is related programming and handling costs) and the amount of detail needed in the final report (cross tabulated raw results are cheaper than survey report summaries). For budgeting purposes, a 20-25 question survey with basic analysis can cost roughly $12,000 - $15,000 CND. We strongly recommend getting at minimum a research summary and presentation from the researcher. Too often, companies will rely solely on the raw cross tabulated report as findings. The research summary costing only a fraction more will identify any significant findings on your behalf. Armed with the research summary, the subject matter expert can turn research findings into conclusions and recommendations. They can identify the business implications or opportunities, and key customer insights. It is this step that turns good research into usable knowledge. How long? From start to finish, the 2001 CMA Internet Marketing research project took eight weeks. These timelines were extremely tight as we wanted to release our results at the CMA s seventh annual Internet Marketing conference. (Although our project timelines were eight weeks we worked weekends. So realistically, our project would have been at least 10 weeks long) 12-15 weeks would be a more manageable project length. Timelines are a factor of the number of resources available to a project. On this project we had two full time resources, with six other contributors at various points throughout the project. This does not include our research partner, Surveysite s, contribution. Page 5 of 7

Our project schedule is summarized: Week of Activity Length of time July 30 Research planning, partner sourcing, project funding 2-3 weeks August 6 Development of survey questionnaire, email invitations 48 hours August 13 Survey programming, editing, testing and revisions 2 weeks Translation 72 hours August 20 August 27 Sending out email and fax invitations, response period, & reminder email 1-2 weeks September 3 Cross tabulation of results 2 weeks One-on-One interviews 2 days September 10 Reviewing and interpreting results, writing of report document, 2 weeks September 17 Creation of graphics, charts 1 week September 24 Final copy revisions, graphical layout, publishing report, Translation of final report 1 week October 1 Presentation of results to CMA 7th Annual Internet Conference October 2, 2001 Full Results Outbound Bounce- Net Outbound Response Response Response Invitations back Rate Period 2000 Results 2200 NA NA 220 10% 4 weeks 2001 Results 7371 1555 5816 452 8% 1.5 weeks E-mail 5560 1555 4005 420 10% Fax 1811 NA NA 32 2% Final Thoughts When we sat down with the Canadian Marketing Association to prepare for this year s second annual CMA Internet Marketing survey, we wanted to achieve two goals; to deliver a state of the art, statistically solid online survey that resulted in compelling research findings and to improve upon our own success metrics in terms of responses and research accuracy. Page 6 of 7

Continued. Final Thoughts We are very pleased with our results. The Internet Marketing questionnaire was completely revised to reflect the need to investigate the Canadian online marketing landscape at a more detailed level. In addition to establishing survey benchmark questions, we added focus areas to gain greater insight into how Internet technologies were being used to support and evaluate online marketing initiatives. In terms of accuracy, our goal was to reach a higher confidence level over the 2000 survey, which had 220 respondents. To reach a 95 percent confidence interval, we needed a minimum of 350 responses. With careful planning, we surpassed that target with 452 responses, resulting in the desired 95 percent confidence level, with a margin of error of +/- 5 percentage points. This was very encouraging, given our response period was shortened from 4 weeks in 2000 to just 1 1/2 weeks in 2001. Indeed, the response to our research efforts was very positive. We even received personal emails from some participants congratulating us on our efforts. One respondent asserted I just wanted to mention that yours was the most professional survey I ve ever been invited to participate in. Another asked how soon could I get a copy [of the survey results]?. For IBM, these strong results were very satisfying. Originally, when the CMA looked to IBM for support on this survey, they saw us as a potential coder for the online portion. Although we can and do provide this type of service, we proposed a greater consulting relationship to guide this year s Internet Marketing research. By working closely with the CMA executive team, we were able to develop research that was relevant, meaningful and new to the membership. We helped architect the research, identify new research objectives and success targets, find partners and funding, create an implementation plan, turn initial survey results into meaningful business implications, design, print, author and publish the report as well as create a PR and marketing launch plan for the report itself. Clearly, a larger role than code development! I am confident in the value of lessons learned from conducting this years 2001 CMA Internet Marketing survey. I hope this learning helps contribute towards your own effectiveness with using online survey methods. Thanks! Laurie Dillon Co-chair, National Practice:: Interactive Marketing and Brand Strategy Centres for IBM e-business Innovation :: TORONTO 120 Bloor St. E., Toronto, Ontario M4W 1B7 Office: 416.355.2141~ Email: ldillon@ca.ibm.com About This lessons learned report was written by Laurie Dillon, co-chair of IBM s Interactive Marketing and Brand Strategy National Practice. This consulting practice helps clients understand what their customers want and value from the digital world and how to use this knowledge to build market share, increase profits, and build sustainable competitive advantage. SurveySite is located in Toronto, Canada. The staff at SurveySite provide complete qualitative and quantitative capabilities across North America, Europe and Asia and are fully experienced in a wide variety of research and statistical methodologies. Methodologies implemented online include e-mail surveys, visitor profiles, online conjoint analysis, Web surveys, customer/employee satisfaction and professional Web site evaluations. Copies of the 2001 CMA Internet Marketing survey are available through the membership section of www.the-cma.org, or visit the IBM website at www.ibm.com/ca/services. Page 7 of 7