1 COMPARISON OF POSTAL AND ONLINE SURVEYS: COST, SPEED, RESPONSE RATES AND RELIABILITY Research Conducted By: Education Market Research Robert M. Resnick, Ph.d. February 2012 With the support of 2012 education Market Research and MCH Strategic Data. copying prohibited.
2 Page 2 EXECUTIVE SUMMARY The intent of this study was to put a number of hypotheses about postal vs. online surveys to the test. Conventional wisdom has it that online surveys are superior to postal surveys in terms of both speed and cost-efficiency. It is also generally assumed that the reliability of data is essentially equivalent if postal and online survey data is compared. The other issue to be investigated here was the effect of length of questionnaire on response rates and data reliability. The empirical evidence used to answer these questions came from a parallel administration of the same EMR survey, at the same time, to separate but similar samples of educators. One set of surveys was printed and mailed and responses were collected via return mail, while the second set was deployed via and the surveys were filled out and the responses were collected online. The data obtained through this simple experiment sheds light on the relative advantages and possible limitations of both postal and online surveys, in terms of speed, cost-efficiency, and data reliability. Comparing the actual cost, speed, response rates, and data reliability of the two survey modes yielded the following profile of pros and cons. Relative Merits Of Postal vs. Online Surveys CRITERION POSTAL ONLINE Cost $23.14 per usable response - 9% less than online $25.22 per usable response Speed 85% of responses within 2 weeks 85% of responses in less than 1 day Response rate 4.4% - 6 times higher than online 0.75% Optimum number of questions 30 questions or more 15 questions is too many Reliability High split-half and testretest reliability Uncertain - needs more research In this specific test, the postal mode outpaced the online mode on four of the five key criteria, with the online mode superior only in terms of speed. If more than 10 questions are needed to get the job done, the postal mode would be the only prudent option. Given that the online mode is faster, but not cheaper compared to the postal mode, it must be emphasized that there is evidently a significant trade off to be concerned about with regard to higher speed vs. lower response rates and reliability when the online mode is utilized.
3 Page 3 INTRODUCTION A market or customer survey project can be divided into four parts: PHASE Four Phases Of A Survey Project DESCRIPTION 1 Conceptual Phase 2 Operational Phase 3 Analysis Phase 4 Implementation Phase 1. The conceptual phase, during which it is decided to whom, to how many, and when and where the surveys will be sent out, what questions will be asked, what form the questions will take, how many, and in what order they will be displayed on the questionnaire. 2. The operational phase, during which the surveys are prepared and then sent out to the target population, completed questionnaires are retrieved, and the data is tabulated. 3. The analysis phase, during which the tabulated data is carefully examined to determine if the responses confirm what is already known or believed to be the case, or if there are surprises and the responses suggest an alternative to what was previously believed to be true. There is also an opportunity to pinpoint significant differences, if any, in opinions and needs by grade level, job title, geographic region, etc. which could cause product developers and marketers to adopt different strategies for different sub-groups within the overall population of educators. 4. The implementation phase, during which those who receive the survey results and analyses decide, based at least in part on the data, what next steps to take. Surveys are some times printed and mailed and filled out with pens and pencils by the respondents, and then mailed back. Some times they are deployed via with the responses collected online. Education Market Research (EMR) has, for seventeen years, primarily (although certainly not exclusively) used the printing and mailing approach to accomplish educator surveys. Recent advances in technology, and the ready availability of addresses for educators, have made the online approach a plausible alternative to the postal mode. The purpose of this paper is to compare the two approaches in the Operational Phase (as defined above) of a survey project, which is where they differ in a number of important ways. It is assumed that the Conceptual, Analysis, and Implementation phases would be much the same regardless of which type of survey was used to obtain the data.
4 Page 4 Some might wonder why it is even necessary to consider the relative merits of the two approaches. They might suggest that, on the face of it, the printing/mailing approach is the slow, expensive, old fashioned or analog way, while the online approach is the fast, inexpensive modern or digital way to conduct survey research. It is hard to argue the issue of speed, but a careful cost comparison is certainly warranted. And what about response rates, and the closely related issue of reliability of the data? Is there a trade off to be concerned about with regard to speed vs. response rates and reliability? If it was an open and shut case in favor of online surveys, one would expect that all of the major survey companies had already switched to the online mode. But that has not happened. Putting EMR aside, J.D. Power and Associates (a division of The McGraw-Hill Companies) is a good example of the continued reliance on postal surveys. The most recent J.D. Power 2012 Vehicle Reliability and Service Survey, was mailed in October The survey instrument was eight pages long with 68 numbered questions (since many questions required multiple responses, there are actually more than 68 questions). An inquiry made to J.D. Power about the survey revealed that this flagship research project is only done by mail (although there are some other surveys that J.D. Power conducts online). Is it possible that J.D. Power has tested and discovered that surveys of this extensive length do not lend themselves to online delivery? So the intent of this study was to put a number of hypotheses about postal vs. online surveys to the test. Conventional wisdom has it that online surveys are superior to postal surveys in terms of both speed and cost-efficiency. It is also generally assumed (or possibly not considered at all) that the reliability of data is essentially equivalent if postal and online survey data is compared. The other issue to be investigated here was the effect of length of questionnaire on response rates and data reliability. Do longer surveys with relatively complex questions work as well in the online mode as shorter, simpler ones? The empirical evidence used to answer these questions came from a parallel administration of the same survey, at about the same time, to separate but similar samples of educators. One set of surveys was printed and mailed and responses were collected via return mail, while the second set was deployed via and the surveys were filled out and the responses were collected online. The data obtained through this simple experiment sheds light on the relative advantages and possible limitations of both postal and online surveys, in terms of speed, cost-efficiency, and data reliability.
5 Page 5 METHOD In order to provide information on current trends in the elementary Reading market segment, EMR designed a detailed survey, and then mailed it on October 17, 2011 to 18,000 randomly selected educators segmented as follows. Sampling Matrix JOB TITLE ELEMENTARY MIDDLE/JHS TOTAL Classroom teacher 6, ,000 Reading teacher 3,000 3,000 6,000 Curriculum supervisor ,000 TOTAL 9,000 3,000 18,000 The survey consisted of four printed pages with a total of 30 numbered questions, of which 26 were multiple choice and 4 required a write in response. Of the 26 multiple choice questions, 6 offered the option of writing in an other response. Three weeks later, on November 8, 2011, the same 30-question Reading Market survey was replicated using the online method. This survey contained identical questions, but due to the formatting necessary for viewing on-screen, the length expanded to 16 pages or screens, compared to 4 pages for the postal version. The invitation to participate in the survey was e- mailed to 90,000 randomly selected elementary teachers (excluding the 18,000 chosen for the postal survey). In terms of a response deadline, the postal survey was mailed (via first class mail) on October 17, 2011 with a response deadline of November 11, 2011, a 4-week window. The online survey was deployed on November 8, 2011 with a November 20, 2011 response deadline, a 12-day window. MCH Strategic Data Both for the postal and for the online survey, the educator names were provided by MCH Strategic Data (www.mchdata.com), a leading compiler of education market data. MCH also handled the configuration and deployment of the online survey, and the collection and tabulation of responses. MCH s database features comprehensive coverage of public, private, and parochial schools, 15,000 school districts, and nearly 5 million educators. In 2010, MCH Strategic Data acquired a fully updated version of the QED Education Database and merged it with its own comprehensive K-12 database.
6 Page 6 MCH Strategic Data has an 83-year commitment to accurate, complete, and timely marketing databases. The company compiles the foremost databases available for business-to-institution marketing, including education, health care, government, and religion, with the education database making up its largest, most comprehensive segment. The compiling operation is centered in the MCH Research Department in Sweet Springs, Missouri. The compiling team is staffed with full-time, year-round professionals who use a wide variety of techniques to update, verify, and enhance the database. Many of the Research staff members have first-hand subject area expertise, in addition to decades of MCH compiling experience. MCH employs a wide variety of compilation methods and sources to develop the foremost education marketing data available. * Annual telephone surveys verify school and district telephone numbers, names and addresses, fundamental attributes like enrollment and grade span, key personnel names, and identify new and closed schools. MCH telephone verifies information on every school district and 99% of schools. * School rosters and class schedules are used to add, delete, or verify millions of educator names and job functions. * Published and web-based sources are used to validate information and rectify discrepancies, and to verify teacher names and job functions. * State directories and licensing files assure completeness of the database and add to comprehensive feature attributes. * Federal Department of Education statistics provide ethnicity, funding, and other attributes, and verify links between schools and districts. * Postal address verification ensures that mailing addresses are always deliverable and receive appropriate postal discounts. * address verification assures deliverability of addresses and maintains CAN-SPAM compliance. * Geospatial processing identifies and rectifies questions and issues arising from geo-location analysis. * Customer-provided information from the GetThereTM Guarantee program quickly identifies questionable addresses, which allows MCH to investigate and update information in their database. * Internal Quality Control Audits include dozens of automated and manual checks to identify and correct errors, preventing them from being added to MCH s all inclusive database. The MCH compiling professionals schedule these and other compiling processes to ensure that the MCH Strategic Data education database is as comprehensive, complete, and accurate as possible.
7 Page 7 RESULTS Survey Costs In most cases, cost is the primary concern of the client who requests a market survey. If alternate methods ave available to achieve the same goal, it is almost inevitable that the less expensive option will be chosen. The conventional wisdom seems to be that online surveys, which avoid all printing and postage costs, are less expensive to conduct compared to postal surveys. In the case of EMR s two test surveys, it was just the opposite. The postal mode was actually less expensive compared to the online mode. The cost for obtaining a mailing list, printing and mailing the surveys (first class mail), return postage, and data tabulation was just over $14,000. The cost for obtaining an list, formatting the survey for online viewing, deploying the survey, collecting the online responses, and tabulating the data was just over $17,000. Of course, the postal survey was mailed to a list of 18,000 names while the online survey invitation was ed to 90,000 names. It is no surprise then that the online survey cost was higher. If the lists were of equal size (18,000 names in both cases), that would have reduced the online survey cost to something in the neighborhood of $5,000. However, the goal was to produce a similar number of responses from each survey mode. Since postal surveys generally yield higher response rates than online surveys, it was necessary to increase the size of the online sample in order to ensure a number of responses roughly equal to that which the mail survey was expected to deliver. With a goal of completed surveys in mind, it was estimated going in that a mailing of 18,000 (with a 3%-5% response rate) would be sufficient to produce that result. On the other side of the coin, since online survey response rates are typically between 0.5% and 1.5%, it was determined that a deployment of 90,000 would be necessary to match the postal survey result. As it turned out, that forecast was right on target for both modes. Given that the postal survey actually produced 605 usable returns, and the online survey produced 674, it is fair to consider them equivalent in terms of effectiveness despite the wide difference in initial sample size. That being the case, it is also fair to say that the postal survey was less expensive compared to the online survey, both in terms of total cost, and in terms of cost per usable response. The cost per usable response computes to $23.14 for the postal survey ($14,000/605 responses), and $25.22 for the online survey ($17,000/674 responses). On a percentage basis, the online survey total cost was 21% higher, and on a per response basis it was 9% higher compared to the postal survey. So if cost is of the essence, the postal survey gets the check mark on this score.
8 Page 8 Elapsed Time Required To Collect Survey Data The postal version of EMR s survey allowed a 4-week window from mail date to response deadline. The response pattern for this survey, as is typical for such projects, was relatively few responses received through the first week, the vast majority coming in during the end of the first week and into the second week, and then responses gradually trailing off in the third and fourth weeks. Around 85% were received within the first two weeks, and 15% in the last two weeks. Additional responses, one or two at a time, continued to come in after the response deadline. The online version of EMR s identical survey allowed a 12-day window for responses. As it turned out, virtually all of the responses were received within the first 24 hours. None were received after the second day. So the same 85% completion level which took two weeks to be reached with the postal survey, took less than one day to be reached with the online survey. Obviously, the online survey proved to be far superior in terms of speed of data collection. If speed is of the essence, clearly the online mode is the only choice. Response Rates EMR s postal survey sample included elementary classroom teachers, Reading teachers, Principals, and curriculum supervisors, while the online version of that same survey included only elementary classroom teachers. Comparing apples to apples, the response rate for elementary classroom teachers was 4.4% in the postal survey, and 0.75% in the online survey. In other words, the response rate was six times higher in the postal survey compared to the online survey. Again, the two survey modes can be considered equally effective in terms of total usable responses produced, but the postal survey was five times more efficient, producing a roughly equal number of usable responses from an initial sample which was one fifth the size of the online survey sample. Are those response rates typical of postal and online surveys? Based on EMR s experience, a 4.4% response rate is actually on the low side of the continuum for a postal survey. Such job titles as classroom teacher, school librarian, and department chairperson have often produced response rates as high as 8% to 10% on prior EMR surveys of educators. In terms of online surveys, MCH Strategic Data s most recent project, a survey of Principals, obtained a 1.3% response rate, which is considerably better than EMR s Reading survey result (0.75%). Part of the reason for that difference can be attributed to the length (number of questions) on the EMR and MCH surveys. The interaction of survey length and response rate will be discussed in a separate section of this report.
9 Page 9 All of the anecdotal evidence points to an expected response rate range for postal surveys of between 3% and 10%. Online surveys should be expected to yield response rates between 0.5% and 1.5%. At either extreme, comparing equivalent surveys, one would anticipate that the postal response rate should be around six times higher than the online response rate. If high response rate is essential, the postal mode is the best choice. Interaction of Survey Length And Response Rates As a general rule of thumb, the longer the survey (more questions) the lower the response rate. It is possible to get people to respond to just a handful of quick questions with little or no monetary incentive to respond. On the other hand, as the survey gets longer, it takes a stronger incentive to ensure that the response rate will be within an acceptable range. The question here is, what is the effect of survey length on response rates for the postal and the online modes? This test of the same 30-question survey via both the postal and online modes revealed some interesting answers. When people receive a printed survey via regular first class mail, it makes sense to think that many of them toss it in the trash without even opening it. Just as most people delete an invitation to participate in an online survey without even opening it. Those who do open the postal survey are able to look it over, scan the number and complexity of the questions, and decide if they are willing to take the time and effort to fill it out. If they decide to go forward, they also have the option of answering only some of the questions and sending back an incomplete questionnaire. Similarly, those who open the invitation can decide if they want to click the link to the survey or not. If they opt to go to the survey, it then becomes more opaque for them, because they cannot readily see the number and complexity of the questions since they are presented one screen at a time. [A "survey completed" percentage bar was included at the bottom of the survey, but it did not state how many questions remained to be answered.] If they start answering and, at some point, decide that this is taking too much time, or there are too many questions, they can elect to stop and their questionnaire will be incomplete. There is no way to know how many people actually start filling out a postal survey and then fail to put it in the return mail. We only know how many are returned, complete or incomplete. One of the advantages of the online survey technology is that we do know how many people answered some, but not all of the questions. In fact, with EMR s online survey there were 674 completed questionnaires, and another 555 partials. However, when those partials were reviewed to see how many of them had answered at least half (15) of the 30 questions, none had. In other words, of those who were motivated enough to start answering the online survey, but not motivated enough to finish, none was willing to answer as many as 15 multiple choice questions.
10 Page 10 This gives us a simple rule of thumb for online surveys we did not have before: evidently 15 questions is too many if one is concerned about response rate. That is based on the fact that almost as many started EMR s 30-question online survey and quit before they were half way through as completed it. If all who started had finished, the response rate would have been 1.3%, almost double the actual 0.75% rate. This hypothesis is also based on the fact that MCH s recent Principal survey achieved a 1.3% response rate utilizing a 10-question survey. So it seems that if you increase the length of your online survey from 10 to 15 questions, you risk losing half of your likely respondents. Conversely, if you limit your online survey to 10 simple questions or fewer, you are likely to get a response rate closer to the top of the online response rate range of 0.5% to 1.5%. While more research should be done to confirm this finding, it seems prudent to operate as if online surveys are most effective when limited to around 10 questions. If more than 10 questions are required, the postal mode is a better choice. Reliability Of Survey Data Most people in the survey business would agree that you don t know anything from a survey unless you know how reliable the survey is. A questionnaire will always produce numerical results, even if those results are meaningless. With unreliable data you run the risk of making business decisions based on survey results that don t actually mean anything. Only a test of reliability can tell you if you should trust the results. In simple terms, a reliable questionnaire is one that would give the same results if it was used repeatedly with the same group. In the case at hand, that would mean comparing random geographic samples of elementary school teachers and getting the same results each time. Instead of investing the time and money doing the identical survey twice, there are tests of reliability for questionnaires which indicate whether the results are meaningful. For the purpose of this investigation of postal and online surveys, we looked at split-half measures of reliability. That involved comparing the results obtained from the first half of the sample (early responders) to those of the second half (late responders). If, for example, the results from the first 300 respondents turned out to be remarkably similar to the results from the second 300, it increases our confidence that if we polled 300 more, or even 3,000 more, the results would still be the same. That is the operational definition of reliability. EMR s Reading Market survey, in its printed/mailed form, has been replicated seven times between 1999 and its latest administration in the Fall Based on a review of core questions embedded in each of those surveys it is clear that the test-retest reliability of this survey is extremely high. That is, results from year to year are strikingly similar in those areas where change would not be expected, such as educators average years of teaching experience, the importance of Reading program alignment to standards, and criteria driving buying decisions.
11 Page 11 To the point of split-half reliability or consistency, results obtained from the first half of the educators responding to EMR s 2011 survey were compared to results obtained from the second half of those responding to the same postal survey. The following are some sample comparisons. How Many Years Of Teaching Experience Do You Have? (Postal Survey) RESPONSE FIRST HALF SECOND HALF DIFFERENCE Over 20 years 40.1% 41.1% -1.0% years 36.7% 37.2% -0.5% 6-10 years 16.2% 15.5% 0.7% 3-5 years 5.4% 5.6% -0.2% 1-2 years 1.0% 0.3% 0.7% Less than 1 year 0.7% 0.3% 0.4% AVERAGE 17.0 YEARS 17.3 YEARS -0.3% The differences range from a low of 0.2% to a high of 1.1%. How much of a difference should be flagged as meaningful? With a survey of this sample size we would usually consider a difference of 3% to 5% or more to be potentially meaningful. Using that criterion, none of the differences above are meaningful. In other words, you can operate as if the average tenure of teachers in the field is 17.0 years or 17.3 years - there is no statistical difference between the two. Additional examples of highly consistent results on EMR s Reading survey are as follows. If You Had It To Do Over, Would You Choose The Same Core Program Again? (Postal Survey) RESPONSE FIRST HALF SECOND HALF DIFFERENCE Yes 70.2% 69.7% 0.5% No 29.8% 30.3% -0.5%
12 Page 12 How Important Is It That Your Reading Program Is Aligned To Common Core Standards? (Postal Survey) RESPONSE FIRST HALF SECOND HALF DIFFERENCE Very important 82.7% 83.9% -1.2% Somewhat important 14.3% 15.1% -0.8% Not important 1.0% 0.7% 0.3% Not familiar with standards 2.0% 0.3% 1.7% Apart From Fit With Students Needs, What Are The Most Important Criteria Driving Your Buying Decision? (Postal Survey) RESPONSE FIRST HALF SECOND HALF DIFFERENCE Proof it works 85.1% 84.2% 0.9% Price 42.0% 43.5% -1.5% Peer recommendations 25.3% 22.9% 2.4% Reputation of brand or publisher 16.3% 16.1% 0.2% Positive reviews/awards 11.1% 14.7% -3.6% Online/digital delivery 10.1% 7.2% 2.9% Other 6.6% 7.2% -0.6% Based on these question-by-question comparisons, we would say that the postal survey is highly reliable, which means the results can and should be trusted. On the online survey side of the fence, because of the way the survey data was originally coded, it was not possible to re-run the question-by-question results comparing the first half to the second half, and thus obtaining a direct measure of split-half reliability. However, since the postal survey data proved to be highly reliable, it was possible to compare some of the key data from the postal survey to the same data collected with the online survey. Presumably, if the postal results are reliable, and the online results match closely with the postal results, then the online results should also be considered reliable. A comparison of the years of teaching experience question, postal survey vs. online survey, follows.
13 Page 13 How Many Years Of Teaching Experience Do You Have? (Postal vs. Online) RESPONSE POSTAL ONLINE DIFFERENCE Over 20 years 40.6% 40.8% -0.2% years 36.9% 35.5% 1.4% 6-10 years 15.8% 17.4% -1.6% 3-5 years 5.5% 5.6% -0.1% 1-2 years 0.7% 0.7% 0.0% On this standard question the postal and online results match up perfectly. Thus if the postal results for this question are reliable, the online results are similarly reliable. Additional questionby-question comparisons follow. If You Had It To Do Over, Would You Choose The Same Core Program Again? (Postal vs. Online) RESPONSE POSTAL ONLINE DIFFERENCE Yes 69.9% 59.1% 10.8% No 30.1% 40.9% -10.8% How Important Is It That Your Reading Program Is Aligned To Common Core Standards? (Postal vs. Online) RESPONSE POSTAL ONLINE DIFFERENCE Very important 83.3% 87.4% -4.1% Somewhat important 14.7% 10.2% 4.5% Not important 0.8% 0.3% 0.5% Not familiar with standards 1.2% 2.1% -0.9%
14 Page 14 Apart From Fit With Students Needs, What Are The Most Important Criteria Driving Your Buying Decision? (Postal vs. Online) RESPONSE POSTAL ONLINE DIFFERENCE Proof it works 84.7% 80.7% 4.0% Price 42.8% 43.5% -0.7% Peer recommendations 24.1% 24.3% -0.2% Reputation of brand or publisher 16.2% 16.9% -0.7% Positive reviews/awards 12.9% 20.8% -7.9% Online/digital delivery 8.6% 6.8% 1.8% Other 6.9% 8.5% -1.6% While the first comparison yielded evidence of equal reliability for the two survey modes, the second, third, and fourth comparisons showed significant differences. How satisfied are educators with their currently adopted Reading programs? The postal survey (69.9%) indicates they are, on average, very satisfied, but the online survey (59.1%) says not so much. Another example is the motivating power of positive reviews/awards when it comes to influencing purchasing decisions. The postal survey (12.9%) indicates, on average, positive reviews/awards are not very influential, but the online survey (20.8%) says they are significantly more influential. Of course, the two surveys reflect the opinions of two separate sets of respondents, so shouldn t we expect and tolerate differences between the two? The answer is a resounding no! In both cases the sample is intended to fairly and accurately represent the universe of elementary classroom teachers, so we have every right to expect the results to be, effectively, the same. Does it make a difference which number we trust? It certainly could, which is why we strive for reliable results. The answer to the question, how reliable does a survey need to be?, is as reliable as possible. More research should be done in this area, but the preliminary evidence at hand suggests that the postal survey produces more reliable results compared to the online version of the same survey. How much this has to do with the typically higher response rates on postal surveys is not known, but it appears to contribute to the contrast. So the postal survey gets the check mark on the reliability score.
15 Page 15 DISCUSSION Need empirical data from the field to confirm (or disconfirm) what you believe to be true? Why not do a quick, cost-effective online survey? The lesson here is, look before you leap. How you do that survey should depend on the importance to you of cost, speed, response rate, number of questions and, ultimately, reliability of survey results. Postal and online surveys have different profiles with respect to each of those critical variables. The purpose of the experiment described herein was to compare the relative merits of the postal and online survey approaches. Some might suggest that, on the face of it, the printing/mailing approach is the slow, expensive, old fashioned way, while the online approach is the fast, inexpensive, modern way to conduct survey research. And it is hard to argue the issue of speed, but what about a cost comparison? And what about response rates, and the closely related issue of reliability of the data? Is there a trade off to be concerned about with regard to speed vs. response rates and reliability? In order to provide information on current trends in the elementary Reading market segment, EMR designed a detailed survey, and then mailed it on October 17, 2011 to 18,000 randomly selected educators segmented as follows. Sampling Matrix JOB TITLE ELEMENTARY MIDDLE/JHS TOTAL Classroom teacher 6, ,000 Reading teacher 3,000 3,000 6,000 Curriculum supervisor ,000 TOTAL 9,000 3,000 18,000 The survey consisted of four printed pages with a total of 30 numbered questions, of which 26 were multiple choice and 4 required a write in response. Of the 26 multiple choice questions, 6 offered the option of writing in an other response. Three weeks later, on November 8, 2011, the same 30-question Reading Market survey was replicated using the online method. This survey contained identical questions, but due to the formatting necessary for viewing on-screen, the length expanded to 16 pages or screens, compared to 4 pages for the postal version. The invitation to participate in the survey was e- mailed to 90,000 randomly selected elementary teachers (excluding the 18,000 chosen for the postal survey).
16 Page 16 Comparing the actual cost, speed, response rates, and data reliability of the two survey modes yielded the following profile of pros and cons. Relative Merits Of Postal vs. Online Surveys CRITERION POSTAL ONLINE Cost $23.14 per usable response - 9% less than online $25.22 per usable response Speed 85% of responses within 2 weeks 85% of responses in less than 1 day Response rate 4.4% - 6 times higher than online 0.75% Optimum number of questions 30 questions or more 15 questions is too many Reliability High split-half and testretest reliability Uncertain - needs more research In this specific test, the postal mode outpaced the online mode on four of the five key criteria, with the online mode superior only in terms of speed. If more than 10 questions are needed to get the job done, the postal mode would be the only prudent option. Given that the online mode is faster, but not cheaper compared to the postal mode, it must be emphasized that there is evidently a significant trade off to be concerned about with regard to higher speed vs. lower response rates and reliability when the online mode is used. Since reliability is the ultimate criterion of success for any survey, an online survey may not always be the right way to go, particularly now that the process of designing and implementing online surveys has become a do-it-yourself activity. If we suppose that technology will continue to evolve, and as a consequence the cost of online surveys will inevitably drop below the cost of postal surveys, making online superior both in terms of cost and speed, will that rule out the use of postal surveys in the future? Putting the allimportant issue of reliability aside for the moment, the answer should still be no, because postal surveys, when properly done, allow for a richness of data analysis that is not possible when the limit is ten questions. To get the sharpest possible portrait of the target market it is necessary to look both at total responses, and at many cross-tabulations of the data, elucidating important differences in responses by job title, grade level, years of experience, size of district, type of district, state, geographic region, and many other key demographic variables. In order to accomplish that, questions relating to those demographics must be added to the survey. If that higher level of analysis is the goal, many more than ten questions will be required.
17 Page 17 Special Limitation Of Online Surveys in the K-12 Market Some of EMR s most interesting and useful survey results have come from comparing the responses, to the same questions, made by classroom teachers, Principals, and various districtlevel personnel. Teachers tend to be realists as far as what is or is not happening in their classrooms, while Principals tend to be cheerleaders who may exaggerate the positives and minimize the negatives. Knowing those differences should cause product developers and marketers to adopt different strategies for different sub-groups within the population of educators. In order to make those types of comparisons, EMR s rule of thumb is to have between 100 and 200 responses to analyze within each sub-group, such as teachers or Principals. For example, if the plan is to look at teachers in grades Pre-K - 2, grades 3-5, and grades 6-8, as well as elementary and middle/junior high school Principals, a minimum of 100 responses (ideally 150 responses) in each of those cross-sections of job title and grade level would be required to perform a meaningful analysis. Assuming a 5% response rate for a typical postal survey of educators, 3,000 of each sub-group in the initial sample would likely ensure the desired number of responses. So EMR s sampling matrix would be as follows. JOB TITLE Hypothetical Sampling Matrix - Postal Survey GRADES PRE-K - 2 GRADES 3-5 GRADES 6-8 TOTAL Classroom or subject teacher 3,000 3,000 3,000 9,000 Elementary Principal ,000 Middle/Junior High Principal ,000 TOTAL 3,000 3,000 3,000 15,000 Those starting numbers are likely to yield the desired 150 responses per cell within this matrix when the postal mode is employed. On the online survey side, we could assume a 1% response rate (the mid-point of the expected range of online survey response rates), so the following starting numbers would be needed to achieve the same results.
18 Page 18 JOB TITLE Hypothetical Sampling Matrix - Online Survey GRADES PRE-K - 2 GRADES 3-5 GRADES 6-8 TOTAL Classroom or subject teacher 15,000 15,000 15,000 45,000 Elementary Principal ,000 Middle/Junior High Principal ,000 TOTAL 15,000 15,000 15,000 75,000 As has been previously discussed, since the typical postal survey delivers five or six times the response rate of the typical online survey, it takes five or six times as many names in the initial online survey sample to ensure the same number of total responses as obtained in the postal survey. Putting aside the issue of the increased cost of a survey starting with 75,000 names, compared to 15,000, there is another problem. There are barely enough middle/junior high Principals in the universe of K-12 educators from which to pull an initial sample of 15,000. [There are only around 14,000 middle and junior highs in the entire U.S. public school market.] The situation is especially problematic at the district level, where such key job titles as federal program directors, Bilingual/ELL directors, technology coordinators, special education directors, and K-12 curriculum/instruction directors all number far fewer than 15,000. To illustrate one example of this issue, there are approximately 7,500 Bilingual/ELL directors in total. Thus an online survey of such directors, including every single one of them in the initial sample, and producing a 1% response rate, is likely to yield 75 total responses (7,500 x 1%). Is that number of responses high enough to produce reliable data? Should critical business decisions be made based on such limited results? Probably not. That being the case, if 150 or more responses from Bilingual/ELL directors is the goal, a postal survey would be the best choice. In summary, reliability is always the most important consideration when thinking about a market survey. An unreliable set of results will be far worse than no empirical data at all. Trusting false positives or false negatives could lead to terrible financial consequences. If you are talking to a survey provider, always ask about sample size and the minimum number of responses needed for sound results. Ask about average response rates, and the related issue of reliability and how it will be measured. This is how you will know if your results can be trusted. And save the questions about speed and cost for last, where they really belong.
An Analysis of the Time Use of Elementary School Library Media Specialists and Factors That Influence It SLMQ Volume 24, Number 2, Winter 1996 Jean Donham van Deusen, Assistant Professor, University of
Data Tools for School Improvement These strategies will help schools select an appropriate and effective data system. Victoria L. Bernhardt Imagine starting the school year with historical data about each
Value-Added Measures of Educator Performance: Clearing Away the Smoke and Mirrors (Book forthcoming, Harvard Educ. Press, February, 2011) Douglas N. Harris Associate Professor of Educational Policy and
White Paper Proven Processes for Textbook Management Recover Dollars for Your District through a Textbook Management System Overview: Textbooks Can Cost Your District Hundreds of Dollars Per Student Textbooks
Career and Technical Education Teacher Training Manual Iredell Statesville Schools December 1, 2010 Table of Contents Before you Start...2 Number of Questions in an Objective...3 Online Help...3 Make a
SOCIETY OF ACTUARIES THE AMERICAN ACADEMY OF ACTUARIES RETIREMENT PLAN PREFERENCES SURVEY REPORT OF FINDINGS January 2004 Mathew Greenwald & Associates, Inc. TABLE OF CONTENTS INTRODUCTION... 1 SETTING
COLLEGE READINESS A First Look at Higher Performing High Schools School Qualities that Educators Believe Contribute Most to College and Career Readiness 2012 by ACT, Inc. All rights reserved. A First Look
Preliminary Findings A Survey of K-12 Educators on Social Networking and Content-Sharing Tools Co-sponsored by 2009 edweb.net, MCH, Inc., MMS Education Survey Goals Benchmark attitudes, perceptions and
Open-Source vs. Proprietary Software Pros and Cons Analyze the strengths and weaknesses of proprietary vs. open source software to determine what is best for your business. White Paper Weighing the Options
Pearson Inform v4.0 Educators Guide Part Number 606 000 508 A Educators Guide v4.0 Pearson Inform First Edition (August 2005) Second Edition (September 2006) This edition applies to Release 4.0 of Inform
T H E PAT H T O A Paperless Practice ILLUSTRATION BY NICK ROTONDO 28 SCOT MORRIS, O.D., F.A.A.O. C o n i f e r, C o l o. The most compelling argument to switch to EHR is the one that starts and ends with
The Official SAT Online Course Case Studies The following case studies describe successful implementations of The Official SAT Online Course at a broad range of schools across the country. As these case
ACT National Curriculum Survey 2012 Policy Implications on Preparing for Higher Standards improve yourself ACT is an independent, not-for-profit organization that provides assessment, research, information,
4TH ANNUAL PRINCIPALS ASSESSMENT OF PUBLIC EDUCATION Introduction Principals are more optimistic this year than last. Even though they continue to report being challenged by inadequate funding, insufficient
HOME SCHOOLING WORKS Pass it on! Online Press Conference March 23, 1999, 12:00pm EST A transcript of the opening remarks by Michael Farris, Esq. & Lawrence M. Rudner, Ph.D. Michael Farris: Good morning.
User Guide Version 3.0 April 2006 2006 Obvious Solutions Inc. All rights reserved. Dabra and Dabra Network are trademarks of Obvious Solutions Inc. All other trademarks owned by their respective trademark
Economic Confidence, Education, And Student Debt Is College Worth The Debt? With increased attention given to rising student debt levels, the question Is college worth it? is being asked more and more.
Are Arizona Public Schools Making the Best Use of School Counselors? Results of a Three-Year Study of Counselors Time Use Arizona School To Work Briefing Paper #16 April 1999 by Judith A. Vandegrift, Morrison
A STUDY GUIDE Nonfiction Matters Reading,Writing, and Research in Grades 3 8 Stephanie Harvey Nonfiction Matters is divided into three parts: Conditions for Successful Inquiry, The Nitty-Gritty, and Getting
Chapter Seven CONCLUSIONS In this chapter, we offer some concluding thoughts on the future of Internet-based surveys, the issues surrounding the use of e-mail and the Web for research surveys, and certain
Research Highlights LONG-TERM CARE IN AMERICA: AMERICANS OUTLOOK AND PLANNING FOR FUTURE CARE INTRODUCTION In the next 25 years, the U.S. population is expected to include 82 million Americans over the
Page 1 of 16 Technical Report Teach for America Teachers Contribution to Student Achievement in Louisiana in Grades 4-9: 2004-2005 to 2006-2007 George H. Noell, Ph.D. Department of Psychology Louisiana
AIME SUMMARY AND RESPONSE TO SURVEY RESULTS: How Students, Teachers & Principals Benefit from Strong School Libraries: The Indiana study How Students, Principals, and Teachers Benefit from Strong School
2011 RATING A TEACHER OBSERVATION TOOL Five ways to ensure classroom observations are focused and rigorous Contents The Role of Observation Criteria and Tools Assessing Quality of Criteria and Tools: Five
For more resources click here -> Online Course Delivery at 50 Accredited Institutions: The Critical Issues Robert M. Colley Associate Dean, Continuing Education Syracuse University Shelly Blowers Graduate
Computer Network Solutions Disaster Recovery Preparedness Benchmark Survey The State of Global Disaster Recovery Preparedness ANNUAL REPORT 2014 The Disaster Recovery Preparedness Council publishes this
User Guide Online Backup Table of contents Table of contents... 1 Introduction... 2 Adding the Online Backup Service to your Account... 2 Getting Started with the Online Backup Software... 4 Downloading
Creating an Effective Mystery Shopping Program Best Practices BEST PRACTICE GUIDE Congratulations! If you are reading this paper, it s likely that you are seriously considering implementing a mystery shop
The Future of College and Career Pathways A national survey of pathways practitioners July 2013 The Future of College and Career Pathways A national survey of pathways practitioners Career and Technical
Appendix B. National Survey of Public School Teachers This survey is based on a national random sample of 1,010 K 12 public school teachers. It was conducted by mail and online in fall 2007. The margin
Which Design Is Best? Which Design Is Best? In Investigation 2-8: Which Design Is Best? students will become more familiar with the four basic epidemiologic study designs, learn to identify several strengths
BUSINESS EQUIPMENT www.ondemandhouston.com www.ondemandhouston.com About Us On Demand Office Equipment Is The Total Solution For All Of Your Business Needs: Office Equipment Printer Fleet Management Production
Public Utilities Commission Commercial Survey Prepared for: Public Utilities Commission Focus Groups Surveys Public Opinion Polling, Portland, Maine 04101 Telephone: 207-772-4011 Fax: 207-772-7027 www.criticalinsights.com
What You Should Look For In a Real-Time GPS Tracking System? There is more than meets the eye regarding the technology that makes up a GPS vehicle tracking system. Most companies who are shopping for a
City of Toppenish Online Presentment and Payment Frequently Asked Questions General What are some of the benefits of receiving my bill electronically? It is convenient, saves time, reduces errors, allows
Galileo Pre-K Online: Aligned with the Head Start and the Office of Head Start Monitoring Protocol by Jason K. Feld, Ph.D. Assessment Technology, Incorporated 6700 E. Speedway Boulevard Tucson, Arizona
Creating Your PALS Online Account for New Teachers Navigate to the PALS Online homepage Type www.wi.palsk8.com into the address bar of your internet browser. New or Returning? To create a new teacher account,
Advanced Degrees and Student Achievement-1 Running Head: Advanced Degrees and Student Achievement A STUDY OF WHETHER HAVING A PROFESSIONAL STAFF WITH ADVANCED DEGREES INCREASES STUDENT ACHIEVEMENT By MEGAN
Pearson Student Mobile Device Survey 2014 National Report: College Students Conducted by Harris Poll Field dates: February 13 March 12, 2014 Report date: May 16, 2014 Table of Contents Background & Objectives
The Impact of School Library Media Centers on Academic Achievement SLMQ Volume 22, Number 3, Spring 1994 Keith Curry Lance, Director, Library Research Service, Colorado Advocates of school library media
Welcome back to EDFR 6700. I m Jeff Oescher, and I ll be discussing quantitative research design with you for the next several lessons. I ll follow the text somewhat loosely, discussing some chapters out
Change Management in Higher Education: Using Model and Design Thinking to Develop Ideas that Work for your Institution By Michael P. Meotti Ed Policy Group Introduction Change and innovation are hot topics
Good afternoon and welcome to today s Coffee Break presented by the Evaluation and Program Effectiveness Team in the Division for Heart Disease and Stroke Prevention at the CDC. We are very fortunate today
Gelb Consulting Group, Inc. 1011 Highway 6 South P + 281.759.3600 Suite 120 F + 281.759.3607 Houston, Texas 77077 www.gelbconsulting.com An Endeavor Management Company Overview One purpose of marketing
How to Keep Marketing Email Out of the Spam Folder A guide for marketing managers and developers Sarah Longfors Web Developer marketing + technology 701.235.5525 888.9.sundog fax: 701.235.8941 2000 44th
Faculty Productivity and Costs at The University of Texas at Austin A Preliminary Analysis Richard Vedder Christopher Matgouranis Jonathan Robe Center for College Affordability and Productivity A Policy
A Guide to Choosing the Right EMR Software A Guide to Choosing the Right EMR Software Eight Important Benchmarks for Community and Critical Access Hospitals Eight Important Benchmarks for Community and
SHOULD SALES FORCE AUTOMATION CHANGES BRAND AUTOMATION FOR LG Dr. Ashish Mathur (M) Associate Professor, Department of Management Studies Lachoo Memorial College of Science & Technology, Jodhpur ABSTRACT
A Comparison of Training & Scoring in Distributed & Regional Contexts Writing Edward W. Wolfe Staci Matthews Daisy Vickers Pearson July 2009 Abstract This study examined the influence of rater training
California Public School Teachers and Their Views on College and Career Readiness A survey conducted April 15 Survey Methodology Online survey conducted April 1, 15 1, interviews among California teachers
Pearson Student Mobile Device Survey 2014 National Report: Students in Grades 4-12 Conducted by Harris Poll Field dates: February 13 March 12, 2014 Report date: May 9, 2014 Table of Contents Background
Data Products and Services The one-stop-shop for all your business-to-consumer data requirements Put data and insight back at the heart of your marketing Knowing who to target, when, via what channel and
BENCHMARKING PERFORMANCE AND EFFICIENCY OF YOUR BILLING PROCESS WHERE TO BEGIN There have been few if any meaningful benchmark analyses available for revenue cycle management performance. Today that has
Electronic Audit Tool User Guide Version 1.0 TABLE OF CONTENTS INTRODUCTION... 1 REQUESTING THE E-AUDIT TOOL PACKAGE... 2 SAVING THE USER GUIDE... 3 SAVING THE WORKING FILE... 4 COMPATIBILITY AND PROGRAM
Comparison of Features: DataDirector 3.x vs. 4.0 DataDirector 4.0 represents an evolution in form and functionality from previous versions. Though we have taken this opportunity to introduce several new
Graduate School of Education Library Media Endorsement Program Handbook Graduate School of Education PO Box 751 Portland, OR 97207-0751 www.pdx.edu/ceed/library Graduate School of Education Mission Preparing
Building Evaluation Capacity: Five Models for Funders A Usable Knowledge White Paper Eric Graig, Ph.D. Managing Director USABLE KNOWLEDGE, LLC 5905 Huxley Avenue, Riverdale, NY 10471 Telephone: 775 363
How Projects Go Wrong How Projects Go Wrong So, you ve commenced your project the strategic direction is well documented, the operational plan prioritises the plan and critical resources, and the project
INVOICES 5-5 The world s most popular gateway to electronic invoice processing Electronic invoice processing is extremely profitable Can your company afford the traditional, manual handling of supplier
Compass 2014-2015 Guide to End of Year Processes Updated 7/21/2015 Compass 2014-2015: Guide to End of Year Processes This document provides educators at the district, school, and classroom level the necessary
Email Marketing for Success A practical guide to growing your customer base, nurturing leads, and building trust throughout the purchase process Email Marketing The Email Marketer's Challenge The Email
68% MAJOR INCIDENT MANAGEMENT TRENDS 5 2016 Survey Report 68% Introduction Reliance on digital infrastructures has dramatically increased the impact and frequency of major incidents. In fact, more than
white paper A New Perspective on Small Business Growth with Scoring Understanding Scoring s Complementary Role and Value in Supporting Small Business Financing Decisions January 2013»» Summary In the ongoing
Information for Law Firms On Direct Mail USING DIRECT MAIL TO PROMOTE YOUR FIRM Direct mail refers to communications sent to clients and non-clients aimed at encouraging the recipient to use services that
Six Steps for Successful Surveys Overview Since the first Web-based survey was conducted in 1994, the use of online surveys has steadily increased as a means of collecting valuable information from important
The North Carolina Recent Graduate Survey Report 2012-2013 Fayetteville State University Education Policy Initiative at Carolina April 2014 One Two Three Four Five One Two Three Four Five Recent Graduate
Marketing Plan Development 101: The Importance of Developing a Marketing Plan for Public Transit Agencies & Commuter Assistance Programs Mark Glein, PhD, Marketing Florida State University Marketing Plan
The 2015 Corpus Christi Catholic- Christian Community College Scholarship The Scholarship The Corpus Christi College Scholarship (CCCS) is a merit- based financial aid grant awarded to an individual student
Why Smart Data is Superior for Most Business Decisions written by Tyler G Page The big data phenomenon and how you can get better ROI out of Smart Data. Table of Contents Introduction 3 A Business Rule
Report THE CHANGING ROLE OF THE DENTAL OFFICE MANAGER 2012 ANNUAL REPORT RESEARCH METHODOLOGY This report analyzes the results of the online survey titled The Changing Role of the Dental Office Manager.
SQL Server and Database: A Comparative Study on Total Cost of Administration (TCA) A case study on the comparative costs of database administration for two of the premier enterprise relational database
The Case for EMBA Sponsorship Part 1 - Preparation and Research Written by Daniel Szpiro, Dean of Executive Education at The Jack Welch Management Institute http://community.mbaworld.com/blog/b/weblog/archive/2015/07/31/thecase-for-emba-sponsorship
1 of 6 Patent Careers For Technical Writers and Scientific, Engineering, and Medical Specialists by Steven C. Oppenheimer, Licensed U.S. Patent Agent Copyright 2008 Steven C. Oppenheimer http://www.oppenheimercommunications.com
Fluent 2016 Page 5 The Performance Marketer s Guide to Email Marketing: Engaging Your Subscribers You ve cleared the initial hurdle of acquiring new prospects for your email marketing campaigns, but now
Getting Started with Online Learning Getting Started with Online Learning 1.0 Page 1 How to Use This Document This tool is designed to guide you through a series of questions and planning steps that will
A Special State-Focused Supplement to Education Week s Technology Counts 2007 About This Report This State Technology Report is a supplement to the 10th edition of Technology Counts, a joint project of
An Enterprise Framework for Business Intelligence Colin White BI Research May 2009 Sponsored by Oracle Corporation TABLE OF CONTENTS AN ENTERPRISE FRAMEWORK FOR BUSINESS INTELLIGENCE 1 THE BI PROCESSING
ConsumerViewSM Insight on more than 235 million consumers and 113 million households to improve your marketing campaigns Learn how Experian s ConsumerView SM database enables better segmentation for brands
How B2B Marketers are using old and new online media to drive lead generation Executive Summary The use and integration of Web 1.0 and 2.0 media into B2B marketing plans for lead generation are well underway
Why You Need Social Media In Email Signatures www.exclaimer.com Executive Summary Turning email contacts into social media contributors is the purpose of this guide and any business, however naïve or savvy,
BACKUP ESSENTIALS FOR PROTECTING YOUR DATA AND YOUR BUSINESS Disasters happen. Don t wait until it s too late. OVERVIEW It s inevitable. At some point, your business will experience data loss. It could
ON-PREMISE OR OUTSOURCE: EXPLORING THE IMPACT OF TECHNOLOGY DEPLOYMENT ON MARKETING EFFECTIVENESS Correlating Marketing Performance with Technology Deployment Options Written by David Daniels, The Relevancy
The Influence of Session Length On Student Success Ruth Logan Research Associate Santa Monica College Peter Geltner, Ph.D. Dean of Institutional Research Santa Monica College Abstract Do compressed courses