A COMPARISON OF ELECTRONIC SURVEYING BY E-MAIL AND WEB



Similar documents
Using Online Surveys to Evaluate Distance Education Programs

A Model Program for Promoting Effective Teaching in Colleges of Engineering

Faculty Proficiency with Technology: Distribution among Rank and Institution

The Savvy Survey #13: Online Surveys 1

Concordia University Course Evaluation Survey: Summary Report

The Development of an Online Engineering Alphabet

Communication Humor and Personality: Student s attitudes to online learning

Analyze Motivation-Hygiene Factors to Improve Satisfaction Levels of Your Online Training Program

Development and Usability Testing of a Student Mobile Application for the AIChE Concept Warehouse

Student Employment Website User Guide for Off-Campus Employers

GRADUATE FACULTY PERCEPTIONS OF ONLINE TEACHING

Highlights from Service-Learning in California's Teacher Education Programs: A White Paper

CHAPTER III METHODOLOGY. The purpose of this study was to describe which aspects of course design

MARKET ANALYSIS OF STUDENT S ATTITUDES ABOUT CREDIT CARDS

Assessing Blackboard: Improving Online Instructional Delivery

How To Get A Law Degree In Alabama

Information Technology in teaching student s perspective

International Conference on Communication, Media, Technology and Design. ICCMTD May 2012 Istanbul - Turkey

Discussion of the effectiveness of the national accreditation process of secondary science education programs

HertSFX. User Guide V2.04. Hertfordshire s Secure File Exchange Portal. (Jan 2014) HertSFX User Guide V2.04 Jan 2014 Page 1 of 17

Ithaca College Survey Research Center Survey Research Checklist and Best Practices

GRADUATE SCHOOL OF LIBRARY AND INFORMATION SCIENCE INTRODUCTION TO LIBRARY AND INFORMATION STUDIES RESEARCH REPORT

Evaluation of the Introduction of e-learning into engineering mechanics

Tips for Taking Online Classes. Student Success Workshop

Examining Students Performance and Attitudes Towards the Use of Information Technology in a Virtual and Conventional Setting

Antoine J. Alston North Carolina A&T State University. W. Wade Miller Iowa State University. Introduction/ Rationale

The Use of a Real Time Online Class Response System to Enhance Classroom Learning

Blackboard Learning System: Student Instructional Guide

Discover Viterbi: New Programs in Computer Science

Students beliefs and attitudes about a business school s academic advising process

Intellect Platform - The Workflow Engine Basic HelpDesk Troubleticket System - A102

Comparison of Student and Instructor Perceptions of Best Practices in Online Technology Courses

AC : COMMUNITY COLLEGE TEACHER PROFESSIONAL DEVELOPMENT

Model for E-Learning in Higher Education of Agricultural Extension and Education in Iran

Onsite Peer Tutoring in Mathematics Content Courses for Pre-Service Teachers

AppShore Premium Edition Campaigns How to Guide. Release 2.1

Stony Brook School of Nursing Instructions for submitting an Online Clinical Site Placement Request

Creating and Implementing Conversational Agents

Student Feedback: Improving the Quality through Timing, Frequency, and Format

OUTLOOK WEB APP 2013 ESSENTIAL SKILLS

UNIVERSITY OF CALGARY Information Technologies WEBFORMS DRUPAL 7 WEB CONTENT MANAGEMENT

THE CHANGING ROLE OF LIBRARIANS AND

Reproductions supplied by EDRS are the best that can be made from the original document.

GUIDELINES FOR DESIGNING AND IMPLEMENTING INTERNET SURVEYS

the general concept down to the practical steps of the process.

MBA 6001, Organizational Research and Theory Course Syllabus. Course Description. Prerequisites. Course Textbook. Course Learning Objectives.

Enrollment Services Texas Southern University E. O. Bell Building 3100 Cleburne Street Houston, TX 77004

SmallBiz Dynamic Theme User Guide

How is Webmail Different than Microsoft Outlook (or other program)?

HTML-eZ: The Easy Way to Create a Class Website

Data Warehousing and Decision Support Tales from a Virtual Classroom

PSYC 101: General Psychology

Procedure Guide: Daily Use Cyber Recruiter 6.6 December 2007

EDUCATION Communication Studies, emphasis in Organizational Communication. The University of Texas at Austin.

Survey research. Contents. Chaiwoo Lee. Key characteristics of survey research Designing your questionnaire

HAN Resource Manual May 2013

Results. 12. The Graduate School is responsive to my requests N = 335

V. Course Evaluation and Revision

Vodafone Plus. User Guide for Windows Mobile

DEPARTMENT OF CURRICULUM AND INSTRUCTION

OWA - Outlook Web App

PROCEDURES AND EVALUATIVE GUIDELINES FOR PROMOTION OF TERM FACULTY DEPARTMENT OF PSYCHOLOGY VIRGINIA COMMONWEALTH UNIVERSITY MARCH 31, 2014

Capstone Suggestions for Survey Development for Research Purposes

Composition Studies. Graduate Certificate Program. Online Certificate Program in English. Indiana University East Department of English

Center for Distance Learning and Instructional Technology. Policies, Procedures and Best Practices for the Development and Teaching of Online Courses

The Importance of Newer Media in Library Training and the Education of Professional Personnel

1. Click on Faculty & Staff link on the top left side of the home page screen.

Quick Reference Guide. Version a

Snap 9 Professional s Scanning Module

Advanced Outlook 2010 Training Manual

A critique of the length of online course and student satisfaction, perceived learning, and academic performance. Natasha Lindsey

Nurse Investor Education Survey

WVU STUDENT EVALUATION OF INSTRUCTION REPORT OF RESULTS INTERPRETIVE GUIDE

The American Academy of Forensic Sciences and North Carolina State University

The University of Arizona

Department Chair Online Resource Center Starting a New Program: A Case Study

Dreamweaver Mail Forms: Using ritmail.cgi

NOTE TAKING AND THE TABLET PC

Click a topic in the Table of Contents to jump to a topic and use Ctrl + Home to return to this page.

Welcome to echalk A Guide For Students. Introduction. Contents:

Aesop QuickStart Guide for Substitutes

Faculty Introduction to Self-Service

Online Master of Science in Information Technology Degree Program User s Guide for Students

Evoko Room Manager. System Administrator s Guide and Manual

Accessing your Professional Development Plan (PDP) Evaluation Process Professional Development Plan Start New Start Edit

Year 2 Evaluation of the Connecticut Clean Energy Climate Solutions Project

You Must Promote You

MASS COMMUNICATIONS IN YOUR MINISTRY

Instructional Designers and Faculty Working Together to Design Learning Objects

Internet Surveys. Examples

360 Feedback HR Representative - Initiation

The Trials and Accomplishments of an Online Adjunct Faculty Member

Getting Started with SurveyGizmo Stage 1: Creating Your First Survey

Writer Guide. Chapter 15 Using Forms in Writer

DIY Manager User Guide.

Evaluation of an Applied Psychology Online Degree Maggie Gale, University of Derby

Executive Dashboard. User Guide

ESA FAQ. Self Administration Frequently Asked Questions

Students Perceptions of Distance Learning, Online Learning and the Traditional Classroom

Presenter: John Young 7/31/09

Transcription:

Proceedings of the 2001 Annual ASEE Conference, ASEE, June 2001. Abstract A COMPARISON OF ELECTRONIC SURVEYING BY E-MAIL AND WEB Catherine E. Brawner, Richard M. Felder, Rodney H. Allen, Rebecca Brent, and Thomas K. Miller Research Triangle Educational Consultants/ COMP-AID/North Carolina State University In recent years the scholarship of teaching has gained increasing recognition in engineering education as a legitimate and valuable faculty activity. Growing numbers of faculty members engaged in educational research have been using surveys as principal components of their assessment programs. These researchers quickly discover that using individual interviews or paper forms to get responses is extremely time-consuming and often prohibitively expensive, and they turn instead to electronic surveys. The two main vehicles for such surveys are e-mail and World Wide Web-based forms. Web surveys are attractive since they allow for automatic tabulation and analysis of responses, but there is a concern that the additional effort they require of respondents could lead to a severe reduction in response rate. The study to be reported was designed to examine the legitimacy of that concern. Engineering professors at two SUCCEED (Southeastern University and College Coalition for Engineering EDucation) campuses were surveyed regarding their use of various teaching techniques and their perceptions of the importance of teaching quality and innovation to their colleagues and administrators. The 361 faculty members surveyed were randomly assigned to fill out identical surveys using either the Web or e-mail. Those who were asked to respond via e-mail were much more likely to return the survey (29% vs. 16%), and full professors in particular were extremely unlikely to use the Web. There were few significant differences in the responses based on the survey method. Possible explanations of these results are proposed and their implications for survey research are explored. I. Introduction With the advent of the widespread use of the Internet has come the ability to field surveys to many people at relatively low cost compared with the cost of fielding paper versions of the same survey to the same population. Electronic surveys can be sent to many people for little marginal cost and data entry can be automated to save time and eliminate errors. The two methods of using the Internet as a survey mechanism are electronic mail (e-mail) and the World Wide Web (the Web). With e-mail, researchers can send surveys to an e-mail address as text messages, which the recipient can then read, save, respond to, or throw away, much like a paper survey. Surveys can also be posted on the Web and may include text, pictures, and forms to be filled in by the respondent.

According to Galin, 1 the primary difference between these two response modes is that e-mail is a push technology while the Web is a pull technology. That is, with e-mail, sent messages are automatically received in the potential respondent s mailbox, whereas respondents must be attracted in some way to a Web page. Because of this difference, one might expect a higher response rate to an e-mail survey than to a Web survey. The experiment described in this paper was designed to test this hypothesis and also to determine whether there were significant differences in the responses submitted by each mode. The literature on surveys suggests that one of the biggest drawbacks of using the Internet is that the population with e-mail and Web access is limited to certain demographic groups 2,3 and that the validity of [Web] survey research is likely to be strongest for research domains that target specific populations. 4 The population in our study was the engineering faculty at the eight colleges that make up SUCCEED (Southeastern University and College Coalition for Engineering EDucation), a National Science Foundation-sponsored consortium. The faculty members were known to have e-mail and Web facilities available to them and were assumed to be technically competent in their use. The subject matter of the survey was faculty development. Survey questions related to the respondents use of various teaching techniques and their perceptions of the importance of teaching quality and innovation to their colleagues and administrators. The results of the survey have been reported elsewhere. 5-7 II. Experimental Methodology Engineering faculty members at two of the eight colleges in SUCCEED in the Fall of 1997 constituted the experimental population. They were divided evenly into a Web group, which was asked to respond to our survey via the Web, and an e-mail group, which was asked to respond via e-mail. All 361 engineering faculty members at the two schools who had e-mail addresses were randomly assigned to either the Web or e-mail group, so that half of the faculty members from each school were in each group. One school was one of the largest in the coalition and the other was one of the smallest. Faculty members in both groups received an e- mail message from a respected person on campus (in one case it was the dean; in the other it was the faculty development coordinator) explaining the purpose of the survey and requesting that they reply. The survey itself was part of this message for the e-mail group, while instructions for accessing the Web survey were part of the message for the Web group. Faculty members were assured that their response would go to an independent researcher and that no one on their campus would have access to individual responses. Non-responders in both groups were sent the e-mail survey as a follow-up three months later as part of the general follow-up to all eight campuses. The e-mail group was given instructions to reply to the message by putting Xs in the brackets next to their selected responses to each question (or in some cases, to insert open-ended responses to questions). Their completed surveys were returned to the independent researcher. The instructions for the Web group were somewhat more complex. In the e-mail message from the respected person, they were given a Personal Identification Number (PIN) and a Web site URL to make note of. The use of the PIN allowed the research team to determine who had responded and to avoid some of the pitfalls described by Schmidt, 8 including multiple responses

from one individual, malicious users, and uninvited respondents. The Web address was hyperlinked so that if their e-mail program supported it, users could click on the link and go directly to the Web site; otherwise they would need to make a note of the URL, open their browser (e.g., Netscape or Internet Explorer), and go to the Web site to fill out the questionnaire. Respondents were given the option of requesting an e-mail version of the survey from the researcher. (One person did this but did not return the e-mail survey.) The questions on the Web-based survey were identical to those in the e-mail version but the method for replying was slightly different. To answer a question with many possible responses (e.g., what is your department?), respondents clicked on a pull-down menu and made their selection, and for questions with only a few possible responses (e.g., what is your sex?) they clicked on a radio button to the left of their desired answer. Space was provided for short answer responses (e.g., other, specify ). Aside from the drop-down boxes and radio buttons, the Web survey layout was relatively simple: there was no color, no Java Script, nor anything else that might distract the user or increase the time required for the survey to download to the respondent s computer, all features of surveys that have been shown to decrease response rates. 9 After filling in the survey, respondents were directed to enter their PIN and to click submit. III. Results Table 1 displays the responses by survey type and institution. In all, 81 people responded to the surveys in this experiment ( 1 st E-mail and Web ) and an additional 47 responded to the follow-up survey ( 2 nd E-mail ). The responses shown are percentages of the e-mail addresses to which each type of survey was sent. Those who responded to the first survey were not sent the second except for a few individuals with multiple e-mail addresses. Table 1: Survey responses by survey type and institution 1 st E-mail Web 2 nd E-mail Total Institution Total N n % n % n % n % A 289 38 26 23 16 38 17 99 34 B 72 14 39 6 17 9 17 29 40 Total 361 52 29 29 16 47 17 128 35 In addition to the different response rates between the first e-mail group and the Web group, there were two significant differences between the response demographics of the two groups. As shown in Table 2, full professors were extremely unlikely to respond via the Web compared with assistant and associate professors (χ 2 (2, N = 73) = 12.2, p =.002). Table 3 shows that those who responded via the Web were also much less likely to have heard about SUCCEED or have been involved in it in some way than were those who responded by e-mail (χ 2 (4, N = 79) = 25.7, p <.001). However, there were no significant differences between the groups based on primary job responsibility (teaching, teaching/research, or administration), length of service, or department.

Table 2: Survey responses by survey type and rank 1 st E-mail Web Rank N % N % Assistant Professor 13 26 6 27 Associate Professor 16 31 15 68 Professor 22 43 1 5 Total 51 100 22 100 Table 3: Involvement in SUCCEED programs by survey type 1 st E-mail Web n % n % Don t know anything about the SUCCEED coalition 3 6 15 54 Heard of the Coalition but haven t been involved with it 24 47 9 32 Attended a Coalition program but have not actively participated 9 18 0 0 Been actively involved in a Coalition project 11 22 3 11 Been a Coalition project leader 4 8 1 4 Total 51 100 28 100 The survey contained 32 questions, broken down as follows: two questions about the respondent s prior attendance and involvement in teaching improvement programs; eighteen questions about the respondent s use of various teaching techniques (e.g., lecture for all of a class period, use active learning exercises in class, assign a major team project, use the Web to communicate with students, solicit feedback from students); five questions about availability of teaching improvement resources on campus and the respondent s use of those resources; seven questions about the respondent s perception of the importance of teaching quality and innovation to himself /herself and others at his/her university. Considering the differences between the respondents in both groups and the lower response rate to the Web survey, one might expect that the responses of the two groups would differ; however, only two of the 32 questions showed significant differences between the Web and e-mail response groups. The Web group gave a higher rating to the importance of teaching quality to their colleagues (M = 7.89, SD = 1.41 on a scale of 0-10) than did the e-mail group (M = 7.06 SD = 2.16) (t (78) = 2.02, p =.047), but the ratings were the same for all other groups (self, department chair, dean, university president) and for the importance of teaching quality and innovation in the institutional rewards system. The Web group was also more likely to solicit feedback from their students more than once a semester (i.e., at some time other than the traditional end of course survey) than was the e-mail group (χ 2 (4, N = 80) = 36.9, p <.001) but was otherwise statistically indistinguishable in the frequency of use of all other teaching techniques (e.g., lecture, demonstrations, use of the computer, writing assignments).

IV. Discussion Our research questions were (1) whether people would be more or less likely to respond to a survey and (2) whether their responses would differ based on whether they were asked to respond by e-mail or by the Web. Obviously the answers to these questions will depend to a great extent on the nature of the survey, but for this study the short answer to the first question is yes and that to the second question is a qualified no. Significantly more people were willing to respond by e-mail than by the Web. This is the primary reason that we chose to do our follow-up survey (and the main survey to the other six SUCCEED campuses) exclusively by e-mail. Researchers who want to know which method is better to use for their own data collection may be guided by some of the underlying factors that may have affected this study. First, when considering whether to use a particular mode (e-mail or the Web_ to conduct an electronic survey, researchers should consider the accessibility to that mode and the normal frequency of use by the target population. In our study, the population at large was known to have access to both e-mail and the Web and we had available the e-mail addresses for most of the population, which made us confident that they could be reached in this manner. E-mail is a common mode of communication among engineering professors within their own campuses, so its use was a part of the expected everyday activities of our population, however, even with the explosive growth in the accessibility of the World Wide Web, many people are still not regular or comfortable users of it. We surmise that the latter factor accounts in large part for the greater response rate to the e-mail survey. Second, we assumed that since the survey was mainly about a topic unrelated to technology, those who chose to respond would be as representative of the population at large as respondents to a paper questionnaire. Although we have no way of verifying it, we believe that this assumption is warranted for most of the items on our survey; it is highly questionable, however, for the items that related to frequency of use of technology in classroom instruction. Familiarity (or lack of it) with a particular mode may account for the relative disinclination of full professors to reply via the Web. While e-mail has been in place for use by university faculty for at least three decades, the Web has only been easily accessible since the mid 1990s and may not yet be a part of daily activities, even among engineering professors. Therefore, researchers who need to be able to distinguish respondents by faculty rank may not want to use the Web as the survey tool. More research may be warranted to clarify this matter. Other areas that researchers might need to consider are costs and data processing considerations. Because we worked in a research university environment, we had the facilities available for publishing the survey on the Web and for sending the e-mails out to our survey population at little marginal cost. We also wrote programs for both the Web and e-mail surveys that eliminated the need to enter the survey responses manually. Although the time spent writing these programs for one survey was relatively costly, the programs can be used again and again. Software for creating Web forms is available to the public at a reasonable cost and Internet service providers will host Web sites also at a reasonable cost. However, the cost of developing or purchasing the technical expertise to develop a good Web site should not be discounted.

V. Conclusion The Web offers a number of advantages for survey administration. The survey site can be programmed to provide real time updates on response frequency and other statistics and can make those statistics available to the survey administrator and, if desired, the respondent. In addition, respondents can be prevented from giving answers that are out of range or otherwise undesirable. For instance, when asked to rate the importance of teaching quality on a scale from 0 and 10, more than one respondent to the e-mail survey gave a non-integer response such as 8.2. (We decided to truncate.) On the other hand, richer information can potentially be received by e- mail, and survey designers can get better feedback about the quality of their questions because respondents can and do use blank space in the survey to explain their answers or give other feedback. In the end, the complexity of preparation of and access to the Web survey and the superior response rate to the e-mail survey made the latter the method of choice for us, and we were pleased to find that there was little difference in the answers to our questions between the e-mail and Web groups. Others who have different cost structures, a target audience that is more easily pulled to a Web site, or a need to keep people informed in real time about the progress of the survey may well determine that a Web survey is better for them. Technology improves, and people s access to and use of technology improves. Today we choose e-mail surveys. A decade ago we may have chosen to mail a paper survey. In a few years we may choose Web surveys. References 1. Galin, M. Collecting data through electronic means: A new chapter in the evolution of survey methodology? Paper presented at the American Evaluation Association Annual Conference. Chicago: November 1998. 2. Thatch. L. Using electronic mail to conduct survey research. Educational Technology (March-April 1995): 27-31. 3. Selwyn, N. and Robson, K. Using e-mail as a research tool. Social Research Update 21 (Summer 1998). URL: http://soc.surrey.ac.uk/sru/sru21.html 4. Schmidt, W. World-Wide Web survey research: Benefits, potential problems, and solutions. Behavior Research Methods, Instruments and Computers 29 (2) (1997): 274-279. 5. R. Felder, R. Brent, C. Brawner, R. Allen, and T. Miller. 1997-1998 Faculty survey of teaching practices and perceptions of institutional attitudes toward teaching. Raleigh, NC: SUCCEED 1999. Available: ERIC, ED 428 607 6. R. Brent, R. Felder, C. Brawner, T. Miller, and R. Allen. Faculty teaching practices and perceptions of institutional support for teaching at eight engineering schools. In Proceedings of the 1998 Frontiers in Education Annual Conference. Tuscon, AZ, November 1998. 7. C. Brawner, R. Felder, R. Brent, T. Miller, and R. Allen. Faculty teaching practices in an engineering education coalition. Proceedings. 1999 Frontiers in Education Annual Conference. San Juan, PR, November 1999. 8. Schmidt, W., Ref. 4, p. 277. 9. Dillman, D., Tortora, R., Conradt, J., and Bowker, D. Influence of plain vs. fancy design on response rates for Web surveys. Proceedings of Survey Methods Section, 1998 American Statistical Association Annual Meetings. Dallas, TX, August 1998.

CATHERINE E. BRAWNER Catherine E. Brawner is president of Research Triangle Educational Consultants. She specializes in evaluation of distance education, educational innovation, and technology use in the classroom. She is the principal evaluator of the SUCCEED Coalition. RICHARD FELDER Richard Felder is Hoechst Celanese Professor Emeritus of Chemical Engineering at North Carolina State University and Faculty Development Codirector of the NSF-sponsored SUCCEED Coalition. He is a Fellow Member of the ASEE, and codirector of the National Effective Teaching Institute. RODNEY H. ALLEN Rod Allen is a research scientist and independent computer consultant. His company, COMP-AID, specializes in innovative applications, crisis consulting, e-commerce, Internet, computer aided design,computer graphics, efficient computing, and teaching computer courses, short courses, and seminars. REBECCA BRENT Rebecca Brent is an educational consultant on the staff of the College of Engineering at North Carolina State University, Faculty Development Codirector of the SUCCEED Coalition, Adjunct Professor of Education at East Carolina University, and codirector of the National Effective Teaching Institute. THOMAS K. MILLER Thomas K. Miller, III is the Interim Vice-Provost for Distance Education and Learning Technologies, Associate Dean in the College of Engineering, and Professor of Electrical and Computer Engineering at North Carolina State University. He is a member of the Academy of Outstanding Teachers at North Carolina State University, and recipient of the 1995 Joseph M. Biedenbach Outstanding Engineering Educator award from the IEEE.