1 Research in Higher Education, Vol. 44, No. 1, February 2003 ( 2003) COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS: Does Mode Matter?+ Robert M. Carini,*,** John C. Hayek,* George D. Kuh,* John M. Kennedy, and Judith A. Ouimet# ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: We examined the responses of 58,288 college students to 8 scales involving 53 items from the National Survey of Student Engagement (NSSE) to gauge whether individuals respond differently to surveys administered via the Web and paper. Multivariate regression analyses indicated that mode effects were generally small. However, students who completed the Web-based survey responded more favorably than paper on all 8 scales. These patterns generally held for both women and men, and younger and older students. Interestingly, the largest effect was found for a scale of items involving computing and information technology. ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: KEY WORDS: survey mode; student engagement; Web survey; NSSE. INTRODUCTION The dramatic surge in online computing over the past decade promises to have far-reaching social and educational consequences. Electronic technology, especially the Web, is no longer the province of the highly educated elite. In 2000, some 55 million Americans went online each day (Pew Internet and American Life, 2000). Various forms of computing and information technology are practically universal on college and university campuses. For instance, 86% of college students have gone online, as compared with 59% of the population overall (Jones, 2002). In 2000, 59% of all college courses were using electronic mail, up from 44% in 1998 and 20% in 1995 (Green, 2001). In addition, the +The research reported in this article was conducted with support from The Pew Charitable Trusts. However, the views expressed in this paper are solely those of the authors, not The Pew Trusts. *National Survey of Student Engagement, Indiana University Bloomington. Center for Survey Research, Indiana University Bloomington. #Community College Survey of Student Engagement, University of Texas at Austin. **Address correspondence to: Robert Carini, National Survey of Student Engagement, Indiana University, Center for Postsecondary Research and Planning, Ashton Aley Hall Suite 102, 1913 East Seventh Street, Bloomington, IN /03/ / Human Sciences Press, Inc.
2 2 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET percentage of college courses that rely on Web-based resources has increased almost fourfold, from 11% in 1995 to 43% in 2000 (Green, 2001). In addition to becoming a widely used educational tool, the Web is gaining popularity as a vehicle to collect survey information. In part, this is due to the allure the Web holds as a cost-effective method to enhance response rates, especially among computer-savvy respondents such as college students, the vast majority of whom have Internet access either at home or through a college or university account. Moreover, some research suggests that individuals have different mode preferences (Groves and Kahn, 1979), and a mixed mode approach (offering the option of completing a paper survey or via the Web) might induce more people to respond than any single mode approach. Finally, Web technology has the ability to overcome some of the privacy protection barriers that make it difficult to contact potential respondents using traditional survey administration methods. For example, caller identification and other technology allow people to screen unwanted telephone interviews (Dillman, 2000). Taken together, these factors may well prompt increased use of the Web or mixed-mode survey approaches. However, little is known about how people perceive and respond to Webbased surveys (Couper, 2000; Dillman, 2000). For example, do students respond differently to Web surveys than they do to traditional paper surveys? If so, are their responses to Web surveys more or less favorable than responses to paper surveys? It is important to know the answers to these and related questions. What may appear from survey results to be changing student behaviors or attitudes that suggest modifications in policies and practices might be a function of mode effects, not actual changes per se in the student experience. The few research studies comparing Web vs. paper modes on student surveys report somewhat mixed findings. Several single-campus studies found few substantial differences between the responses of students who completed the same survey via the Web and paper (Layne, DeCristoforo, and McGinty, 1999; Olsen, Wygant, and Brown, 1999; Tomsic, Hendel, and Matross, 2000). Other research suggests that the particular survey administration mode shapes how people respond (Burr, Levin, and Becher, 2001; Dillman, 2000; Dillman, Sangster, Tarnai, and Rockwood, 1996; Schwarz, Hippler, and Noelle-Neumann, 1992; Turner et al., 1998). For example, Burr, Levin, and Becher report that customers of a federal government agency report more satisfaction with products and services when they respond via the Web instead of paper. Interestingly, computermediated surveys may yield more honest responses on items of a sensitive nature (Turner et al., 1998). Others have found that people respond more favorably to similar items when interviewed by telephone as compared with completing a paper survey (Dillman et al., 1996). Dillman (2000) suggests that some normative and cognitive mechanisms might contribute to mode differences. Normative mechanisms include social
3 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS 3 desirability (responding in socially acceptable ways), acquiescence (the tendency to agree rather than disagree), question order effects (answering later questions to attain consistency with answers to previous questions), and primacy or recency effects (selecting the first or last offered response). Cognitive mechanisms refer to whether or not survey respondents receive identical stimuli via different modes of administration. These include different visual and aural communication and neurological processing preferences, such as respondents being more disposed to answer in different ways depending on whether they or the interviewer is controlling the question flow (Jenkins and Dillman, 1997). These explanations are helpful for understanding what factors might influence mode differences. At the same time, the bulk of the research in this area is based on comparisons of telephone and paper survey approaches. The literature is nearly silent with regard to Web vs. paper differences, especially involving student respondents from multiple institutions. To accurately use and interpret Web survey results, it is important for practical and theoretical reasons to determine if people respond in the same fashion to Web and other modes of survey administration. If it turns out that people answer similarly via the Web and other modes, researchers and the public will have reason to be confident about the validity of the results. However, if people answer differently because the Web mode shapes their responses, adjustments might be made to standardize responses across modes. In addition, new theories or analytical criteria may be needed to properly evaluate and interpret Web results (Dillman, 2000). PURPOSE OF STUDY The purpose of this study is to determine whether mode effects are associated with responses of undergraduate students to a national survey administered via a paper questionnaire and via the Web. Three questions guided this study: 1. Do responses of students who have the option of using either a paper or Web questionnaire differ from those who can only complete a survey on the Web? 2. Do students who use a paper or Web version of an instrument respond differently to questions about their college experiences? 3. Do any observed mode effects differ by certain student background characteristics, notably sex and age? DATA SOURCE The National Survey of Student Engagement (NSSE) annually collects data about the nature of the college experience from tens of thousands of first-year
4 4 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET and senior students at several hundred 4-year colleges and universities (Kuh, 2001). The project routinely uses both paper and Web survey modes. As such, NSSE data are a rich source of information for examining possible response differences between Web and paper administration modes. The target sample included 151,910 students from year colleges and universities that registered for the NSSE survey in Each institution provided student information in an electronic data file. The students were equally divided between first-year and senior students randomly sampled from student populations in each class. The participating colleges and universities generally mirrored the national profile with respect to institutional type (as defined by the 2000 Carnegie Classification), sector (public or private), region, and urbanicity (National Survey of Student Engagement, 2000). The NSSE survey was administered from March through June of 2000 in a mixed-mode format, that is, both traditional paper and Web-based modes were employed. The overall response rate was 42%. No incentives were provided by the NSSE for survey completion. The results reported in this paper are based on 58,288 undergraduates (27,121 first-year students, 31,167 seniors) who provided information for all of the control variables used in our analyses. Of this group, 37,682 students completed the paper version (71% women, 29% men), and 20,606 completed the Web version (58% women, 42% men). At 223 institutions, each student was sent a paper survey, but had the option of completing it on the Web. In contrast, 53 other colleges and universities elected to use a Web-exclusive format, wherein all contact with students was electronic, and students completed the survey via the Web. Thus, there were two mutually exclusive scenarios for completion: (a) paper or Web option and (b) Web only. Among Web-completers, we analyzed the responses of 10,254 students from Web-only institutions and an additional 10,352 students who exercised the Web option. INSTRUMENT The NSSE survey instrument, The College Student Report (Kuh, 2000a), measures the degree to which students participate in effective educational practices (Kuh, 2001). Many of its 67 items have been employed in other collegiate surveys such as the College Student Experiences Questionnaire (CSEQ) and UCLA s Student Information Form (the CIRP survey of first-year students). The College Student Report (hereafter The Report) taps student experiences on several dimensions: (a) involvement in different types of in-class and out-of-class activities, (b) taking part in educationally enriching programs such as study abroad, internships, and senior capstone courses, (c) perceptions of collegiate contributions to educational, personal, and social development, (d) perceptions of the campus environment, such as institutional emphases and quality of relationships on campus, and (e) satisfaction with their overall institutional experi-
5 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS 5 ence. In addition, students give background information, such as their sex, age, race/ethnicity, enrollment status, living arrangements, and major field. Like all college student surveys, the NSSE instrument solicits self-reported information. The Report was designed to satisfy five general criteria that promote valid self-reports (Kuh, 2000b): (a) respondents possess the information asked of them, (b) the items are phrased clearly to avoid confusion (Laing, Swayer, and Noble, 1989), (c) the questions ask about recent experiences (Converse and Presser, 1989), (d) the respondents believe the items warrant thoughtful answers (Pace, 1985), and (e) responding honestly does not threaten, embarrass, or compromise privacy (Bradburn and Sudman, 1988). Kuh et al. (2001) provide details about the psychometric properties of The Report. Considerable effort was made to make the structure and format of the Web survey match the paper survey. However, there were some minor inevitable differences in graphic and visual appearances. At the same time, the items and response options were identical in terms of wording and placement. Further, the number of response options was kept small enough to fit on a single page on any computer screen. Examples of the Web and paper surveys are available at and nsse/ acrobat/tcsr-2000.pdf, respectively. VARIABLE SPECIFICATION We tested for mode effects on eight scales of student engagement involving 53 survey items. Table 1 summarizes scales used in this article and provides descriptive statistics for each scale by mode of administration. In addition, the appendix lists the specific items contributing to each scale. All scales but computing and information technology stem from established scales described and used elsewhere (Kuh et al., 2001). Active and collaborative learning, studentfaculty interaction, general education gains, personal-social gains, and computing and information technology were created by summing individual items contributing to each scale. As academic challenge and supportive campus environment each contained items with different response sets, we created Z scores for each item and then summed all Z scores. Enriching educational experiences also contained items with different response sets, including six dichotomous items on whether students had done/plan to engage in learning opportunities outside the classroom; we equalized the minimum and maximum for each contributing item and then summed all items. 1 Mode of completion was initially constructed as three dummy-coded variables: (a) Web-only or not, (b) Web option or not, and (c) paper or not. Later, Web only and Web option were collapsed, yielding a mode variable equal to 1 if Web, 0 if paper. To more accurately estimate possible mode effects, we controlled for a number of potentially confounding variables at both the student and institutional levels.
6 6 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET TABLE 1. Means, Standard Deviations, and Descriptions of Student Engagement Scales Web vs. Web Paper Paper Standard Standard Mean Scale Description Metric Mean Deviation N Mean Deviation N Difference a Academic challenge Nature and amount of aca- Sum of 10 Z , ,814.60*** demic work performed scored items Active and collabora- Frequency of class partici- Sum of 7 items , ,749.36*** tive learning pation and collaborative learning Enriching educational Participation in learning Sum of 10 items ,828 b ,825 b.44*** experiences opportunities outside the classroom Student-faculty interac- Frequency of student interac- Sum of 6 items , ,167.34*** tion tions with faculty Supportive campus Degree to which the institu- Sum of 6 Z , ,075.57*** climate tion is perceived to be scored items supportive General education Self-reported gains in writ- Sum of 4 items , , gains ing, speaking, and thinking critically Personal and social Self-reported gains related to Sum of 7 items , ,054.52*** gains personal and community issues Computing and infor- Degree to which use comput- Sum of 3 items , ,391.86*** mation technology ing and information technology a t test. b A large proportion of first-year students responded as undecided to one or more of six items that contributed to this scale. ***p <.001 (2-tailed test).
7 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS 7 Student-Level Controls Class (1 = senior, 0 = first-year) Enrollment Status (1 = less than 2 courses, 2 = 2 courses, 3 = 3 to 4 courses, 4 = full-time) Residence On campus (1 = living in residence hall or Greek housing, 0 = living off campus) Sex (1 = female, 0 = male) Age Race/Ethnicity (dummy-coded) African American/Black, Asian/Pacific Islander, Latino/a, White, Native American, or other Major Field (dummy-coded) 2 Humanities (humanities, cultural studies, foreign languages/literature, liberal/general studies, and/or visual and performing arts) Math and sciences (biological/life sciences, computer and information sciences, engineering, mathematics, and/or physical sciences) Pre-professional (agriculture, business, communication, education, healthrelated fields, and/or parks/recreation/sports management) Social sciences (social sciences, multi/interdisciplinary studies, and/or public administration) Multiple Fields Institution-Level Controls 2000 Carnegie Classification (dummy-coded) Doctoral Extensive, Doctoral Intensive, Master s Institutions, Baccalaureate General, Baccalaureate Liberal Arts, or Specialized Institutions Undergraduate enrollment from the Integrated Postsecondary Education Data System (IPEDS) Admissions Selectivity (1 to 6, with 1 being noncompetitive, and 6 being most competitive, from Barron s Profiles of American Colleges, 23rd Edition) Selectivity was unavailable for 13 institutions. However, using the criteria detailed in Barron s, we estimated selectivity for these institutions. Sector (1 = private, 0 = public) Urbanicity from IPEDS (dummy coded) Large urban (large city or urban fringe of large city) Mid-urban (mid-city or urban fringe of mid-size city) Nonurban (Large town, small town, or rural) For computing and information technology only: Academic sup-
8 8 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET port expenditures per student from IPEDS (library expenses, museums, galleries, audio/video services, academic computing, academic administration, personnel development, and course/curriculum development) ANALYTIC STRATEGY First, we predicted which students elected the Web option with a multivariate logistic regression. Although Web option is ultimately a student s decision, institutional characteristics might well mediate such use, chief among them the nature, quality, and availability of computing and information technology on campus. Overall, nearly 22% of eligible respondents selected the Web option. The following variables were statistically significant ( p <.001) and were found to increase the odds of selecting the Web option by at least 25% over their range: living on campus, being younger, male, White or Latino/a instead of African American, majoring in math and science fields or having multiple major fields, and attending a more selective institution or one that invests more in academic support. Given that the instrument itself was identical in Web-only and Web-option scenarios, any observed response differences should be due to selection processes at the student or institutional level. For example, Web-only responses are initially shaped by an institutional decision to administer exclusively via the Web. At the same time, this decision does not neutralize student agency. That is, students vary in their enthusiasm for using computing technology, and this could influence whether a particular student completes the Web-only survey at all. And accessibility to the Web may not be ubiquitous, at least not yet for all students. For example, some have argued that computing and information technology may be less accessible to students of color compared with White students (Malveaux, 2000), although Kuh and Hu (2001) found otherwise. Likewise, although the Web option is ultimately a student s decision, it is filtered through institutional characteristics, such as the availability and quality of oncampus computers and electronic mail. If selection processes are accounted for by control variables, any response differences between these two administration types should disappear. Indeed, after controlling for student and institutional variables, the Web-only and Weboption responses on seven of the eight scales did not exhibit substantive differences (see Kuh et al., 2001 for an item-by-item comparison for all items on The Report). Thus, in subsequent analysis, we collapsed the two types of Web responses into a single Web mode so that paper and Web could be compared. We discuss the one scale that exhibited a difference between Web only and Web option later in this article. We used ordinary least squares (OLS) to examine whether the two survey modes (paper or Web) affected average responses to each scale. To examine
9 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS 9 whether these patterns are generalizable to different subpopulations of college students, we created multiplicative interaction terms involving (a) sex and mode, and (b) age and mode. We considered, but ultimately rejected, using hierarchical models to analyze these data. First, hierarchical models cannot be used because there are no paper respondents at Web-only institutions; thus, there is no variance on the mode variable. Second, where only effect magnitudes are of interest and intraclass correlations are small, analysis with OLS regression and hierarchical linear modeling yields very similar conclusions (Ethington, 1997). Indeed, the intraclass correlations for scales used in this article are small for institutions that offered the Web option; seven of eight scales had less than 13% of their variance between institutions (available from the authors on request). These intraclass correlations are somewhat lower, but generally consistent with those reported by other higher education researchers in multiple institution studies (Ethington, 1997; Pascarella and Terenzini, 1991). Although OLS regressions will likely produce biased standard errors for these data, the statistical power ensured by the large sample size makes typical statistical tests of significance less instructive (Cohen, 1988). For this reason, we computed effect sizes to judge whether the magnitudes of the coefficients were large enough to warrant attention. The effect sizes were calculated by dividing the unstandardized coefficient by the standard deviation of the item for paper-mode respondents (Greenwald, Hedges, and Laine, 1996; Light and Pillemer, 1982; Pascarella, Flowers, and Whitt, 2001). RESULTS Means for each scale are presented separately for both Web and paper modes in Table 1. As shown in the rightmost column, Web responses yielded significantly (p <.001) greater means than paper for all scales except general education gains. Yet these mean differences are raw, that is, they do not control for potential respondent and institutional differences associated with the survey mode. In contrast, Table 2 shows the results from regressing each scale on mode of completion and all control variables. We coded mode = 1 if by Web and mode = 0 if by paper. By using paper as the referent category, we do not intend to suggest that the paper mode yields more valid responses, only that it remains the more commonly used mode. The second column in Table 2 displays positive and statistically significant ( p <.001) unstandardized coefficients for all eight scales, suggesting that students who completed the survey online tended to report more favorable responses than those who completed via paper. 3 In no instance did responses to a scale favor paper over Web. While average responses to all eight scales favor the Web mode, the effect sizes are modest, generally.15 or less (Table 2, third column). The exception is
10 10 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET TABLE 2. OLS Regressions of Engagement Scales on Mode of Administration and Selected Student and Institutional Controls a,b Web vs. Paper Unstandardized Effect Adjusted Net R 2 Scale Coefficient c Size d R 2 Change e Academic challenge.330***(.048) Active and collaborative learning.498***(.029) Enriching educational experiences.079***(.022) Student-faculty interaction.238***(.029) Supportive campus climate.224***(.038) General education gains.164***(.024) Personal and social gains.642***(.044) Computing and information technology.593***(.019) a Student-level controls include class, enrollment status, residence, sex, age, race/ethnicity, and major field; Institution-level controls include Carnegie Classification, undergraduate enrollment, Barron s admissions selectivity, sector, urbanicity, and for computing and information technology only, academic support expenditures. b Ns for Web range from 10,828 20,594; Ns for paper range from 18,825 37,391. c Standard errors in parentheses. d y-standardized coefficient. e Difference in adjusted R 2 between the model containing mode and controls and the model containing only controls. ***p <.001 (two-tailed). computing and information technology, which shows a.27 standard deviation increase if one responded via the Web instead of paper. 4 This scale is noteworthy for another reason: it is the only scale wherein Web-option and Web-only respondents differed in their responses after all controls were included, although both were more favorable than paper. Indeed, the effect size for Web only (.34) was larger than that for Web option vs. paper (.22; not shown here in tabular format). We find little evidence for interactions by sex or age in multivariate analyses on any of these eight scales. That is, these patterns hold for both women and men, and younger and older students. In addition to computing effect sizes, we also considered the impact of mode relative to controls in the models, and the net improvement in variance explained by mode. To gauge the strength of the mode variable relative to controls, we compared the standardized coefficients for mode with those of each of the controls. Interestingly, mode showed larger effects on computing and information technology and active and collaborative learning than for several variables purported to shape student engagement, such as residence, age, race, and undergraduate enrollment. Only student s class wielded a stronger effect than mode
11 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS 11 for computing and information technology; only class, enrollment status, and sector shaped active and collaborative learning more than mode. The mode coefficient was considerably weaker against controls for each of the other six scales. Finally, we examined the net improvement in variance explained (R 2 ) by introducing mode into the regression models after all controls were already introduced (Table 2, rightmost column). In other words, we examined the improvement in adjusted R 2 between the model containing mode and controls, and the model containing only controls. Only one scale yielded a net improvement in R 2 larger than.01: computing and information technology (.014 or 1.4%). DISCUSSION After controlling for student and institution characteristics, it appears that the responses of college students to paper surveys and Web surveys generally evince small distinctions. This conclusion is consistent with other single-campus studies in postsecondary settings (Layne et al., 1999; Olsen et al., 1999; Tomsic, et al., 2000). That said, our findings do not allow us to conclude that Web-based responses are essentially identical to those generated from paper. In particular, responses to Web were more favorable for all eight scales examined. Interestingly, computing and information technology yielded the largest effect favoring Web over paper. Several other recent studies also suggest that responses to Web surveys may be more favorable than for paper on computing-related items (Daly, Cross, and Thomson, 2001; Tomsic et al., 2000), although these studies do not appear to statistically control for possible confounding differences in respondent pools between modes. Although we find moderate effects for computing and information technology that favor the Web mode, selection processes inherent in completing the Web survey may have led to more favorable responses on this particular scale. Given that students were not randomly assigned to one mode or another, those who completed the Web survey after receiving an invitation via traditional mail (Web option) were likely among the most competent and frequent users of computing technologies, that is, actual engagement may shape mode choice instead of mode choice in itself shaping reported engagement. As reported earlier, Web-only respondents scored somewhat higher than Web-option respondents on computing and information technology. Perhaps there are factors associated with being a Web-only school that disproportionately prevents many of the least computer-savvy students from responding to the Web survey. In particular, students who seldom used their electronic mail accounts may have been unaware they were invited to participate. If many of the least engaged students on computing and information technology were effectively removed from the Web-only respondents, this group should outscore their Web-option and paper counterparts. What does this mean for institutional researchers and others who wish to use
12 12 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET Web surveys? This study and others underscore the need to carefully evaluate issues of sampling, nonrespondent bias, and measurement error when interpreting the findings from Web and paper surveys. Couper (2000) cautions that we presently know very little about the nonresponse problem in Web surveys and must rely on electronic mail surveys to try to accurately account for potential respondent bias and differences in the characteristics of responders and nonresponders. This is a nontrivial problem inasmuch as response rates to Web surveys tend to be lower than for paper surveys, although response rates were similar for the data we analyzed here (40% for Web-only institutions vs. 43% for Web-option institutions). While college students are far less likely to be affected by the digital divide (National Telecommunications and Information Administration, 2000), it is nonetheless likely that many potential respondents may prefer to complete a paper survey rather than Web, which can dampen response rates as well as affect the responses of those who complete a survey on-line. Further, it is possible that some of the same normative and cognitive factors that shape differences between other mixed-mode surveys (although they are mostly based on differences in paper vs. telephone results) may have produced the patterns we observe in this article (Dillman, 2000). For instance, the novelty aspect of the Web, since it is still a relatively new mode of survey administration, might invoke a more positive response if completing via the Web is perceived as socially desirable due to the growth of information technology and the Internet. In fact, a new set of mode considerations may be needed to properly evaluate Web vs. paper responses (Dillman, 2000). For example, during focus groups and cognitive testing of The Report, students suggested that the Web version was easier to complete than the paper version. We could speculate that this ease of use might lead to slightly more favorable responses. Clearly, additional research is needed in this area to confidently interpret, and perhaps ultimately obviate any differences in Web and paper responses. Finally, it would be profitable to explore the meanings and implications of computer usage, such as whether extensive use of computers by college students may have positive or negative implications depending on the nature of the application, such as surfing the Web for pleasure, playing games, and developing personal Web pages as contrasted with seeking additional relevant sources for class papers and projects. In particular, to the extent that computing applications are substituted for face-to-face social interaction, they might be viewed as less positive outcomes (Gatz and Hirt, 2000; Kuh and Vesper, 2001). For instance, Gatz and Hirt suggest that electronic mail may not wield an academic or socially integrative role on-campus, but may instead signal a more passive and indirect form of communication. One exception to this more passive student voice via electronic mail often occurs if the content involves a confrontational message.
13 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS 13 Further, their research suggests that many students consider electronic mail to be an impediment or distraction to completing coursework. LIMITATIONS This study has several limitations. First, although our analysis controlled for a number of student and institutional characteristics, no direct measures of precollege characteristics, such as grade point average, ACT and SAT scores, or parental socioeconomic status, were included in the models. In addition, the results for computing and information technology might differ if a more direct measure of computing technology at particular campuses was available. That is, our findings might be due to a preponderance of Web respondents from highly wired campuses who are, in fact, more exposed to a greater array of computing and information technology. For example, Hu and Kuh (2001) found that undergraduates attending more wired campuses as determined by Yahoo s America s Most Wired Colleges surveys more frequently used computing and information technology than their counterparts at less wired campuses. While the academic support measure used here included computing support and was a significant predictor of whether students responded via the Web option, it likely does not fully capture the state of computing technology at particular campuses. 5 Second, the Web and paper versions of the survey had some differences in graphic and visual appearances, such as color, which might affect responses (Couper, 2000). Ideally, the two surveys appear identical to respondents in all respects, although this is difficult to implement in practice. Third, the NSSE targets only first-year and senior students. Perhaps survey mode effects differ for other students. Notably, we did not observe differences in mode effects by sex or age of student. However, because this study examined college students, 89% of the respondents were less than 30 years old. Given the positively skewed distribution on age, we had limited ability to detect differences among the oldest college students. Further, because many college students are relatively computer savvy, the effects of mode may differ for noncollege populations. A final limitation concerns the nature of the survey items themselves. Specifically, the response options for items consisted of relatively few choices, most often only four. Such constrained response possibilities may have dampened the magnitudes of observed effects. Also, the survey used in this study did not contain sensitive questions that posed potential embarrassment or risk. College student surveys that request such information as alcohol and illegal drug use may induce more socially desirable results or affect student responses in other ways that cannot be easily understood in the absence of other information.
14 14 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET CONCLUSION Our findings suggest that mode effects for first-year and senior college students generally tend to be small. A notable exception involves items related to computing and information technology, which exhibit more favorable responses when answered via the Web. However, our data do not allow us to discern whether this is a true mode effect or whether those most engaged in computing and information technology are also those who gravitate toward the Web-based modes. Given the small effect sizes for most scales, this study should help allay concerns that data gathered via the Web may differ substantially from those collected from paper. On the other hand, our findings demonstrate the need for further research into possible mode differences involving the Web both in terms of whether differences exist within the population-at-large and specific cognitive mechanisms to explain any observed differences. APPENDIX: SURVEY ITEMS CONTRIBUTING TO STUDENT ENGAGEMENT SCALES Academic Challenge (Cronbach s α =.72) Number of hours in a typical week preparing for class (studying, reading, writing, rehearsing, and other activities related to your academic program) during the current school year Number of assigned textbooks, books, or book-length packs of course readings during the current school year Number of written papers or reports of 20 pages or more during the current school year Number of written papers of fewer than 20 pages during the current school year Extent to which coursework emphasized this school year: Analyzing the basic elements of an idea, experience, or theory such as examining a particular case or situation in depth and considering its components Extent to which coursework emphasized this school year: Synthesizing and organizing ideas, information, or experiences into new, more complex interpretations and relationships Extent to which coursework emphasized this school year: Making judgments about the value of information, arguments, or methods such as examining how others gathered and interpreted data and assessing the soundness of their conclusions Extent to which coursework emphasized this school year: Applying theories or concepts to practical problems or in new situations How often worked harder than you thought you could to meet an instructor s standards or expectations during the current school year? Extent to which your college emphasized this school year: Spending significant amounts of time studying and on academic work
15 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS 15 Active and Collaborative Learning (Cronbach s α=.65) How often asked questions in class or contributed to class discussions during the current school year? How often made a class presentation during the current school year? How often worked with other students on projects during class during the current school year? How often worked with classmates outside of class to prepare class assignments during the current school year? How often tutored or taught other students during the current school year? How often participated in a community-based project as part of a regular course during the current school year? How often discussed ideas from your reading or classes with others outside of class (students, family members, co-workers, etc.) during the current school year? Enriching Educational Experiences (Cronbach s α =.60) Number of hours in a typical week participating in cocurricular activities (organizations, campus publications, student government, social fraternity or sorority, intercollegiate or intramural sports, etc.) during the current school year Done or plan to do a practicum, internship, field experience, co-op experience, or clinical assignment before you graduate Done or plan to do community service or volunteer work before you graduate Done or plan to do foreign language coursework before you graduate Done or plan to study abroad before you graduate Done or plan to do an independent study or self-designed major before you graduate Done or plan to do a culminating senior experience (comprehensive exam, capstone course, thesis, project, etc.) before you graduate How often had serious conversations during the current school year with students whose religious beliefs/political opinions/values were very different from yours? How often during the current school year had serious conversations with students of a different race or ethnicity than your own? Extent to which your college emphasized this school year: Contact among students from different economic, social, and racial or ethnic backgrounds Student Faculty Interaction (Cronbach s α =.75) How often discussed grades or assignments with an instructor during the current school year? How often talked about career plans with a faculty member or advisor during the current school year? How often discussed ideas from your reading or classes with faculty members outside of class during the current school year? How often worked with faculty members on activities other than coursework (committees, orientation, student-life activities, etc.) during the current school year? How often received prompt feedback from faculty on your academic performance during the current school year? How often worked with a faculty member on a research project during the current school year?
16 16 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET Supportive Campus Environment (Cronbach s α =.79) Extent to which your college emphasized this school year: Providing the support you need to succeed academically Extent to which your college emphasized this school year: Helping you cope with your nonacademic responsibilities (work, family, etc.) Extent to which your college emphasized this school year: Providing the support you need to thrive socially In your experience this year, rate the typical quality of relationships among people at this college: With other students In your experience this year, rate the typical quality of relationships among people at this college: With faculty members In your experience this year, rate the typical quality of relationships among people at this college: With administrative personnel and offices General Education Gains (Cronbach s α =.79) Extent to which your college education contributed to your knowledge, skills and personal development in: Acquiring a broad general education Writing clearly and effectively Speaking clearly and effectively Thinking critically and analytically Personal and Social Gains (Cronbach s α =.84) Extent to which your college education contributed to your knowledge, skills and personal development in: Working effectively with others Voting in elections Learning effectively on your own Understanding yourself Understanding people of other racial and ethnic backgrounds Being honest and truthful Contributing to the welfare of your community Computing and Information Technology (Cronbach s α =.57) How often used to communicate with an instructor or other students during the current school year? How often used an electronic medium ( , list-serve, chat group, etc.) to discuss or complete an assignment during the current school year? Extent to which your college education contributed to your knowledge, skills, and personal development in: Using computing and information technology? ENDNOTES 1. Coding undecided responses as missing for six items that asked whether students had done or planned to participate in activities resulted in a substantial loss of cases for this scale (Table 1), especially for first-year students. In supplementary analyses, we examined the 22,718 seniors separately for this scale and found the same patterns that we report here. Alternately, coding undecided responses as no before scale creation or using multinomial logit models with three outcomes (undecided, no, yes) on these six items did not alter our findings.
17 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS Using dummy variables instead for each of the 19 majors listed on the survey does not alter our findings. 3. In supplementary analyses including an additional 5,125 cases with missing data on one or more student-level controls, we performed mean substitutions and included dummy variables as indicators of substitution for each control in the regressions. This analysis produced the same patterns we report here. 4. This pattern holds for each of the three survey items contributing to the computing and information technology scale. 5. We further explored this possibility in supplementary analyses with data from America s Most Wired Colleges 2000 (Yahoo Internet Life, 2001). Unfortunately, raw scores were publicly available only for the top scoring 300 colleges and universities who participated in the Yahoo survey. We added the wired measure as an additional control using only the 77 NSSE institutions with publicized wired scores. With the range on the wired variable restricted to only the highest scoring institutions, the coefficient for computing and information technology did not change appreciably after introduction of the wired variable. The implication for our analysis is not clear; we cannot judge whether we would find such stability in these mode effects across less-wired institutions. REFERENCES Barron s Educational Series, Inc. (1998). Barron s Profiles of American Colleges, 23rd Edition, Barron s Educational Series, Inc., Hauppauge, NY. Bradburn, N. M., and Sudman, S. (1988). Polls and Surveys: Understanding what They Tell Us, Jossey-Bass, San Francisco. Burr, M. A., Levin, K. Y., and Becher, A. (2001). Examining web vs. paper mode effects in a federal government customer satisfaction study. Paper presented at the Annual Conference of the American Association for Public Opinion Research, Montreal, Canada. Cohen, J. (1988). Statistical Power Analysis for the Behavior Sciences (2nd ed.), Lawrence Erlbaum Associates, Hillsdale, NJ. Converse, J. M., and Presser, S. (1989). Survey Questions: Handcrafting the Standardized Questionnaire, Sage, Newbury Park, CA. Couper, M. P. (2000). Web surveys: A review of issues and approaches. Public Opin. Q. 64: Daly, R. F., Cross, J., and Thomson, G. E. (2001). Web versus paper surveys: Results of a large-scale direct comparison of the two methods. Paper presented at the Forum of the Association for Institutional Research, Long Beach, CA. Dillman, D. A. (2000). Mail and Internet Surveys: The Tailored Design Method, John Wiley & Sons, New York. Dillman, D. A., Sangster, R. L., Tarnai, J., and Rockwood, T. H. (1996). Understanding differences in people s answers to telephone and mail surveys. In: Braverman, M. T., and Slater, J. K. (eds.), Advances in Survey Research, New Directions for Evaluation Series (Vol. 70), Jossey-Bass, San Francisco, pp Ethington, C. A. (1997). A hierarchical linear modeling approach to studying college effects. In: Smart, J. C. (ed.), Higher Education: Handbook of Theory and Research (Vol. 12), Agathon, New York, pp ). Gatz, L. G., and Hirt, J. B. (2000). Academic and social integration in cyberspace: Students and . Rev. Higher Educ. 23:
18 18 CARINI, HAYEK, KUH, KENNEDY, AND OUIMET Green, K. C. (2001). Campus Computing, 2000: The 11th National Survey of Computing and Information Technology in American Higher Education, Campus Computing, Encino, CA. Greenwald, R., Hedges, L. V., and Laine, R. D. (1996). The effect of school resources on student achievement. Rev. Educ. Res. 66: Groves, R. M., and Kahn, R. L. (1979). Surveys by Telephone: A National Comparison with Personal Interviews, Academic, New York. Hu, S., and Kuh, G. D. (2001). Computing experience and good practices in undergraduate education: Does the degree of campus wiredness matter? Paper presented at the meeting of the American Educational Research Association, Seattle, WA. Jenkins, C. R., and Dillman, D. A. (1997). Towards a theory of self-administered questionnaire design. In: Lyberg, L., Biemer, P., Collins, M., Decker, L., deleeuw, E., Dippo, C., et al. (eds.), Survey Measurement and Process Quality, Wiley-Interscience, New York, pp Jones, S. (2002). The Internet Goes to College: How Students are Living in the Future with Today s Technology, The Pew Internet and American Life Project, Washington, DC. Kuh, G. D. (2000a). The College Student Report, National Survey of Student Engagement, Center for Postsecondary Research and Planning, Indiana University, Bloomington, IN. Kuh, G. D. (2000b). The National Survey of Student Engagement: Conceptual Framework and Overview of Psychometric Properties. Indiana Postsecondary Research and Planning [On-line]. Available: nsse Kuh, G.D. (2001). Assessing what really matters to student learning: Inside the National Survey of Student Engagement. Change 33(3): 10 17, 66. Kuh, G. D., Hayek, J. C., Carini, R. M., Ouimet, J. A., Gonyea, R. M., and Kennedy, J. (2001). NSSE Technical and Norms Report, Indiana University Center for Postsecondary Research and Planning, Bloomington, IN. Kuh, G. D., and Hu, S. (2001). The relationships between computer and information technology use, student learning, and other college experiences. J. Coll. Stud. Dev. 42: Kuh, G. D., and Vesper, N. (2001). Do computers enhance or detract from student learning? Res. Higher Educ. 42: Laing, J., Swayer, R., and Noble, J. (1989). Accuracy of self-reported activities and accomplishments of college-bound seniors. J. Coll. Stud. Dev. 29: Layne, B. H., DeCristoforo, J. R., and McGinty, D. (1999). Electronic versus traditional student ratings of instruction. Res. Higher Educ. 40: Light, R., and Pillemer, D. (1982). Numbers and narrative: Combining their strengths in research reviews. Harv. Educ. Rev. 52: Malveaux, J. (2000). Technology, learning, and the future of education. Black Issues Higher Educ. 17: 38. National Survey of Student Engagement (2000). NSSE 2000 Overview, Indiana University Center for Postsecondary Research and Planning, Bloomington, IN. National Telecommunications and Information Administration (2000). Falling Through the Net: Toward Digital Inclusion, U.S. Department of Commerce, Washington, DC. Olsen, D. R., Wygant, S. A., and Brown, B. L. (1999). Entering the next millennium with Web-based assessment: Considerations of efficiency and reliability. Paper presented at the Conference of the Rocky Mountain Association for Institutional Research, Las Vegas, NV. Pace, C. R. (1985). The Credibility of Student Self-Reports, University of California, Center for the Study of Evaluation, Graduate Institution of Education, Los Angeles.
19 COLLEGE STUDENT RESPONSES TO WEB AND PAPER SURVEYS 19 Pascarella, E. T., and Terenzini, P. T. (1991). How College Affects Students: Findings and Insights from Twenty Years of Research, Jossey-Bass, San Francisco. Pascarella, E. T., Flowers, L., and Whitt, E. J. (2001). Cognitive effects of Greek affiliation in college: Additional evidence. NASPA J. 38: Pew Internet and American Life (2000). Tracking Online Life: How Women use the Internet to Cultivate Relationships with Family and Friends, The Pew Internet and American Life Project, Washington, DC. Schwarz, N., Hippler, H. J., and Noelle-Neumann, E. (1992). A cognitive model of response-order effects in survey measurement. In: Schwarz, N., and Sudman, S. (eds.), Context Effects in Social and Psychological Research, New York, Springer-Verlag, pp Tomsic, M. L., Hendel, D. D., and Matross, R. P. (2000). A World Wide Web response to student satisfaction surveys: Comparisons using paper and Internet formats. Paper presented at the Annual Meeting of the Association for Institutional Research, Cincinnati, OH. Turner, C. F., Ku, L., Rogers, S. M., Lindberg, L. D., Pleck, J. H., and Sonenstein, F. L. (1998). Adolescent sexual behavior, drug use, & violence: Increased reporting with computer survey technology. Science 280: Yahoo Internet Life (2001). America s Most Wired Colleges Available: Received June 19, 2001.
The Disengaged Commuter Student: Fact or Fiction? George D. Kuh, Robert M. Gonyea, Megan Palmer National Survey of Student Engagement Indiana University Center for Postsecondary Research and Planning The
Results of the Spring, 2011 Administration of the National Survey of Student Engagement at New Mexico Highlands University Prepared by the Office of Institutional Effectiveness and Research The National
The Association to Advance Collegiate Schools of Business (AACSB) Business Accreditation (BA) Standards Mapped to 2006-2012 NSSE Survey Questions Introduction and Rationale for Using NSSE in AACSB Accreditation
Page 1 of 8 College Cal State L.A. has been a dynamic force in the education of students, setting a record of outstanding academic achievement for more than 60 years within the California State University
NSSE S BENCHMARKS ONE SIZE FITS ALL? Nava Lerer Kathryn Talley Adelphi University Prepared for the Northeast Association for Institutional Research 32th Annual Conference, November 2005 NSSE S BENCHMARKS
An Analysis of Mode Effects in Three Mixed-Mode Surveys of Veteran and Military Populations Boris Rachev ICF International, 9300 Lee Highway, Fairfax, VA 22031 Abstract: Studies on mixed-mode survey designs
Postgraduate Student Engagement Questionnaire Item stem What is the name of your university? In what level of qualification are you enrolled? Are you male or female? Where has your study been mainly based
Measuring Quality 1 Measuring Quality: A Comparison of U. S. News Rankings and NSSE Benchmarks Gary R. Pike Director of Institutional Research Mississippi State University P. O. Box EY Mississippi State,
Evaluation in Online STEM Courses Lawrence O. Flowers, PhD Assistant Professor of Microbiology James E. Raynor, Jr., PhD Associate Professor of Cellular and Molecular Biology Erin N. White, PhD Assistant
Engagement with Information Technology 1 Running head: ENGAGEMENT WITH INFORMATION TECHNOLOGY Student Experiences with Information Technology and their Relationship to Other Aspects of Student Engagement
Evaluation of Student Experiences at the University of Delaware Results of the 2011 National Survey of Student Engagement (NSSE) Introduction The National Survey of Student Engagement (NSSE) is a national
The Academic and Co-Curricular Experiences of UCSD Freshmen Students 2004-2005 Evidence from the Your First College Year Survey The academic, cognitive, and psychosocial adjustment of freshmen students
Bridge or Barrier: The Impact of Social Media on Engagement for First-generation College Students Heather Haeger Rong Wang Allison BrckaLorenz Indiana University, Bloomington American Educational Research
Engaging Online Learners: A Quantitative Study of Postsecondary Student Engagement in the Online Learning Environment Pu-Shih Daniel Chen University of North Texas Kevin R. Guidry Indiana University Bloomington
Page 1 of 10 World-class academics with a personal touch a unique combination you won t find anywhere else! You can choose from over 90 different academic programs with opportunities for both undergraduate
Technical Report No. 15 An Analysis of IDEA Student Ratings of Instruction in Traditional Versus Online Courses 2002-2008 Data Stephen L. Benton Russell Webster Amy B. Gross William H. Pallett September
Evaluating Mode Effects in the Medicare Fee-For-Service Survey Norma Pugh, MS, Vincent Iannacchione, MS, Trang Lance, MPH, Linda Dimitropoulos, PhD RTI International, Research Triangle Park, NC 27709 Key
P E R S P E C T I V E S GEORGE D. KUH AND ROBERT M. GONYEA Spirituality, Liberal Learning, College Student and Engagement ONE OF THE MORE INTRIGUING TRENDS at the turn of the twenty-first century is the
http://www.collegeportraits.org/nj/msu 1 of 1 7/23/2014 9:58 AM Founded in 1908, is New Jersey s second largest university. It offers all the advantages of a large university a comprehensive undergraduate
The prepares future leaders for success in an opportunity-rich environment, ideally located in the Oklahoma City metropolitan area. The offers an innovative learning community where teaching comes first
Aggie R-r-ring Program E-Mail Use Survey Fall 2005 Results Background In a Pew Internet & American Life Project study of college students Internet use, Jones (2002) found that college students are frequently
1 High-Impact Practices and Experiences from the Wabash National Study In our research thus far, we have found that four broad categories of teaching practices and institutional conditions predict growth
Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills) Introduction Many students at CSU, Chico, receive much of their college-level mathematics education from the one MATH course they complete
What Student Engagement Data Tell Us about College Readiness By George D. Kuh, director of the Center for Postsecondary Research and Chancellor s Professor of Higher Education, Indiana University GGood
A Comparison of International Student and American Student Engagement In Effective Educational Practices Chun-Mei Zhao Research Analyst National Survey of Student Engagement, Indiana University Bloomington
International Journal for the Scholarship of Teaching and Learning Volume 8 Number 1 Article 5 January 2014 Graduate Student Perceptions of the Use of Online Course Tools to Support Engagement Stephanie
An Exploratory Survey of Graduate Student Experiences and Satisfaction Andrea I. Bakker (email@example.com) Denise Krallman (firstname.lastname@example.org) Miami University Background This survey was initiated
Degree Attainment of Undergraduate Student Borrowers in Four-Year Institutions: A Multilevel Analysis By Dai Li Dai Li is a doctoral candidate in the Center for the Study of Higher Education at Pennsylvania
46 The Vermont Connection 2010 Volume 31 First-Generation College Students: How Co-Curricular Involvement Can Assist with Success Valerie Garcia First-generation college students are students whose parents
A Comparison of the Random Digit Dialing Telephone Survey Methodology with Internet Survey Methodology as Implemented by Knowledge Networks and Harris Interactive Jon A. Krosnick and LinChiat Chang, Ohio
Improving the Educational Quality of Liberal Arts Colleges by Charles F. Blaich and Kathleen Wise February 2014 Summary A large body of research points to a core set of teaching practices and institutional
2003 National Survey of College Graduates Nonresponse Bias Analysis 1 Michael White U.S. Census Bureau, Washington, DC 20233 Abstract The National Survey of College Graduates (NSCG) is a longitudinal survey
WHAT INFLUENCES END-OF-COURSE EVALUATIONS? 1 What influences end-of-course evaluations? Teaching and learning versus instrumental factors Alexander McCormick Allison BrckaLorenz Center for Postsecondary
www.ncolr.org/jiol Volume 5, Number 2, Summer 2006 ISSN: 1541-4914 College in the Information Age: Gains Associated with Students Use of Technology Terrell L., Ph.D. University of Tennessee Abstract Increasingly
Innovation Insight Series Number 14 http://www.psu.edu/president/pia/innovation/ Across the University many people use data in assessment and decision-making. Data-based decision-making, an essential element
Creating a Pipeline: An Analysis of Pre-College Factors of Students in STEM Erica Harwell and Derek A. Houston University of Illinois at Urbana-Champaign ABSTRACT This study seeks to understand the pre-college
NSSE Multi-Year Data Analysis Guide About This Guide Questions from NSSE users about the best approach to using results from multiple administrations are increasingly common. More than three quarters of
Community College Journal of Research and Practice, 37: 333 338, 2013 Copyright# Taylor & Francis Group, LLC ISSN: 1066-8926 print=1521-0413 online DOI: 10.1080/10668926.2012.754733 EXCHANGE The Community
Using campus survey results to make good decisions in challenging times: The first year experience of veteran and active duty students at UC CAIR Annual Meeting November, 2008 Pasadena, CA Paula Zeszotarski
RESEARCH BRIEF Alumni Survey Results for Pepperdine University s Graduate Programs Teresa Taningco Kaldor, Ph.D. January 2013 Key Findings This study analyzes the results of the Alumni Survey for Pepperdine
Chapter 5: Analysis of The National Education Longitudinal Study (NELS:88) Introduction The National Educational Longitudinal Survey (NELS:88) followed students from 8 th grade in 1988 to 10 th grade in
A Modest Experiment 1 Running Head: Comparing On campus and Distance HBSE Courses A Modest Experiment Comparing HBSE Graduate Social Work Classes, On Campus and at a Distance A Modest Experiment 2 Abstract
NSSE Accreditation Toolkit AACSB Business Mapped to NSSE 2015 Survey Items Introduction and Rationale for Using NSSE in AACSB Business Accreditation One of the most common institutional uses of NSSE data
Title of the submission: A Modest Experiment Comparing Graduate Social Work Classes, on-campus and at A Distance Topic area of the submission: Distance Education Presentation format: Paper Names of the
Community College Survey of Student Engagement 2010 Key Findings Table of Contents Introduction 2 Benchmarks of Effective Educational Practice 3 Areas of Highest Student Engagement 4 Areas of Lowest Student
A Comparison of Training & Scoring in Distributed & Regional Contexts Writing Edward W. Wolfe Staci Matthews Daisy Vickers Pearson July 2009 Abstract This study examined the influence of rater training
Homeschoolers: A Snapshot Judy Wheaton Director of Institutional Research and Assessment, Austin College Background - in 2001, 10% of the Austin College freshman class had been homeschooled - this high
Running Head: PATTERNS IN FACULTY TEACHING 1 Patterns in Faculty Teaching Practices on the Campuses of Historically Black Colleges and Universities and Predominantly White Institutions Mahauganee D. Shaw
GMAC Predicting Success in Graduate Management Doctoral Programs Kara O. Siegert GMAC Research Reports RR-07-10 July 12, 2007 Abstract An integral part of the test evaluation and improvement process involves
Learning and Assessment: Trends in Undergraduate Education A Survey Among Members Of The Association Of American Colleges And Universities Conducted By Hart Research Associates April 2009 This report is
PERCEPTIONS OF IOWA SECONDARY SCHOOL PRINCIPALS TOWARD AGRICULTURAL EDUCATION Neasa Kalme, Instructor Hamilton, Indiana James E. Dyer, Assistant Professor University of Missouri Abstract The primary purpose
National Response Rates for Surveys of College Students: Institutional, Regional, and Design Factors MattJans1,AnthonyRoman2 1MichiganPrograminSurveyMethdology,UniveristyofMichigan,InstituteforSocialResearch,426ThompsonSt.,AnnArbor,
Designing & Conducting Survey Research Santa Monica College Fall 2011 Presented by: Hannah Alford, Director Office of Institutional Research 1 Workshop Overview Part I: Overview of Survey Method Paper/Pencil
Item 1 1. Did you begin college at this college or elsewhere? ENTER Started here 935 71.7 Started elsewhere 369 28.3 Item 4: In your experiences at this college during the current school year, about how
Item 1 1. Did you begin college at this college or elsewhere? ENTER Started here 554 74.5 Started elsewhere 189 25.5 Item 4: In your experiences at this college during the current school year, about how
Probability of Selecting Response Mode Between Paper- and Web-Based Options: Logistic Model Based on Institutional Demographics Marina Murray Graduate Management Admission Council, 11921 Freedom Drive,
Item 1 1. Did you begin college at this college or elsewhere? ENTER Started here 348 72.1 Started elsewhere 135 27.9 Item 4: In your experiences at this college during the current school year, about how
A Special Report for the Council of Independent Colleges Independent Colleges and Student Engagement: Do Religious Affiliation and Institutional Type Matter? Robert M. Gonyea and George D. Kuh Center for
5 The primary goal of this study was to examine the impact of active learning on a student s level of overall social integration and perception of his or her institution s commitment to student welfare.
Page 1 of 11 College Portrait Michigan Tech was founded in 1885 and today is a leading public research university developing new technologies and preparing students to create the future for a prosperous
STUDENTS PERCEPTIONS OF ONLINE LEARNING AND INSTRUCTIONAL TOOLS: A QUALITATIVE STUDY OF UNDERGRADUATE STUDENTS USE OF ONLINE TOOLS Dr. David A. Armstrong Ed. D. D Armstrong@scu.edu ABSTRACT The purpose
Student Experiences with Diversity at Liberal Arts Colleges: Another Claim for Distinctiveness Paul D. Umbach Indiana University Center for Postsecondary Research 1900 East 10 th Street Eigenmann Hall,
1 Junior College Impact on the Student Majoring Education Field in Japan : An analysis of the JJCSS2009 dataset from the I-E-O model Soichiro Aihara Osaka Kunei Women s College email@example.com
J. COLLEGE STUDENT RETENTION, Vol. 5(2) 135-145, 2003-2004 INSTRUCTION AND ACADEMIC SUPPORT EXPENDITURES: AN INVESTMENT IN RETENTION AND GRADUATION ANN M. GANSEMER-TOPF JOHN H. SCHUH Iowa State University,
For more resources click here -> Building Online Learning Communities: Factors Supporting Collaborative Knowledge-Building Joe Wheaton, Associate Professor David Stein, Associate Professor Jennifer Calvin,
93 THE EFFECTIVENESS OF VIRTUAL LEARNING IN ECONOMICS Neil Terry, West Texas A&M University ABSTRACT This paper presents empirical results concerning the effectiveness of Internet instruction in economics.
Best Practices for Increasing Online Teaching Evaluation Response Rates Denise T. Ogden, Penn State University Lehigh Valley James R. Doc Ogden, Kutztown University of Pennsylvania ABSTRACT Different delivery
Chapter Three OVERVIEW OF CURRENT SCHOOL ADMINISTRATORS The first step in understanding the careers of school administrators is to describe the numbers and characteristics of those currently filling these
Proceedings of the Annual Meeting of the American Statistical Association, August 5-9, 2001 WORK EXPERIENCE: DETERMINANT OF MBA ACADEMIC SUCCESS? Andrew Braunstein, Iona College Hagan School of Business,
Simmons College Adaptation of the CLASSE for Students 1-21 Part 1: Engagement Activities. For this course, how often have you done each of the following? 1. Posed questions to the instructor or your peers?
Student Course Evaluations: Key Determinants of Teaching Effectiveness Ratings in Accounting, Business and Economics Courses at a Small Private Liberal Arts College Timothy M. Diette, Washington and Lee
Measuring Internationalization at Liberal Arts Colleges Funded by the Ford Foundation AMERICAN COUNCIL ON EDUCATION The Unifying Voice for Higher Education Center for Institutional and International Initiatives
Utah Comprehensive Counseling and Guidance Program Evaluation Report John Carey and Karen Harrington Center for School Counseling Outcome Research School of Education University of Massachusetts Amherst
9 Evaluation Online offers a variety of options to faculty who want to solicit formative feedback electronically. Researchers interviewed faculty who have used this system for midterm student evaluation.
ACT National Curriculum Survey 2012 Policy Implications on Preparing for Higher Standards improve yourself ACT is an independent, not-for-profit organization that provides assessment, research, information,
Student Success Courses and Educational Outcomes at Virginia Community Colleges Sung-Woo Cho and Melinda Mechur Karp February 2012 CCRC Working Paper No. 40 Address correspondence to: Sung-Woo Cho Quantitative
Suggested Strategies for Student Engagement The following strategies were developed by the Health & Exercise Science Division to align with the research-based conference of the Community College Survey
DOI 10.1007/s10755-007-9043-y Engaging Undergraduate Students in Research Activities: Are Research Universities Doing a Better Job? Shouping Hu George D. Kuh Joy Gaston Gayles # Springer Science+Business
Trends and Emerging Practices in General Education Based On A Survey Among Members Of The Association Of American Colleges And Universities Conducted By Hart Research Associates May 2009 This report is
Research in Higher Education, Vol. 45, No. 2, March 2004 ( 2004) ADDING VALUE: Learning Communities and Student Engagement Chun-Mei Zhao*, and George D. Kuh** :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
RESEARCH SUMMARY Academic Performance of IB Students Entering the University of California System from 2000 2002 IB Global Policy & Research Department August 2010 Abstract This report documents the college
Assessment & Evaluation in Higher Education Vol. 29, No. 5, October 2004 Gathering faculty teaching evaluations by in-class and online surveys: their effects on response rates and evaluations Curt J. Dommeyer*,
The MetLife Survey of Challenges for School Leadership Challenges for School Leadership A Survey of Teachers and Principals Conducted for: MetLife, Inc. Survey Field Dates: Teachers: October 5 November
Your consent to our cookies if you continue to use this website.