Learning-Style Instrument 1 Running head: Learning-Style Instrument Development of a Learning-style Instrument E. Lea Witta Educational Research, Technology and Leadership University of Central Florida Cheng-Yuan Corey Lee Teaching, Learning, and Technology Center The University of Findlay Paper presented at the annual meeting of the American Evaluation Association, Reno, November, 2003. 1
Learning-Style Instrument 2 Abstract The purpose of the current study was to develop and assess a new learning preference instrument. Although based on prior instruments, questions were added to specifically address sociability/isolation. Questions were posed concerning: Sociability (face-to-face other students), Student Organization (set schedule study), Authority Dependence (need instructor feedback), Avoidance (Class boring), Communication (good written communication), Reading (prefer reading), Concrete (like concrete examples), Recognition (Teacher recognize work), Action (participate in course), and Instructor Organization (need clear instructions). Education professionals reviewed and suggested revision for our initial questionnaire. Following the initial revision and removal of approximately 30 questions, professionals in education again evaluated the questionnaire. After the second revision the 49-item questionnaire was placed online at a southern university. Graduate students from masters level online classes were requested to visit the site and complete the questionnaire. The instrument was also submitted to three traditional masters level research classes for the initial pilot resulting in 141 responses. Results using exploratory factor analysis and internal consistency reliability from the pilot are discussed. 2
Learning-Style Instrument 3 With the explosive growth in the World Wide Web (WWW) in the early 1990s, an increasing number of courses are migrating to the Web - namely Web-based education or online education. In 2001, International WHERE + HOW (2001) listed more than 55,000 online courses that are provided by higher educational institutions and training corporations. According to Peterson s (2001) Distance Learning Guide, about 3600 degrees and certificate programs are available from universities all over the world. Many higher education institutions offer a wide variety of online courses and provide the opportunity for students to enroll in certain online courses as part of a degree. Other institutions offer complete undergraduate and graduate degrees through an online education program. As online education is gaining more and more popularity, increasing attention has been focused on the learners adaptation to the new learning environment. Parson (1998) and Alexander (1995) suggested that in order to help with curriculum and instructional designs, educators and researchers should evaluate how students learn in the new environment. Identifying a student s learning style and determining which learning styles perform better in the online environment helps the instructors understand how their students perceive and process information in different ways. Learning Style Learning style refers to the way a leaner perceives, organizes, processes, and understands information. During the last few decades, various categories have been introduced from different theoretical approaches. Currently the most well-known classifications are: field-dependent or field-independent (Ramirez & Castenada, 1974), concrete-abstract and random-sequential (Gregorc, 1979), conditions for learning, area of interest, mode of learning, and expectation of course grade (Canfield & Knight, 1983), and concrete-abstract and reflective-active (Kolb, 1984). According to Matthews (1994) the existence of individual learning styles is not debated. However the term "learning style" is defined differently. Matthews (1994) states that Gregorc envisioned distinctive behaviors, while Kolb emphasized heredity, environment and experience. Canfield, on the other hand, was concerned with instructional preferences. Although these appear to differ (Dunn, DeBello, Brennan, Krimsky, & Murrain, 1981), these models overlap in many areas. All of these frameworks, however, are supposed to measure how an individual learns (Gregorc, 1979) or predisposition in a learning situation (Kolb, 1974). In addition, with the advent of web-based instruction (WBI) the traditional educational setting has changed. Although Verduin and Clark (1991) indicate that distance education is as effective as traditional instruction, there are conditions for this to occur. One of these is student interaction. Yet in web-based classes students may only interact online. Does a "chat" room adequately replace the face-to-face classroom interaction? Can we adequately determine who should or should not take web-based classes based on their preferred "learning style"? Or, could we provide suggestions to help them adapt their preferred style to an online environment? And, can we accurately measure "learning style"? Further, because we are using new technological methods in teaching, will instruments created in the 1980s include items to measure the new environmental conditions? 3
Learning-Style Instrument 4 The purpose of this study was to develop and assess a new instrument that measured students learning styles in the online learning environment. Although the instrument developed was based in part upon previous learning style instruments, we were very concerned about the socialization/isolation aspect of online classes as well as the self-discipline to organize activities to meet course requirements and the ability to communicate needs to others. Consequently we added questions to specifically address those issues. Method The current instrument was developed to examine student s learning preferences in several areas: Sociability (face-to-face other students), Student Organization (set schedule study), Authority Dependence (need instructor feedback), Avoidance (Class boring), Communication (good written communication), Reading (prefer reading), Concrete (like concrete examples), Recognition (Teacher recognize work), Action (participate in course), and Instructor Organization (need clear instructions). For each item, respondents were asked to indicate their preference using a 5-point Likert scale ranging from Strongly Disagree to Strongly Agree. Education professionals reviewed and suggested revision for our initial questionnaire. Following the initial revision and removal of approximately 30 questions, professionals in education again evaluated the questionnaire. After the second revision the 49-item questionnaire was placed online at a southern university and linked to the homepage of two online graduate classes. Students enrolled in these classes were asked to complete the questionnaire. In addition, a pencil-and-paper version of the questionnaire was presented to students in the same graduate classes but in a traditional (face-to-face) environment. A total of 141 students participated in this study. Of the 141 students participating, 96 (68%) students were enrolled in the online classes. The remaining 45 (32%) students completed the pencil-and-paper survey. The survey data were collected by the researcher and were entered into SPSS. Principal components analysis with varimax rotation was used to examine the interrelationships among the items in this new learning styles inventory. Factor analysis is also used to explain these items in terms of their common underlying dimensions or factors (Nunnally, 1978). Reliabilities estimate the extent to which each factor is free from measurement error and are determined by calculating the internal-consistency coefficients for each factor (Nunnally, 1978). Correlation coefficients demonstrate the direction and strength of the relationships among the factors of the student s learning styles. Results and Discussion Of the 49 items in the questionnaire, eight were removed because they produced a factor defined by only one large loading, or because they had a loading of less than 0.45 on any factor. Two additional questions (Q48, Q8) were removed to improve the reliability of their factors. In addition, all questions negatively related to other factor questions or containing negative phrases were reverse coded and are indicated in Table 1 4
Learning-Style Instrument 5 by an R preceding the question number. The remaining 39 questions formed eleven factors which accounted for 63% of the variance in the instrument. The first factor, Sociability, was characterized by questions indicating preference for group activities (Q6), group assignments (Q4), and study for test with others (Q35). Clearly this factor was delineating social classroom behaviors and indicating social interaction. Reliability (Coefficient Alpha) for this factor was 0.83 as is shown in Table 1. The second factor, Student Organization, was defined by questions concerning sticking to a schedule (Q10) and scheduling time for study (Q9). Again, the questions defining the factor provided the anticipated grouping. The reliability estimate for scores produced by this factor was 0.77. The third factor, Avoidance (or nonavoidance), was determined by don t want to attend class (recoded Q39) and class is boring (recoded Q46). These questions also formed the anticipated cluster. Reliability for this factor was 0.73. Factor 4, Authority Dependence, was characterized by need instructor feedback (Q14), no idea how well on assignments (Q13), and can t assess own progress (recoded Q27). Reliability for this factor was 0.72. The remaining factors and their loadings and reliabilities are displayed in Tables 1 and 2. Conclusions Although the final factors produced an acceptable fit for our proposed model, because some questions were removed, this instrument needs to be tested again. In addition, factors in the model should be contrasted via class type, and factors should be correlated with student grades. 5
Learning-Style Instrument 6 References Alexander, S. (1995). Teaching and learning on the World Wide Web. Retrieved November 5, 2001, from http://www.scu.edu.au/ausweb95/papers/education2/alexander/. Canfield, A. A., & Knight, W. (1983). Learning Styles Inventory. Los Angeles, CA: Western Psychological Services. Dunn, R., DeBello, T., Brennan, P., Krimsky, J., & Murrain, P. (1981). Learning style researchers define differences differently. Educational Leadership, 38(5), 372_375. Gregorc, A. F. (1979). Learning/teaching styles: Potent forces behind them. Educational Leadership, 36(4), 234-236. International WHERE + HOW (2001). The international distance learning course finder. Retrieved November 5, 2001, from http://www.dlcoursefinder.com/us/index.htm Kolb, D. A. (1984). Experiential learning. Englewood Cliffs, N J: Prentice_Hall. Matthews, D. B. (1994). An investigation of student learning styles in various disciplines in colleges and universities. Journal of Humanistic Counseling, Education, & Development, 33(2), 65-75. Nunnally, J. C. (1978). Psychometric theory (2nd ed.). New York: McGraw-Hill. Parson, R. (1998). An investigation into instruction available on the World Wide Web. Retrieved September 7, 2001, from http://www.oise.utoronto.ca/~rparson/abstract.html. Peterson s Store (2001). Distance learning. Retrieved November 25, 2001, from http://iiswinprd03.petersons.com/distancelearning/default.asp Ramirez, M., & Castaneda, A. (1974). Cultural democracy, bicognitive development, and education. New York: Academic Press. Tucker, S. Y. (2000). Assessing the effectiveness of distance education versus traditional on-campus education. [ERIC Document Reproduction Service: ED443378]. Verduin, J. R. & Clark, T. A. (1991). Distance education: The foundations of effective practice. San Francisco: Jossey-Bass. 6