On-Line Tool for Exercising and Assessing Knowledge on Engineering Courses Julieta Noguez, Víctor Robledo-Rella, Luis Neri, Enrique Espinosa Tecnológico de Monterrey, Campus Ciudad de México. Calle del Puente 222,Col. Ejidos de Huipulco, México D.F. CP 14380, México D.F. { jnoguez, vrobledo, neri, enrique.espinosa}@itesm.mx Abstract. We have developed a set of generic on-line assessing tools that allow students to exercise and evaluate their previous knowledge, before they submit their admission to engineering majors. The tools are generic and flexible, and may be applied to different disciplines such as math, physics and computing,and may be used in a Web environment. We present and discuss the software tool itself, as well as preliminary results obtained with engineering students taking physics courses. Index Terms Online assessment, e-learning, flexible education, self-testing. INTRODUCTION When prospective engineering students apply for an engineering program at the Tecnológico de Monterrey, they must pass a classification exam on college-physics and college-math. We have found that about four out of five of these potential students fail on such exams. Among the reasons for this are: i) The students have not acquired an adequate level on these fields in their previous courses. ii) The students did not prepare properly for the exam. Nowadays our students prefer to use interactive on-line tools rather than printed tests to enforce their problem solving skills [1]. One of the big advantages of using IT in education is allow that teaching materials and interactions more widely accessible to a greater audience [2]. These include on-line assessment tools for delivery to external students or rural and remote locations. Besides, it also increases accessibility for students with physical impairment. Among the channels used at the Tecnológico de Monterrey are videoing lectures, lecture notes available on-line, block teaching and on-line assessment [3]. Flexible education, in its broadest sense, recognizes that all students have different learning needs, including obviously, restrictions such as time, place and pace of study, as well as learning styles, previous knowledge, experience, and cultural background [4],[5]. s can use self-testing techniques to gain confidence before performing formal assessments. Self-testing allows also the student to practice over a wide question bank without time restrictions. The elearning Research Group of the Tecnológico de Monterrey, Campus Ciudad de México, has been developing interactive tools to promote and to evaluate the students ability to state and to solve specific problems [6]. In this paper we present a generic on-line assessing tools, called hereafter On-lineQuiz. This tool allows students to excersise and evaluate their knowledge and ability to solve specific problems. The tool is generic and flexible, and may be applied to diverse disciplines such as math, physics and computing. They could be used by different teachers and students in a Web environment. ON-LINE TOOLS FOR EXERCISING AND ASSESSING KNOWLEDGE On-line tool has a model mainly focused on self-testing techniques and on-line exercise in a Web environment. This model is not exclusive for prospective engineering students; it also can be applied to other disciplines with more flexibility in time, place and pace of study, even for local students. The On-line tool model is intended to support communication between students when they need to have high interaction. The students can choose the content area to be self-assessed, and the system displays the statements, including images and figures, so the student can take his/her time to solve and answer a given problem, before continuing with following problems. The system needs that the assessment takes place on one site and that the addition of a new student does not affect the communication cycle between the server and the student. The main services are divided in three scenes, which are described below. 1.1 Management services In this scene, a manager was designed to provide access to several types of users: professors or students. Each type of user has different entry permits that define the kind of functionality offered by the system. Also, the manager can adjust the availability of the test, and can obtain time control reports. Figure 1 shows the main functions of the system using a use case diagram of UML notation [7]. T3D-1
FIGURE 1 SYSTEM SERVICES OF MANAGEMENT SCENE 1.2 Professor services The system can attend different areas at the same time. The system distinguishes between: head of department and staff professor. The first ones, allows the professors to access the system, so they can review the assessment problems, and to consult statistics and reports about the students performance. The professor can register student data, design the assessments, and obtains statistics and reports about his/her groups. This functionality is shown in figure 2. FIGURE 3 SYSTEM SERVICES OF STUDENT SCENE ON-LINEQUIZ COMMUNCATION MODEL The On-lineQuiz model is based on client-server architecture [8], [9]. It is shown in figure 4. The communication model components are: Server: This component processes the relevant data in the system. It makes the necessary calculations once the assessments have been designed by the professors. It will serve as a producer. Before starting the assignment, the student and the server must communicate to make sure that the server stores student information which is necessary for the data system processing. FIGURE 2 SYSTEM SERVICES OF PROFESSOR SCENE 1.3 services The students access the system by login permission. The learner can choose the area to exercise and to assess, and the difficulty level of the assignments. The system displays randomly the assessments of the chosen area, which were previously delivered by the professor. The students can attempt to solve his/her assessment several times, and every time the system gives the students feedback and suggestions according to the wrong answers. Once an assessment has been finished, the students are able to star a new assignment and chose a different difficulty level. This functionality is shown in figure 3. User/subscribers: This component directly represents users (manager, professor and student). It is also an interface that delivers assessment to the students. The component, connected to the internet media, must subscribe itself to the server that will produce the relevant data. This component does not make any data processing for the communication cycle. This means that any assessment processing made by this component does not affect the system communication. This helps to maintain the server process without alteration even if other students subscribe themselves to the system or rescind their connection. Internet media: The server and the subscriber components will communicate using this media. It is an intermediate component between server and users, and administers the connections and assessment publication processes. See figure 4. T3D-2
Manager Professor Internet Media Assessments Server We chose two themes to run our On-line tool: circular dynamics of a particle, and work and energy. These are current themes of the first physics course. We designed 15 problems related with each theme. 1 FIGURE 4 COMMUNICATION MODEL FOR INTERACTION 2 interface We defined a standard human-computer interface for all the assessments. It is composed of 4 windows, shown in figures 5 and 6, which are described next: 1. Difficulty level and timing. The difficulty level chosen by the student is displayed as well the remaining allowed time to answer the assignment. 2. Scripts for answer. The assessments are randomly displayed for a given content area. The student can choose which assessment is displayed in order to solve it, and then he can choose the next assessment until he is done. Next button FIGURE 5 USER INTERFACE: AN EXAMPLE OF AN ASSESSMENT IN THE STUDENT INTERFACE FOR AN EXERCISE IN BASIC PHYSICS 3 3. Problem display. Each problem is displayed in a pdf format. It can contain text, figures, and images related with the statement. 4. Feedback. Once the assignment is complete, the system gives the student a grade. For any given wrong answer the system displays a suggestion to the student. This interface allows the student to practice and exercise different kinds of problems with different difficulty levels until he/she feels satisfy of his/her knowledge. CASE STUDY IN PHYSICS Our On-line tool (OLT) for exercising and assessing knowledge has wide applications to engineering courses, particularly those that require a lot of problem solving to understand a given concept. We followed a pre-test, practice and post-test approach, as has been applied also by L. McDermott, P. Shaffer and the Physics Education Group in the University of Washington [10]. In the following, we describe the initial approach for the case of Physics I course (Classical Mechanics) at the Tecnológico de Monterrey. FIGURE 6 THE SYSTEM GIVES HIM FEEDBACK AND SUGGESTIONS TO STUDY TAKING CARE OF THE FAIL ANSWERS. Some examples of these statements are: Example of a circular dynamics problem: In a go-cars race, a 70.0 kg driver in his 120 kg car is traveling with a 50.4 km/h constant speed on a flat horizontal circular road (see figure). The static and kinetic friction coefficients between the tires and the pavement are μ s = 0.850 and μ k = T3D-3
0.620, respectively. Find the minimum curvature radius of the road so that the car will not slide. a) 23.5 m, b) 1.68 m, c) 32.2 m, d) 231 m, e) 2.82 km The final phase. After some of the students used OLT, we applied a post-test to all students (both control and test groups). Study results. We defined a student grade gain as: G = (Post-test grade Pre-test grade)/100 (1) Figure 7 shows the grade gains for our students, sorted by increasing gain. Example of a Work and Energy Problem A block of mass M = 5.00 kg, sliding on a horizontal smooth surface with a 3.00 m/s speed, is directed against a light spring with a constant force k = 500 N/m, as shown in the figure. Find the maximum spring compression. a) 30.0 cm, b) 4.50 cm, c) 24.5 cm, d) 9.00 cm, e) 21.2 cm (Postest - Pretest) /100 Post-test vs. Pre-test Gain 0.50 0.40 0.30 0.20 0.10 0.00 0 5 10 15 20-0.10 With Online tool Whitout Online tool -0.20 Each statement is displayed by the system in window 3, once the student chooses it. Following, we described the evaluation process. EVALUATION AND RESULTS We applied our OLT (On-line tool) in a regular physics class for engineers during first semester of 2007. The initial phase. We designed 30 problems related with two themes mentioned above. From these, 10 were applied as a written pre-test during classtime, 10 problems were fed into the On-line-tool, and the last 10 were reserved as a post-test to be applied.. sample. We invited 31 students to participate in our study. However from these, only 22 students completed the evaluation process. They were students between 1st and 2nd semester of multidisciplinary majors of the engineering school. We divide them randomly into a test group, which worked out with the OLT, and a control group, which did not. Experiment design. From the 22 students, only 8 practiced using OLT. These 8 students used the system for practicing problem-solving for more that one session (30 to 60 minutes each). In addition, the system produced log files that register sessions and student grades. In contrast, the control group did not have any access to the OLT. FIGURE 7 A COMPARISON OF THE TEST GROUP (USING ON-LINE TOOL) VS. THE CONTROL GROUP (WITHOUT ONLINE TOOL) SORTED BY INCREASING GAIN. We can see that 75 percent of the students using OLT improved their grades, while only 46 percent of the students witch did not use the online tool presented positive gain. The average gain was greater for the test group (0.18) than for the control group (0.06). In figure 8 the grades for both groups are shown. We also include the OLT grades for the 8 students who worked with it. Grades 100 90 80 70 60 50 40 30 20 10 0 s' grades 0 5 10 15 20 Pre-test Post-test Online tool FIGURE 8 A COMPARISON OF GRADES OF THE TEST GROUP (USING ON-LINE TOOL) VS. THE CONTROL GROUP (WITHOUT ONLINE TOOL) SORTED BY INCREASING POST-TEST GRADE. T3D-4
The post-test grade average of students using the OLT was 53, while the control group obtained a grade average of 38. CONCLUSIONS AND FUTURE WORK Problem solving practice in different courses is a key factor in engineering education. Our developed On-line tool gives students the opportunity to gain confidence before performing formal assessment tasks. We have deveolped a new On-line tool that allows the students to enhance their skills to solve physics problems. The system displays the assessments including images and figures, so the students have room enough to practice and solve the problems. The system allows the communication cycle established with each student. The On-line tool was applied in a Physics I course at the Tecnológico de Monterrey, Campus Ciudad de México. We presented prelimary results which are consistents with the fact that the student who worked out with our OLT obtained better grades than those students that did not use it. We need to apply more evaluation processes to larger set of students before obtaining a stronger conclussion. We will continue evaluating the whole system with new physics and mathematics groups, and the results will be published elsewhere. Additionaly we plan to include a new intelligent component into the system to give the students more accurate feedback. REFERENCES [1] Espinosa E. & Noguez J. POLizied e-learning using contract management. Computer & Education International Journal". Volume 45, Issue 1, Pages 75-103. August 2005. ISSN: 0360-1315 [2] ANUIES. La educación del siglo XXI. Memorias de la 22 Conferencia del Consorcio Círculo del Pacífico. 2000. [3] Martin M. El modelo educativo del Tecnológico de Monterrey. Ed. Tec de Monterrey. 2002 [4] http://www.flinders.edu.au/flexed/whatis.htm. It was consulted on January, 2007. [5] Chi M. T. H., Feltovich P. J., & Glaser R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152. [6] Espinosa E., Rolbledo-Rella,V., Neri, L. & Noguez, J. Towards an adaptive delivery of evaluation tools. FIE2007 (this meeting). [7] Booch G., Rumbaugh J., & Jacobson I. Unified Model Language. Adisson Wesley. 1999. [8] Sinha, A. Client-server computing. Communications of the ACM, Volume 35, Issue 7 (July 1992) Pages: 77-98. [9] Noguez J., Huesca G., & Sucar L.E. Shared learning experiences in a contest environment within a mobile robotics virtual laboratory. FIE2007 (this meeting). [10] McDermott L.C., Shaffer P.S. and the Physics Education Group. Tutorials in Introductory Physics. Department of Physics, University of Washington, Prentice-Hall, 2002. T3D-5