Improving learning outcomes for first year introductory programming students



Similar documents
Assessing the Success of an Introductory Programming Course

Read, Manipulate, and Write: A study of the role of these cumulative skills in learning computer programming

1001ICT Introduction To Programming Lecture Notes

Simulated learners in peers assessment for introductory programming courses

Performance Assessments in Computer Science. An example of student perceptions

IMPROVING STUDENTS FOCUS IN INTRODUCTORY PROGRAMMING COURSES

Rethinking the First Year Programming Course

Trends in Introductory Programming Courses in Australian Universities Languages, Environments and Pedagogy

nexaminer: A Semi-automated Computer Programming Assignment Assessment Framework for Moodle

GAME: A Generic Automated Marking Environment for Programming Assessment

USEFULNESS OF LEARNING OBJECTS IN COMPUTER SCIENCE LEARNING

Use of Simulator in Teaching Introductory Computer Engineering*

Understanding novice programmers: their perceptions and motivations

Towards the development of a cognitive model of programming; A software engineering proposal

PROBLEMS IN PROGRAMMING EDUCATION AND MEANS OF THEIR IMPROVEMENT

Incorporation of an online tutoring tool into programming courses. Abstract

Measuring Students' Ability to Read and Understand Computer Programs

Using Visual Logic with Pseudocode to Teach an Introductory Programming Course

A Categorization of Novice Programmers: A Cluster Analysis Study

Abstraction in Computer Science & Software Engineering: A Pedagogical Perspective

Difficulties Facing Students in Learning Computer Programming Skills at Tabuk University

LEARNING TO PROGRAM USING PART-COMPLETE SOLUTIONS

LEARNING OBJECTS FOR JAVA PROGRAMMING LANGUAGE

Program Visualization for Programming Education Case of Jeliot 3

PRECALCULUS WITH INTERNET-BASED PARALLEL REVIEW

DESIGNING WEB LABS FOR TEACHING SECURITY CONCEPTS ABSTRACT

How to Support Students Computational Thinking Skills in Educational Robotics Activities

Introducing Software Engineering to the Freshman Student

Training and Development (T & D): Introduction and Overview

The Teaching of Novice Computer Programmers: Bringing the Scholarly-Research Approach to Australia

Student Preferences for Learning College Algebra in a Web Enhanced Environment

Befriending Computer Programming: A Proposed Approach to Teaching Introductory Programming

A simple time-management tool for students online learning activities

Robotics for distance learning: a case study from a UK masters programme

Introductory Problem Solving in Computer Science

INSTRUCTIONAL STRATEGY IN THE TEACHING OF COMPUTER PROGRAMMING: A NEED ASSESSMENT ANALYSES

Cognitive Load Theory and Instructional Design: Recent Developments

What is independent learning and what are the benefits for students?

Student Quick Start Guide

A successful strategy for supporting low socioeconomic and accelerated students studying Biosciences in their first year of study

Felix B. Tan and Hazel Chan The University of Auckland, New Zealand. Introduction ABSTRACT

Effects, Experiences and Feedback from Studies of a Program Visualization Tool

Running head: LEARNING IN PAIR PROGRAMMING INTERACTIONS 1

Structure of Presentation. The Role of Programming in Informatics Curricula. Concepts of Informatics 2. Concepts of Informatics 1

A STUDY OF STUDENT PERFORMANCE IN AN ONLINE INTRODUCTORY FINANCIAL ACCOUNTING COURSE COMPARED TO A TRADITIONAL CLASSROOM SETTING

Guided Debugging Practices of Game Based Programming for Novice Programmers

The Effectiveness of Games as Assignments in an Introductory Programming Course

How To Accredit A Psychology Program At The University Of Melbourne

Just-in-time Learning: Can Online Courses Significantly Support Face to Face Teaching?

A General Framework for Overlay Visualization

Masters in Project Management. Evening and weekend degree programmes for career professionals

Academic Program Review SUMMARY* Department under review: Computer Science. Date self-study received in Dean s office: November 21, 2013

Computational Modeling and Simulation for Learning an Automation Concept in Programming Course

Masifunde. Developing an online study skills course as a free resource for first generation university entrants in southern Africa

ASSEMBLY PROGRAMMING ON A VIRTUAL COMPUTER

1. Responsive Website Basics: Code with HTML, CSS and JavaScript 2. Responsive Web Design

Simulating Distributed Leadership. Dr Ian Heywood and Russell Williams

Having Fun with Computer Programming and Games: Teacher and Student Experiences

Applying GQM Approach towards Development of Criterion-Referenced Assessment Model for OO Programming Courses

Introduction to Embedded Systems. Software Update Problem

How To Teach A Distance Learning Course Online

Development of a Tablet-PC-based System to Increase Instructor-Student Classroom Interactions and Student Learning

Course Syllabus. COSC 1437 Programming Fundamentals II. Revision Date: August 21, 2013

STATISTICAL ARTICLE ERTHYGL YSTADEGOL

Game Programming for Complex System Development and ABET Accreditation Assessment

Management Challenges in a Large Introductory Computer Science Course

- a literature study for developing visualizations in the Codewitz-Minerva project

PAPER 1 THE SCHOOL COUNSELLING WORKFORCE IN NSW GOVERNMENT SCHOOLS

Performance and progression of first year ICT students

Functional Modelling in secondary schools using spreadsheets

Examining Science and Engineering Students Attitudes Toward Computer Science

A First Course in Software Engineering for Aerospace Engineers

CITI Training: Accessing CITI for Initial or Additional Training

Recommended Action Members of the Learning, Teaching and Assessment Sub-committee are invited to

REFLECTIONS ON LEARNING THEORIES IN INSTRUCTIONAL DESIGN 1

A Review of Teaching Strategies in Practical Work for Mechanical and Mechatronic Engineering Students Studying Via On-Campus and Distance Education

The Learn-Verified Full Stack Web Development Program

Using Learning Analytics to Inform Interventions for At Risk Online Students

Chapter 4: Chemistry delivering to Agriculture students

Causes of Failure of Students in Computer Programming Courses: The Teacher Learner Perspective

Practical Exams and Programming Competence

Animating Programs and Students in the Laboratory

Title: Transforming a traditional lecture-based course to online and hybrid models of learning

PGCert/PGDip/MA Education PGDip/Masters in Teaching and Learning (MTL) Programme Specifications

Learner and Information Characteristics in the Design of Powerful Learning Environments

Concept-Mapping Software: How effective is the learning tool in an online learning environment?

RECEPTIVENESS OF EDUCATIONAL MULTIMEDIA TOOLS IN COMPUTER PROGRAMMING EDUCATION

Developing Base Ten Understanding: Working with Tens, The Difference Between Numbers, Doubling, Tripling, Splitting, Sharing & Scaling Up

New reference: diversifying service delivery

CS 1361-D10: Computer Science I

Teaching Roles of Variables in Elementary Programming Courses

AN EVALUATION OF LEARNING IN AN ONLINE PROJECT-BASED WEB APPLICATION DESIGN AND DEVELOPMENT COURSE *

Two-Dimensional Parson s Puzzles: The Concept, Tools, and First Observations

Teaching large lecture classes online: Reflections on engaging 200 students on Blackboard and Facebook

Computer Science CATALOG 2014/2015 BINUS UNIVERSITY

Case Based Scenarios: Evidence Based Teaching Learning Strategy in Nursing Education Pharmacology Course

A BRIEF SURVEY ON THE COMPUTER SCIENCE PROGRAMS IN THE UK HIGHER EDUCATION SYSTEMS

Admissions Requirements Standard qualifications for admission to the course are one of the following:

School of Education. Postgraduate Certificate of Education. Pre-Course Primary Experience Booklet

Transcription:

Improving learning outcomes for first year introductory programming students Sven Venema School of Information and Communication Technology, Griffith University, Brisbane, Australia Abstract Andrew Rock School of Information and Communication Technology, Griffith University, Brisbane, Australia The first year transition to university can be particularly difficult for students in technical degree programmes such as Information Technology (IT). These programmes are generally acknowledged to have high attrition rates. It has been suggested that poor understanding of threshold concepts in introductory computer programming courses may be partly responsible for this attrition. The current project attempts to improve student learning outcomes for the computer assignment threshold concept. To achieve this outcome, an innovative visualisation application was developed and incorporated into the curriculum of a first semester first year introductory programming course. Substantially improved student learning outcomes were observed after its successful deployment. Subsequent tracking indicates the innovation continues to successfully scaffold student learning of the computer assignment threshold concept. Introduction The first year transition to university can be particularly difficult for students in technical degree programmes such as Information Technology (IT) which are generally acknowledged to have high attrition rates, around 30-50% (Denning & McGettrick, 2005). It has been suggested by Ma, Ferguson, Roper, and Wood (2007) that these attrition rates are most likely due to poor performance in first year introductory programming courses. The technical nature of these introductory programming courses can lead to distinct difficulties in understanding. One established way of considering these sorts of difficulties is Perkins' (1999) notion of troublesome knowledge. Meyer and Land (2005) argue that the defining characteristics of this knowledge are that it is counter-intuitive, conceptually difficult, and alien. They also suggest that this type of troublesome knowledge forms a gateway through which a more advanced way of thinking about a topic area can be developed. These gateway concepts are now widely referred to as threshold concepts (Meyer & Land, 2005). Since the early 2000's educators have become increasingly concerned regarding the level of competence of students undertaking introductory computer programming courses that are typically undertaken in the first year of an IT degree programme. In 2001, the McCracken group (McCracken et al., 2001) reported on a multi-national, multi-institutional study of assessment of computer programming skills of first year computer science students. The 1

study was carried out at 8 higher education institutions in the United States of America, Australia, Israel, Wales, England, and Poland. The project aimed to assess the computer programming skills of undergraduate students after completing one or two courses in computer programming. The outcomes of this investigation indicated that most students performed well below expectations on problems that "students in any type of Computer Science programme should be able to solve" (McCracken et al., 2001, p. 128). One striking result from this investigation was an average score of 22.89 for a test that was scored out of 110. Lister et al. (2004) led another group of educators who examined first year introductory programming courses from 12 educational institutions including Australia, England, the United States of America, Finland, New Zealand, Sweden, Denmark, and Wales. They found that many of the students in their study were weak at analysing and predicting the outcome of short Java computer code fragments. They suggested that rather than being weak at problem solving, these students "have a fragile grasp of both basic programming principles and the ability to systematically carry out routine programming tasks, such as tracing (or desk checking ) through code." (Lister et al., 2004) In 2006, Dehnadi suggested that another possible explanation for poor performance in first year programming courses could be related to lack of facility with the computer assignment threshold concept (Dehnadi, 2006). An example of a computer assignment statement written in the Java programming language as used by Dehnadi is given in (1). (1) int a = 10; int b = 20; a = b; Given (1), a = b (read as a is assigned b), so a now has the value 20, and b would retain its value of 20. Dehnadi subsequently developed and made available a set of multiple choice test questions based on computer assignment and more complex questions involving assignment and sequencing of operations, see (2). (2) int a = 5; int b = 3; int c = 7; a = c; b = a; c = b; Given (2), the sequence of assignments is important; a = c (a is now 7, c remains 7); b = a (a remains 7, b is now 7); c = b (c is now 7, b remains 7). In 2008, Bornat et al. (Bornat, Dehnadi, & Simon, 2008) investigated whether this threshold concept could be used as a predictor for success in first year introductory programming courses. After six experiments involving more than 500 students at six institutions across three countries they determined that the Dehnadi test questions were not predictive of successful performance in first year introductory programming courses. However, the types of questions on the test do provide a simple way of determining whether students understand assignment and sequencing concepts. 2

Ford and Venema (2010) used the Dehnadi test questions at an Australia metropolitan university to determine whether students who have passed a first year introductory programming course (P1) had achieved an understanding of the computer assignment threshold concept. The Dehnadi test proved a useful tool in identifying that many students who had passed P1 could not consistently answer correctly questions related to computer assignment. In fact, despite passing P1, only 49.55% of the students in that study correctly answered 10 or more out of 12 questions on the test. This poor performance prompted an intervention in the P1 course in subsequent offerings leading to improved learning outcomes for students. The remainder of this paper is organised as follows. First the details of a programming course (P1) are discussed and a threshold concept is identified. Next an intervention is described that can be used to successfully target the threshold concept and improve associated learning outcomes. This is followed by an outline of the research methodology and a discussion of the findings. Final remarks conclude the paper and indicate how this approach could be used in a wider context. The Programming Course (P1) The first year, first semester introductory programming course (P1) has undergone many changes over the years. Currently, the course uses LEGO MINDSTORMS NXT robots to allow students to solve physical real-world problems. In addition, only a small subset of the Java programming language is introduced to the students in this first introductory programming course. The subset language uses only valid Java syntax and is called MaSH (Making Stuff Happen). MaSH includes everything except for user defined classes and allows students to remove unnecessary complexity until it is required. This has the effect of reducing cognitive load through the reduction of irrelevant information. MaSH allows the beginning student to gradually introduce complexity as needed and to focus only on the requirements of the problem without becoming bogged down by the complexity of the language. The course materials include a special compiler mashc that has been developed in-house. The compiler converts the MaSH source code into its equivalent legal Java source code so that it can be executed by the user, on any system that can run Java. MaSH also presents limited versions of other Java Application Programming Interfaces (APIs), making it easier to find methods to use in various programming environments, such as a PC or a robot. This has the effect of further reducing the amount of detail that a student faces at any given time, consequently reducing the cognitive load. Environments are defined by substitution rules that may be developed by any instructor which allows for further technologies targeted at improving learning to be added into the student workflow as required. Students taking the course are required to attend a 2 hour lecture, a 1 hour workshop, and a 2 hour laboratory session per week. The assessment of the course has moved away from the traditional assignment and final examination model and focuses instead on ten weekly 5% laboratory exercises totalling 40%, five in-lecture 5% multiple choice quizzes totalling 25%, and a final 35% capstone project incorporating the key learning objectives in the course. P1 covers programming concepts, problem solving and design, implementation using a subset of the Java programming language, and using programming tools and environments (including MaSH). Programming examples are discussed and worked through in lectures and students apply the concepts in the weekly laboratory sessions as well as in a weekly one hour group workshop. 3

An example of the type of program worked through in the lectures and workshops (this example from week 4 of the 13 week course) is given in (3). (3) import console; print("how much is the loan for ($)? "); double amount = readdouble(); print("what is the annual interest rate (%)? "); double annualrate = readdouble(); print("how many months? "); int months = readint(); double rate = annualrate/(12*100); double repayment = amount*rate/(1.0-pow((1.0+rate),-months)); double totalinterest = months*repayment-amount; println("monthly repayment = $"+repayment); println("total interest = $"+totalinterest); In (3) note the MaSH subset of standard Java statements. For example: print("how much is the loan for (?$)?"); simplifies the standard print statement System.out.print("How much is the loan for (?$)?");. Reading an integer from the keyboard, as in int months = readint() reduces the standard way of reading integers in Java from multiple lines down to a single statement. By week 7 of the course, programs begin to require encapsulation with methods, but not classes. The intervention One of the difficulties that programmers face, particularly those just beginning, is that it is not possible to see inside the machine. When a sequence of statements needs to be traced through, as in the Dehnadi questions (Dehnadi, 2006), the student has to step through the sequence of assignments and keep track of the values associated with each variable in order to determine the outcome. This is usually done in the student s working memory and may incur significant cognitive load even for simple sequences of assignments. Recall that Lister (Lister et al., 2004) identified tracing through a program as a key weakness of beginning programming students. We posit that this weakness is in part due to the cognitive load requirements of this activity and that student learning outcomes may improve by providing assistance targeted at reducing this cognitive load. Enabling the beginning student to visualise the internal state of the machine, that is, keep track of the variables and their values as well as the current statement to be executed next, would reduce the demand on working memory and thereby reduce the overall cognitive load. By seeing the statements and their impact on the internal state of the machine, the learner can focus on the problem at hand rather than the newly learnt rules for solving the problem. Sweller (1999) refers to this source of cognitive load as problem solving load. A second source of cognitive load identified by Sweller (1999) is referred to as split attention. This type of cognitive load occurs when tracing through a program by having to remember the current values of all variables whilst focusing on the current statement which is changing those values. To reduce both the problem solving and split attention cognitive load whilst tracing through a computer program, the PC programming environment was extended in-house to provide a visual trace and debugging window. The application displays, in real time, both the contents 4

of the variables and how those contents change for each statement in a program as it is executed by the student. As part of the intervention, the visualisation application was incorporated into the P1 curriculum as a tool for students to get a better sense of the internal workings of their programs, particularly as it relates to sequences of assignments. Students can use the visualisation software as they execute their program via a single alternate command instead of the standard Java compile command. Figure 1 illustrates a simple program in this environment. Figure 1 Visualisation of a sequence of assignment statements. In Figure 1, the window on the left shows the program source code (the blue arrow indicates the line of code that is just about to be executed). The source code consists of an import statement to gather all the required resources required by the program to compile and execute, two variable assignments, and a single statement assigning the contents of variable b to variable a. The window on the right shows the variables currently residing in memory and their contents. Programs with methods and therefore different scopes show local variables and parameters being added and removed from the stack, another threshold concept. Note that the blue arrow is before the a = b assignment statement indicating that this statement has yet to execute. Hence, the values of 3 and 5 for variables a and b, respectively. Students can step through the program a single statement at a time at various speeds from slow to fast as well as run the program normally. The right window is updated each step and always reflects the current internal state of the machine. This allows the learner to access all relevant information at the same time, reducing cognitive load. In addition to developing the visualisation software, in Semester 2, 2010, all course material related to assignment was reviewed and examples the visualisation software was integrated into the content. Walkthroughs and additional exercises were also developed for use in workshops and students were trained in the use of the software. However, use of the software was not made mandatory in Semester 2, 2010 due to it being used for the first time that semester. After a successful trial in Semester 1, 2010, use of the visualisation software became mandatory in Semester 1, 2011. 5

In order to verify whether the intervention led to an improvement in student learning outcomes related to computer assignment, the two questions detailed above were embedded in the assessments from Semester 2, 2010 onwards. The next section details the methodology used to verify the effectiveness of the innovation. Methodology Participants Students enrolled at one of the metropolitan campuses of the university where P1 was taught were given two questions related to assignment (see the section detailing the questions below) during the in-lecture quizzes that formed part of the normal assessment regime for the course. The experiment was conducted over 5 semesters from second semester 2010 through second semester 2012. The number of participants per question per semester of the experiment is shown in Table 1. Q1 Q2 Semester 2, 2010 43 46 Semester 1, 2011 190 150 Semester 2, 2011 61 46 Semester 1, 2012 155 111 Semester 2, 2012 55 42 Table 1: The number of participants per question per semester of the experiment. Procedure Each quiz was given under examination conditions in a 30 minute time slot. Question 1 was given immediately after the concept had been introduced in Quiz 2 held in week 5 of semester. Question 2 was given towards the end of semester in week 12 as part of the final Quiz number 5. The second author taught the P1 course over all 5 semesters and administered the quizzes. Marking of the quizzes was carried out electronically. Each of the two questions is outlined in the next section. The Questions The following sections show the two questions given as part of the experiment. Question 1 Figure 2 shows Questions 2. This question was given immediately after the concept of assignment was introduced in week 5. 6

Figure 2 Question 1 Question 2 Figure 3 shows Question 2. This question was given in week 12 as part of the final Quiz. This question is essentially the same as question 1 but adds the concept of variable scope. Results Figure 3 Question 2 Students in each of the 5 offerings of P1 were marked either correct or incorrect for each of the 2 questions. The percentage of students who correctly answered each question per semester is shown in Table 2. 7

Q1 % correct Q2 % correct Semester 2, 2010 21 35 Semester 1, 2011 28 71 Semester 2, 2011 48 74 Semester 1, 2012 45 77 Semester 2, 2012 40 67 Table 2: percentage of students who correctly answered each question per semester. Discussion Recall that, prior to the intervention, 49.55% of P1 students could not consistently answer the above questions correctly. Table 2 shows that in Semester 2, 2010, when the visualisation software had been introduced in a non-mandatory trial capacity, initially only 21% of students could answer Question 1 correctly. Also recall that Question 1 was given immediately after the concept of assignment had been introduced. The number of correct responses improved to 35% for Question 2 by week 12. Whilst there was a small improvement, it was noted that uptake of the new visualisation software was low and that this could reduce the effectiveness of the intervention. To improve its uptake, use of the visualisation software was made mandatory and further integrated into the laboratory exercises, workshop activities, and lectures in Semester 1, 2011. After this change initially only 28% of students in Semester 1, 2011 could answer Question 1 correctly. This is a slight improvement over the previous offering and could be due to early adoption of the visualisation software. However, the number of students who could answer Question 2 correctly in week 12 jumped markedly to 71%. This is a strong indicator that the curriculum enhancements, in particular the mandatory use of the visualisation software have a positive effect on student learning outcomes. From Semester 2, 2011 onwards, almost half the students could answer Question 1 correctly in Week 5. It is likely that the improvement for Question 1 is due to the early use of the visualisation software in lectures, workshops, and laboratories when first presenting the concept. Since Semester 1, 2011, the percentage of correct responses for Question 2 in week 12 has stayed at around 70% and is further indication that the intervention has improved the learning outcomes related to computer assignment for the majority of P1 students. In this case, the problem faced by students is directly related to the underlying technology with which they interact. Therefore it is legitimate to consider adjusting the technology as part of the educational design. Conclusion Today's Information Technology educators are under increasing pressure to improve learning outcomes for first year students in introductory programming courses. We present an innovative visualisation tool that assists students in reducing the cognitive load associated with tracing through a computer program. Use of this tool has been demonstrated to improve learning outcomes of a threshold concept related to computer assignment for students undertaking an introductory programming course. We believe that this tool, and its associated reduction of student cognitive load, could also be used successfully to improve student learning outcomes in other programming related activities. 8

The general principle espoused in this paper is the reduction of cognitive load using an appropriate technological intervention. Despite the success of the intervention discussed in this paper, cognitive load is still an issue in many areas of technology education and further work in adjusting the technology as part of the educational design is recommended. References Bornat, R., Dehnadi, S., & Simon. (2008). Mental models, consistency and programming aptitude Proceedings10th Australasian Computing Education Conference (ACE2008) (pp. 53-62). Wollongong, Australia,. Dehnadi, S. (2006). Testing programming aptitude Proceedings of the 18th Annual Workshop of the Psychology of Programming Interest Group (pp. 22-37). Brighton, England. Denning, P. J., & McGettrick, A. (2005). Recentering computer science. Communications of the ACM, 48, 15-19. Ford, M., & Venema, S. (2010). Assessing the Success of an Introductory Programming Course. Journal of Information Technology Education, 9, 133-145. Lister, R., Adams, E. S., Fitzgerald, S., Fone, W., Hamer, J., Lindholm, M.,... Thomas, L. (2004). A multi-national study of reading and tracing skills in novice programmers. ACM SIGCSE Bulletin, 36(4), 119-150. Ma, L., Ferguson, J., Roper, M., & Wood, M. (2007). Investigating the viability of mental models held by novice programmers. Paper presented at the Proceedings of the 38th SIGCSE Technical Symposium on Computer Science Education, Covington. McCracken, M., Kolikant, Y., Almstrum, V., Laxer, C., Diaz, D., Thomas, L.,... Wilusz, T. (2001). A multinational, multi-institutional study of assessment of programming skills of first-year CS students.. ACM SIGCSE Bulletin, 33(4), 125-140. Meyer, J. H. F., & Land, R. (2005). Threshold concepts and troublesome knowledge (2): Epistemological considerations and a conceptual framework for teaching and learning. Higher Education, 49, 373-388. Perkins, D. (1999). The many faces of constructivism. Educational Leadership, 57(3), 6-11. Sweller, J. (1999). Instructional Design in Technical Areas. Camberwell, Australia: ACER Press. 9