Effectiveness of online teaching of Accounting at University level Abstract: In recent years, online education has opened new educational environments and brought new opportunities and significant challenges for students, lecturers and educational institutions (Duncan, 2012). It brings capacity to overcome geographic and time constraints typical of traditional class room teaching (Neuhauser 2010). The purpose of this paper is to find how increased use of technology by both teachers and learners associates with student achievement in accounting courses. Johnson et al. (2013) conducted a comparative study of motivation between face-to-face and online courses using the data of university students in south-eastern USA. They recommended further research into what would encourage students to study online. Calafiore and Damianov (2011) employed a quantitative approach and used an online tracking tool which records the real time that each student spent online in Economics and Finance courses. In Australia, Duncan, Kenworthy and McNamara (2012), used qualitative and quantitive approaches in post-graduate Accounting courses and found links between participation and success. This action research study (Omura 2015) attempts to explore both teacher and student use of technological tools by examining the effect of the frequencies of engagement in online learning resources, on the academic achievement of undergraduate accounting students. This paper employs OLS model and the Panel regression based on the Random Effect Model, to analyse how the frequencies of student online access on the discussion forums have an effect on the final results or/and overall pass rate. The date is collected from one regional university in Australia which provide the accounting courses for the undergraduate and the postgraduate students. An important limitation is that the time students spend learning through offline activities cannot be assessed. Tracking data needs to continue to establish definite trends but early signs of the combination of teacher and student use of technology are very encouraging. Participation on the discussion forum especially seems to have a statistically significant effect on the pass rate. 1
1. Introduction/Background In recent years, online education has opened new educational environments and brought new opportunities and significant challenges for students, lecturers and educational institutions (Duncan, Kenworthy & McNamara 2012). It brings capacity to overcome geographic and time constraints typical of traditional class room teaching (Neuhauser 2010). In 2013, 39 Australian universities developed A Smart Australia: An agenda for higher education 2013-2016 (Australia 2013) in response to the digital economy and new technology, and in order to survive a new era in Australian higher education. According to this report, in 2013 the education industry is worth nearly $15 billion a year in export income. This international students activity is estimated to support around 127,000 jobs in Australia (Higher Education Finance Report 2011). Online teaching allows Australian universities to expand their student base beyond their campuses for distance education within the universities in Australia and overseas. This trend is evident all over the world. Allen and Seaman (2011) reported that 65 percent of all reported institutions indicated online education is critical to their long-term strategy as over 6.1 million students were taking at least one online course during the fall term in 2010 in USA. The benefits of emerging technology tools for online teaching has been discussed by Beldarrain (2006) who stated that a modification of pedagogical perspectives and theoretical framework in the field of distance education would create more opportunities to cultivate new learning communities and the demand for online education would only continue to grow. At the same time, the distance/online educators require continuing improvement for providing the quality of online courses to those students who are willing to participate and are more mobile and technologically knowledgeable than any previous generation. Calafiore and Damianov (2011) investigated the correlation between three factors (prior grade point average (GPA), some demographic characteristics of students and time spent online) and their final grades on the Generalised Ordered Logistic Regression model. The date was collected from 438 students who enrolled in one of the 10 online courses, 5 Economics or 5 Finance courses, during two semesters in 2008 in a large public university in Texas in the USA. They found both online time and GPA were significant determinants of the final grade but the time spent online only had effect on student performance if the data of students GPA were above 3 in 4-point scale. The rapid engagement of advanced technologies in educational institutions has also provided new opportunities and significant challenges for students and universities (Duncan, Kenworthy & McNamara 2012). Duncan, Kenworhy and McNamara (2012) conducted the research on the effect of synchronous and asynchronous participation on students performance in online accounting courses in Australia. They focused on two online environments including chat room as a synchronous forum and discussion board as an asynchronous forum. They found a positive correlation between the total quality and quantity of students participation and both results in final examination and the overall course performance. 2
Johnson et al. (2013) employed the comparative study for the motivations of enrolment in the faculty teaching between face-to-face and online courses and examined their motivation measurement comparing both online and face-to-face, intrinsic and extrinsic motivations using the data of students (235) and faculty (104) from large, public urban universities in south-eastern USA. They found that those online students who present online extrinsic motivation completed a larger number of online courses. However, Johnson et al. emphasised necessity of further research on what forces would enable students to remain actively engaged in online education. The hypothesis is set in this paper that increased monitoring of progress combined with student participation online would improve their results. We suggest that there are two ways to exploit technology to improve student outcomes. One is to use technology to enhance the student learning experience through features such as discussion forums and online lectures. The other way is for staff to take advantage of technology to improve monitoring of student development because recording progress and communicating feedback becomes a more efficient and less onerous task for teachers. Therefore, this paper reports on two aspects of action research which has continued since 2011 and so it is divided into two sections. Section A explains the increased monitoring techniques developed over the time and includes brief of study desk tool. Section B tries to probe the effectiveness of online participation on student outcomes and investigates whether the number of active participations in online activities and discussion forums is associated with higher grades and passing rates. 2. Motivation (for this study) In Semester 2, 2011, an intermediate Company Accounting course (undergraduate) and a Corporate Accounting course (postgraduate) yielded disappointing results (overall 40% failure rate) for on-campus and distant education students studying online, even though two groups of online students could access an on-campus tutorial. Lack of tutorial attendance on-campus suggested poor motivation of students. The following semester, we decided to introduce some specific changes to increase student engagement, but there were no changes to the course content, assessment or otherwise. However, this semester was conducted entirely online except for two groups who were offered on-campus tutorials. We developed a pilot project to explore teaching techniques exploiting technology to monitor progress and increase student participation. Participants 3
Data about online activity for the 2011 class that prompted this research is no longer accessible but this research uses online activity data from Semester 3 2011 onwards, where records are still available. After 2012, only Semester 1 offers on-campus classes. Therefore this data reflects strong on-line education focus. USQ Accounting classes include overseas based NESB (non-english speaking background) students, where students English level is considered adequate for online interaction and discussion, and this may contribute to the range of unknown variables. The research reports on eight semesters taught by the researcher alone or co-operatively with a colleague plus three classes in one semester taught by other staff. All lecture recordings and other online resources, including management of discussion forums and email support, were provided by the researcher, which allowed some consistency of input. SECTION A: INCREASED MONITORING TECHNIQUES A3 Methodology Although USQ has always tried to monitor student progress, following 2011, teachers of this combined undergraduate and postgraduate course intentionally increased the degree of monitoring in following ways: Check on completion of assessable tasks - Contacting students who do not complete the first item of assessment, CMA (computer management assessment), which is usually due in week 4 of the semester. There are a minimum of two messages posted on the study desk during the semester (which are automatically sent to students email addresses), usually one month apart, reminding students of where we are up to in the study schedule, asking them how they are progressing in their studies, and reminding them you do not have to study alone and that we are here to help. Increase monitoring of student participation Check on completion of non-assessable tasks Check number of times study desk accessed - Sending email directly to those students who have not accessed the course study desk within the first week or two of the semester. Students are encouraged to participate and are reminded that we are there to help them. Personal email contact with students not being involved Emailing students individually who do not attend the first 2 weeks of tutorials. This email reminds students of the benefits of attending the on campus classes and tells them we are looking forward to seeing them in our next class. Personal email contact with students working well 4
All students who are progressing well in the subject are emailed individually congratulating them on their achievement to date and encouraging continued effort. Study Desk The study desk is a work in progress. We are making it as student friendly and useful as possible. We collaborate with colleagues in IT design for interactive webpages for students. A4 Results Explanation of terms and calculation of raw data Result: Average Final Result [= Total marks using midpoint in each score band divided by the number of students] Pass: Pass rate [number of Passes divided by the number of students in each class and calculated as percentage]; APDF: Average Participation per student on Discussion Forum [Total activity (number of hits) on discussion forum divided by the number of students]; APAA: Average Participation of All Activity [Total activity online (number of hits) divided by the number of students] Table 1: Summary of Course Results and Student Online Activity A. Under Graduate Course - Company Accounting 2011 2012 2013 2014 Sem 3 Sem 1 Sem 2 Sem 3 Sem 1 Sem 2 Sem 3 Sem 1 Result 62.2 63.3 56.1 60.3 56.8 56.1 56.8 61.4 Pass 72.3 69.1 50.4 64.1 68.5 62.3 83.8 78.3 APDF 44.2 61.5 64.2 104.5 64.1 78.6 86.1 80.6 APAA 98.1 131.5 99.5 118.2 106.3 81.3 95.2 129.2 Student Number 135 207 137 98 170 106 119 125 B. Post Graduate Course Corporate Accounting 2011 2012 2013 2014 Sem 3 Sem 1 Sem 2 Sem 3 Sem 1 Sem 2 Sem 3 Sem 1 Result 59.8 63.1 47.2 65.4 57.9 63.3 60.2 66.4 Pass 80.3 81.3 54.7 86.4 73.8 92.5 82.9 89.2 APDF 95.0 N/A 88.2 121.0 124.5 164.4 169.3 157.8 APAA N/A N/A N/A 83.1 75.1 111.4 172.9 113.6 Student number 66 71 64 44 84 53 41 65 5
250 Figure 1 Comparison of Results and Online Activities 200 150 Final Result Av Pass rate 100 50 Average TPNDF Average overall activity 0 2011-3 2012-1 2012-3 2013-1 2013-3 2014-1 A5 Discussion of Results The results from third semester 2011 were better than previously and improvement has generally continued. An increase in overall online activity can be seen. Despite fluctuation year by year the improved results have been maintained. One very interesting observation can be made. As seen in Figure 1, there is a significant drop in discussion activity between S1 2013 and S3 2013 and S1 2014. However, there seems to be an explanation. Students were using the discussion forum to ask very basic questions so in S3 2013, I introduced prepared answers to frequently asked questions based on the problems students had had in the past years. Since then, while the discussion forum activity has decreased, students seem to be asking more complex questions. This has also reduced the time the teacher spends in answering basic issues. They are still maintaining a high level of general participation online. It is difficult to isolate the monitoring factors which have made the improved participation and results possible. However, some anecdotal evidence may help to explain our methods and why we believe they are working. On the discussion forum - one student started off asking many questions 3 times a week. She was not given complete answers but guidance as to where to find information from lecture slides or the text book. Although she asked so often, she was encouraged and thanked for questions. Then other students also started to say thanks because they had the same problems and then they gradually suggested answers to each other s questions and more active discussions resulted. Another example of success of these methods is that during a short absence of the lecturer, in one case, each student discussed the answers to other students questions and the system 6
operated independently for a short time. In fact, over time teacher input is needed less but oversight is very important. The use of technology not only helps students but also teachers. For example we have found that in each module an increase an activity usually suggests an area where students are struggling and this information is easily available to teachers without much delay, which is particularly valuable when working with off-campus students. Therefore it informs our teaching and helps focus more attention on some topics. Semester 1 includes on-campus students but semesters 2 and 3 are now exclusively online. However, the results between the two delivery methods now seem very similar which also suggests that the ways of using technology are enhancing distance education. This has been the observation of staff but has been supported by the econometrics analysis of the data, presented in Section B. 7
SECTION B: STATISTICAL ANALYSIS OF DATA B3 Methodology Data for 8 semesters from the year of 2011 to 2014 for the combined subject Company Accounting and Corporate Accounting is collated from the undergraduate course (Table 1 A) and the postgraduate course (Table 1 B) in both online accounting courses for 8 semesters on the course study desk. Table 1 presents the summary of the collected data, including the average of total final results (Result), percentage of pass rate (Pass), and average of student participation on the online discussion forums (APDF), average of student participation on the all study activities (APAA). The number of enrolled participants on each course varies every semester from 98 to 207 for the undergraduate courses and 41 to 84 for the postgraduate course. Tables 1 A and B show the students results and the record of online activities. Table 2 presents analysis of data according to Pearson Correlation and Spearman Rank Correlation. Further analysis to test the significance of correlation found was conducted according to Ordinary Least Squares (OLS) Regression and Panel Regression on Random Effect Econometrics Models as shown in Table 3. B4 Results Table 2: Pearson Correlation and Spearman Rank Correlation A. Pearson Correlation Result Pass APDF APAA Result 1 - - - Pass 0.63 1 - - APDF 0.42 0.69 1 - APAA 0.23 0.19 0.31 1 B. Spearman Rank Correlation Result Pass APDF APAA Result 1 - - - Pass 0.64 1 - - APDF 0.25 0.66 1 - APAA 0.37 0.10 0.12 1 The Pearson Correlation analysis was conducted on the data. Those results are presented on Table 2 A. It shows the evidence that there is some correlation between Pass rate and Average participation on discussion forum (0.69). In addition, the Spearman rank Correlation was conducted. Those results are consistent with the results on Pearson Correlation analysis which also shows some correlation between Pass rate and Average participation on the discussion forums. However, it seems that total activities online have little correlation with Pass rate or Final results. 8
Table 3: OLS Regression Analysis and Random Effect Model Variables Explained Variable Explained Variable (p-value) (0.62) (0.41) (0.32) (0.14) *** Statistically significant at 1% level; * statistically significant at 10% level, The two econometrics models, OLS and Random Effect, are formulated as follows: Result = α+ Pass rate + APDF +APAA+ error Pass rate = α + APDF + APAA + FRA + error Result Pass Result Pass OLS OLS RE RE Constant Coef. 44.558*** -25.504*** 44.558*** -25.504*** (p-value) (0.00) (0.58) (0.00) (0.00) Pass Coef. 0.195* - 0.195* - (p-value) (0.09) - (0.15) - Result Coef. - 1.470* - 1.470*** (p-value) - (0.06) - (0.00) APDF Coef. -0.007 0.153*** -0.007 0.153*** (p-value) (0.82) (0.00) (0.58) (0.00) APAA Coef. 0.018-0.037 0.018-0.037 Standard errors are estimated by using the White Robust Standard Error. In analyzing Panel regression data, the Random Effect Model (RE) is preferred over the Fixed Effect by Hausman Test as it is usually considered more rigorous. Using both OLS and RE models can provide more reliable results. B5 Discussion of results Most researchers have difficulty obtaining statistically significant results on the effect of single factors on educational success (Duncan 2004). Duncan, Kenworthy and McNamara (2012) found a positive trend but did not achieve many statistically significant results. Again statistically significant correlation is difficult to prove. The Pearson and Spearman Rank Correlation have shown some correlation between Pass rate and online discussion forums participations but only 10% significance. Table 3 presents the results of empirical analysis using OLS and Panel Regression, Random Effect Models. The results in both models show that there is a positive and statistically significant effect of the number of Participations on discussion forums on the Pass rate at 1% level. In other words, when the number of participation on discussion forum has increased, the 9
pass rate has also increased. The high significance of the constant demonstrates reliability and suitability of the analytical models. The significance is shown in both models which strengthens the validity of the results. 6. CONCLUSION This study has been unusual because it has been able to follow trends over eight semesters of course work in one subject area using a high proportion of data from online education. Previous studies have suggested the correlation between student online participation in discussion forums and improved performance. This may be the first time that serious econometrics analysis has provided statistical evidence of this relationship. This is an important step forward in educational research given growing development of online education worldwide. It is important to recognise that online participation needs to be accompanied by careful staff involvement and monitoring. Overall the results have been improved and the students participation online has increased. Staff efforts to engage students online became more successful over the time. We have now transferred this teaching process to the Accounting Theory courses, which have always been a very difficult subject for students - again with improved student outcomes. Increased monitoring can be time consuming but, even if only some of the methods are used, they are worthwhile. As time goes on, the amount of monitoring usually decreases as student motivation and involvement increase. Sophisticated software allows not only on-time tracking of student activities but also how long students are on the course study desk, where they review on the study desk or how many times they review the specific information. Also this allows students to upload their questions much more easily. New technology also allows ease of recording and uploading short explanations on specific segments of topics (e.g. 15 to 20 min. on one aspect of consideration), so that relevant sections of material are easy for students to find when they need it. From a teacher point of view, the technology is giving more immediate feedback and useful data is being collected with minimum teacher time and effort involved. If teachers are to offer a real education experience to distance students particularly, they need to be able to monitor student progress more effectively and technology can assist by reducing the time the teacher needs to spend. I will continue to use these tools to analyse in a more complex way into the relationship between teaching method and student outcomes. In the meantime, as teachers, we have to use our data to guide us towards improvement. How much success is due to monitoring or how much to participation is not completely clear, but we are using both aspects to yield better results. We cannot quantify which one contributes most but we can suggest both are important factors. Teaching is always a multiple strategy endeavour so we need to use all tools possible to improve student learning. 10
References: Allen, IE & Seaman, J 2011, Going th distance: Online education in the United States, 2011, The Babson Survey Research Group, <http://files.eric.ed.gov/fulltext/ed529948.pdf>. Australia, U 2013, An agenda for Australian higher edcation 2013-2016, Canbera, Australia,<. Beldarrain, Y 2006, 'Distance education trends: Integrating new technologies to foster student interaction nd collaboration', Distance Education, vol. 27, no. 2, pp. 139-53. Calafiore, P & Damianov, DS 2011, 'The effect of time spent online on student achievement in Online Economics and Finance courses ', The Journal of Economic Education, vol. 42, no. 3, pp. 209-23. Calafiore, P & Damianov, DS 2011, 'The effect o ftime spent online on student achievement in online Economics and Finance Courses', The Journal of Economic Education, vol. 42, no. 3, pp. 209-23. Duncan, B 2004, 'A theory of impact philanthropy', Journal of Public Economics, vol. 88, pp. 2159-80. Duncan, K, Kenworthy, A & McNamara, R 2012, 'The effect of synchronous and asynchronous participation on students' performance in online accounting courses', Accounting Education: an international journal, vol. 21, no. 4, pp. pp 431-49. Higher Education Finance Report 2011, Report. Johnson, R, Stewart, C & Bachman, C 2013, 'What drives students to complete online courses? What drives faculty to teach online? Validating a measure of motivation orientation in university students and faculty', Interactive Learning Environments. Neuhauser, C 2010, 'Learning style and effectivess of online and face-to-face instruction', The American Journal of Distance Education, vol. 16, no. 2, pp. 99-113. 11