Exploring students interpretation of feedback delivered through learning analytics dashboards
|
|
|
- Valerie Little
- 10 years ago
- Views:
Transcription
1 Exploring students interpretation of feedback delivered through learning analytics dashboards Linda Corrin, Paula de Barba Centre for the Study of Higher Education University of Melbourne The delivery of feedback to students through learning analytics dashboards is becoming more common in higher education. However, it is not clear what ability students have to interpret this feedback in ways that will benefit their learning. This paper presents the preliminary results of a mixed methods study into students interpretation of feedback delivered through learning analytics dashboards and the influence this feedback has on students self-regulated learning. The findings from a preliminary analysis of the data from the first two phases will be discussed and the future phases of the research outlined. The outcomes of this research provide new insights into how dashboards can be designed to provide effective feedback in blended learning environments. Keywords: learning analytics, feedback, self-regulated learning, motivation Introduction The emerging field of learning analytics involves the analysis of data about learners and their activities to inform the enhancement of teaching and learning practices and environments (Long & Siemens, 2011). Initial learning analytics research has primarily focused on providing data to academic staff on student engagement and performance, most commonly for the purpose of student retention (Bichsel, 2012; Campbell & Oblinger, 2007). Recently this focus has expanded to include ways to deliver feedback directly to students (Verbert, Duval, Klerkx, Govaerts & Santos, 2013); however there is still a gap in our understanding of the usefulness of learning analytics feedback from the student perspective (Wilson, 2012). The value of feedback in student learning has long been acknowledged (Black & William, 1998; Hattie & Timperley, 2007). Feedback is a key element in students self-regulation of learning as it enables students to monitor their progress towards their learning goals and to adjust their strategies to attain these goals (Butler & Winne, 1995). One way that feedback can be delivered to students using learning analytics is in the form of a dashboard through which multiple sources of data can be visualised in a consolidated view. Feedback delivered through learning analytics dashboards can provide students with data on their performance as well as their engagement with learning activities and assessment. The exact content and design of these dashboards will vary depending on the learning design of the subject. Currently there is very limited research into the effectiveness of providing feedback to students in this format. A study of the Course Signals system at Purdue University found that students who had access to a learning analytics dashboard saw an improvement in grades and demonstrated better retention behavior (Arnold & Pistilli, 2012). However, this study did not consider the detail of how students interpreted and responded to the feedback and its impact on their learning strategies and motivation. Some researchers have questioned the ability of feedback developed via dashboards to provide useful information to learners (Elias, 2011). Such representations of student activity are often incomplete due to the fact that not all aspects of the learning process can be captured by such means. There is also concern about the heavy reliance on quantitative representations of student activity through dashboards. Research on effective feedback to support self-regulated learning suggests that feedback needs to deliver high quality information to students that encourages dialogue with teachers and peers around learning (Nicol & Macfarlane-Dick, 2006). The extent to which students are able to use dashboards to facilitate such dialogue is an area that requires further exploration. Method The aim of this project is to develop a greater understanding of how students interpret and act upon feedback delivered via learning analytics dashboards. The research was guided by the following research questions: 1. How do students interpret feedback delivered through learning analytics dashboards? 2. What actions do students take in response to dashboard feedback? 3. How does access to dashboard feedback influence students motivation and performance in their course?
2 The study was conducted in the first semester of the 2014 academic year. A multi-phase mixed methods design was used incorporating survey and interview methods. Participants were recruited from three large undergraduate subjects at the University of Melbourne, including a first-year Biology subject, a first-year Environments subject, and a second-year Japanese subject. Each of these subjects used a blended learning approach with a combination of online activities and face-to-face lectures, tutorials, labs and/or workshops. The curriculum designs of these subjects were analysed in order to inform the design of the content and layout of the feedback dashboards (see Figure 1), which were slightly different for each subject. The data used to populate the dashboards was extracted from the learning management system (LMS) and represented students assessment data (both formative and summative) as well as their engagement with the LMS site in terms of frequency of access. Where possible, students were presented with their own data as well as a class average. Figure 1: Example of the student dashboard for a Biology student At the beginning of the semester, participants were asked to complete a survey to establish their motivations relating to the subject as well as their personal learning goals. The design of the survey was influenced by the Motivated Strategies for Learning Questionnaire (Pintrich, Smith, Garcia & McKeachie, 1991). In week seven of semester the participants took part in an interview during which they were presented with a dashboard of their engagement and performance data. Using the think aloud interview method, the participants were asked to talk about how they interpreted this feedback and to articulate the actions that they would take in response. The next section of this paper presents a discussion of the broad findings from an initial analysis of the survey and interview data. Results and discussion The sample for this study included a total of 28 participants recruited from the Biology (n=14), Environments (n=4), and Japanese (n=10) subjects. As the University s degree structure allows students to undertake a variety of core and breadth subjects as part of their degree, the sample included students from across six undergraduate degrees including Science (57.1%), Environments (17.9%), Biomedicine (10.7%), Arts (7.1%), Commerce (3.6%) and Music (3.6%). The majority of participants (71.4%) were in their first year of study. Five major themes were identified from a preliminary analysis of the first interview. The first theme was the impact of dashboard on participants reflection on their learning. This relates to the reflection phase of selfregulated learning which involves cognitive judgments, affective reactions, choice behavior, and evaluation of tasks and the context itself (Pintrich, 2004). When reflecting on their dashboard feedback one Biology student demonstrated both cognitive judgment and choice behavior: 630
3 I don t work well reading things and memorizing them, I work better with a lecturer telling me, and writing it down, that connection works better for me I don t personally find the [online quizzes] super helpful, I find the lectures and my tutorials more helpful. I kind of breeze through the [online quizzes]. A Japanese student said of their evaluation of tasks: I don t see a lot into the lecture quizzes, cause I don t feel they are that important. Often they seem a bit too simple... But with the supplementary quizzes, they were pretty good. Contrary to previous research that reported academics concerns that students would not be able to interpret the feedback received (Corrin, Kennedy & Mulder, 2013), it was found that the majority of students were able to interpret the data in a way that promoted reflection on their performance and engagement. The second theme related to students ability to plan new or amended study strategies for the subject. This relates to the self-regulated learning phase of planning (Pintrich, 2004), which involves setting goals, time management, effort regulation, and activating motivational constructs such as value beliefs and interest. The level of detail students gave about their intended actions varied from general statements, such as I will work harder in general (Japanese student) to very specific actions such as I m gonna revise [practicals] at least half an hour every night (Biology student). Other common actions were to prepare for assessments earlier, complete activities they had missed, and revisit assessments with unsatisfactory performance. Several participants said they intended to use the LMS more. However, when asked why they would do this, they were not able to explain how this action would improve their learning. For example, one Environments student who was able to explain why they didn t access the LMS a lot: I write down notes, like assessments and deadlines, so I don t really refer back to [the LMS] as often, still felt that accessing the LMS more often would help improve their performance in the subject. One Biology participant mentioned they did not know what actions to take in response to the feedback, so wanted to seek help from the subject tutors. A small number of participants said they would not implement any actions, as they were satisfied with their performance. The third theme related to how the dashboard affected participants motivation towards the subject. The majority of participants reported an increase in their motivation towards the subject after seeing the dashboard feedback. These statements were usually also related to other parts of self-regulation. For example, when asked about how the feedback affected their motivation, students mentioned motivation associated to effort regulation such as I ll definitely try harder (Japanese student), and It makes me want to do more (Biology student); or awareness of progress: It improves motivation, cause I can see where I am at and I know how I m going so far (Biology student). However, a small proportion of students reported that the dashboard feedback did not affect their motivation. These students tended to be above the class average, or were currently meeting their own performance expectations for the subject. As one Japanese student stated: If I were doing worse I would have more motivation. The fourth theme related to the inclusion of a class average for assessments, online quizzes and access to the LMS site. Throughout the interviews it was clear that this information had a significant impact on students view of their performance and engagement. When shown the dashboard for the first time, the ability to compare themselves with their peers was the first thing many students commented upon, with statements like So the first thing I notice is I m below the class average (Japanese student) and First off, the class average is helpful (Biology student). However, it also was apparent that, for many students, the comparison of their data to that of their peers was obscuring their view of progress towards their overall goal in the subject. For example, although several students indicated that they were aiming for an H1 (the highest grade possible), they were satisfied when they saw that they were performing slightly above class average. This was despite the fact that performing just above the class average might not result in a total mark for the subject that would qualify for an H1 grade. The fifth theme that emerged from the interviews was the benefit students identified in being able to see all the assessment and online learning activities in one consolidated view. Although students had access to all the elements of the feedback in other forms (with the exception of their LMS access statistics), students commented on the usefulness of being able to view different tasks in comparison to each other. Currently students have access to a My Grades tool within the LMS (Blackboard) which presents a summary of marks for quizzes and assessments, but this representation is text based displaying the grades only as numbers. All the participants indicated that they like the visual representation of the feedback. Referring to the use of graphs throughout the dashboard, one Biology student commented that: if you gave me a comparison of a [numeric] mark versus class average then I wouldn t see the difference as vividly as I do here. The design of the dashboards incorporated a space for all activities across the semester. This meant that, in addition to the results and frequencies for the activities they had already completed, students could also see a blank space representing what was ahead and/or what they had missed. This layout helped several students to identify assessment items 631
4 that they didn t know existed or had forgotten to attempt within the set deadlines. For example, some Biology students said they were unaware of the existence of the supplementary activities designed to provide an extra introduction to general science principles. Once they discovered the existence of such formative activities, a number of students expressed an intention to complete these as part of their revision for the final exam. These findings suggest that the design of dashboards can act as a form of study guide for students in planning their engagement with tasks. The challenge for teachers in providing such a guide is to design a dashboard that incorporates as many activities and assessments as possible, without compromising clarity and design. This approach also requires that the data to support these items is available in the LMS, which is not always possible for the complete range of learning activities. Conclusion and future research These initial findings of this work-in-progress study have provided important insights into how students interpret and plan actions in response to feedback delivered through learning analytics dashboards. Contrary to concerns expressed in the literature about whether students have the ability to interpret this kind of feedback (Corrin, Kennedy & Mulder, 2013; Elias, 2011), the majority of students in this study demonstrated a strong ability to reflect and plan. The ability to view their feedback in this format was found to have an impact on students motivation towards the subject and helped to guide them in their progress and performance in learning activities and assessments. However, the extent to which the actions they identified from the initial viewing of the dashboard feedback translate into impact on their engagement and performance is yet to be seen. This aspect of impact is what Verbert et al. (2013) observe is often missing from studies of learning analytics dashboards. This paper presents the preliminary findings of the first two stages of the study. In addition to the initial survey and first interview, the students took part in a second interview and final survey at the end of the semester. The second interview was designed to follow up on the actions that students said they would take in response to the feedback given in the first interview. An updated dashboard was also presented to the students so they could see the impact these activities had on their progress, and so they could identify further actions to take in the lead up to final exams. At the end of the semester, after students had received their final results for the subject, they will be asked to fill in final survey which will ask them to reflect on the role of the dashboard feedback in their approach to study throughout the whole subject. The outcomes of full study will contribute to a greater understanding of students interpretation of feedback and the impact that this has on their self-regulated learning, motivation and goals. In practical terms, the outcomes of this research will inform more effective design of dashboards to provide feedback in a format that can be most beneficial to student learning. References Arnold, K. E. & Pistilli, M. D. (2012). Course Signals at Purdue: Using learning analytics to increase student success. Proceedings of the 2nd International Conference on Learning Analytics & Knowledge. New York: ACM. Bichsel, J. (2012). Analytics in Higher Education: Benefits, Barriers, Progress, and Recommendations (Research Report), Louisville, CO: EDUCAUSE Centre for Applied Research. Retrieved from Black, P. & Wiliam, D. (1998). Assessment and Classroom Learning. Assessment in Education, 5(1), Butler, D. L. & Winne, P. H. (1995). Feedback and Self-Regulated Learning: A Theoretical Synthesis. Review of Education Research, 65(3), Campbell, J. P. & Oblinger, D. G. (2007). Academic analytics. Washington, DC: EDUCAUSE Center for Applied Research. Retrieved from Corrin, L., Kennedy, G., & Mulder, R. (2013). Enhancing learning analytics by understanding the needs of teachers. In H. Carter, M. Gosper & J. Hedberg (Eds.), Electric Dreams. Proceedings ascilite 2013 Sydney. (pp ). Elias, T. (2011). Learning Analytics: Definitions, Processes, and Potential. Retrieved from Hattie, J. & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 77(1), Long, P., & Siemens, G. (2011). Penetrating the fog: analytics in learning and education. EDUCAUSE Review, 46(5), Nicol, D. J. & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education, 31(2), Pintrich, P.R., Smith, D., Garcia, T., & McKeachie, W. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ). Ann Arbor, MI: University of Michigan, National Center for Research to Improve Postsecondary Teaching and Learning. 632
5 Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review,16(4), Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning Analytics Dashboard Applications. American Behavioral Scientist. Advance online publication. doi: / Wilson, A. (2012). Student engagement and the role of feedback in learning. Journal of Pedagogic Development, 2(1), Contact author: Linda Corrin, [email protected] Please cite as: Corrin, L., & de Barba, P. (2014). Exploring students interpretation of feedback delivered through learning analytics dashboards. In B. Hegarty, J. McDonald, & S.-K. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014 (pp ). Note: All published papers are refereed, having undergone a double-blind peer-review process. The author(s) assign a Creative Commons by attribution 3.0 licence enabling others to distribute, remix, tweak, and build upon their work, even commercially, as long as credit is given to the author(s) for the original creation. 633
Loop: A learning analytics tool to provide teachers with useful data visualisations
Loop: A learning analytics tool to provide teachers with useful data visualisations Linda Corrin, Aneesha Bakharia David Williams Gregor Kennedy Lori Lockyer Macquarie University Shane Dawson Paula de
Abstract. Introduction
Let s not forget: Learning analytics are about learning 1 Let s not forget: Learning Analytics are about Learning Dragan Gašević, Shane Dawson, George Siemens Abstract The analysis of data collected from
Implementing Timely Interventions to Improve Students Learning Experience*
Implementing Timely Interventions to Improve Students Learning Experience* Sue Whale Fredy-Roberto Valenzuela Josie Fisher This paper describes the development of an approach aimed at increasing student
Flipped classroom in first year management accounting unit a case study
Flipped classroom in first year management accounting unit a case study Xinni Du Blended Learning Advisor, School of Business University of Western Sydney Sharon Taylor Senior Lecturer, School of Business
A simple time-management tool for students online learning activities
A simple time-management tool for students online learning activities Michael de Raadt Department of Mathematics and Computing and Centre for Research in Transformative Pedagogies University of Southern
Learning Analytics in Higher Education: A Summary of Tools and Approaches
Learning Analytics in Higher Education: A Summary of Tools and Approaches Amara Atif, Deborah Richards Department of Computing, Faculty of Science, Macquarie University Ayse Bilgin Department of Statistics,
Honours Degree (top-up) Business Abbreviated Programme Specification Containing Both Core + Supplementary Information
Honours Degree (top-up) Business Abbreviated Programme Specification Containing Both Core + Supplementary Information 1 Awarding Institution / body: Lancaster University 2a Teaching institution: University
Student diaries: using technology to produce alternative forms of feedback
Student diaries: using technology to produce alternative forms of feedback NUZ QUADRI University of Hertfordshire PETER BULLEN University of Hertfordshire AMANDA JEFFERIES University of Hertfordshire 214
Phase 1 pilot 2005/6 Intervention. GCU Caledonian Business School Business Management Page 1 of 8. Overview
University Department Module Overview Glasgow Caledonian Business school Business Management The Business School has been implementing assessment re-engineering using a variety of technologies, initially
Doing learning analytics in higher education -Critical issues for adoption & implementation-
Doing learning analytics in higher education -Critical issues for adoption & implementation- Dragan Gašević @dgasevic May 8, 2015 Learning Analytics Network University of Edinburgh and JISC Growing demand
Using a blogging tool to assess online discussions: an integrated assessment approach
Using a blogging tool to assess online discussions: an integrated assessment approach Xinni Du Learning and Teaching Unit (LTU) University of New South Wales Chris Daly School of Mining Engineering University
Creating socially inclusive online learning environments in higher education
Creating socially inclusive online learning environments in higher education Lisa Kay Thomas Learning, teaching and Curriculum University of Wollongong James Herbert Learning, teaching and Curriculum University
Dreams, hiccups and realities: What happens when lecturers and students co-design an online module?
Dreams, hiccups and realities: What happens when lecturers and students co-design an online module? Maria Northcote and Beverly Christian School of Education, Faculty of Education, Business and Science
2015 2016 preparatory courses design pre-master s
2015 2016 preparatory courses design pre-master s postgraduate programmes preparatory course design pre-master s 02 Brief descriptive summary Over the past 78 years this course at Istituto Marangoni has
THREE-YEAR COURSES VISUAL & MULTIMEDIA DESIGN
THREE-YEAR COURSES VISUAL & MULTIMEDIA DESIGN undergraduate programmes three-year course visual & multimedia design 02 Brief descriptive summary Over the past 80 years Istituto Marangoni has grown and
HND degree top-up students perceptions of their experience at the University of Worcester: how can future students experiences be improved?
HND degree top-up students perceptions of their experience at the University of Worcester: how can future students experiences be improved? Lerverne Barber and Dr Nick Breeze University of Worcester ([email protected]
2015 2016 one year courses digital image creation for luxury brands
2015 2016 one year courses digital image creation for luxury brands undergraduate programmes one year course digital image creation for luxury brands 02 Brief descriptive summary Over the past 78 years
Beauty and blended learning: E-learning in vocational programs
Beauty and blended learning: E-learning in vocational programs Melanie Brown Waikato Institute of Technology This case study set out to discover how the provision of blended learning focussing on course
An Investigation on Learning of College Students and the Current Application Situation of the Web-based Courses
2011 International Conference on Computer Science and Information Technology (ICCSIT 2011) IPCSIT vol. 51 (2012) (2012) IACSIT Press, Singapore DOI: 10.7763/IPCSIT.2012.V51.127 An Investigation on Learning
Evaluation in Online STEM Courses
Evaluation in Online STEM Courses Lawrence O. Flowers, PhD Assistant Professor of Microbiology James E. Raynor, Jr., PhD Associate Professor of Cellular and Molecular Biology Erin N. White, PhD Assistant
Preparing Students for College Level Math. Heidi Schuler Indiana Math Tour April 23 25, 2014
Preparing Students for College Level Math Heidi Schuler Indiana Math Tour April 23 25, 2014 The National Institute for Literacy, a federal agency, provides leadership on literacy issues for children, youth,
Kevin Mawhinney, Technology Education Department Head, Cobequid Educational Centre, Truro, Nova Scotia, [email protected]
A Review of the Literature on Online (e) Assessment Kevin Mawhinney, Technology Education Department Head, Cobequid Educational Centre, Truro, Nova Scotia, [email protected] Introduction Gaytan and McEwen
A PROFESSIONAL DEVELOPMENT PLANNING BASED CURRICULUM
A PROFESSIONAL DEVELOPMENT PLANNING BASED CURRICULUM Preparing Students for 21 st Century Career Opportunities Best Educational Innovation MKG Hospitality Awards 2015 PREPARING STUDENTS FOR THEIR CAREER
Course outline. Code: EDU101 Title: Human Development and Learning
Course outline Code: EDU101 Title: Human Development and Learning Faculty of: Science, Health, Education and Engineering Teaching Session: Semester 2 Year: 2015 Course Coordinator: Associate Professor
An online professional network to support teachers information and communication technology development
An online professional network to support teachers information and communication technology development Damian Maher Faculty of Arts and Sciences University of Technology, Sydney Shukri Sanber, Leanne
Programme Title: MSc/Diploma/Certificate in Advancing Nursing Practice
THE UNIVERSITY OF EDINBURGH PROGRAMME SPECIFICATION FOR MSc in Advancing Nursing Practice Awarding Institution: The University of Edinburgh Teaching Institution: The University of Edinburgh Programme accredited
Case Based Scenarios: Evidence Based Teaching Learning Strategy in Nursing Education Pharmacology Course
International Journal of Nursing December 2014, Vol. 1, No. 2, pp. 147-154 ISSN 2373-7662 (Print) 2373-7670 (Online) Copyright The Author(s). 2014. All Rights Reserved. Published by American Research Institute
Programme Specification
Programme Specification Where appropriate outcome statements have be referenced to the appropriate Benchmarking Statement (BS) 1 Awarding Institution Queen Margaret University 2 Teaching Institution Queen
BA (Hons) Contemporary Textiles (top up) BA (Hons) Contemporary Fashion (top up) BA (Hons) Contemporary Design for Interiors (top up)
BA (Hons) Contemporary Textiles (top up) BA (Hons) Contemporary Fashion (top up) BA (Hons) Contemporary Design for Interiors (top up) Abbreviated Programme Specification Containing Both Core + Supplementary
General Approaches to evaluating student learning Qatar University, 2010
General Approaches to evaluating student learning Qatar University, 2010 1 N U H A D Y A Z B I K D U M I T, R N, M A, P H D A M E R I C A N U N I V E R S I T Y O F B E I R U T Evaluation Types: Levels
WEPS PEER AND AUTOMATIC ASSESSMENT IN ONLINE MATH COURSES
WEPS PEER AND AUTOMATIC ASSESSMENT IN ONLINE MATH COURSES Olga Caprotti, Johanna Ojalainen, Matti Pauna, and Mika Seppälä Department of Mathematics and Statistics, University of Helsinki, Finland P.O.
ONE YEAR COURSES FASHION IMAGE & STYLING INTENSIVE
ONE YEAR COURSES FASHION IMAGE & STYLING INTENSIVE undergraduate programmes one year course fashion image & styling intensive 02 Brief descriptive summary Over the past 80 years Istituto Marangoni has
Using A Learning Management System to Facilitate Program Accreditation
Using A Learning Management System to Facilitate Program Accreditation Harry N. Keeling, Ph.D. Howard University Department of Systems and Computer Science College of Engineering, Architecture and Computer
2014 2015 one year courses cosmetic and fragrance marketing & management
2014 2015 one year courses cosmetic and fragrance marketing & management undergraduate programmes one year course cosmetic and fragrance marketing & management 02 Brief descriptive summary Over the past
COS 160 - Course Assessment Student Responses
COS 160: Course Assessment Student Responses from Focus Group Sessions Spring 2005 Office of Academic Assessment University of Southern Maine Spring 2005 Introduction The Computer Science department was
A Conceptual Framework for Online Course Teaching and Assessment in Construction Education
A Conceptual Framework for Online Course Teaching and Assessment in Construction Education Namhun Lee Department of Manufacturing and Construction Management Central Connecticut State University With the
Effective Teaching Approaches in Diploma of Business Course
Abstract: Effective Teaching Approaches in Diploma of Business Course Tsen Tzee Mui Curtin University Sarawak Malaysia [email protected] Dr Beena Giridharan Curtin University Sarawak Malaysia [email protected]
Benchmarking across universities: A framework for LMS analysis
Benchmarking across universities: A framework for LMS analysis Lynnae Rankine University of Western Sydney Leigh Stevenson Griffith University Janne Malfroy University of Western Sydney Kevin Ashford-Rowe
The University s course specification template has been developed to fulfil three main functions; it shall act:
LONDON METROPOLITAN UNIVERSITY Course Specification template The University s course specification template has been developed to fulfil three main functions; it shall act: as a source of information for
Honours Degree (top-up) Computing Abbreviated Programme Specification Containing Both Core + Supplementary Information
Honours Degree (top-up) Computing Abbreviated Programme Specification Containing Both Core + Supplementary Information 1 Awarding Institution / body: Lancaster University 2a Teaching institution: University
Using Classroom Assessment to Change Both Teaching and Learning
Research indicates that Classroom Assessment affects the experiences of both the teacher and the students. Using Classroom Assessment to Change Both Teaching and Learning Mimi Steadman Those who have used
Creative Critique in Visual Communication: Effective Feedback in Design Projects Through Voice- Centred Studio-Based Communication and Online Delivery
Creative Critique in Visual Communication: Effective Feedback in Design Projects Through Voice- Centred Studio-Based Communication and Online Delivery Mary-Jane Taylor University of Canberra; School of
How To Find Out If Distance Education Is A Good Thing For A Hispanic Student
Spring 2010 Students Perceptions and Opinions of Online Courses: A Qualitative Inquiry A Report by South Texas College s Office of Research & Analytical Services South Texas College Spring 2010 Inquiries
UNIVERSIDAD NACIONAL ABIERTA Y A DISTANCIA ESCUELA CIENCIAS DE LA EDUCACIÓN
UNIVERSIDAD NACIONAL ABIERTA Y A DISTANCIA ESCUELA CIENCIAS DE LA EDUCACIÓN 551018 FIRST AND SECOND LANGUAGE ACQUITION AND LEARNING Julio Tulande Henry Carvajal Evaluador UNAD Enero de 2014 1 551018 First
Using Formative Writing Assignments to Enhance Student Learning in a First Year Chemistry Class
6th WSEAS International Conference on EDUCATION and EDUCATIONAL TECHNOLOGY, Italy, November 21-23, 2007 110 Using Formative Writing Assignments to Enhance Student Learning in a First Year Chemistry Class
Course Specification MSc Information Management 2016-17 (INMAM)
LEEDS BECKETT UNIVERSITY Course Specification MSc Information Management 2016-17 (INMAM) Our courses undergo a process of review periodically, in addition to annual review and enhancement. Course Specifications
Learning Analytics and Formative Assessment to Provide Immediate Detailed Feedback Using a Student Centered Mobile Dashboard
Learning Analytics and Formative Assessment to Provide Immediate Detailed Feedback Using a Student Centered Mobile Dashboard Naif Radi Aljohani* Faculty of Computing and Information Technology King Abdulaziz
LEARNING ANALYTICS: A SOUTH AFRICAN HIGHER EDUCATION PERSPECTIVE
LEARNING ANALYTICS: A SOUTH AFRICAN HIGHER EDUCATION PERSPECTIVE A few steps on an analytics journey Juan-Claude Lemmens and Michael Henn Presentation Outline Introduction to the research Establishing
Role Profile Part 1. Lecturer Hospitality and Tourism Management. Job Title: FBL137. Reference No: Team Leader Tourism, Hospitality and Events
Role Profile Part 1 Job Title: Reference No: Reports to: Grade: Working Hours: Faculty: Location: Main Purpose of Role: Lecturer Hospitality and Tourism Management FBL137 Team Leader Tourism, Hospitality
Where has the Time Gone? Faculty Activities and Time Commitments in the Online Classroom
Where has the Time Gone? Faculty Activities and Time Commitments in the Online Classroom B. Jean Mandernach, Swinton Hudson, & Shanna Wise, Grand Canyon University USA Abstract While research has examined
Analytics: An exploration of the nomenclature in the student experience.
Analytics: An exploration of the nomenclature in the student experience. Rhonda Leece Student Administration and Services University of New England Abstract The student journey through higher education,
Programme Specification and Curriculum Map for MA TESOL
Programme Specification and Curriculum Map for MA TESOL 1. Programme title MA TESOL 2. Awarding institution Middlesex University 3. Teaching institution Middlesex University 4. Programme accredited by
Converting a Face-to-Face Course to a Hybrid Course
Converting a Face-to-Face Course to a Hybrid Course A research paper by Stephanie Delaney For College Teaching EDAD 966 Introduction Hybrid courses blend the best of face-to-face classes with online learning.
The Impact of Online Quizzes on Student Engagement and Learning
The Impact of Online Quizzes on Student Engagement and Learning Dr. Jennifer Hillman Fall 2011-Spring 2012 Introduction PSYCH 270: Introduction to Abnormal Psychology is typically a large lecture class
Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective. Beth Dietz-Uhler & Janet E. Hurn Miami University
www.ncolr.org/jiol Volume 12, Number 1, Spring 2013 ISSN: 1541-4914 Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective Beth Dietz-Uhler & Janet E. Hurn Miami University
Auditing education courses using the TPACK framework as a preliminary step to enhancing ICTs
Auditing education courses using the TPACK framework as a preliminary step to enhancing ICTs Chris Campbell The University of Queensland Aspa Baroutsis The University of Queensland The Teaching Teachers
Demonstrating Understanding Rubrics and Scoring Guides
Demonstrating Understanding Rubrics and Scoring Guides Project-based learning demands a more progressive means of assessment where students can view learning as a process and use problem-solving strategies
Response Rates in Online Teaching Evaluation Systems
Response Rates in Online Teaching Evaluation Systems James A. Kulik Office of Evaluations and Examinations The University of Michigan July 30, 2009 (Revised October 6, 2009) 10/6/2009 1 How do you get
3. How can you improve your ability to understand and record information presented in lectures?
GETTING THE MOST OUT OF LECTURES Use this sheet to help you: prepare for and learn from lectures manage note-taking during lectures 5 minute self test 1. What are some of the purposes of lectures? 2. How
Formative Feedback for Teaching Assistants (TAs) at UVic
Formative Feedback for Teaching Assistants (TAs) at UVic Suggestions Regarding Implementing a Variety of Feedback Approaches that Support the Professional Development of TAs University-wide 1. Introduction
EXPLORING ATTITUDES AND ACHIEVEMENT OF WEB-BASED HOMEWORK IN DEVELOPMENTAL ALGEBRA
EXPLORING ATTITUDES AND ACHIEVEMENT OF WEB-BASED HOMEWORK IN DEVELOPMENTAL ALGEBRA Kwan Eu Leong Faculty of Education, University of Malaya, Malaysia [email protected] Nathan Alexander Columbia University
Niagara College of Applied Arts and Technology. Program Quality Assurance Process Audit. 18 Month Follow-up Report. Submitted by: Niagara College
Niagara College of Applied Arts and Technology Program Quality Assurance Process Audit 18 Month Follow-up Report Submitted by: Niagara College Report Submitted September 2013 INTRODUCTION Niagara College
Final Exam Performance. 50 OLI Accel Trad Control Trad All. Figure 1. Final exam performance of accelerated OLI-Statistics compared to traditional
IN SEARCH OF THE PERFECT BLEND BETWEEN AN INSTRUCTOR AND AN ONLINE COURSE FOR TEACHING INTRODUCTORY STATISTICS Marsha Lovett, Oded Meyer and Candace Thille Carnegie Mellon University, United States of
Bloom s Taxonomy. List the main characteristics of one of the main characters in a WANTED poster.
Bloom s Taxonomy Bloom s Taxonomy provides an important framework for teachers to use to focus on higher order thinking. By providing a hierarchy of levels, this taxonomy can assist teachers in designing
KENNEBEC VALLEY COMMUNITY COLLEGE FAIRFIELD, MAINE. Department of Social Sciences
KENNEBEC VALLEY COMMUNITY COLLEGE FAIRFIELD, MAINE Department of Social Sciences COURSE NUMBER: PSY101-OLA (Summer 2013) CREDIT HOURS: 3 COURSE TITLE: Introduction to Psychology CLOCK HOURS: 45 INSTRUCTOR:
Students Track Their Own Learning with SAS Data Notebook
Students Track Their Own Learning with SAS Data Notebook Lucy Kosturko SAS Institute, Inc. [email protected] Jennifer Sabourin SAS Institute, Inc. [email protected] Scott McQuiggan SAS Institute,
