Loop: A learning analytics tool to provide teachers with useful data visualisations
|
|
|
- Lucas Carroll
- 10 years ago
- Views:
Transcription
1 Loop: A learning analytics tool to provide teachers with useful data visualisations Linda Corrin, Aneesha Bakharia David Williams Gregor Kennedy Lori Lockyer Macquarie University Shane Dawson Paula de Barba Dragan Gasevic University of Edinburgh Scott Copeland One of the great promises of learning analytics is the ability of digital systems to generate meaningful data about students learning interactions that can be returned to teachers. If provided in appropriate and timely ways, such data could be used by teachers to inform their current and future teaching practice. In this paper we showcase the learning analytics tool, Loop, which has been developed as part of an Australian Government Office of Learning and Teaching project. The project aimed to develop ways to deliver learning analytics data to academics in a meaningful way to support the enhancement of teaching and learning practice. In this paper elements of the tool will be described. The paper concludes with an outline of the next steps for the project including the evaluation of the effectiveness of the tool. Keywords: Learning Analytics, Higher Education, Learning Design, Data Visualisation. Introduction Learning analytics offers the potential to deliver data to teachers in ways that can inform the design and facilitation of learning activities. In order to realise this potential it is necessary to develop tools that allow student data to be analysed and visualised in ways that take into consideration the pedagogical design of learning tasks. This paper reports on the current progress of a project, funded by the Australian Office for Learning and Teaching (OLT), which explores what analytics would be useful for teachers in higher education to inform the development of a generic learning analytics tool. In previous papers we have outlined the overarching approach to the project and the preliminary findings from the first stage of the research (Kennedy et al., 2014). In this paper we will showcase the design of the tool and outline the next steps that will be taken to evaluate the effectiveness of the tool within real teaching scenarios. Advances in learning analytics approaches and methods have resulted in calls for tools that can help academics access and interpret data about their students engagement in online learning environments (Drachsler & Geller; 2012). Previous tools have provided different forms of visualisations of student data from learning management systems including access to resources (Di Bitonto, Pesare, Roselli & Rossano, 2015), assessment (Petropoulou, Kasimatis, Dimopoulos & Retalis, 2014), and social interaction analysis (Schreurs, Teplovs, Ferguson, De Laat & Buckingham Shum, 2013). While these tools provide different ways of viewing data about students engagement with online resources and activities, they do not facilitate an explicit connection with learning outcomes and pedagogical design. The Loop tool provides student data to teachers on students learning interactions and processes to inform teaching and feedback dialogues with students. A fundamental premise of the Loop tool is that the teachers pedagogical intent, which underpins the design of a learning activity, drives the framing of the analytics that are presented for each technology-based tool. This means that the data associated with students interactions with learning material can be used to determine the extent to which the pedagogical intent of the task has been reflected in students interactions. These premises are underpinned by the field of learning design and the Laurillard s (2002) conversational framework. The learning design field provides a context through which more meaningful interpretation of student 409 CP:57
2 data can be made (Lockyer, Heathcote, & Dawson, 2013) by articulating the pedagogical intent of the learning activities to which student engagement data can be compared. The conversational framework highlights the importance of interaction, dialogue and feedback between teachers and students, which should take place in an iterative cycle. Analysis of student data provides an evidence base to inform such feedback and remediation processes between teachers and students. The Loop Tool The Loop tool is an open source analytics application, developed as part of the OLT project, that allows users to visualise student data from learning management systems in a meaningful way. Loop is made up of two components: a log processing component that includes a data warehouse, and a web application that includes dashboards and reports. The Loop tool is programmed in Python using the Django web application framework. The tool is able to process logs from both the Moodle and Blackboard learning management systems to produce a number of different data representations and visualisations. Importantly, the tool is designed to integrate the course structure (i.e., course hierarchy) and course schedule in its associated visualisations to allow teachers to evaluate the effectiveness of learning activities scheduled throughout the course. The word course has been used within the tool to represent a single subject or unit of study. Loop allows teachers to define key learning tasks or events (e.g. exam or non-instruction weeks) which are incorporated into the data representations in the tool to support the interpretation of patterns of student activity. These critical events allow for either isolated (e.g., mid-term exam) or recurrent events (e.g., weekly lectures) across the course to be defined. After its trial and before its formal release, the tool will be supplemented with a pedagogical helper tool that will assist teachers in disaggregating the learning objectives and learning design of their course. It will be this element of the Loop tool that will assist teachers to articulate their pedagogic intent and identify metrics that relate to their design intentions. When used together, the two tools will help teachers see the degree to which students interactions with learning tasks and assessments are manifest in LMS data corresponding with the pedagogical intent envisaged by teachers. The three main sections of the Loop tool are: dashboard, course access, and students. Dashboard The course dashboard provides a quick overview of activity within the course. Figure 1 provides a screenshot of the dashboard with data from one course with which Loop is being piloted. This dashboard summary of the data can be viewed for a particular week or for the entire course. The first representation of the dashboard is associated with Daily Page Views (Figure 1-A). This displays simple metrics of students levels of activity over time for three key three key areas of a course: content, communication, and assessment. The vertical lines on the daily page views graph represent the dates of teacher-identified critical learning events in the course (Figure 1-B). This visual representation allows teachers to easily see how students interactions with critical course materials are influenced (or unaffected) by critical events or key milestones within the course. A representation of week metrics (Figure 1-C) presents a digestible summary of key weekly activities in the course including the number of unique page views, the number of unique students who were active, the number of student sessions conducted, the average number of sessions, and the average number of page views per session. The dashboard also presents a list of the course material most visited by students over the time period (e.g., Top Accessed Content ; Figure 1-D). 410 CP:58
3 Figure 1: The Loop tool dashboard for week 1 of a course. A = Daily page views. B = Critical learning events. C = Week metrics. D = Top accessed content. Course Access The course access section provides teachers with the opportunity to explore, in more detail, students access and interaction with the course. The three-fold classification of content, communication, and assessment is again used to structure teachers exploration and investigation of the learning analytics data. Each of these sub-sections is described below. Content The content section displays all students interactions with the content material in the course. Teachers can elect to see the total number of student views for each item of content material within the course structure, per week and aggregated across the whole course. Teachers can also elect to see unique student views which shows how many unique students viewed each item of content material (either per week or in total aggregate). From either of these two sets of data teachers can access further details on any specific item of content. Figure 2 shows this more detailed view for a course content item. It and shows how the level of student access can be visualised in relation to key learning events over a three-month period. Teachers also have the option with this detailed view to see a histogram of access frequencies, and a list of students who have not yet accessed the particular item of content. Figure 2: A three-month visualisation of access to a course content item aligned with key learning events Communication The communication section contains visualisations of students interactions with discussion forums across the course. In this section it is possible to see the number of views and posts for each of the forums available in the discussion board, both per week and in total. The number of unique students 411 CP:59
4 viewing each forum is also available. The numbers of views for each forum included in the discussion board in relation to a selected learning event date, are presented visually (see Figure 3). Figure 3 shows students access to discussion forums relative to the specific event of the weekly lecture. Teachers may, for example, be interested in whether students are actively engaging in or viewing discussion posts on a specific topic before or after a particular lecture or class. The visualisation shows how many discussion views have occurred (indicated by the size of the circle), as well as the proportion of discussion forum views that occurred before the learning event (blue circle segments) or after the learning event (red circle segments). Figure 3: A visualisation of student access to discussion forums in relation to course events Assessment The assessment section displays data related to students interaction with each assessment available across the course. The number of attempts for each assessment, per week and in total, are presented in a primary table. Teachers also have the ability to view the same data filtered by access by unique students. A grade table presents students grades for each assessment within the course. An event table presents students interaction with each piece of assessment in relation to a defined learning event, using the same visual representation techniques (i.e. size and colour) as those used in the content and communication sections. Students The students section of the tool provides the ability to monitor the activity, communication and progress of individual students. An aggregated table presents an overall count of activity for each student. When an individual student is selected, specific data related to the student s interaction with content, communication and assessment can be viewed. This section can be used by teachers who are interested in investigating engagement patterns of students, potentially to identify those who are struggling with the course. For example, the teacher can check when a student has accessed content materials that relates to particular poor assessment attempts by the student, or whether a student is engaging with other peers on discussion forums after they have been set a topic or problem to investigate. Next steps The Loop tool will now be piloted in four courses across the three institutions participating in this project: a biomedical course and a pharmacology course at the, an education course at Macquarie University, and an accounting course at the. These pilot studies will take place in the second semester of the 2015 academic year. Teaching staff involved in each pilot will take part in an interview at the beginning of semester during which they will be introduced to the functions of the tool and asked how they expect they will use the tool to support their teaching practice. Throughout the semester participants will be asked to complete a short diary entry each time they access the tool to record the purpose of their use and any outcomes and/or actions that results from viewing the data. At the end of semester they will be interviewed again and asked to reflect on their use of the tool, the results of any educational interventions made, and the usability of the tool. The findings of this data collection process will inform any changes to the tool 412 CP:60
5 before it is released as an open source tool for other institutions to use. Moreover, the pedagogical helper tool will be developed, which will assist teachers in articulating their pedagogic intent of their course and its activities, based on its learning objectives and learning design. Conclusion The Loop tool represents a step forward in providing learning analytics that are able to support and inform learning design. The tool is unique in that it has been designed to provide dashboards, reports and visualisations that incorporate course hierarchy (i.e., content tree structure), course schedule (i.e., reporting for each week in the semester) and key course events (i.e., recurring events such as lectures and single assessment submission dates). This structure, manifest in Loop s design and interface, is essential to give teachers an understanding of the how students learning behavior is temporally related to the pedagogical structure of their course, and the specific learning activities within it. References Kennedy, G., Corrin, L., Lockyer, L., Dawson, S., Williams, D., Mulder, R., Khamis, S., & Copeland, S. (2014). Completing the loop: returning learning analytics to teachers. In B. Hegarty, J. McDonald, & S.-K. Loke (Eds.), Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014 (pp ). Di Bitonto, P., Pesare, E., Roselli, T., & Rossano, V. (2015). Digitally Enhanced Assessment in Virtual Learning Environments, In Proceedings of 21st International Conference on Distributed Multimedia Systems, Vancouver, Canada. Drachsler, H., & Greller, W. (2012). Confidence in learning analytics. In Proceedings of the Second International Conference on Learning Analytics and Knowledge (pp ).New York, NY: ACM. Laurillard, D. (2002). Rethinking University Teaching. A conversational framework for the effective use of learning technologies. London: Routledge. Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10) Petropoulou, O., Kasimatis, K., Dimopoulos, I., & Retalis, S. (2014). LAe-R: A new learning analytics tool in Moodle for assessing students performance. Bulletin of the IEEE Technical Committee on Learning Technology, 16(1), Schreurs, B., Teplovs, C., Ferguson, R., De Laat, M., & Buckingham Shum, S. (2013). Visualizing social learning ties by type and topic: Rationale and concept demonstrator. In Proceedings of the Third International Conference on Learning Analytics and Knowledge (pp ). ACM. Corrin, L., Kennedy, G., Barba, P.D., Bakharia, A., Lockyer, L., Gasevic, D., Williams, D., Dawson, S., & Copeland, S. (2015). Loop: A learning analytics tool to provide teachers with useful data visualisations. In T. Reiners, B.R. von Konsky, D. Gibson, V. Chang, L. Irving, & K. Clarke (Eds.), Globally connected, digitally enabled. Proceedings ascilite 2015 in Perth (pp. CP:57-CP:61). Note: All published papers are refereed, having undergone a double-blind peer-review process. The author(s) assign a Creative Commons by attribution licence enabling others to distribute, remix, tweak, and build upon their work, even commercially, as long as credit is given to the author(s) for the original creation. 413 CP:61
Exploring students interpretation of feedback delivered through learning analytics dashboards
Exploring students interpretation of feedback delivered through learning analytics dashboards Linda Corrin, Paula de Barba Centre for the Study of Higher Education University of Melbourne The delivery
A topical start-up guide series on emerging topics on Educational Media and Technology
A topical start-up guide series on emerging topics on Educational Media and Technology CEMCA EdTech Notes 1 Introduction Introduction Teachers who lack the visual cues that are available in a face-to-face
Creepy Analytics and Learner Data Rights
Creepy Analytics and Learner Data Rights Scott Beattie Central Queensland University Carolyn Woodley RMIT Kay Souter Deakin University Enthusiasm for the potential of learning analytics and big data technologies
Doing learning analytics in higher education -Critical issues for adoption & implementation-
Doing learning analytics in higher education -Critical issues for adoption & implementation- Dragan Gašević @dgasevic May 8, 2015 Learning Analytics Network University of Edinburgh and JISC Growing demand
LEARNING ANALYTICS: A SOUTH AFRICAN HIGHER EDUCATION PERSPECTIVE
LEARNING ANALYTICS: A SOUTH AFRICAN HIGHER EDUCATION PERSPECTIVE A few steps on an analytics journey Juan-Claude Lemmens and Michael Henn Presentation Outline Introduction to the research Establishing
Using a blogging tool to assess online discussions: an integrated assessment approach
Using a blogging tool to assess online discussions: an integrated assessment approach Xinni Du Learning and Teaching Unit (LTU) University of New South Wales Chris Daly School of Mining Engineering University
Building a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
Learning analytics approach of EMMA project. Kairit Tammets Centre for educational technology Tallinn University Estonia Contact Kairit@tlu.
Learning analytics approach of EMMA project Kairit Tammets Centre for educational technology Tallinn University Estonia Contact [email protected] Francis Brouns Welten Institute Open Universiteit, Nederland
Activity Visualization and Formative Assessment in Virtual Learning. Environments
Activity Visualization and Formative Assessment in Virtual Learning Environments Urban Nulden and Christian Hardless Goteborg University, Sweden INTRODUCTION Teaching, tutoring and assessing students becomes
Learning Analytics in Higher Education: A Summary of Tools and Approaches
Learning Analytics in Higher Education: A Summary of Tools and Approaches Amara Atif, Deborah Richards Department of Computing, Faculty of Science, Macquarie University Ayse Bilgin Department of Statistics,
Phase 1 pilot 2005/6 Intervention. GCU Caledonian Business School Business Management Page 1 of 8. Overview
University Department Module Overview Glasgow Caledonian Business school Business Management The Business School has been implementing assessment re-engineering using a variety of technologies, initially
Using A Learning Management System to Facilitate Program Accreditation
Using A Learning Management System to Facilitate Program Accreditation Harry N. Keeling, Ph.D. Howard University Department of Systems and Computer Science College of Engineering, Architecture and Computer
The one-eyed king: positioning Universal Design within learning and teaching at a tertiary institution
The one-eyed king: positioning Universal Design within learning and teaching at a tertiary institution Thomas Kerr, Iain McAlpine, Michael Grant Learning and Teaching Centre Macquarie University This paper
Introduction to elearning Pedagogy
Blackboard Training Resources Introduction to elearning Pedagogy Version 1.3 Introduction to elearning Pedagogy Contents Introduction...1 Models of learning and teaching...2 1. Mayes: The Conceptualisation
The skinny on big data in education: Learning analytics simplified
The skinny on big data in education: Learning analytics simplified By Jacqueleen A. Reyes, Nova Southeastern University Abstract This paper examines the current state of learning analytics (LA), its stakeholders
Abstract. Introduction
Let s not forget: Learning analytics are about learning 1 Let s not forget: Learning Analytics are about Learning Dragan Gašević, Shane Dawson, George Siemens Abstract The analysis of data collected from
PRODUCTION STAGE PRE-LAUNCH STAGE LAUNCH STAGE. knowledge or content. skills. advises on pegagogy and course design to.
PROPOSAL STAGE PRODUCTION STAGE PRE-LAUNCH STAGE LAUNCH STAGE PROPOSED ON-LINE COURSE DESIGN PROCESS FOR NIAGARA COLLEGE conversion of a fully on-line course is proposed creation of a new fully on-line
A simple time-management tool for students online learning activities
A simple time-management tool for students online learning activities Michael de Raadt Department of Mathematics and Computing and Centre for Research in Transformative Pedagogies University of Southern
ECU Quality Assurance Guidelines for Online Delivery
ECU Quality Assurance Guidelines for Online Delivery PURPOSE The purpose of these guidelines is to encompass and improve ECU s existing minimum standards for online units, and to extend ECU s quality assurance
Keywords: workplace learning analytics, data visualization, social semantic data
Visualizing workplace learning data with the SSS Dashboard Adolfo Ruiz-Calleja, Tallinn University, [email protected] Sebastian Dennerlein, Graz University of Technology, [email protected] Tobias
Effective Teaching Approaches in Diploma of Business Course
Abstract: Effective Teaching Approaches in Diploma of Business Course Tsen Tzee Mui Curtin University Sarawak Malaysia [email protected] Dr Beena Giridharan Curtin University Sarawak Malaysia [email protected]
REGULATIONS AND CURRICULUM FOR THE MASTER S PROGRAMME IN INFORMATION ARCHITECTURE FACULTY OF HUMANITIES AALBORG UNIVERSITY
REGULATIONS AND CURRICULUM FOR THE MASTER S PROGRAMME IN INFORMATION ARCHITECTURE FACULTY OF HUMANITIES AALBORG UNIVERSITY SEPTEMBER 2015 Indhold PART 1... 4 PRELIMINARY REGULATIONS... 4 Section 1 Legal
Learning and Academic Analytics in the Realize it System
Learning and Academic Analytics in the Realize it System Colm Howlin CCKF Limited Dublin, Ireland [email protected] Danny Lynch CCKF Limited Dublin, Ireland [email protected] Abstract: Analytics
Measuring What Matters: A Dashboard for Success. Craig Schoenecker System Director for Research. And
Measuring What Matters: A Dashboard for Success Craig Schoenecker System Director for Research And Linda L. Baer Senior Vice Chancellor for Academic and Student Affairs Minnesota State Colleges & Universities
Standards for Quality Online Teaching
Standard 1 Standards for Quality Online Teaching The teacher plans, designs and incorporates strategies to encourage active learning, interaction, participation and collaboration in the online environment.
Benchmarking across universities: A framework for LMS analysis
Benchmarking across universities: A framework for LMS analysis Lynnae Rankine University of Western Sydney Leigh Stevenson Griffith University Janne Malfroy University of Western Sydney Kevin Ashford-Rowe
UCD TEACHING AND LEARNING
UCD TEACHING AND LEARNING/ RESOURCES E-Learning Planning Your Module Design for On-line or Blended Modules Authors: Geraldine O Neill & Aine Galvin Email: [email protected], [email protected]
Auditing education courses using the TPACK framework as a preliminary step to enhancing ICTs
Auditing education courses using the TPACK framework as a preliminary step to enhancing ICTs Chris Campbell The University of Queensland Aspa Baroutsis The University of Queensland The Teaching Teachers
IT 201 Information Design Techniques
IT 201 Information Design Techniques Course Syllabus for the Virtual Class 1. Opening Note This section of IT 201 is offered via "Moodle", an online conferencing system. The material covered will be the
THE MASTER S DEGREE IN DESIGN PROGRAMME DESCRIPTION Adopted by the Board of KHiB on 27 October 2011
THE MASTER S DEGREE IN DESIGN PROGRAMME DESCRIPTION Adopted by the Board of KHiB on 27 October 2011 1. THE PROFILE AND OBJECTIVES OF THE STUDY PROGRAMME The goal of the Department of Design is to educate
Assessment and Instruction: Two Sides of the Same Coin.
Assessment and Instruction: Two Sides of the Same Coin. Abstract: In this paper we describe our research and development efforts to integrate assessment and instruction in ways that provide immediate feedback
Strategies to Enhance Learner s Motivation in E-learning Environment
Strategies to Enhance Learner s Motivation in E-learning Environment M. Samir Abou El-Seoud Faculty of Informatics and Computer Science, British University in Egypt (BUE), Cairo, Egypt [email protected]
A Framework for Self-Regulated Learning of Domain-Specific Concepts
A Framework for Self-Regulated Learning of Domain-Specific Concepts Bowen Hui Department of Computer Science, University of British Columbia Okanagan and Beyond the Cube Consulting Services Inc. Abstract.
A Population Health Management Approach in the Home and Community-based Settings
A Population Health Management Approach in the Home and Community-based Settings Mark Emery Linda Schertzer Kyle Vice Charles Lagor Philips Home Monitoring Philips Healthcare 2 Executive Summary Philips
Reporting student achievement and progress in Prep to Year 10
Reporting student achievement and progress in Prep to Year 10 Advice on implementing the Australian Curriculum April 2012 This paper is a component of the Queensland implementation strategy 2011 2013 (June
SLDS Workshop Summary: Data Use
SLDS Workshop Summary: Data Use Developing a Data Use Strategy This publication aims to help states detail the current status of their State Longitudinal Data System (SLDS) data use strategy and identify
Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement
Student retention and learning analytics: A snapshot of Australian practices and a framework for advancement Final report 2015 University of South Australia The University of Melbourne University of Technology
Business Intelligence in E-Learning
Business Intelligence in E-Learning (Case Study of Iran University of Science and Technology) Mohammad Hassan Falakmasir 1, Jafar Habibi 2, Shahrouz Moaven 1, Hassan Abolhassani 2 Department of Computer
Beauty and blended learning: E-learning in vocational programs
Beauty and blended learning: E-learning in vocational programs Melanie Brown Waikato Institute of Technology This case study set out to discover how the provision of blended learning focussing on course
Australian Government Department of Education and Training More Support for Students with Disabilities 2012-2014
Australian Government Department of Education and Training More Support for Students with Disabilities 2012-2014 Evaluation Case Study OnlineTraining Limited professional learning modules MSSD Output 5:
AN ONLINE MASTER S DEGREE: TEACHING AND LEARNING STRATEGY VS. MANAGERIAL CONSTRAINTS
ICICTE 2015 Proceedings 192 AN ONLINE MASTER S DEGREE: TEACHING AND LEARNING STRATEGY VS. MANAGERIAL CONSTRAINTS Andrew Doig and Claire Pekcan Southampton Solent University United Kingdom Abstract This
Harvard University Graduate School of Design Department of Landscape Architecture LANDSCAPE ARCHITECTURE THESIS HANDBOOK 2014 2015
Harvard University Graduate School of Design Department of Landscape Architecture LANDSCAPE ARCHITECTURE THESIS HANDBOOK 2014 2015 INDEX 1. INTRODUCTION TO LANDSCAPE ARCHITECTURE THESIS a. Overview 2.
A Model for Quality Assurance of Assessments and examinations (MQAA) (0224)
A Model for Quality Assurance of Assessments and examinations (MQAA) (0224) Henriette Lucander, Cecilia Christersson, Gunilla Lofstrom Malmo University, Sweden The role of HEIs as stakeholders in societal
Identifying At-Risk Students Using Machine Learning Techniques: A Case Study with IS 100
Identifying At-Risk Students Using Machine Learning Techniques: A Case Study with IS 100 Erkan Er Abstract In this paper, a model for predicting students performance levels is proposed which employs three
How To Learn Marketing Skills
Table of Contents Professional Diploma in Digital Marketing Module 1: Introduction to Digital Marketing Module 2: Search Engine Marketing (SEO) Module 3: Search Engine Marketing (PPC) Module 4: Email Marketing
Chapter 3 - Alternative Assessments
Chapter 3 - Alternative Assessments Evaluating with Alternative Assessments In the ERIC Digest article "Alternative Assessment and Second Language Study: What and Why?" Charles R. Hancock states "In the
Teaching Online at UD Best Practices Guide
Teaching Online at UD Best Practices Guide Version 1.0 April 14, 2009, Updated January 26, 2015 Ryan C. Harris Learning Teaching Center UDit Curriculum Innovation and E Learning Delivering Quality Online
Future Landscapes. Research report CONTENTS. June 2005
Future Landscapes Research report June 2005 CONTENTS 1. Introduction 2. Original ideas for the project 3. The Future Landscapes prototype 4. Early usability trials 5. Reflections on first phases of development
Course Overview Lean Six Sigma Green Belt
Course Overview Lean Six Sigma Green Belt Summary and Objectives This Six Sigma Green Belt course is comprised of 11 separate sessions. Each session is a collection of related lessons and includes an interactive
Gonzaga University Virtual Campus Ignatian Pedagogical Approach Design Portfolio (IPA) Updated: October 15, 2014
Gonzaga University Virtual Campus Ignatian Pedagogical Approach Design Portfolio (IPA) Updated: October 15, 2014 Course Title: Course Number: Faculty Name: Course Date: Course Type course description here.
Using online presence data for recommending human resources in the OP4L project
Using online presence data for recommending human resources in the OP4L project Monique Grandbastien 1, Suzana Loskovska 3, Samuel Nowakowski 1, Jelena Jovanovic 2 1 LORIA Université de Lorraine - Campus
Web Mining as a Tool for Understanding Online Learning
Web Mining as a Tool for Understanding Online Learning Jiye Ai University of Missouri Columbia Columbia, MO USA [email protected] James Laffey University of Missouri Columbia Columbia, MO USA [email protected]
How To Learn Data Analytics
COURSE DESCRIPTION Spring 2014 COURSE NAME COURSE CODE DESCRIPTION Data Analytics: Introduction, Methods and Practical Approaches INF2190H The influx of data that is created, gathered, stored and accessed
Penn State Online Faculty Competencies for Online Teaching
Teaching in an online environment can be considerably different in nature than teaching face-to-face. The competencies listed in this document are intended to provide faculty and administrators with a
2015 2016 one year courses digital image creation for luxury brands
2015 2016 one year courses digital image creation for luxury brands undergraduate programmes one year course digital image creation for luxury brands 02 Brief descriptive summary Over the past 78 years
How To Get More Out Of Leads
Lead Scoring for Success A practical guide to achieving better results with lead scoring Lead Scoring The Growing Need for Lead Scoring The Growing Need for Lead Scoring A company s website is still one
Delaware Performance Appraisal System
Delaware Performance Appraisal System Building greater skills and knowledge for educators DPAS-II Guide for Administrators (Principals) Principal Practice Rubric Updated July 2015 1 INEFFECTIVE A. DEVELOPS
Progression of a Data Visualization Assignment
Progression of a Data Visualization Assignment Joni K. Adkins [email protected] Mathematics, Computer Science, & Information Systems Department Northwest Missouri State University Maryville, MO 64468,
Programme approval 2008/09 PROGRAMME APPROVAL FORM SECTION 1 THE PROGRAMME SPECIFICATION
PROGRAMME APPROVAL FORM SECTION 1 THE PROGRAMME SPECIFICATION 1. Programme title and designation Vascular Ultrasound 2. Final award Award Title Credit ECTS Any special criteria value equivalent MSc Vascular
