Learning Analytics Interoperability The Big Picture in Brief
|
|
|
- Percival Clarke
- 10 years ago
- Views:
Transcription
1 Learning Analytics Community Exchange Learning Analytics Interoperability The Big Picture in Brief an introductory briefing By: Adam Cooper, Cetis, University of Bolton, UK Published: 26 March 2014 Keywords: learning analytics, interoperability, benefits Learning Analytics is now moving from being a research interest to a wider community who seek to apply it in practice. As this happens, the challenge of efficiently and reliably moving data between systems becomes of vital practical importance. System interoperability reduces this challenge. This introductory briefing, which is aimed at nontechnical readers who are concerned with developing plans for sustainable practical learning analytics, describes some of the motivations for better interoperability and outlines the range of situations in which standards or other technical specifications can help to realise these benefits.
2 Contents Introduction... 1 What is Interoperability?... 1 What are the potential benefits?... 2 The Scope of Learning Analytics Interoperability... 4 Conclusion... 7 Further Reading Technical Details... 8 About
3 Introduction Learning Analytics is now moving from being a research interest to a wider community who seek to apply it in practice. As this happens, the challenge of efficiently and reliably moving data between systems becomes of vital practical importance. Whereas traditional forms of analytical processing rely on existing management data, such as student demographics, grades, and recruitment figures, more recent approaches to analytics rely on data that has greater variety and arises from traces left as people use IT systems. This is a central concern for learning analytics, where the data arises from normal use of multiple pieces of software designed for accessing learning resources, social interaction, content creation, etc. In many cases, therefore, practical learning analytics requires that data moves from operational to analytical systems and be put to a different use than originally intended. For example, the data structures in a VLE or LMS are likely to have been designed not for analytics, but to realise teaching and learning use cases - e.g. for accessing video content, participation in forums in a way is technically scalable and maintainable. When statistical processing or data mining is undertaken, for example to support analysis of learner engagement, data has to be re-interpreted. This situation is further amplified by the necessity of combining data from various sources, or maybe to use cloud-computing based data mining engines, to build, test, and apply useful statistical and predictive models. This introductory briefing, which is aimed at non-technical readers who are concerned with developing plans for sustainable practical learning analytics, describes some of the motivations for better interoperability and outlines the scope for interoperability between learning analytics systems. What is Interoperability? The Institute of Electrical and Electronics Engineers defines interoperability to be: the ability of two or more systems or components to exchange information and to use the information that has been exchanged. A broad interpretation of systems that includes people and the activities they undertake using these digital technologies captures the true essence of interoperability as a means to achieve human aims and objectives. When we have interoperability, things just work. When we do not, digital technologies are frustrating or arduous to use: we see errors, warning messages, and crashes; we are faced with data we do not understand, that is corrupted, or missing; we have to know tricks to work-around the technology; we have to use certain software for a given task when we would prefer an alternative. The most difficult challenges with achieving interoperability are typically found in establishing common meanings to the data. Sometimes this is a matter of technical precision, but culture regional, sector-specific, and institutional and habitual practices also affect meaning. Meaning is found in what people do and think and it is often not formally expressed but instead captured in the way software is written or used. An important consequence of the difficulty posed by common meaning is that, except in very constrained settings, useful levels of interoperability are often only found after periods of investigation, negotiation, experience, and refinement. 1
4 What are the potential benefits? In addition to our experience, as consumers, of IT just working, interoperability affords a number of other kinds of benefit with much wider impact in an institutional or business setting. In any given real-world setting, we observe that one or two of these benefit types tend to be more prominent than the others over the lifetime of a system or data. Some of them are general benefits, for example arising from enabling modular or service-oriented approaches, while others have particular relevance to learning analytics. Analytics interoperability benefits include: Efficiency and timeliness are the most obvious when considering business use of IT and IT systems operation. Interoperability means that systems work without the need for a person to intervene to re-enter, re-format, or transform data. Information flows become more efficient and the delivery of information to the point of use enables more timely action. Data quality may also be improved in a suitably designed and managed system by reducing the risk of mistakes and unpredictable degradation of data quality. Independence from consequential disruption: resilience. Although it may be possible to access data directly from the databases that underpin most software, the supplier is unlikely to guarantee that there will be no changes as they release new versions. An interface that is designed to be interoperable conforming to specified data structures and access methods and de-coupled from the operational database scheme - should be a much more dependable data source. Adaptability, both of IT architecture and educational practices, can arise when interoperability is combined with a modular approach; it is faster, cheaper, and less disruptive to change things as needs change. This is particularly relevant for applications, such as learning analytics, where the methods are not yet mature, the institutions that are adopting learning analytics are undergoing rapid change, and the people who are the ultimate users do not yet fully understand the potential. The antithesis of adaptability is lock-in to a black-box integrated system, where the cost and disruption of change inhibits transition to a more fit-for-purpose system. In the absence of change, teachers and learners must endure unsatisfactory software and the potential of learning analytics remain unrealised. Innovation and market growth is closely related to adaptability, as its supply-side counterpart. Interoperability combined with modularity makes it easier to build IT systems that are better matched to local culture without needing to create and maintain numerous whole systems. The market becomes more attractive to new entrants, hence innovation is increased, by reducing the cost of development and because consumer organisations are better able to undertake incremental change. Interoperability enables the development and viability of a category of learning analytics tools that can work across diverse local system choices. Durability of data has generally been the concern of archive managers and neglected by others. Historical data is valuable for learning analytics but simply preserving it as it is created in operational systems is often unsatisfactory because structures and formats change over time, and the changes are rarely properly documented. This is likely to apply to the raw data generated by operational systems, and also to the results of analysis whether these live in spreadsheets or more structured analytical systems. Archiving data using a stable, documented, and well-defined information model 2
5 and technical representation improves the chances that historical data and analytical results will be decipherable in the future. Aggregation of data from multiple sources that may be both internal and external to an organisation is a key driver of value in analytics. Interoperability eases aggregation most immediately through the easing of access to the data but the real advantages arise from merging or joining datasets, which implies the use of common definitions across several IT systems, even if they do not interact in normal operations. Data joining might be supported by a common set of definitions around course structure, combined with a unified identification scheme. Merging of data from blogs, forums, twitter, wikis, etc to analyse patterns of contribution and consumption would require a common way of expressing social media contributions that is independent of the kind of platform. Sharing data, especially when there are multiple parties involved, is more tractable if a common language that does not favour any single organisation can be agreed upon. For many organisations, the previously-mentioned benefits are likely to be far more important than the possibility of sharing, especially of original data; legal and ethical concerns are likely to be serious obstacles except maybe between an educational technology platform provider and their clients, or within the regulated confines of educational research. The sharing of derived or aggregated data is, however, more tractable, and could be useful in benchmarking exercises, or the collective development of predictive models that require the largest possible sample size. 3
6 The Scope of Learning Analytics Interoperability The following diagram illustrates some of the different roles for learning analytics interoperability, broken down into three categories. Rather than focussing only on data about learners, it considers a bigger picture that includes interoperability support for the analytical process in addition to the spectrum of data that might be useful. The implication of the big picture is that opportunity to realise benefits may lie in many places, not that realising benefits requires implementation of the whole. 4
7 We will now look at some examples where interoperability standards or other specifications exist to illustrate the above diagram. The intention here is not to provide a systematic or value-based summary but to show the variety of approaches to interoperability. Web links to most of the technologies mentioned are given in the section Further Reading Technical Details towards the end of this document. Particular attention is given to interoperability that is based on open, rather than proprietary, standards and specifications. This is because open standards 1 maximise the benefits mentioned previously. Interoperability of models and methods is the aim of PMML, the Predictive Model Markup Language, a mature XML-based specification from the Data Mining Group. Although its emphasis is on predictive methods such as decision trees and logistic regression, it can also be used to convey the results of more common statistical tests. It is supported by a range of software, both open source (e.g. R and Knime, which offer programming and visual environments respectively) and closed source, as well as providers of cloud computing data mining services (e.g. Zementis). While PMML does deal with analytical results, a number of international agencies concerned with statistics collection and publication - including the European Central Bank, Eurostat, and UNESCO - have developed SDMX, the Statistical Data and Metadata exchange standard, and a number of freely-available software tools. SDMX is typically used for large-scale demographic, social, economic, and environmental statistics and is capable of capturing a lot of information about the data (i.e. metadata) such as the origin, reporting periods, etc. SDMX is an ISO standard and the W3C Data Cube Vocabulary is a closely-related specification. Both SDMX and Data Cube are very complicated, and are not likely to be useful in their entirety for learning analytics applications any time soon. On the other hand, they do provide a resource for the definition of simplified solutions to similar problems. Basing a simple model on SDMX concepts rather than inventing a new approach allows for future growth in scope and for serendipitous interoperability as what were initially distinct cases grow and merge. A rather different strategy for applying interoperability to the transfer of numerical results is to move the problem from the numbers to their presentation. Widget and portlet approaches allow visualisations and other elements to be composed into a dashboard, provided as cross-platform apps, or embedded in teaching and learning software. This is the realm of web standards such as OAuth, HTML5, and the W3C Packaged Web Apps (Widgets) specification. As such, they are not specific to analytics interoperability but they may be a practical, and possibly interim, means to improving on closed-environment analytics software while avoiding the complexity of SDMX, for example. These techniques are well understood in the educational technology domain. For example, IMS LTI (Learning Tools Interoperability) builds on top of web standards and represents an easy, and quite widely-supported, way to include context sensitive analytics tools into LMS/VLE systems. When it comes to data for analysis, there are numerous existing specifications and standards that contribute elements of interoperability. Although learning analytics will often rely on data that is 1 The term open standards is used with various meanings, ranging from access to the specification document and rights to use the technology on reasonable and non-discriminatory terms through to a requirement that the standard should have been created in a transparent process open to all interested parties. The Open Stand Principles provide a good benchmark since they are supported by the most important IT standards organisations, judged by impact. 5
8 specific to the education and training domain, for example capturing the idea of person roles, it will also often be possible to use generic data such as the web page access and interation events used in web analytics. The scale of use of these generic techniques makes them candidates for adoption, or adaption, for learning analytics because of the good supply of tools and expertise. In the world of web analytics, Google Analytics is dominant. It is, of course, an entirely proprietary system but, since it relies on the interoperability of JavaScript in particular, its users realise a number of the benefits outlined in the previous section. Although few, if any, educational establishments would consider tracking learner-level activity with such a service because of a duty of care with respect to personal data, it may be useful for analysis focussed on web sites and content use. The Open Source Software alternative Piwik gives its users more control over the captured data but it is currently less fine-grained in what it can track and it is still essentially a proprietary approach to data collection and access. Web analytics tools do not, however, cover all of the information that it would be useful to exchange: about people, their activities, the various contexts in which these take place, learning resources, and the relationships between them all. In addition, whereas specifications such as SDMX and PMML have arisen from many years of application, the same is not true for data for learning analytics. Even those cases where an interoperability specification has been used for several years, the use has not usually been for analytics. There is still, in early 2014, a need to experiment, to gather evidence, and to undertake selective refinement and work on new prototype standards driven by analytical need. Two existing interoperability specifications have been chosen to illustrate different strategies within the learning, education, and training domain; they take different approaches to the split between domain-specific and high-level concepts shown on the diagram. Both specifications relate closely to the centrally-important data about learner activity and both have been conceived with learning analytics use cases. The two specifications are the experience API and IMS QTI. The experience API (usually abbreviated xapi and also called the Tin Can API by its creators) has had a wave of interest and implementation since its conception in The xapi is based on the idea of tracking activity and it is based on Activity Streams 2, which was developed to provide a better way of expressing social media activity than existed before. It provides a framework for making statements of the kind someone does an action to/with something, for example Geraldine posted a photo to her album. The xapi has been used for social learning platforms and increasingly it is exciting interest as a means of capturing data for learning analytics, although this is not yet well established in practice. The xapi only defines the statement pattern and how statements are stored and retrieved; it does not specify, for example, what all the verbs are that might be used in practice. This is both an advantage and a difficulty. The advantage lies in being open to a wide range of uses and, since learning analytics is relatively new outside the research context, we should expect the range of uses to increase considerably over the next few years.the disadvantage is that adopters need to define 2 The IMS Caliper project, which is currently (March 2014) in the chartering stage adopts a similar approach to tracking activity with a view to using the collected data for analytics. 6
9 these missing parts. This applies even within a single institutional adoption since data from different systems will generally be combined, but for wider visions of interoperability it indicates that communities of shared practice and educational culture must get together to reach consensus definitions. IMS QTI (Question and Test Interoperability) has a more-than ten year history of development and implementation. Most of the attention has been on QTI for exchange and rendering of assessments and questions, but the creators of QTI included a number of established high-stakes assessment organisations that have been undertaking well-developed statistical analyses for many years. These analyses are usually referred to as psychometrics rather than learning analytics but maybe the current interest in the topic will spur wider interest in these mature techniques, and bring more attention to the results reporting capabilities of IMS QTI. QTI is a lot more complete in what it defines than xapi is. This is for good reason; the nature of questions, tests, and the results of them is sufficiently complicated that more ideas must be defined before a specification becomes useful. This makes it less flexible but affords greater clarity about how QTI works. So, while the scores that learners might get on sections of an assessment, or even the order in which they attempt questions could be sensibly captured with xapi, possibly using terms defined in QTI, a more detailed analysis of interactions is likely to require access to data about the question item structure that would best be expressed in the purpose-build form of IMS QTI. Conclusion A lot of attention given to interoperability in the context of learning analytics has, so far, been given to the capture of learner activity. Among the drivers for this are wider trends in analytics and the diversity of IT used for education and training. Some of the best-known examples of business analytics make particular use of the traces captured during consumer interactions with web sites, and with social media. These motivate interest in the almost-magical acquisition of knowledge from apparently inconsequential data. The diversity in IT presents an old problem; it is difficult to analyse data that is spread about. Hence, people are attracted to using a single method of capturing data that originates in many places for storage in a single location. Frequently, progress may be made by adopting common approaches to generic concepts but useful learning analytics will usually require common approaches to domain-specific concepts, which require definition by communities of practice. This short document has illustrated a wider range of places in which the benefits of interoperability may be found beyond the capture of learner activity. In order to explore these matters more deeply, the Learning Analytics Community Exchange project will produce a range white papers and briefings. These will communicate a systematic analysis of what is currently available and capture practical experience. The papers will be complemented by a number of webinars in which implementers in educational establishments and in the software industry - will share their experiences, and innovators will present new developments. The LACE project will also be working with this community of implementers, researchers, and interoperability experts to help build consensus on common data definitions, and identify important missing pieces in the learning analytics interoperability jigsaw. 7
10 Further Reading Technical Details Activity Streams - Google Analytics - LTI IMS Learning Tools Interoperability - OAuth - Piwik PMML Predictive Model Markup Language - QTI IMS Question and Test Interoperability - SDMX - Statistical Data and Metadata exchange - W3C Data Cube - W3C Widgets - xapi The experience API (also known as Tin Can API) - 8
11 About... Acknowledgements The author would like to thank Tore Hoel (Oslo and Akershus University College) and Wilbert Kraan (Cetis, University of Bolton) for reviewing drafts of this document and providing suggestions that have improved the quality of the final version. The remaining weaknesses are wholly my own. This document was produced with funding from the European Commission Seventh Framework Programme as part of the LACE Project, grant number About the Author Adam is co-director of the Centre for Educational Technology and Interoperability Standards, based at the University of Bolton. He has participated in many of the groups involved in educational technology standards since the year 2000 and is a member of the UK Government Open Standards Board and Information Standards Board for Education, Skills and Children s Services. He has written a number of briefings and white papers on learning analytics, interoperability, and standards development - at - and blogged on the same topics at About this document (c) 2014, Adam Cooper, Cetis, University of Bolton, UK. Licensed for use under the terms of the Creative Commons Attribution v4.0 licence. Attribution should be by Adam Cooper, Cetis, University of Bolton, UK, for the LACE Project ( For more information, see the LACE Publication Policy: The persistent URL for this document is: About LACE The LACE project brings together existing key European players in the field of learning analytics & educational data mining who are committed to build communities of practice and share emerging best practice in order to make progress towards four objectives. Objective 1 Promote knowledge creation and exchange Objective 2 Increase the evidence base Objective 3 Contribute to the definition of future directions Objective 4 Build consensus on interoperability and data sharing 9
Scalable Learning Analytics and Interoperability an assessment of potential, limitations, and evidence
Scalable Learning Analytics and Interoperability an assessment of potential, limitations, and evidence Adam Cooper, Cetis EUNIS 2015, Abertay University, Dundee June 12 th 2015 #laceproject Outline Scene
Mobile Learning and ADL s Experience API
Mobile Learning and ADL s Experience API Kristy Murray, Peter Berking, Jason Haag, and Nikolaus Hruska * Introduction The Advanced Distributed Learning (ADL) Initiative s Sharable Content Object Reference
Contents. viii. 4 Service Design processes 57. List of figures. List of tables. OGC s foreword. Chief Architect s foreword. Preface.
iii Contents List of figures List of tables OGC s foreword Chief Architect s foreword Preface Acknowledgements v vii viii 1 Introduction 1 1.1 Overview 4 1.2 Context 4 1.3 Purpose 8 1.4 Usage 8 2 Management
Information Management Advice 39 Developing an Information Asset Register
Information Management Advice 39 Developing an Information Asset Register Introduction The amount of information agencies create is continually increasing, and whether your agency is large or small, if
Digital Continuity Plan
Digital Continuity Plan Ensuring that your business information remains accessible and usable for as long as it is needed Accessible and usable information Digital continuity Digital continuity is an approach
Plug and play learning application integration using IMS Learning Tools Interoperability
Plug and play learning application integration using IMS Learning Tools Interoperability Simon Booth University of Stirling Susi Peacock Queen Margaret University Stephen P. Vickers The University of Edinburgh
University of Cambridge: Programme Specifications CERTIFICATE OF HIGHER EDUCATION IN INTERNATIONAL DEVELOPMENT
University of Cambridge: Programme Specifications Every effort has been made to ensure the accuracy of the information in this programme specification. Programme specifications are produced and then reviewed
BBC Technology Strategy
BBC Technology Strategy Issued: January 2010 For more information contact: [email protected] Copyright 2010 BBC All Rights Reserved TECHNOLOGY STRATEGY Introduction This paper defines the BBC s
Sytorus Information Security Assessment Overview
Sytorus Information Assessment Overview Contents Contents 2 Section 1: Our Understanding of the challenge 3 1 The Challenge 4 Section 2: IT-CMF 5 2 The IT-CMF 6 Section 3: Information Management (ISM)
Learning Measurement for Analytics Whitepaper. Opportunity
Learning Measurement for Analytics Whitepaper Opportunity The online educational landscape continues to yield significant growth in online digital curriculum development, delivery and enrollment. There
MANAGING DIGITAL CONTINUITY
MANAGING DIGITAL CONTINUITY Project Name Digital Continuity Project DRAFT FOR CONSULTATION Date: November 2009 Page 1 of 56 Contents Introduction... 4 What is this Guidance about?... 4 Who is this guidance
Identifying Information Assets and Business Requirements
Identifying Information Assets and Business Requirements This guidance relates to: Stage 1: Plan for action Stage 2: Define your digital continuity requirements Stage 3: Assess and manage risks to digital
ehealth Architecture Principles
ehealth Architecture Principles Version 3.0 June 2009 Document Control Details Title: ehealth Architecture Principles Owner: Head of Architecture and Design, Scottish Government ehealth Directorate Version:
Learning Analytics: Making learning better?
Learning Analytics: Making learning better? Dr Doug Clow Institute of Educational Technology, The Open University, UK @dougclow dougclow.org [email protected] CC-BY You are free to: copy, share, adapt,
TERMS OF REFERENCE. Revamping of GSS Website. GSS Information Technology Directorate Application and Database Section
TERMS OF REFERENCE Revamping of GSS Website GSS Information Technology Directorate Application and Database Section Tel: Accra 0302 682656 Cables: GHANASTATS In case of reply the number and date of this
the role of the head of internal audit in public service organisations 2010
the role of the head of internal audit in public service organisations 2010 CIPFA Statement on the role of the Head of Internal Audit in public service organisations The Head of Internal Audit in a public
How To Write A Paper On The Integrated Media Framework
The Integrated www.avid.com The Integrated Media production and distribution businesses are working in an environment of radical change. To meet the challenge of this change, a new technology and business
The Forgotten Majority: Nurturing Unknown Prospects
The Forgotten Majority: Nurturing Unknown Prospects Most companies nurture their identified leads with calls, events and email marketing. But how do you nurture your unidentified Prospects? Table of Contents
OpenAIRE Research Data Management Briefing paper
OpenAIRE Research Data Management Briefing paper Understanding Research Data Management February 2016 H2020-EINFRA-2014-1 Topic: e-infrastructure for Open Access Research & Innovation action Grant Agreement
Project Management in Marketing Senior Examiner Assessment Report March 2013
Professional Diploma in Marketing Project Management in Marketing Senior Examiner Assessment Report March 2013 The Chartered Institute of Marketing 2013 Contents This report contains the following information:
Report of the 2015 Big Data Survey. Prepared by United Nations Statistics Division
Statistical Commission Forty-seventh session 8 11 March 2016 Item 3(c) of the provisional agenda Big Data for official statistics Background document Available in English only Report of the 2015 Big Data
IT Insights. Using Microsoft SharePoint 2013 to build a robust support and training portal. A service of Microsoft IT Showcase
IT Insights A service of Microsoft IT Showcase Using Microsoft SharePoint 2013 to build a robust support and training portal June 2015 The Microsoft IT team that is responsible for hosting customer and
Technology management in warship acquisition
management in warship acquisition A J Shanks B.Eng(Hons) MIET BMT Defence Services Limited SYNOPSIS Today s warship designers and engineers look to technology to provide warships and systems better, cheaper
Bachelor of Information Technology
Bachelor of Information Technology Detailed Course Requirements The 2016 Monash University Handbook will be available from October 2015. This document contains interim 2016 course requirements information.
How To Turn Big Data Into An Insight
mwd a d v i s o r s Turning Big Data into Big Insights Helena Schwenk A special report prepared for Actuate May 2013 This report is the fourth in a series and focuses principally on explaining what s needed
How To Use Open Source Software For Library Work
USE OF OPEN SOURCE SOFTWARE AT THE NATIONAL LIBRARY OF AUSTRALIA Reports on Special Subjects ABSTRACT The National Library of Australia has been a long-term user of open source software to support generic
More Enquiries, Same Budget: Solving the B2B Marketer s Challenge
More Enquiries, Same Budget: Solving the B2B Marketer s Challenge You need to increase inbound enquiries, both in volume and quality, but your budget is restricted. Sound familiar? Prospect Analytics offers
ITC 19 th November 2015 Creation of Enterprise Architecture Practice
ITC 19.11.15 ITC 19 th November 2015 Creation of Enterprise Architecture Practice C Description of paper 1. As part of a wider strategy of Digital Transformation of the University s core services, ISG
Honours Degree (top-up) Business Abbreviated Programme Specification Containing Both Core + Supplementary Information
Honours Degree (top-up) Business Abbreviated Programme Specification Containing Both Core + Supplementary Information 1 Awarding Institution / body: Lancaster University 2a Teaching institution: University
SIF 3: A NEW BEGINNING
SIF 3: A NEW BEGINNING The SIF Implementation Specification Defines common data formats and rules of interaction and architecture, and is made up of two parts: SIF Infrastructure Implementation Specification
From Stored Knowledge to Smart Knowledge
From Stored Knowledge to Smart Knowledge The British Library s Content Strategy 2013 2015 From Stored Knowledge to Smart Knowledge: The British Library s Content Strategy 2013 2015 Introduction The British
Open Data Kitchener. Open Data Framework. Introduction. Prepared by Jury Konga, egovfutures Group. October 23, 2013.
Open Data Framework Introduction Prepared by Jury Konga, egovfutures Group October 23, 2013. 0 Background Open government in Canadian municipalities has been in place, to varying degrees, for some time
CREATING A LEAN BUSINESS SYSTEM
CREATING A LEAN BUSINESS SYSTEM This white paper provides an overview of The Lean Business Model how it was developed and how it can be used by enterprises that have decided to embark on a journey to create
White Paper Case Study: How Collaboration Platforms Support the ITIL Best Practices Standard
White Paper Case Study: How Collaboration Platforms Support the ITIL Best Practices Standard Abstract: This white paper outlines the ITIL industry best practices methodology and discusses the methods in
THE BRITISH LIBRARY. Unlocking The Value. The British Library s Collection Metadata Strategy 2015-2018. Page 1 of 8
THE BRITISH LIBRARY Unlocking The Value The British Library s Collection Metadata Strategy 2015-2018 Page 1 of 8 Summary Our vision is that by 2020 the Library s collection metadata assets will be comprehensive,
Developing a Business Analytics Roadmap
White Paper Series Developing a Business Analytics Roadmap A Guide to Assessing Your Organization and Building a Roadmap to Analytics Success March 2013 A Guide to Assessing Your Organization and Building
Creating new university management software by methodologies of Service Oriented Architecture (SOA)
Creating new university management software by methodologies of Service Oriented Architecture (SOA) Tuomas Orama, Jaakko Rannila Helsinki Metropolia University of Applied Sciences, Development manager,
D6 INFORMATION SYSTEMS DEVELOPMENT. SOLUTIONS & MARKING SCHEME. June 2013
D6 INFORMATION SYSTEMS DEVELOPMENT. SOLUTIONS & MARKING SCHEME. June 2013 The purpose of these questions is to establish that the students understand the basic ideas that underpin the course. The answers
PROGRAMME SPECIFICATION
PROGRAMME SPECIFICATION Course record information Name and level of final award: Name and level of intermediate awards: The BSc (Hons) Computer Science is a B.Sc. BSc (Hons) Computer Science with Industrial
IPL Service Definition - Data Governance
IPL Proposal Project: Date: 6th October 2015 Issue Number: Issue 1 Customer: Crown Commercial Service Page 1 of 8 Copyright notice This document has been prepared by IPL Information Processing Limited
Cloud Computing-upcoming E-Learning tool
Cloud Computing-upcoming E-Learning tool Suvarna Dharmadhikari Computer science Department [email protected] Monali Reosekar Computer science Department [email protected] Vidya Gage Computer
An Introduction to Managing Research Data
An Introduction to Managing Research Data Author University of Bristol Research Data Service Date 1 August 2013 Version 3 Notes URI IPR data.bris.ac.uk Copyright 2013 University of Bristol Within the Research
TEC Capital Asset Management Standard January 2011
TEC Capital Asset Management Standard January 2011 TEC Capital Asset Management Standard Tertiary Education Commission January 2011 0 Table of contents Introduction 2 Capital Asset Management 3 Defining
Conference on Data Quality for International Organizations
Committee for the Coordination of Statistical Activities Conference on Data Quality for International Organizations Newport, Wales, United Kingdom, 27-28 April 2006 Session 5: Tools and practices for collecting
INTEGRATING RECORDS SYSTEMS WITH DIGITAL ARCHIVES CURRENT STATUS AND WAY FORWARD
INTEGRATING RECORDS SYSTEMS WITH DIGITAL ARCHIVES CURRENT STATUS AND WAY FORWARD National Archives of Estonia Kuldar As National Archives of Sweden Karin Bredenberg University of Portsmouth Janet Delve
Enterprise Application Integration (EAI) Techniques
Enterprise Application Integration (EAI) Techniques The development of technology over the years has led to most systems within an organisation existing in heterogeneous environments. That is to say, different
perspective Microservices A New Application Paradigm Abstract
perspective Microservices A New Application Paradigm Abstract Microservices Architecture is introducing the concept of developing functionality as a number of small self-contained services. This paper
Conversion Rate Optimisation Guide
Conversion Rate Optimisation Guide Improve the lead generation performance of your website - Conversion Rate Optimisation in a B2B environment Why read this guide? Work out how much revenue CRO could increase
Get to Grips with SEO. Find out what really matters, what to do yourself and where you need professional help
Get to Grips with SEO Find out what really matters, what to do yourself and where you need professional help 1. Introduction SEO can be daunting. This set of guidelines is intended to provide you with
Quick Guide: Meeting ISO 55001 Requirements for Asset Management
Supplement to the IIMM 2011 Quick Guide: Meeting ISO 55001 Requirements for Asset Management Using the International Infrastructure Management Manual (IIMM) ISO 55001: What is required IIMM: How to get
Digital Asset Management. Delivering greater value from your assets by using better asset information to improve investment decisions
Digital Asset the way we see it Digital Asset Delivering greater value from your assets by using better asset information to improve investment decisions In its recent survey on the UK economy, the OECD
Web Archiving and Scholarly Use of Web Archives
Web Archiving and Scholarly Use of Web Archives Helen Hockx-Yu Head of Web Archiving British Library 15 April 2013 Overview 1. Introduction 2. Access and usage: UK Web Archive 3. Scholarly feedback on
How To Complete An Msc Logistics Management Degree Programme At The University Of Lincoln
Programme Specification Title: Logistics Management Final Award: Master of Science (MSc) With Exit Awards at: Postgraduate Certificate (PG Cert) Postgraduate Diploma (PG Dip) Master of Science (MSc) To
Lightweight Data Integration using the WebComposition Data Grid Service
Lightweight Data Integration using the WebComposition Data Grid Service Ralph Sommermeier 1, Andreas Heil 2, Martin Gaedke 1 1 Chemnitz University of Technology, Faculty of Computer Science, Distributed
Open Source, Open Standards and Re Use: Government Action Plan
Open Source, Open Standards and Re Use: Government Action Plan Foreword When Sir Tim Berners Lee invented the World Wide Web in 1989, he fought to keep it free for everyone. Since then, not everyone in
White paper: Information Management. Information is a strategic business asset are you managing it as such?
White paper: Management Tieto 2013 is a strategic business asset are you managing it as such? White paper: Management Tieto 2013 Management the right decisions and actions at the right time based on correct
Free and Open Source Document Management Systems
Free and Open Source Document Management Systems Anas Tawileh School of Computer Science, Cardiff University 5 The Parade, Cardiff CF24 3AA, UK [email protected] Abstract Document Management Systems captured
Transforming Information Silos into Shareable Assets through Automated Content Conversion
Transforming Information Silos into Shareable Assets through Automated Content Conversion AUTOMATED DOCUMENT CONVERSION FOR ECM SYSTEMS WHITE PAPER Executive Summary Massive volumes of business data much
Using Social Networking Sites as a Platform for E-Learning
Using Social Networking Sites as a Platform for E-Learning Mohammed Al-Zoube and Samir Abou El-Seoud Princess Sumaya University for Technology Key words: Social networks, Web-based learning, OpenSocial,
OpenText Information Hub (ihub) 3.1 and 3.1.1
OpenText Information Hub (ihub) 3.1 and 3.1.1 OpenText Information Hub (ihub) 3.1.1 meets the growing demand for analytics-powered applications that deliver data and empower employees and customers to
Programme Specification
Programme Specification Title: Logistics Management Final Award: Master of Science (MSc) With Exit Awards at: Postgraduate Certificate (PG Cert) Postgraduate Diploma (PG Dip) Master of Science (MSc) To
Introduction to Service Oriented Architectures (SOA)
Introduction to Service Oriented Architectures (SOA) Responsible Institutions: ETHZ (Concept) ETHZ (Overall) ETHZ (Revision) http://www.eu-orchestra.org - Version from: 26.10.2007 1 Content 1. Introduction
Fogbeam Vision Series - The Modern Intranet
Fogbeam Labs Cut Through The Information Fog http://www.fogbeam.com Fogbeam Vision Series - The Modern Intranet Where It All Started Intranets began to appear as a venue for collaboration and knowledge
Effecting Data Quality Improvement through Data Virtualization
Effecting Data Quality Improvement through Data Virtualization Prepared for Composite Software by: David Loshin Knowledge Integrity, Inc. June, 2010 2010 Knowledge Integrity, Inc. Page 1 Introduction The
Making Content Easy to Find. DC2010 Pittsburgh, PA Betsy Fanning AIIM
Making Content Easy to Find DC2010 Pittsburgh, PA Betsy Fanning AIIM Who is AIIM? The leading industry association representing professionals working in Enterprise Content Management (ECM). We offer a
Engagement and motivation in games development processes
Engagement and motivation in games development processes Engagement and motivation in games development processes Summary... 1 Key findings... 1 1. Background... 2 Why do we need to engage with games developers?...
Statistical Data Quality in the UNECE
Statistical Data Quality in the UNECE 2010 Version Steven Vale, Quality Manager, Statistical Division Contents: 1. The UNECE Data Quality Model page 2 2. Quality Framework page 3 3. Quality Improvement
Inspection of the London Borough of Barking and Dagenham s arrangements for supporting school improvement
Tribal Kings Orchard 1 Queen Street Bristol, BS2 0HQ T 0300 123 1231 Textphone 0161 618 8524 [email protected] www.ofsted.gov.uk Direct T 0117 3115407 Direct email [email protected] 1
How Good is Our Council?
A guide to evaluating Council Services using quality indicators Securing the future... l Improving services l Enhancing quality of life l Making the best use of public resources Foreword Perth & Kinross
Digital Preservation Strategy
Digital Preservation Strategy Joint Operations Group Policy June 2011 Table of Contents 1. Introduction... 3 2. The Need for this Strategy... 4 3. Mandate... 5 4. Purpose of the Strategy... 6 5. Scope
03. Leveraging the Relationship between BIM and Asset Management
03. Leveraging the Relationship between BIM and Asset Management Current Situation Purpose What is Asset Management? What is BIM? The purpose of this paper is to explain the mutually-supportive relationship
PGCert/PGDip/MA Education PGDip/Masters in Teaching and Learning (MTL) Programme Specifications
PGCert/PGDip/MA Education PGDip/Masters in Teaching and Learning (MTL) Programme Specifications Faculty of Education, Law and Social Sciences School of Education December 2011 Programme Specification PG
Flinders University. Telehealth in the Home. Video Strategy Discussion Paper. 2 October 2013
Flinders University Telehealth in the Home Video Strategy Discussion Paper 2 October 2013 Further information: Alan Taylor, Project Manager [email protected] Page 1 of 14 A AUTHORS A.1 Project
Data Models in Learning Analytics
Data Models in Learning Analytics Vlatko Lukarov, Dr. Mohamed Amine Chatti, Hendrik Thüs, Fatemeh Salehian Kia, Arham Muslim, Christoph Greven, Ulrik Schroeder Lehr- und Forschungsgebiet Informatik 9 RWTH
How to research and develop signatures for file format identification
How to research and develop signatures for file format identification November 2012 Crown copyright 2012 You may re-use this information (excluding logos) free of charge in any format or medium, under
European Security Standards Reference Implementation Initiative (ESSRII)
European Security Standards Reference Implementation Initiative (ESSRII) A Proposal for Action in Europe on International Information Security Standards Brian Gladman, European Technical Director, Trusted
Making cities smarter
bsigroup.com Making cities smarter Guide for city leaders: Summary of PD 8100 There is no better way to improve the lives of billions of people around the world than to improve the way cities work. Michael
Engineering Process Software Qualities Software Architectural Design
Engineering Process We need to understand the steps that take us from an idea to a product. What do we do? In what order do we do it? How do we know when we re finished each step? Production process Typical
GenericServ, a Generic Server for Web Application Development
EurAsia-ICT 2002, Shiraz-Iran, 29-31 Oct. GenericServ, a Generic Server for Web Application Development Samar TAWBI PHD student [email protected] Bilal CHEBARO Assistant professor [email protected] Abstract
Mitra Innovation Leverages WSO2's Open Source Middleware to Build BIM Exchange Platform
Mitra Innovation Leverages WSO2's Open Source Middleware to Build BIM Exchange Platform May 2015 Contents 1. Introduction... 3 2. What is BIM... 3 2.1. History of BIM... 3 2.2. Why Implement BIM... 4 2.3.
Bringing the Cloud into Focus:
Bringing the Cloud into Focus: Things to Consider When Selecting Cloud-Based Museum Collections Management Software Presented by www.historyassociates.com Thinking About a Cloud-Based CMS? In addition
Urban Big Data Centre. Data services: Guide for researchers. December 2014 Version 2.0 Authors: Nick Bailey
Urban Big Data Centre Data services: Guide for researchers December 2014 Version 2.0 Authors: Nick Bailey 1 Introduction... 3 UBDC Data Services... 3 Open Data and the UBDC Open Data portal... 4 Safeguarded
IT3203 Fundamentals of Software Engineering (Compulsory) BIT 2 nd YEAR SEMESTER 3
Fundamentals of Software Engineering (Compulsory) BIT 2 nd YEAR SEMESTER 3 INTRODUCTION This course is designed to provide the students with the basic competencies required to identify requirements, document
New pedagogies and delivery models for developing open online courses for international students
Authors Li Yuan [email protected] Stephen Powell [email protected] CETIS (Centre for Educational Technology & Interoperability Standards), University of Bolton UK Hongliang Ma [email protected]
Instructional Design Framework CSE: Unit 1 Lesson 1
Instructional Design Framework Stage 1 Stage 2 Stage 3 If the desired end result is for learners to then you need evidence of the learners ability to then the learning events need to. Stage 1 Desired Results
Adobe Insight, powered by Omniture
Adobe Insight, powered by Omniture Accelerating government intelligence to the speed of thought 1 Challenges that analysts face 2 Analysis tools and functionality 3 Adobe Insight 4 Summary Never before
