METIS - Meeting teachers co-design needs by means of Integrated Learning Environments D5.2: Report on first formative evaluation round WP5: Evaluation WP Leader: <ITD-CNR> Author(s): Pozzi F. (ITD-CNR), Persico D. (ITD-CNR), Sarti L. (ITD-CNR), Brasher A. (OU), Chacón J. (UPF), Dimitriadis Y. (UVA), Malatesta L. (KEK), Rudman P. (ULEIC), Serrano M. A. (Agora).
Project information Project acronym: Project title: METIS Meeting teachers' co-design needs by means of Integrated Learning Environments Project number: Sub-programme or KA: Project website: 531262-LLP-1-2012-1-ES-KA3-KA3MP KA3 Multilateral projects http://www.metis-project.org Reporting period: From 01/04/2013 To 28/02/2014 Report version: v.1.3 Date of preparation: 15/05/2014 Beneficiary organisation: Project coordinator: Project coordinator organisation: University of Valladolid (UVa), Spain Prof. Yannis Dimitriadis University of Valladolid (UVa), Spain Project coordinator telephone number: +34 983 423696 Project coordinator email address: WP Leader: WP Leader email address: info@metis-project.org Francesca Pozzi (P5) pozzi@itd.cnr.it Document history Date Version Author(s) Description 13/03/2014 v.1.1 Pozzi, Persico, Sarti Preliminary version 22/04/2014 v.1.2 Pozzi, Persico, Sarti First version (revised according to reviewers indications) 15/05/2014 v.1.3 Pozzi, Persico, Sarti Second version revised according to reviewers indications This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. This project has been funded with support from the European Commission. This publication reflects the views only of the author(s), and the Commission cannot be held responsible for any D5.2 use Report which on may first be formative made of evaluation the information round contained therein.
Executive Summary The METIS project intends to contribute to the Technology Enhanced Learning (TEL) research field by providing educators with an Integrated Learning Design Environment (ILDE) and a workshop package for training them in using the ILDE, with the ultimate aim to foster and support effective learning design practices (METIS Consortium, 2012). Work Package 5 (led by ITD-CNR) aims to help METIS in reaching its objectives, by carrying out an internal, ongoing evaluation of the two main outcomes (the ILDE and the workshop package), in order to fine tune and improve them. The WP consists of three main tasks, namely: Task 5.1 Development of assessment plan this task identifies the requirements for workshop and learning design enactment realization, as well as the evaluation methodologies to be adopted. Task 5.2 Evaluation of ILDE in this task the evaluation of the ILDE is carried out, taking into account the perceived ease of use and perceived usefulness by the system users. Task 5.3 Evaluation of the workshop packs this task adopts a similar approach as T5.2, addressing the evaluation of the workshop packages and the enactment of the learning design with the involvement of the same set of users. While in the previous WP5 deliverable (D5.1, submitted at month 5, i.e. March 2013) the overall evaluation strategy and plan were provided (output of T5.1), the present document (D5.2) reports on the first formative evaluation round (output of T5.2+T5.3), consisting of the development of the evaluation tools, carried out between month 5 and month 10 (August 2013), their use to gather data between month 11 (September 2013) and month 17 (March 2014) and the analysis of the data collected. Main outcomes of this deliverable are thus the results of this evaluation process, which highlights the workshops package to be very well accepted by participants, even if some tuning to the format is still needed. The enactment phase needs to be further explored, possibly with more data in the second round and also the differences among the three sectors studied (i.e. higher education, vocational training, adult education) needs further attention. The ILDE was very positively perceived in all the contexts, even if the opinions concerning a real uptake of this innovative technology by the institutions are different. These results are used in the conclusive section to derive a list of implications for the second round of workshops, as well as a set of indications for the project work packages on the next actions to be undertaken.
Table of content 1. Introduction...5 2. Evaluation in METIS...8 3. Evaluation methods... 11 Online questionnaire... 13 Follow up interview... 15 Structure of types A and B Interviews... 16 Structure of Type C Interviews... 18 Data tracked by the system... 18 4. First evaluation in METIS - data analysis... 19 Data from the workshops... 20 Workshop at OU... 20 Workshop at KEK... 28 Workshop at Agora... 34 Data from the enactments... 41 Enactment at OU... 41 Enactment at KEK... 44 Enactment at Agora... 49 5. Discussion and lessons learnt... 50 Feedback on the workshops + enactments... 52 Feedback on the ILDE... 57 6. Conclusions and future work... 59 7. References... 63 8. APPENDIX I Questionnaire... 64 9. APPENIDX II Follow up interviews... 97
1. Introduction The METIS project has three main objectives (METIS Consortium, 2012): 1. To develop an Integrated Learning Design Environment (ILDE). The ILDE will integrate existing free and open source solutions that include: co-design support for communities of practitioners; learning design authoring tools following different pedagogical approaches and authoring experiences; interface for deployment of learning designs on mainstream Virtual Learning Environments (VLEs). 2. To run a series of workshops for teachers at partner institutions using the ILDE. The workshops will be aimed at fostering the adoption of learning design methods among teachers and advancing their skills in the orchestration of ICT-based learning environments according to innovative pedagogical approaches. The ILDE will play a central role in the workshops, because one of the workshop goals will be to support teachers familiarization with the ILDE and to promote the usage of the tools integrated in it. The workshops will also envisage an enactment phase, i.e. a stage where a sub-set of workshop participants will have the chance to come back to their classes and deliver to students a design fully conceived, authored and implemented using the ILDE. 3. To disseminate the project s outcomes and maintain a community of teachers engaged with learning design and its tools. As described in D5.1 (Pozzi et al., 2013), METIS adopts a user-centred design approach: the development of both the ILDE and the workshops is cyclic, with two evaluation phases informed by practice (see Figure 1). These two evaluation phases (the former between the 2 nd and the 3 rd cycle, the latter between the 3 rd and the 4 rd cycle), will incrementally incorporate the needs expressed by end users for both the ILDE and the workshops. 5
EVALUATION Analysis of existing tools Proposal of workshop pack 1 st cycle Release of ILDE prototype Delivery of pilot workshop packs 2 nd cycle Second release of ILDE Second delivery of workshop packs EVALUATION 3 rd cycle Final release of ILDE Refinement of workshop packs 4 th cycle Figure 1 - The METIS four cycles The workshops are the basis for the formative evaluation of the different versions of the ILDE and of the workshop packs themselves. Thus evaluation plays a crucial role in the project, and occurs in an iterative and formative way, i.e. with the aim of informing the following stages of re-design and development. Work package 5 (WP5) in the METIS project is explicitly devoted to evaluation. In particular, the main objective of WP5 is to carry out an overall evaluation of both the ILDE and the workshop package. This implies that WP5 in METIS is primarily focused on the evaluation of the two main outcomes of the project, and only indirectly addresses other aspects of the project evaluation. The WP consists of three tasks, namely: Task 5.1 Development of assessment plan this task identifies the requirements for workshop and learning design enactment realization, and the evaluation methodologies to be adopted. Task 5.2 Evaluation of ILDE taking into account the perceived ease of use and perceived usefulness of the ILDE, a set of indicators and instruments to gather data are designed and developed in this task; various categories of the system users (end-users involved in vocational training, school teachers, higher education teachers, workshop conductors and organizers, etc.) are involved in the usage of the ILDE system. Task 5.3 Evaluation of the workshop packs this task adopts a similar approach as Task 5.2, addressing the evaluation of the workshop packages and the enactment of the learning design with the involvement of the same set of user categories. 6
D5.1 Assessment plan, which was delivered at month 5, contained the overall evaluation strategy (Task 5.1), by providing: an overview of the literature in the fields of: o evaluation of the impact of technology and o evaluation of training events; the specific theoretical frameworks adopted to evaluate the ILDE and the workshops in METIS; the definition of the main dimensions that are considered to evaluate the ILDE and the training events; a preliminary set of indicators to evaluate the ILDE and the training events; the overall evaluation plan. Given that D5.1 was planned so early in the project, at a time when both the ILDE and the workshop packages were still under development, it could not contain the evaluation tools, but only high-level indicators. Later on, when the project was able to provide details about the ILDE and the workshop packages, it was possible to define finer grained indicators and the actual evaluation tools (such as questionnaires, interview rubrics and tracking tools specifications). D5.2 - Report on first formative evaluation round (Task 5.2 + Task 5.3), which is the present document, represents the continuation of the previous deliverable: it documents the first evaluation round (between the 2 nd and the 3 rd cycle, see Figure 2), providing the description of the evaluation methods and tools designed and developed by the project, and the analysis of the gathered data, as well as the discussion about the main lessons learnt, which will inform the 3 rd cycle. 7
EVALUATION D5.1 Release of ILDE prototype Delivery of pilot workshop packs 2 nd cycle D5.2 Second release of ILDE Second delivery of workshop packs EVALUATION 3 rd cycle D5.3 Analysis of existing tools Proposal of workshop pack Final release of ILDE Refinement of workshop packs 1 st cycle D5.4 4 th cycle Figure 2 - Focus of D5.2 Figure 2 shows also D5.3 and D5.4, which are the following deliverables to be produced under WP5. In particular, while the former deliverable will contain the report on the second formative evaluation round, the latter document will provide a comprehensive view on the evaluation carried out during the project lifespan. 2. Evaluation in METIS In this section we recall the main choices made by the project consortium concerning the evaluation and provide a synthesis of the overall evaluation model adopted in METIS (which is more extensively described in D5.1). As already mentioned, the ultimate goal of METIS is the widespread adoption of innovative and effective learning design approaches and tools, by developing the ILDE and designing and running workshops for teachers on learning design using the ILDE. Consequently, specific METIS evaluation needs include: to measure acceptance of the proposed technology (ILDE) by the users (that means evaluating the extent to which users regard the ILDE as easy to use and useful); to assess the adequacy and effectiveness of the workshops (and subsequent enactment) in real contexts, by taking into account not only participants perceptions and reactions, but also the impact that the adoption of these innovations have on their institutions, as well as the contextual factors that may enhance or hinder their impact. 8
The latter point is particularly important to METIS, where three different contexts are being explored: vocational training, adult education and higher education (respectively represented, in the project, by the three partners KEK, Agora and UKOU), since what may be shown as adequate in one context may turn out to be inadequate in another. Thus, among the various existing models described in D5.1 to evaluate the technology, the TAM and its subsequent evolutions (in particular TAM2) have been chosen as theoretical framework to develop the core of the evaluation approach for the ILDE. Furthermore, the information provided by the application of this model will be complemented with data gathered from other sources, such as tracking data provided by the system itself. Thus, in METIS, user acceptance of the ILDE is measured mainly in terms of TAM s indicators Perceived usefulness and Perceived ease of use. Additionally, according to TAM2, social factors that influence ICT acceptance are also taken into account. In order to tailor the TAM to the METIS needs, and in particular to the ILDE evaluation, the approach takes into consideration on one hand, the ILDE functionalities defined in terms of use cases in Hernández-Leo et al. (2013), and on the other hand, the system components, as defined by the specification of the ILDE architecture (Pozzi et al., 2013). As it will be further explained in the following, we chose the questionnaire as the main tool to collect evaluation information on both the workshop and the ILDE. As the evaluation of the ILDE intertwines with the evaluation of the workshops, data are collected in an integrated way through the same online tool, to minimize the effort of the participants/users. In addition, the use of questionnaires complemented by appropriate guidelines for the workshop moderators, ensures that data collection is dealt with using the same procedure even when the evaluators are not present. The evaluation of the enactment activities further relies on interviews of participants. In general, we strive to use appropriate triangulation processes to combine data that come from different sources and are collected using different techniques, such as log analysis, third-party observations, etc. As mentioned before, the TAM and TAM2 provide information based on the users perceptions and opinions, that is, subjective data that need to be complemented with more objective data about what actually happens when users engage with the ILDE. This information is obtained thanks to tracking mechanisms in the ILDE, which provide, among other things, information on trustworthiness of the users opinions. If, for example, a user says that a given functionality was easy to use, but then the data tracked shows s/he never used it, he/she is less trustworthy than a user who claims the functionality was difficult to use after having engaged with it for a certain amount of time. As far as the evaluation of the METIS workshops is concerned, Guskey s model has been chosen as the main source of inspiration for the evaluation (Guskey, 2000; 2002), as it seems to have the closest fit with the project evaluation needs (see D5.1). This is because in his model, Guskey takes into account not only the participants perceptions and reactions to the 9
training event, but also the impact on the system (down to the level of students learning), as well as the contextual factors that may enhance or hinder the impact of the initiative. Both these aspects (user perceptions and impact on the system) seem particularly relevant in METIS, where the uptake and consequent diffusion of learning design innovative practices may meet obstacles and barriers once transferred into real educational contexts, whose permeability to innovation is unknown. According to Guskey (2002), effective teachers professional development evaluation requires the collection and analysis of five critical levels of information, namely: 1) Participants' Reactions (it looks at participants' reactions to the workshops); 2) Participants' Learning (focused on measuring the knowledge and skills that participants gained during the workshop); 3) Organization Support and Change; 4) Participants' Use of New Knowledge and Skills; 5) Student Learning Outcomes. Level 3 focuses on the organization: lack of institutional support can sabotage any professional development effort, even when all the individual aspects of professional development are done right (Guskey, 2002). This means that the innovative practices that are the object of the workshops (e.g., in METIS the use of ILDE to support effective learning design), in real contexts, can fail to be widely adopted due to the contexts/systems where the workshop participants usually operate, which may not be ready to uptake the needed changes. Level 3 is important in METIS, given that we propose innovative practices to three different target users, operating in three contexts (Brasher and Mor, 2013), so we may discover that what fits the needs of one context is not in line with the policies of another one. At Level 4 we focus on whether and to what extent participants are able to uptake the innovative practices proposed during the workshops and their ability to apply them within their daily practice. Unlike Levels 1 and 2, this information cannot be gathered at the end of the workshops, as enough time must pass to allow participants to adapt the new ideas and practices to their settings and enact them. In METIS a certain time is devoted to allow a subset of workshop participants to enact their innovative designs in their respective contexts (enactment phase, see D4.1 for an extensive description of this phase) and thus evaluation of Level 4 occurs after this phase (Rudman & Conole, 2014). According to Guskey (2002), Level 5 addresses the bottom line : how did the workshop affect students/learners? Did it benefit them in any way? This means that after some of the participants of the workshops have implemented their designs, they are in charge of reporting what the impact of the innovation was on the student learning outcomes. By evaluating this level, which we indirectly address with the mediation of teachers, unintended (or unexpected) outcomes can be found. As mentioned before, data collection of evaluation takes place at two different stages: data related to levels 1 to 3 are gathered at the end of the workshops (together with the data related to the ILDE evaluation) through the questionnaire and the analysis of logs, while data 10
related to levels 4 and 5 need more time and are evaluated after the enactment phase through interviews. Furthermore, in analogy to what has been proposed for the ILDE evaluation, also in the case of the workshop evaluation the data coming from participants are compounded with other, more objective data, coming from direct observation of what actually occurred during the workshop sessions themselves. This happens thanks to an observer, who takes notes during the workshops, supported by a rubric. Observers also monitor and gather data during the enactment phases. All the data coming from observation, though, have been considered as part of the workshop enactment (WP4) and are thus extensively reported in D4.1 (Rudman & Conole, 2014). This deliverable takes these data as inputs and thus the final reflections about the results of the evaluation will draw also on them (even if they are not reported here). For this reason, if the reader wants to get a complete picture of the work done, it is recommended that the two deliverables are read one after the other (D5.2 after D4.1). 3. Evaluation methods This section provides a summary of the evaluation tools designed, developed and delivered by the project under WP5 (see Figure 3). Figure 3 - METIS events and evaluation tools WP4 The two yellow circles in the Figure represent the two main events occurred during this first round in METIS, namely the workshop and the subsequent follow up phase, where the enactment took place. All the other symbols in the Figure represent the evaluation tools developed, namely: 11
METIS PROJECT A questionnaire (blue circle) delivered through an online form to all the workshop participants at the end of each workshop; the questionnaire gathers data concerning: o Profile of the workshop participant o Participant s opinions about the workshop (Levels 1, 2 and 3) o Participant s opinions about the ILDE o Free comments. A follow up bespoke interview (green circle) - to be delivered after the enactment phase to the workshop participants who agreed to enact; the interview gathers data concerning: o Profile of the participant o Participant s opinions about the enactment (Levels 4 and 5) o Participant s opinions about the ILDE o Participant s opinions about sustainability of the learning design innovations proposed o Free comments. Another, different bespoke interview is delivered to the workshop participant who did not agree to enact; this interview gathers data about the reasons behind their decision. Data tracked by the ILDE (red line) - these encompass the data related to the actions performed on the platform by participants on learning designs during the workshops and the enactment. Data observed by observers (grey line) reported in D4.1 (Rudman & Conole, 2014). Both the questionnaire and the interview included a consent form, where people should express their will about the use of data (see Appendix I). The design of the questionnaire, the interview rubrics and the data to be tracked, are the result of an iterative process, coordinated by ITD-CNR with the contribution of all the partners, which started at month 3 of the project. Collaboration and discussion about these means occurred through emails and regular virtual meetings; besides, during the Barcelona meeting (July 2013) a dedicated session was organized for the partners to test the questionnaire and for ITD-CNR to gather direct inputs and feedback about it and the overall evaluation strategy. One of the main issues discussed with the partners is the degree to which it was advisable to customize the evaluation means for the three different contexts addressed by the project (i.e.: vocational training, adult education, higher education). The METIS project had developed a generic workshop package that was then adapted to the different contexts (see D3.3, McAndrew, Brasher, Prieto and Rudman, 2013). However, the differences introduced by the customization of the generic workshop package did not hinder the possibility to develop a common set of evaluation means capable of collecting the same categories of data from the different contexts, thus allowing for comparison of results across contexts. 12
There were, however, two kinds of customizations that were planned for and implemented where the three partners representing the different contexts deemed it appropriate. The first was the linguistic customization of the questionnaire: an equivalence table of the question items was produced to make it easy for partners to produce different versions of the questionnaire itself. This way, the questionnaire was produced in English by ITD-CNR and then translated only by those partners that deemed it appropriate (Agora translated it in Spanish, while KEK did not translate it in Greek since their end-users were comfortable with the English version). As for the interviews, these were carried out in the local language based on the rubrics (in English) produced by ITD-CNR. This method also left the interviewers free to adapt the questions to the context, where appropriate. The second type of customization was related to the different ways learning design takes place in the three contexts; at KEK and Agora the teachers are also the designers of the training events, while the OU has a much more complex system featuring different actors in the design and teaching process and a longer time scale for the design-to-enactment process. It was therefore necessary to customize the tools to assess the enactment phase for the different actors, according to a broad range of situations that emerged in the various contexts. A further element of evaluation can be offered by the analysis of the artefacts produced by the participants during their activities. Rather than considering the explorative activities carried out during the workshops, when participants are concerned more on understanding the ILDE functionality than on delivering an authentic product, we mainly focus on the learning designs created during the enactment phase, which are to be delivered to actual students and must therefore feature completeness and self-consistency. These artefacts have already been reported in D4.1 (Rudman & Conole, 2014), so here we will take the elements emerged by D4.1 as inputs and source of further information. The following sections contain an extensive description of each evaluation tool. Online questionnaire As mentioned above, two versions of the online questionnaire were produced: one in English (for KEK and OU) and one in Spanish (for Agora). At the beginning of each workshop, the METIS project is introduced and participants are made aware of the fact they are taking part to a testing phase of the project (see D3.3, McAndrew et al., 2013). Thus participants are informed that, at the end of the workshop, they will be required to fill in a questionnaire and that their actions on the ILDE will be tracked during the workshop. Given that it has been estimated that filling in the questionnaire would require about 20 minutes, the final 25-30 minute slot of each workshop is devoted to this activity. 13
In particular, through the questionnaire, participants are asked to provide the following information: Data about the workshop participant. These include: Personal data (name, age, qualification, institution, position, etc.) Foremost background (mostly educational sciences; mostly computer science; educational technology; other) Work sector (academic: research/ university teaching; school teaching; industry; policy making; professional development; other) Previous knowledge about learning design (beginner, intermediate, expert, other) Previous knowledge (if any) about any learning design tools among those integrated in the ILDE Participant s opinions about the workshop (Levels 1, 2 and 3 of Guskey s model). These include: Motivation participants rate statements concerning their motivation to attend the workshop (Likert scale, from 0=low to 4=high) Reactions participants rate statements about the quality of contents, quality of presentations, quality of discussion, effectiveness of hands-on activities, adequacy of the time schedule, adequacy of rooms and facilities, etc. (Likert scale, from 0=low to 4=high) Learning participants rate statements about what they think they have learnt during the workshop (Likert scale, from 0=low to 4=high) and provide a selfevaluation of the outputs they have produced (namely the quality of the learning designs created during the workshop) Organization support & change - participants rate statements about the possibility that their institutions really adopt (some of) the innovations proposed at the workshop (Likert scale, from 0=low to 4=high). These data represent a preliminary investigation about the degree of sustainability of the innovation proposed by METIS Free comments about the workshop (what did participants like most about the workshop? What improvements should be done? etc.). Besides the quantitative rating, participants can provide additional information in an open free text field (optional), in case they want to justify their ratings. Participant s opinions about the ILDE. 14
The structure of this section of the questionnaire reflects the structure of the ILDE 1 and thus collects users opinions about the following: Functions to create new learning designs participants are asked to state if they understood what the ILDE offers and to rate ease and usefulness of such understanding (Likert scale, from 0=low to 4=high). This includes 4 items. Conceptualizing functions - participants declare if they used the 9 conceptualizing functions and rate their ease of use and usefulness (Likert scale, from 0=low to 4=high). This includes 9 items. Authoring functions - participants declare if they used the 3 authoring functionalities and rate their ease of use and usefulness (Likert scale, from 0=low to 4=high). This includes 3 items. Implementing functions - participants declare if they used the 3 implementing functionalities and rate their ease of use and usefulness (Likert scale, from 0=low to 4=high). This includes 3 items. Browsing functions - participants declare if they used the 6 browsing functionalities and rate their ease of use and usefulness (Likert scale, from 0=low to 4=high). This includes 6 items. Sharing functions - participants declare if they used the 7 sharing functionalities and rate their ease of use and usefulness (Likert scale, from 0=low to 4=high). This includes 7 items. Free comments about the ILDE (what did participants like most about the workshop? etc.). Besides the quantitative rating, participants can provide additional information in an open free text field (optional), in case they want to justify their ratings. The complete questionnaires (in English and Spanish) are provided in Appendix I. Follow up interview As already mentioned, the different contexts in which the workshops took place determined the need for customizing the rubrics of the follow up interviews in order to take into account the different typologies of actors involved in the workshops (see D4.1, Rudman & Conole 2014). In particular, three different rubrics were developed, namely: A. An interview rubric for people who had attended the workshop and carried out the enactment within the timeframe required by the METIS project (2 trainers at Agora + 1 teacher at KEK). B. An interview rubric for people who hadn t attended the workshop, but carried out the enactment within the timeframe required by the METIS project (1 trainer at 1 For a complete view on the ILDE functions, please refer to D2.1. 15
Agora + 1 trainer at KEK). As it will be later on explained (see Enactment at Agora and Enactment at KEK sections), these are particular situations the consortium agreed to take on board, even if not originally planned, because they can contribute to the evaluation of this round. C. An interview rubric for people who had attended the workshop but were not interested in carrying out the enactment (only at OU, sent to 15 workshop participants). The first two interviews (type A and B) are carried out face-to-face by the partners responsible for the respective context; ITD-CNR has provided the rubric and guidelines for the interviewers, so to guarantee homogeneity in data collection. At Agora and KEK the interviews are conducted in the respective languages and recorded; then each partner responsible translates the answers and provides them to ITD-CNR. The third interview (type C) takes the form of a short written survey, consisting of a set of open-ended questions that OU circulated among its workshop participants via email. Structure of types A and B Interviews The first two types of interviews (A and B) have got the same structure, but B contains additional questions to compensate for the fact that participants who hadn t attended the workshop, hadn t filled in the questionnaire. In the interviews, participants are asked to provide the following information: Data about the enactment participant. These include: Personal data (name, age, qualification, institution, position, etc.) (for B only) Foremost background (mostly educational sciences; mostly computer science; educational technology; other) (for B only) Work sector (academic: research/ university teaching; school teaching; industry; policy making; professional development; other) (for B only) Previous knowledge about learning design (beginner, intermediate, expert, other) (for B only) Previous knowledge (if any) about any learning design tools among those integrated in the ILDE. Participant s opinions about the enactment (Levels 4 and 5 of Guskey s model). These include: (for B only) Motivation participants rate statements concerning their motivation to be involved in this experience (Likert scale, from 0=low to 4=high) 16
METIS PROJECT (for B only) Learning participants rate statements about what they think they have learnt during the experience (Likert scale, from 0=low to 4=high) Use of knowledge and skills during the enactment participants are asked to give details and then provide free opinions about the following enactment phases: o Creation of new learning designs during the enactment o Completion/refinement of learning designs created at the workshop o Implementation of their learning designs in a VLE o Delivery of learning designs to students o Evaluation of their learning designs. Student learning outcomes participants are asked to give details and provide opinions about the delivery to students and the impact the learning designs might have on them and their learning. In addition, participants rate statements concerning the impact on students (Likert scale, from 0=low to 4=high). Participant s opinions about the ILDE. Opinions are asked for each of the following sets of ILDE functions: Functions to create new learning designs Conceptualizing functions Authoring functions Implementing functions Browsing functions Sharing functions. This section is treated differently in cases A and B: participants who had attended the workshop (A), are reminded of the answers they had provided for this section in the online questionnaire, and are asked to express and comment their current opinions in the light of the enactment experience, in case their opinions have changed. Participants who hadn t attended the workshop (B) are asked opinions on the same functions, but in the same form of the questionnaire (see Section Online questionnaire above). Participant s opinions about sustainability This section is explicitly devoted to sustainability and the data gathered here should complement those already gathered with the initial questionnaire (section Organizational support & change ). Participants are asked to rate statements concerning the perspective adoption at their institutions of both the ILDE and the workshop, and, more in general, the degree they think these innovations fit the needs of their work sector (vocational training, adult education, university). These data will support reflections about the possibility to further export these innovations to other, similar contexts. 17
Free comments METIS PROJECT Participants are free to express any other comment/opinion (about the ILDE, the workshop, the enactment or even the evaluation means and procedures). Structure of Type C Interviews Coming to the third interview proposed to those who attended the workshop, but did not enact, the survey includes the following open text questions: Changes brought by the workshop to the everyday participant s practice (if any) Reasons for changes or no changes Plans for the future (in terms of changes). The three complete interview rubrics are provided in Appendix II. Data tracked by the system The definition of the data to be tracked by the ILDE was the result of a negotiation between WP5 (Evaluation, led by ITD-CNR) and WP2 (ILDE development, led by UPF). During the negotiation, it was agreed that data would be defined following the use cases conceived during the design of the ILDE and described in D2.1 (Hernández-Leo, Chacón, Prieto, Asensio, and Derntl, 2013). Thus, drawing from the use cases, the following data have been defined as essential for evaluation purposes: Choose a Tool o Number of designs created with each tool by each user o List of designs created with each tool by each user Produce a learning design o Number of designs produced by each user o Number of designs created / modified by days/weeks/months o List of designs modified by each user by days/weeks/months Co-produce a learning design o Number of reviews (edits) to a design (in global or by users) o Number of users editing a design o List of users editing a design Share a learning design o Number of users with whom a design has been shared with editing right o Number of users with whom a design has been shared with view right o Users with whom a design has been shared with editing right o Users with whom a design has been shared with view right o Number of designs shared with others with editing rights o List of designs shared with others with editing rights o Number of private designs (no shared, no view / editing rights for others) 18
o List of private designs METIS PROJECT Instantiate (Implement) a learning design o Number of designs associated to at least a VLE (Virtual Learning Environment) o List of designs associated to at least a VLE by implementer o Number of implementations (design associated to VLEs) for a design Deploy an instantiated learning design o Number of times a deployment package is created for a design o Number of feedback provided on a learning design o Number of comments associated to each design by commenter Explore learning designs o Number of times a user viewed a design (by user and by designer) o Number of times a design has been viewed (by user and by designer) Additional data that were collected regarded information on the ILDE functions more related to the community building, namely: o o o o o o o o o o o o o Number of members in the community Number of designs produced by a member Number of documents associated to each design Number of comments associated to each design Number of reviews to a design Number of users editing a design Number of tags associated to a design Number of designs created / modified by days/weeks/months Number of groups Number of members in each group Number of designs published (accessible outside LdShake) Number of visits to the site / pages Number of designs created with each tool. 4. First evaluation in METIS - data analysis The first round of the METIS workshops occurred in the three contexts chosen by the project (namely at OU, KEK, and Agora) in autumn 2013 with different time schedules and modalities. The design of the three workshops is extensively described in D3.3 (McAndrew et al., 2013), while their running is described in D4.1 (Rudman & Conole, 2014). The enactment phase occurred in the early months of 2014 (again details about the enactment can be found in D4.1). In this section (Section 4), the data gathered during the running of workshops and enactments are reported, through the evaluation means developed under WP5 and described in the previous section. Section 5 will discuss these data. 19
Data from the workshops This section summarizes the information about the workshops that are needed to make this document self-contained. The section has three subsections, each devoted to one of the three workshops. The data reported about the running workshops mostly derive from the questionnaire analysis, complemented by the tracking data usually in the last column of the relevant tables. Workshop at OU Data about the workshop participants The workshop at OU took place on 24 October 2013, and involved 17 participants, 13 of which (76%) filled in the questionnaire. According to the data gathered through the questionnaire, all of the respondents work at the Open University, but they cover different roles: 2 librarians, 1 staff tutor, 4 lecturers, 2 media developers, 2 media project managers, 1 information literacy specialist, 1 assistant director. Their backgrounds are diverse: 2 persons declare their foremost background is in educational science, 1 in computer science, 1 in sociology, 1 in learning media design, 1 in media production/publishing, 2 in publishing and project management, 1 in management learning, 1 in learning development, 1 in English, 1 in library and information systems, 1 in health care. As far as their previous experience on learning design, 3 persons (23% of the respondents) declare they are beginners in this field, while the others declare they are intermediate. Almost all the tools proposed at the workshop were previously unknown to participants (except for Course Map which was known by 4 persons) and 2 persons declare they know respectively Compendium and the Open University Learning Design Initiative (OULDI) 2 tools and resources. When asked about the motivations which led them to attend the workshop, they were required to rate a set of statements (the labels of columns 2 to 5 in Table 1) from 0 to 4. Looking at the results obtained (see Table 1), it emerges that intrinsic motivation played a strong role for the workshop participants to decide to attend ( I am interested in learning deigns, I need to learn about learning design ); however, it seems that the social norms ( People I work with believe learning design is important ) influenced the decision even more. All the respondents declared they were not at all forced to enroll in the workshop (rate=0), except for two persons who rated respectively 2 and 4 this option. 2 http://www.open.ac.uk/blogs/ouldi/ 20
Table 1: Motivation (OU participants) Motivation (from 0=inessential to 4=very important) I am interested in learning design I need to learn about learning design METIS PROJECT People I work with believe learning design is important mean 3,38 3,42 3,45 0,55 st. dev. 0,65 0,79 0,69 1,29 Participant s opinions about the workshop (Levels 1, 2 and 3 of Guskey s model). I was forced to enrol in this workshop Looking at the evaluation data provided about the workshop, and in particular at participants reactions (Level 1), it seems that the most appreciated elements of the workshops at the OU were the hands-on activities and the quality of discussion (see Table 2); the relevance of the workshop and its suitability with regard to participants prior competence was pretty high, and also the quality of contents was positively evaluated. In contrast, the adequacy of the time schedule was less appreciated and even rooms and facilities were not completely satisfying. In particular most participants claimed that activities would have needed more time and a suggestion came to anticipate some of the activities (for example familiarization with CSCL, thinking at a possible design, etc.) before the workshop. One participant claimed that either the room was not big enough or enough spaces had not been allocated per subject desktop. Table 2: Reactions (OU participants) Reactions (from 0=very low to 4=very high) Suitability with regard to your prior competence Relevance of the workshop contents Quality of presentations Quality of the discussion Effectiveness of handson activities Adequacy of time schedule Adequacy of rooms and facilities mean 3,00 3,00 3,00 3,23 3,42 2,25 2,92 st. dev. 0,85 0,91 0,82 1,01 0,67 0,75 1,04 As far as the evaluation provided by participants regarding their learning at the workshop (Level 2), while they think they have learnt a lot, there is some uncertainty related to its application in daily practice and in particular their ratings about the possibility that this will speed up their learning design activities, are quite low (Table 3). 21
Table 3: Learning (OU participants) Learning (from 0=very low to 4=very high) I have learnt a lot from this workshop I look forward to putting into practice what I have just learnt about learning design METIS PROJECT I intend to use the ILDE in my future learning design activities Applying what I have learnt during this workshop will improve the quality of my courses mean 3,00 2,92 2,23 2,77 2,00 st. dev. 1,00 0,95 1,09 0,83 1,00 What I have learnt in this workshop will speed up my learning design activities Participants declared they have created (i.e. conceptualized or authored) a mean of 1,15 new learning designs each, using the ILDE, while only two persons declared they had implemented 1 learning designs each; this is because, due to time restriction, the hands-on activity originally planned for the implementation phase, was then turned into a demonstration, so it is reasonable that not many people experimented it (see D4.1, Rudman & Conole 2014). In any case, we should also keep in mind that at the OU people worked in teams, so these figure reflect a group work, rather than an individual effort. Then participants were asked to evaluate their most complete learning designs and the ratings they provided are contained in Table 4. Obviously, we should consider that these first creations are necessarily explorative in nature, given also the limited time available during the workshop. Table 4: Learning: ratings of the designs (OU participants) Learning Completeness Originality Complexity Reusability (from 0=very low to 4=very high) mean 2,00 1,83 2,00 2,58 st. dev. 0,60 0,72 0,85 1,08 As to the Level 3 (Organization support and change), there is a certain confidence that the ILDE and more in general the practices learnt at the workshop could be adopted by colleagues, but more at an informal level, rather than at an institutional one: as a matter of fact, participants claim that institutions are hard to change and usually very few tools achieve widespread/full adoption at their institution (see Table 5). As it will be further discussed in this document, this perception may derive from the peculiar context of OU, as this organization already has a production process for the generation of e-learning material and activities in place. 22
Table 5: Organizational support and change (OU participants) Org. support & change (from 0=very low to 4=very high) What I have just learnt about learning design could alter the existing practices in my institution The adoption of the ILDE as institutional tool to support learning design is likely to be fostered by my institution What I have just learnt about learning design could be transferred to my colleagues mean 2,38 1,73 2,75 2,55 st. dev. 1,19 0,90 0,75 0,93 ILDE could be adopted by my colleagues Lastly, when asked to list three words to describe the workshop, participants expressed the following (see Figure 4). Figure 4 - Wordle TM created with the words expressed by participants to describe the workshop 3 (OU) As one may see from the Wordle, interesting, stimulating and informative are recurrent terms, but also rushed is a keyword which we need to take into due consideration. The most appreciated activities were the paper prototypes and the storyboarding, the most appreciated side-effect was considered the possibility to meet and discuss with colleagues. Suggestions for possible improvements concerned mainly a better timing, possibly anticipating some activities and/or providing background readings before the workshop itself, or even spreading it over another session. Besides, a better preparation of the guidance materials and a better targeting to participants needs were also advocated. Participant s opinions about the ILDE As far as the evaluation of the ILDE provided by participants, Table 6 shows that OU participants were not so positive concerning the ease of use and usefulness of accessing the 3 The word cloud gives greater prominence to words that were used more frequently. http://www.wordle.net/ 23
various tools within the platform and understanding the differences among them. Usefulness in the table is always higher than ease of use. Table 6: Functions to create new designs (OU participants) Functions to create new LdS 4 (from 0=min. to 4=max.) how many say they did it ease of use (mean /st. dev.) usefulness (mean / st. dev.) understanding that the ILDE provides access to a variety of tools understanding the differences between these tools finding these tools in the ILDE access/enter these tools from the ILDE interface 10 9 8 8 2,50 0,85 2,11 0,60 2,00 0,76 2,13 0,83 2,67 0,87 2,75 0,89 2,57 0,79 2,43 0,98 According to the tracked data, their exploration of the tools for conceptualizing the design was mainly focused on the Heuristic Evaluation tool (which was encouraged by the facilitator), with some exploration of the Persona Card, the Factors and Concerns template and the Image upload (see column tracked data of the Table below). The ease of use of the Heuristic Evaluation was rated on average 2,30 (st. dev. 0,67) and its usefulness 2,80 (st. dev. 0,79). Higher ratings were attributed to the ease of use and usefulness of the Persona Card 5 (see Table 7). Unfortunately, no ratings are available for the Factors and Concerns function. Table 7: Conceptualize functions 6 (OU participants) Conceptualize functions (from 0=min. to 4=max.) Course Map how many say they used it 2 ease of use (mean /st. dev.) 2,50 0,71 usefulness (mean / st. dev.) 2,50 0,71 n. of designs (tracked data) 0 4 In the ILDE terminology, LdS stands for learning designs (see D2.1). 5 The table shows that 4 persons expressed their opinions about the Persona Card (first column), but then the data tracked by the ILDE say that only 1 LdS was created with this function (last column). This should not be regarded as contradictory (and similar situations will be common also in the following tables); it may be caused by one of the following: a) a group of participants jointly explored one function (using one group account) and then all its members expressed their opinions about the function in the questionnaire; b) a number of people entered the function and rated it, but only a subset of them saved their LdS; c) one function was demonstrated at the workshop, so even if participants didn t experienced it directly, they rated it on the basis of what they had been shown. 6 The description of the Conceptualize functions can be found in D2.1. 24
Conceptualize functions (from 0=min. to 4=max.) Design Pattern how many say they used it METIS PROJECT ease of use (mean /st. dev.) usefulness (mean / st. dev.) 2,33 2,33 3 0 0,58 0,58 Design Narrative 1 2,00 2,00 0 Persona Card 3,25 3,25 4 1 0,96 0,50 Factors and Concerns Heuristic Evaluation CompendiumLD (upload) Image (upload) For other conceptualizations 0 -- -- 2 10 2,30 0,67 2,80 0,79 0 -- 0 3 2,00 1,00 3,00 1,00 0 -- -- N.A. 7 n. of designs (tracked data) 10 3 As to the Authoring functions, the OU participants extensively experimented with WebCollage, given that 17 designs were created 8 (see last column in Table 8); they rated positively its usefulness, but they did not find it easy to use (Table 8). Table 8: Author functions 9 (OU participants) Author functions (from 0=min. to 4=max.) how many say they used it ease of use (mean /st. dev.) usefulness (mean / st. dev.) WebCollage 8 1,88 3,00 17 0,64 0,58 OpenGLM 0 -- -- N.A. CADMOS 0 -- -- N.A. n. of designs (tracked data) Evaluation data about the Implementing functions are not very meaningful (see Table 9), given that no designs were implemented and only one person rated it (Implementation was demonstrated at the OU, see D4.1, Rudman & Conole 2014). 7 N.A = not available; this means that the ILDE didn t track this function. 8 To be noted that the 17 designs authored might include different versions of a designs aimed to the same outcomes. 9 The description of the Author functions can be found in D2.1. 25
Table 9: Implement functions 10 (OU participants) Implement functions (from 0=min. to 4=max.) Implement in a VLE through Glue!-PS how many say they used it ease of use (mean /st. dev.) usefulness (mean / st. dev.) 0 -- -- 0 See your VLE 1 2,00 3,00 N.A. Register your VLE 0 -- -- 0 n. of designs (tracked data) More informative are the ratings attributed to the Browsing functions: the most experimented and appreciated ones being the Free search and the Tag browsing (see Table 10). Although not rated by many people, the data show very good results, especially as far as the Usefulness, but also the Ease of use of the Browsing functions. Table 10: Browsing functions 11 (OU participants) Browsing functions (from 0=min. to 4=max.) how many say they used it ease of use (mean /st. dev.) Free search 5 3,40 0,55 Browse by tools 1 3,00 4,00 Search patterns 1 3,00 0,31 Browse by tags 3,50 3,75 4 0,58 0,50 Browse by discipline 1 3,00 4,00 Browse by pedagogical approach 1 3,00 4,00 usefulness (mean / st. dev.) 3,60 0,55 As far as the Sharing functions are concerned, the few people who experienced them, judged them positively. Very likely, the reasons why few people managed to use these functions, have to do with the lack of time devoted to it during the workshops rather than difficulties in use (Table 11). It is reasonable to assume that Sharing functions will play a more relevant role in the follow up of the workshop, when people may feel the need to share their real designs with others for mutual enrichment and exchange. Interestingly, in one case ( Share a LdS with others with view rights ), even if it seems that many people explored this function, only one of them rated the corresponding function in the questionnaire. This may raise doubts about the clarity of the question. 10 The description of the Implement functions can be found in D2.1. 11 The description of the Browsing functions can be found in D2.1. 26
Table 11: Sharing functions 12 (OU participants) Sharing functions (from 0=min. to 4=max.) Create a LDshakers' how many say they used it ease of use (mean /st. dev.) group 1 3,00 2,00 Share a LdS with others with view rights 1 4,00 4,00 Share a LdS with others with edit 0 rights -- -- Add a comment to a LdS 0 -- -- Exchange messages 4 with other LDshakers View someone else's LdS 1 Edit someone else's LdS 0 2,50 1,29 usefulness (mean / st. dev.) 2,75 1,50 n. of actions (tracked data) N.A. 13 3 0 N.A. 3,00 2,00 N.A. -- -- N.A. Even if only five participants provided words to describe the ILDE, from Figure 5 it emerges that, while the platform is considered (potentially) useful, at this stage of development it is considered still unintuitive. Figure 5 - Wordle TM created with the words expressed by participants to describe the ILDE (OU) According to participants, the most appreciated aspect of the ILDE is its ability to make people reflect on the design process, thanks to the rigorousness of the approach and the possibility to share designs with others. As far as possible improvements, even if many participants declared it is too early to give an opinion on that, four of them declared the interface should be improved, so to make it 12 The description of the Sharing functions can be found in D2.1. 27
easier to use and navigate and two others suggested that the terminology / labelling should be improved. Workshop at KEK Data about the workshop participants As it has been extensively described in D4.1 (Rudman & Conole 2014), the workshop at KEK took place on 9 November 2013, and involved 18 participants, 16 of which (the 89%) filled in the questionnaire. According to the data gathered through the questionnaire, they are either school teachers (in the field of chemistry, physics and biology) or teacher trainers, except for 1 school counselor. To be noted that, although KEK was chosen as representative of the vocational training sector, in this first round the enrolled participants do not belong to the expected category; this means that the data emerging from this workshop, which still remain of interest for the project as they provide useful feedback about the way the workshop itself and the ILDE can be perceived by school teachers, shouldn t be considered as belonging to the vocational training sector. As foremost background 14 declare educational science, and 2 computer science. As far as their previous experience in learning design, 12 persons declare they have been working in this field for a few years, while 3 declare they are experts (1 missing answer). Despite such a prior experience, all the tools addressed at the workshop were previously unknown to participants. When asked about the motivations which led them to attend the workshop, they answered by rating from 0 to 4 the statements reported as headings of columns 2 to 5 in Table 12. The results indicate that intrinsic (personal interest / need) and extrinsic (social norms) motivation played an equal role. Only one person rated 1 the option I was forced to enroll in this workshop, the remaining ratings being 0. Table 12: Motivation (KEK participants) Motivation (from 0=inessential to 4=very important) I am interested in learning design I need to learn about learning design People I work with believe learning design is important mean 3,73 3,27 3,67 0,33 st. dev. 0,46 0,88 0,62 1,05 I was forced to enrol in this workshop Participant s opinions about the workshop (Levels 1, 2 and 3 of Guskey s model). Looking at the evaluation provided about the workshop, and in particular at participants reactions (Level 1), it seems that the most appreciated elements of the workshops at KEK were the relevance of the contents and the quality of presentations. Similarly to what happened at the OU, hands-on activities and quality of discussion were also positively 28
evaluated. In contrast, the suitability of the workshop with regard to prior competence was judged just adequate and timing was very badly rated. In the free comments, most of the respondents claim that time devoted to familiarization with, and presentation of, the ILDE was definitely insufficient (Table 13). Table 13: Reactions (KEK participants) Reactions (from 0=very low to 4=very high) Suitability with regard to your prior competence Relevance of the workshop contents Quality of presentations Quality of the discussion Effectiveness of hands-on activities Adequacy of time schedule Adequacy of rooms and facilities mean 2,50 3,33 3,33 3,20 3,20 1,47 3,20 st. dev. 0,71 0,49 0,62 0,56 0,86 1,19 0,56 As far as the evaluation data provided by participants regarding their learning at the workshop (Level 2), ratings are quite high, and participants are positive about the possibility to use the ILDE and apply what they have just learnt in their daily design practice. Table 14: Learning (KEK participants) Learning (from 0=very low to 4=very high) I have learnt a lot from this workshop I look forward to putting into practice what I have just learnt about learning design I intend to use the ILDE in my future learning design activities Applying what I have learnt during this workshop will improve the quality of my courses mean 2,93 2,87 3,00 3,00 2,93 st. dev. 0,80 0,92 0,76 0,76 0,80 What I have learnt in this workshop will speed up my learning design activities Participants declare they have created (i.e. conceptualized or authored) a mean of 1 new learning design each, using the ILDE, while no learning designs were implemented per participant, because the implementation phase at KEK was only demonstrated (and not directly acted by participants, see D4.1, Rudman & Conole 2014). Then participants were asked to evaluate their most complete learning designs and the ratings they provided are contained in Table 15. These ratings are not particularly high, probably because these designs represent the participants first attempts at creating a design in the ILDE. 29
Table 15: Learning: ratings of the designs (KEK participants) Learning Completeness Originality Complexity Reusability (from 0=very low to 4=very high) mean 2,73 2,50 2,36 3,27 st. dev. 0,47 0,85 0,50 0,79 As to the Level 3 (Organization support and change), people are pretty confident that what they have learnt (in terms of methods) could be transferred to their colleagues, but not so much that they would adopt the ILDE. This may derive from the fact that in Greece infrastructures and teacher technological skills are on average insufficient, as the teachers explain in the open question. Table 16: Organizational support and change (KEK participants) Org. support & change (from 0=very low to 4=very high) What I have just learnt about learning design could alter the existing practices in my institution The adoption of the ILDE as institutional tool to support learning design is likely to be fostered by my institution What I have just learnt about learning design could be transferred to my colleagues mean 2,79 2,43 3,21 1,93 st. dev. 0,70 1,02 0,70 1,07 ILDE could be adopted by my colleagues Lastly, when asked to list three words to describe the workshop, participants expressed the following (see Figure 6). As one can see from the Worlde, interesting, collaboration/collaborative and experiential are the most recurrent terms, but similarly to what was declared at the OU, also here teachers said the workshop was short. Figure 6 - Wordle TM created with the words expressed by participants to describe the workshop (KEK) 30
The most appreciated activities were those carried out in groups (because of the possibility to exchange views with others) and the practical and experiential ones; besides, the presentation of collaborative strategies and techniques was also appreciated. All the suggestions for possible improvements concerned a longer duration of the event and a better timing. Participant s opinions about the ILDE As far as the evaluation of the ILDE, all the participants positively rated both the ease of use and usefulness of accessing the various tools within the platform and found it pretty easy to understand the differences among them. Table 17: Functions to create new designs (KEK participants) Functions to create new LdS (from 0=min. to 4=max.) how many say they did it ease of use (mean /st. dev.) usefulness (mean / st. dev.) understanding that the ILDE provides access to a variety of tools understanding the differences between these tools 16 3,13 0,52 16 3,00 0,76 finding these tools in the ILDE 0 - - access/enter these tools from 16 3,00 the ILDE interface 0,53 3,19 0,54 3,06 0,77 3,06 0,57 As a matter of fact, though, according to the data tracked by the ILDE itself, only a subset of the Conceptualize functions were explored during the workshop, namely Course Map, Persona Card and the Image upload function (see column tracked data in Table 18). The evaluation of the ease of use and usefulness of Course Map and Persona Card is positive but not outstanding. Besides, at the KEK workshop, a customized version of the Design Narrative tool was used (paper-based), and this is what the participants rated under For other conceptualizations. Given that this paper-based tool was judged so positively in this context as far as both its ease of use, as well as its usefulness, it was agreed that this tool will be integrated into the ILDE for the next round of workshops. Table 18: Conceptualize functions (KEK participants) Conceptualize functions (from 0=min. to 4=max.) how many say they used it Course Map 16 2,75 0,68 ease of use (mean /st. dev.) usefulness (mean / st. dev.) 2,75 0,68 n. of designs (tracked data) 3 31
Conceptualize functions (from 0=min. to 4=max.) how many say they used it METIS PROJECT ease of use (mean /st. dev.) Design Pattern 0 - - usefulness (mean / st. dev.) Design Narrative 0 - - 0 Persona Card 16 2,53 2,63 0,83 0,89 7 Factors and 0 - - Concerns 0 Heuristic Evaluation 0 - - 0 CompendiumLD 0 - - 0 (upload) Image (upload) 0 - - 7 For other conceptualizations 16 3,25 0,68 3,27 0,70 n. of designs (tracked data) 0 N.A. Among the Author functions, participants used WebCollage (11 designs were created with WebCollage at the workshop, see last column of Table 19), and opinions about it are positive, both as far as ease of use and usefulness are concerned. Table 19: Author functions (KEK participants) Author functions (from 0=min. to 4=max.) how many say they used it ease of use (mean /st. dev.) usefulness (mean / st. dev.) WebCollage 16 3,31 3,31 0,70 0,70 11 OpenGLM 0 - - N.A. CADMOS 0 - - N.A. n. of designs (tracked data) Unfortunately, no answers have been provided as far as the Implementation functions of the ILDE, even if some activities were registered by the system (see column tracked data in Table 20). Table 20: Implement functions (KEK participants) Implement functions (from 0=min. to 4=max.) Implement in a VLE through Glue!-PS how many say they used it ease of use (mean /st. dev.) 0 - - usefulness (mean / st. dev.) See your VLE 0 - - N.A. Register your VLE 0 - - 3 n. of designs (tracked data) 3 32
As far as the Browsing functions are concerned, the Free search and the Browsing by tags functions have been evaluated both as relatively easy to use and useful. Table 21: Browsing functions (KEK participants) Browsing functions (from 0=min. to 4=max.) how many say they used it ease of use (mean /st. dev.) Free search 16 2,87 0,74 Browse by tools 0 - - Search patterns 0 - - Browse by tags 16 2,79 0,89 Browse by discipline 0 - - Browse by pedagogical 0 - - approach usefulness (mean / st. dev.) 2,87 0,74 2,79 0,89 The sharing functionalities of the platform have been appreciated, but here the data tracked by the system say that only the Share a LdS with others with view rights function was really experimented by the workshop participants; the other functions positively rated by respondents ( Exchange messages with other LDshakers, View and edit someone else s LdS ), have probably not been experienced directly. To be noted that again here, as it happened at OU, despite the system tracked a number of actions for some of the sharing functions, then not all the corresponding items of the questionnaire were rated by the workshop participants. This may confirm the preliminary hypothesis that the question is somehow misleading and not clear enough, and this prevents people from answering. Table 22: Sharing functions (KEK participants) Sharing functions (from 0=min. to 4=max.) Create a LDshakers' group Share a LdS with others with view rights Share a LdS with others with edit rights Add a comment to a LdS Exchange messages with other LDshakers View someone else's LdS Edit someone else's LdS how many say they used it ease of use (mean /st. dev.) usefulness (mean / st. dev.) 0 - - N.A. 16 3,40 0,74 3,40 0,74 0 - - 14 0 - - 2 16 3,38 0,81 16 3,40 0,74 16 3,31 0,95 3,38 0,81 3,40 0,74 3,31 0,95 n. of actions (tracked data) 15 N.A. N.A. N.A. 33
Unfortunately, none of the respondents at KEK provided information in the open questions as to the most appreciated aspects of the ILDE, nor did they offered suggestions for possible improvements of the platform. Workshop at Agora Data about the workshop participants As it has been extensively described in D4.1 (Rudman & Conole 2014), the workshop at Agora was organized into three sessions, which took place on 16, 23 and 30 November 2013, and involved a total of 13 participants, 12 of which filled in the questionnaire (92%). As far as their profiles are concerned, at least 6 of them are involved in the Agora activities (some are volunteers) and participants foremost background is the following: 9 declare educational science, 3 educational technology. As far as their previous experience in the learning design field, all except one declare they are beginners. All the tools proposed at the workshop were previously unknown to participants. When asked about the motivations which led them to attend the workshop, they answered by rating a set of statements (from 0 to 4). From the data contained in Table 23, it emerges that in this context intrinsic motivation (personal interest /need) played a strong role in the decision taking, probably a little stronger than the extrinsic motivations (social norms). None in this context declared they had been forced to enroll. Table 23: Motivation (Agora participants) Motivation (from 0= inessential to 4=very important) I am interested in learning design I need to learn about learning design People I work with believe learning design is important mean 3,80 3,40 3,33 0,00 st. dev. 0,42 1,26 1,00 0,00 I was forced to enrol in this workshop Participant s opinions about the workshop (Levels 1, 2 and 3 of Guskey s model). Looking at the evaluation provided about the workshop, and in particular at participants reactions (Level 1), it seems that the most appreciated elements of the workshops were the quality of presentations and the quality of discussion. Even the hands-on activities are positively evaluated, and the contents are considered pretty relevant. Not surprisingly, timing here was not a particular issue; this is different from what happened at OU and KEK, probably because at Agora the workshop lasted much longer than in the other two contexts. 34
Table 24: Reactions (Agora participants) METIS PROJECT Reactions (from 0=very low to 4=very high) Suitability with regard to your prior competence Relevance of the workshop contents Quality of presentations Quality of the discussion Effectiveness of hands-on activities Adequacy of time schedule Adequacy of rooms and facilities mean 3,00 3,27 3,70 3,50 3,30 2,90 3,00 st. dev. 0,71 0,65 0,48 0,71 0,95 0,74 0,87 As far as the evaluation provided by participants regarding their learning at the workshop (Level 2), their opinions are pretty positive: not only did they think they learnt a lot, but also their impressions concerning the possibility to use the ILDE and adopt what is proposed in the workshop in their daily practice, sound promising. Table 25: Learning (Agora participants) Learning (from 0=very low to 4=very high) I have learnt a lot from this workshop I look forward to putting into practice what I have just learnt about learning design I intend to use the ILDE in my future learning design activities Applying what I have learnt during this workshop will improve the quality of my courses mean 3,55 3,20 3,20 3,20 3,00 st. dev. 0,82 0,79 0,79 0,79 0,67 What I have learnt in this workshop will speed up my learning design activities Participants declare they have created (i.e. conceptualized or authored) a mean of 1,58 new learning designs each using the ILDE, and have implemented 1,42 learning designs; this last datum does not entirely reflect what happened in reality, as implementation at Agora was carried out as a demonstration activity (see D4.1, Rudman & Conole 2014). Then participants were asked to evaluate their most complete learning designs and the ratings they provided are contained in Table 26. Ratings given to the completeness and originality of the created learning designs are higher than those at OU and KEK. This could be explained by the fact that, at Agora, more time was devoted to the use of the platform and this resulted in a higher participants satisfaction about their first attempts in creating new learning designs. Table 26: Learning: ratings of the designs (Agora participants) Learning Completeness Originality Complexity Reusability (from 0=very low to 4=very high) mean 3,20 3,20 2,36 2,60 st. dev. 0,79 0,92 0,81 0,97 35
As to the Level 3 (Organization support and change), people seem positive about the possibility that both the ILDE and the proposed learning design practices can be transferred to their colleagues and even be adopted/fostered at institutional level. Table 27: Organizational support and change (Agora participants) Org. support & change (from 0=very low to 4=very high) What I have just learnt about learning design could alter the existing practices in my institution The adoption of the ILDE as institutional tool to support learning design is likely to be fostered by my institution What I have just learnt about learning design could be transferred to my colleagues mean 3,00 3,22 3,10 3,78 st. dev. 0,53 0,67 1,45 0,44 ILDE could be adopted by my colleagues Lastly, when asked to list three words to describe the workshop, participants expressed the following (see Figure 7). Terms such as interesting, practical and learning are the most frequent ones. Figure 7 - Wordle TM created with the words expressed by participants to describe the workshop (Agora, in Spanish 13 ) When asked to identify the most appreciated activities, most of participants answered they have enjoyed the experience as a whole and not many suggestions for improvements were provided. Participant s opinions about the ILDE 13 The Wordle is in Spanish because it has been created using as an input the real answers provided by the participants, which were given in Spanish. 36
Looking at the participants impressions about the ILDE, from Table 28 it seems that, while they have been able to appreciate the usefulness of the various tools offered by the platform, their evaluation of its ease of use is less positive. Table 28: Functions to create new designs (Agora participants) Functions to create new LdS (from 0=min. to 4=max.) understanding that the ILDE provides access to a variety of tools understanding the differences between these tools finding these tools in the ILDE access/enter these tools from the ILDE interface how many say they did it 10 2,89 0,60 10 2,56 0,53 10 2,67 0,87 9 2,67 0,50 ease of use (mean /st. dev.) usefulness (mean / st. dev.) 3,75 0,46 3,25 0,46 3,63 0,52 3,50 0,53 According to the data tracked by the ILDE, at the Agora workshop people explored the following Conceptualize tools: Design Narrative, Persona Card and Image. The ratings given by participants to the ease of use and usefulness of the former two tools are pretty high; also the Design Patterns have been positively rated: given that the system didn t track any activity on these, though, we should assume either that the participants have only accessed them but didn t save, or that the Design Patterns have been demonstrated by the facilitator. 37
Table 29: Conceptualize functions (Agora participants) Conceptualize functions (from 0=min. to 4=max.) how many say they used it ease of use (mean /st. dev.) Course Map 0 - - usefulness (mean / st. dev.) n. of designs (tracked data) 0 Design Pattern 7 3,14 0,38 Design Narrative 6 3,17 0,41 Persona Card 6 3,17 0,75 Factors and Concerns 1 2,00 -- Heuristic Evaluation 1 3,00 -- CompendiumLD (upload) 2 4,00 0,00 Image (upload) 1 4,00 -- For other 1 3,00 conceptualizations -- 3,71 0,49 3,67 0,52 3,50 0,55 3,00 -- 3,00 -- 4,00 0,00 4,00 4 -- 3,00 -- N.A. 0 4 8 0 0 0 In the case of the authoring functions (WebCollage was the only one experimented), usefulness was more appreciated than ease of use. Table 30: Author functions (Agora participants) Author functions (from 0=min. to 4=max.) how many say they used it ease of use (mean /st. dev.) usefulness (mean / st. dev.) WebCollage 9 2,75 3,71 13 0,71 0,49 OpenGLM 0 - - N.A. CADMOS 0 - - N.A. n. of designs (tracked data) Data on Implementation confirm that only a sub-set of participants implemented and deployed their designs; the impressions got as far as the ease of use and usefulness of the implementation functionalities are again quite promising. 38
Table 31: Implement functions (Agora participants) Implement functions (from 0=min. to 4=max.) Implement in a VLE through Glue!-PS how many say they used it 6 3,33 0,82 See your VLE 4 2,15 0,58 Register your VLE 4 3,75 0,50 ease of use (mean /st. dev.) usefulness (mean / st. dev.) 3,83 0,41 3,67 0,58 3,50 0,58 n. of designs (tracked data) 3 N.A. 3 In the case of the browsing functions, here the data are more exhaustive in comparison to those obtained by KEK and OU and the obtained ratings are very high (especially for usefulness). Table 32: Browsing functions (Agora participants) Browsing functions (from 0=min. to 4=max.) how many say they used it ease of use (mean /st. dev.) Free search 2 3,50 0,71 Browse by tools 5 3,25 0,50 Search patterns 4 3,50 0,58 Browse by tags 5 2,60 0,55 Browse by discipline 2 3,00 0,00 Browse by pedagogical approach 0 - - usefulness (mean / st. dev.) 4,00 0,00 3,50 0,58 3,50 0,58 3,00 0,71 3,50 0,71 The questions related to the sharing functionalities were also extensively answered, and the related functions were judged positively. Only a couple of low ratings regard the ease of use of the functions Share a LdS with others with edit rights and Edit someone else's LdS (but both these questions have been answered by two persons only). 39
Table 33: Sharing functions (Agora participants) Sharing functions (from 0=min. to 4=max.) Create a LDshakers' group 7 Share a LdS with others with view rights Share a LdS with others with edit rights Add a comment to a LdS 2 Exchange messages with other LDshakers View someone else's LdS 4 Edit someone else's LdS 2 how many say they used it 5 2 4 ease of use (mean /st. dev.) 3,14 0,38 3,60 0,55 2,50 0,71 3,50 0,71 3,25 0,50 3,25 0,50 2,50 0,71 usefulness (mean / st. dev.) 3,67 0,52 4,00 0,00 3,00 0,00 4,00 0,00 3,75 0,50 3,75 0,50 2,50 0,71 n. of actions (tracked data) N.A. 12 6 0 N.A. N.A. N.A. Among the terms mostly used by participants to describe the ILDE, we can find: good (buena), precision (precision), sharing (compartir) and innovation (innovacion). Figure 8 - Wordle TM created with the words expressed by participants to describe the ILDE (Agora, in Spanish) Among the aspects of the ILDE they mostly appreciated, Agora participants mention clarity (which is somehow in contradiction with the fact that the ease of use of some of the ILDE functions were not always positively rated), and sharing. 40
When asked to provide suggestions for possible improvements, they mention the need to increase the ILDE ease of use and make it more intuitive; at the same time they suggest to devote more time to practice the ILDE. Data from the enactments As already mentioned, after the workshops in each context a sub-set of participants were involved in the enactment phase, namely 1-2 people from each context accepted to complete a learning design and deliver it in a real learning situation. This happened in the first months of 2014 with different modes and timing in the various situations. An account of the enactments is contained in D4.1 (Rudman & Conole 2014). In the following, we summarize the results of the analysis of the data collected through the interviews (types A, B and C) delivered at the end of each enactment. As it has already been explained under Section 3 of this document, we had three interview types, according to the various enactment situations that originated within the three contexts (see Section 3 for more details). Enactment at OU At OU the enactment in the first round followed a different organization path from those carried out at KEK and Agora (presented in the following sections); to account for this difference, it is necessary to provide a brief description of the current state of e-learning production and delivery at the OU. As it was briefly mentioned in D4.1 (Rudman & Conole 2014), the OU has a central VLE that is used by all its faculties to deliver learning activities and materials to students. Within the OU, every course module is produced by a team of people with a variety of skills, including academic subject specialists, media designers, project managers and VLE specialists. The OU has a standard process for the production of online, print and other media, which features the use of an in-house XML schema to store content; this XML content facilitates the production of material in a variety of media. The time taken to produce a 30-ECTS (European Credit Transfer System)-point module typically takes around 2 years but varies from module to module. After production is completed, each module will be available to students for a life span of up to around 10 years (though it may be modified during its lifespan in response to feedback from students). Every module is taught by a number of Associate Lecturers (ALs), individuals who are appointed the role on the basis of their subject knowledge and ability to teach at a distance. There is typically one AL per fifteen students, and student populations range from several thousand for a level one undergraduate module to around a hundred for postgraduate modules. Usually most, if not all, of the ALs employed to teach a particular module will not have been involved in its development. In the workshop at the OU, participants worked in 4 groups, each related to a particular faculty. The aim was to simulate the OU s module team experience by having people with a variety of skills in each group, e.g. academic, media development and project management skills etc. (see Rudman & Conole 2014 for more details about the make up of the groups 41
during the workshop). After the workshop two participants (belonging to distinct groups) expressed interest in a follow-up workshop with a view to enacting collaborative learning designs. These two individuals were a Regional Manager from the OU s Business School within the Faculty of Business and Law, and a Lecturer from the Faculty of Health & Social Care respectively. Unfortunately, circumstances then changed with respect to the planned OU follow up with the Faculty of Business and Law and at the moment this Faculty is no longer able to guarantee the enactment. In the following section we describe in more details the situation at the Faculty of Health & Social Care. Follow up at the Faculty of Health & Social Care One of the participants at the OU pilot workshop in October 2013 was a 'Module Chair' for a new module that is currently under development within the faculty of Health and Social Care. In OU module development the role of the 'Module Chair' is to act as chair-person of the team that will design the module and create the materials. The duration of the module being developed 'Perspectives in health and social care' will be of 30 weeks, and be worth 30 ECTS (European Credit Transfer System) credits. Following the OU workshop, the Module Chair of 'Perspectives in health and social care' invited the OU METIS team to support the consideration of, and if deemed appropriate, design and development of collaborative activities within the module. In February 2014 the OU METIS team ran a 90 minute workshop for 3 academics from the module team. 'Perspectives in health and social care' is to consist of 3 blocks each on a different theme, and each of the 3 academics who attended was the lead author of each block of the module (one of them being the Module Chair). In the first 20 minutes of the workshop, the METIS project and the METIS tools relevant to design of collaborative learning were described by the facilitator, who was then briefed about the current status of the module. During the remaining 70 minutes, two workshop activities drawn from the METIS workshop structure were run: 'How to ruin a collaborative learning activity' and 'Conceptualize learning outcomes' (see D3.3, McAndrew et al., 2013). Following the workshop the METIS project was thanked for its input, and the module team were confident that they can undertake the remainder of the design and implementation using the OU s normal production processes. The Module Chair has since reflected on the impact of METIS on the module's development: "I had worked on module teams in the past where there were what appeared to me to be very well designed and thought-through collaborative learning activities, avoiding any of the obvious traps and difficulties of collaborative work. However, in presentation these had still been hated by students and sometimes ended up being removed from later presentations of the module. I was aware that team working is a key employability skill that can be hard to evidence in distance learning. I also knew I was expected to use Forums within the module and thought that collaborative learning was a good use of forums. I was therefore keen to attend the METIS one-day workshop in order to learn how to design collaborative learning 42
activities better. During the workshop I was able to work on a possible collaborative activity within K118 [K118 is the OU s internal code for the 'Perspectives in health and social care' module]. In the end, this didn t prove to be suitable for use, but it gave me a very useful practice run at the design process. One of the key points I took away was Mary Thorpe 14 s that if you are asking students to give up the many benefits of asynchronous independent learning, there needs to be a damn good reason for them to do so. The mini version of the workshop at a module team meeting helped to convey a little of what I had learned to the other Block leads within the team, although only a small proportion, I think. We tried out one idea for a collaborative learning activity, but ended up rejecting it. We then came up with another idea, which we are still working on. This session also enabled us to make a strategic decision about the use of forums within the whole module which reversed our previous decision we went from little and often and not-assessed to infrequent and chunky and building towards assessment' (Module chair, 'Perspectives in health and social care', 12.03.2014). The first run of this module will be from October 2014 to May 2015, with the enactment of the collaborative learning components scheduled to begin during December 2014. The target number of students for this first run is several hundreds. Interviews to workshop participants who are not enacting A written interview was mailed to 15 participants who attended the OU workshop but who have not expressed interest in proceeding to enactment (this is included as Appendix II c). The first mailing was made on the 14/2/2014, and a reminder was sent on the 28/2/2015. 5 responses have been received. Of the 5 respondents, the faculties and roles were as follows: 2 were from the Mathematics, Computing and Technology faculty (a Staff Tutor and Library Services ) 1 from the Science (media developer) 1 from the Health and Social Care (Library Services) and 1 from the Business and Law faculty (a media developer). All 5 reported that the workshop had not led to a change in practice. The reasons put forward shared the theme of lack of opportunity e.g. There has not been the opportunity, unfortunately I haven't had opportunity to put it into practice yet as no-one in my work area has been developing collaborative activities and 14 Professor Mary Thorpe presented the Evidence and examples of collaborative learning during the OU pilot workshop. 43
Lack of opportunity to work sufficiently far up stream to influence design of collaborative activities. In response to the question about plans for the future, none of the 5 had any specific plans to make use of what they had learnt in the workshop, with 2 indicating that any plans would be subject to the nature of the opportunities arising in the future. When asked if they had any other comments, one participant stated that It would be helpful to have some sort of guidance to the ILDE tool - cribsheet or similar and another mentioned that Occasional email reminders of the main concepts, perhaps by short news update would prompt me to re-visit the workshop ideas. As it will be later on discussed, these responses seem to confirm the difficulty to disrupt existing practices in big institutions such as the OU, where existing methods and tools are so consolidated that people tend to keep using them; one consideration that we may derive is that, given that the ILDE structure does not reflect the way the OU work flow is organized, as it is, the system seems to fit only partially the needs of such an institution. Enactment at KEK As already reported in D4.1 (Rudman & Conole 2014), at KEK there were two people enacting: one person is a teacher who had attended the METIS workshop, the other is a trainer (more representative of the target population represented by KEK), but who hadn t attended the workshop. The latter enactor was included even though missing the necessary requirement (i.e. the workshop attendance), because, given that the former enactor was not representative of the category for which KEK was initially selected by the project (i.e. vocational training), the consortium agreed that it would be important to have at least one enacting person from this sector, so the compromise was reached, to accept a newcomer for this specific purpose. In order to evaluate their experiences, the follow up interviews were delivered (Interview A and Interview B respectively). In the following the main results of the two interviews are provided separately; this is because their background was different (with or without the workshop), and also because they represent two different working sectors, so we think it is important to keep their impressions separated. Enactor #1 - without prior workshop attendance (vocational trainer) The enacting person (Enactor #1 in D4.1) who hadn t attended the workshop is a 36 years old woman, who works at KEK as training and coaching mentor/ adult education; she has got a Master Degree on International Education, but she declares herself a beginner in the field of learning design. As explained in D4.1 (Rudman & Conole, 2014), prior to the enactment she was introduced the topic of learning design and the ILDE during a specific session by the workshop facilitator. She designed some collaborative activities, first of all using the Design Narrative conceptualization tool, and then authored them in WebCollage. Then she proposed the 44
designed activity to a class of 15 students who are high school graduates who had not continued their education and are currently employed. The topic was Introduction to Marketing Principles. When asked about the motivation which led her to do this experience, she answered she is very interested in learning design and she thinks she needs to know something about it. When asked her impressions about what she had learnt during the experience, she was moderately positive about the possibility to put into practice what she had learnt in her daily practice, and not so confident about using the ILDE in the future. In particular she said: It was an interesting experience, I used the ILDE for a course on Introduction to marketing principles for a group of students who have recently graduated from high school. Most of them have not continued their studies as yet and are already active in the job market. The group was not familiar with using Moodle so we never got to the phase of implementing the design. I got an insight of the implement process and what steps are involved only on a theoretical level. Although the time it would take was explained to me beforehand, I did not expect the enactment process would take so long. Although I was familiar with activities such as brainstorming and jigsaw, I did not expect that so many details would be needed when defining them. Overall I liked having to think through the activities and reach a deeper level of detail usually this is only done in my head and on the fly no documentation of the process before or after the class. I also like the sharing option although I was not able to find relevant material in the LdShake web site that could be useful for my course. When asked to give her impressions about the student learning outcomes, she said: The particular student group was not very familiar with collaborative learning activities and was a bit sceptic at the beginning. They ended up enjoying both the brainstorming and the simulation and got to come up with ideas that helped them better understand the concepts at hand. They became more familiar with ways companies communicate with potential customers, ways of developing and presenting their ideas as well as with basic concept of enterprise resource management. Even if she hadn t collected any evidence on the field, she thinks what she had designed improved her students achievements and that the designs produced through the ILDE benefited her students a lot. When asked to evaluate more specifically the ILDE, she said: I was not familiar with the existence of all these tools and I had the chance to familiarise myself with their basic characteristics. It took me some time to understand the differences between conceptualization, authoring and implementation. 45
As far as the Conceptualization tools offered by the ILDE, she only used the Design Narrative (mostly for time constraints) and her impression is pretty positive, both regarding its ease of use and usefulness. As far as Authoring, WebCollage was judged not very easy to use: I found out I needed almost constant support when using WebCollage. Although I was familiar with the concepts and activities, when the tool was explained to me I found it hard to put it into my practice and quite time consuming when not familiar with the tool. It was interesting to use and I appreciated the help/pop-ups when choosing a phase/ design pattern. Unfortunately, as explained in D4.1 (Rudman & Conole 2014), the implementation phase within the ILDE was not reached. This was mostly due to the fact that students were not at all familiar with the use of Moodle and there was little time for getting them to subscribe and use the VLE. The teacher got a preview of the ILDE implementation process on a theoretical level while examples of similar implementation were shown to her by the facilitator. Thus the learning design was trialled, but not using Moodle. As a consequence, the trainer didn t provide answers for this section. Her impressions about the browsing functions (Free search and Browse by discipline) are positive as far as usefulness is concerned, but not completely positive as far as ease of use: I found the annotation of the different items in the ILDE a bit confusing. It took me a while to get used to the fact that items from WebCollage had a different thumbnail than items created in the conceptualize menu. I ran searches both in the KEK ILDE as well as in the LdShake environment. Sometimes the interface was quite slow in responding to my search. Her impressions about both the sharing functions, instead, are more encouraging: The sharing functions were easy to use, I did not get the chance to use the comments utility but I think it is useful for peer feedback. Coming to the sustainability issues, the trainer declared she thinks a workshop on learning design could be proposed on a larger scale at her institution and more in general at other institutions of the vocational training sector, while she is less positive about the possibility that the ilde could be adopted on a larger scale at her institution or at similar institutions. Despite this, she thinks the ILDE is Useful, collaborative, prototype and, when asked what she had mostly appreciated of the ILDE, she said: The fact that it allows sharing of LDs and promotes collaboration between teachers, as well as the fact that it supports various tools and various levels of the learning design process. When asked to suggest possible improvements, she answered: 46
More examples on more domains to use as templates; A faster interface; More clear distinction of the LD types/classification. Overall, we must acknowledge the trainer, probably due to the lack of an adequate introduction to the topics and the system, was not able to exploit and evaluate all the ILDE functionalities and the lack of the implementation phase is a limitation of this case study. So, in the future, the project should avoid to involve people without an adequate support and preparation to the use of the ILDE. Nonetheless, her answers, especially those regarding sustainability, are promising and further stress the need to involve other representatives of the vocational training sector in the next round of workshops. Enactor #2 - with prior workshop attendance (primary school teacher) The person who attended the workshop at KEK and decided to enact is a 29-year-old primary school teacher with a background in educational science, who considers her own experience in the learning design field as intermediate. As already mentioned in D4.1 (Rudman & Conole 2014), this teacher during the enactment decided to create a completely new design from scratch, using the paper-based Conceptualization tool used during the workshop (a customization of the Design Narrative) and then WebCollage. She then delivered the design to a class of 23 students. The teacher declares that during the enactment she, using the ILDE, has put into practice what learnt at the workshop, and doing so has significantly improved the quality of her course; at the same time, she doesn t think what she learnt has sped up her activities, nor that it is impacting on the practices of her institution. In particular she said: I had a very useful experience in the METIS workshop as well as in the enactment phase. I think that there is a gap between theory and practice that I will take into consideration at my following learning designs. The learners characteristics in technology enhanced learning have very significant role that I wasn t able to imagine from the beginning of my learning design. In addition, there are many things that you have to consider better when you teach children. In my opinion the enactment of my learning scenario wasn t particularly successful but I learned many things from this procedure. To my mind, the enactment of learning scenarios should be basic part of the seminar. Participants would have the proper experience to discuss about the procedure of the enactment, the upcoming problems and other issues. She thinks the design(s) she had produced through the ILDE benefited her students: My students enjoyed the new style of teaching and particularly the collaborative and gamebased activities. However, there were many drawbacks that I didn t take into account at the learning design, such as the mismatch of the duration of the activities (we spend more time than I had expected), the internet connection problems (the biggest part of the learning scenario was conducted offline) and the problem with the e-mails and passwords (students logged in as free users). Despite the difficulties, I think that both students and I gained a lot from this lesson. 47
When asked to reconsider her impressions about the ILDE after the enactment phase, she said: I would rate the ease of use and usefulness of For other conceptualizations (i.e. customized version of the Design Narrative) with 4, because I am more familiar with this tool and I think that it enhances very well the learning design. Also I would rate the usefulness of Course Map with 2, because I think that this process repeated at the WebCollage. As far as the Authoring is concerned, she said: I would rate the ease of use of WebCollage with 3, because it s more complicate than I think for the first view. In addition I would rate the usefulness with 1 (in case that I don t want to implement my learning scenario with Moodle) and with 3 (in case that I want to implement it to Moodle), because I think I spend a lot of time for this representation. When asked to evaluate the Implementation functions, she answered: I would rate the ease of use and usefulness of implement functions with 3, because I think these functions are only procedural. However I have again problems in registration of VLE (something with credentials)! Her ratings of the browsing and sharing functions are very positive. When asked to provide some general and final comments, she said: There are some issues/problems that were very disturbing and made me spend more time in the process of learning design such as: the slow loading of the ILDE page when I use its tool; the crashing of ILDE (many times); the VLE registration problem; in WebCollage the Phases/Activities without resource are not represented at all in Moodle; the erase of WebCollage file when you implemented it with GLUE!PS. I think it is very important, in case you have something wrong in your Moodle implementation and you want to change it. I think that it s a very good experience to design through ILDE because there are many tools for all tastes. I would like to continue to use ILDE but I think that new users need support (several hours of seminars) in order to understand the philosophy and the usefulness of authoring tools and implementation tools. Overall it seems that in the case of the primary teacher, who was able to run the whole design cycle, the most appreciated tools are those that cover the high-level design (Conceptualization tool), and the browsing and sharing functions. Less positive is her impressions about the authoring phase and the implementation phase, but this may derive also from the technical issues she suffered. In any case, this is not surprising, given that in the case of a primary teacher the advantages of setting up a Moodle, which is not for sure the most adequate Virtual Learning Environment for children of that age, is very moderate, especially if this operation is not completely smooth. In contrast, again not surprisingly, the teacher appreciates a lot the Conceptualization tools, because they are able to support a crucial phase of the design process that is often very critical and undocumented for school teachers. 48
Enactment at Agora As already presented in D4.1 (Rudman & Conole 2014), three persons enacted at Agora, two of which had attended the workshop, whereas the third didn t attend but usually works in close collaboration with one of the former participants. In this case, the latter enactor was accepted despite the lack of the minimum requirement (i.e. the workshop attendance), because it was judged that a specific session of introduction to the topic + a close and continuous collaboration with her colleague who had attended the workshop, could represent a sufficient background. This additional experience will also offer the project the possibility to consider other alternative (or parallel) approaches for introducing the ILDE into a new context (besides the METIS workshop). They are all volunteers at Agora, teaching in digital literacy classes. In the following the main results coming from the data collected from their interviews are synthesized. The two workshop participants are both very positive about the enactment experience, as they both recognize that the enactment phase was a crucial step to consolidate what they have learnt at the workshop. In particular one of them states: Although being involved on the workshop organised by Àgora on November, I had many doubts about the ILDE (what were the main steps-authoring, conceptualising, implementation-, what to do in each step, which programs were associated to, etc.). The enactment supported me to clarify some of these doubts. After finishing the workshop, I thought it was really very difficult to put the ILDE into practice, but being involved in the enactment showed me that it is not as difficult as it seems. During the enactment phase they declare they have put into practice what they learnt at the workshop and have used the ILDE; this in their view has improved a lot the quality of their courses, has sped up their activities and is impacting on the practices of Agora. The classes involved in the enactment were composed of 20 and 10-12 students respectively. When asked about the student learning outcomes, all the three trainers are very positive. One of them state: The organization of classes using methodologies developed using the ILDE tools makes students more motivated and involved on the classes. Besides, when asked to rate whether and to what extent they think using the ILDE benefited their students, they were all very positive and they think the way they had designed their activities improved their students achievements a lot. When asked to evaluate the ILDE, especially in respect to their former impressions gained at the workshop, they underlined the role of the follow up phase as a key moment to better understand the ILDE and its functionalities. 49
Their impressions about the ease of use and usefulness of the Design Narrative still remains very good, but now they are more aware of the potentiality of the tool. In particular, one of them states: The perceptions are the same. What it has changed is that during the enactment I understood better why they are useful for and when they have to be implemented. Instead about the Authoring (WebCollage), they have still some difficulties. In particular one of them states: Webcollage is still difficult to use. There are many menus to complete which are placed in different but very-concrete position. The zoom helps to enlarge the area where you have to click on, but sometimes you do not know exactly where you are on the global view. As already reported in D4.1 (Rudman & Conole 2014), unfortunately during the enactment at Agora it was not possible to use Glue!-PS to transfer the WebCollage details into Moodle due to a technical problem (Agora had upgraded to a version of MOODLE that GLUE!-PS didn t support at that time). Therefore, Moodle content was generated manually. Thus the Implementation functions were only partially explored by the trainers, who were not able to provide an evaluation of these aspects. Apparently during the enactment the other ILDE functions (browsing and sharing) were only partially used by the enactors, and in any case it seems their original opinions have not changed substantially. Coming to the sustainability issues, it seems that all the three trainers think that both the ILDE and the workshop package fit the needs of their institution and could be adopted on a larger scale even in other, similar institutions of the same sector (adult education). Nonetheless, one of them points out the following: Related to the third question of this last indicator (sustainability), I think that it is important to take care of the monitoring of people who decide to introduce/use ILDE in their classes. It is not a easy-to-use tool, so explanations and support has to be conceived to help teachers/volunteers in their tasks. It is necessary a extensive monitoring of people, to give as information as necessary and to explain it easily (to adapt to teachers/volunteers background!). Overall, it seems that in this context the ILDE and the proposed methodologies are appreciated and they seem to fit not only trainers individual needs, but also institutional ones. One of the respondent underlines the importance of adequately and continuously supporting people, when one wants to introduce such an innovation in a virgin context. 5. Discussion and lessons learnt As we have seen, the first round of workshops gave very useful feedback that should be taken into account as an input to the second round. In particular, there are three types of 50
feedback: concerning the evaluation means, related to the workshop format, and concerning the ILDE. In the following we address first of all the evaluation means and then, in a separate sections, we reflect on the workshop and the ILDE. Both the theoretical models chosen as a basis for the METIS evaluation, can be considered fitting. In particular, the adoption of the Guskey model turned out to be fitting with the METIS evaluation needs: besides evaluating the workshop itself, it allows the evaluation of the enactment as well, all within the same theoretical framework. As a consequence, we can rely on an overall, clear picture of the whole process and of its impact. Nonetheless, we have to acknowledge that the evaluation of Level 5 of the Guskey model, devoted to assess the impact of innovation on students, in this phase was not completely addressed, due to various constraints in the different contexts (already discussed in the section devoted to enactments), so for the next round, we must ensure this Level is properly covered with sufficient data. Besides, Level 4 and Level 5 would also allow an evaluation of the impact at the medium-long term, so the project is considering to come back to the enactors some months after the enactment has finished to assess whether and to what extent their practices have actually changed. Coming to the TAM model to evaluate the ILDE, the adoption of such model and its adaptations turned out to be very effective in the context of METIS and in line with the evaluation needs of the project. As far as the questionnaire is concerned, overall its structure seems to fit the needs of the METIS evaluation, but some of the questions may require slight reformulation based on suggestions coming from the respondents, so to make them clearer 15. At the same time, given that some of the participants claimed the questionnaire was a bit long, we might consider shortening it for the second release. To be noted that in the case of KEK, it was not considered mandatory to have the questionnaire translated in Greek, as participants were deemed familiar with English. Nonetheless, unfortunately in this case often the free questions have been skipped and this is not good, because many times qualitative information coming from open answers are very useful and helpful to understand the quantitative data. So, for the future, we might re-consider this choice, as one of the factor that may have hindered people from responding, could be that, while they are confident with reading English, they might be not so much fluent in writing; if this is the case, we must make sure they feel free to answer in their own language. The interviews turned out to be effective and no particular issue was raised about their structure and formulation. One aspect to underline is that Interview C delivered by OU at the workshop participants who are not enacting, provides very useful information, so for the 15 The need for reformulation arose due to doubts about the interpretation of the questions on the side of respondents. 51
next round we should make it possible to collect these data also in the other contexts and try to understand the reasons why people disregard the follow up phase. As far as the data tracked by the system, some work is still required, mainly because (not surprisingly) ILDE tracks the users actions at a very low level (single click), while of course, in order to juxtapose these data with the data of the questionnaire, we need higher level data. WP2 provided ITD with a set of tracked data, but some additions are still required so to make the juxtaposition even more effective. Furthermore, for the next round, we might consider the possibility to analyse more in depth the designs produced by participants (especially those produced during the enactment). This will entail an effort of translation and preliminary analysis by OU, KEK and Agora, so to allow WP5 people to have these data in a coherent, homogenous way. Feedback on the workshops + enactments After analyzing the data coming from the evaluation context by context, in this section we make a comparison of what happened across the three contexts focusing on individual indicators. So as far as the workshop and enactment evaluation is concerned, we should come back to the five levels of the Guskey s model. Regarding the motivation of people to attend the workshop, it seems that they have not been forced to attend, but they have chosen it either because there is a personal interest in the topic, or because there is a need to learn it, or even because colleagues believe this is an important issue. Having conducted a MANOVA test to verify if there are differences as far as the motivations in the three groups, the results was that there are not significant differences (F=.261 p=.410). In the case of the Reactions to the workshop (Level 1 of Guskey s model), it was not possible to apply the MANOVA test (necessary conditions not met), thus single ANOVAs were calculated, which are reported in the table below: Table 34: Reactions to the workshop Mean st. dev. F p Suitability with regard to 2.95 0.76.365.699 your prior competence Relevance of the 3.20 0.69.867.429 workshop contents Quality of presentations 3.31 0.70 3.149.055 Quality of the discussion 3.28 0.76.502.610 Effectiveness of hands-on 3.29 0.81.227.798 activities Adequacy of time 2.10 1.10 6.892.003 (*) schedule Adequacy of rooms and facilities 3.05.81.415.664 52
As one can see, all the aspects related to the workshop mentioned in Table 34 were very much appreciated and no significant differences in the opinions of the three groups were recorded. The most appreciated phases of the workshop are the hands-on activities, as well as the discussion and the possibility to share points of view with colleagues, but also the contents and the presentations were positively rated. In contrast, the Adequacy of the time schedule got lower rates ; after calculating the related ANOVA (see * in the table) it emerged there was a difference among the groups and, given that variances turned out to be not homogenous (p=.024), we applied a Welch test (p=.005) and then Games-Howell Post Hoc tests, through which we discovered the difference was between the impressions got at KEK, which were particularly negative, and those at Agora, which were far more positive (p=.003). This is not surprising, given that the workshop at Agora was much longer and distributed across three sessions, while the workshop at KEK was much shorter and more intense. This should lead us to reconsider timing and possibly to design longer workshops and/or multisessions for the second round. The balance of the activities was also a problem in all the three contexts, even if this was more an issue for the workshop facilitators (rather than for participants), who had to manage and change plans on-the-fly (see D4.1, Rudman & Conole 2014): in all the cases the time devoted to implementation was far less than that devoted to the conceptualization, and this resulted in a poorer exploration of the related functions of the ILDE. Also the time spent to familiarize with the ILDE was considered too short. This should lead to reconsider not only the overall timing, but also the duration of the single activities. Suggestions regarding possible improvements coming from the participants concern the possibility to anticipate some activities and/or to provide background readings before the workshop itself, or even spread it over another session (where this doesn t already happen). Besides, given that people need time to work on the platform, explore it, and experiment the various functions, a possible improvement might come from spending more time on the ILDE, encouraging the use of the various functionalities, allowing participants to appreciate the differences among them. To be noted, that at the moment the ILDE integrates only WebCollage as an authoring tool, but the upcoming release of the platform envisages the integration of OpenGLM as well and possibly that of CADMOS; this will increase the accuracy of the authoring tools evaluation in the next round. As far as Learning is concerned (Level 2), it seems that all the groups think they have learnt a lot from the workshops (mean= 3.12, st. dev. =0.89); besides, they are all fairly positive concerning the intention to put into practice what they have learnt at the workshop in the future (mean=2.97; st. dev. =0.88) and to adopt the ILDE (mean=2.78; st. dev. =0.96). Besides, they all think what they have learnt will improve the quality of their courses (mean=2.97; st. dev. =0.78). The MANOVA test calculated for the Learning dimension (F= 2.171 p=.031) revealed there is a difference in the opinions related to the statement: What I have learnt in this workshop will speed up my learning design activities (ANOVA test F=5.551; p=0.008), between OU and KEK (Turkey Post Hoc test p=.016) and between OU and 53
Agora (p=.021), as OU turned out to be consistently more negative about it. Another difference comes for the statement: I intend to use the ILDE in my future learning design activities (ANOVA test F=4.018; p=.027) between OU and Agora (Turkey Post Hoc test p=.037). These results should be regarded as positive and give the project a clear indication of the fact that the workshop is very well accepted in all the contexts and seems to fit the needs of the three sectors. This also confirms the choice made under WP3 to have a common design for all the trial partners, with minor customizations for the single contexts. These customizations so far have entailed mostly timing, language of delivery and choice of the ILDE functions on which to draw participants attention (where this is possible, e.g. among the Conceptualization functions); the results of the evaluation confirm that such level of customization satisfies the needs of the various contexts (provided the necessary adjustments related to timing, already discussed above). The MANOVA test conducted on the ratings given by participants to the designs created at the workshops, revealed there are differences of opinions (F=2.54; p=.020). In particular, OU was more rigid as far as the completeness and the originality of its own designs. Nonetheless, given the exploratory nature of the activities conducted at the workshops, we don t think we should take the ratings as a real feedback about the quality of what has been produced. As far as the Organization support & change indicator (Level 3), the MANOVA test was not possible, so single ANOVAs were conducted. Table 35: Organization support and change What I have just learnt about learning design could alter the existing practices in my institution The adoption of the ILDE as institutional tool to support learning design is likely to be fostered by my institution What I have just learnt about learning design could be transferred to my colleagues ILDE could be adopted by my colleagues mean st. dev. F p 2.68 0.90 1.327.280 2.41 1.04 6,919.004 (*) 3.02 0.97.767.472 2.61 1.15 11,56.000 (*) 54
Looking at Table 35, it emerges that all the groups are equally positive about the possibility that what they have learnt about learning design could be transferred to their colleagues, and - to a lesser extent - that this may even alter the existing institutional practices. The situation is a bit different regarding the adoption of the ILDE at the level of institution (see * in the table), where there are differences of opinions: given that variances are homogenous (p=.348), a Turkey Post Hoc test was applied, which revealed that OU is less positive than Agora about this (p=.002). Similarly, about the possibility that the colleagues adopt the ILDE (not at institutional level, but rather on individual basis) (*) again there is a difference: given that variances are homogenous (p=.103), a Turkey Post Hoc test was applied, which made it emerge Agora is absolutely positive about it, which is different from what people at KEK and OU think (p=.013 and p=.000 respectively). The follow up phase carried out by a sub-set of people after the workshops, gives further information about the impact of the ILDE and the METIS activities, and in particular provides useful feedback about Level 4 and Level 5 of the Guskey s model. In particular Level 4 focuses on the ability of workshop participants to put into practice what they have learnt and to uptake innovation in their daily practices. This particular stage in METIS applies only to the two trainers at Agora and the primary school teacher at KEK, i.e. those people who enacted after having attended the workshop, because it is evident that for the non-attendants we cannot evaluate this stage. Despite the paucity of data, the first feedback on this aspect is very positive, because here participants confirm they applied what they learnt at the workshop and think this has improved the quality of their course; especially at Agora they were very enthusiastic of the role of the enactment as a consolidation stage. To be noted, though, a different opinion regarding the impression that this is impacting on the practices of institutions: while at Agora they are positive about this, at KEK the teacher hasn t got this impression. Once again, here we should consider that while the Agora trainers operate in a small context, the primary school teacher is probably thinking at her school so it is reasonable that she is far less confident on this. About Level 5 of Guskey, unfortunately here we have data coming from Agora only, as in the other contexts there was no way to collect data on this during the available time span. Trainers at Agora have reported very positive impressions on this, and their evaluation of the student learning outcomes is absolutely enthusiastic. Furthermore, as already mentioned, we should consider that Level 4 and Level 5 as they are proposed by Guskey, would also allow an evaluation of the medium-long term impact; this would mean coming back again to the enactors some time (months) after they have enacted. At the moment, the project is still considering when and how to address this further phase of the evaluation process. 55
Overall even if the METIS workshop per se was appreciated in the higher education context, it seems that its impact is perceived to be limited to changes in individual and team design practices and adoption of the ILDE won t reach the institutional level; also in the case of school teachers, they think the ILDE will be hardly adopted at institutional level, mostly due to a lack of infrastructure and digital competences in schools; in the adult education, instead, people seem positive about the possibility that both the ILDE and the proposed learning design practices can be transferred to their colleagues and even be adopted at institutional level. At the same time, though, we should not disregard other variables at play in our experiment, as for example the differences among the countries and the size of the institutions involved: in METIS the three contexts are located in three different countries and some of the results may be affected by cultural, economic, political reasons depending on the country. In addition, the OU is a big institution with a long tradition concerning online education and already has an existing process for the production of e-learning content and activities using in house software tools and a team based approach. For this reason, it is also more likely to be more resistant to the adoption of authoring and implementation tools than smaller institutions, like Agora and KEK, that might be prompter in their uptake because they perceive a good opportunity for improvement. Furthermore, as we have seen the OU is a very peculiar context, probably not really representative of the whole higher education sector, and its complex structure does not necessarily reflect the typical structure of a traditional university; this is another variable, that would call for further experiments in METIS, to be carried out in more traditional higher educational institutions. As a last point, we should note that Agora is mainly populated by volunteers, so people whose intrinsic motivation might play an important role as far as their openness and flexibility to accept innovation. Again in this case the presence of such a variable might imply that the trail partner is not completely representative of the sector. Summing up, as a more general consideration, the duplication of trials in institutions other than OU, KEK and Agora, would be desirable, as this would guarantee a better representation of each sector; unfortunately this was not envisaged in the METIS DoW, so, even if desirable, the recommendation will be considered by the consortium in the light of the available possibilities and resources. For sure, the project must guarantee higher numbers of participants for the second round of workshops, in such a way to have a higher number of enactors. This could be achieved by a stronger dissemination effort, but also by making it clearer to the potential workshop participants that enactment is part of the METIS offer and should not be considered as optional. 56
Feedback on the ILDE In this section the analysis of the data regarding the ILDE is provided as a whole and with a view on differences across the three contexts, if any. To give the overall view on the ILDE, we come back to the main indicators of the TAM, namely ease of use and usefulness of the various functions of the system. As a preliminary consideration regarding the ILDE, we should note that the more the system is flexible and able to offer different tools for the same function (e.g. the case of the Conceptualization function that offers a number of tools among which the user can choose 16 ), the more the data are scattered across the contexts. So very often, one tool was explored mostly in one context and not in others. This usually is a consequence of the workshop facilitator inputs/preferences, and this, in turn, may derive from her/his attempt to customize the workshop, judging one tool more fitting for the specific target audience. In the following, no further analysis is conducted for the functions/tools that have been explored mostly in one context (the reader will find the data already reported in the previous sections); in contrast, whenever one function/tool has been explored in two or three contexts, we provide further analysis, aimed to check if there are differences among the sectors. Looking at the Conceptualization functions offered by the ILDE, one can see that data are quite scattered: the Course map and For other conceptualization 17, even if obtained 18 and 17 answers respectively, have been both explored mostly at KEK. Similarly, the Heuristic Evaluation was explored only at the OU. For an analysis of these functions, see the respective sections. All the other tools were far less explored (less than 11 persons declared they had used each of them), probably as the result of the curiosity of individual participants. An additional analysis of the Persona Card, instead, is required, given that this was the only function actually explored in all the three contexts (total: 26 persons declared they had used it). Overall, the Persona Card was moderately appreciated as far as its ease of use is (mean=2,80; st.dev.= 0,86) and its usefulness (mean=2,88; st.dev.= 0,83) with no significant differences among the groups at the MANOVA test (F=2,144; p=.091). As far as the Authoring functions, given that only WebCollage was available, all the data converge on this tool. WebCollage was moderately appreciated as far as its ease of use (mean=2,86; st.dev.= 0,89), while the usefulness was strongly appreciated (mean=3,33; st.dev.= 0,66). There are differences, though, which resulted at the MANOVA test (F=6.195; 16 Course map, Design Patterns, Design Narrative, Persona Card, Factors and Concerns, Heuristic Evaluation, Compendium LD (Image), Image, For other conceptualizations. 17 i.e. the customized, paper-based Design Narrative used at KEK. 57
p=.000) on the way the ease of use has been judged by OU and KEK (ANOVA ease of use: F= 10.588; p=.000; Games-Howell Post Hoc tests p=.002 18 ), the OU being more severe than KEK. Unfortunately, at this round of workshops, the Implementation was not very much explored, so here no further analysis is conducted. To be noted that the technical issues reported for this function during the workshops have been forwarded to WP2 and they have been already solved (see also D4.1). Coming to the Browsing functions, again here, due to the variety of available functions (Browse by tags, Browse by discipline, Free search, Search patterns, etc.), data are quite scattered, but generally these functions are considered pretty positively. People say the most used functions are the Free search (23 answers) and Browse by tags (25 answers). All the other functions have less than 6 answers each. Free Search ease of use has been positively evaluated (mean=3,03; st.dev.=0,72) with no significant differences at the ANOVA test (F=1.53; p=.242). Similarly, its usefulness was very much appreciated (mean=3,13; st.dev.=0,77), but here there is a difference (F=3.89; p=.038) between KEK and Agora, the former being less positive than the latter (Games-Howell Post Hoc test p=.000 19 ). The Browse by tag ease of use was moderately appreciated (mean= 2,86; st.dev.=0,81) and its usefulness was positively judged (mean= 3,00; st.dev.=0,85). At the MANOVA test there are no significant differences among the groups (F=2.498; p=. 058). In addition, the ILDE sharing functions have obtained good results. Among these functions, very few people declared they Created a LDshakers group, they Shared a LdS with others with edit rights, and Add a comment on a LdS (less than 8 people), while the function Edit someone else's LdS was mostly used at KEK. Therefore, for all these functions, the reader will find the data under the previous sections. On the contrary, 22 people (scattered between KEK and Agora) declared they Shared a LdS with others with view rights and this was very much appreciated by both groups (Ease of use mean=3,45; st.dev.=0,68, T-test F=1.149; p=.298. Usefulness - mean=3,55; st.dev.=0,68, T-test F=18.99; p=.000). There are other two functions under this category that have been extensively explored by three and two groups respectively, and they both reported very good results, in terms of ease of use and usefulness: Exchange messages with other LDshakers o Ease of use: mean=3,20; st.dev.=0,88 (ANOVA F=1.665; p=.213 no significant differences reported) 18 Games-Howell Post Hoc tests have been chosen because the number of subjects is very different in the three groups. 19 Games-Howell Post Hoc tests have been chosen because the number of subjects is very different in the three groups. 58
o METIS PROJECT Usefulness: mean= 3,33 st.dev.=0,91 (given that equal variances was not assumed, we applied a Welch test p=.491; no significant differences reported) View someone else's LdS o Ease of use: mean=3,36; st.dev.=0,68 (T-test KEK and Agora F=2.294; p=.148 no significant differences reported) o Usefulness: mean=3,47; st.dev.=0,69 (T-test KEK and Agora F=2.294; p=.148 no significant differences reported). Overall, we shall conclude that the sharing functions are certainly one of the most appreciated aspects of the ILDE and this confirms the existing need to share and exchange ideas with others and more in general a willingness of people to be part of a community, where to find support and inspiration. This might lead the consortium to consider the possibility to propose other activities to the enactors in an additional follow up phase, aimed to foster the community nurturing and sustainment; these kinds of activities were not explicitly envisaged in the DoW but they might be worth a reflection. To conclude the ILDE evaluation, we shall reiterate some of the qualitative feedback received at the workshops related to possible improvements, especially concerning the interface and a more coherent and immediate use of terminology within the system. The integration of other Authoring tools within the ILDE will increase the possibility of meeting a greater variety of users needs. As to the Implementing functions, we should make sure for the next round that any technical problem related to the integration of GLUE!-PS and the Virtual Learning Environments will be removed. Lastly, as already mentioned in the previous section, we should be aware that despite the variety of tools offered by the ILDE the platform is more likely to satisfy the needs of little and not very complex institutions (like Agora), rather than the needs of big and complex institutions (like OU), whose workflow needs to be reflected in the structure of the adopted tools. 6. Conclusions and future work In this deliverable we have reported and analyzed the data coming from the evaluation of the first round of workshops carried out within METIS, and of the ILDE. According to the WP5 objectives and Tasks (see Section 1 of this document), the results of such evaluation should impact on the design and run of the second round of workshops and on the next release of the ILDE (see Figure 2). To make this operative, the following table contains a list of the main implications that this evaluation should have on the following round: 59
Implications for next round of workshops and enactments: For the next round we should overall involve more people in the workshops (at least 25 participants per workshop). We should make sure a significant number of them (at least 3-5 people for workshop) will be willing to enact and will be able to provide the data about enactment with students. This can be done thanks to a stronger dissemination effort (within and outside the project institutions) and by making clearer to potential participants that enactment is not an optional, but it is part of the METIS offer. We should also make sure we recruit the right target for the right context, as we must cover three sectors (vocational training, adult education and higher education). The duplication of trials in institutions other than OU, KEK and Agora, even if not originally foreseen, is also desirable. About the workshop format: o timing should be tuned (longer workshop, probably multi-sessions) o balance and duration of activities should be tuned, to allow that the ILDE is better explored and all its functions are sufficiently addressed (special attention should be devoted to implementation) o to achieve this, some (introductory) activities could be anticipated or some preparatory materials of background readings could be provided prior to the workshop event. Even if overall the workshop was very well accepted, from the experience gained during this round we might consider other formats (for example: multi-shorter sessions + scaffolding in the use of the ILDE as it happened in the case of Agora). This would probably allow a stronger dissemination power within one context. We should consider to propose to enactors an additional follow up phase during which they are encouraged to share and exchange ideas with the other ILDE users about the design process as well as the enactment experience, so to foster community nurturing and development. Implications for the next release of the ILDE: We should improve the overall ease of use of the platform (see specific comments in the analysis sections). We should make the terminology of the platform more coherent (see specific comments in the analysis sections). 60
METIS PROJECT We should resolve all the technical problems, especially those related to the implementation and deployment phases 20 (done already). There is no particular need to customize the ILDE on the basis of the three different contexts; nonetheless, we should be ready to any request of integration of new tools/functions, whenever a new need emerges (as in the case of KEK, who felt the need to adopt another Conceptualization tool, which is now being integrated into the ILDE). Implications for next round of evaluation: The questionnaire has to be refined (some questions need to be rephrased and/or improved) and the questionnaire should be translated also in Greek. The interview for non-enactors (type C) should be adopted and delivered in all the contexts. Both the questionnaire and the interviews will need to be modified according to the changes that will be implemented to both the workshop format and the ILDE prior to the second round. We should also consider how and when to collect data about the impact of the workshop+enactment in the medium-long term; this would imply coming back to the enactors in some months and make an additional interview, aimed to understand whether and to what extent their practices have changes after the METIS experience. The data coming from the tracking functions of the ILDE should be better aligned with the structure of the questionnaire. A deeper analysis of the learning designs produced and used during the enactment should be considered as a further input to the evaluation. To conclude this deliverable, we look at these implications from the perspective of the work packages of the project, so to have operative indications about how to carry on the work and what future actions the consortium should undertake. In particular, the results of the evaluation enlighten that work package 2 (which is devoted to the ILDE design and development), should concentrate its future effort on improving the overall system ease of use, especially in terms of interface and use of terminology, which should be easier and more consistent. More authoring tools should be integrated, to allow the users to choose among different options and approaches in analogy to what already happens for the conceptualization function. Besides, the implementation function should be improved, so to avoid any technical problem during the deployment into the Virtual Learning 20 In the ILDE terminology, the term Implementation means "instantiation, particularization to a specific context, while Deployment refers to the migration to the Virtual Learning Environment. 61
Environments. Lastly, the tracking functionalities of the ILDE should be better aligned with the other evaluation tools (in cooperation with WP5). As far as work package 3 is concerned (which is devoted to the design of the workshop package), we acknowledge the evaluation has provided positive results, provided that the timing dimension is tuned and a better balance among activities is planned. Lastly, as far as work package 4 is concerned (workshop and enactment realization), here some actions should be undertaken to be sure we meet the project requirements, namely: a) the overall number of people involved in the workshops should be increased; b) WP4 should make sure the enactment is carried out by more people, in all the institutions and within the timeframe of the project; c) WP4 should make sure that in each trial institution the target population for which the trial partner was initially involved in the project is recruited (OU=higher education; KEK=vocational training; Agora= adult education); d) the project should also consider the possibility to duplicate the workshop + enactment in institutions other than the ones originally envisaged in the DoW, as the trial partners have proved to be only partially representative of their own sector, so other trials would ensure a better coverage of the various sectors. WP5 needs to revise the evaluation tools to make them clearer and should also revise them to reflect the new functionalities offered by the second release of the ILDE and the changes to the workshop and enactment format (where this applies). WP6, which is devoted to the project dissemination, should contribute to a wider spread of the calls for workshop participants (both within and outside the institutions involved in the project), so to guarantee that higher number of potential users are reached. Furthermore, by disseminating the main project results and events, WP6 should encourage other institutions (other than Agora, KEK and OU) to run the METIS workshop. Besides, it should be up to WP6 to plan further activities/actions for nurturing and sustaining the communities of ILDE users created during the workshops, by encouraging them to share ideas, opinions, experiences about the ILDE usage, as well as the enactment phases. Lastly, WP1 which is devoted to the project management, besides continuously coordinating the partners to help them keeping the focus on the project objectives, should also make an effort to consider the new possibilities emerged by this evaluation, but not initially foreseen in the DoW (for example the possibility to make additional trials of the METIS workshop outside or within the project; the possibility to offer alternative paths other than the METIS workshop to have a stronger impact on institutions for example shorter training sessions + scaffolding; the possibility to have a follow up phase for the enactors, etc.). WP1 should produce feasibility plans for these actions, in such a way to support the consortium in the decision taking process. 62
7. References Guskey, T. R. (2000). Evaluating professional development. Corwin Press, Inc. Thousand Oaks, California. Guskey, T. R. (2002). Professional development and teacher change. Teachers and Teaching: theory and practice, 8(3), 381-391. Hernández-Leo, D., Chacón, J. P., Prieto, L.P., Asensio, J.I., and Derntl, M. (2013). METIS Deliverable D2.1: Report 1 on meeting with stakeholders: early feedback on ILDE requirements. http://metis-project.org/resources/deliverables/metis_d2-1.pdf McAndrew, P., Brasher, A., Prieto, L.P. and Rudman, P. (2013) METIS Deliverable D3.3: Pilot workshops: workshops for different educational levels. http://metisproject.org/resources/deliverables/metis_d3-3.pdf Pozzi, F., Persico, D., and Sarti, 2013 (2013). METIS Deliverable D5.1, Assessment Plan. http://metis-project.org/resources/deliverables/metis_d5-1.pdf Rudman, P. & Conole G. (2014) METIS Deliverable D4.1: Report on the pilot workshops and LD enactment. http://metis-project.org/resources/deliverables/metis_d4-1.pdf 63
Appendix I End-of-workshop questionnaire for METIS workshop participants 8. APPENDIX I Questionnaire End-of-workshop questionnaire for METIS workshop participants (in English) Cuestionario final para los participantes del seminario METIS (in Spanish) Appendix I 64
Appendix I End-of-workshop questionnaire for METIS workshop participants End-of-workshop questionnaire for METIS workshop participants Dear METIS workshop participant, This questionnaire is aimed at collecting information about your workshop experience and about the ILDE tool you have been using during the workshop. We are very grateful for the time and effort you will devote to its compilation, that will hopefully allow us to improve both the workshop format and the ILDE tool. Thank you for your collaboration! The METIS partners Please state your position on the following items: I have had the questionnaire aims explained to me. I agree to my responses and comments in this survey being used for research purposes only. I agree to my words being reported in an anonymous form in any publications. I have been informed that all data will be collected and stored safely, in accordance with the European Directive on Data Protection (Directive 95/46/EC). Structure of the questionnaire Filling in the questionnaire should take about 15-20 minutes. The questionnaire consists of the following sections: 1. Data about you (that will only be used for the purposes of the evaluation of the METIS workshop and ILDE) 2. Your opinions about the workshop 3. Your opinions about the ILDE 4. Final free comments. Feel free to answer only the questions that make sense to you, and you are able to answer. Appendix I 65
Appendix I End-of-workshop questionnaire for METIS workshop participants 0: Workshop identification * Workshop date: Workshop place: 1: Your data Name: Surname: Age: Qualifications: Institution: Position: Name of the group during the workshop * : Your personal ILDE userid * : The ILDE userid adopted by your group: * identifies a mandatory field. 1.1: What is your foremost background? Mostly educational science Mostly computer science Educational technology Appendix I 66
Appendix I End-of-workshop questionnaire for METIS workshop participants Other (specify) 1.2: What is your work sector? Academic: research, university teaching School teaching Industry Policy making Professional development Other (specify) 1.3: How much do you know about learning design? I am a beginner in this field I have been working in the field for a few years I consider myself an expert in the field Other (specify) 1.4: Before this workshop, had you ever used any of the following learning design tools? Course Map WEBCollage OpenGLM CADMOS Appendix I 67
Appendix I End-of-workshop questionnaire for METIS workshop participants Glue!-PS LdShake Other (specify) Appendix I 68
Appendix I End-of-workshop questionnaire for METIS workshop participants 1.5: Did you attend the METIS workshop from the beginning to the end? Yes No If you answered no: what part/percentage of the workshop did you miss? what was the reason? 2: Your opinions about the workshop 2.1: Motivation Please rate the following reasons for enrolling in this workshop (0=inessential - 4=very important) I am interested in learning design 0 1 2 3 4 I need to learn about learning design People I work with believe learning design is important I was forced to enrol in this workshop Other (specify) Appendix I 69
Appendix I End-of-workshop questionnaire for METIS workshop participants 2.2: Reactions Please rate the following aspects concerning the workshop (0=very low - 4=very high) Suitability with regard to your prior competence 0 1 2 3 4 Relevance of the workshop contents Quality of presentations Quality of the discussion Effectiveness of hands-on activities Adequacy of time schedule Adequacy of rooms and facilities For ratings below 2, please specify what could be improved: Appendix I 70
Appendix I End-of-workshop questionnaire for METIS workshop participants 2.3: Learning Please rate your agreement with the following statements (0=very low - 4=very high) I have learnt a lot from this workshop I look forward to putting into practice what I have just learnt about learning design I intend to use the ILDE in my future learning design activities Applying what I have learnt during this workshop will improve the quality of my courses What I have learnt in this workshop will speed up my learning design activities 0 1 2 3 4 How many learning designs did you manage to create (either individually or collectively)? Please rate (0=very low - 4=very high) your most complete design in terms of: Completeness 0 1 2 3 4 Originality Complexity Reusability How many learning designs did you manage to implement? Were there unexpected learning outcomes of the workshop? Appendix I 71
Appendix I End-of-workshop questionnaire for METIS workshop participants 2.4: Organization support & change Please rate your agreement with the following statements (0=very low - 4=very high) What I have just learnt about learning design could alter the existing practices in my institution The adoption of the ILDE as institutional tool to support learning design is likely to be fostered by my institution What I have just learnt about learning design could be transferred to my colleagues For ratings below 2, please describe: ILDE could be adopted by my colleagues 0 1 2 3 4 2.5: Free comments about the workshop Please list here three words you think describe the workshop accurately: Which activity do you like best? Beyond the activities, what do you like most about this workshop? What do you think should be improved about this workshop? Appendix I 72
Appendix I End-of-workshop questionnaire for METIS workshop participants 3: ILDE evaluation In the following section, most of the questions are three-folded, i.e., for each aspect you are asked to evaluate: Whether you were able to perform a certain action To what extent performing such action was easy To what extent performing such action turned out to be useful. 3.1: Functions to create new LdS Please give your opinion about the ease-of-use and usefulness of the ILDE with respect to the following users' tasks: Did you do it? Please rate ease of... (0=min - 4=max) Please rate usefulness of... (0=min - 4=max) Tick the box if you did it 0 1 2 3 4 0 1 2 3 4 understanding that the ILDE provides access to a variety of tools understanding the differences between these tools finding these tools in the ILDE access/enter these tools from the ILDE interface Appendix I 73
Appendix I End-of-workshop questionnaire for METIS workshop participants You may want to explain the reasons for the above ratings: 3.1.1: Conceptualize functions Please give your opinion about the Conceptualize ILDE functions that you have been using in the workshop: Did you use it? Please rate ease-of-use (0=min - 4=max) Please rate usefulness (0=min - 4=max) Tick the box if you used it 0 1 2 3 4 0 1 2 3 4 Course Map Design Pattern Design Narrative Persona Card Factors and Concerns Heuristic Evaluation CompendiumLD (upload) Image (upload) Appendix I 74
Appendix I End-of-workshop questionnaire for METIS workshop participants For other conceptualizations You may want to explain the reasons for the above ratings: 3.1.2: Author functions Please give your opinion about the Author ILDE functions that you have been using in the workshop: Did you use it? Please rate ease-of-use (0=min - 4=max) Please rate usefulness (0=min - 4=max) Tick the box if you used it 0 1 2 3 4 0 1 2 3 4 WebCollage OpenGLM CADMOS You may want to explain the reasons for the above ratings: Appendix I 75
Appendix I End-of-workshop questionnaire for METIS workshop participants 3.1.3: Implement functions Please give your opinion about the Implement ILDE functions that you have been using in the workshop: Did you use it? Please rate ease-of-use (0=min - 4=max) Please rate usefulness (0=min - 4=max) Tick the box if you used it 0 1 2 3 4 0 1 2 3 4 Implement in a VLE through Glue!-PS See your VLE Register your VLE You may want to explain the reasons for the above ratings: Appendix I 76
Appendix I End-of-workshop questionnaire for METIS workshop participants 3.2: Browsing functions Please give your opinion about the ILDE browsing functions that you have been using Did you use it? Please rate ease-of-use (0=min - 4=max) Please rate usefulness (0=min - 4=max) Tick the box if you used it 0 1 2 3 4 0 1 2 3 4 Free search Browse by tools Search patterns Browse by tags Browse by discipline Browse by pedagogical approach You may want to explain the reasons for the above ratings: Appendix I 77
Appendix I End-of-workshop questionnaire for METIS workshop participants 3.3: Sharing functions Please give your opinion about the ILDE sharing functions that you have been using Did you use it? Please rate ease-of-use (0=min - 4=max) Please rate usefulness (0=min - 4=max) Tick the box if you used it 0 1 2 3 4 0 1 2 3 4 Create a LDshakers' group Share a LdS with others with view rights Share a LdS with others with edit rights Add a comment to a LdS Exchange messages with other LDshakers View someone else's LdS Edit someone else's LdS You may want to explain the reasons for the above ratings: Appendix I 78
Appendix I End-of-workshop questionnaire for METIS workshop participants 4: Free comments about the ILDE Please list here three words you think describe the ILDE accurately: What do you like most about the ILDE? What do you think should be improved about the ILDE? Appendix I 79
Appendix I End-of-workshop questionnaire for METIS workshop participants Cuestionario final para los participantes del seminario METIS Participante del seminario METIS, Este cuestionario tiene como finalidad recoger tú valoración sobre tú participación en este seminario así como de la herramienta ILDE que hemos utilizado a lo largo del mismo. Agradecemos tú esfuerzo y tiempo en completar el cuestionario, que nos será de ayuda para mejorar tanto el formato del seminario como de la herramienta ILDE. Muchas gracias por tú colaboración! Entidades socias del METIS Por favor, indícanos tú opinión en los siguientes temas: Se me han explicado los objetivos del cuestionario. Estoy de acuerdo en que mis respuestas y comentarios a este cuestionario sean utilizados con el propósito de esta investigación únicamente. Estoy de acuerdo con que mis opiniones puedan ser utilizadas de forma anónima en publicaciones. Estoy informado/a que mis datos serán archivados y almacenados de forma segura, siguiendo las directicas europeas de protección de datos (Directiva 95/46/EC). Estructura del cuestionario Completar el cuestionario te llevará 15-20 minutos. El cuestionario tiene las siguientes secciones: 1. Información sobre ti (que únicamente será utilizada para la evaluación del seminario del METIS y ILDE) 2. Tú opinión sobre el seminario 3. Tú opinión sobre el ILDE 4. Comentarios finales. Appendix I 80
Appendix I End-of-workshop questionnaire for METIS workshop participants Siéntete libre de responder sólo a aquellas preguntas que tengan sentido para ti, y que puedas responder. 1. Tus datos: Nombre: Apellidos: Edad: Nivel de estudios: Institución: Cargo: Nombre del grupo durante el seminario * : Tú usuario personal ILDE * : El usuario ILDE utilizado en tú grupo: * campo obligatorio. Appendix I 81
Appendix I End-of-workshop questionnaire for METIS workshop participants 1.1. Cuál es tú bagaje educativo más importante? Principalmente educativo Principalmente técnico/tecnológico Tecnológico-educativo Otro (especificar): 1.2. Cuál es tú ámbito de trabajo? Académico: investigación, docencia Educación no universitaria Industrial Político Desarrollo profesional Otro (especificar): 1.3. Cuánto sabes de diseño de aprendizaje (learning design)? Soy principiante He trabajado con él por un tiempo Me considero un experto Appendix I 82
Appendix I End-of-workshop questionnaire for METIS workshop participants Otro (especificar): 1.4. Antes de este seminario, habías utilizado alguna de las siguientes herramientas de diseño de aprendizaje (learning design)? Course Map WEBCollage OpenGLM CADMOS Glue!-PS LdShake Otro (especificar): 1.5. Has participado en el seminario METIS desde su inicio hasta su finalización? Si No Si has respondido que no, qué parte/porcentaje del seminario te has perdido? por qué razón? Appendix I 83
Appendix I End-of-workshop questionnaire for METIS workshop participants Appendix I 84
Appendix I End-of-workshop questionnaire for METIS workshop participants 2. Tú opinión sobre el seminario 2.1. Motivación Por favor, puntúa las siguientes razones para involucrarte en este seminario (0=nada importante, 4-muy importante) 0 1 2 3 4 Estoy interesado/a en el diseño de aprendizaje Necesito aprender sobre el diseño de aprendizaje Las personas con las que trabajo piensan que el diseño de aprendizaje es importante Me obligaron a participar en este seminario Otro (especificar): Appendix I 85
Appendix I End-of-workshop questionnaire for METIS workshop participants 2.2. Reacciones Por favor, puntúa los siguientes aspectos relacionados con el seminario (0=muy bajo, 4=muy alto) 0 1 2 3 4 Adecuación a tus competencias Relevancia de los contenidos del seminario Calidad de las presentaciones Calidad del debate Efectividad de las actividades prácticas Adecuación de los tiempos Adecuación de la sala y de los materiales En puntuaciones inferiores a 2, por favor especifica que podría mejorarse: Appendix I 86
Appendix I End-of-workshop questionnaire for METIS workshop participants 2.3. Aprendizaje Por favor, puntúa la adecuación de las siguientes afirmaciones (0=en desacuerdo, 4=totalmente de acuerdo) 0 1 2 3 4 He aprendido mucho de este seminario Estoy deseando poner en práctica lo que he aprendido de diseño de aprendizaje Intentaré utilizar el ILDE en mis próximas actividades basadas en diseño de aprendizaje Aplicar lo que he aprendido a lo largo del seminario mejorará la calidad de mis cursos Lo que he aprendido en este seminario acelerará mis actividades basadas en diseño de aprendizaje Cuántos diseños de aprendizaje creaste (tanto individual como colectivamente)? Por favor, puntúa (0=poco, 4=mucho) tu diseño de aprendizaje más completo en términos de: 0 1 2 3 4 Finalización Originalidad Complejidad Reutilización Appendix I 87
Appendix I End-of-workshop questionnaire for METIS workshop participants Cuántos diseños de aprendizaje implementaste? Han habido resultados de aprendizaje no esperados en el seminario? 2.4. Apoyo y cambio dentro de la organización Por favor, puntúa la adecuación de las siguientes afirmaciones (0=en desacuerdo, 4=totalmente de acuerdo) 0 1 2 3 4 Lo que he aprendido de diseño de aprendizaje puede cambiar las prácticas existentes en mi institución La adopción del ILDE como herramienta institucional que apoye el diseño de aprendizaje tiene que ser promocionado desde mi institución Lo que he aprendido de diseño de aprendizaje puede ser transferido al resto de mis compañeros/as ILDE puede ser adoptado por mis compañeros/as En puntuaciones inferiores a 2, por favor describe: Appendix I 88
Appendix I End-of-workshop questionnaire for METIS workshop participants 2.5. Comentarios abiertos sobre el seminario Por favor, indica tres palabras que pienses describen este seminario con precisión: Cuál es la actividad que más te ha gustado? Más allá de las actividades, que es lo que más te ha gustado del seminario? Qué piensas que puede ser mejorado en el seminario? 3. Evaluación del ILDE En la siguiente sección, la mayoría de preguntas piden evaluar varios items, por ejemplo, para cada aspecto se te pregunta que evalúes: Si fuiste capaz de llevar a cabo una determinada acción Hasta qué punto, llevar a cabo esa acción fué fácil Hasta qué punto, llevar a cabo esa acción fué útil. Appendix I 89
Appendix I End-of-workshop questionnaire for METIS workshop participants 3.1. Funciones para crear nuevos LdS Por favor, danos tú opinión sobre lo fácil y útil del ILDE con relación a las siguientes tareas: Lo hiciste? Por favor, puntúa la facilidad de... (0=mínima, 4=máxima) Por favor, puntúa la utilidad de... (0=mínima, 4=máxima) Marca la casilla, si lo lograste 0 1 2 3 4 0 1 2 3 4 Entender que el ILDE facilita el acceso a una variedad de herramientas Entender las diferencias entre las diferentes herramientas Encontrar las herramientas en el ILDE Acceder a las herramientas desde la interfaz del ILDE Appendix I 90
Appendix I End-of-workshop questionnaire for METIS workshop participants Quizás quieras explicar las razones de tus puntuaciones anteriores: 3.1.1. Funciones de conceptualización Por favor, danos tú opinión sobre las funciones de conceptualización del ILDE que has utilizado en el seminario: La has utilizado? Por favor, puntúa la facilidad de... (0=mínima, 4=máxima) Por favor, puntúa la utilidad de... (0=mínima, 4=máxima) Marca la casilla, si la usaste 0 1 2 3 4 0 1 2 3 4 Course Map Patrón de Diseño Narrativa de Diseño Plantilla de Persona Factores y Preocupaciones Evaluación Heurística Appendix I 91
Appendix I End-of-workshop questionnaire for METIS workshop participants CompendiumLD (subir fichero) Image (subir imagen) Para otras conceptualizaciones Quizás quieras explicar las razones de tus puntuaciones anteriores: 3.1.2 Funciones de autoría Por favor, danos tu opinión sobre las funciones de autoría del ILDE que has utilizado en el seminario: La has utilizado? Por favor, puntúa la facilidad de... (0=mínima, 4=máxima) Por favor, puntúa la utilidad de... (0=mínima, 4=máxima) Marca la casilla, si la usaste 0 1 2 3 4 0 1 2 3 4 WebCollage OpenGLM CADMOS Appendix I 92
Appendix I End-of-workshop questionnaire for METIS workshop participants Quizás quieras explicar las razones de tus puntuaciones anteriores: 3.1.3. Funciones de implementación Por favor, danos tú opinión sobre las funciones de implementación del ILDE que has utilizado en el seminario: La has utilizado? Por favor, puntúa la facilidad de... (0=mínima, 4=máxima) Por favor, puntúa la utilidad de... (0=mínima, 4=máxima) Marca la casilla, si la usaste 0 1 2 3 4 0 1 2 3 4 Implementar un VLE a través de Glue!-PS Ver tu VLE Registrar tu VLE Quizás quieras explicar las razones de tus puntuaciones anteriores: Appendix I 93
Appendix I End-of-workshop questionnaire for METIS workshop participants 3.2. Funciones de navegación Por favor, danos tú opinión sobre las funciones de navegación del ILDE LdS que has utilizado en el seminario: La has utilizado? Por favor, puntúa la facilidad de... (0=mínima, 4=máxima) Por favor, puntúa la utilidad de... (0=mínima, 4=máxima) Marca la casilla, si la usaste 0 1 2 3 4 0 1 2 3 4 Búsqueda libre Buscar por herramientas Buscar por patrón Buscar por Etiquetas Buscar por disciplina Buscar por aproximación pedagógica Quizás quieras explicar las razones de tus puntuaciones anteriores: Appendix I 94
Appendix I End-of-workshop questionnaire for METIS workshop participants 3.3. Funciones de compartición Por favor, danos tu opinión sobre las funciones de LdShakers del ILDE que has utilizado en el seminario: La has utilizado? Por favor, puntúa la facilidad de... (0=mínima, 4=máxima) Por favor, puntúa la utilidad de... (0=mínima, 4=máxima) Marca la casilla, si la usaste 0 1 2 3 4 0 1 2 3 4 Crear un grupo LdShaker Compartir un LdS con otras persones con los permisos de visionado Compartir un LdS con otras persones con los permisos de edición Añadir un comentario a un LdS Intercambiar mensajes con otros LdShakers Ver el LdS de otra persona Appendix I 95
Appendix I End-of-workshop questionnaire for METIS workshop participants Editar el LdS de otra persona Quizás quieras explicar las razones de tus puntuaciones anteriores: 4. Más comentarios sobre el ILDE Por favor, indica tres palabras que pienses describen el ILDE con precisión: Qué es lo que más te ha gustado del ILDE? Qué es lo que se puede mejorar del ILDE? Appendix I 96
Appendix I End-of-workshop questionnaire for METIS workshop participants 9. APPENDIX II Follow up interviews Interview A for workshop participants This interview structure is meant to get information related to the enactment phase from those people who were former participant of the METIS workshop. Structure of the interview The interview consists of the following sections: 1. Data about the interviewee 2. Opinions about the enactment 3. Opinions about the ILDE 4. Free comments. Notes for the interviewer: The interviewer asks the questions listed below. The interview is recorded, in such a way that in a following phase the interviewer is able to listen to the recordings, provide a synthesis of the interviewee s answers and write the synthesis in this template (in English, so the interviewer will need to translate it, in case the interview occurs in any other language). Consent form At the beginning of the interview, the interviewer will ask the interviewee if s/he agrees with the following statements and complete the form accordingly: Please state your position on the following items: I have had the interview aims explained to me. I agree to my responses and comments in this survey being used for research purposes only. I agree to my words being reported in an anonymous form in any publications. I have been informed that all data will be collected and stored safely, in accordance with the European Directive on Data Protection (Directive 95/46/EC). 1: your data Name: Appendix II 97
Appendix I End-of-workshop questionnaire for METIS workshop participants Surname: Age: Qualifications: Institution: Position: Name of the group during the workshop *: Your personal ILDE userid *: The ILDE userid adopted by your group: * identifies a mandatory field. 2: your opinions about the ENACTMENT 2.1: USE OF KNOWLEDGE AND SKILLS DURING THE ENACTMENT How many new learning designs did you create in the ILDE during the enactment phase (if any)? Please, list them below: How many learning designs did you complete/refine in the ILDE during the enactment phase (starting from the ones you created during the workshop)? Please, list them below: How many learning designs did you manage to implement in a VLE during the enactment phase (including the creation of tools, resources, etc.)? Please, list them below: How many learning designs did you manage to deliver to your students during the enactment phase? Please, list them below: Appendix II 98
Appendix I End-of-workshop questionnaire for METIS workshop participants How many learning designs did you manage to evaluate during the enactment phase? Please, list them below: Please provide a short report of your experience: Now, rate your agreement with the following statements low 0=very 4=very high I have put into practice what I have learnt during the workshop about learning design 0 1 2 3 4 I have used the ILDE in my learning design activities 0 1 2 3 4 Applying what I have leant during the workshop has improved the quality of my courses 0 1 2 3 4 What I have learnt in the workshop has speeded up my learning design activities 0 1 2 3 4 What I have learnt in the workshop is having a positive impact on the practices in my institution 0 1 2 3 4 2.2 STUDENT LEARNING OUTCOMES Here you should focus on the delivery of your designs to your students. How many students did the delivery involve? Please describe the experience, possibly taking into account also the students point of view: Appendix II 99
Appendix I End-of-workshop questionnaire for METIS workshop participants In case you have collected any evidence of the impact of the LD on your students, please briefly report them here (please attach any artefact/product they may have produced testifying their learning outcomes): Now, rate your agreement with the following statements low 0=very 4=very high I think the way I had designed my activities improved my students achievements 0 1 2 3 4 I think the design(s) I produced through the ILDE benefited my students 0 1 2 3 4 I have collected evidences showing that the design(s) I produced through the ILDE benefited my students 0 1 2 3 4 3: ILDE Evaluation At the end of the workshop, you were asked to provide rates concerning your perceptions about the ease of use and usefulness of the various ILDE functions. Here we would like you to reconsider your previous answers and say if your perceptions have changed. Notes for the interviewer: Here the interviewer will help the interviewee by reminding her/his previous answers. Data will be provided by ITD prior to the interview. 3.1: FUNCTIONS TO CREATE NEW LDS In case your perceptions have changed, please explain why: 3.1.1: CONCEPTUALIZE FUNCTIONS In case your perceptions have changed, please explain why: Appendix II 100
Appendix I End-of-workshop questionnaire for METIS workshop participants 3.1.2: AUTHOR FUNCTIONS In case your perceptions have changed, please explain why: 3.1.3: IMPLEMENT FUNCTIONS In case your perceptions have changed, please explain why: 3.2: BROWSING FUNCTIONS In case your perceptions have changed, please explain why: 3.2: SHARING FUNCTIONS In case your perceptions have changed, please explain why: 4: SUSTAINABILITY Please, rate your agreement with the following statements low 0=very 4=very high I think what I have learnt in this experience can be applied to/used in my work in the future 0 1 2 3 4 I think the ILDE could be adopted on a larger scale at my institution in the future 0 1 2 3 4 I think the ILDE fits the needs of my work sector 21 at large 0 1 2 3 4 I think the METIS workshop package could be proposed on a larger scale at my institution in the future 0 1 2 3 4 21 Here we refer to the three main sectors addressed by the project, namely: adult education, vocational training, and university education. Appendix II 101
Appendix I End-of-workshop questionnaire for METIS workshop participants I think the METIS workshop package fits the needs of my work sector 1 at large 0 1 2 3 4 5: FREE COMMENT Feel free to add any another comment concerning the workshop, the ILDE or the tools that have been used to collect your opinions (questionnaire and interview): Appendix II 102
Appendix I End-of-workshop questionnaire for METIS workshop participants Interview B for enacting people who did not attended the workshop This interview structure is meant to get information related to the enactment phase from those people who did not participate to the METIS workshop. The interview gives for granted that the person involved in the enactment, had received some kind of training on the ILDE (as an alternative to participation to the workshop). Structure of the interview The interview consists of the following sections: 5. Data about the interviewee 6. Opinions about the enactment 7. Opinions about the ILDE 8. Free comments. Notes for the interviewer: The interviewer asks the questions listed below. The interview is recorded, in such a way that in a following phase the interviewer is able to listen to the recordings, provide a synthesis of the interviewee s answers and write the synthesis in this template (in English, so the interviewer will need to translate it, in case the interview occurs in any other language). Consent form At the beginning of the interview, the interviewer will ask the interviewee if s/he agrees with the following statements and complete the form accordingly: Please state your position on the following items: I have had the interview aims explained to me. I agree to my responses and comments in this survey being used for research purposes only. I agree to my words being reported in an anonymous form in any publications. I have been informed that all data will be collected and stored safely, in accordance with the European Directive on Data Protection (Directive 95/46/EC). Name: Surname: 1: your data Age: Appendix II 103
Qualifications: Institution: Position: METIS PROJECT Appendix I End-of-workshop questionnaire for METIS workshop participants Your personal ILDE userid *: Your foremost background: Mostly educational science Mostly computer science Educational technology Other (specify) Your work sector: Academic: research, university teaching School teaching Industry Policy making Professional development Other (specify) How much do you know about learning design? I am a beginner in this field I have been working in the field for a few years I consider myself an expert in the field Other (specify) Appendix II 104
Appendix I End-of-workshop questionnaire for METIS workshop participants Before this experience, had you ever used any of the following learning design tools? Course Map WEBCollage OpenGLM CADMOS Glue!-PS LdShake Other (specify) How and why did you decide to be involved in this experience? 2: your opinions about the ENACTMENT 2.1: MOTIVATION Please rate the following reasons for taking part into this experience (0=inessential - 4=very important). I am interested in learning design 0 1 2 3 4 5 I need to learn about learning design People I work with believe learning design is important Appendix II 105
Appendix I End-of-workshop questionnaire for METIS workshop participants I was forced to participate to this experience 2.2: LEARNING Please rate your agreement with the following statements low 0=very 4=very high I have learnt a lot from this experience 0 1 2 3 4 I look forward to putting into practice what I have just learnt about learning design 0 1 2 3 4 I intend to use the ILDE in my future learning design activities 0 1 2 3 4 Applying what I have learnt during this experience will improve the quality of my courses 0 1 2 3 4 What I have learnt in this experience will speed up my learning design activities 0 1 2 3 4 2.3: USE OF KNOWLEDGE AND SKILLS DURING THE ENACTMENT How many new learning designs did you create in the ILDE during the enactment phase (if any)? Please, list them below: How many learning designs did you manage to implement in a VLE during the enactment phase (including the creation of tools, resources, etc.)? Please, list them below: How many learning designs did you manage to deliver to your students during the enactment phase? Please, list them below: How many learning designs did you manage to evaluate during the enactment phase? Please, list them below: Appendix II 106
Appendix I End-of-workshop questionnaire for METIS workshop participants Please provide a short report of your experience: 2.4 STUDENT LEARNING OUTCOMES Here you should focus on the delivery of your designs to your students. How many students did the delivery involve? Please describe the experience, possibly taking into account also the students point of view: In case you have collected any evidence of the impact of the LD on your students, please briefly report them here (please attach any artefact/product they may have produced testifying their learning outcomes): Now, rate your agreement with the following statements low 0=very 4=very high Appendix II 107
Appendix I End-of-workshop questionnaire for METIS workshop participants I think the way I had designed my activities improved my students achievements 0 1 2 3 4 I think the design(s) I produced through the ILDE benefited my students 0 1 2 3 4 I have collected evidences showing that the design(s) I produced through the ILDE benefited my students 0 1 2 3 4 3: ILDE Evaluation In the following section, most of the questions are three-folded, i.e., for each aspect you are asked to evaluate: Whether you were able to perform a certain action To what extent performing such action was easy To what extent performing such action turned out to be useful. 3.1: FUNCTIONS TO CREATE NEW LDS Please give your opinion about the ease-of-use and usefulness of the ILDE with respect to the following users' tasks: understanding that the ILDE provides access to a variety of tools understanding the differences between these tools finding these tools in the ILDE Did you do it? Please rate ease of... (0=min - 4=max) Please rate usefulness of... (0=min - 4=max) Y/N 0 1 2 3 4 0 1 2 3 4 access/enter these tools from the ILDE interface Please explain the reasons for the above ratings: Appendix II 108
Appendix I End-of-workshop questionnaire for METIS workshop participants 3.1.1: CONCEPTUALIZE FUNCTIONS Please give your opinion about the Conceptualize ILDE functions that you have been using in the workshop: Did you use it? Please rate ease of... (0=min - 4=max) Please rate usefulness of... (0=min - 4=max) Y/N 0 1 2 3 4 0 1 2 3 4 CourseMap Design Patterns Design Narrative Persona Card Factors and Concerns Heuristic Evaluation CompendiumLD (upload) Image (upload) For other conceptualizations Please explain the reasons for the above ratings: 3.1.2: AUTHOR FUNCTIONS Please give your opinion about the Author ILDE functions that you have been using in the workshop: Did you use it? Please rate ease of... (0=min - 4=max) Please rate usefulness of... (0=min - 4=max) Y/N 0 1 2 3 4 0 1 2 3 4 Appendix II 109
WebCollage OpenGLM CADMOS METIS PROJECT Appendix I End-of-workshop questionnaire for METIS workshop participants Please explain the reasons for the above ratings: 3.1.3: IMPLEMENT FUNCTIONS Please give your opinion about the Implement ILDE functions that you have been using in the workshop: Did you use it? Please rate ease of... (0=min - 4=max) Please rate usefulness of... (0=min - 4=max) Y/N 0 1 2 3 4 0 1 2 3 4 Implement in a VLE through Glue!-PS See your VLE Register your VLE Please explain the reasons for the above ratings: 3.2: BROWSING FUNCTIONS Please give your opinion about the ILDE browsing functions that you have been using Did you use it? Please rate ease of... (0=min - 4=max) Please rate usefulness of... Appendix II 110
Free search METIS PROJECT Appendix I End-of-workshop questionnaire for METIS workshop participants (0=min - 4=max) Browse by tools Search patterns Browse by tags Browse by discipline Browse by pedagogical approach Y/N 0 1 2 3 4 0 1 2 3 4 Please explain the reasons for the above ratings: 3.2: SHARING FUNCTIONS Please give your opinion about the ILDE sharing functions that you have been using Did you use it? Please rate ease of... (0=min - 4=max) Please rate usefulness of... (0=min - 4=max) Y/N 0 1 2 3 4 0 1 2 3 4 Create a LDshakers' group Share a LdS with others with view rights Share a LdS with others with edit rights Add a comment to a LdS Exchange messages with other LDshakers View someone else's LdS Appendix II 111
Appendix I End-of-workshop questionnaire for METIS workshop participants Edit someone else's LdS Please explain the reasons for the above ratings: 4: SUSTAINABILITY Please, rate your agreement with the following statements low 0=very 4=very high I think what I have learnt in this experience can be applied to/used in my work in the future 0 1 2 3 4 I think the ILDE could be adopted on a larger scale at my institution in the future 0 1 2 3 4 I think the ILDE fits the needs of my work sector 22 at large 0 1 2 3 4 I think a workshop introducing the learning design issues could be proposed on a larger scale at my institution 0 1 2 3 4 I think a workshop introducing the learning design issues would fit the needs of my work sector 1 at large 0 1 2 3 4 5: FREE COMMENT Please list here three words you think describe the ILDE accurately: What do you like most about the ILDE? 22 Here we refer to the three main sectors addressed by the project, namely: adult education, vocational training, and university education. Appendix II 112
Appendix I End-of-workshop questionnaire for METIS workshop participants What do you think should be improved about the ILDE? Feel free to add any another comment concerning the experience, the ILDE or the tools that have been used to collect your opinions (questionnaire and interview): Appendix II 113
Appendix I End-of-workshop questionnaire for METIS workshop participants Interview C- for non-enactors who attended the workshop at the OU What's happened since the 'How to design collaborative learning activities workshop? On 24th October 2013 you attended a workshop on How to design collaborative learning activities run by the Metis project (http://ilde.upf.edu/ou/v/b37) during which you were introduced to the Metis Integrated Learning Design Environment. We would like to find out if the workshop has had an impact on you, and if so what kind of impact. We hope you will answer the following few questions to help us find out. (You will be asked up to 7 questions, and short answers are all that is required. The questionnaire should take up to 15 minutes to complete). Thank you! *Required Has the workshop led to you changing your practices? * For example, it might have changed the way you approach designing learning activities, or the tools you use to design learning activities. Yes/No (if yes go to 1, if no go to 2) 1 Effects of the workshop In what ways have you changed your practices? E.g. it might have changed your approach, or the tools you use. Do you think that attending the workshop improved your competence as a learning designer? Yes/No 2 No changes to practices If you have not changed your practices in response to the workshop, please could you explain why? Plans for the future, other comments etc. Do you have any plans to make use of what you learnt at the workshop in the future? Do you have any other comments? Appendix II 114