D5.2 Lean engineering collaboration system diagnostics: optimization handbook & workbench (Task 5.2 and Task 5.3)

Size: px
Start display at page:

Download "D5.2 Lean engineering collaboration system diagnostics: optimization handbook & workbench (Task 5.2 and Task 5.3)"

Transcription

1 Linked Knowledge in Manufacturing, Engineering and Design for Next-Generation Production Acronym: LinkedDesign Project No: Large-scale Integrating Project FoF-ICT Duration: D5.2 Lean engineering collaboration system diagnostics: optimization handbook & workbench (Task 5.2 and Task 5.3) Abstract: A lean engineering approach identifies ways of reducing waste in collaboration processes. An evaluation of knowledge sources is given, together with ways that support knowledge creation in a Virtual Obeya. The report presents functional requirements for collaborative planning, order management and change management based on literature. A study of selected ERP and PLM systems describes current state-of-the-art and gaps for future developments. A collaborative planning workbench is presented and visualized. Type Deliverable Document ID: D5.2 Work package: WP5 Leading partner: NTNU Author(s): Ottar Bakås (SINTEF), John Krogstie (NTNU), Børge Sjøbakk (SINTEF), Sobah Abbas Petersen (SINTEF), Fredrik Stokke (SINTEF), Pavan Sriram (NTNU), Kjetil Kristensen (NTNU), Geir Iversen (Aker), Joerg Cloebes (VW), Jon Atle Gulla (NTNU), Stefano Racca (Comau), Mozhgan Tavakolifard (NTNU) Dissemination level: PU Status: Final Date: 28 February 2013 Version: 2.0 Copyright LinkedDesign Consortium

2 Versioning and contribution history Version Description Contributors 0.1 Draft Fredrik Stokke 0.2 Key words of contents Ottar Bakås 0.3 Updated interpretation of scope Ottar Bakås 0.4 Restructured documents Børge Sjøbakk 0.5 Input on specific chapters Ottar Bakås 0.6 Updated chapter 3 structure details in chapter 4 after task meeting Ottar Bakås 0.7 Changes to use case information Ottar Bakås et al. 0.8 Update on literature review Ottar Bakås et al. 0.9 Updated chapter 3 John Krogstie et al. 1.0 First full version with full document structure Ottar Bakås et al. 1.1 Minor adjustments Ottar Bakås 1.2 Added research approach and scenario description and analysis Sobah Abbas Petersen 1.3 Updated chapter 3 and 4 John Krogstie, Ottar Bakås et al. 1.4 Updated interview data, mock-ups, appendices Børge Sjøbakk, Sobah Petersen, Kjetil Kristensen, John Krogstie et al. 1.5 Updated literature review Ottar Bakås, Pavan Sriram et al. 1.6 Updated study of ERP and PLM systems Ottar Bakås, Pavan Sriram et al. 1.7 Version sent to internal review Ottar Bakås 1.8 Review comments from UBITECH Dimitrios Alexandrou 1.9 Adapted parts based on review comments. Added table with full overview of all requirements and executive summary. Ottar Bakås, John Krogstie et al. 2.0 Final version, ready for submission Ottar Bakås Reviewers Name Dimitrios Alexandrou Affiliation UBITECH Copyright LinkedDesign Consortium Page 2 / 164

3 Table of Contents 1 INTRODUCTION AND OVERVIEW BACKGROUND AND WP5 OBJECTIVES SCOPE OF TASK SCOPE OF TASK STRUCTURE OF THE REPORT EXECUTIVE SUMMARY RESEARCH METHODOLOGY RESEARCH APPROACH TASK RESEARCH APPROACH FOR TASK Information analysis RESEARCH PHASES AND ACTIVITIES LEAN ENGINEERING COLLABORATION (T5.2) LEAN ENGINEERING: REDUCING WASTE THROUGH THE VIRTUAL OBEYA Workshop results EVALUATION OF KNOWLEDGE SOURCES Quality of information sources Concrete evaluation of tool and tool-types Important learning points for LinkedDesign KNOWLEDGE CREATION IN LEAN ENGINEERING ENVIRONMENTS Supporting knowledge creation in a Virtual Obeya Important learning points for LinkedDesign DIAGNOSTICS FRAMEWORKS AND PROCESS IMPROVEMENTS IN LEAN ENGINEERING ENVIRONMENTS Collaborative diagnostics: Tools for lean engineering collaboration system analysis and optimisation Important learning points for LinkedDesign COLLABORATIVE PLANNING THROUGH EFFECTIVE ORDER MANAGEMENT (T5.3) LITERATURE REVIEW - PROJECT SUPPLY CHAIN COLLABORATION Collaborative planning Order management Engineering change Summary of literature review ASSESSMENT OF EXISTING PLM AND ERP SYSTEMS Teamcenter Copyright LinkedDesign Consortium Page 3 / 164

4 4.2.2 Microsoft Dynamics AX SAP Business Suite Summary WORKBENCH: SYSTEM/FUNCTIONALITY ANALYSIS AND DESIGN User stories and scenarios Collaborative platform: Motivation and structure Order acquisition: functionality and mock-ups Engineering changes: functionality and mock-ups Errors and exceptions: functionality and mock-ups Order fulfillment: functionality and mock-ups SUMMARY OF COLLABORATIVE PLANNING REQUIREMENTS LEAP AND THE COLLABORATIVE PLANNING WORKBENCH COLLABORATIVE PLANNING FRAMEWORK CONCLUSIONS AND FURTHER WORK CONTRIBUTION LIMITATIONS FURTHER WORK AND RESEARCH SCIENTIFIC REFERENCES APPENDICES APPENDIX 1: TOOLS USED IN USE CASES APPENDIX 2 IDEAS FROM TURIN FRONT-END WORKSHOP AND TAKE-UP APPENDIX 3: QUALITY OF DATA AND DATA REPRESENTATIONS Overview of SEQUAL Data Quality APPENDIX 4: USING ACTIVE KNOWLEDGE MODELS (AKM) TO STRUCTURE AND CAPTURE USER KNOWLEDGE APPENDIX 5: INTERVIEW GUIDE FOR EPR AND PLM SYSTEMS APPENDIX 6: INTERVIEW GUIDE FOR INDUSTRIAL PARTNERS APPENDIX 7: COLLABORATIVE DIAGNOSTICS TOOLBOX FOR CONSIDERATION Copyright LinkedDesign Consortium Page 4 / 164

5 Figures Figure 1: Research Methodology Figure 2: Design Approach and Process Figure 3: Approach to knowledge access and knowledge creation in a Virtual Obeya Figure 4: Knowledge access in the LinkedDesign architecture Figure 5: Knowledge spiral for knowledge growth Figure 6: Spiral of organisational knowledge creation Figure 7: Framework for Knowledge maturing (from Kump et al., 2011) Figure 8: Modes of knowledge reuse Figure 9: Model of computer supported reflective learning (From (Krogstie et al 2012)) Figure 10: Production situations and the order penetration point. (Olhager, 2003, p.320) Figure 11: Integrating business processes across the supply chain (adapted from Lambert & Cooper, 2000) Figure 12: A model of a generic change process from Jarratt et al. (2004a) Figure 13: Magic quadrant for Manufacturing Product Life Cycle Management systems (Gartner, 2008) Figure 14: Magic Quadrant for ERP for Product-Centric Midmarket Companies (Gartner, 2012) Figure 15: PLM functionalities in Teamcenter Figure 16: Different change request types in Teamcenter Figure 17: Workflow of engineering change management in Teamcenter Figure 18: Microsoft Dynamics AX 2012 sales orders (Lerberg, 2012) Figure 19: Microsoft Dynamics AX 2012 production orders (Lerberg, 2012) Figure 20: Microsoft Dynamics AX 2012 production orders (Lerberg, 2012) Figure 21: Microsoft Dynamics AX 2012: Create alert rule (Lerberg, 2012) Figure 22: ECM user interface in Dynamics AX (version 4.0) Figure 23: Structure of SAP Business Suite and SAP Hana (SAP, 2013) Figure 24: Merged User Stories in LinkedDesign Figure 25 Collaboration in Error Handling: Manufacturer and Knowledge Engineer Figure 26: Collaboration in Error Handling: Manufacturer and Expert Figure 27: Login page for the Collaborative Platform Figure 28: Personalised Front page Figure 29: Rationale for the design of the functionalities in the Collaborative Platform Figure 30: Order acquisition mock-up # Figure 31: Order acquisition mock-up # Figure 32: Order acquisition mock-up # Figure 33: Order acquisition mock-up # Copyright LinkedDesign Consortium Page 5 / 164

6 Figure 34: Engineering change management mock-up # Figure 35: Error handling during the manufacturing phase Figure 36: Error Detection in Quality Checks Figure 37: Order fulfilment mock-up Figure 38: Connection between LEAP and the Collaborative Planning workbench Figure 39: Proposal for collaborative planning, information and decision support system (COPIDSS) Figure 40: Collaborative Planning Framework Figure 41: SEQUAL framework for discussing quality of models Figure 42: Dimensions in Modelling Figure 43: Dimensions in Enterprise Knowledge Spaces Figure 44: Link between common AKA and workplaces Figure 45: The model-based workplaces of the engineering project pilot Figure 46: Illustrating the current work logic with the material specification document Figure 47: Model-configured Workplaces driven by Active Knowledge Architectures Figure 48: Core EKA elements Tables Table 1: Main research phases M7-M Table 2: Data gathering and research activities M7-M Table 3: Sources of waste in collaboration Table 4: Selected IT tools used by the use case companies (full list in Appendix 1) Table 5: Process template for improving user collaboration Table 6: Types of collaborative planning (adapted from Kilger & Reuter, 2005) Table 7: functionality for change management tasks and functionality in AX Table 8: SAP collaborative planning functionality (source: 89 Table 9: Mitigating waste in collaboration Table 10: Order acquisition related information from interviews of use case companies Table 11: Engineering change related information from interviews of use case companies Table 12: Error handling related information from interviews of use case companies Table 13: Order fulfilment related information from interviews of use case companies Table 14: Summary of collaborative planning requirements Table 15: Dimensions of data quality Copyright LinkedDesign Consortium Page 6 / 164

7 Glossary APS Advanced Planning and Scheduling CAD Computer Aided Design CAE Computer Aided Engineering CPFR Collaborative Planning, Forecasting and Replenishment ERP: Enterprise Resource Planning ETO: Engineer-to-Order ICT: Information and Communication Technology IPR: Intellectual Property Rights KBE: Knowledge Based Engineering LEAP: Linked Engineering and manufacturing Platform MES Manufacturing Execution Systems MPR-II Manufacturing Resource Planning (MRP-II) MRP Materials Requirement Planning PLM: Product Lifecycle Management ROP Reorder point SoA: Service oriented Architecture T: LinkedDesign Task WP: LinkedDesign Work Package Copyright LinkedDesign Consortium Page 7 / 164

8 1 Introduction and overview In this document, the research carried out on LinkedDesign Task 5.2 Lean engineering collaboration system diagnostics & optimisation and Task 5.3 Collaborative planning through effective order management, is presented. This introductory chapter first provides the background of Task 5.2 and Task 5.3 by presenting WP5 and by outlining the scope of the two tasks. Whereas task 5.2 looks upon the support of the collaborative process itself, task 5.3 focuses on how to support the planning and management of the collaborative work. 1.1 Background and WP5 objectives This report is written as a part of WP5 in the LinkedDesign project. The main objective of WP5 is to provide the front end of LinkedDesign; the user interface and navigation that enables and supports the close contact and process-oriented knowledge exchange between experts of different working domains. WP5 will develop a collaborative, context-driven user interface, a Virtual Obeya, which enables user friendly access to all required engineering data and information sources. The work package builds upon previous work in LinkedDesign from WP1, WP2, WP3, WP4 and WP6, and important input from the industrial use cases in the project is taken into account. WP5 comprises 5 separate tasks. This report is a joint deliverable between two of these tasks: Task 5.2: Lean engineering collaboration system diagnostics and optimization Task 5.3: Collaborative planning through effective order management The scope of each task is described below. 1.2 Scope of Task 5.2 The Description of Work outlines the scope of task 5.2 as described below. The goal of this task is to establish a solution for documentation and communication of best practice, securing that recommended solutions are available and used. This includes evaluation of engineering tools, such as CAD, CAE, ERP etc. as knowledge source, and an analysis of innovation tools and external innovation interfaces. New concepts and principles for lean engineering collaboration utilizing among other things model-driven workplaces from AKM will be explored, as well as new diagnostics frameworks for lean engineering collaboration system analysis and optimisation. Copyright LinkedDesign Consortium Page 8 / 164

9 The scope of the task has been interpreted to have three main aspects: 1) Evaluation of the appropriateness of a selected number of existing knowledge sources. The selected knowledge sources are of the types found particularly relevant in the use cases of the project, although the assessment is written to be more generally applicable 2) Ways of both manually and semi-automatically linking up relevant knowledge developed as part of a project/task (a main principle of model-driven workplaces using AKM) 3) Capture best practice for later use, performing process improvement in lean engineering environments for the support of innovation 1.3 Scope of Task 5.3 The Description of Work outlines the scope of task 5.3 as described below. The numbering below is inserted by the authors and commented below The objective of this task is to develop a (1) workbench for collaborative planning and management of (2) engineering and supply orders in project supply chains, as in integrated part of a comprehensive engineering platform. Engineering projects involve uncertainty and change requests that create delays. Efficient change processes and innovative planning methods can allow companies to resolve issues more quickly and reduces the impact a change has on the product launch date in manufacturing. The task includes an (3) evaluation of existing change control and planning functionality in existing PLM and ERP systems, and innovative planning methods that enable projects managers to track status, progress and projected delivery on components. This task will deliver a workbench that supports project supply chain planning and order management where decentralised planning decisions are made with regards to individual company engineering operations but with a network perspective. The scope of the task has been interpreted in the following way: 1) The workbench will be described with functionality, requirements and illustrations of interface through a series of mock-ups. A fully testable pilot of the workbench is outside the scope of this task 2) The workbench will be targeted at companies that produce customer specific products. Relevant directions in literature for such companies are mass customisation, one-off production and Engineer-To-Order companies. However, many of the underlying Copyright LinkedDesign Consortium Page 9 / 164

10 principles in our proposed solution for collaborative planning will be equally relevant for Make-To-Stock, Assemble-To-Order and Make-To-Order companies. 3) We do not seek to evaluate all existing ERP and PLM systems, as it will require resources outside the frame for this task. We rather focus on studying a selection of ERP and PLM systems, based on criteria such as market size and usage by the industrial case companies in LinkedDesign. 1.4 Structure of the report The deliverable report is structured in seven main sections: 1. Introduction: background, objectives, scope and executive summary 2. Research approach and methodology 3. Lean engineering collaboration (task 5.2) 4. Collaborative planning (task 5.3) 5. Conclusions 6. References 7. Appendices Copyright LinkedDesign Consortium Page 10 / 164

11 1.5 Executive summary The key challenges for many enterprises today are the complexity of products and the uncertainty of processes. Customer requirements are subject to frequent changes, calling for enterprise agility and flexibility. At the same time, companies must uphold high efficiency to keep up with increasing global competition. This requires companies to increase the value creation of their engineering, innovation and collaboration processes. Task 5.2: Lean engineering collaboration system diagnostics and optimization As part of T5.2 an assessment of specifically relevant knowledge sources to be used in a Virtual Obeya has been done. This has indicated opportunities, but also challenges when trying to integrate data from different knowledge sources (typically used by people in different roles in an organization) in a common user interface supporting collaboration. In particular it highlights how different tools have a varying degree of explicit meta-model (data model). E.g. in many export-formats one loses some of the important information on product data. Even when different tools support e.g. process data, it is often process data on different levels of granularity. The tools alone all have challenges relative to waste in lean engineering. In a Virtual Obeya environment, one would explicitly want to combine data from different sources to address these sources of waste, in a context-driven manner. A bit dependent on the concrete knowledge sources to combine, this indicates that it is often a partly manual job to prepare for such matching. Also the different level of agreement of data from different sources (social quality) can influence the use of schema and object matching techniques in practice. A type of intermediate storage following the structure of a core ontology as illustrated in the AKM approach described in Appendix 4 is regarded to be beneficial. A framework for understanding different approaches based on different levels of ambitions to knowledge creation in a Virtual Obeya is presented. This can be used as a guide for the particular support for this provided in the Virtual Obeya; In particular one must have in mind the knowledge maturity one want to achieve, and choose a relevant level of visibility and level of formality of new knowledge that are developed when working in the Virtual Obeya based on current and desired level of maturity. An overview of modes of knowledge reuse and process improvement in collaborative environment is presented. Particularly important is the link of potential tool support for achieving learning and improvement through reflection, which provide ideas for functionality to include in a Virtual Obeya supporting collaborative reflection and lean engineering. Task 5.3: Collaborative planning through effective order management Engineering projects involve uncertainty and change requests that create delays. Efficient change processes and innovative planning methods can allow companies to resolve issues more quickly and reduce the impact a change has on the product launch date in manufacturing. Copyright LinkedDesign Consortium Page 11 / 164

12 The task started with a literature review on collaborative planning, order management and engineering changes. Based on this literature, 18 key functional requirements for a collaborative planning workbench were derived. In order to understand the current state-of-the-art within Enterprise Resource Planning (ERP) and Product Lifecycle Management (PLM) today, three systems were studied as a case to highlight best practices and current challenges. Siemens PLM Teamcenter, Microsoft Dynamics AX and SAP were selected due their leading position within the market. Current ERP and PLM solutions offer extensive functionality and many exciting features for collaborative planning. There are important trends towards improved user interfaces, big data analytics, cloud services and mobile accessibility. However, some of the drawbacks and limitations in existing solutions where found to be lack of proper support for: Engineering change propagation analysis; Knowledge management and reuse; Interoperability to allow free interaction between different systems; and Addressing errors, operational issues, uncertainties, engineering changes, variations, and risks explicitly. Empirical data about current status and future needs for three use case partners in LinkedDesign have been gathered. These interviews complemented the list of functional requirements identified in literature. A main contribution from this research is a proposed workbench that supports supply chain planning, order management and engineering change management. The workbench is presented through a series of mock-ups. They show how the functional requirements can be addressed in graphical user interfaces. Key generic capabilities of the proposed workbench include: Personalised and role-based support and capabilities Easy access to all relevant information to perform their tasks, including historical data Real time access to domain knowledge and expertise. Support for internal knowledge and experience transfer Life cycle assessment capabilities The key specific capabilities of the collaborative planning workbench include support for: Order acquisition business processes Engineering change management processes Exceptions and error handling processes Order fulfillment business processes The mockups can serve as a starting point for further development within the LEAP platform, the Virtual Obeya and for European commercial system providers. The work concludes with a collaborative planning framework. Copyright LinkedDesign Consortium Page 12 / 164

13 2 Research methodology The research approach has been somewhat different in the two tasks, as the goals have been different. Task 5.2 sets out to evaluate existing knowledge sources, describes ways of linking knowledge and captures best practices for lean engineering. Task 5.3 on the other hand, had an objective of describing functionality for a collaborative planning workbench, based on user requirements, findings in literature and state-of-the-art within existing ERP and PLM systems. Therefore, the methodological approach for each task is described separately. 2.1 Research approach Task 5.2 The research in task 5.2 follows an analytical approach, with validation of results using identified information within and outside the project on the different areas assessed. o By asking the project partners to reassess relevant input from previous deliverables (D3.1 in particular, in addition to D7.1, D8.1 and D9.1) to find the important tool types vs. overall requirements from the project and use cases in particular o Identified the main tools of each type to evaluate based on what is used in the use cases o Adapted a widely cited generic framework for assessment of data and model quality to our needs o Evaluated the characteristics of the knowledge source as for it s applicability for being included and linked to the Virtual Obeya using the framework by a combination of literature search and interviews o Validated the result on different areas with experts on the different tools and tooltypes with people both inside and outside of the LinkedDesign project 2.2 Research approach for Task 5.3 In order to develop description of a collaborative planning workbench, the research team conducted five main steps within the study: 1. Literature review, focusing on three areas: collaborative planning, order management and change management 2. Study of state-of-the-art within ERP and PLM systems, including vendor interviews. 3. Information analysis: building scenarios from user stories 4. Work bench description through mock-ups Copyright LinkedDesign Consortium Page 13 / 164

14 5. Interviews of use case partners to gather empirical data on current status and requirements In addition, documentation of results and reporting has been an ongoing process throughout the entire process. The methodological approach is illustrated below in Figure Literature review: Collaborative planning, order mgmt, engineering change Identify state-of-the-art concepts 2. Evaluate current ERP / PLM systems Desk study, vendor interviews Identify gaps in best practice Requirements Input to framework Requirements Main contributions: Collaborative planning requirements 3. Information analysis Analyse and evaluate user stories Specify scenarios, identify collaboration patterns 4. Design of Collaborative Planning Workbench Modelling functionality in user interfaces Mock-ups in Pencil 5. User requirements and validation 3 use case interviews Identify and validate industrial needs Requirements Processes Roles Mock-ups 1st version Requirements Mock-ups Final version Visual illustrations of Collaborative Planning Workbench Framework for collaborative planning Figure 1: Research Methodology Information analysis The information analysis phase consists of systematic analysis and evaluation of user stories to obtain requirements for the workbench, as stated in step three of the overall research approach; see Figure 1. The starting point for this work has been the user stories that have been provided by the users. User stories are a means of involving stakeholders directly in the software engineering process and obtaining requirements from them in agile software engineering or extreme programming. One of the limitations of user stories is that they are inadequate as software requirements. In particular, developing software based on user stories often lack a documentation of the requirements (Nawrocki et al., 2002). Although domain experts from industry have provided the user stories, they may not have provided enough domain knowledge as domain experts have tacit knowledge (Nawrocki, et al., 2002), as Copyright LinkedDesign Consortium Page 14 / 164

15 discussed by Suchman (1995). The purpose of our work is not only to design and implement technological support, but also to understand work practices and support best practices for collaboration and collaborative planning in engineering design and manufacturing. There is a great need to understand the human needs, the individual perspectives and how people work. In situations that involve human collaboration and humans interacting with technology, complementary methods have often been used to obtain a deeper understanding of the human needs and how they work. For example, Maiden at al. used human activity modelling, system goal modelling (i* modelling Yu, 1997) and use case modelling to obtain the different perspectives and understand air traffic management as a complex socio-technical system (Maiden, Jones et al., 2004). Goal-oriented requirements modelling has been used to identify business objectives and use case maps, a scenario-based approach, was used to describe the business process and the social context (Liu, Yu, 2002) The above examples highlight the role of scenarios as a complementary approach to understanding users' needs and requirements to support a better design of systems (Carroll, 1995), where abstract situations may be analysed to identify more concrete requirements. This approach is particularly suitable for designing systems where there are interactions among humans as well as between humans and technology, as in collaborative planning and design processes. Scenarios bring in a more comprehensive context of how people work or put a user's needs into context, allowing better design. Scenarios can evoke reflection in the context of design and help visualise situations, thus facilitating users to be more explicit in describing how they work and their needs (Carroll, 1999). The approach that we have taken is illustrated in Figure 2. In addition to the user stories, we have reviewed relevant literature to identify challenges in collaborative processes. We have described scenarios ("stories about people and their activities" (Carroll, 1999) to obtain a deeper insight into the users' needs and to identify the requirements for design and the collaborative processes that involve their work and who they collaborate with. These were used to design a solution. User Stories Literature Scenarios Requirements Collaborative Processes Additional Roles Design Figure 2: Design Approach and Process Copyright LinkedDesign Consortium Page 15 / 164

16 Figure 1 and Figure 2 show the process that we have followed in this work. Users were interviewed at the beginning of the project, during a workshop. We have then described scenarios based on the user stories from the interviews and analysed them to identify the requirements and collaborative processes to propose a design of a solution. The design and the experiences from the scenarios analyses have been used to validate the collaborative processes by interviewing the users. These interviews were more focused and revolved around the design and the collaborative processes. Based on the input from the interviews, we have then refined the design. The interview guide provided in Appendix 6 (Chapter 0) was used to support the interviews. The refined design is illustrated as "mock-ups" in Chapter to Chapter Research Phases and Activities The table below contains an excerpt of some of the main phases in the development of deliverable D5.2. Activities / phases Month(s) Results Work planning, conceptual development, project integration March July '12 Work plan Literature review, May December '12 Conference paper, deliverable chapters Requirement and information analysis Design of workbench Interviews with system vendors and use case partners November '12 January '13 November '12 January '13 November '12 January '13 Functional requirements, scenarios, conference papers User interface mock-ups Empirical interview data D5.2 finalisation January -February '13 D5.2 finalized Table 1: Main research phases M7-M18 Copyright LinkedDesign Consortium Page 16 / 164

17 The table below contains an excerpt of some of the key logged research activities that has been initiated to gather data of D5.2 and carry out specific research activities. Activity Date(s) Results Plenary meeting in Dresden Initial user stories, project vision Plenary meeting in Kassel Guiding principles, WP dependencies Plenary meeting Turin Presentation of approach for T5.2, investigation of most important reasons for waste in lean engineering among the use case partners Meeting with Aker Solution Capturing tool usage and aspects of data quality in Aker Solutions Meeting with Aker Solutions Meeting minutes, status on practices and information sources 1 st interview with Team Center Interview data 2 nd interview with Team Center Interview data 3 rd interview with Team Center Interview data Interview with Aker Solutions Interview data Interview with Comau Interview data Interview with Volkswagen Interview data WP5 synchronisation video calls Bi-weekly from June '12-Feb '13 Work package coordination Table 2: Data gathering and research activities M7-M18 Copyright LinkedDesign Consortium Page 17 / 164

18 3 Lean engineering collaboration (T5.2) 3.1 Lean Engineering: Reducing Waste through the Virtual Obeya The following section builds further on the work described in D5.1 (section Lean Engineering: Reducing Waste through the Virtual Obeya), and reports results from a plenary workshop where the LEAP front end / graphical use interface (GUI) was discussed; specifically requirements and GUI ideas addressing commonly occurring waste in collaboration. The objective of this workshop was to provide insight and decision support material useful for the partners in achieving overall LinkedDesign project objectives. More specifically, the workshop generated input on how application / end user UI requirements can be productively matched with UI concepts and ideas, as provided by the RTD partners. In line with lean engineering principles, functions or features that add distinct value in specific work situations are of high interest, specifically how the UI can contribute to commonly observed sources of waste in collaborative engineering situation. Concrete ideas to address this are found in Appendix 2. Copyright LinkedDesign Consortium Page 18 / 164

19 3.1.1 Workshop results Waste form 1 Description Performance dimension(s) How waste can be reduced Priorities End users Priorities 2 All partners Ideas 3 All partners Divergence Wasted efforts due to politics, mismatch of goals Effectiveness Unification (Hansen, 2009) - - Misunderstanding Disconnect in understanding Effectiveness In-context collaboration Semantic GUIs supporting aggregated knowledge representations Aker Solutions 9 9 Undercommunicating Excess or not enough time spent in collaboration Effectiveness Efficiency Piloting aggregated knowledge representations beyond data and information Aker Solutions 7 8 Interpreting Time spent interpreting communication or artefacts Effectiveness Efficiency Activity-centric GUIs improving (collaborative) task identification and task execution Semantic GUIs supporting aggregated knowledge representations VW Comau 3 3 Interactive access to expertise that can transfer knowledge 1 Waste forms identified for further exploration in D5.1 are marked in green. 2 Priority indicated by number of votes cast in the front end workshop described above, during the Turin consortium meeting 33 Number of ideas provided to address each specific waste form, as discussed in the front end workshop Copyright LinkedDesign Consortium

20 Waste form 1 Description Performance dimension(s) How waste can be reduced Priorities End users Priorities 2 All partners Ideas 3 All partners Searching Time spent searching for information, relationships Effectiveness Efficiency New search capabilities Broad knowledge discovery functionalities searching both knowledge and information, and the people behind the knowledge / information VW Comau Aker Solutions Motion Handover of artefacts or communications Efficiency Make decisions as soon as possible; using notification mechanisms to flag decision items to relevant stakeholders - - Extra processing Excess creation of artefacts or information Effectiveness Efficiency Knowledge briefs, A3, aggregated knowledge and information views reducing 7 7 Translation Time spent conforming objects to new inputs Efficiency Semantic, activity-centric GUIs improving (collaborative) task identification and task execution - - Waiting Delays due to reviews, approvals, and bottlenecks Efficiency All relevant stakeholders directly involved in decisions Transparent processes highlighting items that have reached definition of ready state for further processing VW 4 5 Misapplication Incorrect use of methods and technologies Effectiveness Collaborative approaches imply rapid feedback loops that to some extent prevent incorrect use or at minimum incorrect sustained use 4 3 Table 3: Sources of waste in collaboration Waste forms and description adapted from McKinsey, 2009 elaborated from D5.1 report submitted in M9 Copyright LinkedDesign Consortium Page 20 / 164

21 3.2 Evaluation of knowledge sources The first part here outlines the general approach to assess quality of information sources, and together with appendix 3 and appendix 4 constitutes the core part of the handbook of this deliverable. This generic approach is then exemplified by application of the SEQUAL framework to the concrete needs in the LinkedDesign project. This act both as an instantiation of the generic approach, and as concrete support for other parts of LinkedDesign, in particular the use cases, and provides further background for the design and implementation of the Virtual Obeya. Figure 3: Approach to knowledge access and knowledge creation in a Virtual Obeya Quality of information sources When we look of quality of an information source (e.g. a CAD tool), we look on both the tool, the structure of the stored data (data model, including meta-data) and the characteristics of the data itself, in light of our goal in LinkedDesign for reuse and revisualization of data in a Virtual Obeya, in a way that might be annotated and/or updated through the user interface of LEAP for knowledge enhancement. The above figure (read from the right) describes the overall thinking. The users perform collaborative work using the Virtual Obeya. The Obeya presents context specific information based on the persons involved and other relevant Copyright LinkedDesign Consortium

22 information such as relevant products, projects, tasks, tools, rules and guidelines etc. The relevant context-dimensions are related to the generic ontology (as developed in WP 3). The data is mediated from existing work tools and the data have to be transformed in some way depending on the relevant context. This can be implemented in several ways, but we do not go into detail on this here. The data presented and worked on in the Virtual Obeya can be annotated with other context-oriented information that potentially is stored in the original work-tools, or in separate tools for knowledge management. Detailed work on context will be reported in D5.3 (based on the work in T5.4 Semantic UI principles and context based approach for Virtual Obeya). The generic approach for evaluation of data quality based on the SEQUAL framework on quality of models is described in Appendix 3. Looking at the sets of SEQUAL in the LinkedDesign context, we have the following: G: There are goals on two levels. The goal to be achieved when using the base tool, and the goal of supporting collaborative work with this data from this tool as one of (typically several) sources of knowledge for the Virtual Obeya. Our focus is on this second goal, although these might be related. L: The language is the way data is encoded (E.g. using some standard), and the language for describing the data-model/meta-model (if this is explicit) M: Model externalization, again on two levels: the data themselves and the data-model controlling the way data is structured. A: Actors i.e. the people in different roles using the models, with a focus on the collaborators in the LinkedDesign use-cases. K: The knowledge of the actors (A) in these roles T: Technical actor interpretation, relates to the possibilities of the languages used to provide tool-support in handling the data (in the base tools, and also in a Virtual Obeya) I: Social actor interpretation, relates to how easy it is for the different actors to interpret the data as it can be presented (in the base tool, and also in a Virtual Obeya) D: The domain can on a general level be looked upon relative to the concepts of an upper-level ontology (cf. WP3). Since the general LinkedDesign Ontology is developed in parallel with this work, we will align to this as necessary as the work progresses. For the moment, we focus on perspectives captured in the generic EKA - Enterprise Knowledge Architecture of AKM since this has shown to be useful for context-based user interface development in other projects (Krogstie and Lillehagen, 2008; Lillehagen and Krogstie, 2008) (described in Appendix 4). Thus we consider information on: 1. Products 2. Tasks (including (business) processes and projects) Copyright LinkedDesign Consortium Page 22 / 164

23 3. Goals and Rules (from standards to design rules) 4. Roles (including organizational structure and persons, and capabilities of persons and organizations) 5. Tools and technology Note than in particular cases (e.g. in the LinkedDesign use cases), the domain is limited to what is relevant for manufacturing. The above perspectives are interdependent, but also typically are structured in hierarchical structure within the perspective, e.g. products have subproducts/parts (aggregation), products are of a certain product class, which itself can be of a more generic product class (classification/generalization). Rules can be part of a rule-set (e.g. the rule-set of a standard). A rule-set can be an instance of a class of rule-sets (a product standard), which again is member of a more generic class (standard) etc. It is a hypothesis that the ability to traverse in such aggregation and generalization-hierarchies can be very important for providing a context-sensitive user-interface. Based on this we can describe the quality of data more precisely in the following way: Physical quality relates to if the data is: o Available in a physical format in a timely manner so that it can be reused in the Virtual Obeya. The availability of data to be used in other types of tools are also relevant here. o Availability of different versions of the data when relevant. o Possibility to store relevant meta-data (e.g. the user that has made a CAD model, time-stamp to judge currency etc.) o Available for update or annotation/extension in the user interface (of authorized users if this is an issue) o Availability of data from other tools (in particular relevant to discuss if it can be made available in tools of the other types that are assessed) o Only available for those that should have access, if there are security issues Empirical quality is not directly relevant when evaluating the data-sources per se. Guidelines for this is relevant when we look upon how data can be presented in tools (and in the Virtual Obeya itself) (e.g. as a visualized CAD-drawing that can be manipulated in a 3D interface) Syntactic quality. Is the data represented in a way following the defined syntax (e.g. by following a strictly defined standard), and respecting this syntax in a foreseeable way? Semantic quality. Does the data source contain the expected type of data (and only this type, alternatively that one can extract the necessary information only)? This is looked upon relative to the domain identified on the upper level of the Copyright LinkedDesign Consortium Page 23 / 164

24 ontology/context-model as discussed above, although a single source will seldom have data of all relevant types. Note that we here look on the possibility of representing the relevant type of data (and not from other domains), obviously the level of actual completeness and validity of data is dependent on what is modelled in the concrete case. Tools might also have mechanisms for supporting the rapid development of complete models (e.g. catalogues of standard items in CAD tools). Pragmatic quality. Is data of such a category that it can be easily understood (or visualized in a way that is can be easily understood) by the stakeholders? Since the context of work is changing when in the Virtual Obeya, it is important that the data can be flexibly represented, including access to relevant meta-data. Possibility of tool interpretation can also be important, given that one need to identify the relevant aspects to show in the Virtual Obeya based on the relevant context. Since there is limited focus on process automation in LinkedDesign, the need for representations with an executional semantics is limited. Social quality. Is there agreement on the quality of the data among the stakeholders? Since different data comes from different tools, and often need to be integrated in the Virtual Obeya, agreement on interpretation of data and of the quality of this data across the involved stakeholders can be important. Concrete techniques relative to integrating data from different sources are handled in WP2 of the project and are not focused on here. Deontic quality: Shall we with the help of data from the data source be able to achieve the goals of LEAP? These goals can be related to the discussion on reducing waste in lean engineering. In D5.1 the different areas of waste in lean engineering was described. As described in the previous section, the use case-partners and other project partners have prioritized the waste areas, and we have used this input to come up with the following list as most important: Searching: time spent searching for information Under-communication: Excessive or not enough time spent in communication Misunderstanding: disconnect in understanding Interpreting: time spent on interpreting communication or artefacts Waiting: delays due to reviews, approvals etc. Extra processing: excess creation of artefacts or information Copyright LinkedDesign Consortium Page 24 / 164

25 3.2.2 Concrete evaluation of tool and tool-types Relating to the LinkedDesign Architecture (developed in WP1), only the parts depicted in Figure 4 (a subset of the total architecture) is relevant here (here relating specifically to getting data from existing data sources for reuse in the Virtual Obeya). Figure 4: Knowledge access in the LinkedDesign architecture Note that we look upon KBE-sources as any other data source. Relevant knowledge sources for LinkedDesign were previously identified in D2.1 and D3.1. The descriptions in user stories from the different cases as reported in D7.1, D8.1 and D9.1 and concrete feedback from the use cases has further provided more detailed input to this list, in addition to requirements for support of collaborative planning identified in Task 5.3 (and described in more detail in section 4 in this deliverable). The result of this process is shown in boldface in the below table. The tools shown in normal font is tools of these tool-types used in the use case organization, but which do not have a central role in the specific use cases. Tool Aker Solutions Comau VW T5.3 Office Automation (Spreadsheet, etc.) Sharepoint Excel Copyright LinkedDesign Consortium Page 25 / 164

26 3D Computer Aided Design (CAD 3D) PDMS (Aveva) Autocad Catia v5 Solidworks Catia V5 Knowledge Based Engineering (KBE) KBEDesign (AML, AMLsketcher) Product Data Management /Product Lifecycle management (PDM/PLM) Enovia Internal software Teamcenter Enterprise Resource Planning (ERP) Lifecycle Analysis (LCA) software Kesys Mips COMOS SAP R/3 SAP R/3 SAP R/3 No specific tool yet - use Excel In other dep. Table 4: Selected IT tools used by the use case companies (full list in Appendix 1) MS Dynamics Based on this analysis, we have focused on the following concrete tools and tool types in the assessment. (A more complete list of all the potentially useful tools in the case organizations is found in the Appendix 1). Office automation: Excel (could also include Sharepoint, Word, Powerpoint etc) Computer-Aided Design (CAD): PDMS, Autocad, Catia V5 Knowledge-based Engineering (KBE): KBEdesign Product Lifecycle Management (PLM) PDM: Teamcenter, Enovia Enterprise Research Planning (ERP): SAP ERP (R/3), MS Dynamics In the following, the tools are evaluated, primarily relative to the general tool-category, with specific aspects for the individual tools listed if relevant for illustration purposes Quality of Excel data Much data and information relevant for engineers is developed and resides in office automation tools like Excel. We will in particular look at Excel in the below, since the data here is potentially more structured than what is found in e.g. Word and Powerpoint although this structure is mostly implicit. We also find Excel as a concrete knowledge source in at least one of the LinkedDesign use cases. Copyright LinkedDesign Consortium Page 26 / 164

27 Features supporting physical quality of Excel data Data in tools like excel can be saved both in the native format (.xls,.xlsx), in open standards such as.html,.xps,.dif, and.csv-files, and in open document formats (e.g..ods), thus exceldata can be made available in well-established forms following de jure and de facto standards, and thus can be easily made available for visualization and further use. One can also export e.g. pdf-versions of spreadsheets for making the information available without any possibility for interaction, although this is often less relevant. Ensuring secure access to the data is only manually enforced. Since the format is known, it is possible to save (updated) data from e.g. a Virtual Obeya, feeding this back to the original spreadsheet Features supporting empirical quality of Excel data Excel has several mechanisms for data-visualizations in graphs and diagrams included in ways to ensure nice-looking visualization, and these visualizations can be made available externally for other tools. The underlying rules and macros in the spreadsheets are typically not visualized though Features supporting syntactic quality of Excel data Although the syntax of the storage-formats for excel is well-defined, and standard data-types can be specified on fields, there is no explicit information on the type of data (e.g. if the data represents product information, process information etc). (Calculation) rules can be defined, but again these are undefined (in the formal meaning of the word), and the rules are in many export formats (such as.csv) not included Features supporting semantic quality of Excel data You can represent knowledge of all the listed categories in a spreadsheet, but since the datamodel is implicit, it is not possible to know what kind of data you have available without support from the human developer of the data, or by having this represented in some other way Features supporting pragmatic quality of Excel data As indicated under empirical quality you can present data in spreadsheets visually, which can be shared (and you can potentially update the visualization directly), but as discussed under semantic quality, one do not have explicit knowledge of the category of data represented (although this might be implicitly and informally represented in labels of columns etc.) Features supporting social quality of Excel data Since Excel (and other office automation tools) typically are personal tools (and adapted to personal needs, even in cases where a company-wide template has been the starting point), there is a large risk that there are inconsistencies between data (and underlying data model) in different spreadsheets and between data found in spreadsheets and in other tools. Copyright LinkedDesign Consortium Page 27 / 164

28 Features supporting deontic quality of Excel data Where much engineering knowledge is found in spreadsheets, it can be important to be able to include this in aggregated view in a Virtual Obeya. On the other hand, an explicit meta-model for the data matching the common ontology must typically be made in each case, thus it can be costly to ensure that all relevant data is available. As long as you keep to the same (implicit) meta-model for the data in the spreadsheet, you can update the data in the Virtual Obeya and have it transferred to the original data source. On the other hand, if you need to annotate the data with new categories (e.g. other categories found in the ontology) it is not easy to update the spreadsheet without also updating the explicit meta-model without manual intervention. Looking upon the waste forms we can say the following Searching: When Excel is used, there is often data in a number of different excelsheets, used and developed by a number of different people, and it is hard to know that one have the right data/version available. Undercommunication: There is no explicit data-model, thus the interpretation of data might be based on labels only, which can be interpreted differently by different persons. A number of (calculation) rules are typically captured in excelsheets without being explicitly stated Misunderstanding: Due to potential different interpretation of terms, misunderstandings are likely Interpreting: Since the meaning of data is under-communicating, the time to interpret might be quite long Waiting: If data must be manullay transformed to another format to be usable this might be an issue Extra processing: Due to the versatility of tools like Excel, it is very easy to represent additional data and rules, even if they are not deemed useful Quality of data in CAD tools CAD (Computer Aided Design)-tools are tools used to assist in the creation, modification, analysis, or optimization of the design of a product. CAD software uses either vector based graphics to depict the objects of traditional drafting, or may also produce raster graphics showing the overall appearance of designed objects. Whereas CAD traditionally was used by product designers, functionality to support different engineering professions are often included, and the term CAE - Computer Aided Engineering is also often used on this tool type. A large number of CAD-tools exist (e.g. Autocad, Autodesk, PDMS, SolidWorks, Unigraphics, I-deas NX, Solid Edge, Catia V5, Pro Engineer). Some tools e.g. related to the Catia-family of tools have also been extended to full-fledged PLM-system, which we will discuss in more detail in section We will exemplify with PDMS Aveva described also in WP9 (D9.1, section 5) and also mention aspects found in tools such as Siemens NX and Catia V5 as described in WP 6 (D6.1). PDMS (Aveva, 2012), developed by Aveva, is a Copyright LinkedDesign Consortium Page 28 / 164

29 customizable multi-discipline and multi-user, engineer controlled 3D CAD environment commonly used in the offshore industry. This is typically used by engineers, but as will be discussed below usual to integrate with other tool types such as KBE (in Aker Solutions) and PLM-tools Features supporting physical quality of CAD data Data is stored in a local database, and can be (partly) exchanged based on standard representations (see syntactic quality). Tools such as PDMS support simultaneous work by multiple users (from multiple disciplines), with support for versioning and access control. In many tools, not only the product-information is stored, but also the history-tree of operations done for producing the model. The detailed database structure for instance of PDMS is described in D9.1. Exchange with KBE and PLM tools is described below (for integration with KBE, see also D9.1). It is also possible to export data in generic formats (for visualization) like PDF. In these formats you cannot change the CAD-data directly Features supporting empirical quality of CAD data CAD tools typically have good functionality to visualize the product data in 3D. Because of its economic importance, CAD has been a major driving force for research in computational geometry and computer graphics and thus for algorithms for visualizations that one typically focus on as means under the area of empirical quality Features supporting syntactic quality of CAD data The international CAD data exchange standard including IGES and ISO ISO is informally known as STEP STandard for Exchange of Product model data, has so far been limited to transfer of geometry. Note that when exporting the model in STEP or IGES, the history-tree is not retained. These standards have been incapable of handling parameters, constraints, design features and other design intent data generated by modern CAD systems (Kim, Pratt, Iyer, & Sriram, 2007). In the ProSTEP ivip project 4 the new STEP AP 242 standard (See also under the KBE-section, ) was implemented and tested in a consortium including the vendors of the key CAD systems: CATIA, Pro/Engineer and Siemens NX. These new STEP standard interfaces are so far not announced as standard functionality of these CAD systems Features supporting semantic quality of CAD data CAD-systems typically focus on the (geometric) representations of products at the instance level (compared to KBE below where the representation is generative). One might also 4 Final Project Report, Parametrical 3D Data Exchange via STEP: Parametrics_1.0.pdf Copyright LinkedDesign Consortium Page 29 / 164

30 represent some rules relative to the product in CAD systems (relative to e.g. the materials, processes, dimensions, and tolerances involved), but one do not capture knowledge on organizational structure, tools, and underlying business processes in these tools. Another limitation in most tools is that a limited representation of the function (e.g. overall goal) of the different parts of the design, although some tools support simultaneous representation of (functional and non-functional) requirements with the product-information. Neither the design rationale behind the design decision done is normally captured. CAD tools typically support the development of catalogues of elements. The catalogue contains standard reference data for each available type e.g. of pipe elbows, like material, bend angle and so on. This can support rapid development of new structures, i.e. support rapid achievement of completeness of the product model. The support of default values in the tool user interface can also be useful in this respect. CAD tools are often integrated with a number of different analysis tools (e.g. for finite element analysis) which can support the development of valid design-models Features supporting pragmatic quality of CAD data CAD tools support numerous ways of visualizing the knowledge on the design in an integrated manner, e.g. 3D-view, product model trees etc. Viewing mechanisms showing only parts of the overall structure through layering (e.g. the parts relevant for one discipline), relating this to the common overall structure is a very important mechanism to be able to handle the complexity of CAD drawings. Another interesting approach for ensuring comprehension is the export of the CAD-data to prototyping tools such as 3D-printers which recently has become much cheaper. While the goal of automated CAD systems is to increase efficiency, they are not necessarily the best way to allow newcomers to understand the geometrical principles of solid modeling. For this, scripting languages such as PLaSM (Programming Language of Solid Modeling) have been developed. The scripting approach is very different from working with an interactive GUI, but is preferred by some CAD instructors as scripts reveal all details of the design procedure (not only the final design) Features supporting social quality of CAD data As stated above, CAD is typically used primarily by the designers and some of engineers, often in an early stage in product development. Thus the agreement on these data might be less than e.g. data developed in more organizationally integrating tools such as PLM and ERP-tools. On the other hand, the part of the data that is stored using established standards (e.g. STEP) would probably be easier to ensure that is interpreted identically across usergroups and also across organizations, with the limitation that not all the relevant information is captured in these formats. STEP is developed and maintained by the ISO technical committee TC 184, Automation systems and integration, sub-committee SC 4, Industrial data. Like other ISO and IEC standards STEP is copyright by ISO and is not freely available. However, the EXPRESS schemas are freely available, as are the recommended Copyright LinkedDesign Consortium Page 30 / 164

31 practices for implementers. Also since important data relates to representation of physical products it is probably easier to agree on than more conceptual data (e.g. through the development of physical prototypes) Features supporting deontic quality of CAD data Looking upon the waste forms we can conclude the following Searching: Finding the relevant CAD data can be made easier by linking it to enterprise tools such as PLM tools. Undercommunication: In CAD tools, there might be limitations in the representation of underlying design rules and process information (e.g. relevant for manufacturing) that can be useful at a later stage. Misunderstanding: Due to the number of assumptions that are included in a product design that is not necessarily kept explicit, there might be misunderstandings at a later stage. Interpreting: A number of tools exist to do different types of analysis from different angles, which might make it easier to support interpretation of the product model. Waiting: If other than designers and engineers need information from CAD tools, or need to have changes done at a later stage, they might be dependent on the availability of the engineer to do the changes Extra processing: Since CAD tools store the geometry on instance level, reuse for e.g. variants of products might take extra time Quality of data in KBE tools KBE - Knowledge based engineering has it s roots in applying AI techniques (especially LISP-based) on engineering problems. In (La Rocca, 2012), four approaches/programming languages are described: IDL, GDL, AML, and Intent! all being extensions of LISP. In LinkedDesign, one particular KBE tool is used in Aker Solutions, KBEdesign. The KBeDesign is an Aker Solutions engineering automation tool developed for Oil & Gas offshore platform engineering design and construction, built on top of a commercial Knowledge Based Engineering (KBE) application (Technosofts AML), being similar to the AML sketcher. AML is an object-oriented extension of LISP. AML and KBE is of particular importance for the Aker Solutions Use case in WP9. This text build upon and extend work reported in D6.1 Based on the use case, we note that here there are two important data sources: The representation of the engineering artifacts themselves, and the way the engineering rules are represented (in AML) as part of the overall code. Copyright LinkedDesign Consortium Page 31 / 164

32 Features supporting physical quality of KBE data Knowledge and data is hard-coded in the AML framework. There exists classes for exporting the AML code into XML (the abxml-format) or similar, however some knowledge might be lost. There are also classes for interrogating the AML code for the information you want, along with classes for automatic report creation that can be read in programs such as Microsoft Word. It is possible in KBE Design to interact with most systems in principle. What is so far implemented are import/export routines for files for analysis software like GeniE, STAAD.Pro, drawings are exported to DWG (AutoCAD format). When the model is held within the tool, access rights can be controlled, but it is hard to enforce this when the model is exchanged to other tools. There is limited support for controlling versions both in the rule-set and in the models developed based on the rule-set. As for the rules, these are part of the overall code which can be versioned using e.g. SVN. Some rules related to model hierarchy and metadata (not geometry) for export to CAD systems are stored in a database and can be set up per project. Some capability to import data contained in the CAD-system PDMS via akxml is implemented Features supporting empirical quality of KBE data Geometric data can be visualized as one instantiation of a model with certain input parameters. There are also multiple classes for different kind of finite element analysis of the model. Whereas the engineering artefact worked on is visualized in the work-tool, the AMLrules are not available for the engineer in a visual format, and neither for those developing the rule-base, these are represented in a code-format (i.e. structured text) Features supporting syntactic quality of KBE data In AML, datatypes are not defined. Programs tend to run even with syntax errors in formulas as there are both default values, and systems in place to ensure that systems can run with blank values. The data is stored in a proprietary XML-format, although as indicated it is also possible to make available using CAD-standard (but then only the information necessary for visualization is available) Options are available within AML for import and export to industry-standard file formats, including IGES, STEP, STL, and DXF. New STEP standards going beyond the standard mentioned under CAD-tools that are interesting in connection with KBE codification are: The standard for construction history that are used to transfer the procedure used to construct the shape, referred to as ISO Standards for parameterisation and constraints for explicit geometric product models, providing an indication of what is permissible to change, refer to ISO for single parts and ISO for assemblies Standard for what is known as design features, refer to ISO Copyright LinkedDesign Consortium Page 32 / 164

33 The STEP standards mentioned above are part of STEP AP Edition that will also become part of the upcoming AP 242 (The AP 203 and AP 214 merger). The XMLrepresentation of AP 242, as an open standard, could be a tool for KBE modeling as a basis for manual or automatic code generation for example for the KBE programming language AML. For KBE modeling, lightweight representations of geometry could be enough for graphic representation, making the upcoming JT standard (ISO 14306) a good candidate. The valuable knowledge could be gathered in the backbone which could be the STEP AP 242 XML-representation Features supporting semantic quality of KBE data The focus in KBEDesign is the representation of product data. AML is used to represent engineering rules. There is also possibilities in the core technology to represent process information related to the products. Note that an OO-framework has some well-known limitations in representing rules, e.g. for representing rules spanning many classes (Høydalsvik and Sindre, 1993) The AML framework also supports dependency tracking, so that if a value or rule is updated, everything that uses that value or rule is also changed. Dynamic instantiation is supported, providing potential short turnaround for changes to the rule-set Features supporting pragmatic quality of KBE data The experiences from the Aker use case indicate that it is very important to be able to provide rule visualizations, and that these can be annotated with meta-data and additional information. Standard classes in the AML framework allow you to interrogate AML models, developing reports. Data can be visualized any way you want in AML, and if the required visualization is not part of the standard AML framework, then it can be created. It is practical to have everything working in the same environment, but it can be difficult for non-experienced users to find the right functionality Features supporting social quality of KBE data KBE is a particular solution for engineering knowledge, and as the experiences from use case illustrates there is not always agreement on the rules represented. The KBEDesign tool is used for developing platform-designs, but for other engineering and design tasks, other tools are used. Export to tools used company-wide such as PDMS is important to establish agreement, and thus, social quality of the models Features supporting deontic quality of KBE data An important aspect with object-oriented, rule-based approaches of this sort, is the potential for supporting reuse across domains Summarizing relative to factors for waste reduction in lean engineering Searching: Storing all rules in the KBE-system is useful in this regard, but they are to a limited degree structured e.g. relative to which rules influence each other, Copyright LinkedDesign Consortium Page 33 / 164

34 which are there to follow a certain standard etc. The rules are stored as part of the code, and thus is not easily found; Undercommunication: Since AML-rules are accessible as code only, it can be hard to understand why different design decisions are enforced; Misunderstanding: Can result from not having access to the rules directly; Interpreting: Additional time might be needed for interpretation for the above mentioned reason; Waiting: If not getting support quickly for updating rules (if necessary), this can be an issue. The use of dynamic instantiation described under semantic quality can alleviate this, on the other hand one need people with very specific coding skills to add rules; Extra processing: Might need to represent rules differently to be useful in new situations. On the other hand if using the abstraction mechanism in a good way, this can be addressed. Note that the full use of the OO-mechanisms work best in stable domains; Quality of data in PDM/PLM tools Product lifecycle management (PLM) is the process of managing the entire lifecycle of a product from its conception, through design and manufacture, to service and disposal. Whereas CAD systems focus on early phases of design, PLM attempts to take a full livecycle view. PLM intends to integrate people, data, processes and business systems and provides a product information backbone for companies and their extended enterprise. There are a number of different PDM/PLM-tools. Some tools that were previously CAD tools like Catia have extended the functionality to become PLM-tools. The following is particularly based on literature review and interview with representatives for Teamcenter, which according to Gartner group is the market leader internationally for PLM tools. Aspects relative to how such tools are used together with other tools in order and change management is discussed in section There is typically a core group of people creating information for such tools, and a vast group of people consuming this information Features supporting physical quality of PDM/PLM data Core product data is held in an internal database supported by a common data model. The data can be under revision/version and security (access) control. Some data related to the product might be held in external files e.g. office documents. There can also be integration to CAD tools and ERP-tools (both ways). For Teamcenter for instance, there is CAD-integration (with Autocad, Autodesk, SolidWorks, Unigraphics, I-deas NX, Solid Edge, Catia V5, Pro Engineer) and ERP integration (bi-directional with SAP ERP (R/3), MS Dynamics and Oracle). In addition to access on workstation, it is also possible to access the data on mobile platforms such as ipad. Data can also be shared with e.g. suppliers supporting secure data access across an extended enterprise. This kind of functionality should also make it easier to support the access of data in the PLM-system from outside (e.g. also from a Virtual Obeya). Copyright LinkedDesign Consortium Page 34 / 164

35 Teamcenter have multi-site functionality, but it does not work well to work towards the same database over long distances Features supporting empirical quality of PDM/PLM data PLM tools typically support 2D and 3D visualization of the product and product structure within the tool. These are typically made in CAD tools, see above Features supporting syntactic quality of PDM/PLM data Storage of PLM-data is typically done according to existing standards. PLM XML (an emerging Siemens PLM Software format for facilitating product lifecycle interoperability using XML) is supported, in addition to the formats needed for export to CAD (e.g. using JT) and ERP tools mentioned under physical quality. Newer standards such as QLM (see D1.1) is only used in a limited degree Features supporting semantic quality of PDM/PLM data As the name implies, the main data kept in PLM systems is product data, including data relevant for the process the product undergoes through it s lifecycle (e.g. through BOL, MOL, EOL - cf. the description of the QLM SOM Model in Deliverable D1.1). Schedule information and workflow modelling is supported in tools such as Teamcenter, but similar to CAD tools, the function of the parts in the product is not represented in most tools. Compliance management modules can support representation of regulations (a type of rules) Features supporting pragmatic quality of PDM/PLM data Relevant context information can be added to the product description supporting understanding. PLM systems have become very complex and as such more difficult to use and comprehend. The size of the products (number of parts) has also been increased over the years. Whereas a jet engine in the 1960s had 3000 parts, in 2010 it might have parts. Reporting is traditionally in Excel, but newer tools can support running reports on the 3Dmodel, presenting the results as annotation to the 3D-model. The tool has been reported to be hard to learn if you are not an engineer. Comprehensive Features supporting social quality of PDM/PLM data PLM systems are enterprise integrating systems. When implementing PLM-systems one need to agree on the system set-up, data coded etc. across the organization. Thus when these kinds of systems are successfully implemented, one can expect there to be high agreement on the data found in the tool in the organization. Note that a similar issue that is found in ERP systems can occur (this problem was originally discovered in so-called groupware systems (Grudin, 1994), the so-called work and benefit disparity: Company-wide application often require additional work from individuals who do not perceive a direct benefit from the use of the application. When e.g. creating new parts, lots of attributes need to be added, thus it takes longer time to enter in the beginning (since it is meant to serve the organization as a whole). Copyright LinkedDesign Consortium Page 35 / 164

36 Features supporting deontic quality of PDM/PLM data Looking upon the waste forms we can say the following Searching: Large models and a lot of extra data might make it difficult to get an overview and find all the (and only the) relevant information. On the other hand, since one have a common data model, it should be easier to find all the data relevant for a given product. Undercommunication: Since extra data has to be added up front for the use later in the product life cycle, it is a danger that not all necessary data is added (or is added with poor quality), which can lead to the next two issues: Misunderstanding: Can be a result of under-communication. Interpreting: When engineers and other groups need to communicate, one should also be aware of possible misunderstandings, given that it seems to be hard to learn these tools if you are not an engineer. Also given that only a few people are actually adding data (modelling) a lot of people need to interpret these models without actively producing them. Waiting: It can be a challenge when a change is done for this to propagate also to e.g. ERP systems and supplier systems. For some type of data this propagation is automatic. Extra processing: Necessary to add data up front. Can be a challenge when you need to perform changes, to have the data produced in earlier phases updated Quality of data in ERP tools The American Production and Inventory Control Society (APICS) defines ERP as Framework for organizing, defining, and standardizing the business processes necessary to effectively plan and control an organization so the organization can use its internal knowledge to seek external advantage (Blackstone, 2008). We will in particular focus on SAP ERP, previously known as SAP R/3, in this overview, although examples from other available tools such as MS Dynamics will be included. ERP-systems are very comprehensive tool-sets, thus we will partly focus on the most relevant (seen from the use case partners) and core modules in SAP ERP. The following are common functional areas covered in an ERP System in general. In many ERP Systems these are called and grouped together as ERP Modules that share some common data structures and may be used individually, together or even together with third-party extensions: Financial Accounting General Ledger, Fixed Asset, Payables, Receivables, Cash Management, Financial Consolidation Management Accounting Copyright LinkedDesign Consortium Page 36 / 164

37 Budgeting, Costing, Cost Management, Activity Based Costing Human Resources Recruiting, Training, Payroll, Benefits including pensions, Diversity Management, Retirement, Separation Manufacturing Engineering, Bill of Materials, Work Orders, Scheduling, Capacity, Workflow Management, Quality Control, Manufacturing Process, Manufacturing Projects, Manufacturing Flow, Product Life Cycle Management Supply Chain Management Supply Chain Planning, Supplier Scheduling, Order to Cash, Purchasing, Inventory, Product Configuration, Claim Processing Project Management Project Planning, Resource Planning, Project Costing, Work Break Down Structure, Billing, Time and Expense, Performance Units, Activity Management Customer Relationship Management Sales and Marketing, Commissions, Service, Customer Contact, Call Center Support - CRM systems are not always considered part of ERP systems, but rather handled in separate systems. Data Services Various "self service" interfaces for customers, suppliers and/or employees Access Control Management of user privileges for various processes In e.g. SAP ERP (R/3), the main modules are Financial Accounting (FI) Financial Supply Chain Management (FSCM) Controlling (CO) Materials Management (MM) Sales and Distribution (SD) Logistics Execution (LE) Production Planning (PP) Quality Management (QM) Plant Maintenance (PM) Copyright LinkedDesign Consortium Page 37 / 164

38 Project System (PS) Human Resources (HR) ERP is usually implemented as a large monolithic system covering the different business functions of a company, with a common central database. Managing and tracking the vast amount of transactions in a company within one system yields potentially many benefits over older, scattered legacy systems, but in practice the organization will still have a number of systems supporting functionality not found in the ERP system. Since ERP-systems support the running of the main business, innovative work in new areas in the organization often needs to be supported by other tools, until they have matured to such extend that it can be part of the core business of the organization. ERP systems like SAP R/3 offer central data dictionaries and programming environments that simplify the implementation of add-ons or the integration with other software packages. ERP systems have traditionally focused on internal process integration of well understood business functions, such as sales, production, and inventory management (Kelle and Akbulut, 2005). In an exploratory study Akkermans et al. (2003) identified some shortcomings with ERP systems in the field of supply chain management. "ERP II" was coined in the early 2000s. It describes web based software that allows both employees and partners (such as suppliers and customers) real time access to the systems. In connection to SAP, the SAP SCM part of the SAP Business Suite takes care of the SCM in a way that can be integrated with SAP ERP Features supporting physical quality of ERP data All the data in an ERP system is arranged in accordance with a central data dictionary that defines all the system s entities and their attributes and relationships. The data dictionary is based on the following structure: System configuration tables: These are tables that are maintained primarily by the software vendor or the IT department. Control tables: Control tables contain parameters that govern the actual behavior of the system. They can for example control if it is possible to process invoices for goods that are not delivered, or who is allowed to approve purchase requisitions at which approval level. The tables also contain the organizational structures of the company, e.g. which purchasing organization works on behalf of which company codes. Important in this respect is an underlying organizational ontology that constrains how companies organizational structures may be modeled in the ERP system. If there is no good match between the companies structures and the organizational constraint of the system, the companies may need to map their structures onto ERP structures in non-trivial ways that need to be investigated in detail beforehand for unintentional consequences on functionality. Master data tables: Master data tables and transactional data tables define the application data of the system. Both types are updated when the system is in operation. Copyright LinkedDesign Consortium Page 38 / 164

39 An initial set of master data is set up in the course of the implementation project. They describe business entities that are reused and referred to by the daily transactions in the system. Examples of master data are G/L accounts (FI), assets (AM), cost centers (CO), materials (MM), customers (SD), vendors (MM), employees (HR), plants (PM) and work centers (PP). Transactional data tables: Individual purchase orders, sales orders and plant maintenance orders are all transactional data, and each document tends to be split up in different parts that are stored in different tables. The information about vendors in purchase orders, for example, is stored in another table than the actual materials being requested. Information from the appropriate vendor is copied into the transactional data table from the vendor master data table. In SAP there is in total around database tables. As discussed under integration with PLM-systems it is possible to update the data of ERP systems from the outside, and export data to outside systems. User profiles correspond to the daily tasks and responsibilities of the user and should ensure that a user can only perform transactions that belong to his job. The combination of profile, e.g. purchasing manager, and organizational position, e.g. purchasing group X in purchasing organization Y, typically decides what transactions and master data are available, though individual profiles can be defined for maximum flexibility. Security/access control supported within the tool, as with other tools, is hard to enforce when providing data in other settings external to the tool. SAP ERP transaction data can be exported to datawarehouse tools using SAP Netweaver. The new SAP HANA Architecture is proving a in-memory database solution that can integrate transactional data and analytical queries into the same environement Features supporting empirical quality of ERP data The presentation of data within ERP system is normally through traditional forms and tables. Tools to extract the overall process structure of the events and transaction (such as process mining tools) exist, which in this way can visualize not only the intended process, but also the process that is actually performed. Note that it is not particularly easy to do process mining towards SAP. Van Giressel (2004) concludes that although SAP R/3 logs all required data for process mining, it is not logged in a suitable manner. On the other hand successful attempts of process mining exist (Ingvaldsen, 2011), but for this to be done automatically, use of statistical techniques in addition to use the log data and the reference model is necessary. So-called reference model (often described as SAP s process ontology) exist to document all the standard functionality of an ERP system. It is delivered as part of the system environment and specifies the total capabilities of the application. In the first round, an implementation project may compare the reference models of different applications as part of the vendor selection process. As the project proceeds, the model is used to Copyright LinkedDesign Consortium Page 39 / 164

40 select which modules and which transactions that are relevant identify missing transactions in the application verify how the transactions can be used as part of business processes. estimate and plan the customization and add-ons needed Such reference models are for SAP/R3 provided in the EPC language (Scheer, 1999) a visual process modeling language Features supporting syntactic quality of ERP data ERP-data follows the data-model of ERP, which is rigorously defined, which also can be accessible from the outside, although the data dictionary is not adaptable/responsive to external data models. The data is implemented using relational database technology. You can export and import data in simple, standard format such as.csv Features supporting semantic quality of ERP data ERP-systems are often looked upon as primarily being process-oriented systems, e.g. with a large emphasis on process information. Also information on products and organizational structure and roles is usually captured, whereas information about the business rules applied is less easy to extract. Business rules and policies are usually not explicitly stored in an ERP system, even though they are reflected in the configuration of system components. As for tool-information (e.g. what tool is used in what steps) this is typically not emphasized, at least not conceptually. Completeness of data is often enforced in the tool, whereas validity is more difficult to enforce, especially when people have to enter data that is not for own use, but for the use of someone another place in the organization. On the other hand, since modules share a common data dictionary, an ERP system can to a large extent verify the consistency of data across modules and business areas Features supporting pragmatic quality of ERP data The majority of research on ERP-systems has been on the phases of configuration, implementation and deployment. Researches have noted that there is a need for more focus on better use of the systems and continuous improvements of the systems in use (Botta-Genoulaz and Millet, 2005). As described under empirical quality, the user interface of ERP-systems is in traditional forms and tables, although also a hierarchical structure for accessing relevant transactions is provided. The concept of master data is addressing the pragmatic support given by SAP. Master data serves two purposes: 1. consistency across transactions, 2. ease of data entry. Because all transaction data in ERP systems are linked to one or more master data tables (that the user do not even need to know exist), additional relevant information is automatically filled out and is guaranteed to be correct/consistent/complete. This means that we have a very economical way of entering, expressing, retrieving and manipulating large data sets without even knowing the complexity of the data we are dealing with. There are more than transaction codes, thus some structuring of this is necessary. Decision making often involves looking at several different tables to get a complete view of a Copyright LinkedDesign Consortium Page 40 / 164

41 situation. To support decision making there is a need for graphical information that goes beyond single tables or single graphs to get a broader picture of the system (Parush et al., 2007). An experimental study by Parush et al. (2007) found that for users of ERP-systems for supply chain management graphical visualization of data improved performance, especially for inexperienced users. Visualization enables professionals to comprehend information quickly, and allows immediate action (Chorafas, 2005). With the lack of visualization in ERP systems, juxtaposed with the recognized benefits of useful data display, visualization emerges as a requirement for improved ERP functionality. The visual process models in e.g. EPC (neither the reference process models nor the adapted models) are available for the normal end-user Features supporting social quality of ERP data ERP-solutions like SAP R/3 are widely used. Since ERP systems are meant to integrate data from many sources, the organizational agreement of the data found in these will often be high (at least when the tool has been operational successfully for a while). Note that the integrated nature of ERP tools means that one in many cases might need to enter data not only useful for one s own task, but also useful for others. (Grudin,1994), as also discussed under PLMsystems Features supporting deontic quality of ERP data Looking upon the waste forms in lean engineering we can say the following: Searching: A trend of enterprise systems is that they cover more and more business functions, connecting back-office operations with front-office systems, as well as moving towards real-time tracking and monitoring of operations (Lyons, 2005). The result is that the amount of information within ERP systems is increasing. This means improved possibilities for decision making, but also poses a challenge as to how to present data in a way that is useful for those making the decisions. On the other hand, since one have a common data model, it is potentially easier to find relevant data (at least if the data is within the ERP-system). ERP systems are generally notorious for bad search interfaces. Exactly because there is such a structured data dictionary, it becomes very difficult to search for records unless you know quite a lot about the structure of these records and how data is stored using them. Generally, ERP-systems are good for precise queries, bad for vague exploratory queries Undercommunication: A possible effect of the disparity issue is as with PLM systems that some data might not be available in good quality (in the sense of being valid) Misunderstanding: Can be a result of undercommunication Interpreting: The limited support for pragmatic quality might make the interpretation of data difficult Copyright LinkedDesign Consortium Page 41 / 164

42 Waiting: If one depend on others filling in relevant data, this can result in unnecessary waiting Extra processing: If data has been provided, but that it is wrong, extra processing is necessary to make it right. Handling of changes might here as with PLM systems result in a lot of extra processing Important learning points for LinkedDesign This section has illustrated that different tools used in the organization typically have different user roles, and focus on different types of knowledge (e.g. from different domains cf ). Different tools have a varying degree of explicit meta-model (data model), and this is available in a varying degree. E.g. in many export-formats one loses some of the important information on product data. Even when different tools support e.g. process data, it is often process data on different granularity. The tools alone all have challenges relative to waste in lean engineering. In a Virtual Obeya environment one would explicitly want to combine data from different sources to address these reasons for waste, in a context-driven manner. A bit dependent on the concrete knowledge sources to combine, this indicates that it is often a partly manual job to prepare for such matching. In D2.2 a first prototype is described for addressing some of these problems. One approach in D2.2 is to use the semantic mediator service (cf. Figure 4). This component strongly relies on Wrappers and Mappings to integrate different type of formally defined data sources in a global integrated schema. Since that process is complex and time-consuming a specific service for semi-automatically computing mappings is developed in D2.2with the Schema Matching Service. The second kind of integration is following a less schema dependent path of data integration based on the Smart Link Storage. In contrast to the Semantic Mediator, the Smart Link Storage strives for instance level integration. The Smart Link Storage is a repository to store links between instance objects and allows for an integrated analysis of highly heterogeneous data sources. Even if this is an interesting approach, one should not underestimate the need for matching concepts across data sources. Also the different level agreement of data from different sources (social quality) can influence the use of this approach in practice. In both cases, a sort of intermediate storage following the structure of a core ontology like what is illustrated in the AKM approach described in Appendix 4 is regarded to be beneficial. 3.3 Knowledge creation in lean engineering environments An essential aspect relative to the goal of minimizing waste is to be able to capture knowledge as it arises in the collaborative engineering activities, and distribute relevant knowledge to relevant people in a timely fashion; in-context and on relevant platforms. This is specifically relevant to address under-communication, misunderstanding, interpreting, waiting (although enforcing data input without a specific plan for this might result in extra processing, and increase the searching problem). Copyright LinkedDesign Consortium Page 42 / 164

43 Given the sources of waste identified and prioritized by the industry partners in section 3.1, LEAP could add value by supporting a continuum from informal, ad hoc collaboration (lowthreshold mechanisms for capturing knowledge created as a result of ad hoc creativity in teams) to more formalized engineering processes. A combination of pull- (search) and push-based knowledge provision mechanisms could represent added value in complex, multidisciplinary and collaborative situations such as knowledge assists where the person(s) in need of knowledge may or may not exactly know what to look for, how to retrieve it or what data that could be of interest Supporting knowledge creation in a Virtual Obeya Nonaka and Takeuchi s theory on organisational knowledge creation (Nonaka, 1994) use the following definition: knowledge is justified true belief. It can be argued that this definition only applies to some aspects of human knowledge. Nonaka and Takeuchi tightly link knowledge to human activity. Central to their theory is that organisational knowledge is created through a continuous dialog between tacit and explicit knowledge performed by organisational communities of interaction that contribute to the amplification and development of new knowledge. Thus their theory of knowledge creation is based on two dimensions: 1. The epistemological dimension that embraces the continued dialog between explicit and tacit knowledge 2. The ontological dimension which is associated with the extent of social interaction between individuals developing and sharing knowledge. Nonaka and Takeuchi identify four patterns of interaction between tacit and explicit knowledge commonly called modes of knowledge conversion 1. Socialisation. Creating tacit knowledge from existing tacit knowledge through shared experience 2. Externalisation: Conversion from tacit (or unstated explicit) to explicit knowledge 3. Combination: Creation of new explicit knowledge from existing explicit knowledge 4. Internalisation: Conversion of explicit knowledge to tacit knowledge The internalisation mode of knowledge creation is closely related to learning by doing ; hence the internalisation process is deeply related to action. Nonaka and Takeuchi criticise previous theories on organisational learning, for not addressing the notion of externalisation and having paid little attention to the importance of socialisation. They also argue that a double-loop learning ability (cf. Argyris and Schøn, 1978)) implicitly is built into the knowledge creation model, since organisations continuously make new knowledge by reconstructing existing perspectives, frameworks or premises on a day-to-day basis. When tacit and explicit knowledge interacts, innovation emerges. Nonaka proposes that the interaction is shaped by shifts between modes of knowledge conversion, induced by several triggers as depicted in Fig. 5, we have the socialisation mode starting with building a field of Copyright LinkedDesign Consortium Page 43 / 164

44 interaction facilitating the sharing of experience and mental models. This triggers the externalisation mode by meaningful dialogue and collective reflection where the use of metaphor or analogy helps articulate tacit knowledge which is otherwise hard to communicate. The combination mode is triggered by networking newly created knowledge with existing organisational knowledge, and finally learning by doing triggers internalisation. Dialogue Socialization Externalization Field Building Linking Explicit Knowledge Internalization Combination Learning by doing Figure 5: Knowledge spiral for knowledge growth These contents of knowledge interact with each other as indicated in the spiral of Figure 5, illustrating the epistemological dimension of knowledge Adding Nonaka and Takeuchi s ontological dimension of knowledge creation, we end up with the idealized spiral of organisational knowledge creation depicted in Fig. 6, which shows how the organisation can mobilise tacit knowledge created and accumulated at the individual level, organisationally amplified through the four modes of knowledge conversion and crystallised at higher ontological levels. Thus the authors propose that the interaction between tacit and explicit knowledge becomes larger in scale as the knowledge creation process proceeds up their ontological levels. The spiral process of knowledge creation starts at the individual level and potentially moves upwards through expanding interaction communities crossing sectional, departmental, divisional and possibly organisational boundaries. Copyright LinkedDesign Consortium Page 44 / 164

45 Epistemological dimension Explicit knowledge Combination Externalization Internalization Tacit knowledge Socialization Individual Group Organization Knowledge Ontological dimension Inter-organization Figure 6: Spiral of organisational knowledge creation Whereas we in D5.1 looked upon the interaction between collaborative tasks on the individual and (small)-group level, and the iterative shifts between user collaboration and automated design processes, it is in many cases also relevant to spread the knowledge established from this to the organizational level. Note that it is not given that we want knowledge to move to the right in Figure 6. To discuss this in more detail, we base the discussion on the work on knowledge maturing from the EU FP7 MATURE project. The following is based on Kump et al (2011). Figure 7: Framework for Knowledge maturing (from Kump et al., 2011) Copyright LinkedDesign Consortium Page 45 / 164

46 The Knowledge Maturing Model outlines the following phases (see Figure 7): Ia. Expressing ideas: New ideas are developed by individuals either in informal discussions or by 'browsing' the knowledge available within the organization and beyond. Extensive search and retrieval activities potentially result in loads of materials facilitating idea generation. Ib. Appropriating ideas (individuation): New ideas that have been enriched, refined, or otherwise contextualized with respect to their use are now appropriated by the individual. Knowledge sources and new knowledge are bookmarked so that an individual can benefit from its future (re-)use. II. Distributing in communities (community interaction): This phase is driven by social motives such as belonging to a preferred social group or the expectation of reciprocal knowledge exchange within the community or project. A common terminology for individual contributions is developed and shared among community members. III. Formalising (information): Artefacts created in the preceding phases are often unstructured and restricted to a limited local context. They are only comprehensible for people in this community as shared knowledge is still needed for interpretation. In Phase III, structured documents are created in which knowledge is made more transferable, and context is made explicit with the purpose to ease the transfer to people other than those in the originating community or project. From Phase IV on, there are two alternative paths of knowledge maturing: IV 1. Ad-hoc training (instruction): Activities related to creating training materials out of documents that are typically not suited as learning material as they have not been made with didactical considerations in mind. Topics are refined to ease teaching, consumption, or re-use. Tests help assess the knowledge level and select learning objects or paths. Knowledge can be used for formal training in Phase V (V1 a. Formal training (instruction)). The subject area becomes teachable to novices. A curriculum integrates learning content into a sequence using didactical concepts to guide learners in their learning process. Learning modules and courses can be combined into programs used to prepare for taking on a new role, for example. IV 2. Piloting (implementation): Experiences are deliberately collected with a test case stressing pragmatic action trying a solution before a larger roll-out of a product or service to an external community, or new rules, procedures, or processes to an internal target community such as project teams or other organizational units. Know-how can be institutionalized at the beginning of Phase V. V2 a. Institutionalising (introduction): Within an organization, formalized documents that have been learned by knowledge workers are implemented into the organizational infrastructure in the form of business rules, processes or standard operating Copyright LinkedDesign Consortium Page 46 / 164

47 procedures. In the organization-external case, products or services are launched on the market. V b. Standardising (incorporation): This latest phase covers standardization or certification. Certificates confirm that participants of formal trainings achieved a certain degree of proficiency or justify compliance with a set of rules that the organizations have agreed to fulfil. Standards also help connecting products or services or showing that they fulfil laws or recommendations before being offered on a certain market. Often the individual engineer is working individually with his tools (e.g. CAD tools) that are made specifically for specific tasks. When meeting for collaboration in the Virtual Obeya, knowledge from different experts is brought together and should be presented in an interrelated way also taking other context information into account. In such sessions, in addition to learn from each other, one often also see that new knowledge is produced, and it is often important to be able to represent this for later use. This knowledge might be relevant on different levels (referring to the levels in the model above): 1. For the workers only, to be used as private guidelines 2. For the whole project 3. For a wider community of experts within an area 4. For the whole department or the whole organization/enterprise (different levels can be imagined here based on the size of the organization and the organizational structure) 5. Beyond the organization (e.g. if it is input to existing standards in the process of further developing the standard). Other example are if one are part of an extended enterprise or a network involved in open innovation (see next section) Note that the learning s and experiences will often need to be more restricted the more widely one want to share the knowledge. Thus, whereas it is beneficial to capture knowledge at the source (cf AKM principles described in Appendix 4), new knowledge captured should have different visibility depending on its nature and the goals of knowledge creation. Another dimension is the formality of the knowledge captured, which is very much related to the what level of knowledge maturity to achieve one want to achieve. It can be 1. Informal annotations. 2. Relations (e.g. as open linked data), to or between existing knowledge sources can be added. 3. Annotations linked to the context (thus potentially linked to concepts in the common ontology). In AKM for instance this is often the setting, using the IRTV - dimensions as the structuring ontology. 4. Codified knowledge (e.g. as new formal KBE-rules) Copyright LinkedDesign Consortium Page 47 / 164

48 5. Structured to act as basis for ad-hoc or organizational training 6. Structured to act as a tool, product or process to be reused across the organization 7. Structured to act as input to external standards The availability and formality of the captured data might also be temporally restrained, e.g. all new knowledge is kept within the project until the end of the project, and first then one share information more widely e.g. through reflection session shared either on instance level (e.g. as examples of good or bad practices) or on the type level (e.g. updating some organizational methodology). The last type (process improvement based on reflection) is dealt with in the next section. In the LinkedDesign use cases, focus is mostly on the first 4 areas above. For a concrete example, see for instance D9.2. In all cases it is important to be clear on the level of knowledge maturing you are on, and the ambitions to increase the level of maturity. Aiming too high will result in waste due to extra processing. Aiming too low might result in waste through under-communication, with possible result in waste due to unduly searching, interpretation work and misunderstandings Important learning points for LinkedDesign This section has looked in particular on general mechanisms for knowledge creation and how these mechanisms can be applied in LinkedDesign. Some important points are: Have in mind the ambition of knowledge maturity one want to achieve, and choose a relevant level of visibility and level of formality of new knowledge that are developed when working in the Virtual Obeya based on current and wanted level of maturity. In organizations the level of dynamics of processes differs. Whereas e.g. innovative processes can be very dynamic, other processes are more static. Whereas some processes moves from dynamic to more static after one have done the same thing several times, other types of processes will always be dynamic. To attempt to formalize these too much will probably be counterproductive. We will investigate the last aspect in more detail in the next section when looking upon how working in a Virtual Obeya can support the process improvement of an organization Copyright LinkedDesign Consortium Page 48 / 164

49 3.4 Diagnostics frameworks and process improvements in lean engineering environments Collaboration and knowledge sharing are primary drivers for creativity and innovation for today s networked businesses. As work is becoming more collaborative (see e.g. Chapter 3 in D5.1 or Kristensen and Kijl, 2010), both productivity and the ability to innovate depend on advances in the way engineers and other knowledge workers collaborate. The below figure illustrates modes of process improvement within an organization (inspired by work on process knowledge management (Jørgensen 2004). Figure 8: Modes of knowledge reuse Depending on the knowledge maturity of the processes in an organization, how process improvement is done varies. When the knowledge is only on the individual level, reuse is between personal experiences on the instance level to a new situation at the same level (mode 1 above). The same can happen within the community. When formalizing knowledge, one typically enter mode 2 in the above figure and develop general procedures, products or rules typically based on the experiences from a number of instances. Traditionally, this mode is done by humans through workshops, focus groups or other ways of gathering experiences from many instances. Also external reference processes might be consulted in this mode. In some settings, one could also imagine using more automated techniques (such as process mining (van der Aalst, 2011)). Process improvement in the traditional sense is mode 3, creating a new type level process based on an existing type level process and existing instances resulting from applying the type level process. One might also do the improvement purely based on the type level model, e.g. through process simulation as indicated as mode 4. The final mode (mode 5), is relevant when new (individual and project) task is to use the type level process model in new concrete instances, e.g. as guidance, but typically adapting the generic process description to the specific task. In e.g. a workflow system, this activation would be automatic. In between the automatic workflow activation and a purely manual Copyright LinkedDesign Consortium Page 49 / 164

50 activation, one finds the approach of interactive activation, a mode often used in connection with emergent and interactive workflow (Krogstie and Jørgensen, 2004). Process improvement is typically the result of a reflective process (reflection on action). Reflection might also be relevant within the task (reflection in action) for rapid adaptations, but our prime focus in this section is reflection on action for learning and knowledge maturing. As described in Appendix 4 reflection entails adding an additional dimension to existing knowledge representations. To discuss potential tool-support for reflection further, we use a model of computer-supported reflective learning currently developed in the FP7 project MIRROR (Krogstie et al. 2012) as depicted in Figure 9. Although this is meant to cover both individual and collaborative reflection, our main focus here is collaborative reflection. Figure 9: Model of computer supported reflective learning (From (Krogstie et al 2012)) Overall, when doing process improvement, it is important to ensure that relevant data for improvement is captured as part of work, frame the improvement sessions, have the necessary information available in the session, and ensure the availability of the improved knowledge. Note that even if the reflection sessions is singled out above, they might often be done in practice closely integrated with normal work activities, What to have available when doing Copyright LinkedDesign Consortium Page 50 / 164

51 reflection is obviously very dependent on the mode of knowledge reuse. E.g. in mode 1, only instance level information is available (typically only from the current project). In Mode 3, both the information on the type level process and information of one or more instantiation of the type level process is useful to have available (in addition to knowledge about how the knowledge was adapted to the specific project). More precisely: 1. One need to support data relevant to reconstruct and reflect on experiences from work, e.g. to capture design decisions in the Virtual Obeya 2. Data e.g. on behavior might also be useful (e.g. tracking what functionality in the Virtual Obeya is used or not) 3. Tools can contain reminders on suitable times for reflection, or provide information relevant for the decision to reflect on action (e.g. when situations of waste is discovered during work) 4. In addition to the trigger for reflection, it can be important to have sufficient information relative to the context of the work that triggered the reflection 5. When people involved in reflection are not present at the same place, ways to share the experience and other experience relevant for the reflection e.g. using the Virtual Obeya can be important 6. Relevant data from different sources might be important to bring into the reflection process, both on the instance and type level as indicated above 7. To make sense of the experiences, one might need information about the surrounding context 8. How to conclude based on reflection might be structured in certain form (e.g. A3 sheets) or according to concepts found e.g. in a common ontology or terminology, and use this to structure the newly developed knowledge 9. The results from the reflection should be captured e.g. in a knowledge management system, or linked to the original knowledge sources 10. The whole process should be shared in methods for using the working environment (in our case the Virtual Obeya) 11. Also the way of performing reflection session can be improved using experiences from how it has been conducted 12. The results from the reflection must be made available in the normal work-tools or influence the normal work-practice in the organization Copyright LinkedDesign Consortium Page 51 / 164

52 3.4.1 Collaborative diagnostics: Tools for lean engineering collaboration system analysis and optimisation As we understand from the above model, process improvement and reflection to perform process improvement is anchored in individual and collaborative work activities. The value of collaboration in engineering contexts as they can be observed in LinkedDesign industry partners respective organisations is strongly related to collaboration as a purposeful activity. Furthermore, collaboration is dynamic; it evolves over time and co-exists in a large and growing number of different forms. Effectively supporting high-performance collaborative work patterns across a variety of different engineering contexts requires competencies for both individual knowledge workers and process owners. Moreover, performance is typically linked to collaboration on both a strategic and an operational level, both for work and reflection on work for learning and process improvement. While technology is an important enabler of new collaborative work forms with attractive characteristics, technology alone is not sufficient to enable new, high-performance lean engineering practices. Broader change initiatives including smart combinations of people resources, technology, work processes, business culture and organizational models, are needed to fully exploit the value of collaboration. This is further complicated by the current lack of well-known, industrial frameworks for A) evaluating the impact of and B) systematically improving collaboration. There are however a number of tools and diagnostic frameworks that can be applied as decision support tools to make informed decisions that reduce risk and improve the chance of success, e.g. Hansen (2009), Rosen (2007) and Mattesich et. al. (2001) additional information can be found in the table below. Objective / process step Promote awareness Define precise collaboration objectives Identify collaboration barriers Define precise collaboration requirements Additional analyses & success factors Identify solutions Realise benefits Description Finding relevant examples from a range of industries to identify a suitable level of ambition Using available knowledge to define a set of precise, actionable objectives that links collaboration to overall business objectives Using state of the art diagnostics tools to identify and benchmark collaboration barriers Using a comprehensive database as an external resource when working systematically with eliciting user requirements Additional fit-for-purpose analysis; overall collaboration system and collaboration success factors Decision support with respect to identifying solutions that fit the problem, whatever the problem may be (process, organisation, technology, culture) Decision support with regards to ensure learning is put to use through alignment of chosen solution(s), smooth operations and continuous improvement Table 5: Process template for improving user collaboration Copyright LinkedDesign Consortium Page 52 / 164

53 Appendix 7: Collaborative diagnostics toolbox for consideration contains a repository of tools and activities for each of the objectives / process steps above, that can be used as decision support when working to systematically improving user collaboration, and when addressing commonly occurring sources of waste in collaboration. When exploring and piloting incontext collaborative work in environments developed for effective and efficient knowledge sharing, visualization and joint decision making, these tools can be used to add value by accurately pinpointing non-technology collaboration barriers and appropriate solutions, improving overall performance. Succeeding with collaboration is a complex undertaking, and few companies succeed in exploiting the full potential of deep collaboration. One of the main reasons for this is that collaboration is suffering from major coherency disconnects: 1. First, improvement efforts often fail because strategic initiatives and decisions are not followed up by operational measures instating a policy that collaboration constitutes a main element in running the business does not lead to change unless it is followed by specific, clearly defined work practices that spells out how to use collaboration operationally to achieve business objectives. Collaborative engineering must not remain a loosely defined, ambiguous term it must be given a clearly defined content. 2. Second, improvement efforts often fail because collaboration is seen and treated as something that is domain-specific rather than an enterprise-wide concept. This happens in part because the provision side has a strong position, and is able to influence terminology and shape managers thinking on collaboration. Collaboration does not equal a single collaboration tool or platform, or even a set of tools or platforms; indeed most activities in engineering companies today include some collaborative aspect(s). Unless these misconceptions are cleared and a proper understanding of how to facilitate and manage collaboration as a broad set of business activities is in place, it will be difficult to reap the full benefits of collaboration and equally avoid misapplications of collaboration (that can reduce performance). An analysis of collaboration patterns to gain further insight on various collaboration instances including, but not limited to outgoing requests for collaboration and related assistance by the knowledge provider in line with the activities described in the table above and in Appendix 7 will be further explored in D9.3, together with exploring new user collaboration concepts as a strategic approach to reducing waste including both intra-process and inter-process perspectives: An exploration of lean product development principles as a systematic approach to reduce waste and improve performance. This is based on an analysis of lean product development principles based on semantics and front end value drivers, using a Virtual Obeya to reduce common sources of waste in knowledge-intensive user collaboration processes. This will in part build on previous work as documented in D5.1 Figure 10 From Obeya to Virtual Obeya, D5.1 Table 5: Requirements; Obeya factors promoting efficiency and effectiveness, and D5.1 Chapter Productivity Dimensions ). Copyright LinkedDesign Consortium Page 53 / 164

54 A final aspect here is the support of improvement across organizations achieving externally visible innovation. As described in (Pisano et al., 2008), there are several possible types of collaboration modes In the open, hierarchical mode, anyone can offer ideas, but your organization defines the problem and chooses the solution In the open flat mode, anyone can solicit and offer ideas, and no single participant has the authority to decide what is or is not a valid innovation In the closed, hierarchical mode, your organization selects certain participants and decides which ideas gets developed In the closed, flat mode, a selected group is invited to offer ideas. Participants share information and IPR and make critical decisions together (a consortium) A strong form for such knowledge sharing is promoted in the area of open innovation, which focuses on an open, flat mode. A fundamental problem with the way knowledge work is currently organised in many areas, is that there is an enormous waste of ideas. For every idea that leads to something valuable, there are a huge number of ideas that do not lead to anything because they are abandoned at some stage before implementation. Some are dropped simply because they are bad ideas. Others may actually have been good, but the originator did not have the capacity or competence to follow it through. These abandoned ideas might have led to a valuable outcome if taken over by others, but this could not happen because nobody else knew about it - as product or process ideas are often well-guarded secrets of the originating organization. The naive answer to this would be a shift from proprietary idea generation within each organization to full transparency of knowledge artefacts across organizations. This could avoid the double work and wasted effort discussed above, as organizations could build on each other's ideas and artefacts, and learn from each other's knowledge about successes and failures. This is what has been termed "open innovation" ( Chesbrough, 2011), meaning that that companies should use external as well as internal ideas, and exploit both external and internal paths to the market, when advancing their own products and business models. The total productivity of the participants is likely to increase due to such shared usage of ideas and synergies achieved by enhanced connections between experts working on related problems. Note that there is a trade-off, since open networks will produce many ideas, and the screening of the ideas can be costly. Open innovation will have to rely heavily on ICT, facilitating virtual communities of nomadic, human/organizational actors, co-working on partially shared digital artefacts. The term digital eco-system has recently been used to generalize such communities, emphasizing that their actors constantly interact and cooperate with other actors in both local or remote ecosystems. Digital ecosystems (Krogstie, 2012b) is a metaphor inspired by natural ecosystems which describes a distributed, adaptive, and open Copyright LinkedDesign Consortium Page 54 / 164

55 socio-technical system. Such systems are characterized by self-organization, scalability and sustainability, and - at its best - provides both economic and social value. Looking back at the MATURE-model, we see that the third (organizational) level of knowledge maturing can be side-stepped in an open innovation model, going directly from internal community to the external world (much like how academic research is done) Important learning points for LinkedDesign This section has looked on some generic approaches to process improvement and how they can support process improvement through a Virtual Obeya. Ideas for functionality in this regard relates to: Functionality to support collaborative reflection in a Virtual Obeya related to the MIRROR reflection model Functionality to support collaborative diagnostics, described in detail in Appendix 7 Copyright LinkedDesign Consortium Page 55 / 164

56 4 Collaborative planning through effective order management (T5.3) The goal of Task 5.3 is to provide solutions for project supply chains through a platform for collaboration in engineering projects. It is important to keep in mind the differences between different production strategies and contexts when developing such solutions. Research and practices from one production context may not be applicable for other production context (Cox, 2004). For instance, much of the operations management literature is based on high volumes industries, often classified as make-to-stock companies. In comparison, literature on engineering production contexts, characterized by lower volumes and higher uncertainty, is relatively scarce. The engineering production situation is often referred to as engineer-toorder (ETO), and is classified through having an order penetration point, OPP 5, at the design stage its operations. Most of the operations management and production literature classifies companies into a manufacturing continuum spanning across four types: make-to-stock (MTS); assemble-to-order (ATO); make-to-order (MTO); and engineer-to-order (ETO) as illustrated below (Wortmann, 1992, Amaro et al., 1999; Olhager, 2003). Product delivery strategy Design Fabrication and procurement Final assembly Shipment MTS OPP ATO OPP MTO OPP ETO OPP Figure 10: Production situations and the order penetration point. (Olhager, 2003, p.320) Dotted lines means forecast-driven, straight lines means customer-order-driven It should be noted that, often, a company will not fit perfectly in one production situation; rather, a company may have characteristics from various production situations (Porter et al., 1999). Further, Caron and Fiore (1995) note that there also is a span within the engineering classification alone, arguing that there is a difference between engineering companies who are conducting operations at the company s premises and those who are of a more contractual nature moving from site to site doing projects. Nonetheless, Caron and Fiore (1995) argue that these companies have many of the same characteristics: The processes can be characterized as non-repetitive or pulse processes, with discontinuity aspects as 5 The order penetration point is also referred to as the customer order decoupling point (CODP), e.g. Vollman et al (2005) Copyright LinkedDesign Consortium Page 56 / 164

57 temporariness, uniqueness and multi-functionality. As such, many firms can be characterized as engineer-to-order; ETO companies can range from being highly integrated with in-house manufacture, to pure design-and-contract organizations (McGovern et al., 1999). In an extreme form, an ETO setting consists of a multi-project situation where the form of the finished product only becomes apparent during the execution of the project (Wortmann, 1995) We find the inherent project and engineering nature of ETO companies as a natural basis for further work. The context of the report is hence set to be of an engineering nature, applying ETO theory where it seems fit. The use of such theory enables us to include Task 5.3 s elements of engineering, projects, supply chain, uncertainty and engineering change management. 4.1 Literature review - project supply chain collaboration The key challenges for many enterprises today are the complexity of products and the uncertainty of processes. Customer requirements are subject to frequent changes, calling for enterprise agility and flexibility. At the same time, companies must uphold high efficiency to keep up with increasing global competition. An increasing demand for high flexibility, short and precise lead-times, high quality and low costs represents a significant challenge for companies to manage operations in networks. This involves both changes from customers and suppliers as well as internal changes caused by lack of materials or resources, changed priorities etc. If decisions are made without having an overview of options and consequences this might lead to high inventories, long lead times, low customer satisfaction or low resource utilization. To meet these conditions, enterprises require better tools for collaborative planning. The literature review is structured in three main areas: overall collaborative planning models and systems, order management and change management Collaborative planning The term collaborative planning is used to describe processes that spans across multiple planning domains. This means to connect local planning processes by sharing relevant data between the planning domains, and to create a common and mutually agreed upon plan. Thus, input data is updated faster and planning results become more accurate. Collaborative planning concepts can be applied to the planning processes that interface with customers (e. g. sales planning) and to those that interface with suppliers (e. g. procurement planning). These are examples of inter-organisational collaboration, or supply chain collaboration. Collaborative planning is also relevant in an intra-organizational context. Particularly larger enterprises have multiple business units, departments, engineering units, factories and sales offices, spread across geographical locations and time-zones. The three LinkedDesign industrial case companies Volkswagen, Comau and Aker Solutions are good examples of Copyright LinkedDesign Consortium Page 57 / 164

58 businesses with dispersed activities that require good solutions for intra-organisational collaboration. Further, collaborations can be distinguished by the domains that are exchanged and collaboratively planned. The following list of types of collaboration is included to show the types of domains or areas where collaborative planning can be needed: Planning domains Main driver Material or service related Demand collaboration Supplier Material-related Inventory collaboration Supplier Material-related Procurement collaboration Customer Material-related Capacity collaboration Customer Service-related Transport collaboration Customer Service related Integrated projects Network Material and service related Multi-tier collaboration networks Customer Material and service related Table 6: Types of collaborative planning (adapted from Kilger & Reuter, 2005) State-of-the-art collaborative models Several collaborative models for coordination network activities have been developed such as collaborative planning, forecasting and replenishment (CPFR), vendor-managed inventory (VMI), and automated replenishment programs. The aim of these models is to achieve seamless inter-organisational interfaces of materials and information flow by specifying control principles and operational models (Ivert and Johnson, 2007; Holweg et al., 2005). CPFR and VMI are described in further detail below. Vendor-managed inventory (VMI) is a collaborative model where a vendor (supplier) of a product takes full responsibility for maintaining an agreed inventory of material at the customers' premises. VMI solutions are based on frequent information sharing between the supplier and customer, often by using Electronic Data Interchange formats, EDI software and statistical methodologies to forecast and maintain correct inventory in the supply chain (Stadtler and Kilger, 2005). Copyright LinkedDesign Consortium Page 58 / 164

59 Collaborative planning, forecasting and replenishment (CPFR) is a business practice that combines the intelligence of multiple trading partners in the planning and fulfillment of customer demand. By linking sales and marketing best practices such as category management to supply chain planning and execution processes, CPFR aims to increase availability while reducing inventory, transportation and logistics costs. Since the 1998 publication of the VICS CPFR guidelines, over 300 companies, especially within retail and wholesaling, have implemented the process. Case studies of CPFR projects document effects of 2-8 % improvement of in-stock availability in stores, and inventory reductions of 10 to 40% across the supply chain (VICS, 2013) System support for collaborative planning In the past decades, there has been a continuous evolution in manufacturing planning and control approaches. The important milestone systems in this evolution started with simple reorder point (ROP) systems, and advanced with materials requirement planning (MRP) systems, manufacturing resource planning (MRP-II) systems, manufacturing execution systems (MES), enterprise resource planning systems (ERP), advanced planning and scheduling systems (APS) to extended ERP (Mabert et al., 2000; McClellan, 1997; McDermott, 1999; Orlicky, 1975 ; Vollmann, T., et al., 2005, Ivert et al., 2010). ERP systems, together with manufacturing execution systems (MES), are the dominating planning systems in industry today. They have a single-company focus and mainly support centralized production planning and control (Alvarez, 2007). ERP, and its related systems like customer relationship management and supply chain management (SCM), limit the expandability and re-configurability of the manufacturing systems as the evolution has focused on the improvements inside the enterprise and has limitations with inter-enterprise support and collaboration with customers and suppliers. (Vollmann et al. 2005; Davenport 2000). Also ERP systems are traditionally centralised and suit for sequential manufacturing process planning, scheduling, and control mechanisms and lack flexibility to respond to changing production styles and high-mix low-volume production environments Shen et al. (2006). But global competition and rapidly changing customer requirements are forcing major changes in the production styles and configuration of manufacturing enterprises to collaborate with each other, their customers and suppliers to leverage each other s competences to remain competitive. This has increased the need for moving towards border-less collaborative enterprise (Bénaben, F., et al., 2006). This imposes challenges of coordination and integration within and across the enterprises. Lack of information sharing and transparency is often recognised as a key issue for companies that seek to increase coordination and integration in their manufacturing planning and control (Vollmann et al., 2005). A highly transparent system, where relevant information can be accessed in real time, can ensure that decision making and execution are consistent throughout the planning processes (Strandhagen et al., 2006). Copyright LinkedDesign Consortium Page 59 / 164

60 Planning in supply chains face the main challenge that no coordination system fully shares all relevant information among the companies involved (Alvarez, 2007). All relevant information from the involved companies need to be integrated and up to date and accessible in real time from anywhere in the network. Each company should ideally be able to see the real time situation in the network, downstream as well as upstream (Handsfield and Nichols, 2002). Although information visibility and system integration are regarded as keys for enabling collaboration, very few networks have successfully achieved this [10]. Advanced Planning and Scheduling (APS) is the most comprehensive system for supply chain planning and control today. APS systems are based on the principles of hierarchical planning and make extensive use of solution approaches known as mathematical programming, algorithms, and meta-heuristics. They perform optimization and/or simulation on finite capacity on planning and scheduling decision. APS systems consider finite capacity constraints. They assume a finite capacity of resource materials and work centres as they schedule operations by optimizing constraints to meet an objective. (APICS, 2007). The main focus is on supporting the material flow across a supply chain, with planning processes related to: procurement, production, transport, distribution, and sales (Stadtler, 2005). There are several commercial APS software packages available, e.g. from SAP, Oracle, i2, Lawson, Manugistics, and JDA. APS provides planning functionality that supplement existing ERP systems. The ERP system handles the basic activities and transactions such as customer orders and accounting whereas the APS system focus on the operational activities related to decision-making, planning, scheduling and control of a supply chain and related management activities, which are not explicitly well covered in ERP systems. Unlike ERP, APS tries to feasible, near optimal plans across the supply chain, while taking potential bottlenecks into consideration. APS has the capability to simulate different scenarios for decision support, to plan and to schedule on-line as well as off-line. Planning and scheduling is by nature a proactive process, which can initiate an event in another area of the business or at partners based on the workflow control. Although APS aims at automating and computerizing the planning processes by use of simulation and optimization, the decision-making is still made by planners with insight in the particular supply chain and know how on the system constraints but likewise important: a feeling for feasibility of created plans. Thus, APS aim to bridge the gap between the supply chain complexity and the day-to-day operative decisions. This requires, however, that planners are able to model and setup decision rules for the planning and optimization. APS models easily become quite large and complex (Stadtler, 2005), and APS systems are best suited to manage complexity associated with product variety, volumes and global operations, while companies with simple products or narrow product lines may find negative returns from APS systems due to the additional effort required to manage them (Setia et al, 2008). There are several problems involved in implement and using APS software (Ivert & Copyright LinkedDesign Consortium Page 60 / 164

61 Jonnson, 2011). APS systems have a high entry level and require that decisions makers have a general understanding of optimisation and how data is structured. Often external consultants are required to set up and implement the system. The planning requires very high data quality, and discipline by users in the supply chain who have to update data and parameters. Complex models make it difficult to interpret the results and detect errors, and plans generated by the APS system might contain errors or is not considered feasible by the planners. In addition, the APS systems are not very user-friendly, which lead to a limited use of the system functionality and use of parallel systems (Ivert & Jonnson, 2011). According to Jonnson et al (2007), APS systems may yield significant benefits if they are used properly, e.g. improved decisions support, reduced overall planning time, cost savings, reduced inventory levels, and increased customer satisfaction. A study by Funk (2001) however, showed that only 20% of the APS installations investigated were successful (based on a threshold of achieving 70% of projected gain to become a success). Copyright LinkedDesign Consortium Page 61 / 164

62 4.1.2 Order management In this section, order management and its importance in engineering projects are first described in general. Here, a distinction between order acquisition and order fulfillment is made, before subsequently describing the two phases in more detail. Here, the goal, tasks and risks/challenges for each of the two phases are described, together with the need for information sharing, cross-functional integration and/or coordination. This is followed by considerations of order management in a supply chain perspective, before some technology support for order management is presented. Finally, a user scenario describing the need for information in the order acquisition phase is presented Introduction to order management In engineering projects, products are manufactured by means of unique engineering design or substantial customization based on orders specifying what to produce and when it should be finished. Order management involves creating and maintaining this information about product specifications and the promised delivery dates (Tenhiälä and Ketokivi 2012). In its most general form, the order management process starts when a company receives orders from a customer and ends when the final goods are delivered (Lin and Shaw 1998). Customized products, whatever the degree of customization, can only be made, or at least finished, to order (Amaro, Hendry et al. 1999). The substantial customization in engineering projects makes order management a key activity in meeting customer demand in an effective and efficient manner. At the same time, a higher degree of customization makes it increasingly difficult for the organization to collect, store and process the information describing customer orders (Forza and Salvador 2002). As such, there is a need for systems that in a perspicuous manner support the engineering project organization in managing orders. In general, order management can be divided into two phases: order acquisition and order fulfillment (Forza and Salvador 2002). The latter is sometimes used interchangeably with the overall term order management (Waller, Woolsey et al. 1995; Croxton 2003); however, as the two phases have different challenges (Tenhiälä and Ketokivi 2012) the distinction between order acquisition and order fulfillment is adopted through the remainder of this section. The order management process is often viewed as transactional and part of the logistics function within a firm (Croxton 2003); however, as will be evident in subsequent parts, order management activities are not limited to a single function in an engineering project organization. Further, order management is argued to consist of strategic as well as operational elements. For example, the order management process has to be designed in terms of configuring the network, establishing procedures and practices and deciding on the role of technology in the process, which may be considered to be at a strategic level (Croxton 2003). The role of technology in order management will be treated later in this section. Copyright LinkedDesign Consortium Page 62 / 164

63 Order acquisition In the order acquisition phase, the goal is to obtain an order with terms being a consensus between the project organization and the customer, by eliciting customer needs and communicating available options (Tenhiälä and Ketokivi 2012). In an engineering setting, this will typically happen through two succeeding stages: marketing and tendering. In the marketing stage, the decision of whether or not to respond to an invitation to tender is made based on customer requirements, commercial factors, the organization's ability to compete and the likelihood of success (Hicks, McGovern et al. 2000). If the organization decides to respond to the invitation, preliminary development of the conceptual design and definitions of major components and systems are conducted, including technical features, delivery terms, price and commercial terms (Hicks, McGovern et al. 2000). There are several risks associated with the order acquisition phase. Muntslag (1994) makes a distinction between three types of order-dependent risk in ETO situations: Technical (or quality) risk, time risk and financial risk. Technical risk refers to the situation where a product cannot be technically produced, which necessarily leads to more product engineering and detailed design. Further, time risk is the risk of encountering a throughput time in engineering and manufacturing that is longer than what was estimated in the tendering stage, whereas financial risk is the risk of the costs of engineering and production being higher than estimated in the tendering stage. Financial risk may result from technical and time risk (Muntslag 1994). These risks are present because order characteristics such as routings, material and machine requirements are usually not fully known at the stage of order acceptance making it extremely difficult to measure the impact a new order has on the production system (Ebben, Hans et al. 2005). According to Tenhiälä and Ketokivi (2012) typical risks include customers becoming confused with the offered variety and the manufacturer making mistakes in configuring the products. These may be considered to be technical risks. Several of the above mentioned risks may be removed, or at least severely reduced, through information sharing and cross-functional coordination or integration. When responding to a tender, sales personnel must, for example, be able to ensure both the technical viability and delivery time feasibility (Salvador and Forza 2004), referring to technical and time risk (Muntslag 1994), respectively. This necessitates effective flow of information between sales/marketing and manufacturing/engineering functions (Sawhney and Piper 2002; Zorzini, Corti et al. 2008; Tenhiälä and Ketokivi 2012). Such coordination often lacks due to differing objectives of sales and manufacturing (Kingsman, Worden et al. 1993; Danese and Romano 2004; Ebben, Hans et al. 2005; Pandit and Zhu 2007): Sales/marketing wish to maximize their number of orders received, adapting quickly to shifting market demand and cutting quotation prices and delivery times. Manufacturing, however, prefer a stable and smooth workload over time, to cut product-, overhead- and inventory costs (Kingsman, Worden et al. 1993). Overcoming this type of sub-optimization through increasing the coordination between the functions is likely to result in more realistic lead times quoted to the customer (Konijnendijk Copyright LinkedDesign Consortium Page 63 / 164

64 1994; Zorzini, Corti et al. 2008) and fewer production planning problems (Kingsman, Worden et al. 1993; Konijnendijk 1994; Ebben, Hans et al. 2005), reducing the time risk. McGovern et al. (1999) argue that a way to reduce costs is to involve the purchasing function in design and tendering decisions. Burt and Doyle (1993) found that % of total avoidable cost is controllable at the design stage. This implies that early involvement of the purchasing function in tendering and design is essential to reduce cost. This will in turn reduce the financial risk (see Muntslag 1994). Hicks et al. (2000) argue that such an integration of purchasing needs to be in place in order to do proactive purchasing. Purchasing sits on knowledge about suppliers' capabilities and performance, and own inventory status, and can as such mitigate technical, time and financial risk caused by marketing operating with insufficient information (Danese and Romano 2004) Order fulfillment Given that the order acquisition phase led to the generation of a customer order, the purpose of the order fulfillment phase is filling, delivering and servicing what is ordered within the agreed delivery date (Croxton 2003). Possible activities carried out in this phase are order entry, routing, assembly and picking, shipping, installation, invoicing and collection (Waller, Woolsey et al. 1995). The activities carried out in the order fulfillment phase are mainly operational and straightforward. The risks associated with the order acquisition phase are, however, inherited to the order fulfillment phase, where inaccurate estimates of product specifications and delivery due date may become evident. The main challenge in the order fulfillment phase is to cope with occurring modifications to product specifications and delivery dates (Tenhiälä and Ketokivi 2012). Such change orders are a common feature for ETO companies (Riley, Diller et al. 2005), and the capability to respond to these short term dynamics is a prerequisite for success in many engineering organizations (Little, Rollins et al. 2000). Krajewski et al. (2005) present three measures to cope with this: (1) the use of supply contracts to get tighter control over a buyer s demand changes; (2) the use of suppliers capabilities in adaptive production scheduling; and (3) the use of postponement. It should be mentioned that change orders are not always a disadvantage for the project; they may as well result in, additional income, cost savings or performance improvements (Terwiesch and Loch 1999). Many organizations that conduct order-driven manufacturing feel pressure to approve changes because freezing configurations and delivery schedules in the order acquisition phase is not considered as acceptable customer service (Tenhiälä and Ketokivi 2012). For the same reason, charging extra for change orders is no matter of course. As for the order acquisition phase, lack of communication between marketing and other functions may lead to capacity and materials shortages, which may decrease the manufacturer's reliability (Hanna, Camlic et al. 2004). Finally, it is worth mentioning that not all change orders are initiated by the customer. It may also be engineering changes proposed by designers that need to be communicated to e.g. sales/marketing, production planning and the customer (Danese and Romano 2004). As Copyright LinkedDesign Consortium Page 64 / 164

65 such, tools supporting order fulfillment should take into account changes initiated internally as well as changes imposed by the customer Order management in a supply chain perspective In the demand-driven nature of supply chain management, where the ultimate aim is to improve the competitiveness of a supply chain as a whole, the starting point of supply chain planning is available and planned customer orders (Stadtler 2005). Meeting the requirements of the orders in a timely manner is a key to effective supply chain management (Lambert and Cooper 2000). Order fulfillment is one of eight identified supply chain processes by the Global Supply Chain Forum, and is a key process in managing the supply chain (Croxton 2003). As is evident from above, the actors involved in order management activities are not limited to various functions within a company, but do also cut across other supply chain actors; i.e. tier suppliers and customers (Cooper, Lambert et al. 1997), see Figure 11. ORDER MANAGEMENT Figure 11: Integrating business processes across the supply chain (adapted from Lambert & Cooper, 2000) While managers often view order management as a logistical activity, Croxton (2003) argues that it is the integration with other functions in the firm and other firms in the supply chain that becomes key in defining order fulfillment as a supply chain process (p.22). This may, however, not be straightforward for all production situations. According to Ireland (2004), demand regularity and power and dependence characteristics dictate which supply chain management techniques are suitable and which are not. Cox (2004) further states that what is ideal practice for an industry characterized by high volumes, not necessarily is feasible for an industry having other characteristics, thus limiting the potential use of identified best practice. This is supported by Briscoe and Dainty (2005), who find that the large number of supply chain partners and the significant level of fragmentation in an engineering production Copyright LinkedDesign Consortium Page 65 / 164

66 situation may constrain supply chain integration. Further, as manufacturing practice is increasingly outsourced, the order fulfillment process is likely to be executed throughout supply chain networks (Lin and Shaw, 1998), which again increases the level of fragmentation. As emphasized above, information sharing, cross-functional integration and coordination are important both in the order acquisition and order fulfillment phases. In a supply chain perspective, this means that there needs to be a flow of information between the customer and any possibly required suppliers of materials or components. For example, if the product is dependent upon a component made by a supplier, the supplier's capacity should perhaps be visible prior to agreeing on a delivery due date, in order to avoid potential over-expenditure. However, supply chain visibility does not mean sharing all information with all partners in the supply chain, but rather that the shared information should be relevant and meaningful. The level of supply chain visibility is determined by the extent to which the shared information is accurate, trusted, timely, useful and in a readily usable format (Bailey and Pearson 1983). Visibility across the supply chain is a key element in supply chain competition, and ideally, all members in a supply chain should have access to updated information and performance figures regarding the main processes of their partners (Caridi, Crippa et al. 2010). Further, supply chain transparency should be sought that is, sharing data regarding current order and production statuses as well as plans and forecasts with the various supply chain partners involved (Akkermans, Bogerd et al. 2004) Technology supporting order management The current proliferation of products, i.e. the general increase in product variety, brings an increase in the volume of information exchanged between various actors during the order process (Forza and Salvador 2002). In general, technology enables a higher degree of integration between various actors in the order management process, and the order management process is now highly affected by the growth of technology in the supply chain environment. What used to be manual and clerical activities are now often automated by the adoption of technology such as EDI, the internet, available-to-promise (ATP) and capable-topromise (CTP) systems, ERP and advanced planning and scheduling (APS) systems, transportation management systems (TMS) and inventory visibility tools (Croxton 2003). This technology enables information to flow more easily between actors (Beckman and Rosenfield 2008), facilitates supply chain visibility (Barratt and Oke 2007) and eases sharing of information about production/delivery schedules and order status for tracking/tracing (Lee and Whang 2000). In the order acquisition phase, PC tools may also help ensure technical feasibility by formalizing the rules about how products can be configured and providing user interfaces that helps sales/marketing with translating customer requests into technical specifications (Tenhiälä and Ketokivi 2012). As such, modern configuration systems can potentially offer important support (Forza and Salvador 2002). Further, ATP and CTP systems are widely used to estimate delivery dates, and ERP systems typically offer techniques for doing so (Tenhiälä Copyright LinkedDesign Consortium Page 66 / 164

67 and Ketokivi 2012). With respect to order fulfillment, general applications of information sharing technology applies. With the occurrence of change orders, an overview of affected components is needed, together with some sort of notification about the change for the people handling the different components. Today's ERP systems have a "where-used" functionality (pegging) that to some extend can do this; however, it is often underused by companies (Koh and Saad 2006). Hicks et al. (2000) point to what types of information elements that are needed for effective sharing of information; it "requires use of common databases that support tendering, design, procurement, and project management. This requires records of previous designs, standard components and subsystems, together with costing, planning, vendor performance and souring information" (p.189). Kärkkäinen et al. (2003) study the use of web and product identification technologies to overcome logistical challenges in project supply chains concluding with that there are many benefits with the use of identity systems for tracing, tracking and control of project deliveries. However, such technology comes with a cost both for the focal firm and the potential customers or suppliers who are implementing the same technology. Implementation of a cross-organizational information sharing tool may be both costly, time-consuming and risky (Lee and Whang 2000). As for all investments, a company has to weigh the costs against the value added by the technology in question, but also in this setting, the technology should be judged by its ability to streamline the process and integrate the SC (Croxton 2003) (p.28). Finally, it should be noted that information sharing is merely an enabler of coordination and planning in the supply chain; companies will have to develop the right capabilities to utilize the shared information in an effective manner (Lee and Whang 2000). Copyright LinkedDesign Consortium Page 67 / 164

68 4.1.3 Engineering change Engineering change affecting operations has long been recognised by researchers as a major obstacle to the customer value in the supply chain. And the effects increases even more when collaboration is involved. Some of these obstacles have been identified as wastes in section 3.1. In the context of this work in LinkedDesign, our research is concerned with changes and alterations to products or processes within engineering and production. This means that other applications of change management within business, such organizational change, is outside the scope of this work Definition Most of the authors refer to engineering change as modification of a product's components once it has entered production (Wright, 1997; Tavcar & Duhovnik, 2005); changes to drawings or software that have already been released (Terwiesch et al.,1999); or modifications after product design is released (Huang et al., 1999, 2001). Three relevant definitions are as follows: An engineering change (EC) is a modification to a component of a product, after that product has entered production (Wright 1997) Engineering changes are the changes and modifications in forms, fits, materials, dimensions, functions, etc. of a product or a component (Huang and Mak 1999) Engineering change orders (ECOs) changes to parts, drawings or software that have already been released (Terwiesch and Loch 1999) Based on the work of Terwiesch et al. (1999) work, Jarratt et al., (2004a) proposed a comprehensive definition; An engineering change is an alteration made to parts, drawings or software that have already been released during the product design process. The change can be of any size or type; the change can involve any number of people and take any length of time Classification of engineering changes (EC) The ECs were classified in accordance with their impact on the company (Maull et al., 1992), Dale, (1982) and Reidelbach, (1991) classified EC s based on time; Balcerak and Dale (1992) based on urgency. Huang and Mak (1997) developed an EC taxonomy based on the following categories: routine, expedite, emergency, high risk and mandatory. There is often a dependency between different ECs due, for example, to block change strategy, where a company decides to carry out several ECs at a specific point in time, or to a pending situation in which an EC is based on another EC that has to be carried out (Wänström 2006). Based on the LinkedDesign (section 3.1) workshop we can group some of the changes as delay due to reviews, approvals, and bottlenecks, handover of communications or artefacts etc. Copyright LinkedDesign Consortium Page 68 / 164

69 The engineering change process Various authors have divided the EC process into different steps, Dale (1982) described the process in two steps Terwiesch et al., (1999); Lee et al., (2006); Kocar et al., (2010), in three steps, Maull et al., (1992) in five steps and Jarratt et al., (2004a) in a comprehensive six-step process. A generic engineering change process A comprehensive six-step process has been suggested by Jarratt et al. (2004a). This is shown in Figure A request for an engineering change must be made. Most companies have standard forms (either electronic or on paper) that must be completed. The person raising the request must outline the reason for the change, the priority of the change, type of change, which components or systems are likely to be affected, etc. This form is then sent to a change-controller who will enter it into an engineering database. 2. Potential solutions to the request for change must then be identified, but often only a single one is examined. This can be due to a variety of reasons: time pressures, the fact that the solution is obvious or because engineers stop investigating once one workable solution is found. 3. The impact or risk of implementing each solution must then be assessed. Various factors must be considered: e.g. the impact upon design and production schedules; how relationships with suppliers will be affected; and will a budget overrun occur. The further through the design process a change is implemented, the more potential for disruption there is. 4. Once a particular solution has been selected, it must be approved. Most companies have some form of Engineering Change Board or Committee, which reviews each change, making a cost benefit analysis for the company as a whole and then grants approval for implementation. The Engineering Change Board must contain a range of middle to senior ranking staff from all the key functions connected to the product: e.g. product design, manufacture, marketing, supply, quality assurance, finance, product support, etc. A thorough list of suitable functions to consider is provided by DiPrima (1982). 5. Implementation of the engineering change can either occur immediately or be phased in. Which option is followed will depend upon various factors such as the nature of the change (e.g. if it is a safety issue, then immediate implementation must occur) and when during the product life cycle the change is occurring. Paper work must also be updated. One of the major problems frequently associated with EC is that of ensuring that only current documentation is available to manufacturing areas (Wright 1997). Copyright LinkedDesign Consortium Page 69 / 164

70 6. Finally, after a period of time, the change should be reviewed to see if it achieved what was initially intended and what lessons can be learned for future change process. This aspect is emphasised by DiPrima (1982). The review should examine whether the product and associated processes (e.g. manufacturing) are functioning as expected. Often surprises can be discovered, for example, more obsolete stock than originally accounted for. Not all companies carry out such a review process properly. Figure 12: A model of a generic change process from Jarratt et al. (2004a) Factors affecting engineering change process Unidentified change propagation: Possessing the capabilities to identify change propagation has been recognized as an important and critical skill in the ECM process (Giffin et al., 2007). Change propagation stems from components being coupled with each other, either directly or indirectly (Eckert et al., 2005). This implies that when there is a coupling between Copyright LinkedDesign Consortium Page 70 / 164

71 components, there is a chance that changing one component will also require the other component to change. The stronger these couplings are there is more likely a change to cause further downstream changes (Cheng et al., 2010). Complex products often experiences more change propagation than other products, due to more couplings (Cheng & Chu, 2010). Another issue with complex products is that very few people have a good understanding of the entire product, and thus will have problems identifying change propagation throughout the product (Eckert et al., 2004). Hence, it is important to involve several disciplines in order to get different views on the change. Change propagation is a very challenging issue in ECM. The study conducted by the Aberdeen Group reveals that only 11% of the companies participating in the study had capabilities to both provide a precise list of affected components by a change, and automatically propagate the change to related components (Brown, 2006). Knowledge Management: For new product development knowledge management is considered to be critical (Lee et al., 2006). Changes are more likely to propagate due to the innovation factor. This is due to low degree of knowledge and information (Jarratt et al., 2011). The ECM system today does not possess the capabilities to easily capture and manage knowledge that is generated from collaboration and the decision making process (Lee et al., 2006). Hence, the knowledgebase available to decision makers is significantly reduced, and decisions will rely more heavily on personal experience. To identify the engineering resources loss due to poor knowledge management can however be a difficult task. If there are no knowledge management systems, there is no knowledge to compare with, hence it s almost impossible to identify whether an issue has been experienced and possibly solved in the past. As a result, this problem might appear as a subtle one. However, by evaluating a company s knowledge management systems, one would be able to identify whether or not this could possibly be occurring. Distributed environment: As stated, the ECM process is a rather complex process, involving different disciplines both internally (e.g. production-supply), externally (e.g. design collaboration between multiple companies) (Terwiesch et al., 1999). Although the importance of collaboration and information sharing has been stressed, rather than moving towards efficient integrated systems, and close company collaboration, companies tend to work in a decentralized manner, even within the internal departments (Koçoğlu et al., 2011). But Awad and Nassar (2010) presented that, to achieve this unified collaboration network is not an easy task, and several obstacles have to be surpassed. From the ECM literature, the following issues related to information sharing and integration in the ECM process can be identified: Copyright LinkedDesign Consortium Page 71 / 164

72 Different IT-systems exist within the companies today, meaning that information is stored in several locations making it less accessible for the people who might need it. (Huang et al., 2001; Gao et al., 2008) Simultaneous access to an Engineering Change Request by multiple disciplines is usually not permitted, especially when they are geographically distributed. This results in a sequential processing causing excessive throughput time. (Huang et al., 2001). Different disciplines and companies can view the same phenomena in different ways, resulting in errors due to misinterpretations (Little et al., 2000). Collaboration across company borders, different companies might have different incentives, different IT systems and different priorities (Wasmer et al., 2011) Summary of literature review In chapter 4.1, literature on general collaborative planning strategies, order management and change management has been reviewed. Collaborative planning strategies Collaborative planning is used to describe processes that spans across multiple planning domains, such as demand, inventory, procurement, capacity and transport planning. This means to connect local planning processes by sharing relevant data between the planning domains, and to create a common and mutually agreed upon plan. Collaborative planning concepts can be applied both to inter-organisational contexts, (supply chain collaboration) and intra-organizational context within an enterprise. Collaboration is needed across business units, factories, geographical locations and time-zones. Collaborative planning, forecasting and replenishment (CPFR) and vendor managed inventory are some of the models that have been implemented, with varying degree of success. ERP systems, are the dominating planning system in industry today, but have a singlecompany focus, support centralized production planning and has limitations with interorganizational. State of the art within supply chain planning systems is Advanced Planning and Scheduling (APS). They are based on hierarchical planning and perform optimization based on mathematical programming and algorithms, but are complex systems that require high competence levels to be used properly. Derived functional requirements - a collaborative planning workbench should: # 1) have degree of information visibility to enable collaboration (visibility) Copyright LinkedDesign Consortium Page 72 / 164

73 # 2) be able to share relevant and meaningful key information to relevant stakeholders. The shared information is to be accurate, trusted, timely and useful, and in a readily usable format. (visibility) # 3) have simple user interfaces that ensure low threshold levels to be employed by a range of users (serviceability) # 4) be able to process orders that vary in scope (flexibility) Order management It was found that order management can be divided into two phases, order acquisition and order fulfillment, which have different challenges. In the order acquisition phase, the goal is to obtain an order, possibly through a tendering process. Here, different functions within the organization are required to contribute with input in order to estimate both costs and delivery schedule of the bid. Given that the order acquisition phase results in an order, the order fulfillment phase is concerned with filling, delivering and servicing what is ordered within the agreed delivery date. This includes more straightforward tasks, such as order entry, routing, assembly and picking, shipping, installation, invoicing and collection. Change orders are a common feature for engineering organizations, and the capability to respond to these may be a prerequisite for success. Lack of communication between different functions in the organization may lead to sub-optimal solutions with respect to change orders. As such, common for the two phases is the need for cross-functional information sharing in a timely manner. Therefore, it is relatively easy to envisage how an information sharing platform may be of use in the order management process of engineering projects. Derived functional requirements - a collaborative planning workbench should: # 5) have functionality for both supporting both marketing and tendering activities # 6) have possibility to distribute signals from the market to the organization, and vice versa # 7) ease order management and minimize the time used on administrative matters on order management (performance) # 8) facilitate EDI ordering, and easily illustrate where the order is the process and when it is to be delivered. (features) # 9) address and mitigate technical risk, time risk and financial risks (Reliability) # 10) be able to handle rescheduling (robustness) # 11) facilitate a rapid process of error handling correction and change management (serviceability) Copyright LinkedDesign Consortium Page 73 / 164

74 # 12) align the order fulfillment process output and the customer specifications and requirements (conformance) # 13) share current order and production statuses as well as plans and forecasts for the various partners involved. The data should be real-time. (transparency) Engineering change management Based on the interviews, study s, and literature we found that engineering change management is still a major concern for the organizations. There is still more need for information sharing between and across organizations to avoid confusion on shop floor, design phase etc., this will help the organizations better organize and authorize changes on a real time basis. So far most of the organizations have not effectively addressed the issue of cost control due to engineering changes, which in turn had negative impact on the productions. Which finally lead to customer not being satisfied. So there is a major need for a simple collaborative tool which can improve visibility, communication and information sharing for better internal and external customer satisfaction across and within the organization and effectively manage engineering changes. Derived functional requirements - a collaborative planning workbench should: # 14) Facilitate documentation according to company standards, e.g. providing the reason for change, priority of the change, type of change, etc. # 15) Enable risk assessment of multiple change alternatives, e.g. the impact on design, production schedules, supplier relationships, costs, etc. # 16) Include functionality for approving changes, e.g. by a change board/committee # 17) Ensure updating of documentation so that only prevailing documentation is readily available # 18) Incorporate functionality for change propagation Copyright LinkedDesign Consortium Page 74 / 164

75 4.2 Assessment of existing PLM and ERP systems This section studies existing Product Lifecycle Management (PLM) and Enterprise Resource Planning (ERP) systems in the context of collaborative planning. We study how current stateof-the-art systems support the phases of a formal engineering change process and involvement of different stakeholders to manage engineering changes and error handling. We study how selected systems support information sharing and knowledge generation in relation to order management in engineering and manufacturing. In this section we assess some of the most well-known ERP systems (SAP, Microsoft Dynamics AX) and PLM systems (Teamcenter, SAP). The selection of which systems to study was based on strengths in the market, aiming to select market leaders within their domain, and relevance for the LinkedDesign consortium. The Gartner group classifies different business systems by using a two-dimensional matrix that evaluates vendors based on their completeness of vision and ability to execute. The Magic Quadrant has 15 weighted criteria that plot vendors based on their relative strengths in the market (Gartner, 2011). As shown in Figure 13, a study by Gartner group (2008) concludes with the Siemens PLM software (Teamcenter) as a leading PLM solution. This system was therefore selected as a PLM case to study further. Figure 13: Magic quadrant for Manufacturing Product Life Cycle Management systems (Gartner, 2008) As shown in Figure 14, a similar Gartner group (2012) study of ERP for product-centric medium-sized companies highlights Microsoft Dynamics AX and SAP Business All-in-One as the two market leaders. Two of the industrial partner in LinkedDesign uses SAP for their Copyright LinkedDesign Consortium Page 75 / 164

76 ERP needs (Comau and Volkswagen). Further, SAP Research is part of LinkedDesign. It was therefore chosen to study Microsoft Dynamics AX and SAP Business Suite, which is targeting larger enterprises, as ERP cases Figure 14: Magic Quadrant for ERP for Product-Centric Midmarket Companies (Gartner, 2012) We have used a combination of document studies and interviews with vendors to conduct the descriptions and analyses in the following sections, studying Teamcenter, AX and SAP Teamcenter Teamcenter has been used as an example system in order to map existing PLM functionality against theory on engineering change management. This will provide evidence on the current status of a world leading PLM system, and thus provide a good fundament to compare current functionality with findings from the engineering change management literature literature. The information presented in the following section is a based on information from Siemens (2013a) and the Teamcenter resource library (Siemens, 2013b). Another important source of information has been empirical data gathered through three interviews with the Norwegian certified vendor of Teamcenter, Summit Systems (see Appendix 5 for details). Teamcenter, a PLM system provided by Siemens PLM software, is currently the leading global provider of product lifecycle management software (CIMdata, 2010). Teamcenter intends to power innovation and improve productivity by connecting people across global product development and manufacturing organizations with the product and process knowledge they need to succeed. Copyright LinkedDesign Consortium Page 76 / 164

77 Functionality Through consolidating different IT systems needs of an organization, Teamcenter intends to provide a single, organized and secure source of product engineering and process knowledge. The purpose is to seamlessly connect different engineering and design teams, enabling them to them to work together as a single entity regardless of location. Teamcenter is comprised of a wide range of functionalities, as illustrated in Figure 15. Figure 15: PLM functionalities in Teamcenter Teamcenter provides functionality that enables you to initiate, administer, review, approve, and execute product changes. By automating the change process, one can minimize changerelated rework and coordinate tasks to be performed by individuals across the organization. Because Teamcenter change management leverages product structure definitions, one can evaluate the impact of changes, track the status and completion of tasks, and maintain a comprehensive history of product changes throughout the lifecycle. Change management in Teamcenter is tightly integrated with a schedule manager and workflows to schedule implementation activities and guide a change through its phases. In addition, Teamcenter supports a simplified change process through its issue manager. The issue manager automatically provides access to design review and issue resolution tools, including the CAD tool NX, and lifecycle visualization. The change management solution in Teamcenter is based on the following change objects: Problem report (PR): Initiates a change. A PR defines a problem or an enhancement. The processing of a PR can lead to the creation of an enterprise change request. Copyright LinkedDesign Consortium Page 77 / 164

78 Enterprise change request (ECR): Initiates a proposal that recommends a change and captures business decisions associated with the change. An ECR will contain a solution to the problem with cost estimates and benefits of making the change. The actual solution (for example, a new item revision) is implemented in the change notice. Enterprise change notice (ECN): Provides a detailed work plan to resolve one or more ECRs or a portion of one ECR. An ECN identifies all items and a document affected by a change and authorizes the actions that address a change. Deviation request: Seeks consent to deviate from a solution in production to resolve a set of problems to initiate improvements. Figure 16: Different change request types in Teamcenter Engineering change management process in Teamcenter In Figure 17 the workflow for managing a change in Teamcenter is described. Each of the steps is explained below. Figure 17: Workflow of engineering change management in Teamcenter Copyright LinkedDesign Consortium Page 78 / 164

79 1. Author a problem report (PR): A requestor creates a problem report to identify a problem or enhancement, provide a preliminary assessment, and show the steps necessary to reproduce the problem. 2. Approve a problem report: A change specialist assigns a priority to the problem report and assigns it to an analyst for technical review. 3. Create an enterprise change request (ECR): A requestor creates an ECR to address the problem report. An analyst develops one or several alternative solutions. The analyst does this by creating markups on documents, Word documents, presentations, and so on. No decision has been made at this stage about whether to proceed or what new items or item revisions may be required. 4. Evaluate the impacts: The analyst identifies the items impacted by the change, prepares supporting documentation, and prepares a high-level proposal for the actions required to implement the change. 5. Make a business decision: A change specialist submits the ECR to a change review board who decides if the change will be made. The change review board can approve the change request, reject it, or require additional investigation. If this is a fast track change, the review board is the owner of the change and the process moves to step 10 (execution). 6. Derive an enterprise change notice (ECN): The ECN addresses the implementation details of the change. The requestor can delegate responsibility for elaborating the details of the implementation plan. 7. Prepare an implementation plan: The analyst develops a detailed plan to address the set of approved ECRs addressed by the ECN. At this stage the agreed solution is implemented in the new/revised items. 8. Approve the ECN: The change implementation board reviews and approves the plan to address the change. 9. Assign effectivity: A change specialist can assign effectivities to the ECN. The effectivities specify the timing of when the change takes effect. 10. Execute the change: The analyst implements and tracks the detailed plan for addressing the change. A change specialist tracks the implementation progress at a high level. 11. Close the change: The analyst closes the associated levels of the implementation plan. When all the actions associated with each level of the implementation plan are complete, a change specialist closes the change Teamcenter strengths Teamcenter provides one source of information, where all related information can be represented in relationship networks. This also encompasses change information, which means that when a PR, ECR or ECN has been linked to a part, this change information will become a new part of the relationship network for that part. This results in an easy retrieval of change information throughout the product lifecycle. For each of these change objects, there Copyright LinkedDesign Consortium Page 79 / 164

80 are predefined change process workflows. As default the processes are in accordance with the industry-standard CMII closed-loop change model. In case of a submitted change request, this workflow will drive the tasks associated with the ECR such as developing multiple change strategies and performing change propagation analysis. This is achieved through using different workflow tasks, paths, approvals and dynamic user assignments in order to ensure that the right people gets the right information at the right time to complete the task. It also enables the change participants to keep track of the ECM process by providing real-time status of the workflow. Through the interviews (see Appendix 5), several strengths with Teamcenter were identified. It should be noted that many of these benefits are relevant for many other PLM solutions in the market Single source of product knowledge Coherent data model one version of the "truth" Extended enterprise functionality: suppliers and customers logging into the database Standardized ERP integrations and import/export formats Extended enterprise, with suppliers and customers logging into the database Maintaining multiple Bill-of-Materials: as-designed, as-built, as maintained Teamcenter challenges and future direction Based on the interviews and our research we have identified challenges and shortcomings of the Siemens PLM Teamcenter for handling engineering changes. Propagation analysis support: This functionality refers to where-used/where referenced functionality, which intuitively identifies directly linked components and assemblies to the component in question. This is especially beneficial for complex products where there might be large amounts of connections. Such functionality does however require a complete and up to date Bill of Materials in order to provide accurate results. Real-time workflow support: This is an additional feature which was essential to achieve real-time collaboration in Teamcenter. This functionality was also identified as one of the key Copyright LinkedDesign Consortium Page 80 / 164

81 enablers for best-in-class change performance by the AberdeenGroup (2007). Thus this functionality has been added in our mockup. Knowledge Management and information storage and reuse: Central storage of engineering change management related dat. This information retrieval functionality should aid in easy retrieval of product and process knowledge and thus facilitate efficient reuse of knowledge. The necessary extent of this functionality will naturally depend on the extent of the database in question. The ideal and most demanding situation would however be a fully integrated system acting as a single source of knowledge. Information overload: The information can be extremely difficult to learn or use if you are not an engineer. This may result in either that you make delays in making decisions, or that you make the wrong decisions. Organisational implementation challenges: Teamcenter is a comprehensive system with many options for configuration. The organisational and cultural challenges within the company is often experiences as a major challenge in implementation projects. Working through Teamcenter requires new produces and business processes. Companies use much energy on agreeing on how Teamcenter shall be set up, for example how items should be numbered and how the future workflows should look like (PLM interview #3). Frontloading information needs: When introducing Teamcenter, new routines requires much information about a part or item to be registered before design and engineering tasks can start. This frontloading of information, with many attributes to fill out, can be experienced as a challenge in the beginning (PLM interview #3). In our interviews, we also asked about the future direction of the Teamcenter system. The vision for Teamcenter is what they have called high definition PLM. This involves three elements: intelligently integrated information, future-proof architecture and role-based user experiences with information relevant to the context of the work. The following areas where highlighted as essential developments in coming years (PLM interview #3): Improved user interfaces, easing the entry barrier to the system. Closer to real-time capture of change data. Improved change propagation across enterprises and supply chains. Mobile notifications. Expanding in the value chain, moving from a product development based tool, to include early and late stages, such as requirements, manufacturing, and maintenance. Some of these challenges and future opportunities for Teamcenter is addressed in the workbench mock-ups presented in section 4.3. Copyright LinkedDesign Consortium Page 81 / 164

82 4.2.2 Microsoft Dynamics AX Microsoft Dynamics is a line of enterprise resource planning (ERP) and customer relationship management (CRM) software applications developed by Microsoft. Microsoft Dynamics AX, former known as IBM Axapta, was originally developed as collaboration between IBM and Damgaard Company. IBM dropped early out of the project and Damgaard merged with Navision Software A/S in The combined company, initially NavisionDamgaard, later Navision A/S, was then ultimately acquired by the Microsoft Corporation in the summer of 2002 (Damgaard, 2013). Microsoft Dynamics AX is Microsoft s core business management solution, designed to meet the requirements of midsized companies and multinational organizations. The system is used for organization, logistics, supply and demand, supporting most administrative tasks in a business. The current version of AX was released earlier this year and is known as Microsoft Dynamics AX AX 2012 has a new user interface and layout compared to AX Order management features Sales Order When a customer wants to buy something from a big manufacturing company today he will have to send them an order. The order manager receives the order and enters it into the company s ERP system as a sales order. A sales order contains different information like the sales order number, who the customer is and when the order should be delivered. Each sales order also has a separate window showing which items the order consist of and the quantity of these. Figure 18: Microsoft Dynamics AX 2012 sales orders (Lerberg, 2012) Copyright LinkedDesign Consortium Page 82 / 164

83 Production orders A production order is made to satisfy the specific need of a sales order or to restock the inventory. They can be automatically generated from the sales orders or they can be made manually. This production order is telling the people at the shop floor what they have to make and when it needs to be done. Materials and parts used in the production are automatically deducted from the inventory when they are used. This is important because the system needs to know the current inventory levels to be able to do inventory management. Figure 19: Microsoft Dynamics AX 2012 production orders (Lerberg, 2012) Purchase orders Purchase orders are set up when the company needs to buy items or materials. They are made to keep the preset inventory levels for stock items or they are made to directly, as material or product, satisfy the need of a sales order. A purchase order is built up like a sales order, but with purchase order fields. Items and materials included in the purchase order are like in the sales order shown in a separate window. Copyright LinkedDesign Consortium Page 83 / 164

84 Figure 20: Microsoft Dynamics AX 2012 production orders (Lerberg, 2012) Notifications The notification functionality in Microsoft Dynamics AX 2012 helps users track critical events. This functionality is not related to a specific module, but can be used across modules. Every user can set up alerts to help them keep track over events affecting their own daily tasks. Figure 21: Microsoft Dynamics AX 2012: Create alert rule (Lerberg, 2012) To set up a notification the user can right click on a field connected to the specific notification and choose Create alert rule. This will take the user to a new window where the notification can be specified. Figure 21 shows an example where a notification will be given when a purchase order status changes. The notification options include who will receive the notification and if it will only be a pop up or also en Engineering Change Management Features Engineering Change Management module from Microsoft provides a workplace to design new products and to make changes to existing products, before they are released into production. And it s fully compliant with Dynamics AX Best Practices as it s built right in Copyright LinkedDesign Consortium Page 84 / 164

85 AX with the same tools used by Microsoft. They have same GUI; same database; and same development suite. The process to make a change in Dynamics AX is through the Engineering Change Request. Once the request is approved an engineering Change Order is created. In this workplace changes can be made to items, BOMS, or routes. Document handling is integrated to relate all drawings or procedures. Each change supports: Revise Existing, New Revision, Create New, or Revise existing and New Revision. Line changes to routes and BOMS support: No Change, Update Item, Delete Item, Add Item. Figure 22: ECM user interface in Dynamics AX (version 4.0) The table below shows areas in Dynamics AX and relevant functionality for engineering changes: Dynamics AX Revision master Authorizations Engineering Standard Procedures (ESP) Engineering desk Functionality User defined revision numbers Historical tracking for Engineering Change Orders and different change types Engineering drawing ID numbering Document handling User defined steps and areas for engineering approvals Parameter setting to distribute approvals either in, or, out of sequence Digital signature with authorization required for approval User and password definitions table Field to associate user signature Compliant to ISO Standard Operating Procedures Revision control Automated submittal for approvals Digital signature for approvals Revision status: Active, Inactive, Under Construction, Under Revision Automatic association to Engineering Change Orders and Engineering Research and Design Orders Electronic or physical document storage Document check in/check out Digital signature document check in/check out Document transaction tracking Copyright LinkedDesign Consortium Page 85 / 164

86 Engineering change requests Engineering change orders Engineering research and design request Engineering research and design orders Task group assignments Document handling Engineering reason codes Associations to Customers/Vendors/Internal references User defined approval sequences with digital signatures Task group assignments Document handling Make changes to Items, BOMs, or Routes Change Types: Create New; Revise Existing, New Revision Build BOMs using existing AX items or new engineering items Assume Items, BOMs, or Routes from other engineering projects Item, BOMS, and Routes use same functionality as standard AX User defined approval sequences with digital signatures Approval based release into AX Task group assignments Document handling Engineering reason codes Associations to Customers/Vendors/Internal references User defined approval sequences with digital signatures Task group assignments Document handling Create new Items, BOMs, or Routes Build BOMs using existing AX items or new engineering items Assume Items, BOMs, or Routes from other engineering projects Item, BOMS, and Routes use same functionality as standard AX User defined approval sequences with digital signatures Approval based release into AX Table 7: functionality for change management tasks and functionality in AX AX strengths Some of the key strengths of Microsoft Dynamics AX have been found to be: Scope and flexibility: Discrete, process and lean manufacturing principles have been integrated into one solution and can be used in parallel (Gartner, 2012). Microsoft Integration: Leverages the Microsoft application platform and productivity suite, with enhanced integrations to Microsoft's Visual Studio, SQL Server, BI, and SharePoint. Throughout the system, analytical capabilities have been more deeply integrated. Early adopters whom Gartner (2012) interviewed highlighted the ease of adding data visualizations and drill-down capabilities. Role-based user interface personalization: is seen by customers as being very flexible. However, Gartner (2012) estimates that only 20% to 25% of the user base personalizes the UI. Copyright LinkedDesign Consortium Page 86 / 164

87 AX challenges When comparing with the requirements for collaborative planning and engineering change management, there seems to be a gap between what is found as state-of-the-art methods in literature and current AX functionality in the following areas: Storage and management of knowledge and skills. Support for collaboration and coordination of engineering change management. Support interoperability to allow free interaction between different entities. Address errors, operational issues, uncertainties, engineering changes, variations, and risks explicitly, and cope with the dynamicity and complexity and support real real-time decision making. Documentation for the final solution sometimes lacks details. Even though Microsoft's implementation methodology, SureStep, assists in documentation production, partners are often too stretched to finish the job effectively (Gartner, 2012) SAP Business Suite SAP is a German software company providing integrated enterprise applications. They are the market leader for business software in terms of software and software-related revenues (Leimbach, 2008). The company currently has 232,000 customers, 65,000 employees and locations in more than 130 countries. (SAP, 2013). In order to understand their dominance within the enterprise resource planning domain, it is interesting to highlight the main milestones in their 40-year long history: o o o o o o 1972: Foundation. Five former IBM employees started SAP under the name Systemanalyse und Programmentwicklung. 1991: SAP R/3. Launch of SAP R/3, with a client-server concept, uniform graphical interface and use of relational databases. The system was intended for midsize companies and was an important source of growth for SAS (Leimbach, 2008). 1999: mysap. New product strategy to combine e-commerce solutions with SAP's existing ERP applications on the basis of Web technology. 2004: SAP NetWeawer. An integration platform to joint multiple data sources and a service-oriented application platform. The technical foundation for SAP Business Suite. 2010: Mobile strategy. Acquisition of Sybase, market leader in information management software and mobile data use, enabling "wireless" companies. 2011: SAP HANA. A platform for data analysis. An in-memory product that enables customers to analyze big data in seconds rather than days. Copyright LinkedDesign Consortium Page 87 / 164

88 Currently, SAP is focuses on five market categories, among them Applications. In the area of business applications there are different solutions available, depending on size of the company and the type of industry SAP Business Suite is targeted towards larger enterprises, whereas SAP Business One, SAP Business by Design and SAP All-in-One are solutions targeted to small and medium-sized enterprises. In the further parts of the chapter, SAP Business Suite is studied, as it illustrates state-of-the-art functionality from SAP. However, it should be noted that some of the other categories within SAP, such as Mobile, Analytics and Cloud are relevant to collaborative planning, it is outside the scope of this work to study these categories in detail. The SAP Business Suite is structured in five main components, as illustrated in Figure 23: SAP ERP enterprise resource planning SAP CRM customer relationship management SAP SCM supply chain management SAP SRM supplier relationship management SAP PLM product lifecycle management Figure 23: Structure of SAP Business Suite and SAP Hana (SAP, 2013) In addition, other SAP platforms are important for building collaborative business processes: SAP HANA: platform for real-time reporting and data analytics NetWeaver: technical foundation and integration platform. NetWeaver Cloud: supports Software-as-a-Service, enabling cloud-based apps. Copyright LinkedDesign Consortium Page 88 / 164

89 These five key words where used for further search in the extensive SAP.com libraries and product data sheets. Table 8 below summarizes the identified best-practice elements. SAP solution Selected best practice elements Description of functionality and business processes supported SAP ERP Order fulfilment Visualize and analyze customer value, plan more effective sales tactics and identify the right target audience for marketing, sales, and service initiatives. Supplier Collaboration Includes elements such as Delivery Schedule collaboration, Inventory collaboration, Replenishment Order collaboration, Return Delivery collaboration and Supply Chain Exception collaboration. Manufacturing work order Collaboration Integrated Sales and Operations Planning (S&OP) Mobility for ERP SAP CRM SAP 360 Customer On-Demand Sales SAP SCM Advanced Planning and Optimization (APO) SAP Supply Network Collaboration SAP Demand Planning Collaborative Response Management SAP Track & Trace SAP SRM Supplier Order Collaboration SAP PLM Collaboration PLM tools SAP Visual Enterprise Viewer A view on a manufacturing work order that both the customer and the supplier share. Supports Supply Network Planning and Service Level Optimization. Mobile access to SAP ERP via mobile units, such as and iphone, Android. A new component based on HANA, cloud, social media and mobility, that can be linked with CRM. On demand sales and mobile sales for sales team collaboration. It offers planning and optimization functionalities in: Demand Planning, Supply Planning, Supply and Demand Matching, Production Planning Detailed Scheduling, Global Available to Promise etc. Offers a way for suppliers to connect with a customer using a web browser to trigger and update replenishment and invoice data. Supports VMI, purchase order process, dynamic replenishment, webbased supplier Kanban, Suppy network inventory process. Supports the demand planning and forecasting processes. Userspecific planning layouts and interactive planning books enables integration of people from different departments and companies into the forecasting process. For reacting to changes in demand and supply, communication with customers and suppliers. Supports: Fulfilment visibility, Procurement visibility, Item serialization and product traceability, and Product genealogy Extending the functionality of a web-based order management system, which seamlessly connects a purchaser and small suppliers, no matter what software they use. Supports Customer Invoice Processing, Goods and Service Acknowledgement Processing, Goods and Service Confirmation Processing, Purchase Order Processing, Sales Order Processing, Supplier Invoice Processing collaboration PLM tools, like integrated product development, CAD integration, portfolio and project management 3D visualization viewer for Windows that allows collaboration, analytics, as well as assembly and maintenance work instructions to be delivered in interactive real-time 3D. Table 8: SAP collaborative planning functionality (source: Copyright LinkedDesign Consortium Page 89 / 164

90 It should also be noted that in addition to the best practice functionalities described in Table 8, there exists numerous industry solutions. However, a mapping of these was deemed to be outside the scope of this work Summary Current ERP and PLM solutions offer extensive functionality and the chapter has shown that there are good support for collaborative planning in current ERP and PLM systems. However, some of the drawbacks and limitations in existing solutions where found to be the following: Lack of propagation analysis support Lack of real-time workflow support Lack of knowledge management and information storage and reuse Need to address errors, operational issues, uncertainties, engineering changes, variations, and risks explicitly, and cope with the dynamicity and complexity and support real real-time decision making Material plans and project plans are not linked Lack of support on information dependencies within and between manufacturing enterprises Lack of support interoperability to allow free interaction between different entities Copyright LinkedDesign Consortium Page 90 / 164

91 4.3 Workbench: system/functionality analysis and design In this section, we will describe the work that was done to design technological support for collaborative planning, based on the approach described in Section 2 and Figure User stories and scenarios A set of user stories were created based on interviews and workshops with the users; an overview of this is shown in Figure 24. Based on these user stories and the literature review conducted in Section 4.1, we have considered the following processes in collaborative planning as the relevant ones for the LinkedDesign: 1. Order acquisition - tendering processes leading to an industrial project / production order; 2. Order fulfillment - planning and follow-up of a sales order or production order; 3. Engineering changes - management of engineering changes; 4. Exceptions or error handling - managing exceptions and errors occurring during life cycles stages; 5. Applications - life cycle analysis and risk analysis; Figure 24: Merged User Stories in LinkedDesign Copyright LinkedDesign Consortium Page 91 / 164

92 Scenario description The following scenario description attempts to illustrate the above process in more detail to enable the identification of requirements for the design of the system. (Note that this scenario aligns well with the scenario that has been designed for the review in spring 2013.) The characters in the scenario are as follows: Manny Manufacturer: Marcus Marketing: Knut KE: Steve Supplier: Chris Customer: Alex Expert: Doris Designer: Lenny Lifecycle Volker Wagen Manufacturing Manager Marketing and Sales Manager Knowledge Engineer Key supplier representative Customer representative Expert in the domain Parts Designer Expert in lifecycle calculations Quality Controller Collaborative planning scenario, part #1 Manny hears a beep on his mobile device and sees that a message had appeared on his screen indicating an error in the production system (Manny will have access to the project data through mobile app). REQ Visual and audio alert to capture Manny's timely attention. Manny uses the system to obtain more information about the error such as the type of the error and the conditions that caused the error so that he could identify the potential error source. REQ Manny has access to the necessary information from the system to attend to the error and take appropriate action and decision. Manny, who is an experienced manufacturer, was able to take the appropriate action very quickly and correct the error and confirm that the error has been attended to. REQ Manny adds feedback to the system after attending to the error. This is a part of recording history of events, which could be useful for him and others, or even the management for decisionmaking. There may also have been a situation where Manny may have wished to obtain some expert opinion or advice before he acted on an error. In this case, Manny checks his system to see if he could contact an expert in error handling (Alex), who may be within or outside his organization, but who will be able to provide advice in real time. Manny sees that Alex is online and contacts Alex via the online chat communication tool (e.g.: Skype or Microsoft communicator) for a quick chat to verify if his interpretation of the situation is correct. Reassured by the feedback from Alex, Manny goes ahead to correct the error and adds a comment as a feedback to other manufacturers. REQ Manny has access to real time expert advice through various communication technologies. Such expert advice may also be collected in an experience database that is available. The possibility to tap into collective knowledge from experts anytime. Copyright LinkedDesign Consortium Page 92 / 164

93 During the shift, Manny encounters several other errors that he dealt with in a similar manner. At one point, Manny noticed that the error message box had changed color from orange to red indicating consecutive errors. Manny examines this situation using the information from the system and determines that it could have been due to a faulty box. Manny takes the necessary action and enters a message in the system, for others and at later shifts, to inform them. Manny also checks if a similar sequence of errors had occurred over the last two weeks. REQ Visual information about a different situation of errors. Availability of historical information to support judgement and decision-making. Feedback and information sharing through the system. Manny felt that this shift had been a busy and eventful one and that he had experienced more errors than normal during this shift. So, Manny decided to review the errors that he had encountered. Manny also looked for the types and amount of errors that had occurred during his shifts over the last month to see if how this shift was in comparison to his other shifts, to have an indication of his performance. He also accessed information on the rate of errors in general to see how this shift was in comparison to the normal operations. REQ Ability to look at performance data for an individual as well as in general for the operations. This can support learning and provide indirect feedback and reassurance to people. Knut, as a part of his regular work, reviews the list of errors from the previous day and sees the errors that had occurred during Manny's shift the day before. Knut found Manny's feedback in the system very useful, particularly regarding the sequence of errors. Knut decided that the parameters that were causing those errors requires a closer look and decides to take this up with his team during their next meeting. He makes a note of this in the system so that his team has access to the information and thanks Manny for the feedback. REQ KE, the Knowledge Engineer, has access to updated info about errors and who and how they were handled. Knut is able to share relevant and timely information (from the source) with his team. Similar to the error-handling scenario, when a design engineer, Doris, encounters a change request, she could be notified by the system, which also provides her the possibility to access relevant information to act upon it in a timely manner. Similarly, the system could also facilitate access to other people that could help, who may be peers in other organisations or experts in the field. Doris should be able to provide and store her feedback via the system so that other design engineers as well as other roles in the company such as the manufacturers and knowledge engineers (Manny and Knut) may be able to utilize that. Most importantly, when Doris makes a change, the change is propagated through the design and the system provides capabilities to alert others that may be affected by the change. For example: Collaborative planning scenario, part #2 Steve has been notified that the batch of sensors that Doris is using in her design is faulty and requests Doris to replace the sensors in the design with another one with the same parameters. This may however affect the heat profiles of the other components in the product and the overall cost of the product. When Doris attempts to replace the sensors in her design, she gets a notification that she should get an approval from Lenny, the lifecycle cost analyst. REQ Doris is notified of the checks that she has to do before she can make changes in her design. Lenny notices that there is an alert on his system informing of this change and he is able to check the details and consequences of this change on the lifecycle costs of the product via the system. Copyright LinkedDesign Consortium Page 93 / 164

94 REQ Doris is able to make others aware of the engineering change requests that she has addressed. Lenny is made aware of this change and has access to updated info about it. Lenny accesses the lifecycle costs analysis module on the system and recalculates the lifecycle costs. The green bar on his system indicates that the consequences of the change in design are within acceptable levels. Lenny triggers a message to Doris to inform her that she can go ahead with the changes. REQ Lenny is able to access the technological support needed to conduct lifecycle cost analysis through his system. Lenny is able to inform others, e.g. Doris, through his system Scenario Analysis to identify Collaborative Processes The above scenario provides an insight into the collaborative processes that take place in the users organisations. One of the limitations of the information that is available from the user stories is the insight into how the different roles in the organisation collaborate, may need to collaborate or may depend on each other. The scenario analysis helps us to address this more explicitly, providing input to conduct further interviews. Based on the scenario analysis, one of the things that we have done is to describe how the different roles in the organisations may interact with one another; in particular, how one role may depend on another or benefit from another. Two such examples of collaboration are illustrated in Figure 25 and Figure 26. Figure 25 Collaboration in Error Handling: Manufacturer and Knowledge Engineer A possible situation when an error occurs is shown in Figure 25, where Manny updates an Error Database. Knut is able to review the errors from the Error Database here, Knut can be considered as collaborating with Manny through the electronic support systems, in this case indirectly. The communication possibilities could further be enhanced by facilitating Knut to send a message to Manny or contact him directly by other means. The Error Database may well be designed as a part of the Knowledge Management System, depending on the Copyright LinkedDesign Consortium Page 94 / 164

95 preference of the user. The main point here is the identification of the need for an Error Database and the purpose it may serve to determine the functionality. Manny Encounter Error Access Error Database for information (e.g. has there been similar errors?) Take necessary Action to correct error Obtain expert advice Alex Figure 26: Collaboration in Error Handling: Manufacturer and Expert In addition to the indirect communication that is illustrated in Figure 25, Manny may also have a need to collaborate directly with someone real time. An example of this is the need for expert advice for reassurance or verification of the intended action before he does it. So, before Manny takes action to correct an error, he may wish to consult the Knowledge Management System, which may have examples of how similar errors have been dealt with by other people, or ideally one or several experts in the domain available to advise him Collaborative platform: Motivation and structure The discussions in the previous chapters, both in the literature reviews and the analysis of the user stories, highlight that there is a need for a collaborative platform that supports order and change management processes within an enterprise. The user stories brought forth the perspectives of the different roles that are involved in the engineering processes, e.g. the Production Manager, the Knowledge Engineer and the Design Engineer, see Figure 24. Although each of these roles has their specific needs, it is evident from their stories that there are dependencies among these roles and they need to collaborate with one another as discussed in the scenario analysis in section Thus, the main motivation for our design is to bring forth a holistic view and design a collaborative platform to provide support for the different roles in the various processes. An overview of the generic capabilities that were desired by all the users can be summarised as below (not that this is not an exhaustive list): Personalised and role-based support and capabilities. Easy access to all relevant information to perform their tasks. Access to historical information. Real time access to domain knowledge and expertise. An overview of all tasks. Support for internal knowledge and experience transfer. Copyright LinkedDesign Consortium Page 95 / 164

96 Life cycle assessment. To provide personalised or role-based support, it is beneficial for each user to have their own profiles on the platform and some support for their specific as well as general activities. We have designed some mockups using the Pencil application to illustrate some ideas for a collaborative platform to address the needs of the users. A possible entry point to the platform, a login window, is shown in Figure 27. Once the user has logged in, s/he will be presented with a front page consisting of a set of modules or processes that are relevant for him/her, e.g. change management or order acquisition; see Figure 28. Figure 27: Login page for the Collaborative Platform Figure 28: Personalised Front page All the users addressed common needs such as the need to access correct and timely information, to have an overview of relevant information related to e.g. errors or changes, access to relevant knowledge, both in terms of historical information as well as human Copyright LinkedDesign Consortium Page 96 / 164

97 expertise, overview of their tasks, etc. To address such needs for all users and the various roles, we have designed the collaborative platform with some generic functionality that will be accessible for everyone, while executing all processes. An overview of the layout that we envisage is shown in Figure 29. The screen layout is designed to categorise the different types of functionalities that are incorporated into a common workspace for any individual; see Figure 29. In general, the following categories of functionalities have been considered: Common for all: these are functionalities that are available for all roles and individuals. These functionalities are framed in a rectangle with solid lines. Some of these functionalities are available on the top of the screen, which are dedicated to accessing information sources and searching for information. Another block of common functionalities are placed to the left of the page which provide real time access to other people and can provide an overview of people who are online and available at any time. A similar capability is available to the bottom left of the page where expert feedback is available. Personalised: these functionalities are personalised to the user's profile and are framed in a rectangle with dotted lines. Such functionalities include the calendar and task list, which are available from the top right hand side of the screen and bottom Role-specific: these functionalities are tailored to the role of the person and are framed in a rectangle with broken and dotted lines; e.g. a design engineer will see a different set of functionalities compared to a Knowledge Engineer. Role-specific functionalities are available from the bottom left of the screen and at the very bottom of the screen. Module-specific: these functionalities are tailored to the module or the relevant process, such as change management, and are framed in a rectangle with broken lines. They are placed at the top right and the bottom left of the screen. The area in the middle of the screen is dedicated to the role and module specific content. For example, a design engineer will have content that is specific to his/her role and tasks, which is shown on the top right of the area. This will be explained in more detail in the following subsections. In the following sections, we analyze the users' needs for the four processes: order acquisition, change management, order fulfillment and error and exception handling in detail and propose the design of the collaborative platform to support these processes. Note that we have focused on these main processes in collaborative planning and therefore we have not considered every single topic that was included in the user scenarios and the interviews. The design that is presented in this deliverable is the result of the interviews with the users and is based on their feedback. Therefore, we have not conducted any validation about the layout. So far, we have focused on addressing the needs for the various functionalities. Copyright LinkedDesign Consortium Page 97 / 164

98 Module specific Common for all Personalized Common for all Role Specific Module specific Figure 29: Rationale for the design of the functionalities in the Collaborative Platform Reducing waste in collaboration In developing the required functionality and designing the mock-ups, there has also been a goal to reduce the amount of waste in collaboration (see Table 3, section Workshop results) to the extent possible. Table 9 below explains how the collaborative platform relates to the sources of waste in collaboration, and how the waste is reduced by utilizing the collaborative platform. Waste form Description of waste How the waste is reduced Divergence Misunderstanding Undercommunicating Wasted efforts due to politics, mismatch of goals Disconnect in understanding Excess or not enough time spent in collaboration Visualization of the same information for multiple partners, contributing to a joint view on status and plans. Real-time collaboration with relevant actors depending on your role Tailored GUIs for various modules and roles Possibility for the user to personalize his GUI to some extent, enabling aggregated knowledge representations to fit individual needs Real-time collaboration with relevant actors Integrated functionality for , shared calendars, contacts etc. Copyright LinkedDesign Consortium Page 98 / 164

99 Waste form Description of waste How the waste is reduced Interpreting Time spent interpreting communication or artifacts Activity-centric GUIs adapted to the user's affiliation, containing only functionality relevant for the logged in user Functionality for analysis and aggregated knowledge representations fitted to the user Interactive expert and management feedback Searching Motion Time spent searching for information, relationships Handover of artifacts or communications Overall search capabilities Knowledge repository with both archived and active activities Functionality for searching for people in the organization Notifications flagged for follow-up readily available Stage gate functionality for ensuring effective and efficient handover of tasks Visual representation of priority and risk profile, readily available to relevant stakeholders Extra processing Excess creation of artefacts or information Overview of status (e.g. of plant) Lists of projects and their status (e.g. due date, priority) Relevant information displayed depending on affiliation Translation Time spent conforming objects to new inputs Only relevant information/documentation readily available Stage gate functionality to ensure task execution conforming to the organization's standards Waiting Delays due to reviews, approvals, and bottlenecks Information available to relevant stakeholders Real-time feedback Status of resources indicated (online, offline, etc.) Indicator of progress (% completed) and which stages are completed Misapplication Incorrect use of methods and technologies Access to various functionality (e.g. tools for various analysis) is limited by the user's affiliation Table 9: Mitigating waste in collaboration Order acquisition: functionality and mock-ups When interviewing the three companies, we found that they had different foci with respect to order acquisition. This was perhaps not that surprising, given their diverse characteristics and customer base. For example, Volkswagen sells large volumes of basically identical products, leading to a focus on forecasting and prognoses, whereas the focus for Comau and Aker lies at tendering/front-end studies due to their highly customized products. Table 10 below summarizes the companies' situation and needs with respect to order acquisition. Copyright LinkedDesign Consortium Page 99 / 164

100 Volkswagen Comau Aker REQ. REF. Main focus Forecasting/prognoses Tendering Tendering (front-end studies) Current situation Tools used Current needs/ challenges Two different departments that handle marketing; one handles future predictions whether the other handles short-term demand Availability of parts shared with customers and suppliers Database with available parts Low degree of marketing with simple market analysis for machining; for other products the customers call all suppliers and invite to tender Typically attend two tendering processes at the time (with different teams) All teams capable of managing a proposal; all have the same type of specialists Low-detail proposal developed in 2-3 weeks Prioritize tenders by profit, then by due date Keep all information from previous tenders; both won and lost. Check previous tenders manually for commonality and estimate based on previous tenders AutoCAD for simple layout proposal Two kinds of product types with different requirements; for more standardized products (process equipment, subsea, drilling) marketing is important, whereas for one-of-a-kind products tendering/front-end studies are most important. Involved in front-end studies at the time Within one division you may compete with the same resources on different bids. A lot of sharing of resources, especially during frontend studies Stage-gate model; you have to pass gates before tendering, etc. Based on risk system. The organizational layer above project manager coordinates frontend projects CRM system SAP for production planning Excel for calculations Risk assessment database KBE system to accumulate knowledge Understanding the cost of Risk identification delivering what the customer wants A way to propose multiple solutions in the proposal A way to include LCC data in the proposal System to communicate good solutions; lessons learned and experience People need to know rules and put them into the KBE system, the system does not know anything Table 10: Order acquisition related information from interviews of use case companies OA_Req_3 OA_Req_7 OA_Req_5 OA_Req_6 OA_Req_4 OA_Req_6 Copyright LinkedDesign Consortium Page 100 / 164

101 By combining some of the most common requirements acquired during the interviews with requirements found in literature (see section ) we have summarized seven requirements for the order acquisition module. These are described below, followed by illustrations of how the mock-ups relate to the requirements: OA_Req_1: Functionality for both marketing and tendering activities OA_Req_2: Possibility to distribute signals from the market to the organization, and vice versa OA_Req_3: Possibility to manage several tenders at the time OA_Req_4: Functionality for evaluating the risk of tenders OA_Req_5: Stage gate functionality to avoid losing important information in tendering OA_Req_6: Should store information about previous tenders and enable automatic searching within these OA_Req_7: Should accommodate sharing of resources during tendering We will now illustrate how the requirements OA_Req1-7 are accommodated through the proposed mock-ups of the order acquisition module. In Figure 30 below, the first two order acquisition requirements (OA_Req_1 and OA_Req_2) are raised. As illustrated in the frame OA_Req_1, functionality for conducting both marketing and tendering is included in the mock-up, as these would be activities conducted by Marcus Marketing. The frame OA_Req_2 shows the marketing panel. Here, Marcus is able to swiftly signal the organization with, for example, market trends, new regulations or complaints. Copyright LinkedDesign Consortium Page 101 / 164

102 OA_Req_1 OA_Req_2 Figure 30: Order acquisition mock-up #1 In Figure 31 below, OA_Req_2 is raised again; however, this figure illustrates how Marcus may receive important information from other personnel in the organization. This information may either be communicated to reach new customers, or be used as a way of strengthening the relationship with existing customers. Figure 31: Order acquisition mock-up #2 In Figure 32 below, the tendering functionality is illustrated. Here, Marcus has the possibility to manage several tenders at the time (OA_Req_3), as illustrated in the corresponding frame. In the example, on-going tender processes can be sorted according to number, due date, client, priority or person in charge. Relevant documentation, like the invitation to tender, should be Copyright LinkedDesign Consortium Page 102 / 164

103 readily available. In the example, functionality for registering new tendering processes is illustrated through a button. Here, one would typically have to fill in information about due date, client, etc. (see above), whereas the tender number is automatically generated. Further, as can be seen in the rightmost frame, the personalized space to the right has changed some. It now illustrates the risk profile of on-going tenders; satisfying OA_Req_4. OA_Req_3 OA_Req_4 Figure 32: Order acquisition mock-up #3 The last mock-up of the order acquisition module is illustrated in Figure 33 below. This panel appears when clicking on a specific tender process in the previous panel (Figure 32). Stage gate functionality is illustrated by means of check boxes and a corresponding progress bar, meeting OA_Req_5, whereas functionality for searching within previous tenders is illustrated at the bottom of the panel (OA_Req_6). The last requirement (OA_Req_7) is that the module should accommodate sharing of resources during tendering. We have illustrated this functionality by including relevant files connected to the tender, readily available for whatever resource that is assigned to the task. Copyright LinkedDesign Consortium Page 103 / 164

104 OA_Req_5 OA_Req_7 OA_Req_6 Figure 33: Order acquisition mock-up #4 In the next section, the change management module is further described Engineering changes: functionality and mock-ups As shown in Table 11, the use case companies provided different viewpoints for engineering change. Volkswagen is concerned with managing design alterations to existing cars and collaborating with suppliers for new product development. In Comau, the critical issue was to learn from previous projects in ongoing engineering change processes. Aker Solutions, a core engineering company, has a strong attention to compliance to standards, as they are numerous within the oil and gas industry. The table below summarizes the companies' situation and needs with respect to engineering changes. Volkswagen Comau Aker REQ. REF. Main focus Design Reuse of knowledge Current situation Employ modular platforms consisting of fewer parts in order to minimize the risk of designing something unfeasible Compliance with company standards Changes handled differently based on contract type (lump-sum or reimbursement) Copyright LinkedDesign Consortium Page 104 / 164

105 Volkswagen Comau Aker REQ. REF. Unfeasibility in the desired design is sometimes revealed during the ramp up phase Several functions are needed in case of a change due to change propagation; affecting subsequent manufacturing stages Tools used Catia for design changes PLM system (ENOVIA) to handle redesign of parts; track changes and modifications Viewer module for CAD files. Current needs/ challenges Direct expert feedback for early feasibility check Lack a knowledge-based software that saves information about unfeasible designs Lack a database about how changes were handled in previous projects Strict change control system used for any change, regardless of who initiated it. This is an organizational setup; change board Change board (typically department managers) meets once a week to address change propositions during the week Company-wide guiding and rules of how to define a change, etc. Change board's once-a-weekmeetings mitigate time and distance challenges due to spread geographic locations Change is coordinated by the change manager Web service (Aker specific) supporting change control COMOS as information carrier in engineering (followed by MIPS for fabrication and procurement) for internal communication (sent by the change system) and notifications Estimating fabrication consequences, i.e. are changes possible without cutting off what you have already done? How does it affect schedule and costs? Table 11: Engineering change related information from interviews of use case companies ECM_Req_3 ECM_Req_3 ECM_Req_5 ECM_Req_4 ECM_Req_1 ECM_Req_4 ECM_Req_5 ECM_Req_4 ECM_Req_5 ECM_Req_2 ECM_Req_5 ECM_Req_6 ECM_Req_7 Some of the most common requirements acquired during the interviews have been combined with requirement found in literature (see section 4.1.3), resulting in seven requirements for the change management module. These are summarized below, followed by illustrations of mock-ups and their relation to the requirements. Copyright LinkedDesign Consortium Page 105 / 164

106 ECM_Req_1: Facilitate documentation of on-going engineering changes according to company standards, e.g. providing the reason for change, priority of the change, type of change, etc. ECM_Req_2: Enable risk assessment of multiple change alternatives, e.g. the impact on design, production schedules, supplier relationships, costs, etc. ECM_Req_3: Include functionality for approving changes, e.g. by a change board/committee ECM_Req_4: Ensure updating of documentation so that only prevailing documentation is readily available ECM_Req_5: Incorporate functionality for change propagation ECM_Req_6: Provide direct expert feedback for rapid feasibility control ECM_Req_7: Store information about previous changes both feasible and unfeasible In Figure 34 below, the requirements are addressed. In the frame at the bottom of the Design panel, Doris Designer gets an overview of active engineering change requests that are relevant for her (ECM_Req_1). In the same frame, ECM_Req_2 is taken into account by illustrating how the system has performed an assessment of the corresponding risk, evaluating the change at hand to have a medium risk level. A risk evaluation tool is, as shown, also available at the right-hand menu. Further, at the bottom it is illustrated how Doris may release a change; i.e. send it to a change board, or committee, for approval (ECM_Req_3). In the right-hand menu, Doris is also able to access PLM/CAD systems, where only the prevailing documentation is, or should be, available (ECM_Req_4). As such, she is unable to employ old design rules that may propagate problems and considerable amounts of rework in a longer term. Even if Doris employs the correct design rules, any changes will propagate to other functions nonetheless. As such, functionality for propagation analysis is incorporated in order to satisfy ECM_Req_5. Finally, the common-for-all functionality (see Figure 34) highlighted in the frames located at the left side of the screen cover ECM_Req_6 (expert feedback) and ECM_Req_7 (information sources), respectively. Copyright LinkedDesign Consortium Page 106 / 164

107 ECM_Req_7 ECM_Req_2 ECM_Req_1 ECM_Req_6 ECM_Req_5 ECM_Req_3 ECM_Req_4 Figure 34: Engineering change management mock-up # Errors and exceptions: functionality and mock-ups Similar to order acquisition and change management, the importance of errors and exception handling was different for the different users. We gathered that, at least within the context of LinkedDesign, error handling was most important for Volkswagen. Nevertheless, it is relevant for all three users and we have tried to capture their needs in Table 12. Main Focus Current situation Volkswagen Comau Aker Error handling/tracking during the manufacturing process. Information related to the specific part is difficult to obtain. Actual parts are not inspected on the production site; therefore errors are detected much later in the process, often causing delays in the production process Tendering process "Non-conformity" is the terminology used. The requirements on the product of Comau (e.g. assembly line) is very vague. Errors are often related to wrong and outdated information. Errors detected by the quality routines. Generic checklists used in the project execution model. REQ. REF Copyright LinkedDesign Consortium Page 107 / 164

108 Tools used Current Needs Volkswagen Comau Aker TRIMEK - provides some of the information. Errors detected on the production site. Detect the error and understand why the error occurred. Information related to the specific part easily available for the different roles involved (personalization). Access to historical information related to errors and how they were dealt with. No database to record the problems encountered and how they were resolved. Minor modifications (several encountered per day) are not recorded. Once accepted by the customer and shipped to them, any problems are handled by the after-sales department. x-enovia Support for quality control (difficult to achieve due to one-of a kind manufacturing) Overview of problems and how they were resolved. Errors usually happen because engineers use wrong or old information - information sources that are used in the calculations are not in sequence. Information exchange between different departments are inadequate, which can cause errors, e.g. fabrication department and procurement. COMOS, MIPS Synchronization of information between different departments so that engineers have updated and correct information for their work. Support for relevant checklists in the quality process. Support for keeping the checklists "alive" by providing functionality to detect new generic concepts that may appear. Timely detection of errors, reasons and context for the error and the action taken. Overview of the cost of an error. REQ. REF ER_Req_1 ER_Req_3 ER_Req_4 Copyright LinkedDesign Consortium Page 108 / 164

109 Table 12: Error handling related information from interviews of use case companies Based on the input from the user stories and the interviews, we identified the following general requirements for error and exception handling: ER_Req_1: Errors detected during the manufacturing phase, or as early as possible ER_Req_2: Real-time access to information about errors ER_Req_3: Access to historical information about errors such as a list of previous errors and how similar errors have been dealt with in the past ER_Req_4: Information about the cost of an error ER_Req_5: Overview of cost related errors ER_Req_6: Create and forward error reports to relevant people To address the need from Volkswagen that errors be detected during the manufacturing phase on the shop floor, ER_Req_1, the manufacturer, Manni, receives a notification on his workspace as shown in Figure 35. The "Plant layout and status" panel shows a red dot where the error has occurred. When selecting the red indicator, a grey box showing information about the specific error appears, also providing some functionality to deal with the error. A button labeled "Read more" can be used to obtain more information related to the specific error, partly addressing ER_Req_2. In addition to this, ER_Req_2 is also addressed in the button labeled "Propagation Analysis", which is on the bottom part of the screen dedicated to role-specific functionalities. The expert feedback functionality as well as the real time communication capabilities, available from the right side of the screen, may also be used by Manni to obtain additional relevant information and help to deal with the error. A button labeled "Forward error report" can be used to create an error report and forward the error report to other relevant people who may be affected by the error, or may need to be informed about it. For example, other people working on the production site such as the machine handler or a supervisor may need to have this information in real-time. This functionality addresses ER_Req_6. Historical information regarding errors as in ER_Req_3 can be accessed via the functionality "Corrective Action" available at the bottom of the screen. This can be a list of similar errors that had occurred in the past and the corrective actions that were taken in those situations. Similarly, ER_Req_4 and ER_Req_5 can be addressed by the function "Cost Analysis" available at the bottom of the screen. This could be a complete cost analysis of the product after an error had been dealt with as well as other more specific cost analyses related to the specific error or the product itself. Note that in the mockups, we have considered errors as a part of engineering change. However, we are aware that errors are not always a consequence of a change, and that errors may not necessarily lead to a change in the design. Copyright LinkedDesign Consortium Page 109 / 164

110 ER_Req_6 ER_Req_1 ER_Req_2 ER_Req_4 ER_Req_5 ER_Req_3 Figure 35: Error handling during the manufacturing phase In addition to detecting errors during the manufacturing phase, it is also important to detect errors during the quality processes. Today, errors are detected during the quality checking phase and the users feel that this is too late in the production process to detect errors and address them; this can also be costly. Thus, there was a strong need to detect errors during the manufacturing phase as discussed earlier in this section. We believe that if this was possible, there would be ideally no errors, or at least less errors during the quality checking processes. However, to ensure that all errors are detected, it is also important to have this capability available during the quality checking processes as shown in Figure 34 (engineering change) and Figure 36 (error detection). Copyright LinkedDesign Consortium Page 110 / 164

111 Figure 36: Error Detection in Quality Checks Order fulfillment: functionality and mock-ups The production strategies for the three companies are varying. Volkswagen has a make-tostock strategy. Comau and Volkswagen have a mix of make-to-stock and engineer-to-order, with a majority of their activity within one-off engineering projects. Even though their main focus is different, many similar challenges and future needs where identified. Table 13 below summarizes the companies' status and needs with respect to order fulfillment. Main Focus Current situation Volkswagen Comau Aker REQ. REF Communication with Modular design strategy. Internal collaboration on fabrication units, subcontractors, across technical Supplier collaboration change orders, learning from critical. previous projects disciplines and time zones Modular product platforms - fewer parts that are used for several car models. Have good database for stock levels and lead times, uncertain to which degree this is shared. PowerTrain unit: engineering projects. Robot unit: maketo-order/make-to-stock Sometimes the lead time of a particular material is larger than the lead time of the project One-off projects: Front-End Studies. Some massproduction, (process equipment for subsea) Coordination of resources across multiple projects conducted by Project Management office. Copyright LinkedDesign Consortium Page 111 / 164

112 Tools used Future Needs Volkswagen Comau Aker REQ. REF Production plans updated weekly, then put into database and distributed to the tier producers Many frequent change order. Changes from two sources: the customer or see potential for process improvements internally Late changes in project have severe consequences for fabrication. OF_Req_3 OF_Req_4 Decreasing CO2 emissions is critical Catia for design tasks Collaboration with suppliers in new Product Development is essential. Sharing stock level visibility with suppliers and customers is important. Mobile phone / smart phone apps for information dissemination internally is interesting Today, you have to know the person involved in the design process. It would be a nice way to access one or multiple experts. Would like functionality of assigning and receiving tasks. I think people would be able to work with this. Fast and quick overview. Changes need to be passed around internally, technical managers needs to work with project managers, that approaches changes. ENOVIA PLM for changes. SAP for orders. MS Project for Gantt charts Often the customer does not know from the beginning what he really wants. Our challenge is to understand what it will cost to provide what the customer wants. We have several interfaces, but the project manager almost always use Microsoft Project, while the technical leader uses PLM and another person who is a project manager of purchasing and manufacturing, and he uses SAP. One problem is that sometimes we make very similar projects. What we miss is knowledge about how we resolved past problems. We have thought about an internal Wikipedia for how the project developed over time, not drawings etc., but the issues we met and the communication. The problem is that we do not have any tool or database where we record all the problems we find and the steps taken to resolve them. Searching knowledge history would be very interesting and useful Specific tool for Risk management Communication with subcontractors and fabrication units can be challenging, particularly in south/eastern countries Important to know how far fabrication have reached and if changes are possible without cutting off what you have already done. A lot of the time they invent good solutions in one project, and the other project would like to use them. A challenge to pass around lessons learnt Collaboration support across multiple time zones. Table 13: Order fulfilment related information from interviews of use case companies OF_Req_1 OF_Req_2 OF_Req_3 Copyright LinkedDesign Consortium Page 112 / 164

113 By combining some of the most common requirements acquired during the interviews with requirements found in literature (see section ) we have summarized six requirements for the order fulfillment module. These are described below, followed by an illustration of how the mock-up relates to these requirements: OF_Req_1: Facilitate collaboration across domains, e.g. project manager and technical leads jointly assess change orders to align business issues with technical questions OF_Req_2: Easily find how issues where handled and solved in previous projects OF_Req_3: Overview of numerous changes and keep track of change history OF_Req_4: Differentiate between different sources of change orders OF_Req_5: Visualize and share current order, production statuses, plans and forecasts for the various parties involved. Real-time data. OF_Req_6: Track orders from acceptance, through order entry, order routing, assembly, shipping, installation and invoicing In Figure 37, the order fulfillment module of the collaborative planning workbench is illustrated. At the left side of the screen a collaboration area, where Manny the Manufacturing Manager can contact the Project Manager and Technical Leads, is illustrated. Within the Workbench, they can see the same information and jointly assess change orders to align business issues with technical questions (OF_Req_1). At the top of the screen the knowledge repository is highlighted. Here, the user can browse or search for history of how issues where handled and solved in previous projects (OF_Req_2). Further, at the bottom right of the Change order panel the highlighted area shows a full overview of current changes within the project, with ID numbers, date of request, completion date, responsible person and change reports (OF_Req_3). The top right area of the Change order panel shows how Manny the Manufacturing Manager can enter different types of changes, for instance if it is related to a change of delivery schedule due date, extra feature of product, new customer requirement etc. (OF_Req_4). The rightmost panel in Figure 37 shows a progress status of order completion. Manny the Manufacturing Manager can share this with partners of the supply chain by sending a link to this information element (OF_Req_5). Finally, at the left side of Figure 37 the functionalities for order tracking with functions for viewing or entering new information about order acceptance, order entry, order routing, assembly, shipping, installation and invoicing are shown (OF_Req_6). Copyright LinkedDesign Consortium Page 113 / 164

114 OF_Req_2 OF_Req_6 OF_Req_4 OF_Req_3 OF_Req_1 OF_Req_5 Figure 37: Order fulfilment mock-up 4.4 Summary of collaborative planning requirements The requirements for functionality which were described and illustrated through mock-ups above are summarized in Table 14 below. Topic Requirement Description General requirements for collaborative planning workbench Personalized and role-based support and capabilities Easy access to all relevant information to perform their tasks Access to historical information Real time access to domain knowledge and expertise An overview of all tasks Support for internal knowledge and experience transfer Copyright LinkedDesign Consortium Page 114 / 164

115 Topic Requirement Description Order acquisition Engineering change management Errors and exceptions OA_Req_1 OA_Req_2 OA_Req_3 OA_Req_4 OA_Req_5 OA_Req_6 OA_Req_7 ECM_Req_1 ECM_Req_2 ECM_Req_3 ECM_Req_4 ECM_Req_5 ECM_Req_6 ECM_Req_7 ER_Req_1 ER_Req_2 ER_Req_3 ER_Req_4 ER_Req_5 ER_Req_6 Functionality for both marketing and tendering activities Possibility to distribute signals from the market to the organization, and vice versa Possibility to manage several tenders at the time Functionality for evaluating the risk of tenders Stage gate functionality to avoid losing important information in tendering Should store information about previous tenders and enable automatic searching within these Should accommodate sharing of resources during tendering Facilitate documentation of on-going engineering changes according to company standards, e.g. providing the reason for change, priority of the change, type of change, etc. Enable risk assessment of multiple change alternatives, e.g. the impact on design, production schedules, supplier relationships, costs, etc. Include functionality for approving changes, e.g. by a change board/committee Ensure updating of documentation so that only prevailing documentation is readily available Incorporate functionality for change propagation Provide direct expert feedback for rapid feasibility control Store information about previous changes both feasible and unfeasible Errors detected during the manufacturing phase, or as early as possible Real-time access to information about errors Access to historical information about errors such as a list of previous errors and how similar errors have been dealt with in the past Information about the cost of an error Overview of cost related errors Create and forward error reports to relevant people Copyright LinkedDesign Consortium Page 115 / 164

116 Topic Requirement Description Order fulfillment OF_Req_1 OF_Req_2 OF_Req_3 OF_Req_4 OF_Req_5 OF_Req_6 Facilitate collaboration across domains, e.g. project manager and technical leads jointly assess change orders to align business issues with technical questions Easily find how issues where handled and solved in previous projects Overview of numerous changes and keep track of change history Differentiate between different sources of change orders Visualize and share current order, production statuses, plans and forecasts for the various parties involved. Real-time data. Track orders from acceptance, through order entry, order routing, assembly, shipping, installation and invoicing Table 14: Summary of collaborative planning requirements The next section describes how the collaborative planning workbench is positioned within the LEAP architecture. 4.5 LEAP and the Collaborative Planning Workbench This section describes the collaborative planning workbench within the context of the Linked Engineering and Manufacturing Platform (LEAP) architecture. The collaborative platform proposed is based on a personal as well as role-based workspace for various people in organisations. Functionalities are packaged and made available to users according to their roles and the processes that they may need to perform such as engineering change management. It brings together ideas from the Virtual Obeya and "MyObeya" concepts and the "Streamwork Collaboration Dashboard" described in the Deliverable D1.1 Data Management and Integration Components in the LinkedDesign Architecture. The collaboration platform can be considered as a detailed view of "MyObeya", describing the kinds of functionalities each role may want for performing specific tasks within the organisation. The mockups show possible interfaces for "MyObeyas" that provides access to dedicated as well as general functionalities that are required to perform various tasks. Some of these functionalities may be supported by the Streamwork Collaboration Dashboard while others may be other dedicated capabilities such as the widgets for KBE or LCA, as described in D1.1. Copyright LinkedDesign Consortium Page 116 / 164

117 Figure 38: Connection between LEAP and the Collaborative Planning workbench While the design of the collaborative workspace has been driven by the user stories, it has been influenced by ideas of Active Knowledge Models (AKM) and the idea of modelling to support role-based workspaces. Context-rich workspace models may be created using the IRTV modelling language, where IRTV represents Information, Role, Task, View. The models are composed of content that focus on dependencies among and between roles (R), their main tasks (T), supporting views (V) and relevant information elements (I); e.g. the mockups show the workspace for specific roles, where each role is able to interact with other roles though the real time communication capabilities as well as the timely propagation of information to the relevant roles; the information that is relevant for each role to conduct specific tasks are made easily available, etc. Models created using this approach can automatically generate workspaces for the specific roles. Users can affect the model through their workspaces to automatically update the model. This methodology is described in detail in Appendix 4 (Section 0). Copyright LinkedDesign Consortium Page 117 / 164

118 4.6 Collaborative planning framework The purpose of this section is to propose a concept of collaborative planning, information and decision support system (CoPIDSS) to carry out collaboration between partners within and across enterprises and thereby meet the main requirements for collaborative platform support identified through the user scenarios. Findings in the literature, observations in the industry, and discussions with practitioners imply that collaboration system support can be assigned to at least three categories. The categories are: manual coordination, bolt-on software and integrated collaborative systems. The first category is manual coordination, which refers to an arrangement where both the enterprises and their information systems are separated from one another. In such a setting, design engineers run their specialized design software and the production planners use ERP system s production planning module. The coordination is handled informally through meetings, s, spreadsheets, and at best with groupware solutions. Along the years, some published works have described arrangements that would clearly fall under this category (e.g. Hoevers, 1986; Jahnukainen et al., 1995; Tenhiälä & Eloranta, 2005). Moreover, personal experiences and discussions with industry representatives suggest that this approach is relatively common. The second category is the bolt-on software offered to the conventional ERP systems. The term stands for supplementary applications that are integrated to enterprise systems through standard interfaces (Gupta, 2000). It seems that some software developers have awakened to the deficiencies of ERP systems own project modules. For example, a German company Wassermann offers a solution, which seems to tackle project-based MPC. The company states that: (conventional enterprise systems planning modules) "presents only specific parts of the process, without demonstrating the important dependencies and synchronization issues between them as a whole or in detail (Wassermann, 2005). The last solution category is integrated collaborative systems. Such platforms can provide support for multi-user editing, archiving, information sharing and other collaborative activities. It will also allow users concurrent information creation, communication, data transfers, revision, etc. Copyright LinkedDesign Consortium Page 118 / 164

119 Figure 39: Proposal for collaborative planning, information and decision support system (COPIDSS) COPIDSS is a platform where separate systems are used within and across the manufacturing enterprises will be connected to the various functions of collaborative planning which include product portfolio management, collaborative product customization, collaborative product development, collaborative product manufacturing, collaborative component supply and extended product service (Terwiesch, and Loch, 1999). For example, a collaborative product manufacturing function consist of an ERP component, for which some of the most widely used vendor solutions are SAP and Microsoft dynamics. These systems are then integrated through middle-ware software like enterprise application integration (EAI) or service oriented architectures (SOA). The users will get access through a common portal, where the information pertaining to a specific group and role is accessible. A user in a designer role sees drawings in a PLM system, such as Teamcenter, which he will modify/change as per the requirements. Then it can be checked by knowledge engineer or design engineer, possibly from different enterprises. They can collaborate using electronic communication tools such as , web publishing, revision control, digital mock-ups, and online conferencing and chat applications in real time. In addition, real time feedback or expert opinion for better decision support can be available. CoPIDSS can help the users collect, organize, manage, store, convert data within and between enterprises and it can avoid loss of information. Also, once verified, these data can be stored in the form of knowledge and skills in engineering change and error management systems (ECEMS) and in engineering knowledge management systems (EKMS) respectively. ECEMS and EKMS are repositories which contain descriptions of knowledge sources on EC s, errors, time data, source descriptions, information and links between knowledge sources. In case of errors or engineering changes, the ECEMS and EKMS can be designed to proactively alert Copyright LinkedDesign Consortium Page 119 / 164

120 design engineer or project manager to similar situations that had occurred in the past. This can also help the enterprises to carry out real time joint viewing, digital mock-ups and joint decision making. These repositories are linked to a central element called a connector which assures the understanding between different users through an ontology based services (Maedche et al., 2001). The connector element consists of sub elements which will manage all the applications, data conversions (style and format), and ensure safe and flexible Information system which will have a robust security and confidentiality characteristics. The portal will not only ensure unified access but it can also provide personalization after authentication. This will give flexibility for the different actors and users to interactive and communicate within the portal. The open and flexible nature of the framework will enable firms to integrate their technology, processes, and information with all their partners along their extended value chain. These partners in turn may also plugin their respective technologies, processes and information, thus creating a collaborative planning network. This will help the enterprises decision makers (internal and external) to have access to all the relevant and detailed information for collaboration and real time decision making. Figure 40: Collaborative Planning Framework Copyright LinkedDesign Consortium Page 120 / 164

121 5 Conclusions and further work This deliverable, together with the D5.1 and D5.3 builds the generic framework for the development of the Virtual Obeya prototype in D5.4. Specific applications of the results are meant to be adapted and used as appropriate also in the use case Work Packages. 5.1 Contribution The main contributions from this deliverable are outlined as follows: Task 5.2 Lean engineering collaboration system diagnostics and optimization: A generic approach to evaluate information sources for reuse; An assessment of highly relevant knowledge sources to be used in a Virtual Obeya. The assessment indicated opportunities and challenges related to integrating data from different knowledge sources into a single, meaningful user interface supporting different roles in an organization. A framework for understanding different approaches based on different levels of ambitions to knowledge creation in a Virtual Obeya, that can be used as a guide for the particular support for this provided in the Virtual Obeya; An overview of modes of knowledge reuse and process improvement in collaborative environments. Specifically important is the link of potential tool support for achieving learning and improvement through reflection, which provide ideas for functionality to include in a Virtual Obeya; Task 5.3: Collaborative planning through effective order management A literature review on collaborative planning, order management and engineering changes, leading to a set of requirements for functionality; A study of a PLM system (Teamcenter) and two ERP systems (Microsoft Dynamics AX and SAP) to identify state-of-the art solutions and current shortcomings; A collaborative planning framework; A collaborative planning workbench; Functionality and user interfaces of the collaborative planning workbench. The mockups can serve as a starting point for further development within the LEAP platform, the Virtual Obeya and for European commercial system providers; Copyright LinkedDesign Consortium Page 121 / 164

122 Of special importance is the link of potential tool support for achieving learning and improvement through reflection, which should provide ideas for functionality to include in a Virtual Obeya; A link between functional requirements and user interface mock-ups for: o Order acquisition business processes o Engineering change management processes o Exceptions and error handling processes o Order fulfillment business processes 5.2 Limitations This work has attacked a large problem area, and we have had to do some pragmatic choices to be able to provide an analysis useful for the project. Basing the work on the specific use cases is on one hand positive for the project as such, making it possible to come up with approaches and frameworks that can be directly useful for the other parts of the project. As with any (multi)-case study though this approach have some limitation as for generalizability. We have tried to address this threat to generic validity of the results by also base the work on more general frameworks and results from a more general literature study. 5.3 Further work and research As for the assessment of knowledge sources, the specific use of the SEQUAL framework can be looked upon both in more detail on these and other types of tools, but should also be sued on other cases to validate the generic framework. Also relative to other frameworks used, additional case studies in other settings should be performed, and as such be used to feed back to the further developments of the used frameworks. Work is ongoing in collaboration with other EU-projects to pursue these possibilities. Further, the collaborative planning workbench concepts will be further studied in remainings tasks of WP 5 and 6, such as: Task 5.5 Develop solution for Virtual Obeya; Task 6.5 Application scenarios digital factory design; Copyright LinkedDesign Consortium Page 122 / 164

123 6 Scientific references Ahire, S.L. and Waller, M.A., (1994), "Incremental and Breakthrough Process Improvement: An Integrative Framework". International Journal of Logistics Management, 5(1), pp Akkermanns, H., Bogerd, P. and Van Doremalen, J., ( 2004). "Travail, transparency and trust: A case study of computer-supported collaborative supply chain planning in high-tech electronics". European Journal of Operational Research, 153(2), pp Akkermans, H. A., Bogerd, P., Yücesan, E. and Van Wassenhove, L. N., (2003). "The impact of ERP on supply chain management: Exploratory findings from a European Delphi study". European Journal of Operational Research, 146(2), pp Alvarez, E., (2007). "Multi-plant production scheduling in SMEs". Robotics and Computer-Integrated Manufacturing, 23(6), pp Amaro, G., Hendry, L. and Kingsman, B., (1999). "Competitive advantage, customisation and a new taxonomy for non make-to-stock companies". International Journal of Operations & Production Management, 19(4), pp APICS (2007). Using Information Technology to Enable Supply Chain Management. APICS Certified Supply chain Professional Learning system, the Association for Operations Management, Alexandria, VA. Argyris, C. and Schön, D., (1978). Organizational Learning: A Theory of Action Perspective. Addison Wesley, Reading MA, USA. AVEVA, (2012). Software for Plant Engineering and Design. Available at: [Last accessed ] Awad, H. A. H. and Nassar M. O. (2010), "Supply chain integration: Definition and challenges". Proceedings of the International Multiconference of Engineers and Computer Scientists, IMECS 2010, March 17-19, Hongkong Bailey, J. E. and Pearson, S. W., (1983). "Development of a Tool for Measuring and Analyzing Computer User Satisfaction". Management Science, 29(5), pp Balcerak, K. J. and Dale, B. G., (1992). "Engineering change administration: the key issues". Computer Integrated Manufacturing Systems, 5(2), pp Barratt, M. and Oke, A., (2007). "Antecedents of supply chain visibility in retail supply chains: A resource-based theory perspective". Journal of Operations Management, 25(6), pp Batini, C. and Scannapieco, M., (2006). Data Quality: Concepts, Methodologies and Techniques. Springer- Verlag, Berlin Beckman, S.L. and Rosenfield, D.B., (2008). Operations strategy: competing in the 21 st century. McGraw- Hill/Irwin, Boston, USA. Bertin, J., (1983). Semiology of Graphics: Diagrams, Networks, Maps. University of Wisconsin Press. Bertrand, J.W.M. and Muntslag, D.R., (1993). "Production control in engineer-to-order firms". International Journal of Production Economics, 30-31, pp Blackstone, J. H., (2008). APICS dictionary. APICS, Athens, GA. Briscoe, G. and Dainty, A., (2005). "Construction supply chain integration: an elusive goal?". Supply Chain Management: An International Journal, 10(4), pp Brown, J., (2006). Managing product relationships: enabling iteration and innovation in design. Aberdeen Group, Boston, Massachusetts Burt, D. N. and Doyle, M. F. (1993). The American Keiretsu: A Strategic Weapon for Global Competitiveness. Business One Irwin, Homewood, IL, USA. Copyright LinkedDesign Consortium Page 123 / 164

124 Caridi, M., Crippa, L., Perego, A., Sianesi, A. and Tumino, A., (2010). "Measuring visibility to improve supply chain performance: A quantitative approach". Benchmarking: An International Journal, 17(4), pp Caron, F. and Fiore, A., (1995). Engineer to order companies: how to integrate manufacturing and innovative processes. International Journal of Project Management, 13(5), pp Carroll, J. M., (1999). Five Reasons for Scenario-Based Design. In: 32nd Hawaii International Conference on System Sciences, Hawaii, IEEE. Carroll, J.M., (1995). Scenario-based Design: Envisioning Work and Technology in System Development. John Wiley & Sons, Inc. New York, NY, USA: Chen, P. P-S., (1976). "The entity-relationship model: Towards a unified view of data". ACM Transactions on Database Systems, 1(1), pp Cheng, H. and Chu, X. (2010). "A network-based assessment approach for change impacts on complex product". Journal of Intelligent Manufacturing, 23 (4), pp Chesbrough, H., (2011) Open Services Innovation. Rethinking your business to growth and compete in a new era. Jossey-Bass, San Francisco, CA, USA. Chorafas, D. N., (2005). The real-time enterprise. Auerbach Publications, Boca Raton, Fla. CIMdata (2010), Teamcenter unified : Siemens PLM Software s Next Generation PLM Platform, white paper, June 2010 Cooper, M. C., Lambert, D. M. and Pagh, J. D. (1997). "Supply Chain Management: More Than a New Name for Logistics. International Journal of Logistics Management. 8(1), 1-14 Cox, A., (2004), "The art of the possible: relationship management in power regimes and supply chains". Supply Chain Management: An International Journal, 9(5), pp Croxton, K., (2003). "The Order fulfilment Process". International Journal of Logistics Management. 14(1), pp Dale, B. G., (1982). "The management of engineering change procedure". Engineering Management International, 1(3), pp Damgaard, (2013). Company History. Available at: [Last accessed ] Danese, P. and Romano, P. (2004). "Improving Inter-Functional Coordination to Face High Product Variety and Frequent Modifications". International Journal of Operations & Production Management. 24(9/10), DiPrima, M. R., (1982). "Engineering change control and implementation considerations". Production and Inventory Management Journal, 23(1), pp Ebben, M., Hans, E. and Olde Weghuis, F. (2005). "Workload Based Order Acceptance in Job Shop Environments". OR spectrum. 27(1), Eckert, C. M., Clarkson, P. J. and Earl, C. F., (2005). "Predictability of change in engineering: a complexity view". In: Proceedings of the 2005 ASME international design engineering technical conferences (IDETC/CIE2005), Long Beach, California, USA. Eckert, C. M., Clarkson, P. J. and Zanker, W., (2004). "Change and customisation in complex engineering domains". Research in Engineering Design, 15(1), pp Elfving, J.A., Tommelein, I.D. and Ballard, G., (2005). "Consequences of competitive bidding in project-based production". Journal of Purchasing and Supply Management, 11(4), pp Falkenberg, E. D., Hesse, W., Lindgreen, P., Nilsson, B. E., Oei, J. L. H., Rolland, C., Stamper, R. K., Assche, F. J. M. V., Verrijn-Stuart, A. A. and Voss, K., (1996). A Framework of information system concepts - The FRISCO Report. IFIP WG 8.1 Task Group FRISCO. Copyright LinkedDesign Consortium Page 124 / 164

125 Forza, C. and Salvador, F. (2002): "Managing for Variety in the Order Acquisition and Fulfilment Process: The Contribution of Product Configuration Systems". International Journal of Production Economics. 76(1), Funk, G., (2001). "Enterprise integration: join the successful 20%". Hydrocarbon Processing, 80(4), pp Gao, Q., Du, Z., Qu, Y., (2008). "Analysis on Engineering Change Management Based on Information Systems", In Yan X.T., Ion, W.J. and Eynard B. (Eds.), Global Design to Gain a Competitive Edge, Springer London: Gartner, (2008), "Magic Quadrant for Manufacturing Product Life Cycle Management, 4Q07", by analysts Marc Halpern & Dan Miklovic, published 4 Jan 2008, ID: G Gartner, (2011), "Magic Quadrants and Market Scopes: How Gartner Evaluates Vendors Within a Market", available on [last accessed ] Gartner, (2012), "Magic Quadrant for Single-Instance ERP for Product-Centric Midmarket Companies", by analysts Christian Hestermann, Chris Pang & Nigel Montgomery, published 27 June 2012, Marc Halpern & Dan Miklovic, ID: G Giffin, M., de Weck, O., Bounova, G., Keller, R., Eckert, C. M. and Clarkson, P. J., (2007). "Change propagation analysis in complex technical systems". In: Proceedings of the 2007 ASME international design engineering technical conferences & computers and information in engineering conference (IDETC/CIE 2007). Grudin, J., (1994). "Groupware and social dynamics: eight challenges for developers". Communications of the ACM, 37(1), pp Handsfield, R. B. and Nichols, E. L., (2002). Supply Chain Redesign: Transforming Supply Chains into Integrated Value Systems. Financial Times Prentice Hall, Upper Saddle River. Hanna, A. S., Camlic et al. (2004). "Cumulative Effect of Project Changes for Electrical and Mechanical Construction". Journal of construction engineering and management. 130(6), Hansen M. T.: The Collaboration Toolkit, Harvard Business Press, Hansen, M. T., (2009). Collaboration: How leaders avoid the traps, create unity, and reap big results. Harvard Business Press, Boston, USA. Hicks, C., McGovern, T. and Earl, C.F., (2000). "Supply chain management: A strategic issue in engineer to order manufacturing". International Journal of Production Economics, 65(2), pp Holweg, M., Disney, S., Holmström, J. and Småros, J., (2005). Supply Chain Collaboration: Making Sense of the Strategy Continuum. European Management Journal, 23(2), pp Høydalsvik, G. M. and Sindre, G., (1993). "On the purpose of object-oriented analysis". In: Proceedings of the Conference on Object-Oriented Programming Systems, Languages, and Applications (OOPSLA'93), pp , ACM Press. Huang, G. Q. and Mak, K. L., (1999). "Current practices of engineering change management in UK manufacturing industries". International Journal of Operations & Production Management, 19(1), pp Huang, G. Q., Yee, W. Y. and Mak, K. L., (2001). "Development of a web-based system for engineering change management". Robotics and Computer-Integrated Manufacturing, 17(3), pp Huang, G.Q. and Mak, K. L., (1998). "Computer aids for engineering change control". Journal of Materials Processing Technology, 76(1-3), pp Ingvaldsen, J. E., (2011). Semantic Process Mining of Enterprise Transaction Data. PhD-thesis NTNU, Trondheim, Norway. Ireland, P., (2004). "Managing appropriately in construction power regimes: understanding the impact of regularity in the project environment". Supply Chain Management: An International Journal, 9(5), pp Copyright LinkedDesign Consortium Page 125 / 164

126 Ivert, L. K. and Jonnson, P., (2011). "Problems in the onward and upward phase of APS systems implementation why do they occur?" International Journal of Physical Distribution & Logistics Management, 41(4), pp Ivert, L. K. and Jonsson, P., (2010). "The potential benefits of advanced planning and scheduling systems in sales and operations planning". Industrial Management & Data Systems, 110(5), pp Jarratt, T. A. W, Eckert, C. M., Caldwell, N. H. M. and Clarkson, P. J., (2011). "Engineering Change: an overview and perspective on the literature", Research in Engineering Design, 22, pp Jarratt, T. A. W., Eckert, C. M. and Clarkson, P. J., (2004a). "Engineering change". In: Clarkson PJ, Eckert CM (eds) Design process improvement, Springer, New York. Jonnson P., Kjellsdotter L. and Rudberg, M., (2007). "Applying advanced planning systems for supply chain planning: three case studies". International Journal of Physical Distribution and Logistics Management. 37(19), pp Jørgensen, H. D., (2004). Interactive Process Models. PhD-thesis NTNU, Trondheim, Norway. Karkkainen, M., Holmstrom, J., Framling, K. and Artto, K., (2003). "Intelligent products a step towards a more effective project delivery chain". Computers in Industry, 50(2), pp Kelle, P. and Akbulut, A., (2005). "The role of ERP tools in supply chain information sharing, cooperation, and cost optimization". International Journal of Production Economics, 93-94, pp Kilger. C and Reuter, B. (2005). "Collaborative Planning", In: Stadtler and Kilger (Eds), Supply Chain Management and Advanced Planning: Concepts, Models, Software and Case Studies, 3 rd Ed, Springer- Verlag, Berlin. Kim, J., Pratt, M. J., Iyer, R. and Sriram, R., (2007). Data Exchange of Parametric CAD Models Using ISO NISTIR Kingsman, B., et al. (1993). "Integrating Marketing and Production Planning in Make-to-Order Companies". International Journal of Production Economics. 30, Kocar, V. and Akgunduz, A., (2010). "ADVICE: a virtual environment for engineering change management". Computers in Industry, 61(1), pp Koçoğlu, İ., İmamoğlu, S. Z., İnce, H. and Keskin, H., (2011). "The effect of supply chain integration on information sharing: Enhancing the supply chain performance". Procedia - Social and Behavioral Sciences, 24, pp Koh, S. C. L. and Saad, S. M. (2006). "Managing Uncertainty in Erp-Controlled Manufacturing Environments in SMEs". International Journal of Production Economics. 101(1), Konijnendijk, P. A., (1994). "Coordinating marketing and manufacturing in ETO companies". International Journal of Production Economics, 37(1), pp Krajewski, L., Wei, J.C. and Tang, L.-L., (2005). "Responding to schedule changes in build-to-order supply chains". Journal of Operations Management, 23(5), pp Kristensen, K. and Kijl, B., (2010). "Collaborative Performance: Addressing the ROI of Collaboration". International Journal of e-collaboration, 6(1). Krogstie J. and Lillehagen, F., (2008). "Methodologies for Active Knowledge Modelling". In: Halpin, T., Proper, E. and Krogstie, J. (eds), Innovation in Information Systems Modelling: Methods and Best Practices. IGI. Krogstie, B. R., Prilla, M., Wessel, D. and Knipfer, K., (2012). "Computer-Support for reflective learning at the workplace: A model". In: Proceedings of IEEE 12 th International Conference on Advanced Learning Technologies (ICALT). Krogstie, J. and Jørgensen, H., (2004). "Interactive models for supporting networked organisations". In: 16th Conference on advanced Information Systems Engineering. Riga, Latvia: (Springer, Berlin Heidelberg New York). Krogstie, J., (2001). "A Semiotic Approach to Quality in Requirements Specifications". In: Proceedings IFIP 8.1. Working Conference on Organizational Semiotics July, Montreal, Canada. Copyright LinkedDesign Consortium Page 126 / 164

127 Krogstie, J., (2012). Model-based Development and Evolution of Information Systems: A Quality Approach. Springer-Verlag, Berlin. Krogstie, J., (2012b). "Modeling of Digital Ecosystems: Challenges and Opportunities". In: Proceedings Proceedings of 13 th IFIP WG 5.5 Working Conference on Virtual Enterprises, PRO-VE. Kump, B., Knipfer, K., Pammer, V., Schmidt, A., Maiier, R., Kunzmann C., Cress, U. and Lindstaedt, S., (2011). "The Role of Reflection in Maturing Organizational Know-how". In: Proceedings of 6 th European Conference on Technology Enhanced Learning (ECTEL) La Rocca, G., (2012). "Knowledge based engineering: Between AI and CAD. Review of a language based technology to support engineering design". Advanced Engineering Informatics, 26(2), pp Lambert, D. M. and Cooper, M. C. (2000): "Issues in Supply Chain Management". Industrial marketing management. 29(1), Lee, H. J., Ahn, H. J., Kim, J. W. and Park, S. J., (2006). "Capturing and reusing knowledge in engineering change management: a case of automobile development". Information Systems Frontiers, 8, pp Lee, H. L. and Whang, S., (2000). "Information sharing in a supply chain". International Journal of Manufacturing Technology and Management, 1(1), pp Leimbach, T., (2008). "The SAP Story: Evolution of SAP within the German Software Industry". Annals of the History of Computing, IEEE, 30(4), pp Lerberg, A., (2012). Development of a software application for supply chain order management. Unpublished master thesis, Institute for production and quality engineering, NTNU, Trondheim, Norway. Lillehagen, F. and Krogstie, J., (2008). Active Knowledge Modelling of Enterprises. Springer-Verlag, Berlin. Lin, F. R. and Shaw, M. J., (1998). "Reengineering the Order Fulfilment Process in Supply Chain Networks". The International Journal of Flexible Manufacturing Systems, 10, pp Little, D., Rolling, R., Peck, M. and Porter, K., (2000). "Integrated planning and scheduling in the engineer-toorder sector". International Journal of Computer Integrated Manufacturing, 13(6), pp Liu, L. and Yu, E. S., (2002). "Designing Web-Based Systems in Social Context: A Goal and Scenario Based Approach". In: Pidduck, A. B., Ozsu, M. T., Mylopoulos, J. and Woo C. C. (eds), Advanced Information Systems Engineering: 14th International Conference, CAiSE 2002, Springer-Verlag, Berlin Heidelberg. Lyons, M. H., (2005). "Future ICT systems -- understanding the business drivers". BT Technology Journal, 23(5), pp Maiden, N. A. M., Jones, S. V., Manning, S., Greenwood, J. and Renou, L., (2004). "Model-driven Requirements Engineering: Synchronising Models in an Air Traffic Management Case Study". In: Persson, A. and Stirna, J. (eds), CAiSE Springer-Verlag, Berlin Heidelberg. Mattessich, P. W., Murray-Close, M. and Monsey, B. R., (2001). Collaboration - What Makes It Work, 2 nd Edition. Fieldstone Alliance, St. Paul, MN. Maull, R., Hughes, D. and Bennett, J., (1992). "The role of the bill-of-materials as a CAD/CAPM interface and the key importance of engineering change control". Computing & Control Engineering Journal, 3(2), pp McGovern, T., Hicks, C. and Earl, C. F., (1999). "Modelling Supply Chain Management Processes in Engineerto-Order Companies". International Journal of Logistics Research and Application, 2(2), pp Moody, D. L. and Shanks, G., (2003). "Improving the quality of data models: empirical validation of a quality management framework". Information Systems, 28(6), pp Moody, D. L., (1998). "Metrics for Evaluating the Quality of Entity Relationship Models". In: Proceedings of the Seventeenth International Conference on Conceptual Modelling (ER '98), Singapore, Elsevier Lecture Notes in Computer Science, 1507, pp Moody, D. L., (2009). "The "Physics" of Notations: Toward a Scientific Basis for Constructing Visual Notations in Software Engineering". IEEE Transactions on Software Engineering, 35(6), pp Copyright LinkedDesign Consortium Page 127 / 164

128 Moody, D. L., and Shanks, G. G., (1994). "What Makes a Good Data Model? Evaluating the Quality of Entity Relationship Models". In: Proceedings of the 13th International Conference on the Entity-Relationship Approach (ER 94), pages , Manchester, England. Morris, C., (1938). "Foundations of the Theory of Signs". In: International Encyclopedia of Unified Science, vol. 1. University of Chicago Press, London. Muntslag, D. R. (1994). "Profit and Risk Evaluation in Customer Driven Engineering and Manufacturing". International Journal of Production Economics. 36(1), Nawrocki, J., Jasinski, M., Walter, B. and Wojciechowski, A., (2002). "Extreme Programming Modified: Embrace Requirements Engineering Practices". In: IEEE Joint International Conference on Requirements Engineering (RE 02), IEEE. Nonaka, I., (1994). "A dynamic theory of organizational knowledge creation". Organization Science, 5(1), pp Olhager, J.,( 2003). "Strategic positioning of the order penetration point". International Journal of Production Economics, 85(3), pp Pandit, A. and Zhu, Y., (2007). "An ontology-based approach to support decision-making for the design of ETO (Engineer-To-Order) products". Automation in Construction, 16(6), pp Parush, A., Hod, A. and Shtub, A., (2007). "Impact of visualization type and contextual factors on performance with enterprise resource planning systems". Computers & Industrial Engineering, 52(1), pp Pisano G.P. and Verganti, R., (2008). "Which Kind of Collaboration Is Right for You?" Harvard Business Review, December. Porter, K., Little, D. and Peck, M., (1999). "Manufacturing classifications: relationships with production control systems". Integrated Manufacturing Systems, 10(4), pp Price, R. and Shanks, G., (2004). "A Semiotic Information Quality Framework". In: IFIP WG8.3 International Conference on Decision Support Systems (DSS2004), Prato, Italy, 1-3, 2004, pp Price, R. and Shanks, G., (2005). "A semiotic information quality framework: Development and comparative analysis". Journal of Information Technology, 20(2), pp Quinn, F. (2003). The Elusive Goal of Integration. Supply Chain Management Review, 7. Reidelbach, M. A., (1991). "Engineering change management for long- lead-time production environments". Production and Inventory Management Journal, 32(2), pp Riley, D. R., Diller, B. E. and Kerr, D. (2005). "Effects of Delivery Systems on Change Order Size and Frequency in Mechanical Construction". Journal of construction engineering and management. 131(9), Rosen, E., (2007). The Culture of Collaboration. Red Ape Pub. Salvador, F. and Forza, C. (2004). "Configuring Products to Address the Customization-Responsiveness Squeeze: A Survey of Management Issues and Opportunities". International Journal of Production Economics. 91(3), SAP, (2013a). SAP company history. Available at: [last accessed ] SAP, (2013b). SAP Business Suite, Powered by SAP Hana, Applications brochure, CMP23378, Available at [last accessed ] Sawhney, R. and Piper, C. (2002). "Value Creation through Enriched Marketing Operations Interfaces: An Empirical Study in the Printed Circuit Board Industry". Journal of Operations Management. 20(3), Scheer, A.W., (1999). ARIS, Business Process, 3 rd ed. Springer, Berlin Heidelberg, New York. Setia, P., Sambamurthy, V. and Closs, D., (2008). "Realizing business value of agile IT applications: antecedents in the supply chain networks". Information Technology and Management, 9(1), pp Copyright LinkedDesign Consortium Page 128 / 164

129 Shannon, C. E. and Weaver, W., (1963). The Mathematical Theory of Communication. Univ. of Illinois Press. Shneiderman, B., (1992). Designing the User Interface: Strategies for Effective Human- Computer Interaction, 2 nd ed. Addison Wesley, Reading, Massachusetts. Siemens (2013a), Explore Siemens PLM Software, Available: [last accessed ] Siemens (2013b), Teamcenter Resource Library, Available at: [last accessed ] Stadtler, H. and Kilger, C., (2005). Supply Chain Management and Advanced Planning: Concepts, Models, Software and Case Studies, 3 rd ed. Springer, Berlin. Stadtler, H., (2005). Supply chain management and advanced planning basics, overview and challenges. European Journal of Operational Research, 163(3), pp Strandhagen, O., Alfnes, E., and Dreyer, H. C., (2006). "Supply Chain Control Dashboards", In: Conference proceedings Production and Operations Management Society (POMS), Boston, US. Suchman, L., (1995). "Making Work Visible." Communications of the ACM, 38(9), pp Tavcar J, Duhovnik J (2005) Engineering change management in individual and mass production. Robotics and Computer-Integrated Manufacturing, 21(3), pp Tenhiälä, A. and Ketokivi, M. (2012). "Order Management in the Customization Decision Sciences. 43(1), Responsiveness Squeeze", Terwiesch, C. and Loch C. H., (1999). "Managing the process of engineering change orders: the case of the climate control system in automobile development". Journal of Product Innovation Management, 16(2), pp Tsichritzis, D. and Klug, A., (1978). "The ANSI/X3/SPARC DBMS Framework". Information Systems, 3, pp Van der Aalst, W. M. P., (2011). Process Mining: Discovery, Conformance and Enhancement of Business Processes. Springer-Verlag. van Giressel, N., (2004). Process Mining in SAP R/3. Master thesis, Eindhoven University of Technology. VICS, (2013). Voluntary Inter-industry Commerce Solutions. Available at: [last accessed ] Vollman, T. E., Berry, W. L., Whybark, D. C. and Jacobs, F. R., (2005). Manufacturing Planning and Control for Supply Chain Management. McGrawHill, Boston, USA. Waller, K., Woolsey, D. and Seaker, R., (1995). "Reengineering Order Fulfilment". International Journal of Logistics Management, 6(2), pp Wand, Y. and Weber, R., (1993). "On the Ontological Expressiveness of Information Systems Analysis and Design Grammars". Journal of Information Systems, 3(4), pp Wänström, C., (2006). Materials Planning in Engineering Change Situations, PhD dissertation, Volum 2484, Chalmers Tekniska Högskola. Ware, C., (2000). Information Visualization. Morgan Kaufmann. Wasmer, A., Staub, G., Vroom, R.W. (2011): "An industry approach to shared, cross Organizational engineering change handling - The road towards standards for product data processing." Computer Aided Design 43(5): pp Wortmann, H., (1995). "Comparison of information systems for engineer-to-order and make-to-stock situations". Computers in Industry, 26(3), pp Wright, I. C., (1997). "A review of research into engineering change management: implications for product design". Design Studies, 18(1), pp Copyright LinkedDesign Consortium Page 129 / 164

130 Yu, E. S., (1997). "Towards Modeling and Reasoning Support for Early-Phase Requirements Engineering". In: Proceedings of the Third IEEE International Symposium on Requirements Engineering. Zorzini, M., Corti, D. and Pozzetti, A. (2008). "Due Date Quotation and Capacity Planning in Make-to-Order Companies: Results from an Empirical Analysis". International Journal of Production Economics. 112(2), Copyright LinkedDesign Consortium Page 130 / 164

131 7 Appendices 7.1 Appendix 1: Tools used in use cases Information Sources vs LinkedDesign X - Relevant for case in LinkedDesign X - Mentioned in D3.1 (or found later) Tool Aker Solutions Comau VW Office Automation (Spreadsheet, etc.) X-sharepoint X-Excel x 3D Computer Aided Design (CAD 3D) X-PDMS (Aveva) x-catia v5 / X - Catia V5 Autocad Solidworks Knowledge Based Engineering (KBE) X-KbeDesign Product Data Management (PDM/PLM) x-enovia x -internal software Enterprise Resource Planning (ERP) Kesys MIPS, COMOS x-sap SAP Lifecycle Analysis (LCA) software, X-no tool yet x - in other dep Supply Chain Management (SCM) SAP Customer Relationship Management (CRM) SAP Change Management "Change Control System (CSS) inhouse web application Enovia Risk Management Risk Dashboard (inhouse application) Copyright LinkedDesign Consortium Page 131 / 164 None Project Management Software x - Trello x-ms Project x R-plan, MS Project Computer Aided Engineering (CAE) 2D Computer Aided Design (CAD 2D) Finite Element Analysis Method (FEA/FEM) x- (Solidworks, not in KBEDesign) x - Microstation, Autocad x - GeniE, SACS, Staad Pro, Robot x-solidworks x-autocad x-ansys Lsdyna, Matlab Ansys x -catia x - LSDyna consol Autoform Ansys

132 Computational Fluid Dynamics (CFD) Computer Aided Manufacturing (CAM) Computer Aided Process Planning (CAPP) / Digital Manufacturing / Factory Planning Discrete Event Simulation (DES) x- Arena/Automod x - Ansys x - HLS Document Management System (DMS) x - ProArc x - Sharepoint x - DMS Workflow Management System (WMS) Copyright LinkedDesign Consortium Page 132 / 164

133 7.2 Appendix 2 Ideas from Turin front-end workshop and take-up # Repository of ideas 6 Addressed in application scenario 7 1 Divergence Low priority in LD No submitted ideas for this collaborative waste source 2 Misunderstanding Partially addressed 1. What: Information sharing, information assets recommendation / How: Information tagging based on LDO 2. Format for structuring the communicated knowledge 3. Provide illustration of rules / Show modification simulations (SAP presentation) / In-context support with relevant experts 4. Polimi / Holonix / Comau: Improve visualisation and understanding / User-friendly interface 5. Strong link data/files with the author using social media platform such as StreamWork (search for user profile) 6. What: -Identify source of waste / How: The right info visualised comprehensively 7. What: Local, context-based views on common knowledge / information in distributed, collaborative situations / How: Better understanding based on local context 8. Ways of providing collaborative user interfaces which is dependent on the context of all the collaborators => Less misunderstanding 9. From the context-driven visualization mechanism developed by BIBA: It's very interesting and useful that a real context driven visualization mechanism could be developed, in which the users of different users could be liked to go them 3 Undercommunicating Partially addressed 1. Direct link through QLM messenger from the machine to the user to retrieve data 2. Undercommunicating: Within error reports, work logs, etc. - link to relevant documentation, models, and people 3. What: Undercommunicating: Aker-like case => Direct addressing of question to the correct person / How: , messenger, etc. 4. What: Under communication - Communication as information sharing from maintenance / How: Mock-up shown by SAP; where maintenance was done 5. Undercommunicating: Link general information generated in different areas or departments that refer to the task / point in which the person is working 6. Allow a transition between paper / whiteboard-based collaboration to a more durable/reusable digital format => More efficient use of key players' time due to less rework 6 See objectives and agenda for Turin front-end workshop in an Appendix in this report 7 See D1.2 report Copyright LinkedDesign Consortium Page 133 / 164

134 # Repository of ideas 6 Addressed in application scenario 7 7. Undercommunicating: In-context support with experts 8. Shared view on a report: Less processing / less waiting 4 Interpreting Partially addressed 1. Interpretation: Give examples, write a unique vocabulary and a glossary 2. Ways to add knowledge ad-hoc which can be retrieved in a context dependent ideas => Support interpretation and searching 3. Long interpretation of an analysis result / Describe outcomes (possible) and their meaning in a common manner 5 Searching Partially addressed 1. What: Addressing searching / finding the most appropriate template from a previous project / customer / order - How: Better search functionality with meta data that allow better refining of searches, across multiple information systems 2. Data integration / semantic mediator 3. How: Automated information gathering and processing due to keyword search / something like internal Google search 4. Search: Overlay or place side-by-side related information from heterogeneous sources 5. What: Efficient search of the right "thing" / How: Use of right semantics etc. 6. What: Too much information in wrong places / How: Subscriptions only to relevant information 7. Use context to filter available information, so that search is easier 8. Appropriate queries 9. In-context contacting expert. Inquiry with background info (context) embedded 10. BIBA mock-up: Identification of piece quality and defects. Quick decision making and current production state 11. Searching: Enabling searching through a single interface 12. Enterprise search engine: Linked with the ontology / taxonomy. Note: It has not been shown as demo 6 Motion Low priority in LD No submitted ideas for this collaborative waste source 7 Extra processing Partially addressed 1. Extra processing through integration of PLM data. Evaluations and LCC, LCA and maintenance analysis. Can be much simplified almost an mockups 2. Presentation of already correlated info items 3. Shared (collaborative) view on a data source (e.g. data tables) => Less processing => Less waiting to show the data to others 4. What: Addressing extra-processing (manually entering and changing information in multiple information systems) / How: Allow for direct manipulation of data, sending messages, contacting persons etc. in the LEAP platform 5. What: The loss of most of information requires to replicate work / How: The possibility to save work delete the reprocess of the work Copyright LinkedDesign Consortium Page 134 / 164

135 # Repository of ideas 6 Addressed in application scenario 7 6. Extra-processing: Simplicity workflow / optimization assisting user 7. Show critical process in management view. People could be replanning process timeline 8 Translation Low priority in LD No submitted ideas for this collaborative waste source 9 Waiting Partially addressed 1. Active targeted notification 2. What: Real time Data integration (E.g. Aalto) targets WAITING / How: Displaying data in a user specific way online 3. What: Near real-time notification of events / How: QLM + Middleware 4. A tool for immediately diagnose the machine failure to the maintenance team through mobile devices 5. Implement data banks inside the tools 10 Misapplication Not yet addressed 1. Context driven user interface Ontology 2. Misapplication: WHAT: Specific tool by customer on inside company which cannot be applied / HOW: A unique standard mock-up, tool could be a solution to speed up the work 3. Misapplication: Structure correctly the GUI of the overall application and of each component, it can support pushing the processes Copyright LinkedDesign Consortium Page 135 / 164

136 7.3 Appendix 3: Quality of data and data representations As a basis for the evaluation of the data sources in chapter 3, we use a generic framework for the evaluation of models and modeling languages (SEQUAL - Semiotic quality framework), specialized for assessing the quality of data sources of the kind particularly relevant for LinkedDesign. We will first describe the generic SEQUAL framework, which is phrased relative to quality of models in general, before we describe the specialization on data quality, looking at data as a model on the instance level (models e.g. data models are usually thought upon as being on the type level, although in enterprise modeling you usually represent both instance level and type level concepts) Overview of SEQUAL SEQUAL has the following properties (Krogstie, 2012): It distinguishes between goals and means by separating what you are trying to achieve (quality of models) from how to achieve it. It can be used for evaluation of models and modeling languages in general, but can also be extended for the evaluation of particular types of models. It is closely linked to linguistic and semiotic concepts. In particular, the core of the framework including the discussion on syntax, semantics, and pragmatics is related to the use of these notions in the semiotic theory of Morris (1938). Extensions are partly based on extensions in organizational semiotics (Falkenberg et al, 1996) and we have kept the original terminology from these areas. It is based on a constructivistic world-view, recognizing that models are usually created as part of a dialogue between the participants involved in modeling, whose knowledge of the modeling domain and potentially the domain itself changes as modeling takes place. The framework has earlier been used for evaluation of modeling and modeling languages of a large number of perspectives, including data, object, process, enterprise, and goal-oriented modeling. It has been used both for models on the type level and instance level (i.e. data quality), which is the focus here and will be described further below. The framework is illustrated in Figure 41. Quality has been defined referring to the correspondence between statements belonging to the following sets: G, the set of goals of the modeling task. L, the language extension, i.e., the set of all statements that are possible to make according to the rules of the modeling languages used. D, the domain, i.e., the set of all statements that can be stated about the situation. M, the externalized model itself. K, the explicit knowledge relevant to the domain of the audience. I, the social actor interpretation, i.e., the set of all statements that the audience interprets that an externalized model consists of. T, the technical actor interpretation, i.e., the statements in the model as 'interpreted' by modeling tools. Copyright LinkedDesign Consortium Page 136 / 164

137 Figure 41: SEQUAL framework for discussing quality of models The defined quality types are: 1. Physical quality: The basic quality goal is that the externalized model M is available to the relevant social and technical actors. 2. Empirical quality deals with comprehension and predictable error frequencies when a model M is read or written by different social actors. Before evaluating empirical quality, physical quality should be addressed. 3. Syntactic quality is the correspondence between the model M and the language extension L. Before evaluating syntactic quality, physical quality should be addressed. 4. Semantic quality is the correspondence between the model M and the domain D. This includes both validity and completeness. Before evaluating semantic quality, syntactic quality should be addressed. Domains can be divided into two parts, exemplified by looking at a software requirements specification (Krogstie 2001): Everything the computerized information system (CIS) is supposed to do (for the moment ignoring the different views the stakeholders have on the CIS to be produced). This is termed the primary domain. Constraints on the model because of earlier baselined models such as system level requirements specifications, enterprise architecture models, statements of work, and earlier versions of the requirement specification to which the new requirement specification model must be compatible. This is termed the modeling context. Perceived semantic quality is the similar correspondence between the social actor interpretation I of a model M and his or hers current knowledge K of domain D. Before evaluating perceived semantic quality, pragmatic quality should be addressed. 5. Pragmatic quality is the correspondence between the model M and the actor interpretation (I and T) of it. One differentiates between social pragmatic quality (to what extent people understand the models) and Copyright LinkedDesign Consortium Page 137 / 164

138 technical pragmatic quality (to what extent tools can be made that can interpret the model). Before evaluating pragmatic quality, empirical quality should be addressed. 6. The goal defined for social quality is agreement among social actor s interpretations (I). Before evaluating social quality, perceived semantic quality should be addressed. 7. The deontic quality of the model relates to that all statements in the model M contribute to fulfilling the goals of modeling G, and that all the goals of modeling G are addressed through the model M. In particular, one often includes under deontic quality the extent that the participants after interpreting the model learn based on the model (increase K) and that the audience are able to change the domain D if this is beneficially to achieve the goals of modeling (if the model is prescriptive). This area was earlier called organizational quality. The term deontic is from Greek 'deon' - duty from impersonal dei - it behoves (i.e. it is fitting) relating to the goal one want to achieve. Language quality goals are looked upon as means in the framework. Six areas of language quality are identified: 1. Domain appropriateness. This relates the language and the domain. Ideally, the language must be powerful enough to express anything in the domain, not having what (Wand and Weber 1993) terms construct deficit. On the other hand, you should not be able to express things that are not in the domain, i.e. what is termed construct excess (Wand & Weber 1993). Domain appropriateness is primarily a mean to achieve high semantic quality. 2. Comprehensibility appropriateness relates the language to the social actor interpretation. The goal is that the participants in the modeling effort using the language understand all the possible high statements of the language. Comprehensibility appropriateness is primarily a mean to achieve empirical and pragmatic quality. 3. Participant appropriateness relates the social actors explicit knowledge to the language (i.e. do the participants know the language being used or are easily able to learn it). Participant appropriateness is primarily a mean to achieve high semantic and pragmatic quality. 4. Modeler appropriateness: This area relates the language extension to the knowledge of the modeler. The goal is that there are no statements in the explicit knowledge of the modeler that cannot be expressed in the language. Modeler appropriateness is primarily a mean to achieve high semantic quality. 5. Tool appropriateness relates the language to the technical audience interpretations. For tools interpretation, it is especially important that the language lend itself to automatic reasoning. This requires formality (i.e. both formal syntax and semantics being operational and/or logical), but formality is not necessarily enough, since the reasoning must also be efficient to be of practical use. This is covered by what we term analysability (to be able to exploit the mathematical semantics if any efficiently) and executability (to be able to exploit the operational semantics if any efficiently). Different aspects of technical actor interpretation appropriateness are a mean to achieve high syntactic, semantic and pragmatic quality (through formal syntax, mathematical semantics, and operational semantics respectively). 6. Organizational appropriateness relates the language to standards and other organizational needs within the organizational context of modeling. These are means to support deontic quality Data Quality It is in LinkedDesign particularly relevant to look upon the assessment of the underlying data model and data quality together. There are a number of approaches to dimensions of data quality. Different authors use similar terms in the area somewhat differently. We will base this section on the framework presented in (Batini and Scannapieco, 2006), where the following aspects are discussed relative to the data values. The examples relates to data in a relational database, with tables including tuples having attributes of pre-defined values, and in addition include integrity constraints. Copyright LinkedDesign Consortium Page 138 / 164

139 Dimension name Category Definition Accuracy Syntactic Semantic Distance between v (the correct value) and v' (the incorrect value) (the edit value) Completeness Degree to which all values are present in a data collection Time-related aspects Currency Volatility Timeliness Degree to which the data is up-to-date Frequency with which data vary with time How current the data is for the task at hand Consistency Coherence of the same datum, represented in multiple copies, or different data to respect integrity constraints and rules Interpretability Concerns the documentation including the data model, and other metadata that are available to correctly interpret the meaning of data Accessibility Data is accessible for those needing access to the data in a format that can be understood Quality of information source Believability Reputation Objectivity Is the data provided true, real and credible? Is the source normally credible? Is the source believed to be objective? Table 15: Dimensions of data quality When looking upon data quality in isolation, the underlying data model can be looked upon as part of the context (i.e. a pre-existing model that this model should relate to). Obviously how the data is meant to be and actually are used influences on the perceived quality of the data. To also capture this (Price and Shanks, 2004, Price and Shanks, 2005) use the term information quality to combine a product-based and a service-based view on data quality. The product-based perspective, covered by traditional data quality properties, focuses on the design and internal IS view. From this view, quality is defined in terms of the degree to which the data meets initial requirements specifications or the degree to which the data corresponds to the relevant real-world phenomena that it represents. The limitation with this is that even if data corresponds to a requirements specification or the real-world, there can still be quality deficiencies with respect to actual use-related data requirements, which may differ from the planned uses catered for in the initial specifications. This leads to a service-based perspective of data quality, often called information quality, which focuses on the information consumer s response to their task-based interactions with the IS. Price and Shanks (2004) defines this area building upon semiotic theory Based on empirical evaluations of the original framework presented in 2004, the following quality categories have been defined (Price and Shanks, 2005): Syntactic Criteria (based on rule conformance) Conforming to metadata, i.e. integrity rules. Data follows specified database integrity rules. Semantic Criteria (based on external correspondence) Mapped completely. Every real-world phenomenon is represented. Mapped unambiguously. Each identifiable data unit represents at most one specific real-world phenomenon. Phenomena mapped correctly. Each identifiable data unit maps to the correct real-world phenomenon. Properties mapped correctly. Non-identifying (i.e. non-key) attribute values in an identifiable data unit match the property values for the represented real-world phenomenon. Mapped consistently. Each real-world phenomenon is either represented by at most one identifiable data unit or by multiple, but consistent identifiable units or by multiple identifiable units whose inconsistencies are resolved within an acceptable time frame. Mapped meaningfully. Each identifiable data unit represents at least one specific real-world phenomenon. Copyright LinkedDesign Consortium Page 139 / 164

140 Pragmatic Criteria (use-based consumer perspective) Accessible (easy, quick). Data is easy and quick to retrieve. Suitably presented (suitably formatted, precise, and measured in units). Data is presented in a manner appropriate for its use, with respect to format, precision, and units. Flexibly presented (easily aggregated; format, precision, and units easily converted). Data can be easily manipulated and the presentation customized as needed, with respect to aggregating data and changing the data format, precision, or units. Timely. The currency (age) of the data is appropriate to its use. Understandable. Data is presented in an intelligible manner. Secure. Data is appropriately protected from damage or abuse (including unauthorized access, use, or distribution). Type-sufficient. The data includes all of the types of information important for its use. Allowing access to relevant metadata. Appropriate metadata is available to define, constrain, and document data. Perceptions of the syntactic and semantic criteria defined earlier. For open data (Berners-Lee, 2010) have proposed a five-star maturity model as follows: 1 star: publishing data on the web even in proprietary and desktop-centric formats 2 stars: publishing data in machine-readable formats such as spreadsheets documents 3 stars: publishing data in machine readable and non-proprietary formats using open standards, e.g., CSV 4 stars: publishing data using linked data principles 5 stars: linking the available data As also discussed in (Batini and Scannapieco, 2006) data quality should be looked upon in connection to quality of the defining data models. Data models are a type of structural models used both for human sense-making and communication and as a context for systems development. Approaches within the structural modeling perspective concentrate on describing the static structure of a domain. The main construct of such languages is the "entity". Going back to the ANSI SPARC work (Tsichritzis and Klug, 1978), one differentiates between three levels of data models: Conceptual models (e.g. ER models (Chen, 1976) ) Logical models (e.g. in the form of relational tables) Physical models (e.g. a physical implementation of a relational database using a DBMS There are well-defined ways of going between these levels, although often automatic mappings are not sufficient in practice to get ideal database performance based on the conceptual and logical models. Some of the early work on quality of models focused on data models (Moody and Shanks, 1994), a quality model that was extended in (Moody and Shanks, 1998; Moody and Shanks, 2003) based on empirical investigations on its use. The quality model in Moody and Shanks (1994) contains the following properties: Correctness is defined as whether the model conforms to the rules of the data modeling technique (i.e. whether it is a valid data model). This includes diagramming conventions, naming rules, definition rules, rules of composition and normalization. Completeness refers to whether the data model contains all information required to support the required functionality of the system. Copyright LinkedDesign Consortium Page 140 / 164

141 Integrity is defined as whether the data model defines all business rules that apply to the data. Flexibility is defined as the ease with which the data model can cope with business and/or regulatory change. Understandability is defined as the ease with which the concepts and structures in the data model can be understood. Simplicity means that the data model contains the minimum possible entities and relationships. Integration is defined as the consistency of the data model with the rest of the organization s data. Implementability is defined as the ease with which the data model can be implemented within the time, budget and technology constraints of the project. Another overview of data model (schema) quality is presented in (Batini and Scannapieco, 2006), containing the following areas: Correctness with respect to the model concerns the correct use of the concepts in the language. A negative example is to represent FirstName as an entity, and not as an attribute (since FirstName do not have unique existence in the real world) Correctness with respect to requirements Minimalisation, no requirement is represented more than once Completeness Pertinence that measures how many unnecessary conceptual elements are included Readability through aesthetics Readability through simplicity Normalization Whereas normalization is first relevant on the logical level, the others apply also on the conceptual level. We discuss means within each quality level in detail, positioning those areas that are specified by Batini et al (2006) and Price et al. (2004,2005). These are emphasized using italic below Physical data quality Aspects of persistence, data being accessible (Price) for all (accessibility (Batini)), currency (Batini) and security (Price) cover aspects on the physical level. This area can be looked upon relative to measures of persistence, currency and availability that apply also to models of other types: Persistence: How persistent is the data, how protected is it against loss or damage? For data on disk, the physical quality will be higher if there is a backup copy, even higher if this backup is on another disk whose failure is independent of the original. The way of storing the data should be efficient, i.e. not using more space than necessary. Currency: How long time ago is it that the data were included in the database (assuming the data was current when entered). Depending on the type of data, the age of the data is of differing importance. When the domain is changing rapidly (has high volatility), currency of the data is of more importance for the data to have appropriate timeliness. Metrics on currency can easily be devised and calculated if the database support time-stamping of statements. This area will relate to semantic and perceived semantic quality (see below), relative not only to the time of entering of data, but also last time the data is validated. Availability: How available is the data to the audience? Clearly, this is dependent on that the data is first made persistent. Availability also depends on distributability, especially when members of the audience are geographically dispersed. Data which is in an electronically distributable format will be more easily distributed than data which must be printed on paper. It may also matter exactly what is distributed, e.g. the data in an editable form or merely in an output format. Thus, the maturity level according to (Berners-Lee Copyright LinkedDesign Consortium Page 141 / 164

142 2010) is relevant. A metric for availability is the proportion of data relevant for a member of the audience being available for that audience member. In connection to currency and availability, the term 'timeliness' is often used, i.e. the data is not only current, but are also available in time for events that corresponds to their usage. This relates directly to the goal of modeling, thus timeliness is set up as a deontic goal. A possible measurement of timeliness consists of (i) a currency measurement and (ii) a check that the data is available before the planned usage time. Security as mentioned by Price relates to that people not meant to have access to the data to not get such access. Tool functionality in connection to achieving physical quality is based on traditional database-functionality potentially enriched with semantic web technology Empirical data quality This is addressed by understandable (Price). Since data can be presented in many different ways, this relates to how the data is presented and visualized. How to best present different data depends quite a bit on the underlying data-type. Depending on the breath of the user-group, guidelines for universal access might also be important with standards such as WCAG 2.0. There are a number of generic guidelines within data visualization and related areas that can be applied, and we will only mention a few of these here. For textual data, one can look at work on readability indexes e.g. Flesch (1948), but we will not go into detailed description of such techniques here. For computer-output specifically, many of the principles and tools used for improving human computer interfaces are relevant at the empirical level (Shneiderman, 1992). For visual presentation of data, one can also base the guidelines on work e.g. in cognitive psychology and cartography with the basis that data is meant to be useful in connection to communication between people. Going back to (Shannon and Waver, 1963), communication entails both encoding by the sender and decoding by the receiver. Encoding has been discussed in detail e.g. in the work of Bertin (1983). According to (Bertin 1983) there are 4 different possible effects of encoding: 1. Association. The marks can be perceived as similar 2. Selection. The marks can be perceived as different 3. Order. The marks can be perceived as ordered 4. Quantity. The marks can be perceived as proportional 8 different variables presented to convey one or more of these meanings in a visualisation are: planar variables: horizontal position, vertical position retinal variables: shape (association and selection), size (selection, order and quantity), color (Association and selection), brightness (value) (selection and order), orientation (association), texture (association, selection and order) For decoding Moody (2009) presents a model differentiating between aspects of perception and cognition: Perceptual discrimination: Features are detected by specialized feature detectors. Based on this, the visualization is parsed into its parts. Perceptual configuration: Structure and relationship among elements are inferred. Within the area of Gestalt psychology, a number of principles for how to convey meaning through perceptual means are provided (Ware, 2000). Copyright LinkedDesign Consortium Page 142 / 164

143 Attention management: All or part of the perceived image is brought into working memory. Working memory has very limited capacity. To be understood, statements in the model must be integrated with prior knowledge in the long-term memory of the interpreter. Differences in prior knowledge (e.g. expert-novice differences) greatly affect the speed and accuracy of processing. Rules for color-usage are also useful in connection to evaluating data visualization (if different colors are used for differentiating meaning differences). Around 10% of the male population and 1 % of the female population suffer from some form of color vision deficiency (Ware, 2000). On the other hand, color is an important differentiator in other visual representations that is meant to be widely used (e.g. maps, see (Bertin, 1983)). Shneiderman (1992) has listed a number of guidelines for the usage of color in visual displays in general: Use color conservatively Limit the number of colors used. Many design guidelines suggest limiting the number of colors in a display to four, with a limit to seven colors. According to the opponent process theory (Ware, 2000), there are six elementary colors, and these colors are arranged perceptually as opponent pair s long three axes: blackwhite, red-green, and yellow-blue. There is both physiological and linguistic support for using these colors for differentiation. In a study of more than 100 languages from many different cultures, it was showed that primary color terms are quite consistent across cultures. In languages with only two basic color words, they are always black and white; if a third color is present, it is always red, the fourth and the fifth are either yellow then green, or green and then yellow; the sixth is always blue, the seventh is brown, followed by pink, purple, orange, and grey in no particular order Red attracts the eye more than other colors Ensure that the use of colors support your task Have color coding appear with minimal user effort Place the application of color coding under (guided) user control. Use color to help in formatting Be consistent in color coding Be aware of common expectations about color codes. This can be dependent on the local culture Be aware of problems with color pairings. If saturated (pure) red and blue appear on a display at the same time, it may be difficult for users to absorb the information. Red and blue are on the opposite ends of the spectrum, and the muscles surrounding the eye will be strained by attempts to produce a sharp focus for both colors simultaneously. Blue text on a red background would present an especially difficult challenge for users to read. Similarly, other combinations will appear difficult, such as yellow on purple, and magenta on green. Too little contrast is also a problem (yellow letters on a white background, or brown letters on a black background). Note that colors might be shown differently on different screens and projectors. Use color changes to indicate status changes Use color in graphic displays for greater information density When using a color coding, take into account that the visualization might need to be presented or distributed in gray-scale (for example when printed). The use of emphasis can also be in accordance with the relative importance of the data. Factors that have an important impact on visual emphasis are: Size (the big is more easily noticed than the small) Solidity (e.g. bold letters vs. ordinary letters, full lines vs. dotted lines, thick lines vs. thin lines, filled boxes vs. non-filled boxes) Copyright LinkedDesign Consortium Page 143 / 164

144 Difference from ordinary pattern (e.g. slanted letters will attract attention among a large number of ordinary ones) Foreground/background differences (if the background is white, things will be easier noticed the darker they are) Change (blinking or moving symbols attract attention) Position (looking at a two-dimensional arrangement, people tend to start at its middle) Connectivity (objects that have connections to many other objects (having a high degree) will attract attention compared to objects with few connections) Syntactic data quality From the generic SEQUAL framework we have that there is one main syntactic quality characteristic, syntactical correctness, meaning that all statements in the model are according to the syntax and vocabulary of the language Syntax errors are of two kinds: Syntactic invalidity, in which words or graphemes not part of the language are used. Syntactic incompleteness, in which one lack constructs or information to obey the language's grammar Conforming to metadata (Price) including that the data conform to the expected data type of the data (as described in the data model/data schema or in a standard) are the main measures of syntactic data quality. This will typically be related to syntactic invalidity when e.g. the data is of the wrong data-type Semantic data quality When looking upon semantic quality relative to the primary domain of modeling, we have the following properties: Completeness is described both with completeness (Batini), mapped completely (Price), and mapped unambiguously (Price). Validity is described by accuracy (Batini), both syntactic and semantic accuracy as they have define it, the difference between these is rather to decide on how incorrect the data is, phenomena mapped correctly (Price), properties mapped correctly (Price) and mapped meaningfully (Price). When the rules of representation are formally given, consistency (Batini)/mapped consistently (Price) is normally also related to validity Properties related to the model context are related to the adherence of the data to the data model. One would expect for instance that All tables of the data model should include tuples Data follows the constraints defined in the data-model The possibility of ensuring high semantic quality of the data is closely related to the semantic quality of the underlying data model. When looking upon semantic quality of the data model relative to the primary domain of modeling, we have the following properties: Completeness (Moody) (number of missing requirements) and integrity (Moody) (number of missing business rules) relates to completeness. The same does Batini's point on completeness. Completeness (Moody) (number of superfluous requirements) and integrity (Moody) (number of incorrect business rules) relates to validity. The same applies to Batini's points on correctness with respect to model and correctness with respect to requirements. Copyright LinkedDesign Consortium Page 144 / 164

145 Pragmatic data quality Pragmatic quality as we define it relates to the comprehension of the model by participants. Two aspects can be distinguished: That the interpretation by human stakeholders of the data is correct relative to what is meant to be expressed by the data. In addition to the data it will often be useful to have different meta-data represented (such as the source of the data). That the tool interpretation is correct relative to what is meant to be expressed by the data. Starting with the human comprehension part, pragmatic quality on this level is the correspondence between the data and the audience's interpretation of it. Moreover, it is not only important that the data has been understood, but also who has understood (the relevant parts of) the data. It is important to notice that the pragmatic goal is stated as comprehension, i.e. that the data has been understood, not as comprehensibility, i.e. the ability of the data to be understood (which is dealt with under empirical quality above). There are several reasons for doing so. First, the goal here is that the data is understood by the involved stakeholders, not that it is understandable per se, although understandability can be an important mean to achieve comprehension. Actual comprehension can be very dependent on the process by which the data is developed, the way the participants communicates with each other and various kinds of tool support. The main aspect at this level is interpretability (Batini), that data is suitably presented (Price) and data being flexibly presented (Price). Allowing access to relevant metadata (Price) together with the main data is an important mean to achieve comprehension Social data quality The goal defined for social quality is agreement. Six kinds of agreement can be identified, according to the following dimensions: Agreement in knowledge vs. agreement in interpretation. In the case where two data representations are made based on the view of two different actors, we can also talk about agreement between the representations. Relative agreement vs. absolute agreement. Relative agreement means that the various sets to be compared are consistent -- hence, there may be many statements in the data representation of one actor that are not present in that of another, as long as they do not contradict each other. Absolute agreement, on the other hand, means that all data are the same. In practice relative agreement is what one should strive for achieving. The area quality of information source (Batini) is an important mean for the social quality of the data, since if the source has a good reputation, this will increase the probability of agreement (although one should be aware of dangers of model monopoly (Bråten 1973)). In some case one need to combine different data sources. This consists of a combination of combing the datamodels, and then transferring the data from the two sources into the new schema. Schema and data integration is described in detail in Deliverable D Deontic data quality A number of aspects are on this level relating to the goals of having the data in the first place. Whereas currency (Batini) was put at the physical levels, aspects do decide volatility (Batini) and timeliness (Batini)/ timely (Price) needs to relate to the goal of having and distributing the data. The same is the case for type-sufficient (Price), the inclusion of all the types of information important for its use. Copyright LinkedDesign Consortium Page 145 / 164

146 7.4 Appendix 4: Using Active Knowledge Models (AKM) to structure and capture user knowledge Active Knowledge Modeling (AKM) describe an approach for representing knowledge as visual models, where complex, rigid, software-oriented languages are replaced by visual models (see Core concepts A model is a representation of some aspects of the real world entities and phenomena, as interpreted by some actor(s). A model is active if it also influences the reality that it represents. Knowledge is held by people, so knowledge modelling languages should support human communication, sense-making and learning. Activation, the process whereby a model influences reality, cannot be solely based on automated execution. Instead, users must be supported by flexibly interpreting the models and acting upon them, in the situations that arise. This principle is called interactive activation. It implies partial automation, where the automation boundary may be adapted by the users. The more precise and detailed a model is, the more automatic execution is possible. However, exception handling requires that the users are allowed to access the underlying model and change the default, automatic interpretation of it, something which is typically not possible in traditionally coded applications. A knowledge architecture, consist of knowledge explicitly represented in structured models, and of the mental views of the people involved in creating and using these models. Knowledge is explicitly represented as information and data structures. Data consists of symbols used for conveying information and knowledge. Data becomes information when its meaning is interpreted by some actor. We thus see data as a one-dimensional representation, a stream of symbols. Information adds a second dimension that reflects the meaning of the original data, often called meta-data. Knowledge implies a justification of the information or that the information guides action. Knowledge establishes structures and relationships between information elements, and uses them to manage dependencies. Knowledge representations must thus possess at least three dimensions: data, its meaning and the structure, justifications, and actions that the knowledge results in. In order to support reflection on knowledge, a fourth dimension is needed. Reflection on knowledge is required for e.g. learning, knowledge management, design, innovation, collaboration, creation of shared understanding, and other creative tasks. We refer to representations that contain four reflective dimensions as knowledge spaces. These concepts are illustrated in Figure 42. Figure 42: Dimensions in Modelling Copyright LinkedDesign Consortium Page 146 / 164

147 In software systems, the second dimension is typically represented as program code that defines how the data is manipulated, stored and presented to the user. Among humans, the second dimension can be illustrated by the capability to understand a certain language, such as English. If you do not understand the language, speech or text becomes incomprehensible data. Few computerized systems are really knowledge-based. Their data structures and program code are fixed. By using the system, humans can bring in the extra dimensions of interpretation and reflection needed for knowledge and learning. However, these additional dimensions cannot be directly reflected back into the computerized system as updated data structures or program logic. There is no easy way to affect the behaviour of the system other than manually changing the code. Thus, reflection, knowledge and learning cannot be shared among the people using the system directly. What a user can learn from the system is limited to the two dimensions that were coded into the system from the start. The concept of Model-Generated Workplaces (MGWP) can be used to make the models available in a tailorable, context-specific way. MGWP is a working environment for professionals (including designers and engineers) involved in running the business operations of the enterprise. It is a user platform that provides the graphical front-end for human users to interact with software services supporting their day-to-day business activities. The workplace can be tailored to meet the specific requirements of different roles or persons within an enterprise, providing customized, context-specific presentation and operation views based on data in the enterprise systems. What is Active Knowledge Architecture (AKA)? Holistic design must be performed in teams with enterprise architects, methodology experts, and leading users, supported by agile architecting methods and practitioner-driven approaches. The many kinds of models are integrated in and reused through a visual, role-oriented Active Knowledge Architecture (AKA). Figure 43: Dimensions in Enterprise Knowledge Spaces An AKA is built by a modelling team and emerges as leading users express knowledge of roles and workspaces. Architects govern principles and create modelling constructs to allow as much work-centric knowledge as possible to be captured. Modeling in practical work-centric context, capturing both approach and execution, allow users to express knowledge that would otherwise remain tacit and even be lost. Context-rich workspace models are created using the IRTV language (Information, Role, Task, View). The models are composed of contents that focus on dependencies among and between roles (R), their main tasks (T), supporting views (V) and relevant information elements (I) e.g. related to product structure, as illustrated in Figure 43 Models created using this approach can automatically generate workspaces for the specific roles. Users can affect the model through their workspaces to automatically update the model. Example from practice Copyright LinkedDesign Consortium Page 147 / 164

148 Industrial pilots have been developed that demonstrate the AKM approach and potential user benefits. A pilot example from an oil and gas field engineering project is presented below. Its focus is on the design of a selected piping system platform area and three roles involved. Figure 44: Link between common AKA and workplaces. Figure 44 shows an AKA with just the application model visible. The two workplaces generated by the AKA provide tasks for users to enhance the model and build new ones. Workplace models are composed of content specific to roles, their main tasks, supporting views and relevant information elements. The three workplaces shown in Figure 45 were modelled, embedded in, and configured by tools and models embedded in the AKA. Figure 45: The model-based workplaces of the engineering project pilot from left: - the piping engineer, the piping area manager, and the methodology manager. Copyright LinkedDesign Consortium Page 148 / 164

149 The objectives of improved visual collaboration and coordination to improve work planning, reporting and execution were met in a few weeks. Practitioners must collaborate to design and apply variant, modularization, and design rules. Tasks being executed and new tasks are captured in task-patterns. These may later contribute to improved processes for design, coordination, traceability, predictability and reusability. New design principles and methodologies are developed (Krogstie and Lillehagen, 2008), including modelling principles and services for composing modelling languages to be used in life-cycle maintenance and support. Common understanding of operations and tasks, roles and responsibilities, interoperability and information exchange, and business and engineering data was achieved. Stakeholder and user involvement in modelling and architecture development and management was facilitated, as was continuous learning and innovation. AKA Benefits The advantages of developing an AKA to capture the approach, emerging architectures, and enterprise evolution are: Enable model-based, AKA-driven execution, Support work in context, modelling knowledge spaces, Support role-oriented, architecture-driven workspaces, Capture local nuances, practices and rules, and rich context, Giving users control over data, information flows and views, Closing the gap between design and execution of IT-solutions, Allow users to extend networking platforms to invite new users, Integrate and provide role-specific operational views, Give control of IT solutions and services to practitioners, Implement new methods by combining mental and digital models, and Produce event- and situation-driven communications and views. Building an AKA starts by modelling a customer scenario using what AKM calls the Enterprise Knowledge Architecture (EKA) as a model template. The EKA is a generic knowledge model, able to represent any AKA content as information, roles, tasks, and views (reflecting the concepts of the IRTV language). Whereas much of the work in engineering is done within organizations, one find increased benefit in also sharing (parts of) the knowledge between organizations, also in early phases. Examples with AKM e.g. in the MAPPER project (Lillehagen and Krogstie, 2008) highlights the potential for such knowledge sharing with a case from Kongsberg Automotive (KA). The needs of Kongsberg are very similar to the needs expressed by companies in other industrial sectors, such as aerospace, construction and energy. Short term, Kongsberg s needs and goals were to: Capture and correctly interpret customer requirements, Create role-specific, simple to use and re-configurable workplaces, Create effective workplace views and services for data handling, Improve the quality of specifications for customers and suppliers, Improve communications and coordination among stakeholders, Copyright LinkedDesign Consortium Page 149 / 164

150 Find a sound methodology for product parameterization, automating most of the tasks for product model customized engineering. To fulfill these goals KA investigating the AKM approach, adapting several methodologies and building modelbased workplaces as illustrated in the following case. Material specifications are the core knowledge of collaboration between the customer, represented by Kongsberg Automotive (KA) and the supplier, represented by Elektrisola (E). As illustrated in Figure 46 the material specification is today managed as a document, typically created in Microsoft Word. The content in a specific version of the material specification is put together by one person in KA and approved by one person in E and both companies are filing one copy of the approved material specification. Of course over time additional customer requirements and changes to requirements need to be communicated resulting in new parameter values in new versions of the document. The biggest disadvantages with this solution are: The content in the material specifications is not easily accessed and cannot contribute to the two companies operations directly. The process and work logic to achieve a consistent specification is not captured, making integration with other processes impossible. The involvement and commitment from the supplier is not encouraged, there is no support for mutual adjustments in supply and demand. Keeping the material specifications updated in both companies can be quite time consuming Kongsberg Automotive Documents and Elektrisola Desired Effect Designed Wire Note of Accept. Desired Change Figure 46: Illustrating the current work logic with the material specification document (Lillehagen and Krogstie, 2008) Copyright LinkedDesign Consortium Page 150 / 164

151 The general approach as illustrated in Figure 47 has been to replace the document with an operational knowledge architecture built by using the Configurable Visual Workplace (CVW) module developed by AKM within the MAPPER project. The biggest advantages with the model based knowledge architected solution are: The content in the material specifications will be easy to access by both companies and can be part of the each company s complete knowledge architecture, provided that the model based solution is replacing the document based solution for other applications within the companies. The involvement from the supplier will be encouraged and the supplier commitment will be more obvious. The time for updating the material specifications is expected to be reduced in both companies. There is no real need of filed paper copies anymore. Shared Operational Project Knowledge Architecture Kongsberg Automotive Requested conductor Resistivity Proposed conductor designs Elektrisola Workplace Agreed parameter value ranges Specific parameter values Figure 47: Model-configured Workplaces driven by Active Knowledge Architectures Enterprise Knowledge Architecture - EKA Figure 48 defines the core constructs used for representing models on the technical layer. All constructs are regarded as Elements. Models contain elements, but one element may be found in multiple models. Models can capture partial, overlapping and incomplete views. Copyright LinkedDesign Consortium Page 151 / 164

152 Figure 48: Core EKA elements Conventionally, most model elements will be Objects. All kinds of elements may have Properties, and Relationships link two elements through Origin and Target Roles. Relationships, roles and properties are also elements, so they may possess properties and have relationships to other element. The EKA does not separate between meta-classes, classes and instances because One person s roof is another one s floor, thus an instance in one view may be a class in another. AKM models represent mutually reflective views. Instead, a special relationship called Is between two objects (or relationships or properties), denote that the origin is defined by the target, and can thus express both specialization (Student is Person) and instantiation (George isa Person). The instantiation relationship Is-a has similar meaning as Is, but it is used to separate meta-levels (for the modelling contexts where this is required). Finally, Equals is a bi-directional Is-relationship, which implies that the two elements represent the same thing. Equals is typically used for representing mappings between models that represent different perspectives on the same domain. Other relationship types include general links and associations, and decomposition with (Part) and without (Member) ownership. Relationships and properties have cardinality. Note that this approach enables classification, decomposition and states of properties, relationships and views just like objects. Values are also represented as first class model elements. This implies that values can be related to other values (e.g. derived-from or in-conflict-with ), that values may be shared between multiple elements, that value sets may be defined (using member relationships). Roles are values that reference other elements. Properties are modelled as relationships from an element to a value. A Parameter is a relationship between a task and a value, signifying that the task processes or is influenced by the value. The right column of the EKA core concepts model define basic constructs for representing the dynamic execution aspects of model elements. As a core concept, Task represents any unit of behaviour or action. Events represent the statement that something has happened, and may trigger tasks. A Rule is a special kind of task that defines constraints, laws or intentions that should be enforced. Copyright LinkedDesign Consortium Page 152 / 164

153 7.5 Appendix 5: interview guide for EPR and PLM systems Interview guide LinkedDesign: Analysis of collaborative planning in ERP and PLM systems PART 1: Key data on vendor and software Market share size, in Norway, in Europe? Main types of customers: SMEs or large companies Customer buys full package or selected modules? Base-installation and add-ons? What infrastructure or other systems is requirements for using the software? PART 2: Current functionality What functionality is currently supporting o Collaboration in the product development o Collaboration in the design & engineering phase o Collaboration in the manufacturing stage o End-of-life phase: maintenance, removal from service and disposal How is the system currently supporting o Change orders o Component changes o Cost information related to changes? How does it support overall "generic products" or product families. How is software package structured in main modules? Which, if any, industry tailored solutions exists? How does the ERP/ PLM system support the PLAN phase of a PLM processes? PART 3: Future trends and functionality PART 4: In which areas does the application need to improve? What are the customers asking for that cannot be currently solved with the solution? Data model: How do the tool represent data on * Products/product structures * Processes and tasks * Goals and rules * Persons and roles * Environment and context How are access to data controlled (data management/security etc) Is it possible to export/import data in standard formats? If case which standards are supported? Is it possible to invoke services/access data in the tool from outside the tool? Copyright LinkedDesign Consortium Page 153 / 164

154 The following Teamcenter PLM interviews were conducted: ID Date Interviewer Source Duration PLM # Ottar Bakås Jørn Atle Husbyn, Application Engineer, Summit Systems 1 hour PLM # Pavan Sriram Jørn Atle Husbyn, Application Engineer, Summit Systems 1 hour PLM # Ottar Bakås, Pavan Sriram Jørn Atle Husbyn, Application Engineer, Summit Systems Lars Fossum, Pre-sales, Summit Systems 2 hours Copyright LinkedDesign Consortium Page 154 / 164

155 7.6 Appendix 6: interview guide for industrial partners Interview guide LinkedDesign: Interviews with use case companies, Aker Solutions, Volkswagen, Comau Introduction: The purpose of this document is to serve as a guideline in industry-targeted interviews in LinkedDesign. The interview guide is written for Task 5.3, Collaborative Planning through effective order management. The target interviewees are project managers, engineers and manufacturing managers from the three industrial users in the project: Aker Solutions, Comau and Volkswagen. The interview guide is divided in four topic areas: 1. Order acquisition tendering processes leading to an industrial project / production order 2. Order management planning and follow-up of a sales order or production order 3. Engineering changes management of engineering changes 4. Exceptions handling managing exceptions and errors occurring during the life cycle Base scenario: Some of the questions in the interview are loosely based on the review story scenario, which includes the following roles: Pablo Project Project Manager Doris Designer Parts Designer Lenny Lifecycle Expert in lifecycle calculations Volker Wagen Quality Controller The following roles have been included in addition Manny Manufacturer Manufacturing Manager Marcus Marketing Marketing and Sales Manager Knut KE Knowledge engineer Steve Supplier Key supplier representative Chris Customer Customer representative The relevant parts of the review story scenario are included in the text below. Copyright LinkedDesign Consortium Page 155 / 164

156 1. Order acquisition New scenario Marcus Marketing is attending a professional conference where he learns about new EC regulations that will affect their design. He also talks to existing customers that has ideas for new features in one of the existing products of the company. Marcus Marketing needs to communicate this information to Doris Designer. How do you collaborate within the organization to capture market knowledge and acquire new projects? How important is this pre-sales process for the design solutions in the final project How is risk associated with new projects assessed? Which tools do you use today to support the order acquisition process? What are the limitations with the current support? Future Possibility Mock-ups see presentation Copyright LinkedDesign Consortium Page 156 / 164

157 2. Order fulfillment How do you collaborate internally and with your suppliers and customers to execute an ongoing project? How do you collaborate internally with fulfilling customer orders / projects? On which areas do you collaborate most frequently with suppliers and customer in order fulfillment? How do you assign key resources across projects? How do the tools support this today? What are the limitations with the current support? Who can provide additional support for this process? (e.g. external experts) Future Possibility mock-ups Copyright LinkedDesign Consortium Page 157 / 164

158 3. Change order management / engineering changes Scenario story sequence: #2 Pablo Project is a manager at Volkswagen. Due to EC regulations on emission requirements for cars, the mission is to reduce the weight of the new Volkswagen Golf. Note: the request for a change may be internal or external and due to a multitude of reasons. How would you deal with requests for change in your designs? How do you manage change requests from the customer? What information is required to act on the change request? Who will be impacted by the change? And how will they be impacted by the change? What types of changes is most common? What are the most frequent impacts of changes time, cost, resources? All? How do your current tools support this today? What are the limitations with the current support? Who can provide additional support for this process? (e.g. external experts) Future Possibility mock-up Copyright LinkedDesign Consortium Page 158 / 164

159 4. Error Handling Scenario story sequence: #13 While Lenny and Volker discuss the numbers on the phone a notification appears in Volker s inbox. A defective part has just been produced. How would you deal with errors during production? Who is involved? Who will be impacted by the error? And how will they be impacted by the error? What are the impacts of this error time, cost, resources? What information is required to act on the error and correct it? How do the tools support this? What are the limitations with the current support? Who can provide additional support for this process? (e.g. external experts) Future Possibility Collaborative Process: Copyright LinkedDesign Consortium Page 159 / 164

Project Overview. Adrian Mocan, SAP Sergio Terzi, Politecnico di Milano

Project Overview. Adrian Mocan, SAP Sergio Terzi, Politecnico di Milano Project Overview Adrian Mocan, SAP Sergio Terzi, Politecnico di Milano LinkedDesign is supported by the European Commission's Seventh Framework Program (FP7) Consortium 2 Project Objectives Data Federation

More information

Scalable End-User Access to Big Data http://www.optique-project.eu/ HELLENIC REPUBLIC National and Kapodistrian University of Athens

Scalable End-User Access to Big Data http://www.optique-project.eu/ HELLENIC REPUBLIC National and Kapodistrian University of Athens Scalable End-User Access to Big Data http://www.optique-project.eu/ HELLENIC REPUBLIC National and Kapodistrian University of Athens 1 Optique: Improving the competitiveness of European industry For many

More information

FITMAN Future Internet Enablers for the Sensing Enterprise: A FIWARE Approach & Industrial Trialing

FITMAN Future Internet Enablers for the Sensing Enterprise: A FIWARE Approach & Industrial Trialing FITMAN Future Internet Enablers for the Sensing Enterprise: A FIWARE Approach & Industrial Trialing Oscar Lazaro. olazaro@innovalia.org Ainara Gonzalez agonzalez@innovalia.org June Sola jsola@innovalia.org

More information

Federated, Generic Configuration Management for Engineering Data

Federated, Generic Configuration Management for Engineering Data Federated, Generic Configuration Management for Engineering Data Dr. Rainer Romatka Boeing GPDIS_2013.ppt 1 Presentation Outline I Summary Introduction Configuration Management Overview CM System Requirements

More information

TDT4252 / DT8802 Enterprise Modelling and Enterprise Architecture

TDT4252 / DT8802 Enterprise Modelling and Enterprise Architecture 1 TDT4252 / DT8802 Enterprise Modelling and Enterprise Architecture Sobah Abbas Petersen Adjunct Associate Professor sap@idi.ntnu.no 2 Lecture Today Based on the following article: Jørgensen, H. D., Karlsen,

More information

DELIVERABLE D2.1.1 Requirements gathering plan and methodology

DELIVERABLE D2.1.1 Requirements gathering plan and methodology PERICLES - Promoting and Enhancing Reuse of Information throughout the Content Lifecycle taking account of Evolving Semantics [Digital Preservation] DELIVERABLE D2.1.1 Requirements gathering plan and methodology

More information

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material,

More information

Towards Collaborative Requirements Engineering Tool for ERP product customization

Towards Collaborative Requirements Engineering Tool for ERP product customization Towards Collaborative Requirements Engineering Tool for ERP product customization Boban Celebic, Ruth Breu, Michael Felderer, Florian Häser Institute of Computer Science, University of Innsbruck 6020 Innsbruck,

More information

Meta-Model specification V2 D602.012

Meta-Model specification V2 D602.012 PROPRIETARY RIGHTS STATEMENT THIS DOCUMENT CONTAINS INFORMATION, WHICH IS PROPRIETARY TO THE CRYSTAL CONSORTIUM. NEITHER THIS DOCUMENT NOR THE INFORMATION CONTAINED HEREIN SHALL BE USED, DUPLICATED OR

More information

Master Data Management Architecture

Master Data Management Architecture Master Data Management Architecture Version Draft 1.0 TRIM file number - Short description Relevant to Authority Responsible officer Responsible office Date introduced April 2012 Date(s) modified Describes

More information

An Introduction to SharePoint Governance

An Introduction to SharePoint Governance An Introduction to SharePoint Governance A Guide to Enabling Effective Collaboration within the Workplace Christopher Woodill Vice President, Solutions and Strategy christopherw@navantis.com 416-477-3945

More information

D6.1: Service management tools implementation and maturity baseline assessment framework

D6.1: Service management tools implementation and maturity baseline assessment framework D6.1: Service management tools implementation and maturity baseline assessment framework Deliverable Document ID Status Version Author(s) Due FedSM- D6.1 Final 1.1 Tomasz Szepieniec, All M10 (31 June 2013)

More information

For more information about UC4 products please visit www.uc4.com. Automation Within, Around, and Beyond Oracle E-Business Suite

For more information about UC4 products please visit www.uc4.com. Automation Within, Around, and Beyond Oracle E-Business Suite For more information about UC4 products please visit www.uc4.com Automation Within, Around, and Beyond Oracle E-Business Suite Content Executive Summary...3 Opportunities for Enhancement: Automation Within,

More information

DELIVERABLE. Grant Agreement number: 325091 Europeana Cloud: Unlocking Europe s Research via The Cloud

DELIVERABLE. Grant Agreement number: 325091 Europeana Cloud: Unlocking Europe s Research via The Cloud DELIVERABLE Project Acronym: Grant Agreement number: 325091 Project Title: Europeana Cloud Europeana Cloud: Unlocking Europe s Research via The Cloud D5.3 Europeana Cloud Access and Reuse Framework (originally:

More information

Business Process Models as Design Artefacts in ERP Development

Business Process Models as Design Artefacts in ERP Development Business Process Models as Design Artefacts in ERP Development Signe Ellegaard Borch IT University of Copenhagen, Rued Langgaards Vej 7, 2300 København S, Denmark elleborch@itu.dk Abstract. Adequate design

More information

TDWI strives to provide course books that are content-rich and that serve as useful reference documents after a class has ended.

TDWI strives to provide course books that are content-rich and that serve as useful reference documents after a class has ended. Previews of TDWI course books offer an opportunity to see the quality of our material and help you to select the courses that best fit your needs. The previews cannot be printed. TDWI strives to provide

More information

Revealing the Big Picture Using Business Process Management

Revealing the Big Picture Using Business Process Management Revealing the Big Picture Using Business Process Management Page 1 of 20 Page 2 of 20 Introduction In today s business environment, change is inevitable. Changes in technology, organizational structure,

More information

Integrating SAP and non-sap data for comprehensive Business Intelligence

Integrating SAP and non-sap data for comprehensive Business Intelligence WHITE PAPER Integrating SAP and non-sap data for comprehensive Business Intelligence www.barc.de/en Business Application Research Center 2 Integrating SAP and non-sap data Authors Timm Grosser Senior Analyst

More information

Program Advisory Committee (PAC) Agenda. December 14, 2011 9:00am 3:00pm PST. Agenda Items:

Program Advisory Committee (PAC) Agenda. December 14, 2011 9:00am 3:00pm PST. Agenda Items: BOULDER NASHVILLE SAN FRANCISCO KANSAS CITY SPRINGFIELD, MO FAIRFAX, VA 2540 Frontier Avenue, Suite 100 Boulder, Colorado 80301 303.444.4149 SUBJECT: Date: Program Advisory Committee (PAC) Agenda December

More information

ENOVIA SmarTeam Engineering Express

ENOVIA SmarTeam Engineering Express TechniCom 179-9 Rte 46W #175 Rockaway, NJ 07866 www.technicom.com ENOVIA SmarTeam Engineering Express: A review by TechniCom June 2008 Author: Raymond Kurland, TechniCom Group Background Introduction In

More information

Operationalizing Data Governance through Data Policy Management

Operationalizing Data Governance through Data Policy Management Operationalizing Data Governance through Data Policy Management Prepared for alido by: David Loshin nowledge Integrity, Inc. June, 2010 2010 nowledge Integrity, Inc. Page 1 Introduction The increasing

More information

Architecting enterprise BPM systems for optimal agility

Architecting enterprise BPM systems for optimal agility Architecting enterprise BPM systems for optimal agility Dr Alexander Samarin www.samarin.biz About me An enterprise solutions architect From a programmer to a systems architect Experience in scientific,

More information

MANAGING USER DATA IN A DIGITAL WORLD

MANAGING USER DATA IN A DIGITAL WORLD MANAGING USER DATA IN A DIGITAL WORLD AIRLINE INDUSTRY CHALLENGES AND SOLUTIONS WHITE PAPER OVERVIEW AND DRIVERS In today's digital economy, enterprises are exploring ways to differentiate themselves from

More information

Background: Business Value of Enterprise Architecture TOGAF Architectures and the Business Services Architecture

Background: Business Value of Enterprise Architecture TOGAF Architectures and the Business Services Architecture Business Business Services Services and Enterprise and Enterprise This Workshop Two parts Background: Business Value of Enterprise TOGAF s and the Business Services We will use the key steps, methods and

More information

A Concept Model for the UK Public Sector

A Concept Model for the UK Public Sector A Concept Model for the UK Public Sector January 2012, Version 0.2 January 2012, Version 0.2 Introduction This paper is produced by the CTO Council Information Domain to scope and propose a concept model

More information

SIMATIC IT Production Suite Answers for industry.

SIMATIC IT Production Suite Answers for industry. Driving Manufacturing Performance SIMATIC IT Production Suite Answers for industry. SIMATIC IT at the intersection of value creation processes With SIMATIC IT, Siemens is broadening the scope of MES. Plant

More information

Agile Design of Sustainable Networked Enterprises

Agile Design of Sustainable Networked Enterprises Agile Design of Sustainable Networked Enterprises Frank Lillehagen 1 and John Krogstie 2 1 Commitment AS, P.O. Box 543, 1327 Lysaker, Norway frank.lillehagen@commitment.no 2 NTNU, 7491 Trondheim, Norway

More information

Enabling Data Quality

Enabling Data Quality Enabling Data Quality Establishing Master Data Management (MDM) using Business Architecture supported by Information Architecture & Application Architecture (SOA) to enable Data Quality. 1 Background &

More information

Welcome to online seminar on. Agile PLM Overview. Presented by: Mahender Bist Partner Rapidflow Apps Inc mbist@rapidflowapps.com.

Welcome to online seminar on. Agile PLM Overview. Presented by: Mahender Bist Partner Rapidflow Apps Inc mbist@rapidflowapps.com. Welcome to online seminar on Agile PLM Overview Presented by: Mahender Bist Partner Rapidflow Apps Inc mbist@rapidflowapps.com April, 2011 Rapidflow Apps - Introduction About Rapidflow Apps Oracle Gold

More information

Surveying and evaluating tools for managing processes for software intensive systems

Surveying and evaluating tools for managing processes for software intensive systems Master Thesis in Software Engineering 30 Credits, Advanced Level Surveying and evaluating tools for managing processes for software intensive systems Anuradha Suryadevara IDT Mälardalen University, ABB

More information

What an Architect Needs to Know

What an Architect Needs to Know Corporate Technology What an Architect Needs to Know Experiences from the Siemens Curriculum for Engineers Frank Buschmann Siemens AG Corporate Technology Systems Architecture and Platforms Copyright Siemens

More information

Basic Unified Process: A Process for Small and Agile Projects

Basic Unified Process: A Process for Small and Agile Projects Basic Unified Process: A Process for Small and Agile Projects Ricardo Balduino - Rational Unified Process Content Developer, IBM Introduction Small projects have different process needs than larger projects.

More information

Corporate websites, the cornerstone of your digital marketing strategy.

Corporate websites, the cornerstone of your digital marketing strategy. Corporate websites, the cornerstone of your digital marketing strategy. Never before have companies had so many different ways of reaching their target audience. Social networks, new technologies and the

More information

Five best practices for deploying a successful service-oriented architecture

Five best practices for deploying a successful service-oriented architecture IBM Global Services April 2008 Five best practices for deploying a successful service-oriented architecture Leveraging lessons learned from the IBM Academy of Technology Executive Summary Today s innovative

More information

Improving Interoperability in Mechatronic Product Developement. Dr. Alain Biahmou, Dr. Arnulf Fröhlich, Dr. Josip Stjepandic

Improving Interoperability in Mechatronic Product Developement. Dr. Alain Biahmou, Dr. Arnulf Fröhlich, Dr. Josip Stjepandic International Conference on Product Lifecycle Management 1 Improving Interoperability in Mechatronic Product Developement Dr. Alain Biahmou, Dr. Arnulf Fröhlich, Dr. Josip Stjepandic PROSTEP AG Dolivostr.

More information

Oracle s Primavera P6 Enterprise Project Portfolio Management

Oracle s Primavera P6 Enterprise Project Portfolio Management Oracle s Primavera P6 Enterprise Project Portfolio Management Oracle s Primavera P6 Enterprise Project Portfolio Management is the most powerful, robust and easy-to-use solution for prioritizing, planning,

More information

5 Best Practices for SAP Master Data Governance

5 Best Practices for SAP Master Data Governance 5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC Executive Summary Successful deployment of ERP solutions can revolutionize

More information

Training Management System for Aircraft Engineering: indexing and retrieval of Corporate Learning Object

Training Management System for Aircraft Engineering: indexing and retrieval of Corporate Learning Object Training Management System for Aircraft Engineering: indexing and retrieval of Corporate Learning Object Anne Monceaux 1, Joanna Guss 1 1 EADS-CCR, Centreda 1, 4 Avenue Didier Daurat 31700 Blagnac France

More information

Thomas Usländer Fraunhofer IITB

Thomas Usländer Fraunhofer IITB ORCHESTRA Day Stresa, 12 December 2007 ORCHESTRA Architecture - Behind the Scenes Thomas Usländer Fraunhofer IITB ORCHESTRA Consortium ORCHESTRA Ambition Analysis Maps Info Centre Archive Control centre

More information

Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario

Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario About visualmetrics visualmetrics is a Business Intelligence (BI) solutions provider that develops and delivers best of breed Analytical Applications, utilising BI tools, to its focus markets. Based in

More information

Extracting Business. Value From CAD. Model Data. Transformation. Sreeram Bhaskara The Boeing Company. Sridhar Natarajan Tata Consultancy Services Ltd.

Extracting Business. Value From CAD. Model Data. Transformation. Sreeram Bhaskara The Boeing Company. Sridhar Natarajan Tata Consultancy Services Ltd. Extracting Business Value From CAD Model Data Transformation Sreeram Bhaskara The Boeing Company Sridhar Natarajan Tata Consultancy Services Ltd. GPDIS_2014.ppt 1 Contents Data in CAD Models Data Structures

More information

EPM Live Presentation. EPM Live Solution Overview

EPM Live Presentation. EPM Live Solution Overview Presentation Solution Overview Social Project & Work Management Social Project Management Product Focus Area Customers interested in an Enterprise Project and Social Project Management application to manage

More information

Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff

Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business

More information

Anatomy of an Enterprise Software Delivery Project

Anatomy of an Enterprise Software Delivery Project Chapter 2 Anatomy of an Enterprise Software Delivery Project Chapter Summary I present an example of a typical enterprise software delivery project. I examine its key characteristics and analyze specific

More information

Microsoft SOA Roadmap

Microsoft SOA Roadmap Microsoft SOA Roadmap Application Platform for SOA and BPM Thomas Reimer Enterprise Technology Strategist, SOA and BPM Microsoft Corporation (EMEA) Trends and Roadmap THE FUTURE OF DYNAMIC IT Market Trends

More information

A Closer Look at BPM. January 2005

A Closer Look at BPM. January 2005 A Closer Look at BPM January 2005 15000 Weston Parkway Cary, NC 27513 Phone: (919) 678-0900 Fax: (919) 678-0901 E-mail: info@ultimus.com http://www.ultimus.com The Information contained in this document

More information

Franco Furlan Middle and Eastern Europe CoE for Analytics

Franco Furlan Middle and Eastern Europe CoE for Analytics Franco Furlan Middle and Eastern Europe CoE for Analytics 1 Creating Value through Finance Organizations Business Partnership Compliance Financial Planning and Analysis Accounting and Financial Close Treasury

More information

The overall aim for this project is To improve the way that the University currently manages its research publications data

The overall aim for this project is To improve the way that the University currently manages its research publications data Project Plan Overview of Project 1. Background The I-WIRE project will develop a workflow and toolset, integrated into a portal environment, for the submission, indexing, and re-purposing of research outputs

More information

Semantic Business Process Management Lectuer 1 - Introduction

Semantic Business Process Management Lectuer 1 - Introduction Arbeitsgruppe Semantic Business Process Management Lectuer 1 - Introduction Prof. Dr. Adrian Paschke Corporate Semantic Web (AG-CSW) Institute for Computer Science, Freie Universitaet Berlin paschke@inf.fu-berlin.de

More information

Using Measurement to translate Business Vision into Operational Software Strategies

Using Measurement to translate Business Vision into Operational Software Strategies Using Measurement to translate Business Vision into Operational Software Strategies Victor R. Basili University of Maryland and Fraunhofer Center - Maryland BUSINESS NEEDS Any successful business requires:

More information

Model Driven Interoperability through Semantic Annotations using SoaML and ODM

Model Driven Interoperability through Semantic Annotations using SoaML and ODM Model Driven Interoperability through Semantic Annotations using SoaML and ODM JiuCheng Xu*, ZhaoYang Bai*, Arne J.Berre*, Odd Christer Brovig** *SINTEF, Pb. 124 Blindern, NO-0314 Oslo, Norway (e-mail:

More information

EU CUSTOMS BUSINESS PROCESS MODELLING POLICY

EU CUSTOMS BUSINESS PROCESS MODELLING POLICY EUROPEAN COMMISSION MASP Revision 2014 v1.1 ANNEX 4 DIRECTORATE-GENERAL TAXATION AND CUSTOMS UNION Customs Policy, Legislation, Tariff Customs Processes and Project Management Brussels, 03.11.2014 TAXUD.a3

More information

Advancing Your Business Analysis Career Intermediate and Senior Role Descriptions

Advancing Your Business Analysis Career Intermediate and Senior Role Descriptions Advancing Your Business Analysis Career Intermediate and Senior Role Descriptions The role names listed in the Career Road Map from International Institute of Business Analysis (IIBA) are not job titles

More information

Project Execution Guidelines for SESAR 2020 Exploratory Research

Project Execution Guidelines for SESAR 2020 Exploratory Research Project Execution Guidelines for SESAR 2020 Exploratory Research 04 June 2015 Edition 01.01.00 This document aims at providing guidance to consortia members on the way they are expected to fulfil the project

More information

Independent process platform

Independent process platform Independent process platform Megatrend in infrastructure software Dr. Wolfram Jost CTO February 22, 2012 2 Agenda Positioning BPE Strategy Cloud Strategy Data Management Strategy ETS goes Mobile Each layer

More information

Business Process Management In An Application Development Environment

Business Process Management In An Application Development Environment Business Process Management In An Application Development Environment Overview Today, many core business processes are embedded within applications, such that it s no longer possible to make changes to

More information

A SOA visualisation for the Business

A SOA visualisation for the Business J.M. de Baat 09-10-2008 Table of contents 1 Introduction...3 1.1 Abbreviations...3 2 Some background information... 3 2.1 The organisation and ICT infrastructure... 3 2.2 Five layer SOA architecture...

More information

To introduce software process models To describe three generic process models and when they may be used

To introduce software process models To describe three generic process models and when they may be used Software Processes Objectives To introduce software process models To describe three generic process models and when they may be used To describe outline process models for requirements engineering, software

More information

forecasting & planning tools

forecasting & planning tools solutions forecasting & planning tools by eyeon solutions january 2015 contents introduction 4 about eyeon 5 services eyeon solutions 6 key to success 7 software partner: anaplan 8 software partner: board

More information

NASCIO EA Development Tool-Kit Solution Architecture. Version 3.0

NASCIO EA Development Tool-Kit Solution Architecture. Version 3.0 NASCIO EA Development Tool-Kit Solution Architecture Version 3.0 October 2004 TABLE OF CONTENTS SOLUTION ARCHITECTURE...1 Introduction...1 Benefits...3 Link to Implementation Planning...4 Definitions...5

More information

CAD/CAE systems and cost engineering

CAD/CAE systems and cost engineering CAD/CAE systems and cost engineering The purpose of this article is to explain how total life-cycle solutions can help engineering procurement and construction and plant owner companies meet today s business

More information

D 8.2 Application Definition - Water Management

D 8.2 Application Definition - Water Management (FP7 609081) Date 31st July 2014 Version [1.0] Published by the Almanac Consortium Dissemination Level: Public Project co-funded by the European Commission within the 7 th Framework Programme Objective

More information

Enterprise Architecture Assessment Guide

Enterprise Architecture Assessment Guide Enterprise Architecture Assessment Guide Editorial Writer: J. Schekkerman Version 2.2 2006 Preface An enterprise architecture (EA) establishes the organization-wide roadmap to achieve an organization s

More information

A discussion of information integration solutions November 2005. Deploying a Center of Excellence for data integration.

A discussion of information integration solutions November 2005. Deploying a Center of Excellence for data integration. A discussion of information integration solutions November 2005 Deploying a Center of Excellence for data integration. Page 1 Contents Summary This paper describes: 1 Summary 1 Introduction 2 Mastering

More information

How To Use Cenitspin

How To Use Cenitspin cenitspin The PLM Software Solution cenitspin: Simple, Intuitive and User-Friendly PLM DESKTOP www.cenit.com/cenitspin Everything at a Glance with the PLM Desktop The easy-to-operate PLM software cenitspin

More information

FTA Technology 2009 IT Modernization and Business Rules Extraction

FTA Technology 2009 IT Modernization and Business Rules Extraction FTA Technology 2009 IT Modernization and Business Rules Extraction August 5th, 2009 _experience the commitment TM Agenda IT Modernization Business Rules Extraction Automation Tools for BRE BRE Cost and

More information

Enterprise resource planning Product life-cycle management Information systems in industry ELEC-E8113

Enterprise resource planning Product life-cycle management Information systems in industry ELEC-E8113 Enterprise resource planning Product life-cycle management Information systems in industry ELEC-E8113 Contents Enterprise resource planning (ERP) Product data management (PDM) Product lifecycle management

More information

Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise

Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise Data Governance Data Governance, Data Architecture, and Metadata Essentials Enabling Data Reuse Across the Enterprise 2 Table of Contents 4 Why Business Success Requires Data Governance Data Repurposing

More information

Total Exploration & Production: Field Monitoring Case Study

Total Exploration & Production: Field Monitoring Case Study Total Exploration & Production: Field Monitoring Case Study 1 Summary TOTAL S.A. is a word-class energy producer and provider, actually part of the super majors, i.e. the worldwide independent oil companies.

More information

Engineering Document Release Management

Engineering Document Release Management Engineering Document Release Management Design Release Archive ImageSite is a comprehensive yet easy-to-use Engineering Document Management system that automates the engineering data lifecycle across your

More information

Long Term Knowledge Retention and Preservation

Long Term Knowledge Retention and Preservation Long Term Knowledge Retention and Preservation Aziz Bouras University of Lyon, DISP Laboratory France abdelaziz.bouras@univ-lyon2.fr Recent years: How should digital 3D data and multimedia information

More information

Simplify and Automate IT

Simplify and Automate IT Simplify and Automate IT Expectations have never been higher Reduce IT Costs 30% increase in staff efficiency Reduce support costs by 25% Improve Quality of Service Reduce downtime by 75% 70% faster MTTR

More information

Software Engineering. Software Processes. Based on Software Engineering, 7 th Edition by Ian Sommerville

Software Engineering. Software Processes. Based on Software Engineering, 7 th Edition by Ian Sommerville Software Engineering Software Processes Based on Software Engineering, 7 th Edition by Ian Sommerville Objectives To introduce software process models To describe three generic process models and when

More information

Modeling Guidelines Manual

Modeling Guidelines Manual Modeling Guidelines Manual [Insert company name here] July 2014 Author: John Doe john.doe@johnydoe.com Page 1 of 22 Table of Contents 1. Introduction... 3 2. Business Process Management (BPM)... 4 2.1.

More information

Family Evaluation Framework overview & introduction

Family Evaluation Framework overview & introduction A Family Evaluation Framework overview & introduction P B Frank van der Linden O Partner: Philips Medical Systems Veenpluis 4-6 5684 PC Best, the Netherlands Date: 29 August, 2005 Number: PH-0503-01 Version:

More information

Combining Security Risk Assessment and Security Testing based on Standards

Combining Security Risk Assessment and Security Testing based on Standards Jürgen Großmann (FhG Fokus) Fredrik Seehusen (SINTEF ICT) Combining Security Risk Assessment and Security Testing based on Standards 3 rd RISK Workshop at OMG TC in Berlin, 2015-06-16 3 rd RISK Workshop

More information

Issue in Focus: Integrating Cloud PLM. Considerations for Systems Integration in the Cloud

Issue in Focus: Integrating Cloud PLM. Considerations for Systems Integration in the Cloud Issue in Focus: Integrating Cloud PLM Considerations for Systems Integration in the Cloud 1 Tech-Clarity, Inc. 2012 Table of Contents Introducing the Issue... 3 Start with the Business in Mind... 4 Choose

More information

ORACLE HYPERION PLANNING

ORACLE HYPERION PLANNING ORACLE HYPERION PLANNING ENTERPRISE WIDE PLANNING, BUDGETING, AND FORECASTING KEY FEATURES Hybrid data model facilitates planning, analysis and commentary Flexible workflow capabilities Reliability with

More information

Oracle Hyperion Planning

Oracle Hyperion Planning Oracle Hyperion Planning Oracle Hyperion Planning is an agile planning solution that supports enterprise wide planning, budgeting, and forecasting using desktop, mobile and Microsoft Office interfaces.

More information

11 Tips to make the requirements definition process more effective and results more usable

11 Tips to make the requirements definition process more effective and results more usable 1 11 Tips to make the s definition process more effective and results more usable This article discusses what I believe are the key techniques for making s definition process repeatable from project to

More information

Extended Enterprise Architecture Framework Essentials Guide

Extended Enterprise Architecture Framework Essentials Guide Extended Enterprise Architecture Framework Essentials Guide Editorial Writer: J. Schekkerman Version 1.5 2006 Preface An enterprise architecture (EA) establishes the organization-wide roadmap to achieve

More information

TECHNOLOGY SOLUTIONS FOR THE INTERNAL AUDITOR

TECHNOLOGY SOLUTIONS FOR THE INTERNAL AUDITOR TECHNOLOGY SOLUTIONS FOR THE INTERNAL AUDITOR (BUY VS BUILD) APRIL 17, 2015 LEVERAGING TECHNOLOGY FOR AUDIT Utilizing Software to Administrate Audit Process 40% 35% 30% 37% Tools Leveraged 32% 36% Yes

More information

Data Governance, Data Architecture, and Metadata Essentials

Data Governance, Data Architecture, and Metadata Essentials WHITE PAPER Data Governance, Data Architecture, and Metadata Essentials www.sybase.com TABLE OF CONTENTS 1 The Absence of Data Governance Threatens Business Success 1 Data Repurposing and Data Integration

More information

MDM and Data Warehousing Complement Each Other

MDM and Data Warehousing Complement Each Other Master Management MDM and Warehousing Complement Each Other Greater business value from both 2011 IBM Corporation Executive Summary Master Management (MDM) and Warehousing (DW) complement each other There

More information

Microsoft SharePoint THE PLATFORM ENTERPRISES NEED

Microsoft SharePoint THE PLATFORM ENTERPRISES NEED Microsoft SharePoint THE PLATFORM ENTERPRISES NEED Presentation Outline Purpose of the Presentation The Right Team Introduction to SharePoint SharePoint as DMS SharePoint ac ECM SharePoint for Workflows

More information

Ontology and automatic code generation on modeling and simulation

Ontology and automatic code generation on modeling and simulation Ontology and automatic code generation on modeling and simulation Youcef Gheraibia Computing Department University Md Messadia Souk Ahras, 41000, Algeria youcef.gheraibia@gmail.com Abdelhabib Bourouis

More information

Regulated Documents. A concept solution for SharePoint that enables FDA 21CFR part 11 compliance when working with digital documents

Regulated Documents. A concept solution for SharePoint that enables FDA 21CFR part 11 compliance when working with digital documents Regulated Documents A concept solution for SharePoint that enables FDA 21CFR part 11 compliance when working with digital documents Contents Life science industry challenges Regulated Documents our service

More information

Microsoft Dynamics AX 2012 A New Generation in ERP

Microsoft Dynamics AX 2012 A New Generation in ERP A New Generation in ERP Mike Ehrenberg Technical Fellow Microsoft Corporation April 2011 Microsoft Dynamics AX 2012 is not just the next release of a great product. It is, in fact, a generational shift

More information

PRODUCT DEVELOPMENT BEST PRACTICES AND ASSESSMENT

PRODUCT DEVELOPMENT BEST PRACTICES AND ASSESSMENT PRODUCT DEVELOPMENT BEST PRACTICES AND ASSESSMENT Kenneth Crow DRM Associates 310-377-5569 k.crow@npd-solutions.com BEST PRACTICES Multi-year effort by consortium to identify best practices of product

More information

Solution Architecture Overview. Submission Management. 2015 The Value Enablement Group, LLC. All rights reserved.

Solution Architecture Overview. Submission Management. 2015 The Value Enablement Group, LLC. All rights reserved. Solution Architecture Overview Submission Management 1 Submission Management Overview Sources of Record MDM Manually Captured Lifecycle Events PLM Repository Data Domain Objects supporting Submission Process

More information

Microsoft Business Analytics Accelerator for Telecommunications Release 1.0

Microsoft Business Analytics Accelerator for Telecommunications Release 1.0 Frameworx 10 Business Process Framework R8.0 Product Conformance Certification Report Microsoft Business Analytics Accelerator for Telecommunications Release 1.0 November 2011 TM Forum 2011 Table of Contents

More information

Data Consistency Management Overview January 2014. Customer

Data Consistency Management Overview January 2014. Customer Data Consistency Management Overview January 2014 Customer Agenda Motivation SAP Solution Manager as Tool for Data Consistency Management Transactional Correctness (TC) Guided Self Service Data Consistency

More information

The Accenture/ Siemens PLM Software Alliance

The Accenture/ Siemens PLM Software Alliance The Accenture/ Siemens PLM Software Alliance Enabling Efficient Product Lifecycle Management Companies in a wide range of industries rely upon Product Lifecycle Management (PLM) to grow their business,

More information

Capabilities, Sample Use Cases, Case Studies

Capabilities, Sample Use Cases, Case Studies Capabilities, Sample Use Cases, Case Studies Core capabilities of Diaku Axon Visibility & Understanding Analysis & Alignment Control Measurability Collaborate on a shared understanding of the organisation

More information

NSW Government Open Data Policy. September 2013 V1.0. Contact

NSW Government Open Data Policy. September 2013 V1.0. Contact NSW Government Open Data Policy September 2013 V1.0 Contact datansw@finance.nsw.gov.au Department of Finance & Services Level 15, McKell Building 2-24 Rawson Place SYDNEY NSW 2000 DOCUMENT CONTROL Document

More information

Building a Data Quality Scorecard for Operational Data Governance

Building a Data Quality Scorecard for Operational Data Governance Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...

More information

HP Systinet. Software Version: 10.01 Windows and Linux Operating Systems. Concepts Guide

HP Systinet. Software Version: 10.01 Windows and Linux Operating Systems. Concepts Guide HP Systinet Software Version: 10.01 Windows and Linux Operating Systems Concepts Guide Document Release Date: June 2015 Software Release Date: June 2015 Legal Notices Warranty The only warranties for HP

More information

How to simplify the evolution of business process lifecycles

How to simplify the evolution of business process lifecycles How to simplify the evolution of business process lifecycles Dr Alexander Samarin Independent consultant, Switzerland www.improving-bpm-systems.com samarin@bluemail.ch Abstract. My experience shows that

More information

OpenAIRE Research Data Management Briefing paper

OpenAIRE Research Data Management Briefing paper OpenAIRE Research Data Management Briefing paper Understanding Research Data Management February 2016 H2020-EINFRA-2014-1 Topic: e-infrastructure for Open Access Research & Innovation action Grant Agreement

More information

Increasing Development Knowledge with EPFC

Increasing Development Knowledge with EPFC The Eclipse Process Framework Composer Increasing Development Knowledge with EPFC Are all your developers on the same page? Are they all using the best practices and the same best practices for agile,

More information