Industrialised Test Automation

Similar documents
Herstellerinitiative Software (OEM Initiative Software)

DEDICATED TO SOLUTIONS. Automotive System and Software Development

Software Production. Industrialized integration and validation of TargetLink models for series production

How to Upgrade SPICE-Compliant Processes for Functional Safety

Development of AUTOSAR Software Components within Model-Based Design

Project Management Office

Distributed Engineered Test Efficiency Improvement

ISO/IEC Part 2 provides the following copyright release:

Evaluation and Integration of Risk Management in CMMI and ISO/IEC 15504

Systems Engineering: Development of Mechatronics and Software Need to be Integrated Closely

Model-based Testing of Automotive Systems

Basic Trends of Modern Software Development

Application of software product quality international standards through software development life cycle

Vehicle Electronics. Services and Solutions to Manage the Complexity

Figure 5.1: The interdependence of functions in international business

What is Automotive Software Engineering? What is Automotive Software Engineering? What is Automotive Software Engineering?

A Framework for Software Product Line Engineering

TPI a model for Test Process Improvement

Cost-effective supply chains: Optimizing product development through integrated design and sourcing

Model Based System Engineering (MBSE) For Accelerating Software Development Cycle

SYSTEMS, CONTROL AND MECHATRONICS

Software House Embedded Systems

2008 by Bundesamt für Sicherheit in der Informationstechnik (BSI) Godesberger Allee , Bonn

CYBER SECURITY DASHBOARD: MONITOR, ANALYSE AND TAKE CONTROL OF CYBER SECURITY

Part I. Introduction

CRITICAL SUCCESS FACTORS FOR A SUCCESSFUL TEST ENVIRONMENT MANAGEMENT

Integrating Test Management and Risk Management. White Paper. Testing to gain insight into risks

FROM A RIGID ECOSYSTEM TO A LOGICAL AND FLEXIBLE ENTITY: THE SOFTWARE- DEFINED DATA CENTRE

, Head of IT Strategy and Architecture. Application and Integration Strategy

Verification and Validation of Software Components and Component Based Software Systems

Automotive Software Development Challenges Virtualisation and Embedded Security

What is a process? So a good process must:

Software Quality Standards and. from Ontological Point of View SMEF. Konstantina Georgieva

Hardware in the Loop (HIL) Testing VU 2.0, , WS 2008/09

Brochure More information from

Improving Interoperability in Mechatronic Product Developement. Dr. Alain Biahmou, Dr. Arnulf Fröhlich, Dr. Josip Stjepandic

Testing of safety-critical software some principles

Standard Glossary of Terms Used in Software Testing. Version 3.01

Intelligent development tools Design methods and tools Functional safety

Performance Testing Meets the Cloud

1.1 The Nature of Software... Object-Oriented Software Engineering Practical Software Development using UML and Java. The Nature of Software...

Revised October 2013

Family Evaluation Framework overview & introduction

Smart Data Center Solutions

How To Manage Test Data Management At Sqs.Com

"Service Lifecycle Management strategies for CIOs"

Supply chain maturity study Comparator report HSCNI

Design Specification for IEEE Std 1471 Recommended Practice for Architectural Description IEEE Architecture Working Group 0 Motivation

A collaborative and customized approach to sourcing testing and quality assurance services Performance driven. Quality assured.

TURKEY SOFTWARE QUALITY REPORT

Hardware safety integrity Guideline

The Asset Management Landscape

Information paper. Best Practice for Successful Implementation of ISO for Financial Institutions

ANNEXURE A. Service Categories and Descriptions 1. IT Management

Business Operations. Module Db. Capita s Combined Offer for Business & Enforcement Operations delivers many overarching benefits for TfL:

Navigating ISO 9001:2015

Software Engineering Reference Framework

The Role of Information Technology Studies in Software Product Quality Improvement

Operational Acceptance Testing. White Paper. Business continuity assurance. December 2012

Trends in Embedded Software Development in Europe. Dr. Dirk Muthig

FITMAN Future Internet Enablers for the Sensing Enterprise: A FIWARE Approach & Industrial Trialing

Multi-domain Model-driven Development Developing Electrical Propulsion System at Volvo Cars

Component visualization methods for large legacy software in C/C++

Enterprise Architecture: A Governance Framework

Effective Software Security Management

Introduction CHAPTER 1

Data Center Solutions

turning ideas into success

SUPPLY CHAIN MANAGEMENT: A BOARDROOM IMPERATIVE AN FTI CONSULTING BRIEFING PAPER

BCS THE CHARTERED INSTITUTE FOR IT. BCS HIGHER EDUCATION QUALIFICATIONS BCS Level 6 Professional Graduate Diploma in IT SOFTWARE ENGINEERING 2

PLM Center of Excellence PLM for Embedded Product Development - Challenges, Experiences and Solution. M a y

How To Develop Software

Contents. viii. 4 Service Design processes 57. List of figures. List of tables. OGC s foreword. Chief Architect s foreword. Preface.

SOA Testing Services. Enabling Business Agility and Digital Transformation

Mobile application testing for the enterprise

EARSC Guideline Document. EARSC EO Industry Certification Scheme

Performance Testing and Functional Automation Specialist Cloud Services

1 What does the 'Service V model' represent? a) A strategy for the successful completion of all service management projects

SALES AND OPERATIONS PLANNING BLUEPRINT BUSINESS VALUE GUIDE

Model-Based Conceptual Design through to system implementation Lessons from a structured yet agile approach

Comprehensive Testing Services for Life Insurance Systems

Certification Report

Implementation of ANSI/AAMI/IEC Medical Device Software Lifecycle Processes.

Data analytics Delivering intelligence in the moment

NI Automotive Day (July 12th, 2006) Quality Management by Functional Testing. Jürgen Wölfle, Continental TEMIC

PROCESSING & MANAGEMENT OF INBOUND TRANSACTIONAL CONTENT

Better Test Quality by Automation

Virtual Integration and Consistent Testing of Advanced Driver Assistance Functions

An Agent-Based Concept for Problem Management Systems to Enhance Reliability

Introduction to RACE FUELS Hans-Christian von der Wense Munich, Germany

Transcription:

Whitepaper Industrialised Test Automation sqs.com Harmonising Testing for Mechatronic Systems Authors: Rainer Anders, Senior Consultant Peter Bölter, Delivery Manager Thomas Thurner, Market Unit Director ISS SQS Software Quality Systems AG Germany Published: August 2013 SQS the world s leading specialist in software quality

Industrialised Test Automation 2 RAINER Anders Senior Consultant & SPICE Assessor Champion of Innovation Group Automotive Engineering rainer.anders@sqs.com Rainer Anders has been working for SQS since November 2000. As an ISO-15504/SPICE Assessor and Senior Consultant, he is responsible for process improvement as well as project, quality and test management in large and complex projects and organisational environments. He interprets the relevant norms from a practitioner s point of view and integrates this interpretation into the customer s organisation. For more than 20 years, he has been involved in process improvement and testing for various enterprises in the automotive industry, IT service providers and financial services. Peter Bölter Delivery Manager peter.boelter@sqs.com Peter Bölter is a business graduate who specialised in management and applied computer science. He has worked for SQS since 1988 in the fields of testing, quality assurance and quality management. As Director of the Bootstrap Institute, he played an important role in developing and establishing the SPICE software process assessment model in Europe in the 1990s. Since 2006 he has been President of intacs e.v., which defines and regularly checks training standards for process assessors. He has undertaken national and international management tasks within SQS, including as Managing Director of SQS Nederland B.V. Mr Bölter is currently Delivery Manager in the Industrial Services und Solutions Market Unit and responsible for SQS s solutions in the customer environment. He is also the Quality Representative for this Market Unit s ISO9001-certified QM system.

Industrialised Test Automation 3 Thomas Thurner Market Unit Director Industrial Services and Solutions thomas.thurner@sqs.com Thomas Thurner holds a diploma in Electrical Engineering with a focus on Data Processing and Telecommunication. He has worked with SQS for six years responsible for the Market Unit Industrial Services and Solutions. Beforehand, he worked for 19 years in several different positions as an Engineer, Project Manager and Division Manager in the field of automotive embedded systems. His spectrum includes the development and testing of (safety related) Mechatronic Systems, Real Time Operating Systems, Data Bus Networks, Quality Assurance and Fault Tolerant Architectures.

Industrialised Test Automation 4 Contents 1. Management Summary....5 2. Introduction...6 2.1. Industrialisation...6 2.2. The challenge....8 2.3. Setting the Scene...10 3. Market Current Status and Outlook...11 4. Industrial Test Automation...13 4.1. Testing...13 4.2. Test Automation....14 4.3. Test Processes....14 4.4. ISO 15504/SPICE...16 4.5. TestSPICE...17 4.5.1. Primary Life Cycle Processes........................................... 17 4.5.2. Supporting Life Cycle Processes...19 4.5.3. Organisational Life Cycle Processes....20 4.6. TestSPICE Level....21 4.7. Industrialised Test Automation...22 5. Conclusion and Outlook...24 6. Bibliographical References...25

Industrialised Test Automation 5 1. Management Summary An IT system s value is mainly dependent on the software that defines how the system works, how it behaves and how it can be maintained. Based on this dominant role of software, the discipline of software testing is well understood and has been established for more than 30 years. For mechatronic systems this awareness is a pretty new understanding. Mechatronic systems consist of combinations of software and electronic and mechanical hardware. They are usually decomposed into many subsystems with a number of electrical and mechanical interfaces. Mechatronic systems have many different signals as inputs and they have actuators as outputs that might also affect safety (safety relevant systems). Moreover, the physical reaction of many outputs is measured and used as input (closed control loop). In these mechatronic systems software is also becoming more and more important. Today, a typical car consists of more than 200 different subsystems with altogether more than 10 million lines of code, requiring us to increase and improve test activities for software as well. Today, mechatronic systems are usually composed of many different mechatronic subsystems, each having its own supplier. Also, each supplier tries to deliver its subsystems to as many customers as possible. So their focus is on delivering, developing and testing product lines. These product lines are usually carefully constructed in many different models that can also be used for testing. By substituting more and more simulated modules with real-world testware in a hardware in the loop (HIL) environment, the different test stages evolve to the final system test with a dedicated HIL test system. From a systems manufacturers perspective the overall management of the supply chains is much more difficult in terms of testing: their focus is on integration testing, verifying millions of combinations of different mechatronic subsystems delivered by many different suppliers, each having its own simulation environment, its specific test process and specific test approach. This paper proposes an industrial test automation approach, harmonising these different testing activities applied to the different subsystems by different suppliers. This harmonisation enables reuse of testware, test environments, simulations, HILs, test automation and testing concepts for the whole supply chain. Even the combination of model-based testing (simulation) and HIL can be reused at the system level. For harmonisation this paper proposes TestSPICE as a well-established process reference. The focus of this paper is therefore on improving the underlying testing process for a mechatronic system itself and not directly on improving product quality which is a long-term derived consequence. The overall goal is to increase test efficiency by a high degree of reuse and to increase effectiveness by combining different test environments to have more intermediate test stages on the integration path. Most of the benefits can be leveraged by those stakeholders that integrate subsystems into systems (OEMs and higher-level suppliers). However, even lower-level suppliers can leverage from this concept as it simplifies fulfilling SLAs and gives a clear framework that can be directly followed.

Industrialised Test Automation 6 2. Introduction Even today s modern IT systems can be characterised by the IPO principle. Following this principle an IT system consists of the following three major components: I: Input: Here the information, resources and ideas that are necessary as a precondition to run an implemented algorithm are collected. Even programs without any obvious input data need at least the GO signal as input to start execution. P: Processing: Based on the input data, the calculation given by the implemented algorithm is executed. If some temporary data needs to be stored, a temporary storage system can be modelled as well. O: Output: The output of the calculation is given to the outside of the application. Typical output channels are displays, databases or files. These IT systems are well known due to their global impact. Big IT system clusters like Google or Amazon change everybody s lives and at the current time a number of big IT systems are replacing many small IT systems (local office installations, local backup software etc.). Although another domain has much more tangible assets, its external perception in terms of software is very underestimated: mechatronic systems. Mechatronic systems are special IT systems embedded in mechanically engineered devices. Mechatronic systems in general apply an adjusted IPO principle: The input data as well as output data is usually available as signals. They can be physical or represented in analogue form or digitally encoded; they can be corrupted by electronic noise and broken off by peaks or interruptions. The processing step usually has to be done in real time, i.e. there is a clear contract in terms of time behaviour when an input has to be valid and when an output has to be ready. The output data is usually directly connected with a mechanical engineering system. This can be, for example, a robot, an engine, a braking system or a conveyor belt, i.e. mechanical systems might have a safety aspect. The output causes a reaction by the mechatronic system which may be measured and used as input to create a closed control loop. These additional characteristics explain the interdisciplinary character of mechatronic systems (abbreviated as mechatronics), as is displayed in Figure 1. These specifics of mechatronic systems have caused many differences in terms of production between typical IT systems and mechatronic systems. The biggest difference is the level of industrialisation. 2.1. Industrialisation Industrialisation in general means the process of transitioning from an agrarian society into an industrial one, applying industrial production processes. This transition is defined by four major dimensions: Standardisation and automation: The many different disciplines that are involved in mechatronic systems are urged very early to use product standards wherever possible. The computer area, for example, has established a few wellaccepted bus standards that are used to enable communication between different mechatronic systems. If there is a general decision to apply

Industrialised Test Automation 7 Computers Materials Processing Automotive Digital Control Systems Control Systems Aerospace Mechatronics Control Electronics medical Electronic Systems xerography manufacturing Mechanical CAD Mechanical Systems Electromechanics consumer products defense systems Figure 1: Involved disciplines of mechatronic systems (taken from Bliss) mechatronics to a system, then economies of scale suggest replacing as many systems as possible with mechatronics, as the reproduction of the IT part of mechatronics is low-cost and mainly consists of copying. Continuous improvement: The existence of standards simplifies continuous improvement as a benchmark can easily be applied. If two suppliers deliver a system, fulfilling a standard comparison between two parts is easily possible and can be used for further improvement. Modularisation: Based on the (improved) standards the level of granularity, in terms of subsystems, can be reduced. Putting in standardised parts together might create another, higher addedvalue level part. Instead of selling single screws or cables, modularisation suggests selling robots (with a lot of screws and cables). Focus on core competency: If a company produces a lot of different parts to be able to sell a higher-level module, it will, on the other hand, consume some parts/services to be able to do this. Industrialisation suggests to focus on a specific core competency and to consume parts not belonging to the own core competency to a supplier. Industrialisation heavily affects productivity; something that increases dramatically for industrialised companies. Companies producing mechatronic systems have been applying this concept for more than 50 years. IT service providers have now started to copy this approach. They started very late and are catching up fast (Walter Brenner). Figure 2 demonstrates this overall trend.

Industrialised Test Automation 8 Industrial production IT services Productivity Focus on core competencies Modularisation Continuous improvement Standardisation / Automation 1920 1940 1960 1980 2000 2020 Figure 2: IT-industrialisation follows classical production domain (taken from Walter Brenner, Source: University St. Gallen) The concept of focusing on your core competency has a strong impact on the global delivery chain. Porsche for example, a typical German car manufacturer, leverages external suppliers for 80 % to 90 % of the overall process chain (cf. Online and Porsche, 2004/2005). This dramatic change of production chains requires the renaming of the basic stakeholders: OEM (original equipment manufacturer) refers to a company that consumes products/services made by a second company for use in its own products/services. Supplier (or OEM sales) refers to a company that produces products/services for another company retailing this product/service under that purchasing company s brand name. Porsche is a typical OEM. It becomes clear that an OEM can be a supplier as well, i.e. Porsche s parts are used in other cars as well. 2.2. The challenge Today an OEM has a complex, globally dispersed process chain with several suppliers. From a testing perspective an OEM s focus is mainly on integration testing: Testing performed to expose defects in the interfaces and in the interactions between integrated components or systems. Definition 1: Integration Testing (ISTQB definition in Mario Winter, 2012)

Industrialised Test Automation 9 E/E Requirements Analysis on System Level E/E System Design Requirements Analysis on Component Level Component Design E/E System Test on Vehicle Level E/E System Integration and Test (HiL, Breadboarding) Component Test Component Integration OEM E/E = Electric/Electronic Software Design Software Implementation Software Integration and Test Module Test Supplier Figure 3: Testing activities of OEM and supplier The integration testing assumes that the internal behaviour, tested by low-level activities focusing on the implementation and internal structures, is done by the supplier. This adjusted setting of responsibilities is presented in the adjusted V-model in Figure 3. As mentioned above, a supplier itself can leverage other suppliers to be an OEM as well. The concept is pretty similar to the famous matryoshka dolls: however, while matryoshka dolls have a natural limit of refinement depth, process chains do not. The challenge that is covered in this paper can be explained by using the matryoshka doll metaphor (see Figure 4). The basic assumptions here are: An OEM s integration testing can reuse much of the testware created in the supplier s system testing. Regarding the Figure, this means that OEM 1 testing activities can reuse much of the testware created by the supplier 1 system test. The supplier s integration test can reuse much of the testware created in the former OEM s system testing. Of course, both statements are similar; but it is important to note that harmonising and reusing any testware is a win-win constellation for both suppliers and OEMs. The challenge within the mechatronic systems domain is to establish a governance system that enables harmonisation of testing processes and testware (as a result of testing processes) to allow for synergies between different testing activities. A good example for this approach could be the AUTO- SAR concept (AUTOSAR, 2012) which describes many software interfaces and their interoperability in a virtual electronic control unit (ECU). AUTOSAR is therefore a basis for unification of related test processes between different software suppliers along the complete ECU software integration chain.

Industrialised Test Automation 10 OEM 1 OEM 2 OEM 3 Supplier 1 Supplier 2 Supplier 3 Figure 4: OEMs become suppliers for higher process chain levels (towards higher system levels) 2.3. Setting the Scene The approach used in this paper to achieve these goals is to apply a systematic and consistent testing process. While there are different test process maturity models, we focus on TestSPICE as this is the only maturity model that is compliant with the overall maturity model definition standard ISO 15504, which allows an organisation to apply a set of consistent maturity models (e.g. a combination of SPICE for development processes and TestSPICE for testing processes). Nevertheless, the overall idea of industrial test automation is not restricted to TestSPICE but could utilise many other maturity models as well. The overall message that testing synergies begin and can be leveraged when different parties in a complicated process chain apply a consistent and interoperable process is invariant from the specific process model. So the solution provided in Section 4 is detailed using TestSPICE but other test processes can be used as well.

Industrialised Test Automation 11 3. Market Current Status and Outlook The market for mechatronic systems is growing very fast, although this trend is often not directly visible. For example, it now takes dozens of microprocessors running 100 million lines of code to get a premium car out of the driveway, and this software is only going to get more complex (Charette). The radio and navigation system in the current S-class Mercedes-Benz alone requires over 20 million lines of code; and the S-class contains nearly as many ECUs (electronic control units) as the new Airbus A380 (excluding the plane s in-flight entertainment system). This trend is not only associated with the automotive industry but covers all industry-related business domains, where mechatronics is relevant. Avionics is another impressive example. The avionics system in the F-22 Raptor, the current U.S. Air Force frontline jet fighter, consists of about 1.7 million lines of software code. The F-35 Joint Strike Fighter [ ] will require about 5.7 million lines of code to operate its on-board systems (Charette). And this overall growing trend will continue. The e-mobility area for example is another boost for the importance of IT in mechatronic systems: for makers of hybrid electric vehicles there is a transition anticipated that will transform new cars into [ ] software centric platforms. We can safely say that there s a 40 per cent to 60 per cent increase in software code relative to another car. (Dignan, November 1, 2010) Another trend that is strongly related with mechatronics is the internet of things, which refers to uniquely identifiable objects (things) and their virtual representations in an Internet-like structure (Wikipedia). Since most things have some mechanical aspects, any growth of the internet of things is strongly related with mechatronics. Even today there are around 9 billion connected devices, each having IT on board that needs to be tested. The total figure is set to rise to [ ] 24 billion connections by 2020 as a wide range of new devices, machines and vehicles connect to networks (GSMA). Some more numbers and the associated revenue opportunities are presented in Figure 5. Of course, not all of the total revenue is spent on quality assurance. But today s benchmarks suggest spending between 20 % and 30 % of the overall IT budget on quality assurance (e.g. A. Spillner, 2011). Together with Gartner s analysis (Leclerque, 2010) that the overall IT market will have an outsourcing ratio of about 25 %, this generates the following market view on testing of mechatronic systems: Mechatronics will be the key driver for future IT business. The business of classic IT systems will stagnate or even decrease. Outsourcing, i.e. having suppliers delivering parts of a system, will be a very important delivery model. Testing will be key to allowing the safe use of mechatronic systems. Test automation that leverages synergies between different suppliers and/or supplier and OEM is a fantastic parameter for saving money in the overall process chain.

Industrialised Test Automation 12 The connected life by 2020 2011 9 Billion Total connected devices 2020 24 Billion Total connected devices 2011 6 Billion Mobile connected devices 2020 12 Billion Mobile connected devices North America $ 241 Billion Revenue opportunity for mobile network operators in 2020 $1.2 Trillion 7x increase on 2011 expected revenues Europe $ 305 Billion Revenue opportunity for connected devices in vertical sectors Latin America $ 92 Billion Middle East Africa $ 87 Billion Asia Pacific $ 447 Billion Health $ 97 Billion Automotive $ 202 Billion Consumer electronics $ 445 Billion Utilities $ 36 Billion Creating opportunities through cross-industry collaboration Figure 5: Connected devices and associated business for mechatronic systems

Industrialised Test Automation 13 4. Industrial Test Automation 4.1. Testing Testing is a well-known activity in our daily lives. However, it has to be clear that there is a big difference between testing at home and professional testing in the business area. While testing at home is often seen as an ad-hoc final check before an everyday activity (e.g. let s test if the water is boiling before putting in the noodles), testing in the commercial area has undergone a lot of maturation. Today testing in commercial areas: Starts as soon as possible (e.g. before checking if the water is boiling we test whether the oven is switched on) Includes all different quality attributes and is not restricted to functionality (e.g. we also test whether the noodles came from controlled farming) Includes all different attributes that might have an impact on the overall result (e.g. we also test whether the water is clean) This modern understanding of testing culminates in the ISTQB definition: The process consisting of all life cycle activities, both static and dynamic, concerned with planning, preparation and evaluation of software products and related work products to determine that they satisfy specified requirements, to demonstrate that they are fit for purpose and to detect defects. This definition has many impacts for mechatronic systems: All intermediate results in the supply chain have to be tested as well, even the smallest parts. The reason is that the overall quality is defined by the weakest link in the overall process chain. The further an intermediate result is located down the supply chain the more difficult it is to test since the complete test frame has to be simulated/emulated. All mechatronic systems are not only tested for functionality but also for safety, performance, reliability etc. It is not only the mechatronic system itself that has to be tested but all related work products as well. This includes documentation, environmental constraints, interfaces etc. Two aspects are important for further direction of this solution: If all suppliers had a harmonised test approach and if all suppliers not only delivered the required product, but also the testware itself, this would generate a lot of synergies for testing activities between the different suppliers and the OEMs. If there are reusable assets between different suppliers there is a high probability that test automation will be possible, i.e. some activities could be done by IT without any manual interaction. Definition 2: ISTQB definition of testing (December 2007)

Industrialised Test Automation 14 4.2. Test Automation According to ISTQB Test automation is defined as: The use of software to perform or support test activities, e.g. test management, test design, test execution and results checking. Definition 3: Test automation (ISTQB, December 2007) It is obvious that this automation definition does have its focus on common IT systems and does not reflect mechatronic systems. Test automation of mechatronic systems usually needs a lot of hardware to simulate and/or emulate the environment. The deeper that a product under test is located in the overall process chain, the more difficult it is to test since the natural environment, i.e. the final system that contains the sub-product, is completely missing. For example, to test a car in the system acceptance test, the test environment consists mainly of a road and some petrol. To test a single wheel you need a test axle, a test engine to rotate the wheel, a speed measurement device etc. In mechatronics this dedicated hardware to utilise test automation is usually called a HIL simulator: A real-time simulator constructed by hardware and software, which is configured for the control system under consideration and interfaced to the target system or component through appropriate I/O. During testing with an HIL simulator the target system or component will not experience significant difference from being connected to the real system. Definition 4: HIL simulator (Det Norske Veritas AS) Both parts of HIL emulators, hardware and software, need a lot of customising for efficient and effective use in specific hardware domains. Hardware does not change the overall character of test automation, i.e. even if it increases the overall costs of testing, the most difficult part is to systematically plan and execute tests and to compile meaningful test automations. Applied to the concept of long distributed process chains, the possible synergy is obvious: the complete HIL simulator, consisting of hardware and software, can be reused at the next level of engineering. Even if the OEM and the supplier have completely different hardware parts the basic concepts can be reused for Industrialised Test Automation. Typical examples are test concepts, test architectures, test models and test data. If these parameters are consistent between different suppliers and OEM there is a big opportunity for increasing test effectiveness, i.e. to reuse assets from lower level tests and focus efforts on current level and test efficiency, i.e. automate test execution as much as possible. 4.3. Test Processes The major challenge is to harmonise HIL simulators in complex process chains between many different suppliers. In general, those initiatives can start from three different points of view that are classified within the excellence framework for quality management (see Figure 6). The three dimensions, Leadership, Processes and Key performance results, define those dimensions that can be used for governance of organisations. For the overall goal of harmonisation they can be refined into:

Industrialised Test Automation 15 Enablers Results People People Results Leadership Policy & Strategy Processes Customer Results Key Performance Results Partnerships & Resources Society Results Innovation & Learning Figure 6: Overall structure of Excellence Framework for Quality Management (EFQM) Organisational harmonisation (leadership in EFQM), i.e. to establish common structures within organisations that simplify interoperability between units, roles, hierarchies, leadership etc. of different companies. Process harmonisation, i.e. to establish common process frameworks within organisations that simplify interoperability between different sets of activities, quality gates, responsibilities and governance etc. of different companies. Product harmonisation (key performance results in EFQM), i.e. to harmonise techniques, tools, hardware devices etc. In this paper we explicitly address the second dimension, i.e. the process harmonisation. The reason is that product harmonisation might be seen as a too fine-grained governance approach: no company would accept being forced to use specific tools, hardware devices, programming languages and dialects etc. It would further reduce the possibility of establishing companies unique selling points. On the other hand, harmonisation of the organisation would have too small an impact: the organisation of an ice cream provider can look pretty similar to an IT development house; however this generates only a limited degree of harmonisation. In this paper the harmonisation of processes is recommended: it has defined interfaces to the organisational structure and enforces some level of harmonisation for products. In particular, the overall quality of the products is heavily influenced

Industrialised Test Automation 16 Product maturity as planned Identified Clusters Cluster 1: Low process capability, product maturity much later than planned. Product maturity much later than planned 0,00 0,50 1,00 1,50 2,00 2,50 3,00 3,50 Process capability rating Cluster 2: Transition phase, project management incomplete, product maturity differs. Cluster 3: High process capability, product maturity as planned. Figure 7: Relation between process maturity and product quality (cf. J. Etzkorn) by the applied processes. In Figure 7, an empirical data set is published that demonstrates how process maturity, i.e. a metric to measure the conformance of an applied process to established reference models, affects product quality. As can be seen, better process maturity generates better planned product quality. 4.4. ISO 15504/SPICE There are different ways to define process standards. The most common approach is to define so-called maturity levels: Instead of a binary decision in terms of fulfilled/not fulfilled, there are different levels of fulfilment. The levels should support the following interpretations: A lower level represents a lower fulfilment, representing a lower process maturity. It relates to some best practices that are not fulfilled completely to apply a perfect process. A higher level represents a higher fulfilment, representing a higher process maturity. It relates to a set of best practices that are compliant with a reference model. The lowest level represents all processes that are not compliant with any best practices. Even the worst process (i.e. one that is not existent explicitly) should have this lowest level. The highest level represents processes that fulfil all best practices that are part of the reference model.

Industrialised Test Automation 17 Since there are many different maturity models measuring different process aspects it makes sense to apply a standard for measuring process maturity. This enables benchmarking between different process aspects and guarantees some preconditions for levelling. The most famous reference model for measuring process maturity is established in ISO/IEC 15504. This framework can be used by organisations involved in planning, managing, monitoring, controlling and improving the acquisition, supply, development, operation, evolution and support of products and services. It consists of the following 5 parts: ISO/IEC 15504-1:2004 Information technology Process assessment Part 1: Concepts and vocabulary. Here, all information on the concepts of process assessment and its use in the two contexts of process improvement and process capability determination is given. ISO/IEC 15504-2:2003 Information technology Process assessment Part 2: Performing an assessment. Here, all requirements for performing process assessment as a basis for use in process improvement and capability determination are described. ISO/IEC 15504-3:2004 Information technology Process assessment Part 3: Here, some guidance on performing an assessment is given. ISO/IEC 15504-4:2004 Information technology Process assessment Part 4: Here, some guidance on use for process improvement and process capability determination is given. ISO/IEC 15504-5:2006 Information technology Process Assessment Part 5: Here, an exemplar Process Assessment Model is presented. It is important to note that SPICE, a specific maturity model for software development, is only presented as an example, i.e. the focus of this standard is to define a framework for any process maturity model. For the purpose of this we focus on TestSPICE, a reference maturity model for test processes that is compliant with ISO 15504. 4.5. TestSPICE The TestSPICE reference model supports a process assessment model that is compliant with ISO 15504 and covers all aspects of testing. Figure 8 gives a high-level overview of the different TestSPICE process categories, which can be classified as primary life cycle processes, supporting life cycle processes and organisational life cycle processes (cf. Figure 8). It is outside the scope of this paper to fully explain TestSPICE. In the following subsection only a short overview is given. Each section ends with a short explanation of how this area can be leveraged for process harmonisation for industrial test automation. 4.5.1. Primary Life Cycle Processes The primary life cycle process category covers those activities that are directly connected with testing. An overview is given in Figure 9. The purpose of this process group is defined in SIG (2012): The primary Life Cycle processes category consists of processes that may be used by the customer when acquiring test services from a supplier, and by the supplier when responding to a tender and delivering test services to the customer including the testing processes needed for preparation and execution of tests.

Industrialised Test Automation 18 TestSPICE Primary Life Cycle Processes Test Service Acquisition Testing Test Service Supply Test Environment Operation Process category Process group Supporting Life Cycle Processes Support Organisational Life Cycle Processes Integrated Management Resource & Infrastructure Process Improvement Test Process Management Test Regression, Reuse & Maintenance Figure 8: TestSPICE process categories and process groups As can be seen in Figure 9, the primary life cycle processes consist of the following four subgroups refining this process category (for a more detailed description cf. SIG, 2012): The Test Service Acquisition (TSA) process group consists of processes that are performed by the customer in order to acquire a test service. Key processes here are the acquisition preparation, the supplier selection, the contact agreement, the test service monitoring and the test service acceptance. The Testing (TST) process group consists of processes that directly elicit and manage the product or testing requirements, specify, implement, or maintain the software or system tests. Key processes here are the test requirements analysis, the test analysis and design, the test realisation and execution, the test results analysis and reporting, the test automation design and the test automation implementation.

Industrialised Test Automation 19 Primary Life Cycle Processes Test Service Acquisition Testing Test Service Supply Test environment operation TSA.1 Acquisition preparation TSA.2 Supplier selection TSA.3 Contract agreement TSA.4 Test service monitoring TSA.5 Test service acceptance TST.1 Test requirements analysis TST.2 Test analysis & design (specification) TST.3 Test realisation and execution TST.4 Test results analysis and reporting TST.5 Test automation design TST.6 Test automation implementation TST.7 Test environment testing TSS.1 Test supplier tendering TSS.2 Test service delivery TSS.3 Test service acceptance support TEO.1 Operational use of test environment TEO.2 Test environment user support Figure 9: TestSPICE s Primary Life Cycle Processes The Test Service Supply (TSS) process group consists of processes performed by the supplier in order to supply a product or service. Key processes here are the test supplier tendering, the test service delivery and the test service acceptance support. The Test Environment Operation (TEO) consists of processes which directly address definition, planning, set-up and support of test environment. Key processes here are the operational use of test environment and the test environment user support. 4.5.2. Supporting Life Cycle Processes The supporting life cycle process category covers those activities that may be employed by any of the other processes at various points in the life cycle. They are performed to support activities of the primary or management life cycle. An overview is given in Figure 10. Supporting Life Cycle Processes Support SUP.1 Quality assurance SUP.2 Verification SUP.3 Validation SUP.4 Joint review SUP.5 Audit SUP.6 Product and service evaluation for test SUP.7 Documentation SUP.8 Configuration management SUP.9 Problem resolution management SUP.10 Change request management Figure 10: TestSPICE s Supporting Life Cycle Processes

Industrialised Test Automation 20 Organisational Life Cycle Processes Integrated Project Management IPM.1 Organisational alignment IPM.2 Organisation management IPM.3 Project management IPM.4 Quality management IPM.5 Risk management IPM.6 Measurement Resource & Infrastructure RIN.1 Human resource management RIN.2 Training RIN.3 Knowledge management RIN.4 Test infrastructure Process Improvement PIM.1 Process establishment PIM.2 Process assessment PIM.3 Process improvement Test Process Management TPM.1 Test strategy TPM.2 Test planning TPM.3 Test execution and controlling TPM.4 Test closing and reporting Test Regression, Reuse & Maintenance TRM.1 Test asset management TRM.2 Test reuse programme management TRM.3 Regression test management TRM.4 Testware maintenance Figure 11: TestSPICE s Organisational Life Cycle Processes 4.5.3. Organisational Life Cycle Processes The organisational life cycle process category covers those activities that establish the business goals of the organisation in terms of testing process, product and resource assets. As can be seen in Figure 11, the organisational life cycle processes consist of the following five subgroups refining this process category (for a more detailed description cf. SIG, 2012): The Integrated Project Management (IPM) process group consists of processes that contain practices that may be used by anyone who manages any type of project or process within the life cycle. Key processes here are organisational alignment, organisation management, project management, quality management, risk management and measurement. The Resource & Infrastructure (RIN) process group consists of processes performed in order to make available the necessary human resources and necessary infrastructure. Key processes here are human resource management, training, knowledge management and test infrastructure. The Process Improvement (PIM) group consists of processes performed in order to define, deploy and improve the processes performed in the organisational unit. Key processes here are process establishment, process assessment and process improvement. The Test Process Management (TPM) process group consists of processes performed in order to plan, monitor, control and report the testing. Key processes here are test strategy, test planning, test monitoring and test closing & reporting.

Industrialised Test Automation 21 The Test Regression Reuse & Maintenance (TRM) process group consists of processes performed in order to systematically exploit reuse opportunities in organisations, to support reuse programmes and to manage regression tests. Key processes here are test asset management, test work product reuse management, regression test management and testware management. 4.6. TestSPICE Level The three different test process categories explained in Section 4.5 cover all process areas that have to be considered for a systematic test. It is important to note that the fundamental motivation of this paper, to establish Industrialised Test Automation for mechatronic systems, is explicitly addressed in the process group Test Regression and Reuse & Maintenance. Systematically applying the best practices of that process group in the globally dispersed process chain supports the idea that an OEM s integration testing can reuse much testware created in a supplier s system testing. However, TestSPICE explicitly addresses the problem that cherry picking in terms of process groups, or even single processes, doesn t make much sense: There is a high interdependence between the different process categories and process areas, and some processes are preconditions for others. As an example the process group Test Regression and Reuse & Maintenance depends heavily on the process category primary life cycle processes as the test assets are defined and are, in principle, available for any reuse. This strongly suggests thinking holistically about all process categories when defining and improving a test process. A perfect test process fulfils all requirements of all process areas, while a chaotic test process fails one, or even all, process areas. Between these two extremes there might be different levels of maturity, representing different process maturities for all process areas. For this purpose TestSPICE has defined 5+1 maturity levels (+1 represents the status where activities are done without having any explicit goal, process or strategy). An overview of these maturity levels is given in Figure 12. Each level has some specifics in terms of process maturity: Level 0 (not shown in Figure 12): Incomplete Process: the process is not implemented, or fails to achieve its process purpose (if any exists). Level 1: Performed process: the implemented process achieves its process purpose. Level 2: Managed process: the performed process is now implemented in a managed fashion (planned, monitored and adjusted) and its work products created in the processes are appropriately established, controlled and maintained. Level 3: Established process: the managed process is now implemented using a defined process that is capable of achieving its process outcomes. Level 4: Predictable process: the established process now operates within defined limits to achieve its process outcomes. Level 5: Optimising process: the predictable process is continuously improved to meet relevant current and projected business goals. Each level defines specific process attributes for all process areas that have to be fulfilled to reach this specific maturity level. Each process attribute measures a particular aspect of the process capability. This is done by so-called process capability indicators that are defined for all levels. In addition to this, TestSPICE defines so-called process performance indicators exclusively to capability level 1.

Industrialised Test Automation 22 Level 5 (optimising) IN OUT Level 4 (predictable) IN OUT Level 3 (established) IN OUT Level 2 (managed) IN OUT Level 1 (performed) IN OUT Figure 12: Maturity levels of TestSPICE 4.7. Industrialised Test Automation The focus of this paper is to address the challenges mentioned in section 2.2. The challenge within the mechatronic systems domain is to establish a governance system that enables harmonisation of testing processes and testware (as a result of testing processes) to allow for synergies between different testing activities along the supply chain. The process landscape introduced by TestSPICE (cf. section 4.5) provides an excellent framework for that purpose: if both OEMs and suppliers define, establish and improve their test processes in terms of TestSPICE the harmonisation of testing processes and testware is already in progress. However, the matryoshka concept shown in Figure 4 defines some additional requirements: Each party (OEM and supplier) should apply the same process landscape. TestSPICE is not the only test process model (e.g. TMAP and TPI is another one) but is one that can easily be synchronised with development processes (e.g. SPICE). This paper suggests TestSPICE for all OEMs and suppliers; however, similar results can be expected when all parties apply another reference model consistently. The overall process maturity is defined by the lowest maturity of any party in the globally dispersed process chain. If a supplier as an example does not have any process maturity the corresponding OEM will not be able to reuse anything from it.

Industrialised Test Automation 23 The process models reflect only two directly connected parties: The OEM and the supplier. Both are in sync if they synchronise their processes. Since most mechatronic systems have a globally dispersed deep process chain this synchronisation should be extended to all parties (all suppliers and all OEMS of that process chain). The industrialisation test automation approach suggests the following best practices for a harmonised test approach to allow for synergies between them: All involved parties should work to reach Test- SPICE level 3 (as a minimum). This is the first maturity level, where the test process is well established, the corresponding working products are defined and controlled and thus where reuse can take place systematically. For a detailed list of work products referring to that level and leveraging reuse see chapter 8 in (SIG, 2012-11-01). For synchronisation between the test processes between different parties, an independent third party has to be utilised. This party guarantees that each involved party is not only working on reaching level 3 but that this is done in sync with all other parties. A typical job description for the latter point is to be a mediator: Usually two parties, even if they are in a very close relation, have different goals and objectives. This heterogeneity increases the more parties are involved in the overall process chain. Applying a process framework consistently decreases the heterogeneity to some degree because suppliers and OEMs are both covered in it (see for example primary life cycle processes, section 4.5.1). But even there some small differences between party A OEM and B Supplier in terms of testing processes might occur. These differences equate to a big problem if B Supplier is also B OEM subcontracting to CSupplier etc. The difference between A and C might be much bigger than between A and B since there is no direct link between A and C. The concept of a mediator helps here: He is neither responsible for any process execution nor for any work products. The only goal he has in mind is to ensure maximum consistency between the different testing processes and their maturity. As mediator he mediates not only between A and B but also between A and C to enable an overall harmonisation. This again enables A OEM to reuse testware from B Supplier, who himself reuses much testware from C Supplier. Doing this A OEM is reusing testware from C Supplier as well. Experience shows that, by applying the TestSPICE approach consistently for all involved parties and leveraging a mediator, the advantages are obvious: in many cases the complete HIL simulator, consisting of hardware and software, could be reused at the next level of engineering. And even if the OEM and the supplier have completely different hardware parts (for whatever reason) the basic concepts could be reused for Industrialised Test Automation. Typical examples are test concepts, test architectures, test models and test data. Applying TestSPICE ensures that these parameters are consistent between different suppliers and OEMs and that there is a big opportunity for increasing test effectiveness, i.e. to reuse assets from lower level tests and focus efforts on current level, and test efficiency, i.e. to automate test execution as much as possible.

Industrialised Test Automation 24 5. Conclusion and Outlook The market view on mechatronic systems is overwhelming: in future most devices will have some IT on board, calling for typical software quality assurance activities. The concept of industrialisation will force all parties to focus on their core competency, i.e. to delegate all other activities to somebody else. So the process chains of mechatronic systems will be more and more globally dispersed over dozens of different parties all over the world. Nowadays, each party focuses on its part in that chain and focuses solely on delivering the required product or service. Collateral products like testware are usually not delivered, forcing all parties to start from scratch when thinking about testing. The reason for that is not only a mono-dimensional contract focusing on the core deliverable, but also on the fact that each party might have its own processes, test concepts and test methodologies which makes any reuse nearly impossible. Industrialised Test Automation suggests another approach: A strong harmonisation of all testing related activities will enable a high level of reuse between different parties. Most of the testware created by a supplier can be reused by the OEM, if the fundamental testing process is similar. For that purpose this paper suggests TestSpice as the process reference model. It defines a complete catalogue of process areas that are relevant for testing. Each process can fulfil some specific process attributes that are clustered on five different maturity levels. The set of process attributes being relevant for Level 3 enable the required reuse: The supplier and OEM have comparable processes in place, generating similar sets of working products and having similar responsibilities. If an independent party who is controlling the overall process chain, including all involved parties, feels responsible for the overall harmonisation, many synergies by reusing testware can be leveraged. In practice, this reduces test efforts significantly for all involved parties and improves the overall quality of the mechatronic systems, since each partner can focus his test activities on those aspects that are core for their deliverables. In doing so, Industrialised Test Automation enables a layered automated integration test, in which each part that is integrated and tested into another part has some parts to be integrated and tested as well.

Industrialised Test Automation 25 6. Bibliographical References AUTOSAR. (2012). AUTOSAR. Retrieved 02 13, 2013, from www.autosar.org Bliss, R. (n.d.). Retrieved 2012, from Mechatronics: http://en.wikipedia.org/wiki/file:mecha_workaround.svg Brenner, W., Ebert, N., Hochstein, A. & Übernickel, F. (n.d.). IT-Industrialisierung: Was ist das. Retrieved 2007, from Computerwoche: IT-Strategie: http://www.computerwoche.de/management/ it-strategie/592035/ Charette, R. N. (n.d.). IEEE Spectrum inside technology. Retrieved February 2009, from This car runs on code: http://spectrum.ieee.org/green-tech/advanced-cars/this-car-runs-on-code Det Norske Veritas AS. (n.d.). Standard for certification, No. 2.24. Retrieved 2011, from Hardware in the Loop Testing (HIL): http://exchange.dnv.com/publishing/stdcert/standard2-24.pdf Dignan, L. (November 1, 2010). GM's Volt: 10 Million Lines of Code. (CBSNews). Etzkorn, J., BMW Group. (n.d.). Suppliers, SPICE & Beyond: A decade's experience report. VDA Automotive SYS Conference, Berlin 5th July, Berlin. Focus Money Online. (n.d.). Finanzen. Retrieved 18.09.2012, from Porsche startet Boxster-Produktion in Osnabrück: http://www.focus.de/finanzen/news/auto-porsche-startet-boxster-produktion-inosnabrueck_aid_821791.html Glossary Working Party from the International Software Testing Qualifications Board (December 2007). Standard glossary of terms used in Software Testing. Homepage: ISTQB. GSMA. (n.d.). Connected Life. Retrieved 2012, from Smart connectivitiy and why it matters: http://connectedlife.gsma.com/wp-content/uploads/2012/07/gsma_intelligent_mobile_networks.pdf Leclerque, K. (2010, September 09). Outsourcing im Jahr 2020. Retrieved July 19, 2011, from www.cio.de: http://www.cio.de/knowledgecenter/outsourcing/2247905/index.html Porsche (2004/2005). Porsche Geschäftsbericht: Porsche Konzern in Zahlen. Aktiengesellschaft Stuttgart. SIG, TestSPICE (2012-11-01). TestSPICE: Process Assessment Model. Accepted by intacs. Smith, A. (2008). An Inquiry into the Nature and Causes of the Wealth of Nations. Oxford University Press. Spillner, A. Vosseberg, K., Winter, M. & Haberl, P. (2011). Softwaretest-Umfrage 2011. Available at http://www.softwaretest-umfrage.de. Srivastava, P. R. (2009). Estimation of Software Testing Effort: An intelligent Approach. Birla Institute of Technology and Science, Pilani, Rajasthan, India: Computer Science and Information System Group.

Industrialised Test Automation 26 Wikipedia. Retrieved 12 2012, from Internet of things: http://en.wikipedia.org/wiki/internet_of_things Wikipedia. (2011, July 27). Independent test organization. Retrieved July 27, 2011, from Wikipedia, the free encyclopedia: http://en.wikipedia.org/wiki/independent_test_organization Winter, M., Ekssir-Monfared, M., Sneed, H., Seidl, R. & Borner, L. (2012). Der Integrationstest (1. Ausgabe ed.). Hanser-Verlag.