Managing detailed development data in a PLM framework Jan Söderberg, Systemite AB (presenter) Peter Thorngren, Volvo GTT 2 nd Interoperability for Embedded Systems Development Environments Systemite AB Fürstenbergsgatan 4 SE-416 64 Göteborg Sweden Phone: +46 31 719 93 00
Embedded Systems - PLM scale
Complexity - Scale Systemite AB 2012 Fürstenbergsgatan 4 SE-416 64 Göteborg Sweden Phone: +46 31 719 93 00
Complexity Scale, contd. One ECU out of 50!
Complexity - Versioning
Complexity - Connectivity
Complexity - Product Line 130+ Specifications and Reports 10+ Products 40+ Suites
Complexity - Change > 2000 Change Requests per Baseline
Complexity IT Environment API MQ API API MQ Automated File Mgmt
Tool Integration
Gimme All Your Requirements Which requirements?
Sample Meta Model, around Requirement Functional Requirement Which System? Which Version? Which Status? (Approval, status etc)
Integration Case Automated Environment Development Execution Sub Suite Scope Scope Suite Specification Specification Item Specifying Item Spec Specification System Artifact Requirement Specification Requirement Case Case Case Status System Under Artifact Case Requirement
Integration Case Automated Environment SystemWeaver Development & Management Customer Execution application PNTool API API Integration based on: Open meta model, based on EAST- ADL and custom extensions automation scripts SystemWeaver C# API
Findings Indeed possible to support embedded systems development in a PLM framework like SystemWeaver - even on a detailed design level Tool integration can be done by file exchange or API integration OSLC can be a new alternative for integration Integration without open meta models and onthologies is extremely difficult and pointless Much to gain by using standard or customized Domain Specific Architecture Description Languages (ADLs), like EAST-ADL
Automated in RIG CW EB Vehicle Engineering Peter Thorngren 16
Development of complex systems is typically not managed in PLM systems today. Examples are development of software, embedded systems or test automation. The approaches used instead are mostly based on file based version control or custom solutions. A reason for this situation is that many development tools are still file based, and the scale of individual development activities are small enough to be carried out in the traditional approach. However, the growing scale of systems like automotive electric/electronic systems, the tight schedules in large development projects, and the need for integration and collaboration between different development activities has made the traditional approaches less feasible. In this paper we will look into the problem, and describe why the file based approach faces substantial challenges and why the PLM approach is promising, using an industrial case as an example. In traditional file based development individual artifacts are defined in separate files. File based versioning can keep track of changes and versions of such files. It also offers basic configuration support, usually using some label mechanism, where different files included in a configuration are tagged with a label representing the configuration. A limitation with this approach is that the actual system configuration has to be built outside the versioning system by using some build mechanism, like the traditional make command for software. Such built configurations then exist only in the PC or workstation of an individual developer, and changes to any of the (source) files have to be checked in to the versioning system. This approach works well for development of software where an individual developer can develop a module of the software contained in a file, for a period of hours or days. This level of granularity and level of collaboration is not enough for a complex system or system-of-system context. An example that stresses the need for efficient data management is testing of large systems. The challenge here is due to the results of several processes that meet and touch in the testing activities; the test cases, the specifications that are used as references for the test cases, the artifacts comprising System Under, the test environment and equipment to mention the most important. The realization of individual artifacts is typically uncorrelated, on the scale of time relevant for testing activities: new artifacts arrive every day, requirements and test cases constantly change, and regression tests have to be performed daily. The rate of change means that formal waterfall or baseline based configuration management is not effective, since there will be many changes included within each iteration, between each formal baseline. The rate of change means that test data representing the developed system will change from one second to the other. This also means that defining configurations based on labels and performing check out, check in, and merge operations, which are required for file based configuration and management, no longer work. An alternative solution is to use a PLM approach where all information is managed in a coherent framework. In this approach the actual system configuration, for example as defined by requirements, test cases and test results, is explicit in the PLM system, with no need for a separate build process. By direct access to the representation of the developed system in the PLM system, data produced by test activities can update the configuration in real time. This kind of real-time access and collaboration is not limited to human intellectual interaction but can also be used for automated processes like regression tests. Some experiences that validate this approach will be presented from the development and testing of the TEA+ E/E system, developed by Volvo Group Trucks Technology. One of the characteristics of complex systems, like embedded systems, is the multitude of aspects that need concern during development. The classical (minimal) PDM approach is to manage the product structure of the system, where detailed information is kept as proprietary, black box representations for each block in the structure. For established mechanical systems this is a working solution. Unfortunately, for complex systems the product or module structure is one of the least important structure to be managed. Other more or equally important structures or viewpoints are, to mention a few: Connectivity interfaces between components defining responsibilities between them Interaction how the component interact through the interfaces in order to achieve desired functional properties. Allocation how functionality or responsibility is allocated to the physical structure Dependability how the system components and requirements fulfill the safety goals of the system Requirements how the properties of the system components and derived requirements fulfill product requirements and verification the implementation status of the system as proven by test To enable interoperability of tools involved in the development process all the above structures must be open. Open means that they must be visible and accessible. Moreover the semantics of them must be defined. One efficient approach to achieve this is to share a common meta model, accepted by the product domain. The level of required interoperability, from by the needed viewpoints (listed above), defines the level of granularity of this openness, and a coherent domain specific definition of this openness for embedded (automotive) systems has been defined in the EAST-ADL meta model. Historically we have built the interoperability in SystemWeaver, for tool and process integration, by defining a meta model to be used within SystemWeaver. By the use of industrially accepted meta models (like EAST-ADL) and means of technical integration as offered by OSCL this level of integration can be achieved also between different PLM systems and tools. Note that provisions for technical integration is not enough since there must be a shared definition of the semantics of the shared data.