SQMB '11 Automated Model Quality Rating of Embedded Systems



Similar documents
Ingo Stürmer, Dietrich Travkin. Automated Transformation of MATLAB Simulink and Stateflow Models

Model-based Testing of Automotive Systems

Development of AUTOSAR Software Components within Model-Based Design

Best Practices for Verification, Validation, and Test in Model- Based Design

Echtzeittesten mit MathWorks leicht gemacht Simulink Real-Time Tobias Kuschmider Applikationsingenieur

1.1 The Nature of Software... Object-Oriented Software Engineering Practical Software Development using UML and Java. The Nature of Software...

Software House Embedded Systems

Die wichtigsten Use Cases für MISRA, HIS, SQO, IEC, ISO und Co. - Warum Polyspace DIE Embedded Code-Verifikationslösung ist.

MODEL-BASED DEVELOPMENT OF AUTOMOTIVE EMBEDDED SOFTWARE IN COMPLIANCE WITH ISO 26262: CHALLENGES & EFFECTIVE SOLUTIONS 8 JUNE - 9 JUNE 2015

Complexity Analysis of Simulink Models to improve the Quality of Outsourcing in an Automotive Company. Jeevan Prabhu August 2010

Questions? Assignment. Techniques for Gathering Requirements. Gathering and Analysing Requirements

Model Based System Engineering (MBSE) For Accelerating Software Development Cycle

Automated Data Acquisition & Analysis. Revolutionize Validation Testing & Launch With Confidence

Service Oriented Architecture for Agricultural Vehicles

Introduction of ISO/DIS (ISO 26262) Parts of ISO ASIL Levels Part 6 : Product Development Software Level

Certification of a Scade 6 compiler

Automotive Software Engineering

Estimating Software Reliability In the Absence of Data

I can make just such ones if I had tools, and I could make tools if I had tools. -Eli Whitney

Software Production. Industrialized integration and validation of TargetLink models for series production

Introduction to Static Analysis for Assurance

How To Do Continuous Integration

Power inverters: Efficient energy transformation through efficient TargetLink code

Software Development Principles Applied to Graphical Model Development

Using Model and Code Reviews in Model-based Development of ECU Software Mirko Conrad, Heiko Dörr, Ines Fey, Ingo Stürmer

LMS is a simple but powerful algorithm and can be implemented to take advantage of the Lattice FPGA architecture.

Functional Architectures with SysML

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!

Caterpillar Automatic Code Generation

Business Process Configuration with NFRs and Context-Awareness

What is a life cycle model?

Is a Data Scientist the New Quant? Stuart Kozola MathWorks

Java-based Functionality and Data Management in the Automobile. Prototyping at BMW Car IT GmbH. by Alexandre Saad, BMW Car IT GmbH, Munich/Germany

Software Engineering for LabVIEW Applications. Elijah Kerry LabVIEW Product Manager

ISTQB Certified Tester. Foundation Level. Sample Exam 1

Layman's Guide to ANSI X3.182

Software Engineering Reference Framework

Part I. Introduction

Dr.-Ing. Rainer Rasche dspace GmbH Rathenaustrasse Paderborn automotive testing expo June 22, 2010

To introduce software process models To describe three generic process models and when they may be used

Introduction to MATLAB Gergely Somlay Application Engineer

AUTOSAR Seminar WS2008/ Assignment: Simulation of Automotive Systems in the Context of AUTOSAR

An Introduction to. Metrics. used during. Software Development

Software Development with Real- Time Workshop Embedded Coder Nigel Holliday Thales Missile Electronics. Missile Electronics

Software Module Test for an Electronic Steering Lock

Maschinelles Lernen mit MATLAB

TESSY Automated dynamic module/unit and. CTE Classification Tree Editor. integration testing of embedded applications. for test case specifications

Static analysis on integration level

MATLAB in Business Critical Applications Arvind Hosagrahara Principal Technical Consultant

A Tool for Generating Partition Schedules of Multiprocessor Systems

Software Design Document (SDD) Template

Quality Assurance Methods for Model-based Development: A Survey and Assessment

Automating Code Reviews with Simulink Code Inspector

A Static Analyzer for Large Safety-Critical Software. Considered Programs and Semantics. Automatic Program Verification by Abstract Interpretation

Safe Automotive software architecture (SAFE) WP3 Deliverable D3.6.b: Safety Code Generator Specification


CS 389 Software Engineering. Lecture 2 Chapter 2 Software Processes. Adapted from: Chap 1. Sommerville 9 th ed. Chap 1. Pressman 6 th ed.

E-vote 2011 Version: 1.0 Testing and Approval Date: 26/10/2009. E-vote SSA-U Appendix 5 Testing and Approval Project: E-vote 2011

Managing detailed development data in a PLM framework

Product Development Flow Including Model- Based Design and System-Level Functional Verification

REAL-TIME STREAMING ANALYTICS DATA IN, ACTION OUT

Towards a Model-Based Safety Assessment Process of Safety Critical Embedded Systems. Peter Bunus petbu@ida.liu.se

Best practices for developing DO-178 compliant software using Model-Based Design

Introduction to Engineering System Dynamics

MathWorks Automotive Conference 2015 Simon Fürst, 2015/09/24. MODEL-BASED SOFTWARE DEVELOPMENT: AN OEM S PERSPECTIVE.

Methodology Framework for Analysis and Design of Business Intelligence Systems

The «SQALE» Analysis Model An analysis model compliant with the representation condition for assessing the Quality of Software Source Code

Embedded Vision on FPGAs The MathWorks, Inc. 1

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

Software Process Models. Xin Feng

11 November

find model parameters, to validate models, and to develop inputs for models. c 1994 Raj Jain 7.1

Industrial Roadmap for Connected Machines. Sal Spada Research Director ARC Advisory Group

STEP-PS/1AC/12DC/1.5. Extract from the online catalog. Order No.: DIN rail power supply unit 12 V DC/1.5 A, primary switched-mode, 1- phase

Converting Models from Floating Point to Fixed Point for Production Code Generation


Model-Based Development of Safety-Critical Software: Safe and Effi cient

Dynamic Network Analyzer Building a Framework for the Graph-theoretic Analysis of Dynamic Networks

Validating Diagnostics in Early Development Stages

Patterns to Introduce Continuous Integration to Organizations

ABSTRACT. would end the use of the hefty 1.5-kg ticket racks carried by KSRTC conductors. It would also end the

Managing the Services Lifecycle SOA & BPM

A highly configurable and efficient simulator for job schedulers on supercomputers

Software Quality. Software Quality Assurance and Software Reuse. Three Important Points. Quality Factors

GET ALERTED ABOUT A T24 ERROR BEFORE IT OCCURS PROACTIVE T24 MONITORING TAMAM YOU NEED INFORMATION AND NOT MEASURED DATA

Towards Collaborative Requirements Engineering Tool for ERP product customization

8. Master Test Plan (MTP)

Development Process Automation Experiences in Japan

Transcription:

SQMB '11 Automated Model Quality Rating of Embedded Systems Jan Scheible (jan.scheible@daimler.com) Daimler AG - Group Research and Advanced Engineering Hartmut Pohlheim (pohlheim@model-engineers.com) Model Engineering Solutions GmbH GR/PSS / Model-based Development / 21.02.2011

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 2 Agenda Challenge Approach Further Evaluation Summary and Outlook

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 3 Challenge (1/2) Current Situation - Model-based software development increasingly becoming standard in automotive industry. - Size and complexity of models becoming ever larger: - Example of a large Matlab Simulink model from passenger vehicle domain: - approx. 15,000 blocks - 700 subsystems - Subsystem hierarchy with 16 levels - Considerable time pressure in development. - Despite high abstraction level, developers have much freedom. Lots of possibilities for potential errors.

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 4 Challenge (2/2) Goals - Rating of model quality in an automatic and comprehensible manner. - Reduce costs and effort. - Use of existing information to as high a degree as possible. - Provide a compact overview of model quality. - Visualization of current status and progress of model quality.

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 5 Agenda Challenge Approach Model Quality Quality Model in the Development Process Evaluation of Measured Values Aggregation of Evaluations Further Evaluation Summary and Outlook

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 6 Approach Overview Framework zur for automatisierten Automatic Quality Ermittlung Rating of der Modellqualität Quality analytisch analytic Quality Model Aspects Factors Metrics Criteria Quality Rating constructive Patterns Guidance Architecture Patterns Guidelines

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 7 Approach Overview Framework zur for automatisierten Automatic Quality Ermittlung Rating of der Modellqualität Quality analytisch analytic Quality Model Aspects Factors Metrics Criteria Quality Rating constructive Patterns Guidance Architecture Patterns Guidelines

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 8 Model Quality (1/2) High-Quality Simulink Models - A model may fulfill criteria with a positive influence on desired factors (e.g. maintainability Maintainability, comprehensibility Comprehensibility or reliability Reliability ). - The more criteria a Simulink model fulfills, the higher its quality. - All criteria that allow inference about desired factors are of interest.

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 9 Model Quality (2/2) Objective Rating - An objective rating is possible with the help of a quality model. Model Quality Factors are desired properties for high-quality models Factor 1 Factor 2 Criteria are indicators of desired properties Criterion 1 Criterion 2 Evaluation Metrics check the fulfillment of the criteria Metric 1 Metric 2 Metric 3 - The quality model currently consists of 6 factors, 18 criteria and 46 metrics.

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 10 Quality Model in the Development Process (1/2) Software requirements Integration test Requirements coverage of model Quality criteria of model Guideline conformity of model Requirements Software design Models Software integration Module test Test implementation Test specification Requirements coverage of test specification Complexity of model Implementation Code Result of model test Result of static code analysis Model coverage of test implementation Quality Model Framework for rating Rating Model model Quality quality

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 11 Quality Model in the Development Process (2/2) Classification Model Quality Correctness Comprehensibility Testability Code generation ability Reliability Maintainability Functionality Scale Test coverage Runtime errors Standard conformity Requirements coverage of model Complexity of model Requirements coverage of test specification Result of model test Result of static code analysis Guideline conformity of model Model coverage of test implementation

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 12 Quality Model in the Development Process (2/2) Classification Model Quality Correctness Reliability Correctness Reliability Comprehensibility Maintainability Testability Functionality Code generation ability Functionality Scale Test coverage Runtime errors Requirements coverage of model Runtime errors Complexity of model Standard conformity Requirements coverage of model Requirements coverage of test specification Result of model test Result of static code analysis Guideline conformity of model Model coverage of test implementation Result of model test Result of static code analysis

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 13 Evaluation of Measured Values (1/5) Measurements Model quality Comprehensibility Maintainability Correctness Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines #Subsystems ØSignals per Bus #DataStore Read Passed tests Requirements coverage v 1 = 828 v 2 = 87 v 3 = 18.1 v 4 = 54 v 5 = 0.83 v 6 = 1.0

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 14 Evaluation of Measured Values (2/5) Determination of Permissible Values Model quality - Define limits for each metric in order to check their permissible values. Comprehensibility Maintainability Correctness Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines #Subsystems ØSignals per Bus #DataStore Read Passed tests Requirements coverage v 1 = 828? v 2 = 87? v 3 = 18.1? v 4 = 54? v 5 = 0.83 e 5 = 83% v 6 = 1.0 e 6 = 100% min max min max min max min max

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 15 Evaluation of Measured Values (3/5) Reference Model Comprehensibility Model quality Maintainability Correctness - A reference model describes how an average Simulink model should look. Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines #Subsystems ØSignals per Bus #DataStore Read Passed tests Requirements coverage v 1 = 828 e 1 = 100% v 2 = 87 e 2 = 20% v 3 = 18.1 e 3 = 98% v 4 = 54? v 5 = 0.83 e 5 = 83% v 6 = 1.0 e 6 = 100% 828 2,146 0 87 869 0.0 18.1 36.4 min max 3,735 blocks => Minimum Maximum # Crossed lines # Subsystems - 0 2,146 869 Ø Signals per Bus 0.0 36.4

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 16 Model quality Evaluation of Measured Values (4/5) Rules Comprehensibility Maintainability - Rules specifying relations between measured values. Correctness Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines v 1 = 828 e 1 = 100% #Subsystems v 2 = 87 e 2 = 20% ØSignals per Bus v 3 = 18.1 e 3 = 98% #DataStore Read v 4 = 54 e 4 = 50% Passed tests v 5 = 0.83 e 5 = 83% Requirements coverage v 6 = 1.0 e 6 = 100% 54 57 60 #DataStore Memory = 57 #DataStore Read #DataStore Memory

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 17 Evaluation of Measured Values (5/5) Visualization ø Signals per Bus 32.8 25.3 # Internal states 1,335 11,541 12,601 # Lines # Clones 725 1,065 5,040 6,683 # Crossed lines 499 7.7 10,099 ø Outport links 1,255 1.25 # Subsystems 2,437 1.21 1,908 1,840 48 44 48 40 0.7 6 2.2 9.0 8 9 Depth of subsystem hierarchy ø Height of subsystem representation 110 68 ø Width of subsystem representation 1.3 1.5 ø Width of output interface 15.1 ø Width of input interface

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 18 Aggregation of Evaluations (1/5) Evaluated Metrics Model quality Comprehensibility Maintainability Correctness Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines #Subsystems ØSignals per Bus #DataStore Read Passed tests Requirements coverage e 1 = 100% e 2 = 20% e 3 = 98% e 4 = 50% e 5 = 83% e 6 = 100%

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 19 Aggregation of Evaluations (2/5) Algorithm - Aggregated values provide a quick overview of a model s quality. - An arithmetic mean is not suitable: If one or more metrics are not fulfilled to a sufficient extent, this must lead to a devaluation. 60 Evaluations: e 1 = [25%, 30%, 35%] e 2 = [10%, 80%, 90%] Aggregated evaluation (in percent) 50 40 30 20 10 0 0 1 2 3 Number of evaluations under threshold

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 20 Aggregation of Evaluations (3/5) Aggregation in the Quality Model Threshold = 30% Model quality Comprehensibility Maintainability Correctness Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines #Subsystems ØSignals per Bus #DataStore Read Passed tests Requirements coverage e 1 = 100% e 2 = 20% e 3 = 98% e 4 = 50% e 5 = 83% e 6 = 100%

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 21 Aggregation of Evaluations (3/5) Aggregation in the Quality Model Threshold = 30% Model quality Comprehensibility Maintainability Correctness e 7 = 100% e 8 = 29,5% e 9 = 50% e 10 = 91,5% Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines #Subsystems ØSignals per Bus #DataStore Read Passed tests Requirements coverage e 1 = 100% e 2 = 20% e 3 = 98% e 4 = 50% e 5 = 83% e 6 = 100%

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 22 Aggregation of Evaluations (3/5) Aggregation in the Quality Model Threshold = 30% Model quality e 11 = 32,4% e 12 = 50% e 13 = 91,5% Comprehensibility Maintainability Correctness e 7 = 100% e 8 = 29,5% e 9 = 50% e 10 = 91,5% Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines #Subsystems ØSignals per Bus #DataStore Read Passed tests Requirements coverage e 1 = 100% e 2 = 20% e 3 = 98% e 4 = 50% e 5 = 83% e 6 = 100%

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 23 Aggregation of Evaluations (3/5) Aggregation in the Quality Model e 14 = 58% Threshold = 30% Model quality e 11 = 32,4% e 12 = 50% e 13 = 91,5% Comprehensibility Maintainability Correctness e 7 = 100% e 8 = 29,5% e 9 = 50% e 10 = 91,5% Readability Scale Avoidance of implicit constructs Functional correctness #Crossed lines #Subsystems ØSignals per Bus #DataStore Read Passed tests Requirements coverage e 1 = 100% e 2 = 20% e 3 = 98% e 4 = 50% e 5 = 83% e 6 = 100%

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 24 Aggregation of Evaluations (4/5) Visualization 13% 36% 34% 34% 40% 100% 50% 90% 100% 90% 31% 31% 33% 33% 42% 31% 0% 0% 36% 36% 36%

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 25 Aggregation of Evaluations (5/5) Retracing Quality Rating over Time - Model quality rating provides a snapshot of a model s evaluations at a specific time. - Trend analysis is used to retrace the evolution and evaluation of measured values over time. - Shows how the metrics measured values lie in relation to their limits. 100% 100% 100% 100% Metric 1 50% Metric 2 25% 0% 0% 1st iteration 2nd iteration 3rd iteration 1,350 Maximum 1,300 1,200 Metric 2 1,250 1,150 Minimum 80 60 Maximum Metric 1 45 52 1st iteration 2nd iteration 3rd iteration

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 26 Agenda Challenge Approach Further Evaluation Summary and Outlook

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 27 Further Evaluation 1. Comparison of a model s quality rating with perceived model quality by the developers. 2. Verify whether the number of findings in model reviews correlates with the quality rating. 3. Verify whether the appearance of actual bugs correlates with the quality rating. Rate as many models as possible.

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 28 Agenda Challenge Approach Further Evaluation Summary and Outlook

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 29 Summary - A quality model is used for defining our notion of model quality. - Most important artifacts of the model-based development process are taken into account. - Metrics check the fulfillment of desired quality criteria. - The measured values of the metrics are evaluated. - Evaluations are aggregated in the quality model to provide a quick overview. - Retracing quality rating over time.

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 30 Outlook Evaluation - Verify normalization by block count in the reference model [in progress]. - Check if the approach is only applicable to models in the same project [in progress]. Quality model - Potentially the expressive power of the quality model must be enhanced (e.g. handling of conflicting measurements). Metrics - Integration of more Simulink-specific metrics [in progress]. - Identification of the metrics with the highest relevance [in progress]. - Add more interfaces to tools for automatic rating [in progress]. - Compare the frequency distribution of the per subsystem measurements.

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 31 Thank you for your attention!

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 32 Backup

Reference Model - Average values of arbitrary models normed against the model s reference (#blocks) Measurements database Modell 1 Modell 2 Modell 3 Modell 4 Modell 5 Average Variance #Subsystemanmerkungen Gotos 44 9 0 32 47 0.01318 0.01551 # Subsystems 340 677 231 150 1,083 0.17428 0.12372 # Crossed lines 299 931 6,683 828 2,581 0.45065 0.20779 # Blocks 1,120 2,924 9,017 2,526 4,309 1.0 0.0 Instantiated reference model # Subsystems # Lines # Crossed lines 200 blocks 1,000 blocks 5,000 blocks Min. Max. Min. Max. Min. Max. 14 46 70 230 348 1,149 224 256 1,120 1,280 5,600 6,400-96 - 480-2,400 - Average values multiplied by reference (#blocks) - Min and max result from variance of averages. Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 33

Prototype Infrastructure Qualitätsmodell Metamodell Faktoren Metriken Kriterien Simulink-Modell Data-Provider MXAM MXRAY manuelle Eingaben sldiagnostics Metrik 1 Metrik 2 Metrik 3... Metrik n Qualitätsbewertung Messergebnisse Messwert 1 Messwert 2 Messwert 3... Messwert n Präsentation Messobjekt

Automated Model Quality Rating of Embedded Systems / GR/PSS / 21.02.2011 35 Aggregation of Evaluations (?/?) Algorithm and Formula e 1 = [25%, 30%, 35%] e 2 = [10%, 80%, 90%]

Evaluation of Measured Values (?/?) Limits Tolerance Tolerance 100% 0% 0 Minimum Maximum Tolerance Tolerance 100% 0% 0 Minimum 0% 0 Maximum