Return of experience on certification of the CCS subsystem from the independent laboratories ERA ETCS Conference on Testing and Certification 29th of March in Lille
Contents Brief Introduction into the Subset- 076 structure, method and process the test experience of independent laboratories Status quo - results achieved Lessons Learned to improve the quality of the outcome 29th of March > Slide 2
Introduction of Subset-076 Relation between component, integration and system tests RU = Railway undertaking, IM = Infrastructure manager, NoBo = Notified Body 29th of March > Slide 3
Introduction of Subset-076 Structure, approach and test process Structure Test creation process Test application process TSD* - Test Sequence Debugger 29th of March > Slide 4
Introduction of the independent Labs Subset-076 testing experience Cedex & INECO: Member of Subset-076 Working group since 2002 Leadership of Subset-076 WG Dr. Jaime Tamarit / Dr. Jorge Iglesias First OBU tests with all UNISIG suppliers 6 Subset-076 TS have been run 2004 Test campaign performed with one ETCS OBU in 2006 projects in service using Laboratory test with project data and real EVC Use of common formats for JRU records, project data and track layout DLR: Member of Subset-076 Working group since 2002 Partial Test campaign performed with another OBU in 2009/2010 Test campaign performed with 2 different ETCS OBUs in 2010 Support of the supplier concerning in-house tests Test automation Multitel: Member of Subset-076 Working group since 2006 Test campaign performed with one ETCS OBU in 2010 Support of suppliers concerning in-house tests Semi-automatic test result analysis. 29th of March > Slide 5
Status quo Results achieved Good cooperation between UNISIG companies and independent labs over several years, i.e.: Nearly all of the original UNISIG companies have tested their OBUs in an independent lab partially or completely concerning Subset-076 Subset-094 is an agreed and good basis as interface, enabling a remote OBU preparation in advance to a one week integration in the lab after shipping Independent labs are accepted partners supporting the development of OBUs with their test experience and facilities as well as being able to perform a Subset-076 test campaign in 6 months (e.g., beginning with the integration in 2 weeks, test runs in 8 weeks, evaluation and detailed analysis in 8 weeks, report* including review, discussions and finalization in 6 weeks) * Report with over 500 pages Din-A4 containing approx. detailed information of the 42.000 test steps 29th of March > Slide 6
Status quo Accreditation Multitel: Has finished the ISO 17025 accreditation for EVC certification process by BELAC. The certificate number is 427-TEST. Multitel has been certified to ISO 9001 since January 2004. The certificate number is BXL07000029. Cedex: Has been granted an ISO 17025 accreditation for Eurobalises. The certificate number is 465/LE1003. CEDEX will extend the accreditation for EVC certification by ENAC. DLR: Has been certified to ISO 9001 since 2003 and VDA6.2 since 2006. The DLR laboratory responding to the tender has applied to DGA (Deutsche Gesellschaft für Akkreditierung - German Association for Accreditation) for an accreditation to ISO 17025 for the EVC Annex Quality-DLR-02. 29th of March > Slide 7
Status quo Backward compatibility First results of the backward compatibility work: As a result of the contract signed by MULTITEL with the EEIG Users Group in 29th April 2010, no major backward compatibility problem was found between the TSD* for Baseline 3 against the Subset-076 Test Sequences for SRS v2.3.0 d. The test campaign has been performed during summer 2010 by CEDEX and INECO from Spain, DLR and IQST from Germany and TECH4RAIL and MULTITEL from Belgium. RINA from Italy participated to some tests that were carried out within the CEDEX premises with main focus to the test methodology. A second test is necessary, once the Subset-076 and the TSD* for Baseline 3 has been debugged, although this first result of the Backward Compatibility is promising. TSD* - Test Sequence Debugger 29th of March > Slide 8
Statistics from an arbitrary test execution Test Sequence executed until the end Summary of Total Steps Test Sequences executed up to the percentage of 25 21 abort due to OBU abort due to test sequence 47 39% Covered Steps Not Executed Steps 61% percentage of execution 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 0 10 20 30 40 50 60 70 80 90 100 test sequence Conclusions: Subset-076 is applicable and provides feedback to the OBU supplier Both - the products and Subset-076 are in the consolidation taking into account the experiences and results from a test campaign Neither an OBU nor the test standard can be considered as being approved without coming together by performing a test campaign in a independent lab 29th of March > Slide 9
Extension of the Subset-076 improvement process Test creation procedure Test campaign process E&H* Errors and Hints as feedback from the Ss-076 application in test campaigns 29th of March > Slide 10
Errors and hints as main part of the improvement process All issues (problems, questions,..) upcoming during the application of Subset-076 will be collected and reviewed and discussed in the Subset- 076 working group (ERA, independent labs, UNISIG representatives) Version 50 from 14th of March 2011 contains: 1634 entries (includes multiple entries) as feedback from the application of Subset-076 (in comparison with round about 42.000 test steps of all test sequences representing 3,89 %) 1509 entries (single steps) are in progress to be reviewed in order to improve the test specification by replacing them with test cases or test case corrections* *many single steps have been introduced in order to establish pre-conditions of certain test cases 29th of March > Slide 11
Categorisation of errors and hints entries Cat 1-3: Will be managed in the Subset- 076 working group Cat 4-5: Will be forwarded to the steering committee Cat 1 Editorial error Example (E&H l752) : Comment should be added to clarify, that min safe rear end must be used (position refers to estimated front end). See 3.13.3.3.1b) Cat 2 Sequence error Example (E&H l406): Permitted speed indication is according to table 4.7 not applicable in UN. Cat 3 Variability (options, set of options, parameterization...) Example (E&H l482): The packet 5 is wrong, the NID_C variable is empty and also is necessary the value 1 in the N_ITER variable. Cat 4 Different specification interpretations (DMI inclusion,...) Example (E&H l303): Set these steps to optional. We have CR338 for this functionality/behaviour which is optional to implement. Cat 5 Impact on other subsets Example (E&H l178): Consistency problem because of Subset-040 v210. The 72 packet is not allowed. 29th of March > Slide 12
Key issues to improve the test specification The following key issues intend to improve the next Subset-076 version for Baseline 3: Temporal behaviour of the Test Sequence Debugger: The non redundant and non safe functional implementation of the OBU does not provide a realistic temporal behaviour, e.g. like the time to boot an OBU or time to enter a complete set of train parameters on the real DMI or to establish a safe radio connection to the trackside. Diverging train dynamics of real OBUs: Real OBUs to be tested respect mostly certain train types and characteristics, like acceleration and breaking forces, weights, length, axle load,.. This causes deviations concerning the movement calculated in the Test Sequence according to the generic train data (i.e., acceleration, braking, max speed, train stop locations of Movement Authorities,..). OBU - ETCS onboard units 29th of March > Slide 13
Key issues to shorten the test application Increase modularity of the Test Sequences: The sequential manner of the Test Sequences and mixing Level 0,1,2, infill test cases in a Test Sequence increase the effort of testing as well as they cause too strong restrictions and inflexibility concerning the OBU configuration. Formalization of test events enabling test automation (especially the test evaluation): DMI and JRU test steps should be specified more formally, e.g., when the driver enters the driver ID or message 24 or packet 27 is logged. 29th of March > Slide 14
Relationship between operational and technical tests Ensuring the coherence, consistency and compatibility of the operational with the technical system (Subset-026) requirements 29th of March > Slide 15