LISA Pathfinder SUMMARY



Similar documents
Software Quality Subcontractor Survey Questionnaire INSTRUCTIONS FOR PURCHASE ORDER ATTACHMENT Q-201

Mission Operation Ground. ESA. Mario Merri GSAW, Los Angeles, USA 2 Mar 2011 ESA UNCLASSIFIED

Space Project Management

Space Project Management

8. Master Test Plan (MTP)

ELECTROTECHNIQUE IEC INTERNATIONALE INTERNATIONAL ELECTROTECHNICAL

Space product assurance

Criteria for Flight Project Critical Milestone Reviews

SCHEDULE 3. Milestones and Deliverables. Redacted Version

RAMS Software Techniques in European Space Projects

SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK

Space engineering. System engineering. ECSS-E-10 C Draft 1

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

Page 1 of 7 Effective Date: 12/18/03 Software Supplier Process Requirements

Derbyshire Trading Standards Service Quality Manual

Introducing ECSS Software-Engineering Standards within ESA

Space product assurance

Montana Department of Transportation Information Services Division. System Development Life Cycle (SDLC) Guide

ESA s Data Management System for the Russian Segment of the International Space Station

ISO 9001: 2008 Construction Quality Management System Sample - Selected pages (not a complete plan)

An Introduction to the ECSS Software Standards

TfNSW Standard Requirements TSR T Technical Management

Space Project Management

R214 SPECIFIC REQUIREMENTS: INFORMATION TECHNOLOGY TESTING LABORATORY ACCREDITATION PROGRAM

ISO 9001:2015 Overview of the Revised International Standard

Network Rail Infrastructure Projects Joint Relationship Management Plan

An Increase in Software Testing Robustness: Enhancing the Software Development Standard for Space Systems

IT Project: System Implementation Project Template Description

PROJECT AUDIT METHODOLOGY

Guidelines for the Acceptance of Manufacturer's Quality Assurance Systems for Welding Consumables

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

This is Document Schedule 5 Part 1 referred to in this Contract SCOTTISH MINISTERS REQUIREMENTS SCHEDULE 5 PART 1 QUALITY MANAGEMENT SYSTEM

AS 9100 Rev C Quality Management System Manual. B&A Engineering Systems, Inc Business Park Drive, Suite A-1 Costa Mesa, CA 92626

3SL. Requirements Definition and Management Using Cradle

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

AIPM PROFESSIONAL COMPETENCY STANDARDS FOR PROJECT MANAGEMENT PART B CERTIFIED PRACTISING PROJECT PRACTITIONER (CPPP)

SOFTWARE VERIFICATION RESEARCH CENTRE SCHOOL OF INFORMATION TECHNOLOGY THE UNIVERSITY OF QUEENSLAND. Queensland 4072 Australia TECHNICAL REPORT

SOFTWARE ASSURANCE STANDARD

Space project management

Software Classification Methodology and Standardisation

ECSS-E-ST-40C 6 March Space engineering. Software. ECSS Secretariat ESA-ESTEC Requirements & Standards Division Noordwijk, The Netherlands

Appendix E Program Management Plan Template

Rolling Stock PPP Double Deck Trains. Exhibit 2. Contract Management Requirements

Tailoring of ECSS Software Engineering Standards for Ground Segments in ESA

NATO Integrated Quality Requirements for Software throughout the Life Cycle

System Engineering Plan

Bitworks Design & Consultancy Quality Management System Part 1 - Quality Manual

Engineering Procurement Construction Quality Plan

<name of project> Software Project Management Plan

Request for Proposal for Application Development and Maintenance Services for XML Store platforms

PROJECT MANAGEMENT FRAMEWORK

SAFE SOFTWARE FOR SPACE APPLICATIONS: BUILDING ON THE DO-178 EXPERIENCE. Cheryl A. Dorsey Digital Flight / Solutions cadorsey@df-solutions.

BOARD NOTICE COUNCIL FOR THE BUILT ENVIRONMENT. Notice No

UIDE FOR NDEPENDENT OFTWARE ERIFICATION ALIDATION

CONFIGURATION MANAGEMENT PLAN GUIDELINES

Developing LPF s Data Management Unit

Wharton Construction Ltd. Quality Manual. Kellaw Road Yarm Road Business Park Darlington DL1 4YA

Model-Based Testing of Spacecraft Flight Software

PHASE 3: PLANNING PHASE

PHASE 3: PLANNING PHASE

EARSC Guideline Document. EARSC EO Industry Certification Scheme

Project Management Guidelines

Procedure for Assessment of System and Software

JE PANEL/BENCHMARKING REF NO: 867/4 EVALUATION DATE:

Space engineering ECSS. System engineering Part 1: Requirements and process. ECSS-E-10 Part 1B EUROPEAN COOPERATION FOR SPACE STANDARDIZATION

NABL NATIONAL ACCREDITATION

Information security controls. Briefing for clients on Experian information security controls

Procurement of Goods, Services and Works Policy

FSW QA Testing Levels Definitions

NFSA Project Management Guidelines

IEC Functional Safety Assessment. Project: K-TEK Corporation AT100, AT100S, AT200 Magnetostrictive Level Transmitter.

Software Engineering Reference Framework

Notes. Score. 1 Basic Services 1.1. A few instances; could have been more proactive

SCHEDULE 16. Exit Plan. sets out the strategy to be followed on the termination (including Partial Termination) or expiry of this Agreement; and

NATO GUIDANCE ON THE USE OF THE AQAP 2000 SERIES

CalMod Design-Build Electrification Services

Maturity Model. March Version 1.0. P2MM Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce

North European Functional Airspace Block Avinor, Norway EANS, Estonia Finavia, Finland LGS, Latvia. NEFAB Project CHANGE MANAGEMENT MANUAL

DRAFT PLANNING THE OPENING OF A ROAD PROJECT GUIDELINE 1

Project Assessment Framework Establish service capability

Quality Manual. This manual is proprietary and no part thereof shall be copied without written authorisation from the company. Ref: Quality Manual.

Software Engineering Introduction & Background. Complaints. General Problems. Department of Computer Science Kent State University

Title: Rio Tinto management system

Quality Assurance QUALITY ASSURANCE PLAN

MHRA GMP Data Integrity Definitions and Guidance for Industry January 2015

Software Test Plan (STP) Template

Introduction of ISO/DIS (ISO 26262) Parts of ISO ASIL Levels Part 6 : Product Development Software Level

Software Quality Assurance Plan

aaca NCSA 01 The National Competency Standards in Architecture aaca Architects Accreditation Council of Australia PO Box 236 Civic Square ACT 2608

Procedure. Work Health and Safety Contractor Management. Document number: PRO Rev no. Description Process Owner Approved for issue

CORPORATE QUALITY MANUAL

CLIENT / PROJECT MANAGER AGREEMENT

6500m HOV Project Stage 1: A-4500 HOV

Goddard Procedures and Guidelines

Old Phase Description New Phase Description

Transcription:

Page 2 of 36 SUMMARY This document defines the Independent Software Verification and Validation requirements for the Implementation Phase of the LISA Pathfinder project.

Page 3 of 36 TABLE OF CONTENTS 1 SCOPE... 5 1.1 Applicable and Reference Documents... 6 1.1.1 Applicable Documents... 6 1.1.2 Reference Documents... 6 2 ACRONYMS... 7 3 PROGRAMME OVERVIEW... 8 3.1 Project Framework... 8 4 DEFINITIONS... 9 4.1 Prime... 9 4.2 Function definition... 9 4.3 Criticality Definition... 9 4.4 Independent Software Verification and Validation ()... 9 5 PROGRAMME ACTIVITIES AND SCOPE... 11 5.1 Software identification... 11 5.2 Milestones... 11 5.3 Scope of work... 11 5.3.1 Software Criticality Analysis... 12 5.3.2 Requirements Verification... 12 5.3.3 Design Verification... 12 5.3.4 Coding Verification... 14 5.3.5 Verification and Validation Testing... 14 5.3.6 Evaluation of Documentation... 16 5.4 Software items subject to... 17 5.5 Deliverable documents... 18 5.6 Software Non-Conformance and Problem Reporting... 19 5.7 Risk Management... 19 6 ORGANISATION... 20 6.1 Capability... 20 6.2 Tools, Techniques and Methods... 20 ANNEX A... 21 Specimen Programme Flowchart... 21 Flow Chart Key... 22 ANNEX B... 23

Page 4 of 36 REPORT GUIDELINES... 23 B.1 EXPECTED CONTENT OF A VERIFICATION REPORT... 23 B.2 PERIODIC REPORTING, MEETINGS AND PARTICIPATION IN REVIEWS... 23 ANNEX C... 25 S REQUIREMENTS MATRIX... 25 LIST OF TABLES Table 5.4.1 Software Items Subject to...17 Table 5.5.1 Timing of Reports...18

Page 5 of 36 1 SCOPE This specification, designed to complement the Statement Of Work (SOW), defines the requirements for the Independent Software Verification and Validation () activities to be performed on the LISA Pathfinder project. The requirements are applicable to the LISA Pathfinder Implementation Phase activities of EADS Astrium Ltd., EADS Astrium GmbH and any subcontracted project software elements. The scope therefore encompasses the software development activities of core team member SciSys, their Data Handling Software subcontractor as well as other equipment containing a software element external to the OBC. In accordance with the LISA Pathfinder Product Assurance Plan [RD4], on-board software designated as mission critical shall be subject to. The on-board software comprises the application software (incorporating data handling software) which runs on the spacecraft s central onboard computer (OBC), together with embedded software in avionics equipments. From previous experience of similar projects, it is envisaged that there will be no safety critical software on this programme, since safety hazards are expected to be mitigated by non-software means. This condition will be verified through hazard analyses conducted at system and sub-system level. Mission critical software is identified as that software whose anomalous behaviour would cause or contribute to the permanent loss or degradation of the spacecraft s capability to perform its mission. Note that whereas the software embedded in the primary payload instruments (LTP and DRS) is outside the scope of this specification, hazard analyses conducted at system level will take due account of the interfaces to these instruments as part of the identification of critical software items on the spacecraft side of these interfaces. The terms subcontractor and developer in this document are limited to those subcontractors (and subcontractors thereto) developing or supplying any kind of software for the LISA Pathfinder project.

Page 6 of 36 1.1 APPLICABLE AND REFERENCE DOCUMENTS 1.1.1 Applicable Documents [AD1] S2.ASU.RS.1005 LISA Pathfinder Sub-Contractor PA Requirements, Issue 6 [AD2] S2.SYS.RS.1001 LISA Pathfinder Software Product Assurance Requirements for Subcontractors, Issue 2 [AD3] S2.ASU.RS.1023 LISA Pathfinder Subcontractor Project Management Requirements and Tasks, Issue 3 [AD4] S2-ASU-RS-1034 CADM Requirements, Issue 2 1.1.2 Reference Documents [RD1] S2.ASU.LI.2003 Critical Items List (CIL), Issue 1 [RD2] ECSS-Q-80B Software Product Assurance, Latest Issue [RD3] ECSS-E-40B Software Part 1: Principles and Requirements, Latest Issue [RD4] S2.ASU.PL.1003 LISA Pathfinder Product Assurance Plan, Issue 2 [RD5] S2.ASU.LI.1006 LISA Pathfinder Acronyms List, Issue 2

Page 7 of 36 2 ACRONYMS See [RD5]

Page 8 of 36 3 PROGRAMME OVERVIEW 3.1 PROJECT FRAMEWORK LISA Pathfinder (LISA-PF) will address the technology needs of LISA. It is required to place a spacecraft into an orbit compatible with the technology demonstration requirements. The payload of the LISA-PF mission will be the technology demonstration packages (the LISA Test Package LTP and the Disturbance Reduction System DRS), the micropropulsion subsystems and the Drag-Free Attitude Control System (DFACS). The Prime Contractor responsible to the Agency for the LISA-PF programme is EADS Astrium Ltd with EADS Astrium GmbH as core team member responsible for the DFACS and SciSys as core team member responsible for the OBC software. EADS Astrium GmbH is responsible under separate contract for the LTP industrial lead. The objectives of the LISA-PF Implementation Phase are to finalise the detailed design of the LISA- PF satellite, to manufacture and assemble the satellite and to verify its performance.

Page 9 of 36 4 DEFINITIONS 4.1 PRIME References to Prime within this document will mean EADS Astrium Ltd., who shall act as customer to the subcontractor. 4.2 FUNCTION DEFINITION Validation and Verification (V&V) are defined as follows: Validation: Validation is a confirmation process, through the provision of objective evidence, that the requirements baseline functions and performances are correctly and completely implemented in the final product for the specific intended use or application of the software. Verification: Verification is a confirmation process, through the provision of objective evidence, that each software product properly reflects and fulfils the specified requirements 4.3 CRITICALITY DEFINITION The definition of criticality classes applicable to all software items within the LISA Pathfinder project is as follows: Safety-critical: Software whose anomalous behaviour would cause or contribute to a failure of the satellite system resulting in loss of life, personal injuries, or damage to other equipment. Mission-critical: Software whose anomalous behaviour would cause or contribute to a failure of the satellite system resulting in permanent and/or non-recoverable loss of the satellite s capability to perform its planned mission. Non-critical: Software whose anomalous behaviour would cause or contribute to a failure of the satellite system with negligible or minor effect on the satellite s operability, with the possibility to remove the fault, for example, by patching the malfunctioning software. The term critical software encompasses software items that are either safety critical or mission critical, or both. 4.4 INDEPENDENT SOFTWARE VERIFICATION AND VALIDATION () Independent Software Verification and Validation: Activities listed below are undertaken, under contract to Prime, by an organisation independent from the software development organisation(s). These activities take place in parallel with the activities of the software developers and need to relate to the developers planned development lifecycles. The V&V activities performed under the programme shall not substitute in any way for the V&V activities to be performed by the software developers.

Page 10 of 36 The main activities related to are subdivided as follows: Requirements Verification: This activity ensures that all the originating requirements are expressed correctly and fully in higher level specifications and are consistently allocated to subordinate software requirements. Design Verification: This activity verifies the requirements cross-referencing between the software requirements baseline and the software design. This activity will be split further to accommodate the software development programme as defined by the software developer. Code Verification: This activity includes the evaluation of consistency and correctness between the code and the software design plus an assessment of how well the code follows applicable standards and practices (e.g. as defined in the governing Software Development Plan or Software Quality Assurance Plan). A static code analysis tool (ideally differing from any tool employed by the developer) is expected to be used in the assessment of code Test Evaluation: This activity includes the evaluation of each software developer verification test for consistency, feasibility, coverage of the requirements. Validation: This activity, in addition to the validation activities performed by the software developer, concentrates on testing special error cases and worst-case scenarios. The scheduling of the above activities needs to reflect the stage reached by each developer (a typical specimen programme flow chart is given at Annex A). Repeat verification and validation shall be performed, as necessary, to reflect the potential evolution of requirements, design or code. Wherever possible, observations arising from shall be available in time to be raised as RIDs during the course of formal software developer milestone reviews.

Page 11 of 36 5 PROGRAMME ACTIVITIES AND SCOPE This section describes the scope of work, deliverable documents and associated milestones for the. 5.1 SOFTWARE IDENTIFICATION Software subject to shall be identified by Prime using various dependability analyses as defined within the LISA Pathfinder Product Assurance Plan [RD4], including Hazard Analyses, and Hardware/Software Interaction Analyses (HSIA). The output of these analyses will include identified software items categorised with appropriate criticality designations. Critical software items will be listed in the LISA Pathfinder Critical Items List (CIL) [RD1]. 5.2 MILESTONES The dates for all milestones relevant to the programme will be as defined in the Statement of Work. These dates may evolve in the course of the project. Prime will, on an ongoing basis, ensure that the organisation is kept adequately apprised of all relevant milestone dates in order that activities may be planned accordingly. 5.3 SCOPE OF WORK Prime will define within the CIL the software items considered to be critical and thus subject to. The CIL is a living document and therefore subject to change. All critical software items shall be addressed and the contractor shall be free to provide additional focus to the analysis of any software items that they deem to merit particular scrutiny. Reports on the status of activities shall be provided to Prime on an ongoing basis. In addition, a detailed report and conclusion shall be created for each of the software engineering processes defined in the following subsections. Annex B gives guidelines for the content of these detailed reports and also summarises requirements for ongoing progress reports and attendance at software developer milestone reviews. The contractor shall devise independent verification and validation methods of the software requirements, design and code, avoiding where possible the use of tools already employed by the software developers. It is the responsibility of the contractor to understand the division of the OBC software into each sub component and to ensure the interfaces between software components are suitably documented and tested. Additional software validation test cases shall be devised (see 5.3.5) to test for exceptions and to subject the software system to pre-determined stresses, including error conditions, worst-case scenarios and deadlock situations. The tests shall be designed to provide a measure of actual performance against specified budgets. Depending on the software under test and the test requirements, the tests may need to employ an individual software developer s SDE, the SVF at

Page 12 of 36 SciSys premises or a more fully integrated MDVE at EADS Astrium premises. In all cases the contractor shall approve the test scripts, evaluate all results obtained and (with the possible exception of repeat testing) witness the test being performed at the premises of the software supplier concerned. 5.3.1 Software Criticality Analysis The contractor shall assess the criticality level of the software items listed in table 5.4 as a means to scope their activities to the areas deemed the most critical and publish the results in an Independent Software Criticality Analysis Report. The contractor shall consider the system level, the technical specifications, the design, the code and shall be supported by traceability analysis, control flow/call graphs analysis and complexity measurement. 5.3.2 Requirements Verification System level requirements for the on-board software, whether in respect of the OBC application software or embedded software in flight equipments, will be defined by Prime. In the case of equipment embedded software, equipment level requirements generated by Prime will be the principal customer input to the software developers for determining the required software functionality. Prime will also originate the detailed AOCS algorithms to be implemented in the OBC application software, via an AOCS Software Systems Specification. The contractor shall evaluate how each software developer has responded to customer requirements and publish the evaluation in a Requirements Verification Report (structured as per Annex B). Requirement verification shall address the following aspects: a. Software Requirements are traceable to system requirements. b. Software Requirements are correctly derived from system requirements c. Software Requirements are externally and internally consistent d. Software Requirements are verifiable. e. Feasibility of software design. f. Feasibility of operations and maintenance 5.3.3 Design Verification 5.3.3.1 Architectural Design Before embarking on a detailed software design, each software developer would normally generate an architectural design identifying the essential software structure. Typically, this would encompass the identification of the main modules, the principle functions of each module, software interfaces, overall mode logic, timing and sizing budgets etc. The contractor shall evaluate each software

Page 13 of 36 developer s architectural design and publish the evaluation in an Architectural Design Verification Report (structured as per Annex B). The contractor shall also review any ongoing evolution of requirements during this activity, and shall update the Requirements Verification Report (from the previous engineering phase) as necessary. Architectural design verification shall address the following aspects: a. Traceability from the requirements to the software structure b. Internal consistency between the software components. c. Feasibility of producing a detailed design. d. Feasibility of operations and maintenance. e. Design evaluation including reliability and robustness aspect. f. Correctness of the design with respect to the requirements and the interfaces. g. Correct implementation of sequences of events, inputs, outputs, interfaces, logic flow, allocation of timing and sizing budgets, error definition, isolation and recovery. h. Software partitioning integrity to assess whether the division in components is safe and does not allow for the failures to be propagated between components. 5.3.3.2 Detailed Design Each software developer will provide documentation containing the detailed software design. Either a dedicated detailed design document will be produced or the detailed design information will reside within a combined architectural/detailed design document. In either case, the software design will be documented in sufficient detail to allow coding activities to commence. The contractor shall evaluate each software developer s detailed design and publish the evaluation in a Detailed Design Verification Report (structured as per Annex B). The contractor shall also review any ongoing evolution of requirements and/or software architecture during this activity, updating the Requirements Verification and Architectural Design Verification Reports (from previous engineering phases) as necessary. Detailed design verification shall address the following aspects: a. Traceability from the architectural design to the detailed design b. Internal consistency between the software components c. Feasibility of testing. d. Feasibility of operations and maintenance. e. Correctness of the design with respect to the requirements and the interfaces. f. Correct implementation of sequences of events, inputs, outputs, interfaces, logic flow, allocation of timing and sizing budgets, error definition, isolation and recovery.

Page 14 of 36 5.3.4 Coding Verification The contractor shall evaluate each software developer s code and publish the evaluation in a Code Verification Report (structured as per Annex B). The contractor shall also review any ongoing evolution of requirements and/or software architecture and/or software detailed design during this activity, updating the relevant Verification Reports (from previous engineering phases) as necessary. Code verification shall address the following aspects: a. Traceability to design and requirements, testability, correctness, conformity to coding standards. b. Internal consistency between the software units. c. Feasibility of integration and testing. d. Feasibility of operations and maintenance. e. Testability of each code unit (whether acceptance criteria can be defined for each unit) f. Correct implementation of sequences of events, completeness, consistent interfaces, correct data and control flow, appropriate allocation of timing and sizing budgets and appropriate mechanisms for error definition, isolation and recovery. g. Suitability and effectiveness of software developer s test V&V documentation (i.e.: for each requirement of the software item, a set of test cases (inputs, outputs, test criteria) are included. 5.3.5 Verification and Validation Testing The contractor shall evaluate the verification and validation tests performed by each software developer as described more fully below. In addition the contractor shall produce additional validation tests, using test scenarios and procedures created independently of the software developer. These validation test will be kept separate from the from the software developers own validation test to ensure independence. The contractor shall also review any ongoing evolution of requirements and/or design during this activity, updating the Verification Reports from previous engineering phases as necessary. 5.3.5.1 Test Evaluation This task is intended to encompass unit and integration testing, validation against technical specification and against the requirement baseline. The contractor shall evaluate each software developer s verification tests and publish the evaluation in a Test Evaluation Report (structured as per Annex B). The contractor shall also evaluate each software developer s validation tests, and publish the evaluation in a Test Evaluation Report (structured as per Annex B). This shall be delivered to Prime in time for each software developer s QR, unless otherwise agreed with Prime.

Page 15 of 36 Test evaluation shall address the following aspects: a. Traceability to software design. b. Test coverage of the requirements of the software item. c. Conformance of test results with test pass/fail criteria. d. Evaluation of operations manual / user manual (if applicable) for completeness and clarity. e. Coverage of system level requirements by the validation test cases f. Extent to which validation tests exercise stress conditions g. Extent to which software is shown to perform reliably in a representative operational environment h. Response to injected errors, boundary and singular inputs, ability to isolate and minimise the effect of errors. i. Conformance of the software to the operational and functional requirements relating to critical software. j. Statement and branch coverage k. Identify potential test for independent validation testing 5.3.5.2 Validation Testing This task will be more focused on validating the software against the Technical Specification and the Requirement Baseline including user level requirements, demonstrating robustness and reliability etc. It is in this area that the contractor is required to contribute additional 'validation' test cases. As part of the Validation Testing activity, the contractor shall evaluate each software developer s problem and non-conformance reporting system. In discussion with Prime, the contractor shall track any non-conformances identified during the validation test, in discussion with Prime, until a successful resolution is reached. The evaluation of the software developer s problem reporting system shall be included in the Validation Test Report. In respect of the -specified additional validation tests, the contractor shall detail these, and their justification, in an Independent Validation Test Plan. The results from these tests shall be published in an Independent Validation Test Results document. Note that software developers may also elect to incorporate these tests within their own validation test documentation. The -specified additional validation tests will be performed at times mutually agreed between the software developers, Prime and the contractor. In general, these tests will take place during or shortly after each developer s validation test campaign to maximise the benefit of the test equipment being already set up. In some cases, tests may be performed at higher levels of integration of the overall software system. In both cases, the Prime in agreement with the software

Page 16 of 36 developers, will ensure the test equipment is available to the contractor for an agreed duration for each stage of the testing In the event that changes to software functionality arise in the period following the execution of the -specified additional validation test cases, the contractor shall revisit the test cases and make any modifications deemed necessary. The Contractor shall closely monitor any repeated -specified tests. Note that it is not anticipated that repeat visits to software developers premises would be required in these circumstances. The contractor shall ensure that the final version of the flight software is subject to the full set of additional validation tests. 5.3.6 Evaluation of Documentation During each of the processes above the software developer s documentation shall be evaluated and the evaluation incorporated into the Verification Report for the applicable engineering phase. Documentation verification shall address the following aspects: a. Documentation is relevant, adequate, complete and consistent. b. Documentation conforms to prescribed configuration process controls, including references, format, content, review, approval and authorising, release, update and storage. c. Records are established and maintained in accordance with a documented process defining identification, storage, protection, retrieval, and disposition.

Page 17 of 36 5.4 SOFTWARE ITEMS SUBJECT TO The LISA Pathfinder on-board software items are as presented in Table 5.4.1. Code Description Language Size (Lines of code) Processor OBC ASW C ~75000 ERC32 Supplier SciSys OBC API SW and SUSW C/Assembler ~5000 ERC32 TBC Star Tracker C/Assembler ~20000 TBC Sun Sensor C/Assembler ~5000 TBC Gyro C/Assembler ~5000 TBC FEEPS C/Assembler ~5000 TBC RTOS Managers C/Assembler See Note 2 ERC32 TBC TBC TBC TBC EADS Astrium Table 5.4.1 Software Items Subject to Note 1: The table content is currently largely TBC. The definitive list, including critical category designations, will be found in the CIL [RD1] which will be more specific and reiterated as subsystem analyses progress. Note 2: The RTOS employed for the LISA-Pathfinder OBC Application Software is RTEMS, which shall be subject to validation testing to the extent necessary to demonstrate that the relevant managers perform acceptably under realistic worst case conditions. As input to this, the contractor shall review all schedule analyses performed by the OBC Application Software developer.

Page 18 of 36 5.5 DELIVERABLE DOCUMENTS The organisation shall prepare and deliver documentation as Table 5.5.1 for input to the referenced software developer s review milestones. Software Review Milestones Document Management Plan Quality Plan Requirements Verification Report (initial comments) Requirements Verification Report Requirements Verification Report (update) Architectural Design Verification Report (initial comments) Architectural Design Verification Report Architectural Design Verification Report (final update) Detailed Design Verification Report (initial comments) Detailed Design Verification Report Detailed Design Verification Report (final update) Code Verification Report Code Verification Report (final update) Test Evaluation Report Kick-off SRR PDR CDR QR AR Independent Software Criticality Analysis Report Independent Validation Test Plan Independent Validation Test Results Validation Test Report Summary Report Periodic PA/progress reports Provided for each system level review. Periodically as defined in SOW. Table 5.5.1 Timing of Reports Initial comments for each milestone reports shall be presented to Prime as inputs to the relevant milestone review as indicated by the flowchart at Annex A and will be based on the information contained in the milestone review data pack delivered 2 weeks before the review. The report will be updated with information provided during the reviews and in the data pack. The updated report will

Page 19 of 36 be presented to Prime after an agreed duration after each initial set of comments and no later than the review milestone marked in the table 5.5.1. In addition, a final update shall be presented to Prime not later than the milestone marked in table 5.5.1. Where the developer is utilising pre-existing software, the reporting sequence shall reflect those milestones still available within the development lifecycle and encompass reporting information that would have otherwise been included in the earlier report(s). The Management Plan content shall conform to LISA Pathfinder project management requirements [AD3] and shall include the following: Identification of software requiring and software developer. Independency of organisation. Planned interaction relating to the software developer s development lifecycle, including how, when and where the activities will take place. reports shall include the content relating to verification criteria and shall comply with the guidelines given in Annex B to this document. 5.6 SOFTWARE NON-CONFORMANCE AND PROBLEM REPORTING Software problems and non-conformances detected by the contractor shall be processed in accordance with the requirements defined in the LISA Pathfinder Software Product Assurance Requirements for Subcontractors [AD2]. 5.7 RISK MANAGEMENT The contractor shall undertake suitable risk management with respect to the undertaking of their own responsibilities defined within this specification. Any anticipated difficulties in generating timely reports shall be reported to Prime for agreement of risk reduction actions. The contractor shall also assess each software developer s risk management process as an input to the overall evaluation of each supplier s ability to deliver a quality product within contractual timescales. The contractor shall report any concerns to Prime.

Page 20 of 36 6 ORGANISATION 6.1 CAPABILITY The contractor s organisation shall have the management, quality and technical skills required to undertake the responsibilities defined in this specification. Technical staff shall be capable of understanding the languages, tools, and methods used by the applicable software developers on the LISA Pathfinder project. The contractor shall demonstrate in their proposal that personnel with the appropriate skills are available. 6.2 TOOLS, TECHNIQUES AND METHODS Tools and methods to be deployed for the verification activities in each of the software lifecycle phases shall be declared and subject to approval by Prime.

Page 21 of 36 ANNEX A SPECIMEN PROGRAMME FLOWCHART Lifecycle Reviews Lifecycle Phases Software Reviews System requirements Requirements Verification SRR Software requirements Architectural Design Design Verification PDR Detailed Design Design Verification Code Testing Code, Test, Int n Verification Integration PRIME CDR Verification and Validation Test Evaluation QR System installation System Test Validation AR

Page 22 of 36 FLOW CHART KEY Acronyms AR CDR PDR QR SRR Acceptance Review Critical design review Preliminary Design Review Qualification Review System Requirements Review KEY SW Development lifecycle path parallel operation investigation activity Reporting path

Page 23 of 36 ANNEX B REPORT GUIDELINES B.1 EXPECTED CONTENT OF A VERIFICATION REPORT These reports shall contain an evaluation of the engineering phase dependent activities of the software developers, and shall include: Introduction o o o o Context Identification of the software developer and items that were subject to Software development phase Approach used (tools etc) References Contributors Summary of Findings (including quality of documentation) Classification Scheme used for Findings (this should be consistent for all analyses) Status of any relevant NCRs or SPRs pertaining to report material Review Item Discrepancies (to include recommendations) Change history B.2 PERIODIC REPORTING, MEETINGS AND PARTICIPATION IN REVIEWS Bi-monthly progress meetings shall be held with Prime at contractor premises. ESA shall be invited to these meetings. A progress report shall be provided at least 5 days prior to each meeting, covering commercial aspects, current document list, action item list, meetings or test campaigns held in the previous period, risk status and activities planned for the next period. The contractor shall plan and prepare for attendance at software developer milestone reviews, by agreement with Prime. A number of the RIDs raised at these reviews may be based on findings. It is anticipated that attendance at these reviews will also provide the contractor with insights into the software developer s engineering processes. (Note that whereas software developer milestone review meetings may occasionally be held on Astrium premises, meetings required for test witnessing during test campaigns shall in general be held at the premises of the respective software developer.) The contractor shall also plan for participation in system level reviews with ESA and shall contribute an summary report to the review data pack.

Page 24 of 36 INTENTIONALLY BLANK

Page 25 of 36 ANNEX C S REQUIREMENTS MATRIX Requirements herein shall be read and understood in conjunction with the text in the main body of the document. The table provides each requirement with a unique reference for compliance monitoring purposes. The table may be used as a template for a Compliance Matrix. Rq t Ref Para Description Comments Contractor Responsible 1 1 Software designated as safety critical shall be subject to independent verification and validation. All 2 1 Software designated as mission critical shall be subject to independent verification and validation. All 3 4.4 Independent verification and validation shall take place in parallel with the activities of the software developer and need to relate to the developers software development lifecycles. All 4 4.4 The software developer shall perform their own V&V and not rely on. Developer 5 4.4, 5.3.3 Any iteration of the requirement phase (or part thereof) shall also be verified. Within Design Phase Verification 6 4.4, 5.3.4 Any iteration of requirement/design phases (or part thereof) shall also be verified. Within Code Phase Verification 7 4.4, 5.3.5 Any iteration of requirement/design/code phases (or part thereof) shall also be verified and validated. Within Verification & Validation 8 4.4 Observations arising from shall be available in time to be raised as RIDs during the course of formal software developer milestone reviews. 9 throughout Prime approval shall be by both LISA Pathfinder Project Manager and PA Manager or his delegate. Approval Prime than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 26 of 36 Rq t Ref Para Description Comments Contractor Responsible 10 5.1 Software subject to shall be identified by various dependability analyses, as defined within the LISA Pathfinder Product Assurance Plan. Software identification Prime 11 5.1 Analyses and the declaration of critical or critical software shall be determined by Prime. Software identification Prime 12 5.1 The software items declared critical shall be listed in the LISA Pathfinder CIL. Software identification Prime 13 5.2 The dates for all milestones relevant to the programme shall be as declared in the SOW. Prime 14 5.2 Prime will, on an ongoing basis, ensure that the organisation is kept adequately apprised of all relevant milestone dates in order that activities may be planned accordingly. Prime 15 5.3 The contractor shall devise independent verification and validation methods of the software requirements, design and code, avoiding where possible the use of tools already employed by the software developers. 16 5.3 The contractor shall devise additional software test cases to test for exceptions and to subject the software system to pre-determined stresses, including error conditions, worst-case scenarios and deadlock situations. The tests shall be designed to provide a measure of actual performance against specified budgets. Scope of work 17 5.3 In all cases the contractor shall witness the tests and evaluate all results obtained. Scope of work 18 5.3, Annex The contractor shall evaluate the software developer s engineering process activities by attending Scope of work B2 software developer milestone reviews, by agreement with Prime. 19 5.3.2 The contractor shall evaluate how each software developer has responded to customer requirements and publish the evaluation in a Requirements Verification Report (structured as per S Software Requirement process than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 27 of 36 Rq t Ref Para Description Comments Contractor Responsible Annex B) 20 5.3.2 Requirement Verification shall address the following aspects: Software Requirement process a) Software Requirements are traceable to system requirements b) Software Requirements are correctly derived from system requirements c) Software Requirements are externally and internally consistent d) Software Requirements are verifiable. e) Feasibility of software design. f) Feasibility of operations and maintenance 21 5.3.3.1 The contractor shall evaluate each software developer s architectural design and publish the evaluation in an Architectural Design Verification Report (structured as per S Annex B). The contractor shall also review any ongoing evolution of requirements during this activity, and shall update the Requirements Verification Report (from the previous engineering phase) as necessary. Software Design process 22 5.3.3.1 Architectural Design Verification shall address the following aspects: Software Design process a) Traceability from the requirements to the software structure b) Internal consistency between the software components. c) Feasibility of producing a detailed design. d) Feasibility of operations and maintenance. e) Design evaluation including reliability and robustness aspect. f) Correctness of the design with respect to the requirements and the interfaces. g) Correct implementation of sequences of events, inputs, outputs, interfaces, logic flow, allocation of timing and sizing budgets, error definition, isolation and recovery. h) Software partitioning integrity to assess whether the division in components is safe and does not allow than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 28 of 36 Rq t Ref Para Description Comments Contractor for the failures to be propagated between components Responsible 23 5.3.3.2 The contractor shall evaluate each software developer s detailed design and publish the evaluation in a Detailed Design Verification Report (structured as per S Annex B). The contractor shall also review any ongoing evolution of requirements and/or software architecture during this activity, updating the Requirements Verification and Architectural Design Verification Reports (from previous engineering phases) as necessary. Software Design process 24 5.3.3.2 Detailed Design Verification shall address the following aspects: Software Design process a) Traceability from the architectural design to the detailed design b) Internal consistency between the software components c) Feasibility of testing. d) Feasibility of operations and maintenance. e) Correctness of the design with respect to the requirements and the interfaces. f) Correct implementation of sequences of events, inputs, outputs, interfaces, logic flow, allocation of timing and sizing budgets, error definition, isolation and recovery. than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 29 of 36 Rq t Ref Para Description Comments Contractor Responsible 25 5.3.4 The contractor shall evaluate each software developer s code and publish the evaluation in a Code Software Coding process Verification Report (structured as per S Annex B). 26 5.3.4 Code Verification shall address the following aspects: Software Coding process a) Traceability to design and requirements, testability, correctness, conformity to software requirements and coding standards. b) Internal consistency between the software units. c) Feasibility of integration and testing. d) Feasibility of operations and maintenance. e) Testability of each code unit (whether acceptance criteria can be defined for each unit) f) Correct implementation of sequences of events, completeness, consistent interfaces, correct data and control flow, appropriate allocation of timing and sizing budgets and appropriate mechanisms for error definition, isolation and recovery. g) Suitability and effectiveness of software developer s test V&V documentation (i.e.: for each requirement of the software item, a set of test cases (inputs, outputs, test criteria) is included. 27 5.3.5.1 The contractor shall evaluate each software developer s verification tests and publish the evaluation in a Test Evaluation Report (structured as per Annex B).. This shall be delivered to Prime in time for each software developer s QR, unless otherwise agreed with Prime. Test Evaluation 28 5.3.5.1 The contractor shall evaluate each software developer s verification tests and publish the evaluation Test Evaluation in a Test Evaluation (structured as per Annex B). This shall be delivered to Prime in time for each software developer s AR, unless otherwise agreed with Prime. 29 5.3.5.1 Test Evaluation shall address the following aspects: Test Validation than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 30 of 36 Rq t Ref Para Description Comments Contractor Responsible a) Traceability to software design. b) Test coverage of the requirements of the software item. c) Conformance of test results with test pass/fail criteria. d) Evaluation of operations manual / user manual (if applicable) for completeness and clarity. e) Coverage of system level requirements by the validation test cases f) Extent to which validation tests exercise stress conditions g) Extent to which software is shown to perform reliably in a representative operational environment h) Response to injected errors, boundary and singular inputs, ability to isolate and minimise the effect of errors. i) Conformance of the software to the operational and functional requirements relating to critical software. j) Statement and branch coverage k) Identify potential test for independent validation testing 30 5.3.5.2 Deleted Deleted Deleted than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 31 of 36 Rq t Ref Para Description Comments Contractor Responsible 31 5.3.5.2 As part of the Validation Testing activity, the contractor shall evaluate each software developer s problem and non-conformance reporting system. Any non-conformances identified in the course of performing validation tests shall be tracked by the contractor, in discussion with Prime, until a successful resolution is reached. The evaluation of the software developer s problem reporting system shall be included in the Validation Test Report. NCR/SPR Evaluation 32 5.3.5.2 In respect of the -specified additional validation tests, the contractor shall detail these, and their Documentation of Independent Validation Tests Repeat Validation Tests justification, in an Independent Validation Test Plan which shall be delivered to Prime in time for each software developer s CDR, unless otherwise agreed with Prime. The results from these tests shall be published in an Independent Validation Test Results document, which shall be delivered to Prime in time for each software developer s AR. 33 5.3.5.2 In the event that changes to software functionality arise in the period following the execution of the specified additional validation test cases, the contractor shall revisit the test cases and make any modifications deemed necessary. Any repeat -specified tests performed shall be monitored closely by the contractor. 34 5.3.5.2 The contractor shall ensure that the final version of the flight software is subject to the full set of additional validation tests. Validation of Final Software 35 5.3.6 During each of the processes above (S 5.3.1 5.3.5) the software developer s documentation shall be evaluated and the evaluation incorporated into the Verification Report for the applicable engineering phase. Documentation Verification than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 32 of 36 Rq t Ref Para Description Comments Contractor Responsible 36 5.3.6 Documentation verification shall address the following aspects: Documentation Verification a) Documentation is relevant, adequate, complete and consistent. b) Documentation conforms to prescribed configuration process controls, including references, format, content, review, approval and authorising, release, update and storage. c) Records are established and maintained in accordance with a documented process defining identification, storage, protection, retrieval, and disposition. 37 5.4 The RTOS employed for the LISA-Pathfinder OBC Application Software is RTEMS, which shall be subject to validation testing to the extent necessary to demonstrate that the relevant managers perform acceptably under realistic worst-case conditions. RTOS 38 5.4 As input to the of RTEMS, the contractor shall review all schedule analyses performed by the OBC Application Software developer. RTOS 39 5.5 Initial comments for each milestone reports shall be presented to Prime as inputs to the relevant milestone Timing of Reports review as indicated by the flowchart at Annex A and will be based on the information contained in the milestone review data pack delivered 2 weeks before the review. The report will be updated with information provided during the reviews and in the data pack. The updated report will be presented to Prime after an agreed duration after each initial set of comments and no later than the review milestone marked in the table 5.5.1. In addition, a final update shall be presented to Prime not later than the milestone marked in table 5.5.1. 40 5.5 Where the developer is utilising pre-existing software, the reporting sequence shall reflect those Pre-existing software milestones still available within the development lifecycle and encompass reporting information that would have otherwise been included in the earlier report(s). than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 33 of 36 Rq t Ref Para Description Comments Contractor Responsible 41 5.5 organisation The Management Plan content shall conform to LISA Pathfinder project management requirements documentation and shall include the following: Identification of software requiring and software developer. Independency of organisation. Planned interaction relating to the software developer s development lifecycle, including how, when and where the activities will take place. 42 5.5 reports shall include the content relating to verification criteria and shall comply with the guidelines given in S Annex B. organisation documentation 43 5.6 Software problems and non-conformances detected by the contractor shall be processed in accordance with the requirements defined in the LISA Pathfinder Software Product Assurance Requirement for Subcontractors.. Problem reporting 44 5.7 The contractor shall undertake suitable risk management with respect to the undertaking of their own responsibilities defined within this specification. Any anticipated difficulties in generating timely reports shall be reported to Prime for agreement of risk reduction actions. Risk Management 45 5.7 The contractor shall also assess each software developer s risk management process as an input to the overall evaluation of each supplier s ability to deliver a quality product within contractual timescales. The contractor shall report any concerns to Prime. Risk Management 46 6.1 The contractor s organisation shall have the management, quality and technical skills required to undertake the responsibilities defined in this specification. Technical staff shall be capable of understanding the languages, tools, and methods used by the applicable software developers on the LISA Pathfinder project. Organisation than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 34 of 36 Rq t Ref Para Description Comments Contractor Responsible 47 6.1 The contractor shall demonstrate in their proposal that personnel with appropriate skills will be available for the project Capability 48 6.2 Tools and methods to be deployed for the verification activities in each of the software lifecycle phases shall be declared and subject to approval by Prime. Tools 49 6.3 Deleted Deleted Deleted 50 5.3.1 The contractor shall assess the criticality level of the software items listed in table 5.4 as a mean to scope his activities to the areas deemed the most critical and publish the results in an Independent Software Criticality Analysis Report. 51 5.3.1 The contractor shall consider the system level, the technical specifications, the design, the code and shall be supported by traceability analysis, control flow/call graphs analysis and complexity measurement. Software Criticality Analysis Software Criticality Analysis than that for which it is supplied and shall not in whole or in part be reproduced, copied, or communicated to any person without written permission from the owner.

Page 35 of 36 INTENTIONALLY BLANK

Page 36 of 36 DOCUMENT CHANGE DETAILS ISSUE CHANGE AUTHORITY CLASS RELEVANT INFORMATION/INSTRUCTIONS 4 S2CA120 - Added comments from preteb #2 3 S2CA090 - Added comments from preteb 2 S2CA088 - Second Issue for ITT Datapack 1 - - First Draft for System PDR DISTRIBUTION LIST INTERNAL EXTERNAL M. Backler Luisella Giulicchi (ESA) G. Adams Paolo Maldari (ESOC) Configuration Management Juan Carenza (ESA) T. Remion W. Fichter