Integrated Software Dependent Systems (ISDS)
|
|
|
- Chad Andrews
- 10 years ago
- Views:
Transcription
1 OFFSHORE STANDARD DNV-OS-D203 Integrated Software Dependent Systems (ISDS) DECEMBER 2012 The electronic pdf version of this document found through is the officially binding version
2 FOREWORD DNV is a global provider of knowledge for managing risk. Today, safe and responsible business conduct is both a license to operate and a competitive advantage. Our core competence is to identify, assess, and advise on risk management. From our leading position in certification, classification, verification, and training, we develop and apply standards and best practices. This helps our customers safely and responsibly improve their business performance. DNV is an independent organisation with dedicated risk professionals in more than 100 countries, with the purpose of safeguarding life, property and the environment. DNV service documents consist of among others the following types of documents: Service Specifications. Procedural requirements. Standards. Technical requirements. Recommended Practices. Guidance. The Standards and Recommended Practices are offered within the following areas: A) Qualification, Quality and Safety Methodology B) Materials Technology C) Structures D) Systems E) Special Facilities F) Pipelines and Risers G) Asset Operation H) Marine Operations J) Cleaner Energy O) Subsea Systems U) Unconventional Oil & Gas Det Norske Veritas AS December 2012 Any comments may be sent by to [email protected] This service document has been prepared based on available knowledge, technology and/or information at the time of issuance of this document, and is believed to reflect the best of contemporary technology. The use of this document by others than DNV is at the user's sole risk. DNV does not accept any liability or responsibility for loss or damages resulting from any use of this document.
3 Changes Page 3 CHANGES General This document supersedes DNV-OS-D203, May Text affected by the main changes in this edition is highlighted in red colour. However, if the changes involve a whole chapter, section or sub-section, normally only the title will be in red colour. Changes The document is totally revised.
4 Contents Page 4 CONTENTS CH. 1 INTRODUCTION... 6 Sec. 1 General... 7 A. General... 7 A 100 Introduction... 7 A 200 Objectives... 7 A 300 Organisation of content... 7 A 400 Assumptions... 7 A 500 Scope and application... 7 A 600 Types of software within scope... 8 A 700 Alterations and additions of approved systems... 8 B. References... 8 B 100 International or national references... 8 B 200 DNV references... 9 CH. 2 TECHNICAL PROVISIONS Sec. 1 Principles A. General A 100 Process requirements A 200 System hierarchy Sec. 2 Confidence Levels A. Confidence Levels A 100 Definition of confidence levels Sec. 3 Responsibilities A. Activities and Roles A 100 Activities A 200 Roles Sec. 4 Project Phases and Process Areas A. Project phases A 100 Introduction A 200 Basic engineering phase (A) A 300 Engineering phase (B) A 400 Construction phase (C) A 500 Acceptance phase (D) A 600 Operation phase (E) B. Process Areas B 100 Introduction B 200 Requirements engineering (REQ) B 300 Design (DES) B 400 Implementation (IMP) B 500 Acquisition (ACQ) B 600 Integration (INT) B 700 Verification and Validation (VV) B 800 Reliability, Availability, Maintainability and Safety (RAMS) B 900 Project Management (PM) B 1000 Risk Management (RISK) B 1100 Process and Quality Assurance (PQA) B 1200 Configuration Management (CM) Sec. 5 ISDS Requirements for Owners A. Owner requirements A 100 Requirements under the owner s responsibility A 200 Acceptance criteria for owner assessments A 300 Documentation criteria for the owner Sec. 6 ISDS Requirements for System Integrators A. System integrator requirements A 100 Requirements under the system integrator s responsibility... 22
5 Contents Page 5 A 200 Acceptance criteria for system integrator assessments A 300 Documentation criteria for the system integrator Sec. 7 ISDS Requirements for Suppliers A. Supplier requirements A 100 Requirements under the supplier s responsibility A 200 Acceptance criteria for supplier assessments A 300 Documentation criteria for the supplier Sec. 8 ISDS Requirements for the Independent Verifier A. Independent verifier requirements A 100 Activities for which the independent verifier is responsible CH. 3 CLASSIFICATION AND CERTIFICATION Sec. 1 Requirements A. General A 100 Introduction A 200 Organisation of Chapter A 300 Classification principles A 400 Compliance of Activities A 500 Approval of Documents A 600 Rating of compliance A 700 Reporting and milestone meetings B. Class notation B 100 Designation B 200 Scope C. In operation assessments C 100 Objectives C 200 Scope of annual assessments C 300 Scope of renewal assessments App. A DEFINITIONS AND ABBREVIATIONS A. Definitions A 100 Verbal Forms A 200 Definitions B. Abbreviations App. B REQUIREMENT DEFINITION A. Requirement definition A 100 General A 200 Activity definition basic engineering A 300 Activity definition engineering A 400 Activity definition construction A 500 Activity definition acceptance A 600 Activity definition operation A 700 Activity definition several phases... 94
6 OFFSHORE STANDARD DNV-OS-D203 INTEGRATED SOFTWARE DEPENDENT SYSTEMS CHAPTER 1 INTRODUCTION CONTENTS PAGE Sec. 1 General... 7
7 Ch.1 Sec.1 Page 7 SECTION 1 GENERAL A. General A 100 Introduction 101 This standard contains requirements and guidance on the process of design, construction, commissioning and operation of Integrated Software Dependent Systems (ISDS). ISDS are integrated systems where the overall behaviour depends on the behaviour of the systems software components. 102 This standard focuses on the integration of the software dependent systems, sub-systems and system components, and the effects these have on the overall performance of the unit (ship, rig etc.) in terms of functionality, quality, reliability, availability, maintainability and safety. This standard intends to help system integrators and suppliers as well as owner to: reduce the risk for delays in new-build projects and modification projects, reduce the risk for downtime and accidents caused by software in the operation phase, improve the processes for maintenance and upgrades of software dependent systems throughout the life cycle, improve the knowledge of the relevant systems and software across the organisations, work within a common framework to deliver on schedule while achieving functionality, quality, reliability, availability, maintainability and safety targets, communicate and resolve key issues related to integration challenges at an early stage and throughout the whole life cycle. A 200 Objectives 201 The objectives of this standard are to: provide an internationally acceptable standard for integrated software dependent systems by defining requirements for the work processes during design, construction, commissioning and operation, serve as a contractual reference document between suppliers and purchasers, serve as a guideline for designers, suppliers, purchasers and regulators, specify processes and requirements for units or installations subject to DNV certification and classification services. A 300 Organisation of content 301 This document is divided into the following chapters and appendices: Ch.1 gives general introduction, scope and references. Ch.2 lists the requirements for the different roles, including assessment and document requirements. Ch.3 gives procedures and principles applicable when this standard is used as part of DNV classification. Appendix A lists definitions and abbreviations used in this standard. Appendix B gives a detailed description of the activities introduced in Ch.2. A 400 Assumptions 401 The requirements of this standard are based on the assumptions that the personnel are qualified to execute the assigned activities. 402 The requirements of this standard are based on the assumptions that the parties involved in the different processes are familiar with the intended function(s) of the system(s) subject for ISDS. A 500 Scope and application 501 The requirements of this standard apply to the processes that manage ISDS throughout the life cycle of a ship or offshore unit, and apply to new-builds, upgrades and modification projects. It puts requirements on the ways of working, but does not contain any specific product requirements. 502 The requirements of this standard apply to systems, sub-systems and software components created, modified, parameterized, tuned and/or configured for the specific project where this standard is applied. This standard focuses on the software aspect in the context of system and unit requirements. 503 The voluntary ISDS class notation, as specified in Ch.3, may be assigned when DNV has verified compliance. DNV s verification activities include all the activities specified under the independent verifier role in Ch.2, for the relevant confidence level.
8 Ch.1 Sec.1 Page 8 A 600 Types of software within scope 601 This standard focuses on achieving high software quality and takes into consideration all typical types of software. The requirements differ depending on whether the software is new or reused: New software (typically application software) developed within the project is qualified for use in the ISDS by showing that the supplier s development process is compliant to this standard. All reused software shall be qualified for use. Reused software is either COTS or base products. The term base product is here used to describe any kind of existing product, component, software library, software template or similar on which the supplier bases the development (or automatic generation) of the custom specific product. The qualification of reused software shall be performed by using one of these options: 1) Demonstrating compliance with this standard. 2) Assessing the quality through due diligence of the software. 3) Demonstrating that the software is proven-in-use. 4) Procurement of COTS software as described in this standard. A 700 Alterations and additions of approved systems 701 When an alteration or addition to the approved system(s) is proposed, applicable ISDS requirements shall be applied and relevant information shall be submitted to DNV. The alterations or additions shall be presented for assessment and verification. B. References B 100 International or national references 101 The standards listed in Table B1are referenced in this standard. Table B1 International or national references Reference Title IEC IEV 191 Dependability and quality of service IEC Functional safety of electrical/electronic/programmable electronic safety-related systems IEC Functional safety Safety instrumented systems for the process industry sector IEC 19501:2005 Unified Modelling Language Specification IEEE :1990 Glossary of software engineering terminology IEEE Software configuration management plan IEEE Software test documentation IEEE 1074:2006 Developing software life cycle processes INCOSE SE 2004 INCOSE System Engineering Handbook, 2004 ISO/IEC 9126 Software engineering Product quality ISO/IEC Life Cycle Management System Life Cycle Processes ISO 9000 Quality management systems SWEBOK 2004 Guide to the Software Engineering Body of Knowledge (SWEBOK), 2004 Version
9 Ch.1 Sec.1 Page 9 B 200 DNV references 201 This standard is complimentary to the standards listed in Table B2 and B3. Table B2 DNV Offshore Standards Standard Title DNV-OSS-101 Rules for Classification of Offshore Drilling and Support Units DNV-OSS-102 Rules for Classification of Floating Production, Storage and Loading Units DNV-OSS-103 Rules for Classification of LNG/LPG Floating Production and Storage Units or Installations DNV-OSS-300 Risk Based Verification DNV-OS-A101 Safety Principles and Arrangements DNV-OS-D101 Marine and Machinery Systems and Equipment DNV-OS-D201 Electrical Installations DNV-OS-D202 Automation, Safety and Telecommunication Systems DNV-OS-D301 Fire Protection DNV-OS-E101 Drilling Plant DNV-OS-E201 Oil and Gas Processing Systems DNV-OS-E301 Position Mooring Table B3 Other DNV references Reference Title DNV-RP-D201 Recommended practice for Integrated Software Dependent Systems DNV-RP-A201 Plan Approval Documentation Types Definitions DNV-RP-A203 Recommended Practice for Qualification of New Technology Pt.6 Ch.7 Rules for Classification of Ships - Dynamic Positioning Systems Pt.6 Ch.26 Rules for Classification of Ships - Dynamic Positioning System - Enhanced Reliability DYNPOS-ER SfC 2.24 Standards for Certification - Hardware in the Loop Testing (HIL)
10 OFFSHORE STANDARD DNV-OS-D203 INTEGRATED SOFTWARE DEPENDENT SYSTEMS CHAPTER 2 TECHNICAL PROVISIONS CONTENTS PAGE Sec. 1 Principles Sec. 2 Confidence Levels Sec. 3 Responsibilities Sec. 4 Project Phases and Process Areas Sec. 5 ISDS Requirements for Owners Sec. 6 ISDS Requirements for System Integrators Sec. 7 ISDS Requirements for Suppliers Sec. 8 Requirements for the Independent Verifier... 34
11 Ch.2 Sec.1 Page 11 SECTION 1 PRINCIPLES A. General A 100 Process requirements 101 This standard provides requirements for a process. These requirements are formulated as a set of activities that apply for specific roles during specific project phases and at specific confidence levels. A 200 System hierarchy 201 In order to describe the different parts that make up Integrated Software Dependent Systems, this standard uses the hierarchy defined in Fig.1. Figure 1 The hierarchy terms used in this standard
12 Ch.2 Sec.2 Page 12 SECTION 2 CONFIDENCE LEVELS A. Confidence Levels A 100 Definition of confidence levels 101 Confidence levels are assigned by the owner to a selection of the unit s functions and systems, based on evaluations of the importance of these functions in relation to reliability, availability, maintainability and safety. 102 Confidence levels define the required level of trust that a given function (implemented by one or more systems) will perform as expected. This standard defines the confidence levels 1 through 3 where the higher confidence level will require a higher project ambition level with the aim of increasing the dependability of the systems in scope. The higher confidence levels also include the activities required for the lower ones. 103 Table A1 shows the difference between confidence levels 1, 2 and Ch.3, Sec.1, B200 shows the recommended confidence levels for systems and components relevant for selected unit types (drilling unit, FPSO etc.). See DNV-RP-D201 for general guidance on the principles of assigning confidence levels to functions and systems. Table A1 The difference between the confidence levels Confidence level Characteristics Focus Key activities 1 Basic software confidence System Project management Defined ways of working Design and verification of software within a system 2 Enhanced integration confidence Systems System integration 3 Enhanced quantified confidence Systems System integration High dependability Interface definition Describing interaction between systems Traceability of requirements Qualitative RAMS Obsolescence management Quantitative RAMS High involvement of independent verifier Enhanced verification
13 Ch.2 Sec.3 Page 13 SECTION 3 RESPONSIBILITIES A. Activities and Roles A 100 Activities 101 Each requirement of this standard is formulated as an activity which is assigned to a role, a defined project phase, and a confidence level. The activities are listed in Sec.5 to 7 and described in detail in Appendix B. Each required activity has a unique identifier. The identifier is structured in three parts: Z.YYY.NN. The first part ( Z ) of the activity identifier refers to the project phase. The second part ( YYY ) of the activity identifier refers to the process area. The third part ( NN ) of the activity identifier is a unique number for the activity. For example, A.REQ.2 is the identifier of the 2 nd activity of the requirements process area for the basic engineering phase. Some activities are performed in 2 or several phases. In this case, the activity s phase is described as an X. Each X activity describes in which phases it shall be performed.x.req.1 is the common activity no. 1 for the requirements process area for managing requirement changes in all phases. 102 Several activities require communication between different roles to be carried out. For these activities the contributing role(s) are specified, in addition to the responsible role. The expected contributions are specified in this standard, and the contributing role shall provide the specified information to the responsible role when requested. Figure 1 An overview of communication and exchange of information between the roles.
14 Ch.2 Sec.3 Page 14 A 200 Roles 201 This standard defines requirements on several organisations with responsibilities within the system life cycle. Each role is assigned activities and has the responsibility to perform the activity with an outcome that fulfils the specified criteria. 202 Each organisation is assigned one of four predefined roles. The four roles are: Owner (OW): In the context of this standard the owner is the organisation who decides to develop the unit (ship, rig etc.), and provides funding. The operator of the system can be part of the owner's organisation, or can be a subcontractor acting on behalf of the owner. For a definition of the term owner reference is also made to DNV-OSS-101 Ch.1, Sec.1, B. System integrator (SI): Responsible for the integration of all systems included in the scope of this standard. The system integrator is normally the shipyard, but parts of the integration may be delegated to other parties. In such case this shall be clearly defined and documented. Supplier (SU): Responsible for the integration and delivery of one or more single systems (drilling control system, power management system etc.). If the supplier purchases products and services from other organisations, these are regarded as sub-suppliers, and are under the supplier's responsibility. Independent verifier (IV): An organisation that is mandated to independently verify that the system is developed according to this standard. As part of the classification process for the ISDS notation, DNV shall take the role of independent verifier.
15 Ch.2 Sec.4 Page 15 SECTION 4 PROJECT PHASES AND PROCESS AREAS A. Project phases A 100 Introduction 101 All activities in this standard are mapped to the typical project life cycle, see Fig The transitions between the phases represent ISDS milestones. At each milestone the following information shall be reported by the involved roles: status for the compliance to this standard, action plans for handling non-conformities, risk observations made by the independent verifier. 103 At each milestone a milestone meeting should be arranged. The system integrator is responsible for arranging such meetings at all milestones, except M5 for which the owner is responsible. 104 An ISDS milestone is completed when the owner, the system integrator and the independent verifier endorse the information presented at the milestone. Figure 1 Process chart describing the relationship between project phases (A to E) and ISDS milestones (M1 to M5). ISDS milestone M5 is only applicable for modifications made during the operation phase. A 200 Basic engineering phase (A) 201 During this phase the technical specification and design of the unit are established, including RAMS requirements. The main systems which will be included in the scope for ISDS requirements and their confidence levels are identified. The contract between the owner and the system integrator is established during this phase. The following activities normally take place before the contract between the owner and the system integrator is signed, depending on the confidence level for the different systems: Define mission, objectives and shared vision (A.REQ.1, CL1 and above). Define operational modes and scenarios to capture expected behaviour (A.REQ.3, CL2 and above). Define RAM related requirements and objectives (A.RAMS.2, CL2 and above). Define procedures (owner) (A.PQA.1, CL1 and above). A 300 Engineering phase (B) 301 In this phase contracts are established with suppliers, and the suppliers are involved in setting up the development/configuration of each system. The detailed design of the unit and systems is documented. The verification, validation, testing and integration strategies, and major interfaces are established, and RAMS analyses are carried out. The ISDS process is normally aligned with the overall building schedule so that steel cutting takes place towards the end of the ISDS engineering phase. Normally, only one phase B activity takes place before the contracts between the system integrator and the suppliers are signed: Submit proposals to system integrator with compliance status (B.REQ.1, CL1 and above).
16 Ch.2 Sec.4 Page 16 A 400 Construction phase (C) 401 In this phase the construction and integration of the systems are carried out. Detailed software design, coding and/or parameterization are performed. Systems and interfaces are tested, validated and verified as part of this phase. A 500 Acceptance phase (D) 501 In this phase, the functionality, performance, RAMS requirements and other aspects of the integrated systems are validated, through commissioning activities, integration testing, and sea trials. A 600 Operation phase (E) 601 In this phase the unit is in operation. Maintenance and small upgrades are performed as deemed necessary by the owner. 602 Large upgrades should be managed as separate projects, following a distinct lifecycle based on this standard. Any planned upgrade resulting in the shutdown of the unit or ship for any extended period of time should be regarded as a large upgrade. B. Process Areas B 100 Introduction 101 Activities are logically grouped in process areas based on the typical engineering discipline they address. Each process area spans multiple project phases. B 200 Requirements engineering (REQ) 201 The requirements engineering process area covers the activities needed to define, document and manage the requirements on the unit, the systems and the related software. On CL1, the overall goal and vision for the unit are defined and the requirements on the unit and relevant systems are specified. A dialogue between the owner/system integrator on one hand and the potential suppliers on the other is expected to take place in order to align the requirements with the supplier s systems. The allocation of requirements to systems shall be documented. No specific methods or formats for the unit or system requirements are expected. On CL2, operational modes and scenarios are defined in order to put the requirements into an operational context, and to detail the interaction between different systems. A trace from requirements to design and verification shall be documented and maintained. CL3 introduces additional independent verifier activities. B 300 Design (DES) 301 The design process area consists of activities to establish a design on different levels. Together with the interface related activities in the integration process area (INT), this creates the design basis from which the systems and related software can be produced and verified. On CL1, each system is designed with identification of major sub-systems and components. The external interfaces shall be documented and coordinated with other relevant systems. On CL2, the unit design is documented, maintained, and analysed with focus on integration of the systems. A strategy for handling of current and future obsolescence is expected to be defined, and design guidelines to be established. The architecture of relevant systems and software components is detailed and documented, including new, reused, and parameterized software. The documentation related to any base-products shall be kept up to date. CL3 introduces additional independent verifier activities. B 400 Implementation (IMP) 401 The implementation process area covers the coding and configuration activities needed to create and customize software modules in order to fulfil the specified design. In addition, associated support documentation shall be created. On CL1, the software components are programmed, parameterized and tested based on a baselined design. On CL2, implementation guidelines and methods are expected to be used as additional input to the programming and parameterization, and support documentation like user manuals is produced. CL3 does not add any requirements.
17 Ch.2 Sec.4 Page 17 B 500 Acquisition (ACQ) 501 The acquisition process area includes activities related to when a supplier uses sub-suppliers to develop or deliver components/systems. It covers both the situation where configuration and development of the software components is subcontracted and the situation where the supplier buys 'commercial off the shelf' (COTS) systems. On CL1, specific contracts between the supplier and the sub-supplier are established and followed-up. The components or systems are verified at delivery and it shall be ensured that the delivered component/system can be integrated into the system/unit in question. On CL2, the COTS systems are selected based on defined criteria. Intermediate deliveries are reviewed, and acquired components or systems are monitored with regards to obsolescence during the operation phase. CL3 does not add any requirements. B 600 Integration (INT) 601 The integration process area covers the assembly of the systems into the unit, and activities to coordinate the interfaces between the systems. On CL1, the responsibilities regarding each system and how it is to be integrated are defined. On CL2, a specific integration plan is produced. Inter-system interfaces are coordinated and systems checked against pre-defined criteria before the integration takes place. CL3 introduces additional independent verifier activities. B 700 Verification and Validation (VV) 701 The verification and validation process area addresses the quality assurance activities needed to ensure that that each software component, each system and the integrated systems in concert perform as expected and fulfil the needs of the intended use. On CL1, the focus is on the individual systems. It is required that a verification strategy is defined, and that basic verification activities like FAT, peer reviews, and qualification of reused software are prepared and performed according to this strategy. It is also expected that the owner performs validation testing during the acceptance phase, and when modifications and upgrades are performed during the operation phase. On CL2 the focus is on the functionality and performance of the integrated systems working together, and on early verification of the correctness and the completeness of requirements, interface specifications, user interfaces, and design documentation. The verification and validation results are expected to be analysed and compared with defined targets. On CL3, the focus is on an elaborated system testing, and that an independent party performs testing on the system(s). Additional independent verifier activities are also introduced. B 800 Reliability, Availability, Maintainability and Safety (RAMS) 801 The reliability, availability, maintainability and safety (RAMS) process area gathers the activities dealing with the definition, analysis and verification of the RAMS properties of the unit and specific systems. Security aspects are also included. On CL1, the focus is on the safety part, meaning that applicable laws and regulations regarding safety are identified, and that software and software failures are taken into account when doing safety analysis. In the operation phase a structured way of doing maintenance is required. On CL 2, the focus is on the reliability, availability and maintainability (RAM) of the systems in question. Goals regarding RAM are established, analysed and verified, but the goals can be qualitative in nature. Risks and mitigations related to the RAMS aspects are managed. The activities related to handling of the RAMS aspects are planned and followed-up during the project. Security aspects are dealt with by performing security audits. On CL3, the focus is on RAM objectives that are explicitly defined, analysed and proven fulfilled. In order to achieve this, the RAM objectives need to be quantitative. Additional independent verifier activities are also introduced. B 900 Project Management (PM) 901 The project management process area covers the activities required to make sure that the project plans for the different organizations involved are created, synchronized and followed-up. On CL1, basic project management activities regarding project planning and tracking are required, and there are no additional requirements at CL2 and CL3.
18 Ch.2 Sec.4 Page 18 B 1000 Risk Management (RISK) 1001 The risk management process area covers activities related to identifying, mitigating and tracking product and project risks related to systems and software. Based on the risks, the different systems are assigned a confidence level. On CL1, risks are identified, reviewed, tracked, and updated. On CL2, also the risk mitigation actions shall be tracked to verify that they have the expected effect on the risk. CL3 introduces additional independent verifier activities. B 1100 Process and Quality Assurance (PQA) 1101 The process and quality assurance process area covers the activities needed to define and follow up the way of working within the project. It also covers the activities needed to make sure that the involved organizations fulfil the requirements in this standard. On CL1, the applicable procedures for each organization are defined and when necessary coordinated with the other roles. The adherence to the defined procedures is followed-up by each organization. DNV will follow-up on the adherence to the requirements in this standard. There are no additional requirements at CL2 and CL3. B 1200 Configuration Management (CM) 1201 The configuration management area covers activities to make sure that changes to documents and software are performed in a controlled way, and to ensure the integrity and consistency of the systems, their configuration, and all related work products (requirements, design, interface specifications, specifications, source code, documentation, etc.). On CL1, the configuration management area includes all required activities, and there are no additional requirements at CL2 and CL3.
19 Ch.2 Sec.5 Page 19 SECTION 5 ISDS REQUIREMENTS FOR OWNERS A. Owner requirements A 100 Requirements under the owner s responsibility 101 The following Table A1 lists the requirements under the owner s responsibility. See also Table A2 for the associated acceptance criteria and Table A3 for documentation criteria. 102 The owner shall also contribute to requirements that are under the responsibility of other roles. 103 Appendix B fully specifies the requirements for all roles. Table A1: Requirements under owner s responsibility Reference Required activity Contributor(s) Phase CL A.PQA.1 Define procedures (owner) Basic engineering CL1 A.RAMS.2 Define RAM related requirements and objectives Basic engineering CL2 A.REQ.1 Define mission, objectives and shared vision Basic engineering CL1 A.REQ.3 Define operational modes and scenarios to capture expected behaviour Basic engineering CL2 A.RISK.1 Define a strategy for risk management System integrator Basic engineering CL2 A.RISK.3 Assign confidence levels System integrator Basic engineering CL1 A.VV.1 Validate the concept of the unit with the users System integrator Basic engineering CL2 B.DES.5 Define obsolescence strategy System integrator, Supplier Engineering CL2 D.VV.1 Perform validation testing System integrator, Supplier Acceptance CL1 D.VV.2 Perform validation with operational scenarios System integrator, Supplier Acceptance CL2 D.VV.3 Analyse validation results with respect to targets Acceptance CL2 E.CM.1 Manage change requests during operation Operation CL1 E.CM.2 Perform configuration audits Supplier Operation CL1 E.PQA.1 Define procedures for problem resolution, change handling, and maintenance activities Supplier Operation CL1 E.RAMS.1 Maintain and execute the plan for maintenance in operation Supplier Operation CL1 E.RAMS.2 Collect RAMS data Operation CL2 E.RAMS.3 Analyse RAMS data and address discrepancies Operation CL2 E.RAMS.4 Perform RAMS impact analysis of changes Operation CL2 E.RAMS.5 Periodically perform security audits of the systems in operation Operation CL2 E.VV.1 Perform validation testing after changes in the systems in operation Supplier Operation CL1 E.VV.2 Perform validation with operational scenarios after changes in the systems in operation Supplier Operation CL2 X.PQA.1 Control procedures (owner) Several CL1 X.PQA.4 Follow-up of ISDS assessment gaps (owner) Several CL1
20 Ch.2 Sec.5 Page 20 A 200 Acceptance criteria for owner assessments 201 The following Table A2 lists the acceptance criteria for assessments of the owner. The following evidence shall be presented to the independent verifier during assessments to document that the required activities have been performed. 202 See also Table A3 for the required documentation criteria. Table A2: Acceptance criteria for assessments of owner Reference Assessment criteria A quality system, documents, minutes of meetings, or other relevant information showing: A defined way of working for the major activities in the project, clear roles and responsibilities and defined ways of A.PQA.1 interaction between the different organizations (e.g. owner, system integrator, supplier, independent verifier, and others). Listing of RAM requirements. A.RAMS.2 For CL2: qualitative requirements are acceptable. For CL3: quantitative requirements (objectives) are required. Unit design intention and philosophy: The vision of the unit/system, descriptions of the unit/systems A.REQ.1 overall behaviour and the expected business/safety/environmental performance. A.REQ.3 Vessel specification: description of the operational modes and corresponding key operational scenarios, detailed to the level of the different systems. A.RISK.1 Risk management procedure. Blank risk register. A.RISK.3 Confidence level matrix for the relevant systems. Unit concept presentation: Simulations and Minutes of System Concept Review Meeting. A.VV.1 FEED study. Obsolescence management plan: Authorised vendor list, Spare parts list (hardware & software), stock, B.DES.5 alternate spare parts list, management of intellectual property. Obsolescence criteria for software. Manufacturer preferred equipment list. Test procedure: black box tests, boundary tests, software behaviour and parameterisation and calibration. D.VV.1 Test reports: executed consistent with procedure. Test issue list: deviations (punches) and variations. Test procedure: operational scenarios. D.VV.2 Test reports: tests performed in compliance with procedure and coverage of scenarios. Test procedure: quality criteria. D.VV.3 Test reports: analysis of the results. Test issue list. Change requests Impact analysis Change orders E.CM.1 Work orders Problem reports Release notes Maintenance logs E.CM.2 Configuration audit reports. Configuration management plan. Configuration management procedure: migration issues and software obsolescence (ref E.ACQ.1). Maintenance procedures: procedures for the maintenance, software update, migration and retirement, E.PQA.1 backup and restore procedures and procedures for receiving, recording, resolving, tracking problems and modification requests. Change management procedure. Issue tracking and resolution procedure. Maintenance plan: configuration items, audit activities, maintenance activities, expected software update, migration and retirement activities, maintenance intervals and tailored procedures for the maintenance in E.RAMS.1 operation. Malicious software scan log files records. Maintenance logs. RAMS data collection system. E.RAMS.2 RAMS data collected. E.RAMS.3 RAMS analysis. E.RAMS.4 Impact analysis showing RAMS evaluation. E.RAMS.5 Security audit report.
21 Ch.2 Sec.5 Page 21 Table A2: Acceptance criteria for assessments of owner (Continued) Reference Assessment criteria E.VV.1 Test procedure: includes black box tests and includes boundary tests. Test reports: consistent with procedure. E.VV.2 Test procedures: Covering relevant Operational scenarios. Test reports: tests performed in compliance with procedure and analysis of the results. X.PQA.1 Proof that process adherence is being assessed: Quality control records, Project control records and Minutes of meetings, or other relevant information. X.PQA.4 Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of implementation of the actions. A 300 Documentation criteria for the owner 301 The table below lists all documents to be sent to the independent verifier and in which activities the independent verifier is going to use the different documents. 302 When the independent verifier is expected to comment on the document, the word reviewed is employed. For documents which serve as background information to put the reviewed documents in a context, the word used is employed. 303 Most documents are provided for information (FI). The only document that is sent to the independent verifier for approval (AP) is the corrective action plan. Table A3: Documents required for review Reference Documents A.PQA.1 No documentation to be submitted to DNV for review. List of RAM requirements unit (FI): A.RAMS.2 - reviewed in A.IV.2 and B.IV.4 at CL3 - used in D.IV.3 at CL3. List of RAM requirements system (FI) used in C.IV.3 at CL3. A.REQ.1 Design Philosophy (FI) used in A.IV.1 at CL3. A.REQ.3 Vessel specification (FI) reviewed in A.IV.1 at CL3. A.RISK.1 No documentation to be submitted to DNV for review. Vessel specification (confidence levels) (FI): A.RISK.3 - reviewed in A.IV.1 at CL3 - used in A.IV.2 and B.IV.4 at CL3. A.VV.1 No documentation to be submitted to DNV for review. B.DES.5 No documentation to be submitted to DNV for review. Test procedure for quay and sea trials (FI) and Report from quay and sea trials (FI) reviewed in D.IV.1 at D.VV.1 CL2 and CL3. Report from quay and sea trials (FI) used in D.IV.2 at CL3. D.VV.2 Test procedure (FI) and Test report (FI) reviewed in D.IV.1 at CL2 and CL3. Test report (FI) used in D.IV.2 at CL3. D.VV.3 Verification analysis report (FI) reviewed in D.IV.2 at CL3. E.CM.1 No documentation to be submitted to DNV for review. E.CM.2 No documentation to be submitted to DNV for review. E.PQA.1 No documentation to be submitted to DNV for review. E.RAMS.1 No documentation to be submitted to DNV for review. E.RAMS.2 No documentation to be submitted to DNV for review. E.RAMS.3 No documentation to be submitted to DNV for review. E.RAMS.4 No documentation to be submitted to DNV for review. E.RAMS.5 No documentation to be submitted to DNV for review. E.VV.1 Test procedure (FI) and Test report (FI) reviewed in E.IV.1 at CL3. E.VV.2 Test procedure (FI) and Test report (FI) reviewed in E.IV.1 at CL3. X.PQA.1 No documentation to be submitted to DNV for review. X.PQA.4 Corrective action plan (AP) reviewed and approved in X.IV.1.
22 Ch.2 Sec.6 Page 22 SECTION 6 ISDS REQUIREMENTS FOR SYSTEM INTEGRATORS A. System integrator requirements A 100 Requirements under the system integrator s responsibility 101 The following Table A1 lists the requirements under the system integrator s responsibility. See also Table A2 for the associated acceptance criteria and Table A3 for documentation criteria. 102 The system integrator shall also contribute to requirements that are under the responsibility of other roles. 103 Appendix B fully specifies the requirements for all roles. Table A1: Requirements under system integrator s responsibility Reference Required activity Contributor(s) Phase CL A.CM.1 Establish a baseline of requirements for the unit Owner Basic engineering CL1 A.DES.1 Establish the unit design Owner Basic engineering CL2 A.PM.1 Establish the master plan Owner Basic engineering CL1 A.PQA.2 Define procedures (system integrator) Basic engineering CL1 A.RAMS.1 Determine safety rules, standards and laws applicable Owner Basic engineering CL1 A.RAMS.3 Develop the RAMS plan for the unit Owner Basic engineering CL2 A.REQ.2 Collect requirements for the unit and systems Owner Basic engineering CL1 A.REQ.4 Allocate functions and requirements to systems Owner Basic engineering CL1 A.REQ.5 Consult potential suppliers for acquiring of systems Owner Basic engineering CL1 A.REQ.6 Establish traceability of requirements Basic engineering CL2 A.RISK.2 Jointly identify risks Owner Basic engineering CL1 A.VV.2 Verify the unit and system requirements Basic engineering CL2 B.CM.1 Establish baselines of requirements and design Owner Engineering CL1 B.CM.2 Establish and implement configuration management Owner Engineering CL1 B.DES.3 Use established design guidelines and methods Engineering CL2 B.DES.4 Analyse and refine the unit design Owner, Supplier Engineering CL2 B.INT.1 Define integration plan Supplier Engineering CL2 B.INT.2 Coordinate inter-system interfaces Supplier Engineering CL2 B.PM.1 Establish the project plan for each organisation Owner Engineering CL1 B.PM.2 Coordinate and integrate the project plans with the master plan Owner, Supplier Engineering CL1 B.RAMS.1 Identify software-related RAMS risks and priorities Owner Engineering CL2 B.RAMS.2 Identify RAMS risk mitigation actions Engineering CL2 B.VV.1 Define verification and validation strategy Owner Engineering CL1 B.VV.2 Review the design with respect to requirements and design rules Owner Engineering CL2 B.VV.3 Review consistency between design and operational scenarios Engineering CL2 B.VV.4 Review interface specifications Engineering CL2 B.VV.5 Validate critical or novel user-system interactions Owner, Supplier Engineering CL2 Check readiness status of systems and components before C.INT.1 integration Construction CL2
23 Ch.2 Sec.6 Page 23 Table A1: Requirements under system integrator s responsibility (Continued) Reference Required activity Contributor(s) Phase CL C.PQA.1 Establish procedures for problem resolution and maintenance activities in the construction and acceptance phases Supplier Construction CL1 C.VV.9 Arrange independent testing Supplier Construction CL3 D.CM.1 Manage software changes during commissioning Owner, Supplier Acceptance CL1 D.CM.2 Establish a release note for the systems in ISDS scope Acceptance CL1 D.CM.3 Transfer responsibility for system configuration management Owner, to owner Supplier Acceptance CL1 D.RAMS.1 Demonstrate achievement of unit RAMS requirements Supplier Acceptance CL2 D.RAMS.2 Collect data and calculate RAM values Supplier Acceptance CL3 D.RAMS.3 Perform a security audit on the deployed systems Acceptance CL2 D.VV.4 Perform systems integration tests Supplier Acceptance CL2 X.CM.1 Track and control changes to the baselines Several CL1 X.PM.1 Monitor project status against plan Owner, Supplier Several CL1 X.PM.2 Perform joint project milestone reviews Owner, Supplier Several CL1 X.PQA.2 Control procedures (system integrator) Several CL1 X.PQA.5 Follow-up of ISDS assessment gaps (system integrator) Several CL1 X.REQ.1 Maintain requirements traceability information Several CL2 X.RISK.1 Track, review and update risks Owner, Supplier Several CL1 X.RISK.2 Decide, implement and track risk mitigation actions to closure Owner Several CL2 X.VV.2 Detail procedures for testing Several CL1 A 200 Acceptance criteria for system integrator assessments 201 The following Table A2 lists the acceptance criteria for assessments of the system integrator. The following evidence shall be presented to the independent verifier during assessments to document that the required activities have been performed. 202 See also Table A3 for the required documentation criteria. Table A2: Acceptance criteria for assessments of system integrator Reference Assessment criteria Approved and controlled unit requirements document. A.CM.1 Revision history of unit requirements document. A.DES.1 Unit design: unit design specifications, systems/network topology and functional descriptions. A.PM.1 Master plan: Activities, work breakdown structure (WBS), schedule, and milestones. A quality system, documents, minutes of meetings, or other relevant information showing: A defined way of working for the major activities in the project, clear roles and responsibilities and defined ways of A.PQA.2 interaction between the different organizations (e.g. owner, system integrator, supplier, independent verifier, and others). Listing of regulatory requirements that apply regarding safety. A.RAMS.1 Resolution of conflicting rules. Application guidelines. Plan(s) showing the methods, tools, and procedures to be used for RAMS activities. Schedule of RAMS activities. A.RAMS.3 Expectations on the suppliers RAMS plan. RAM data to be collected (CL3). A.REQ.2 Vessel specification: operational requirements, functional requirements, non-functional requirements and technical constraints. A.REQ.4 Design specification (or requirements) for the relevant systems. System request for proposal (RfP): functional specifications, generic system requirements and A.REQ.5 obsolescence information. Requirements compliance information (on CL2 and above). Traceability information between requirements on unit level and requirements on the different systems. A.REQ.6 Defined mechanisms and ambition-level regarding requirements traceability. Project risk list: risk list with risks related to e.g. requirements, schedule, effort, quality, performance, A.RISK.2 consistency and obsolescence (for both hardware and software).
24 Ch.2 Sec.6 Page 24 Table A2: Acceptance criteria for assessments of system integrator (Continued) Reference Assessment criteria Review records of the unit requirements. A.VV.2 Review records for the system requirements. Baseline repositories. Identification of baselines. B.CM.1 Approved and controlled documents (baselines) for: unit specifications, unit design, system requirements, system design, interface specifications and base products. Configuration management plan: Definition of a Change Control Board (CCB) process or similar, identification of required baselines, required baseline content, change request forms. B.CM.2 Change requests and change decisions. Version history information of baselines. Defined rules and mechanisms for version control. Effective implementation of version control mechanisms. System design guidelines: including RAMS related aspects. B.DES.3 Unit design guidelines: including RAMS related aspects. B.DES.4 Updated unit design documentation: unit design specifications, systems/network topology with software components, interface specifications, and functional descriptions. Plan for integration of systems into unit: The responsibilities of the different organizations, dependencies among systems, sequence for integration, integration environment, tests and integration readiness criteria. B.INT.1 Plan for integration of sub-systems and components into systems (when required): Dependencies among systems, sub-systems and components, sequence for integration, integration environment, tests and integration readiness criteria. Interface overview/matrix information with assigned responsibilities. B.INT.2 Agreed inter-system interface specifications containing: protocol selected, definition of commands, messages, data and alarms to be communicated and specifications of message formats. Interface definition and verification status. Schedule. Project plan: WBS, technical attributes used for estimating, effort and costs estimates, deliverables and B.PM.1 milestones, configuration management plan. Resource allocation. Master plan. B.PM.2 Project plans. RAMS hazard and risk list showing consideration of software risks. B.RAMS.1 Defined risk identification and analysis methods. Relevant risks are communicated to other roles. RAMS hazard and risk mitigation list showing mitigation actions for software risks. B.RAMS.2 Relevant mitigation actions are communicated to other roles Verification strategy: which part to verify: unit, system, sub-system, component, module, design documents. Method specification documents, etc.: which methods to use for this verification: testing, inspection, code analysis, simulation, prototyping, peer review techniques, quality criteria and targets, which test B.VV.1 types to use: functional, performance, regression, user interface, negative, what environment to use for verification and identification of the test stages (e.g. sea trials, integration tests, commissioning, FAT, internal testing, component testing) to be used for the verification and the schedule for those tests. Validation strategy: products to be validated, validation criteria, operational scenarios, methods and environments. Documented design review records addressing: requirements verification, design rules and verification B.VV.2 of uncertainties. B.VV.3 Minutes from review: review results considering consistency of interface/function/component/scenarios. Interface specification reviews addressing at least: consistency between input and output signals, B.VV.4 frequency and scan rates, deadlocks, propagation of failures from one part to another, engineering units, network domination. Validation records including: workshop minutes, user representative s participation and comments and B.VV.5 agreed action lists. C.INT.1 Integration readiness criteria fulfilled per component and per system. Agreed maintenance procedures: Procedures for general system maintenance activities and procedures for software update, backup and roll-back. C.PQA.1 Agreed problem resolution procedures: Procedures for receiving, recording, resolving, tracking problems (punches) and modification requests. Test procedure: covering the system and its interfaces. C.VV.9 Test report.
25 Ch.2 Sec.6 Page 25 Table A2: Acceptance criteria for assessments of system integrator (Continued) Reference Assessment criteria Defined software configuration management: definition of Change Control Board (CCB), change request forms, description of change process for software, impact analysis, Identification of items to be controlled, configuration management tool, including, issue, change, version and configuration tracking tool and prevents unauthorised changes. D.CM.1 Modification records justifying changes: configuration records, version histories, release notes, change orders. D.CM.2 Overall release note for the systems in ISDS scope. D.CM.3 Approved configuration management plan. Records of transmission of software, documentation and data, or responsibility thereof. D.RAMS.1 RAMS compliance analysis information. D.RAMS.2 Calculations of RAM values for relevant systems and the unit. RAM data. D.RAMS.3 Security audit records. D.VV.4 Integration test procedures covering system interfaces and inter-system functionality. Integration test reports. Change requests/orders. Version histories for baselines. X.CM.1 Changes to: unit requirements, unit design, system requirements, system design, software design, interface specifications and software. Configuration records from document or software repositories. Master schedule. Master plan (updated). Project status report. X.PM.1 Project action list. Minutes of review meetings. Progress report. Minutes of joint milestone meetings. X.PM.2 ISDS compliance status. Action plans. Proof that process adherence is being assessed: Quality control records, Project control records and X.PQA.2 Minutes of meetings, or other relevant information. Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of X.PQA.5 implementation of the actions. Up to date traceability information: from owner to system requirements, from system requirements to functional specifications (where applicable), from system requirements to base-product and X.REQ.1 configuration data (where applicable), from functional specifications to sub-system/component specifications and from requirements to test procedures (when the test procedures are available). Completeness and consistency review records of the traceability information. Project risk management plan. X.RISK.1 Updated internal risk register (per organization). Updated project risk register (jointly managed). Updated internal risk register: risk list, mitigation actions and follow-up records (per organization). X.RISK.2 Updated project risk register: risk list, mitigation actions and follow-up records (jointly managed). X.VV.2 Existence of relevant test procedures.
26 Ch.2 Sec.6 Page 26 A 300 Documentation criteria for the system integrator 301 The table below lists all documents to be sent to the independent verifier and in which activities the independent verifier is going to use the different documents. 302 When the independent verifier is expected to comment on the document, the word reviewed is employed. For documents which serve as background information to put the reviewed documents in a context, the word used is employed. 303 Most documents are provided for information (FI). The only document that is sent to the independent verifier for approval (AP) is the corrective action plan. Table A3: Documents required for review Reference Documents A.CM.1 No documentation to be submitted to DNV for review. A.DES.1 No documentation to be submitted to DNV for review. A.PM.1 No documentation to be submitted to DNV for review. A.PQA.2 No documentation to be submitted to DNV for review. List of regulatory requirements unit (FI): A.RAMS.1 - reviewed in A.IV.2 at CL3 - used in B.IV.4 and D.IV.3 at CL3. List of regulatory requirements system (FI) used in C.IV.3 at CL3. Plan for handling of RAMS (FI): A.RAMS.3 - reviewed in A.IV.2 at CL3 - used in B.IV.4 at CL3. A.REQ.2 Vessel specification (FI) reviewed in A.IV.1 at CL3. A.REQ.4 Specification (FI) reviewed in A.IV.1 at CL3. A.REQ.5 No documentation to be submitted to DNV for review. A.REQ.6 Traceability matrices (FI) used in A.IV.1 at CL3. A.RISK.2 No documentation to be submitted to DNV for review. A.VV.2 No documentation to be submitted to DNV for review. B.CM.1 No documentation to be submitted to DNV for review. B.CM.2 No documentation to be submitted to DNV for review. B.DES.3 RAMS design guidelines and methods for the vessel (FI) used in B.IV.1 at CL3. RAMS design guidelines and methods for the system (FI) used in B.IV.1 at CL3. Interface description (FI) reviewed in B.IV.1 at CL3, Functional description (FI) reviewed in B.IV.1 at CL3, B.DES.4 Block (topology) diagram (FI): - reviewed in B.IV.1 at CL3 - used in B.IV.2 at CL2 and CL3. Integration plan (FI): B.INT.1 - reviewed in B.IV.2 at CL2 and CL3 - used in C.IV.1 at CL3. B.INT.2 Interface description (FI) used in B.IV.2 at CL2 and CL3. B.PM.1 No documentation to be submitted to DNV for review. B.PM.2 No documentation to be submitted to DNV for review. B.RAMS.1 RAMS risk register (FI) and RAMS risk analysis documentation (FI) reviewed in B.IV.3 at CL3. RAMS risk register (FI): B.RAMS.2 - reviewed in B.IV.3 at CL3 - used in C.IV.3 and D.IV.3 at CL3. Verification and validation strategy (FI): B.VV.1 - reviewed in B.IV.2, at CL2 and CL3 - used in C.IV.1 at CL3, and C.IV.2 and D.IV.1 at CL2 and CL3. B.VV.2 No documentation to be submitted to DNV for review. B.VV.3 No documentation to be submitted to DNV for review. B.VV.4 No documentation to be submitted to DNV for review. B.VV.5 No documentation to be submitted to DNV for review. C.INT.1 No documentation to be submitted to DNV for review. C.PQA.1 No documentation to be submitted to DNV for review. C.VV.9 Independent test procedure (FI) and Independent test report (FI) reviewed in D.IV.1 at CL3. Independent test report (FI) used in D.IV.2 at CL3. D.CM.1 No documentation to be submitted to DNV for review. D.CM.2 No documentation to be submitted to DNV for review.
27 Ch.2 Sec.6 Page 27 Table A3: Documents required for review (Continued) Reference Documents D.CM.3 No documentation to be submitted to DNV for review. D.RAMS.1 RAMS compliance report (FI) reviewed in D.IV.3 at CL3. D.RAMS.2 RAM report (FI) unit reviewed in D.IV.3. RAM report (FI) system used in D.IV.3. D.RAMS.3 Security audit report (FI) used in D.IV.3 at CL3. D.VV.4 Test procedure (FI) and Test report (FI) reviewed in D.IV.1 at CL2 and CL3. Test report (FI) used in D.IV.2 at CL3. X.CM.1 No documentation to be submitted to DNV for review. X.PM.1 No documentation to be submitted to DNV for review. X.PM.2 No documentation to be submitted to DNV for review. X.PQA.2 No documentation to be submitted to DNV for review. X.PQA.5 Corrective action plan (AP) reviewed and approved in X.IV.1. X.REQ.1 Traceability matrices (FI) used in B.IV.1 and C.IV.1 at CL3, and in C.IV.2 at CL2 and CL3. X.RISK.1 No documentation to be submitted to DNV for review. X.RISK.2 No documentation to be submitted to DNV for review. X.VV.2 No documentation to be submitted to DNV for review.
28 Ch.2 Sec.7 Page 28 SECTION 7 ISDS REQUIREMENTS FOR SUPPLIERS A. Supplier requirements A 100 Requirements under the supplier s responsibility 101 The following Table A1 lists the requirements under the supplier s responsibility. See also Table A2 for the associated acceptance criteria and Table A3 for documentation criteria. 102 The supplier shall also contribute to requirements that are under the responsibility of other roles. 103 Appendix B fully specifies the requirements for all roles. Table A1: Requirements under supplier s responsibility Reference Required activity Contributor(s) Phase CL B.ACQ.1 Select COTS products based on defined criteria Engineering CL2 B.ACQ.2 Establish contract with sub-suppliers Engineering CL1 B.CM.1 Establish baselines of requirements and design Owner Engineering CL1 B.CM.2 Establish and implement configuration management Owner Engineering CL1 B.DES.1 Design the system System integrator Engineering CL1 B.DES.2 Design each software component Engineering CL2 B.DES.3 Use established design guidelines and methods Engineering CL2 B.DES.5 Define obsolescence strategy System integrator, Supplier Engineering CL2 B.INT.1 Define integration plan Supplier Engineering CL2 B.PM.1 Establish the project plan for each organisation Owner Engineering CL1 B.PQA.1 Define procedures (supplier) Engineering CL1 B.RAMS.1 Identify software-related RAMS risks and priorities Owner Engineering CL2 B.RAMS.2 Identify RAMS risk mitigation actions Engineering CL2 B.RAMS.3 Consider software failure modes in safety analysis activities Engineering CL1 B.RAMS.4 Develop the RAMS plan for the system System integrator Engineering CL2 B.REQ.1 Submit proposals to system integrator with compliance status Engineering CL1 B.REQ.2 Refine system requirements into software component requirements Engineering CL2 B.REQ.3 Detail operational scenarios Engineering CL2 B.VV.1 Define verification and validation strategy Owner Engineering CL1 B.VV.2 Review the design with respect to requirements and design rules Owner Engineering CL2 B.VV.3 Review consistency between design and operational scenarios Engineering CL2 B.VV.4 Review interface specifications Engineering CL2 C.ACQ.1 Accept deliverables Construction CL1 C.ACQ.2 Ensure transition and integration of the delivered product Construction CL1 C.IMP.1 Develop and configure the software components from design Construction CL1 C.IMP.2 Develop support documentation Owner, System integrator Construction CL2 C.IMP.3 Perform software component testing Construction CL1 C.IMP.4 Use established software implementation guidelines and methods Construction CL2 C.INT.1 Check readiness status of systems and components before integration Construction CL2 C.RAMS.1 Demonstrate achievement of system RAMS requirements Construction CL2 C.RAMS.2 Evaluate software systems and software components against RAM objectives Construction CL3 C.RAMS.3 Prepare a plan for system maintenance during operation Owner Construction CL1 C.VV.1 Perform peer-reviews of software Construction CL1
29 Ch.2 Sec.7 Page 29 Table A1: Requirements under supplier s responsibility (Continued) Reference Required activity Contributor(s) Phase CL C.VV.2 Review software parameterisation data System integrator Construction CL1 C.VV.3 Perform internal testing System integrator Construction CL2 C.VV.4 Perform high integrity internal testing System integrator Construction CL3 C.VV.5 Perform code analysis on new and modified software Construction CL2 C.VV.6 Analyse verification results with respect to targets System integrator Construction CL2 C.VV.7 Qualify reused software Construction CL1 C.VV.8 Perform Factory Acceptance Tests (FAT) Owner, System integrator Construction CL1 E.ACQ.1 Manage and monitor obsolescence Operation CL2 E.CM.1 Manage change requests during operation Operation CL1 E.RAMS.3 Analyse RAMS data and address discrepancies Operation CL2 E.RAMS.4 Perform RAMS impact analysis of changes Operation CL2 X.ACQ.1 Monitor contract execution and changes Several CL1 X.ACQ.2 Review intermediate deliverables Several CL2 X.CM.1 Track and control changes to the baselines Several CL1 X.CM.2 Establish a release note for the delivered system Several CL1 X.DES.1 Update the base-product design documentation Several CL2 X.PM.1 Monitor project status against plan Owner, Supplier Several CL1 X.PQA.3 Control procedures (supplier) Several CL1 X.PQA.6 Follow-up of ISDS assessment gaps (supplier) Several CL1 X.REQ.1 Maintain requirements traceability information Several CL2 X.RISK.1 Track, review and update risks Owner, Supplier Several CL1 X.RISK.2 Decide, implement and track risk mitigation actions to closure Owner Several CL2 X.VV.1 Perform verification and validation on added and modified software components Owner Several CL1 X.VV.2 Detail procedures for testing Several CL1 A 200 Acceptance criteria for supplier assessments 201 The following Table A2 lists the acceptance criteria for assessments of the supplier. The following evidence shall be presented to the independent verifier during assessments to document that the required activities have been performed. 202 See also Table A3 for the required documentation criteria. Table A2: Acceptance criteria for assessments of supplier Reference Assessment criteria COTS product selection procedure: obsolescence management. B.ACQ.1 COTS product selection matrix: rationale for selection, selection criteria, evaluations and selection. Supplier agreement: product or component specifications, functional specifications, technical acceptance B.ACQ.2 criteria, ownership transfer conditions, delivery strategy, provisions for review of intermediate deliveries. Baseline repositories. Identification of baselines. B.CM.1 Approved and controlled documents (baselines) for: unit specifications, unit design, system requirements, system design, interface specifications and base products. Configuration management plan: Definition of a Change Control Board (CCB) process or similar, identification of required baselines, required baseline content, change request forms. Change requests and change decisions. B.CM.2 Version history information of baselines. Defined rules and mechanisms for version control. Effective implementation of version control mechanisms. Design for system (hardware & software): functional description, user interface descriptions, block/ B.DES.1 topology diagrams with software components, external interface descriptions and internal interface descriptions. Component design for each software component, in sufficient detail so as to proceed to the making of the software: structural description, functional description, behaviour description, parameters (default, B.DES.2 intervals, as-designed), interfaces description, allocation of software to hardware and assumptions and known limitations of the design.
30 Ch.2 Sec.7 Page 30 Table A2: Acceptance criteria for assessments of supplier (Continued) Reference Assessment criteria System design guidelines: including RAMS related aspects. B.DES.3 Unit design guidelines: including RAMS related aspects. Obsolescence management plan: Authorised vendor list, Spare parts list (hardware & software), stock, B.DES.5 alternate spare parts list, management of intellectual property. Obsolescence criteria for software. Manufacturer preferred equipment list. Plan for integration of systems into unit: The responsibilities of the different organizations, dependencies among systems, sequence for integration, integration environment, tests and integration readiness criteria. B.INT.1 Plan for integration of sub-systems and components into systems (when required): Dependencies among systems, sub-systems and components, sequence for integration, integration environment, tests and integration readiness criteria. Schedule. B.PM.1 Project plan: WBS, technical attributes used for estimating, effort and costs estimates, deliverables and milestones, configuration management plan. Resource allocation. A quality system, documents, minutes of meetings, or other relevant information showing: A defined way of working for the major activities in the project, Clear roles and responsibilities and Defined ways of B.PQA.1 interaction between the different organizations (e.g. owner, system integrator, supplier, independent verifier, and others). RAMS hazard and risk list showing consideration of software risks. B.RAMS.1 Defined risk identification and analysis methods. Relevant risks are communicated to other roles. RAMS hazard and risk mitigation list showing mitigation actions for software risks. B.RAMS.2 Relevant mitigation actions are communicated to other roles B.RAMS.3 Safety analysis showing consideration of software failure modes. Plan showing objectives, methods, tools, and procedures to be used, consistent with the RAMS plan for B.RAMS.4 the unit. Schedule of RAMS activities. RAM data to be collected (CL3). Submitted technical proposal for the system: system breakdown, alternatives and options, description of B.REQ.1 customisation or parameterisation of existing products (including software), requirements compliance matrix and software lifecycle information (including licensing, ownership and obsolescence). Refined component requirements and specification. B.REQ.2 Requirement allocation matrix. System/component behaviour and interaction specification and descriptions: use cases, sequences B.REQ.3 (including signal usage), state diagrams, interlocks, degraded sequences, performance targets and constraints and limitations. Verification strategy: which part to verify: unit, system, sub-system, component, module, design documents. Method specification documents, etc.: which methods to use for this verification: testing, inspection, code analysis, simulation, prototyping, peer review techniques, quality criteria and targets, which test types to B.VV.1 use: functional, performance, regression, user interface, negative, what environment to use for verification and identification of the test stages (e.g. sea trials, integration tests, commissioning, FAT, internal testing, component testing) to be used for the verification and the schedule for those tests. Validation strategy: products to be validated, validation criteria, operational scenarios, methods and environments. B.VV.2 Documented design review records addressing: requirements verification, design rules and verification of uncertainties. B.VV.3 Minutes from review: review results considering consistency of interface/function/component/scenarios. Interface specification reviews addressing at least: consistency between input and output signals, B.VV.4 frequency and scan rates, deadlocks, propagation of failures from one part to another, engineering units, network domination. Component acceptance data: acceptance criteria, component acceptance (FAT, SAT) test procedures, C.ACQ.1 component acceptance test records, component acceptance issue and problems list and component acceptance coverage measurements (requirements, structural). Supplier agreement on: list of deliverables, review and approval plans and support and maintenance agreement. C.ACQ.2 Product documentation. Operation manual. Configuration information. Developed component release note. Commented software source code. C.IMP.1 Parameters and configuration files. I/O List. Development environment configuration.
31 Ch.2 Sec.7 Page 31 Table A2: Acceptance criteria for assessments of supplier (Continued) Reference Assessment criteria System and component support documentation: data sheets, user manuals, administration manuals, C.IMP.2 operating and maintenance procedures, training material and FAQs, known defects and troubleshooting guides. Review records for the support documentation. C.IMP.3 Software test log: list of defects, date of test, tester, test scope and pass or fail. Software defect list. Software guidelines/standards/rules/checklists/automated checks. C.IMP.4 Review records. C.INT.1 Integration readiness criteria fulfilled per component and per system. C.RAMS.1 RAMS compliance analysis information. C.RAMS.2 RAM report: Calculations of RAM values for designated systems and RAM data. Maintenance management plan: configuration items, rules for operation/maintenance, backup and restore C.RAMS.3 procedures, expected maintenance activities, expected software update, migration and retirement activities, schedules and tailored procedures for maintenance in operation. Peer review methodology description. Peer review schedule. C.VV.1 Peer review records. Peer review check lists. C.VV.2 Parameter list review report: name, value, tolerance, function. C.VV.3 Test procedures. Test reports. Test procedures C.VV.4 Test reports C.VV.5 Software code verification: peer review reports, code analysis reports and code rule set. Verification result evaluation: result analyses, punch lists, action lists, defect correction and focus on C.VV.6 defect prone software. Software qualification report: reused software component list, qualification method for each reused C.VV.7 software component and qualification data. System FAT procedure: coverage of requirements, functionality, performance, RAMS (when applicable), C.VV.8 integration testing, hardware/software integration, interfaces and degraded modes. System FAT report: consistent with procedure, deviations identified and coverage measured. Obsolescence strategy document. E.ACQ.1 Obsolescence management plan: Authorised vendor list, Spare parts list (HW & compatible SW), Alternate spare parts list and Management of intellectual property. Change requests Impact analysis Change orders E.CM.1 Work orders Problem reports Release notes Maintenance logs E.RAMS.3 RAMS analysis. E.RAMS.4 Impact analysis showing RAMS evaluation. Sub-supplier progress review schedule. Sub-supplier progress review reports. X.ACQ.1 Sub-supplier project control records. Sub-supplier quality control records. X.ACQ.2 Supplier agreement: list of deliverables and review and approval plans. Review records/minutes. Change requests/orders. Version histories for baselines. X.CM.1 Changes to: unit requirements, unit design, system requirements, system design, software design, interface specifications and software. Configuration records from document or software repositories. X.CM.2 Component release note: including list of changes to previous version of component. Base product design description. X.DES.1 Revision information for updated base-product components.
32 Ch.2 Sec.7 Page 32 Table A2: Acceptance criteria for assessments of supplier (Continued) Reference Assessment criteria Master schedule. Master plan (updated). X.PM.1 Project status report. Project action list. Minutes of review meetings. Progress report. Proof that process adherence is being assessed: Quality control records, Project control records and X.PQA.3 Minutes of meetings, or other relevant information. X.PQA.6 Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of implementation of the actions. Up to date traceability information: from owner to system requirements, from system requirements to functional specifications (where applicable), from system requirements to base-product and configuration X.REQ.1 data (where applicable), from functional specifications to sub-system/component specifications and from requirements to test procedures (when the test procedures are available). Completeness and consistency review records of the traceability information. Project risk management plan. X.RISK.1 Updated internal risk register (per organization). Updated project risk register (jointly managed). Updated internal risk register: risk list, mitigation actions and follow-up records (per organization). X.RISK.2 Updated project risk register: risk list, mitigation actions and follow-up records (jointly managed). X.VV.1 Test procedure: consistent with change or upgrade scope. Test report: consistent with test procedure. X.VV.2 Existence of relevant test procedures. A 300 Documentation criteria for the supplier 301 The table below lists all documents to be sent to the independent verifier and in which activities the independent verifier is going to use the different documents. 302 When the independent verifier is expected to comment on the document, the word reviewed is employed. For documents which serve as background information to put the reviewed documents in a context, the word used is employed. 303 Most documents are provided for information (FI). The only document that is sent to the independent verifier for approval (AP) is the corrective action plan. Table A3: Documents required for review Reference Documents B.ACQ.1 No documentation to be submitted to DNV for review. B.ACQ.2 No documentation to be submitted to DNV for review. B.CM.1 No documentation to be submitted to DNV for review. B.CM.2 No documentation to be submitted to DNV for review. Interface description (FI), B.DES.1 Functional description (FI) and Block (topology) diagram (FI) reviewed in B.IV.1 at CL3. B.DES.2 Software design description (FI) reviewed in B.IV.1 at CL3. B.DES.3 RAMS design guidelines and methods for the vessel (FI) used in B.IV.1 at CL3. RAMS design guidelines and methods for the system (FI) used in B.IV.1 at CL3. B.DES.5 No documentation to be submitted to DNV for review. Integration plan (FI): B.INT.1 - reviewed in B.IV.2 at CL2 and CL3 - used in C.IV.1 at CL3. B.PM.1 No documentation to be submitted to DNV for review. B.PQA.1 No documentation to be submitted to DNV for review. B.RAMS.1 RAMS risk register (FI) and RAMS risk analysis documentation (FI) reviewed in B.IV.3 at CL3. RAMS risk register (FI): B.RAMS.2 - reviewed in B.IV.3 at CL3 - used in C.IV.3 and D.IV.3 at CL3. B.RAMS.3 Safety assessment report (FI) used in C.IV.3 at CL3. Plan for handling of RAMS (FI): B.RAMS.4 - reviewed in B.IV.4 at CL3 - used in C.IV.3 at CL3. B.REQ.1 Specification (FI) used in B.IV.1 at CL3.
33 Ch.2 Sec.7 Page 33 Table A3: Documents required for review (Continued) Reference Documents B.REQ.2 Specifications (FI) used in B.IV.1 at CL3. B.REQ.3 Specifications (FI) used in B.IV.1 at CL3. Verification and validation strategy (FI): B.VV.1 - reviewed in B.IV.2, at CL2 and CL3 - used in C.IV.1 at CL3, and C.IV.2 and D.IV.1 at CL2 and CL3. B.VV.2 No documentation to be submitted to DNV for review. B.VV.3 No documentation to be submitted to DNV for review. B.VV.4 No documentation to be submitted to DNV for review. C.ACQ.1 No documentation to be submitted to DNV for review. C.ACQ.2 No documentation to be submitted to DNV for review. C.IMP.1 No documentation to be submitted to DNV for review. C.IMP.2 No documentation to be submitted to DNV for review. C.IMP.3 No documentation to be submitted to DNV for review. C.IMP.4 No documentation to be submitted to DNV for review. C.INT.1 No documentation to be submitted to DNV for review. C.RAMS.1 RAMS compliance report (FI) reviewed in C.IV.3 at CL3. C.RAMS.2 RAM report (FI) reviewed in C.IV.3. C.RAMS.3 No documentation to be submitted to DNV for review. C.VV.1 Software peer review records (FI) used in C.IV.1 at CL3. C.VV.2 No documentation to be submitted to DNV for review. C.VV.3 No documentation to be submitted to DNV for review. C.VV.4 Test procedure at manufacturer (FI) and Test report at manufacturer (FI) used in C.IV.1 at CL3. C.VV.5 Software code analysis record (FI) used in C.IV.1 at CL3. C.VV.6 Verification analysis report (FI) reviewed in C.IV.1 at CL3. C.VV.7 No documentation to be submitted to DNV for review. System FAT procedure (FI) and System FAT report (FI): C.VV.8 - reviewed in C.IV.2 at CL2 and CL3 - used in C.IV.1 at CL3. E.ACQ.1 No documentation to be submitted to DNV for review. E.CM.1 No documentation to be submitted to DNV for review. E.RAMS.3 No documentation to be submitted to DNV for review. E.RAMS.4 No documentation to be submitted to DNV for review. X.ACQ.1 No documentation to be submitted to DNV for review. X.ACQ.2 No documentation to be submitted to DNV for review. X.CM.1 No documentation to be submitted to DNV for review. X.CM.2 No documentation to be submitted to DNV for review. X.DES.1 No documentation to be submitted to DNV for review. X.PM.1 No documentation to be submitted to DNV for review. X.PQA.3 No documentation to be submitted to DNV for review. X.PQA.6 Corrective action plan (AP) reviewed and approved in X.IV.1. X.REQ.1 Traceability matrices (FI) used in B.IV.1 and C.IV.1 at CL3, and in C.IV.2 at CL2 and CL3. X.RISK.1 No documentation to be submitted to DNV for review. X.RISK.2 No documentation to be submitted to DNV for review. X.VV.1 Test procedure (FI) and Test report (FI) used in E.IV.1 at CL3. X.VV.2 No documentation to be submitted to DNV for review.
34 SECTION 8 ISDS REQUIREMENTS FOR THE INDEPENDENT VERIFIER A. Independent verifier requirements A 100 Activities for which the independent verifier is responsible 101 The following table describes the activities that shall be performed by the independent verifier. 102 As part of the classification process for the ISDS notation, DNV shall take the role of independent verifier. 103 Most documents are reviewed for information (FI), only the corrective action plan is reviewed for approval (AP). 104 Document types listed in the table are further described in RP-A201 and a description of the content in an ISDS context is given in the assessment criteria for each of the referenced activities. Table A1 Requirements under independent verifier s responsibility ID CL Activity Description Guidance Note A.IV.1 3 Review technical specification The technical specification of the unit shall be reviewed to verify that CL3 systems are working together without inconsistencies. Interfaces to other systems are also addressed in order to see that there are no compromising impacts on the CL3 systems. Document type Z050-Design Philosophy (A.REQ.1) Z040-Vessel specification (A.REQ.2 & A.REQ.3) Z100-Specification (A.REQ.4) Z290-Record-Traceability matrices (A.REQ.6) Z040-Vessel specification- ISDS Confidence levels (A.RISK.3) Input Acceptance Criteria CL3 systems are designed to work together without inconsistencies Other systems are not having a compromising impact on the CL3 system(s). Activity Ref. A.REQ.1 A.REQ.2 A.REQ.3 A.REQ.4 A.REQ.6 A.RISK.3 Verification Output - Review comments on Z040 and Z100 document types Offshore Standard DNV-OS-D203, December 2012 Ch.2 Sec.8 Page 34
35 Table A1 Requirements under independent verifier s responsibility (Continued) ID CL Activity Description Guidance Note A.IV.2 3 Review the RAMS approach for the unit Review of RAMS requirements, the allocation to systems and the plan for performing RAM activities to ensure completeness and consistency. Document type I300-RAMS documentation for the vessel-list of regulatory requirements (A.RAMS.1) I300-RAMS documentation for the vessel-list of RAM requirements for the vessel (A.RAMS.2) I300-RAMS documentation for the vessel-plan for handling of RAMS (A.RAMS.3) Z040-Vessel specification- ISDS Confidence levels (A.RISK.3) Input Acceptance Criteria Completeness: a) Listing of, laws, standards, and rules that apply regarding safety b) Plan showing objectives, methods, tools, and procedures to be used. c) Schedule of RAMS activities. d) RAM data to be collected (CL3) e) RAMS requirements f) Confidence levels For CL2: qualitative requirements For CL3: quantitative requirements are required. Activity Ref. A.RAMS.1 A.RAMS.2 A.RAMS.3 A.RISK.3 Verification Output - Review comments on all I300 input document types Offshore Standard DNV-OS-D203, December 2012 Ch.2 Sec.8 Page 35
36 Table A1 Requirements under independent verifier s responsibility (Continued) ID CL Activity Description Guidance Note B.IV.1 3 Review technical specification and design B.IV.2 2 Review integration, verification and validation strategy Review of functional design specification and software design specification agreed upon by supplier and yard against vessel specification, as well as other criteria and RAMS objectives. Review verification/validation/ integration strategy against technical specification for completeness of verification intended to be achieved. Document type Z100-Specification (B.REQ.1, B.REQ.2, B.REQ.3) Z060-Functional description (B.DES.4). I020-Functional description (B.DES.1) I030-Block (topology) diagram (B.DES.1, B.DES.4) I220-Interface description- Inter-system interface specification (B.DES.4) I220-Interface description- Intra-system interface specification (B.DES.1) I290-Software design description (B.DES.2) I300-RAMS documentation for the vessel-rams design guidelines and methods (B.DES.3) I310-RAMS documentation for the system-rams design guidelines and methods (B.DES.3) Z290-Record-Traceability matrices (X.REQ.1) I140-Verification and Validation Strategy (B.VV.1), I210-Integration plan- (B.INT.1), I220-Interface description inter system interface specification (B.INT.2) I030-Block (topology) diagram (B.DES.4) Input Acceptance Criteria Consistency of functional design specification and software design specification agreed upon by supplier and yard against vessel specification, Consistency with RAMS objectives. Completeness of verification intended to be achieved (functions, interfaces) Completeness of validation intended to be achieved (requirements, scenarios) Completeness of integration plan with regards to software Activity Ref. B.DES.1 B.DES.2 B.DES.3 B.DES.4 B.REQ.1 B.REQ.2 B.REQ.3 X.REQ.1 B.DES.4 B.INT.1 B.INT.2 B.VV.1 Verification Output - Review comments on I020, I030, I220, I290 and Z060 document types - Review comments on I140 and I210 document types Offshore Standard DNV-OS-D203, December 2012 Ch.2 Sec.8 Page 36
37 Table A1 Requirements under independent verifier s responsibility (Continued) ID CL Activity Description Guidance Note B.IV.3 3 Review and comment on the software part of the safety analysis for critical functions in scope of the ISDS B.IV.4 3 Review the RAMS approach for the system Review the software-related RAMS risks and priorities, and the resulting software-related RAMS mitigation actions. Review of RAMS requirements for the system, and the plan for performing RAM activities to ensure completeness and consistency. Special methods for RAMS risk analysis and mitigation are used, such as: HAZID, HAZOP, FMEAs, and FMECAs. Document type I300-RAMS documentation for the vessel - RAMS risk register (B.RAMS.1 & B.RAMS.2) I310-RAMS documentation for the system - RAMS risk register (B.RAMS.1 & B.RAMS.2) I300- RAMS documentation for the vessel-rams risk analysis documentation (B.RAMS.1) I310- RAMS documentation for the system-rams risk analysis documentation (B.RAMS.1) I300-RAMS documentation for the vessel-list of regulatory requirements (A.RAMS.1) I300-RAMS documentation for the vessel-plan for handling of RAMS (A.RAMS.3) I310-RAMS documentation for the system-listing of RAM requirements for the system (A.RAMS.2) I310-RAMS documentation for the system-plan for handling of RAMS (B.RAMS.4) Z040-Vessel specification- ISDS Confidence levels (A.RISK.3) Input Acceptance Criteria Consistency of the scope selection for risk identification, Consistency between unit and system level risk analysis, Consideration of the software failures modes, Consideration of the hardware-induced software failure modes. Completeness: a) Listing of, laws, standards, and rules that apply regarding safety b) Plan showing objectives, methods, tools, and procedures to be used. c) Schedule of RAMS activities. d) RAM data to be collected (CL3) e) RAMS requirements. For CL2: qualitative requirements are good enough For CL3: quantitative requirements are required. Activity Ref. B.RAMS.1 B.RAMS.2 A.RAMS.1 A.RAMS.2 A.RAMS.3 A.RISK.3 B.RAMS.4 Verification Output - Review comments on I300 and I310 document types - Review comments on the I310 document type Offshore Standard DNV-OS-D203, December 2012 Ch.2 Sec.8 Page 37
38 Table A1 Requirements under independent verifier s responsibility (Continued) ID CL Activity Description Guidance Note C.IV.1 3 Review the verification activities in the construction phase C.IV.2 2 Review and witness the FAT Evaluate how completely the verification strategy has been executed in the construction phase (through the FAT). Determine whether or not all required testing, peer reviews, and other verification activities have been conducted. Confirm that any issues or problems identified during these activities are being tracked to closure. Review and contribute to the FAT procedure. Witness the FAT execution. Records of V&V activities (e.g., test reports, peer review reports) should be matched to configuration items. Traceability matrices should be checked to see that requirements are included in test plans or other V&V activities. Document type Z290-Record-Software peer review (C.VV.1) Z120-Test procedure at manufacturer (C.VV.4) Z130 -Report from test at manufacturer (C.VV.4) Z290-Record -Software code analysis (C.VV.5) Z120-Test procedure at manufacturer-system FAT procedure (C.VV.8) Z130 -Report from test at manufacturer-system FAT report(c.vv.8) Z241-Measurement report- Verification analysis report (C.VV.6) I210-Integration plan (B.INT.1) I140-Software quality plan - Verification and validation strategy (B.VV.1) Z290-Record-Traceability matrices (X.REQ.1) Z120-Test procedure at manufacturer-system FAT procedure (C.VV.8) Z130 -Report from test at manufacturer System FAT report(c.vv.8) I140-Software quality plan- Verification and validation strategy (B.VV.1) Z290-Record-Traceability matrices (X.REQ.1) Input Acceptance Criteria Planned verification activities have been successfully completed and problems resolved. Verification and validation is on track as expected by the strategies. Consistency between verification strategy and test procedures Completeness of test procedures (functions, interfaces, design) Completeness of test cases Test reports reflect the actual test results with pass/fail judgements and deviations from procedures. Activity Ref. B.INT.1 B.VV.1 C.VV.1 C.VV.4 C.VV.5 C.VV.6 C.VV.8 X.REQ.1 B.VV.1 C.VV.8 X.REQ.1 Verification Output - Review comments on the Z241 document type - Review comments on Z120 and Z130 document types Offshore Standard DNV-OS-D203, December 2012 Ch.2 Sec.8 Page 38
39 Table A1 Requirements under independent verifier s responsibility (Continued) ID CL Activity Description Guidance Note C.IV.3 3 Review RAMS arguments and evidence for the system D.IV.1 2 Review and witness the Commission ing tests The RAMS arguments and reviews shall be reviewed to ensure completeness and compared with the RAMS requirements for the system. Review and contribute to the commissioning tests. Witness the commissioning execution. Document type I310-RAMS documentation for the system-rams compliance report (C.RAMS.1) I310-RAMS documentation for the system -RAM report (C.RAMS.2) I310-RAMS documentation for the system-list of RAM requirements (A.RAMS.2) I310-RAMS documentation for the system-list of regulatory requirements (A.RAMS.1) I310-RAMS documentation for the system - RAMS risk register ( B.RAMS.2) I310-RAMS documentation for the system -Safety assessment report (B.RAMS.3) I310-RAMS documentation for the system-plan for handling of RAMS (B.RAMS.4) I140-Software quality plan - Verification and validation strategy (B.VV.1) Z140-Test procedure for quay and sea trial (D.VV.1, D.VV.2) Z150-Report from quay and sea trials (D.VV.1, D.VV.2) Z140-Test procedure for quay and sea trial-integration test procedure (D.VV.4) Z150-Report from quay and sea trials-integration test report (D.VV.4) Z140-Test procedure for quay and sea trial-independent test procedure, if at CL3 (C.VV.9) Z150-Report from quay and sea trials-independent test report, if at CL3 (C.VV.9) Input Acceptance Criteria Completeness of content: a) RAM data and RAM calculations b) RAMS compliance report Risks to RAM and software safety are under control. Consistency between verification strategy and test procedures Completeness of test procedures (functions, interfaces, design) Completeness of test cases Consistent with RAMS objectives Test reports reflect the actual test results with pass/fail judgements and deviations from procedures. Activity Ref. A.RAMS.1 A.RAMS.2 B.RAMS.2 B.RAMS.3 B.RAMS.4 C.RAMS.1 C.RAMS.2 B.VV.1 C.VV.9 D.VV.1 D.VV.2 D.VV.4 Verification Output - Review comments on I310-RAMS compliance report and I310-RAM report document types -Review comments on Z140 and Z150 document types Offshore Standard DNV-OS-D203, December 2012 Ch.2 Sec.8 Page 39
40 Table A1 Requirements under independent verifier s responsibility (Continued) ID CL Activity Description Guidance Note D.IV.2 3 Review commission ing test results D.IV.3 3 Review RAMS arguments and evidence for the unit E.IV.1 3 Witness upgrade commission ing tests Review and analyse the results of all tests activities done by the system integrator and owner. The RAMS arguments and reviews shall be reviewed to ensure completeness and compared with the RAMS requirements for the unit. Review and contribute to the upgrade tests. Witness the test execution. This activity focuses on the review of activities performed after C.IV.1 activity. Document type Z150-Report from quay and sea trials (D.VV.1, D.VV.2) Z150-Report from quay and sea trials-integration test report (D.VV.4) Z150-Report from quay and sea trials-independent test report(c.vv.9) Z241-Measurements report- Verification analysis report (D.VV.3) I300-RAMS documentation for the vessel-list of regulatory requirements (A.RAMS.1) I300-RAMS documentation for the vessel-list of RAM requirements (A.RAMS.2) I300-RAMS documentation for the vessel-rams risk register (B.RAMS.2) I300-RAMS documentation for the vessel-rams compliance report (D.RAMS.1) I300-RAMS documentation for the vessel-ram report (D.RAMS.2) I310-RAMS documentation for the system-ram report (D.RAMS.2) I300-RAMS documentation for the vessel-security audit report (D.RAMS.3) Z140-Test procedure for quay and sea trial (E.VV.1, E.VV.2) Z150-Report from quay and sea trials (E.VV.1, E.VV.2) Z120-Test procedure at manufacturer (X.VV.1) Z130 -Report from test at manufacturer (X.VV.1) Input Acceptance Criteria Planned verification and validation activities have been successfully completed and problems resolved. Verification and validation strategies have achieved their objectives. RAMS objectives are met. Completeness of content of reports a) Calculations of RAM values for designated systems b) RAMS compliance report For CL2 systems: qualitative RAMS are good enough For CL3 systems: qualitative RAMS are required Risks To RAM and Software Safety are under control. Security audit has been performed. Completeness of test procedures (impacted functions, impacted requirements, impacted scenarios) Consistent with RAMS objectives. Activity Ref. C.VV.9 D.VV.1 D.VV.2 D.VV.3 D.VV.4 A.RAMS.1 A.RAMS.2 B.RAMS.2 D.RAMS.1 D.RAMS.2 D.RAMS.3 E.VV.1 E.VV.2 X.VV.1 Verification Output - Review comments on the Z241 document type. - Review comments on I300-RAMS compliance report and I300-RAM report document types -Review comments on Z140 and Z150 document types Offshore Standard DNV-OS-D203, December 2012 Ch.2 Sec.8 Page 40
41 Table A1 Requirements under independent verifier s responsibility (Continued) ID CL Activity Description Guidance Note X.IV.1 1 Assess compliance to the ISDS standard X.IV.2 1 Analyse and present the ISDS assessment and IV activities results for the phase Processes shall be assessed to determine how they comply with the applicable ISDS standard. An activity status report is issued by the independent verifier for each milestone. The corrective action plan shall be reviewed for approval. The implementation of the corrective actions shall be assessed in the subsequent phase. This activity shall be performed in phases A to D and may be performed in phase E. For the E phase, see description of annual and renewal assessments in Ch.3 Sec.1 C Gather the results from the various independent verification activities. Analyse the results from the independent verifier activities and analyse the associated risks. Present the analysis result and associated risks at milestone meeting. This activity shall be performed in phases A to D. Each organization is responsible for its quality assurance. The quality assurance is responsible for ensuring the ISDS practices are achieved. The purpose of the independent verifier is not to bear responsibility for the organizations to perform their activities as intended, but to verify that such a target has been achieved. The inputs to this activity are the results from all the IV activities performed in the phase. Document type Q030-Corrective action plan (X.PQA.4, X.PQA.5 and X.PQA.6) In addition relevant documentation is reviewed during the assessment. All documents produced by IV activities. Input Acceptance Criteria Compliance to ISDS activities and document requirements. Activity Ref. X.PQA.4 X.PQA.5 X.PQA.6 - IV activities Verification Output - Review comments on, and approval of the Q030 document type - Milestone report - Milestone presentation Offshore Standard DNV-OS-D203, December 2012 Ch.2 Sec.8 Page 41
42 OFFSHORE STANDARD DNV-OS-D203 INTEGRATED SOFTWARE DEPENDENT SYSTEMS CHAPTER 3 CLASSIFICATION AND CERTIFICATION CONTENTS PAGE Sec. 1 Requirements... 43
43 Ch.3 Sec.1 Page 43 SECTION 1 REQUIREMENTS A. General A 100 Introduction 101 As well as representing DNV s interpretation of safe and recommended engineering practice for general use by the offshore industry, the offshore standards also provide the technical basis for DNV classification, certification and verification services. 102 A complete description of principles, procedures, applicable class notations and technical basis for offshore classification is given by the offshore service specifications, see Table A1. Table A1 Offshore Service Specifications Document reference no. Title DNV-OSS-101 Rules for Classification of Offshore Drilling and Support Units DNV-OSS-102 Rules for Classification of Floating Production, Storage and Loading Units DNV-OSS-103 Rules for Classification of LNG/LPG Floating Production and Storage Units or Installations A 200 Organisation of Chapter Chapter 3 identifies the specific documentation, certification and surveying requirements to be applied when using this standard for certification and classification purposes. A 300 Classification principles 301 The requirements of this standard shall only be applied for systems also subject to classification through main class and other additional class notations, as relevant. 302 As part of the classification process DNV shall take the role as the independent verifier (IV). Independent verifier activities are given in Ch.2, Sec.8. A 400 Compliance of Activities 401 The requirements and corresponding acceptance criteria for all activities are listed in Ch.2, Sec.5 to 7, and explained in detail in Appendix B. 402 Compliance of activities is verified through assessments carried out by the independent verifier. Each role shall be assessed by the independent verifier during the project. For the operation phase (E), assessments shall be performed regularly as defined in C100 to During the assessment the independent verifier shall assess each activity for the relevant role, confidence level and project phase. Findings shall be reported by the independent verifier to the assessed role in an assessment report. A 500 Approval of Documents 501 Based on the assessment performed by the independent verifier the assessed role shall present an action plan for approval, with specific and time bound actions for each non-conformity. 502 The independent verifier shall verify that the actions in the approved action plan are carried out according to the plan. 503 Documentation, other than the action plan, listed under the documentation criteria for each role in Ch.2 in this standard, shall be reviewed and commented by the independent verifier for information. A 600 Rating of compliance 601 DNV rates the different activities defined for each role in Ch.2 based on observations during assessments and document review, and list these as findings in an assessment report. 602 Three different types of ratings are used; High compliance, Medium compliance and Low compliance. 603 In the assessment report High compliance is represented by the colour green, Medium compliance by the colour orange and Low compliance by the colour red. A 700 Reporting and milestone meetings 701 Each assessed organisation will receive an assessment report with the major findings and status with regards to approval of activities. 702 DNV will provide input to the milestone meetings M1, M2, M3 and M4 in the form of a presentation of an ISDS status report which summarises the different organisation s compliance with this standard as assessed by DNV.
44 Ch.3 Sec.1 Page DNV will at the same time provide an assessment of the risks associated with the total project s compliance-status towards this standard. B. Class notation B 100 Designation 101 Units built and tested in compliance with the requirements of this standard can be assigned the optional class notations for Integrated Software Dependent Systems (ISDS). 102 The notation can be assigned to a new-build when compliance is verified for phases A through D. To maintain the notation, compliance must be verified for phase E. 103 The designation for the class notation shows the notation name, which is ISDS, the systems included in the scope of the notation, and the confidence level specified for each system: ISDS (system1cl, system2cl,, systemncl) Example: ISDS (DP2, PMS2, WCS3): these systems have been developed according to the scope and confidence levels identified in Ch.3, Sec.1, B200. The DP abbreviation refers to the system itself and not to the class notations such as DYNPOS AUTR etc. 104 The ISDS class notation does not replace, but is complementary to other class notations. B 200 Scope 201 Table B1 defines the systems and confidence levels recommended to include in the scope of the ISDS class notation. 202 Unless otherwise agreed by Owner and DNV, and specified by Owner, the scope and confidence levels defined in Table B1 applies for the ISDS class notation. Table B1: ISDS class notation scope, system definitions and confidence levels for selected systems Function Prevent escalation of abnormal conditions System that may be Typical sub-systems and included in scope for ISDS components Shutdown and Disconnection Systems (SDS) Network Emergency Shutdown (ESD) system, including: Input devices Interfaces towards other safety systems Central control unit Output actuators Signal transfer lines Power supply Process Shutdown (PSD) system Emergency Disconnect System (EDS)/Emergency Quick Disconnect (EQD) system Critical Alarm and Action Panel (CAAP) High Integrity Protection System (HIPS) Fire & Gas System (F&G) Network Fire and gas detection system Alarm and communication system Systems for automatic action DNV Rule reference* OS-A101 OS-E101 OS-D202 OS-E201 CL Applicable for 2 Drilling Unit Well Intervention Unit FPSO FSU OS-D301 2 Drilling Unit Well Intervention Unit FPSO FSU
45 Ch.3 Sec.1 Page 45 Table B1: ISDS class notation scope, system definitions and confidence levels for selected systems (Continued) Function System that may be included in scope for ISDS Well control Well Control System (WCS) Drilling Work Over/ Completion Power generation and distribution Drilling Control system (DCS) Equipment Handling (EH) Heave Compensation and Tensioning System (HCTS) Drilling fluid circulation and cementing (MUD) Well Testing Systems (WT) Work Over Control System (WOCS) Power Management System (PMS) Power Plant Control System (PPC) Typical sub-systems and components Choke and kill system Diverter system Blow Out Prevention (BOP) system or Well Control Package (WCP), including Topside panels Network Subsea Electronic Modules (SEM) HVAC for driller s cabin Driller s chair Network Zone management/anticollision system Drilling data acquisition system Top drive Drawwork Rotary table Vertical pipe handler Horizontal pipe handler Fingerboard Make up system BOP handler Marine riser tensioners, including re-coil system Active compensation systems Heave motion compensators Mud circulation system, high pressure Cementing system Production shut down system Blow down system WOCS Container WOCS s chair Data acquisition system Network Power generation remote control and monitoring Power distribution remote control and monitoring Blackout prevention Load dependent start/stop including drilling drive and thruster drive power limitations. Engine change over Load sharing in remote droop mode (symmetric, asymmetric and manual) Blackout recovery Network Operator stations Integration with system specific PMS (drilling system PMS etc.), if applicable Governor & synchronizing unit Turbine controls Protection relays Thruster drives Drilling drives High voltage or low voltage switchboard DNV Rule reference* OS-E101 Table A2 3 Drilling Unit Well Intervention Unit OS-E101 2 Drilling Unit Well Intervention Unit OS-E101 Table A5 OS-E101 Table A3 OS-E101 Table A6 CL Applicable for 2 Drilling Unit Well Intervention Unit 2** Drilling Unit Well Intervention Unit 2 Drilling Unit Well Intervention Unit OS-E101 2 Drilling Unit Table A7 Well Intervention Unit OS-E101 2 Well Intervention Unit OS-D201 2 All unit types OS-D201 2 All unit types
46 Ch.3 Sec.1 Page 46 Table B1: ISDS class notation scope, system definitions and confidence levels for selected systems (Continued) Function Typical sub-systems and CL Applicable for Position keeping Control and monitoring System that may be included in scope for ISDS Dynamic Positioning System (DP) POSMOOR system (POS) Jacking Systems (JACK) Integrated Control and Monitoring System (ICM) * Only systems and components with software should be considered. **CL3 should be applied for units involved in fixed-to-bottom operations. 203 Other systems than those listed in Table B1may be granted the ISDS notation at CL1, 2 or 3 as specified below: Crane (CR) Navigation Systems (NAV) Process Control System (PCS) Propulsion System (PROP) Steering Systems (ST). components DP control computer(s) Independent joystick Sensor systems Display systems Operator panels Network Positioning reference systems Thruster control mode selection system POSMOOR control computer(s) Independent joystick Sensor systems Display systems Operator panels Network Positioning reference systems Active mooring equipment, e.g. windlass and winch Hydraulic system Control system Drives Sea water, fresh water, hot water, high pressure wash down system Fuel oil system HVAC system Service, instrument and bulk air system Hydraulic power system Ballast system Load and stability system DNV Rule reference* Pt.6 Ch.7 Pt.6 Ch.26 2 All unit types OS-E301 2 Drilling Unit Well Intervention Unit FPSO FSU OS-D101 2 Drilling Unit Well Intervention Unit Wind Installation Unit OS-D202 2 All unit types 204 For each system a reference to the applicable DNV Rule or Offshore Standard (OS) is listed. The rules or standards identified will provide a more specific description of the systems. DNV OS-D202 provides generic common requirements to, and is applicable for, all systems listed above. C. In operation assessments C 100 Objectives 101 The ISDS class notation is maintained through demonstrating compliance to the requirements of this standard in the operation phase. 102 DNV, as the independent verifier, shall perform an annual assessment of the unit in operation to verify compliance. 103 DNV shall perform a renewal assessment of the unit every fifth year. 104 Annual and renewal assessments for the ISDS notation should be carried out at the same time and interval as the periodical classification survey of the unit, and can also be carried out on the basis of documentation and evidence assessed onshore. 105 An action plan that demonstrates how findings will be closed shall be prepared by the owner, and
47 Ch.3 Sec.1 Page 47 approved by DNV. DNV shall verify, through re-assessment(s), that the action plan is implemented and the notation maintained. 106 The owner is to inform DNV whenever a system with the ISDS notation is modified. For major upgrades or conversions of the unit in operation the full set of requirements in this standard may apply. C 200 Scope of annual assessments 201 The purpose of the annual assessment is to ensure that the confidence that has been built into the unit is actually maintained. The effective implementation and continuous maintenance of the activities required by this standard for phase E, operation, shall be assessed. 202 As part of the annual assessment any changes, introduced after the latest assessment, to the systems within ISDS scope are to be addressed. An impact analysis of changes shall be reviewed and confirmed. Any follow up activities are to be agreed. 203 Updated evidence is to be kept and made available for review by the attending surveyor. Relevant evidence include (with reference to required activity in parentheses): Inventory and spare part records with focus on obsolescence (produced in E.ACQ.1) Configuration management plan and logs for configuration management activities, including SW change orders (produced in E.CM.1) Change request logs including impact assessments (produced in E.CM.1) Configuration audit reports (produced in E.CM.2) Procedures for modifications request (produced in E.PQA.1) Maintenance plans and procedures (produced in E.RAMS.1 and E.PQA.1) Corrective action plans (produced in X.PQA.4 and X.PQA.6) Quality control and project control records, as relevant (produced in X.PQA.1 and X.PQA.3) Maintenance in operation plans, including migration and SW retirement plans (produced in E.RAMS.1) Records of RAMS data (produced in E.RAMS.2) Analysis and investigation reports from RAMS incidents/failures (produced in E.RAMS.3) Records of RAMS impact analysis (produced in E.RAMS.4) Security audit reports (produced in E.RAMS.5) Records from validation, verification and testing, as relevant if systems have been changed in operation (produced in E.VV1 and E.VV.2) Version histories for baselines, requirements trace and configuration records (produced in X.CM.1). C 300 Scope of renewal assessments 301 The scope of the annual assessment is to be carried out. In addition the renewal assessment will have a specific focus on identified process areas or activities. These areas or activities are to be selected based on a discussion with owner of specific focus areas and should also be based on important or frequent findings from the annual assessments carried out since the last renewal.
48 OFFSHORE STANDARD DNV-OS-D203 INTEGRATED SOFTWARE DEPENDENT SYSTEMS APPENDICES CONTENTS PAGE App. A DEFINITIONS AND ABBREVIATIONS App. B REQUIREMENT DEFINITION... 56
49 App.A Page 49 APPENDIX A DEFINITIONS AND ABBREVIATIONS A. Definitions A 100 Verbal Forms Shall: Indicates a mandatory requirement to be followed for fulfilment or compliance with the present standard. Deviations are not permitted unless formally and rigorously justified, and accepted by all parties. Should: Indicates a recommendation that a certain course of action is preferred or particularly suitable. Alternative courses of action are allowable under the standard when agreed between contracting parties, but shall be justified, documented and approved by DNV. May: Indicates permission, or an opinion, which is permitted as a part of conformance with the standard. Can: Indicates a conditional possibility. A 200 Definitions Acceptance criteria: The criteria that a system or component must satisfy in order to be accepted by a user, customer, or other authorized entity [IEEE :1990]. Activity: A defined body of work to be performed, including its required input and output information. [IEEE 1074:2006]. Availability: The ability of the system to provide access to its resources in a timely manner for a specified duration [IEC IEV ], alternatively, the time or proportion of time that the system is functioning as intended. Baseline: A consistent set of specifications or products that have been formally reviewed and agreed upon, that thereafter serve as the basis for further development, and that can be changed only through formal change control procedures [ISO IEC 15288:2008]. Base Product: A pre-existing product which is reused, configured, qualified, modified or enhanced to meet the specific needs of new projects. Black-box: (1) A system or component whose inputs, outputs, and general function are known but whose contents or implementation are unknown or irrelevant. (2) Pertaining to an approach that treats a system or component as in (1). [IEEE :1990]. Black-box testing: see Functional testing. Block diagram: A diagram of a system, computer, or device in which the principal parts are represented by suitably annotated geometrical figures to show both the functions of the parts and their functional relationships [IEEE :1990]. Change control board: See Configuration control board. Code review: Systematic examination (often as peer review) of computer source code intended to find and fix mistakes overlooked in the initial development phase, improving the overall quality of software. Code review may also be partly automated. Commercial Off-The-Shelf (COTS): COTS products are ready-made packages sold off-the-shelf to the acquirer who had no influence on its features and other qualities. Typically the software is sold pre-wrapped with its user documentation [ISO/IEC 25051:2006(E)]. Commissioning (tests): Verifying and documenting that the unit and all of its systems are designed, installed, tested and can be operated and maintained to meet the owner's requirements. Component: A logical grouping of other components or modules inside a system or sub-system. Component testing: Testing of individual hardware or software components or groups of related components [IEEE :1990]. Configuration audits: Activities ensuring that the configuration management process is followed and that the evolution of a product is compliant to specifications, policies, and contractual agreements. Functional configuration audits are intended to validate that the development of a configuration item has been completed and it has achieved the performance and functional characteristics specified in the System Specification (functional baseline). The physical configuration audit is a technical review of the configuration item to verify that the as-built maps to the technical documentation [INCOSE SE 2004]. Configuration control board: A group of people or a person responsible for evaluating and approving or disapproving proposed changes to configuration items, and for ensuring implementation of approved changes.
50 App.A Page 50 Configuration data: Data used to configure/tailor a system or component. The data needs to be quality assured. Configuration item: An aggregation of hardware, software, or both, that is designated for configuration management and treated as a single entity in the configuration management process [IEEE :1990]. Consequence (failure): Real or relative magnitude of the seriousness of the failure (business, environmental and safety). Consistency: The degree of uniformity, standardization, and freedom from contradiction among the documents or parts of a system or component [IEEE :1990]. Coverage: The amount or proportion of a software component that has been tested. It is commonly quantified by counting the execution of statements, decision outcomes, and I/0 values. Coverage measures help to identify unreachable and untested code. Critical: Any function or component whose failure could interfere significantly with the operation or activity under consideration. Criticality: The degree of impact that a requirement, module, error, fault, failure, or other item has on the development or operation of a system [IEEE :1990]. Cycle time: (1) The period of time required to complete a sequence of events. (2) A set of operations that is repeated regularly in the same sequence, possibly with variations in each repetition; for example, a computer's read cycle. [IEEE :1990]. Deadlock: A situation in which computer processing is suspended because two or more devices or processes are each awaiting resources assigned to the others [IEEE :1990]. Decision Coverage: A measure of the amount of software tested, typically expressed as the number or percentage of outcomes of decision statements in the component that have been tested. Defect: Non-fulfilment of a requirement related to an intended or specified use [ISO 9000: 2005]. Dependability: Collective term used to describe the availability performance and its influencing factors: reliability performance, maintainability performance and maintenance support performance. See also RAMS, to which it adds Security [IEC IEV ]. Due diligence (software): An investigation, validation and verification of a software system/sub-system/ component/module for proving its usefulness and conformity in a given context. Error: A discrepancy between a computed, observed or measured value and condition and the true, specified or theoretically correct value or condition [IEC ]. Essential function (or system): A system supporting the function, which needs to be in continuous operation or continuously available for on demand operation for maintaining the unit's safety [DNV-OS-D202]. Established design: A design that has largely been previously, successfully implemented. Such designs are often the basis for turnkey fixed price systems such as current generation drill ships. Factory Acceptance Tests (FAT): Acceptance testing (see above) of a component, sub-system or system before delivery and integration. Failure: The termination of the ability of a functional unit to perform a required function on demand. Note: a fault in a part of the system may lead to the failure of its function, itself leading to a fault in other linked parts or systems etc. Failure mode: A defined manner in which a failure can occur. Failure modes can be seen as scenarios for how a system can go wrong. Fault: Abnormal condition that may cause a reduction in, or loss of, the capability of a functional unit to perform a required function excluding the inability during preventive maintenance or other planned actions, or due to lack of external resources [IEC ]. Finding: The result of approval, assessment and renewal activities by DNV, identifying the most important issues, problems or opportunities for improvement within the scope of this standard. Firmware: The combination of a hardware device and computer instructions and data that reside as read-only software on that device [IEEE :1990]. Functional requirement: A requirement that specifies a function that a system or system component must be able to perform [IEEE :1990]. Functional testing: (1) Testing that ignores the internal mechanism of a system or component and focuses solely on the outputs generated in response to selected inputs and execution conditions. (2) Testing conducted to evaluate the compliance of a system or component with specified functional requirements. [IEEE :1990].
51 App.A Page 51 Generic System Requirement (GSR): an attribute of the system or its function, relating to performance, quality or RAMS, as seen by the owners [DNV-RP-D201]. Contain advice which is not mandatory for the assignment or retention of class, but with which DNV, in light of general experience, advises compliance [DNV-OSS-101]. Impact analysis: Analysis, which identifies all systems and software products affected by a software change request and develops an estimate of the resources needed to accomplish the change and determines the risk of making the change [SWEBOK 2004]. Integration strategy: An assembly sequence and strategy that minimizes system integration risks. This strategy may permit verification against a sequence of progressively more complete component configurations and be consistent with a fault isolation and diagnosis strategy. It defines the schedule of component availability and the availability of the verification facilities, including test jigs, conditioning facilities, assembly equipment [ISO/IEC 15288]. Integrated Software Dependent System: An integrated software dependent system is an integrated system for which the overall behaviour is dependent on the behaviour of its software components. Integrated System: An integrated system is a set of elements which interact according to a design, where an element of a system can be another system, called a subsystem, which may be a controlling system or a controlled system and may include hardware, software and human interaction [IEC ]. Interface: (1) A shared boundary across which information is passed. (2) A hardware or software component that connects two or more other components for the purpose of passing information from one to the other. (3) To connect two or more components for the purpose of passing information from one to the other. (4) To serve as a connecting or connected component as in (2) [IEEE :1990]. (5) A collection of operations that are used to specify a service of a component. Interface Specification: Data describing the communications and interactions among systems and subsystems. Interface testing: Testing conducted to evaluate whether systems or components pass data and control correctly to one another [IEEE :1990]. Maintainability: (1) The ease with which a software system or component can be modified to correct faults, improve performance or other attributes, or adapt to a changed environment. (2) The ease with which a hardware system or component can be retained in, or restored to, a state in which it can perform its required functions [IEEE :1990]. Migration: System migration involves moving a set of instructions or programs, e.g., PLC programs, from one platform to another, minimizing reengineering. Migration of systems can also involve downtime, while the old system is replaced with a new one. Milestone: A scheduled event marking the transition from one project phase to the next. This standard identifies 5 milestones. Mitigation: Action that reduces the consequence(s) of a hazardous event or risk [IEC ]. Modification request: See Change request. Module: In this standard used to describe the lowest branches in the system hierarchy. Modules can be made of hardware (HW) or software (SW). Non-functional requirement: A requirement that specifies a characteristic or property that is not described as function, e.g. performance requirements. Non-standard legacy software: non-standard software is software that was not designed to be reused, but may be reused anyway. Novel: A feature, capability, or interface that largely is not present in the base product. Obsolescence (Risk): Risk associated with technology within the system that becomes obsolete before the end of the Expected Shelf or Operations Life, and cannot provide the planned and desired functionality. This risk may be mitigated by the Portability Generic System Requirement [DNV-RP-D201]. Peer review: A process of subjecting an author's work to the scrutiny of others who are experts in the same field. Portability: The ease with which a system or component can be transferred from one hardware or software environment to another [IEEE :1990]. Probability (of failure on demand): For hardware and random failures, probability that an item fails to operate when required. This probability is estimated by the ratio of the number of failures to operate for a given number of commands to operate (demands) [IEC IEV ].
52 App.A Page 52 Project: In the context of this standard the term Project refers to the activities, responsibilities and roles involved in a new build or upgrade project where the ISDS standard is applied. Prototype: A model implemented to check the feasibility of implementing the system against the given constraints, to communicate the specifier s interpretation of the system to the customer, in order to locate misunderstandings. A subset of system functions, constraints, and performance requirements are selected. A prototype is built using high-level tools. At this stage, constraints such as the target computer, implementation language, program size, maintainability, reliability and availability need not be considered [ISO/IEC :2010]. Prototyping: A hardware and software development technique in which a preliminary version of part or all of the hardware or software is developed to permit user feed-back, determine feasibility, or investigate timing or other issues in support of the development process [IEEE ]. Quality target: The objective or criteria agreed by the stakeholders to be reached for a quality characteristic. ISO/IEC proposes many quality characteristics for consideration. Record: Information or documents stating results achieved or providing evidence of activities performed. Redundancy: The existence of more than one means for performing a required function or for representing information [IEC ]. Redundancy prevents the entire system from failing when one component fails. Regression testing: Selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements [IEEE :1990]. Release note: The term release is used to refer to the distribution of a software configuration item outside the development activity. This includes internal releases as activities might contain provisions which cannot be satisfied at the designated point in the life cycle [IEEE :c6s2.6]. The release notes typically describe new capabilities, known problems, and platform requirements necessary for proper product operation [SWEBOK 2004]. Reliability: The capability of the ISDS to maintain a specified level of performance when used under specified conditions. Reliability, Availability, Maintainability, (Functional) Safety (RAMS): A set of commonly linked generic system attributes that often need to be dealt with in a systematic manner. See also Dependability. Requirement: A condition or capability that must be met or possessed by a system or system component to satisfy a contract, standard, specification, or other formally imposed documents [IEEE :1990]. Reused software: Software integrated into the system that is not developed during the project, i.e., both standard software and non-standard legacy software. Software can be reused as-is or be configured or modified. Review: Activity undertaken to determine the suitability, adequacy and effectiveness of the subject matter to achieve established objectives [ISO 9000:2005]. Revision control: Management of multiple revisions of the same unit of information (also known as version control, source control or (source) code management). Risk: The qualitative or quantitative likelihood of an accident or unplanned event occurring, considered in conjunction with the potential consequences of such a failure. In quantitative terms, risk is the quantified probability of a defined failure mode times its quantified consequence [DNV-OSS-300]. Role: A role is an organization with responsibilities within the system lifecycle. A role has specific activities to perform. See Ch.2, Sec.3, A200. Safety integrity level (SIL): A relative level of risk-reduction provided by a safety function, or to specify a target level of risk reduction. [IEC 61508] defines four levels, where SIL 4 is the most dependable and SIL 1 is the least. SIL is not to be confused with confidence level, which is defined in Ch.2, Sec.2. Software: Computer programs, procedures, and possibly associated documentation and data pertaining to the operation of a computer system [IEEE :1990]. Software Component: A software component is an interacting set of software modules. A software component is a configuration item. Software lifecycle: The period of time that begins when a software product is conceived and ends when the software is no longer available for use. The software life cycle typically includes a concept phase, requirements phase, design phase, implementation phase, test phase, installation and checkout phase, operation and maintenance phase, and, sometimes, retirement phase. Note: These phases may overlap or be performed iteratively [IEEE :1990]. Software Module: Separately compilable or executable piece of source code. It is also called Software Unit or Software Package [ISO/IEC 12207:2008]. A small self-contained program which carries out a clearly defined task and is intended to operate within a larger program. Single point of failure: Component or interface of a system for which no backup or redundancy exists and the failure of which will disable the entire system.
53 App.A Page 53 Simulation: One of real-world simulation, process simulation, electronic circuit simulation, fault simulation, simulation of Software in the Loop (SIL), simulation of the system s environment or Hardware-In-the-Loop (HIL) [SfC 2.24]. Simulators serve various purposes and use different modelling techniques [ISO/IEC :2010]. Site Acceptance Tests: An acceptance test for a fully integrated system. May be part of commissioning, or may be performed in the factory before delivery to the unit. Source code: Computer instructions and data definitions expressed in a form suitable for input to an assembler, compiler, interpreter or other translator [IEEE :1990]. Specification: A document that specifies, in a complete, precise, verifiable manner, the requirements, design, behaviour, or other characteristics of a system or component, and, often, the procedures for determining whether these provisions have been satisfied [IEEE :1990]. Standard software: Ready-made and packaged software intended to be used in different systems, for example COTS software, the ISDS supplier s own developed standard software components, and open source software. State diagram: A diagram that depicts the states that a system or component can assume, and shows the events or circumstances that cause or result from a change from one state to another [IEEE :1990]. Statement Coverage: A measure of the amount of software that has been tested, typically expressed as a percentage of the statements executed out of all the statements in the component tested. Static Analysis: The process of evaluating a system or component based on its form, structure, content, or documentation. Contrast with: dynamic analysis [IEEE :1990]. Statistical Testing: A testing method based on allocating test cases to components and functions in proportion to their expected use in operational scenarios, also called random testing. Data from statistical testing can be used to predict operational reliability. Sub-supplier: (in the context of this standard) Organisations and companies delivering products and services to suppliers. Sub-system: A Sub-system is a part of a system. For example Choke and Kill is a part of the Well Control System, the Independent joystick is part of the Dynamic Positioning system (ref. definition in Ch.3 Sec.1B). System: A defined product which contains sub-systems. For the purposes of the ISDS notation, a system refers to the qualifier identified in the notation, for example DP, DCS, PMS, etc. (ref. Ch.3 Sec.1B). System Architecture: A selection of the types of system elements, their characteristics, and their arrangement [INCOSE SE 2004]. System Design Review: A review conducted to evaluate the manner in which the requirements for a system have been allocated to configuration items, the system engineering process that produced the allocation, the engineering planning for the next phase of the effort, manufacturing considerations, and the planning for production engineering [IEEE :1990]. System testing: Testing conducted on a complete, integrated system to evaluate the system's compliance with its specified requirements. See also: component testing; integration testing; interface testing [IEEE :1990]. Stress testing: Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements [IEEE :1990]. Target environment: The configuration of network protocols, computers, PLCs, sensors, final elements and other hardware on which a software integrated system is intended to be executed in operations. Test case: A specification of a test in terms of: a description of the purpose of the test pre-conditions (e.g. the state of the software under test and it s environment) actions organized in one or several scenarios (including what data to provide) expected results Traceability: Linkage between requirements and subsequent work products, e.g. design documentation and test documentation. Traceability matrix: A matrix that records the relationship between two or more products of the development process; for example, a matrix that records the relationship between the requirements and the design of a given software component [IEEE :1990]. Unit: The vessel that will get the ISDS notation, typically a Mobile Offshore Unit or a Ship. Use Case: The Use Case specifies the concepts used primarily to define the behaviour and the functionality of a system or a subsystem, without specifying its internal structure. Use cases and actors (users) interact when the services of the system are used. In this context an actor plays a coherent set of roles when interacting with the system. Note that in the Use Case context an actor may be a user or another system interacting with the first one [IEC 19501:2005]. Validation: Confirmation, through the provision of objective evidence that the requirements for a specific intended use or application have been fulfilled [ISO 9000:2005].
54 App.A Page 54 Validation strategy: Identification (e.g., list) of validation activities to be performed, along with validation methods, objectives, and responsibility assigned to them. The purpose of this strategy is to minimize redundancy and maximize effectiveness of the various validation activities. Verification: Tasks, actions and activities performed to evaluate progress and effectiveness of the evolving system solutions (people, products and process) and to measure compliance with requirements. Analysis (including simulation, demonstration, test and inspection) are verification approaches used to evaluate: risk; people, product and process capabilities; compliance with requirements, and proof of concept [INCOSE SE 2004]. Verification strategy: Identification (e.g., list) of verification activities to be performed, along with verification methods, objectives, and responsibility assigned to them. The purpose of this strategy is to minimize redundancy and maximize effectiveness of the various verification activities. Version: Software items evolve as a software project proceeds. A version of a software item is a particular identified and specified item. It can be thought of as a state of an evolving item [SWEBOK 2004]. White box testing: A testing method that uses knowledge of the internal organization of the software to select test cases that provides adequate coverage of the software. White box testing is also called structural testing. Work product: A deliverable or outcome that must be produced to prove the completion of an activity or task. Work products may also be referred to as artefacts. B. Abbreviations The abbreviations in Table B1 are used. Table B1 Abbreviations AP For Approval BOP Blow Out Prevention CAAP Critical Alarm and Action Panel CCB Change Control Board CL-<n> Confidence Level <n> (n=1 to 3) CMC Certification of Materials and Components COTS Commercial off-the-shelf CPU Central Processing Unit DNV Det Norske Veritas DP Dynamic Positioning EDS Emergency Disconnect System EQD Emergency Quick Disconnect ESD Emergency Shut Down F&G Fire and Gas FAT Factory Acceptance Tests FEED Front-End Engineering and Design FI For Information FMECA Failure Modes, Effects, Criticality Analysis GSR Generic System Requirement HCTS Heave Compensation and Tensioning System HIPS High Integrity Protection System HVAC Heating, Ventilation, and Air Conditioning HW Hardware (as opposed to software) IAS Integrated Automation System IEC The International Electrotechnical Commission IEEE The Institute of Electrical and Electronics Engineers INCOSE International Council on Systems Engineering IO Input/output (also I/O) ISDS Integrated Software Dependent Systems ISO International Organization for Standardization IV Independent Verifier MOU Mobile Offshore Unit MTTF Mean Time To Failure MTBF Mean Time Between Failures
55 App.A Page 55 Table B1 Abbreviations (Continued) MTTR Mean Time To Repair MUD Bulk Storage, drilling fluid circulation and cementing OS Offshore Standard OSS Offshore Service Specifications OW Owner PLC Programmable Logical Controller PMS Power Management System PPC Power Plant Control System PRH Pipe / Riser Handling PSD Process Shut Down RAM Reliability, Availability, Maintainability RAMS Reliability, Availability, Maintainability, Safety RfP Request for Proposal RMS Riser Monitoring System RP Recommended Practice SAT Site Acceptance Tests SEM Subsea Electronic Module SI System Integrator SU Supplier SW Software UIO Unit In Operation WCS Well Control System WCP Well Control Package WOCS Work Over Control System
56 App.B Page 56 APPENDIX B REQUIREMENT DEFINITION A. Requirement definition A 100 General 101 The following lists the required activities for all phases and all roles. 102 Each activity is presented in the same manner. The list of activities is sorted alphabetically by the activity ID and contains the following elements: The header describes the unique activity identifier (ID), enabling easy reference and traceability, and the name of the activity. The identifier is structured in three parts: Z.YYY.NN. The first part ( Z ) of the activity identifier refers to the project phase. The second part ( YYY ) of the activity identifier refers to the process area. The third part ( NN ) of the activity identifier is a unique number for the activity. The phase and the confidence level at which the activity is to be performed. Assignment of responsible roles for unit and system level. The requirement definition provides a detailed description of the activity requirements. A guidance note is provided when needed. The assessment criteria field provides typical evidence to be made available by the responsible role(s) for the assessment. The documentation criteria field provides a list of required documentation to be submitted to DNV for approval (AP) or for information (FI). The contributions field lists the roles that are expected to contribute to the activity and the details of the expected contributions. A 200 Activity definition basic engineering A.CM.1 Establish a baseline of requirements for the unit Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: A baseline of the unit requirements shall be established at the end of the basic engineering phase. This baseline shall be used as the reference when requirements evolve in later phases. The purpose of an explicit requirements baseline is to achieve control of any changes to the requirements. Approved and controlled unit requirements document. Revision history of unit requirements document. Acceptable contributions from Owner. Approval of the requirements document. A.DES.1 Establish the unit design Phase: Basic engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: A top level architecture of the systems and software within ISDS scope shall be established, identifying the systems and their interrelationships, including the required interfaces. Unit design: unit design specifications, systems/network topology and functional descriptions. Acceptable contributions from Owner. Review the unit design/top level architecture.
57 App.B Page 57 A.PM.1 Establish the master plan Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: A master plan for the development of the unit s systems in ISDS scope shall be established. The plan shall contain a high level master schedule, taking into account rough estimations and discussions with potential suppliers. Milestones for the whole project shall be established, including milestones to the ISDS standards. All stakeholders shall be in agreement on the plan. The development plan and the milestones should typically show the requirements and design freeze dates. Master plan: Activities, work breakdown structure (WBS), schedule, and milestones. Acceptable contributions from Owner. Provide inputs on specific schedule constraints and master schedule. A.PQA.1 Define procedures (owner) Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible:. Requirement definition: Procedures to be used within the project shall be defined, coordinated, and agreed within and between organisations participating in the project. Roles, responsibilities and specific requirements as defined in this standard shall be explicitly addressed. The word procedures in this context is used to represent all documentation regarding the way of working, e.g. process descriptions, standard operating procedures, work instructions, checklists, guidelines etc. Some procedures may need to be coordinated with the system integrator s procedures when the system integrator is selected (see A.PQA.2). The defined procedures are normally made up of quality management system documents: e.g. standard operating procedures, process description, checklists, and document templates along with any project specific adaptations of these. The following areas are normally expected to be covered: All activities required to be performed by the owner, listed in Ch.2 Sec.5. Responsibilities and authorities of different disciplines and roles, Mechanisms for submitting and receiving information (documents) between different organisations, Mechanisms for defining baselines of information (documents), Mechanisms for handling of documents and information while they are work in progress, Mechanisms for review and approval of drawings and other documents, Mechanisms for approval of deliverables (verification mechanisms), Mechanisms for handling of changes to already agreed technical scope, schedule or costs, Mechanisms for follow-up of process adherence (see X.PQA.1 and X.PQA.4). A quality system, documents, minutes of meetings, or other relevant information showing: A defined way of working for the major activities in the project, clear roles and responsibilities and defined ways of interaction between the different organizations (e.g. owner, system integrator, supplier, independent verifier, and others).
58 App.B Page 58 A.PQA.2 Define procedures (system integrator) Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: Procedures to be used within the project shall be defined, coordinated, and agreed within and between organisations participating in the project. Roles, responsibilities and specific requirements as defined in this standard shall be explicitly addressed. The word procedures in this context is used to represent all documentation regarding the way of working, e.g. process descriptions, standard operating procedures, work instructions, checklists, guidelines etc. Some procedures may need to be coordinated with the owner (see A.PQA.1) and the suppliers (see B.PQA.1) when the suppliers are selected. The defined procedures are normally made up of quality management system documents: e.g. standard operating procedures, process description, checklists, and document templates along with any project specific adaptations of these. The following areas are normally expected to be covered: All activities required to be performed by the system integrator, listed in Ch.2 Sec.6. Responsibilities and authorities of different disciplines and roles, Mechanisms for submitting and receiving information (documents) between different organisations, Mechanisms for defining baselines of information (documents), Mechanisms for handling of documents and information while they are work in progress, Mechanisms for review and approval of drawings and other documents, Mechanisms for approval of deliverables (verification mechanisms), Mechanisms for handling of changes to already agreed technical scope, schedule or costs, Mechanisms for escalation of problems (see C.PQA.1), Mechanisms for follow-up of process adherence (see X.PQA.2 and X.PQA.5), Mechanisms allowing management insight into the project s status, Internal procedures and rules for the work to be carried out in the project e.g. design guidelines (see B.DES.3). Internal procedures and rules regarding how to document the requirements, design, implementation, verification & validation, and acceptance of the systems within ISDS scope. A quality system, documents, minutes of meetings, or other relevant information showing: A defined way of working for the major activities in the project, clear roles and responsibilities and defined ways of interaction between the different organizations (e.g. owner, system integrator, supplier, independent verifier, and others). A.RAMS.1 Determine safety rules, standards and laws applicable Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: System integrator. Requirement definition: Rules, standards and applicable laws for safety requirements shall be identified, reviewed and agreed upon. Sources of statutory rules include flag states, shelf states, and classification societies. Also, industry standards such as IEC 61508/61511 or ISO may be applied. This activity is an extension of the A.REQ.2 activity. Listing of regulatory requirements that apply regarding safety. Resolution of conflicting rules. Application guidelines. List of regulatory requirements unit (FI): reviewed in A.IV.2 at CL3 used in B.IV.4 and D.IV.3 at CL3. List of regulatory requirements system (FI) used in C.IV.3 at CL3. Acceptable contributions from Owner. Make clear any specific considerations such as sovereignty of intended area of operation.
59 App.B Page 59 A.RAMS.2 Define RAM related requirements and objectives Phase: Basic engineering. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Owner. Requirement definition: Reliability, availability, and maintainability related requirements shall be defined and documented. These requirements shall be detailed down to the system level and shall be quantitative for CL3. RAM requirements may be explicitly defined as quantitative or qualitative targets or objectives that shall be met by the functions or the systems. As an example of a quantitative requirement, MTTF or MTBF for a given system in a given environment may be used as a reliability target. Relative reliability (e.g., more reliable than ) is an example of a qualitative requirement. RAM requirements may be part of an overall requirements document. This activity should be coordinated with the activity A.REQ.2, and the RAM requirements are normally documented in the same document as the other unit and system requirements. Listing of RAM requirements. For CL2: qualitative requirements are acceptable. For CL3: quantitative requirements (objectives) are required. A.RAMS.3 Develop the RAMS plan for the unit Phase: Basic engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. List of RAM requirements unit (FI): reviewed in A.IV.2 and B.IV.4 at CL3 used in D.IV.3 at CL3. The methods, tools and procedures used in all RAMS-related activities should be defined, documented and put under configuration control. This activity provides input to the activity B.RAMS.4. The RAMS plan should cover all relevant RAMS activities included in this standard. This activity may be coordinated with the activities A.PM.1 and A.PQA.2 and the RAMS plan may be a separate document, or a part of the project plan or quality assurance plan. Safety may be dealt with separately, for example in a functional safety management plan. List of RAM requirements system (FI) used in C.IV.3 at CL3. Requirement definition: RAMS activities shall be planned, including methods and tools to be used. Expectations on the suppliers RAMS activities, plans and reporting shall be identified. Plan(s) showing the methods, tools, and procedures to be used for RAMS activities. Schedule of RAMS activities. Expectations on the suppliers RAMS plan. RAM data to be collected (CL3). Plan for handling of RAMS (FI): reviewed in A.IV.2 at CL3 used in B.IV.4 at CL3. Acceptable contributions from Owner.Review and comment on the RAMS plan for the unit.
60 App.B Page 60 A.REQ.1 Define mission, objectives and shared vision Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible:. Requirement definition: Mission, vision and objectives of the unit shall be defined. The vision of the unit s final solution shall be formalised and distributed to all stakeholders. Agreeing on the overall vision, mission and objectives makes it possible for all involved roles to make correct decisions regarding the functionality and capability of the systems in question. The results may be a design-basis document or part of a FEED study or similar report. Unit design intention and philosophy: The vision of the unit/system, descriptions of the unit/systems overall behaviour and the expected business/safety/environmental performance. A.REQ.2 Collect requirements for the unit and systems Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. It is important that requirements on individual Systems are defined while taking into account interacting System and the whole Unit. There are no specific demands regarding the format or analysis of the requirements, but it is still important that all relevant aspects like functionality, performance, reliability, availability, maintenance, safety are covered. For a more detailed description of what aspects to cover, see DNV RP-D201, Appendix A: Generic System Requirements. Collecting the requirements may be supported by prototyping, FEED studies, technology qualification or other means. Design Philosophy (FI) used in A.IV.1 at CL3. Requirement definition: Owner requirements shall be collected. These requirements shall focus on operational needs, along with the constraints the owner is placing on the unit and systems. Vessel specification: operational requirements, functional requirements, non-functional requirements and technical constraints. Acceptable contributions from Owner. Provide requirements. Review of requirements documents. Participation in clarification of requirements. Vessel specification (FI) reviewed in A.IV.1 at CL3. A.REQ.3 Define operational modes and scenarios to capture expected behaviour Phase: Basic engineering. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible:. Requirement definition: The different operational modes of the unit shall be described and the corresponding operational scenarios defined. The interaction between the operators and the different systems shall be defined for key operational scenarios. The key operational scenarios shall be explicitly described in order to enable the system integrator and suppliers to focus on these. This activity is an extension of the activity A.REQ.2. An operational scenario is a description of a typical sequence of events that includes the interaction of the system environment and users, as well as interaction among its components. Nominal scenarios should be described as well as key the important degraded ones. The task of selecting which of the operational scenarios to define as key can be done by evaluating the scenarios towards criteria like the novelty of the scenario, novelty of the systems involved, associated risks etc. During the basic engineering phase operational scenarios are used to evaluate the vessel specification.
61 App.B Page 61 At a later stage the operational scenarios are used as input to more detailed use-cases and to create tests-cases for the interaction between different systems. The operational modes and scenarios may also serve as input to operation manuals. The documentation of operational modes and scenarios can be achieved by creating a concept of operation document (CONOPS). For more info see the ANSI/AIAA G (focusing on new systems) and IEEE Standard 1362 (focusing on upgrades to existing systems). Vessel specification: description of the operational modes and corresponding key operational scenarios, detailed to the level of the different systems. A.REQ.4Allocate functions and requirements to systems Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. The system integrator normally requests information from possible suppliers to specify the system requirements. Some requirements may be unique to a system; other requirements may be common across all or multiple systems. Vessel specification (FI) reviewed in A.IV.1 at CL3. Requirement definition: The requirements shall be organized into packages that can serve as the basis for the development of each system. The allocation of requirements shall ensure that systems on a lower confidence level do not have compromising impact on systems on a higher confidence level. Design specification (or requirements) for the relevant systems. A.REQ.5 Consult potential suppliers for acquiring of systems Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. The requirements allocation to systems is defined in activity A.REQ.4 and the traceability information in activity A.REQ.6. Make/buy/reuse analyses are normally used to determine which/what part of a system could be acquired. The system integrator should make sure that the system is not over-specified before consulting what suppliers have to offer. The system integrator should communicate the hard operational constraints so that the suppliers understand the need for their system. Specification (FI) reviewed in A.IV.1 at CL3. Acceptable contributions from Owner. Review of system level functional specifications/requirements documents, participation in clarification of requirements. Requirement definition: Potential subcontractors for developing/delivering systems shall be consulted. On CL2 and above, the requirements allocated to the system shall be used to track compliance of the supplier s offer. System request for proposal (RfP): functional specifications, generic system requirements and obsolescence information. Requirements compliance information (on CL2 and above). Acceptable contributions from Owner. Contribution of Owner to System request for proposal (RfP): Technical constraints, Generic System Requirements (GSR).
62 App.B Page 62 A.REQ.6 Establish traceability of requirements Phase: Basic engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: The mechanisms, ambition-level, and eventual tools to be used for the traceability of requirements shall be defined and communicated to suppliers. Traceability between different levels of requirements shall be documented, and shall at this point in time at least cover the trace from unit level requirements to system level requirements. Three different types of traceability are normally expected to be documented during the project (see X.REQ.1): 1) Traceability between requirements on different levels (e.g. from unit to system). 2) Traceability from a requirement to where and how it is designed and implemented. 3) Traceability from a requirement to where and how it is verified and validated. Different levels and granularity of traceability may be defined depending on the complexity and the risk of not achieving the requirements and the confidence level of the system in question. The traceability information should be kept up to date as described in activity X.REQ.1, this also includes updates that the suppliers are expected to perform. Traceability matrices are normally used to document the requirement allocation, but also databases or references from documents to documents can be used as long as the traceability information is explicit and reviewable. Traceability information between requirements on unit level and requirements on the different systems. Defined mechanisms and ambition-level regarding requirements traceability. A.RISK.1 Define a strategy for risk management Phase: Basic engineering. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible:. The risk management strategy is typically a common asset for individual projects. The ISDS standard focuses on the risk management of common risks which impact several systems or the whole project. Traceability matrices (FI) used in A.IV.1 at CL3. Requirement definition: A risk management strategy shall be defined. It shall identify the sources of the risks, how they are categorised, how they are characterised (attributes) and prioritised, how risks are reported, who is responsible, which risk mitigations to use and when the risk should be mitigated. Risk management procedure. Blank risk register. Acceptable contributions from System integrator. Review the risk strategy.
63 App.B Page 63 A.RISK.2 Jointly identify risks Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: Risks impacting the project plans or the efforts shall be identified at the beginning of the project. Consequences (schedule, effort, quality, obsolescence, etc.) and the associated probability shall be evaluated and documented. Risks in this process area should focus on project (schedule and effort) and consider product risks which potentially impact several systems or the whole project. RAMS risks are managed in the RAMS process area. A clear distinction should be made between risks (not having occurred yet) and issues (already present). Project risk list: risk list with risks related to e.g. requirements, schedule, effort, quality, performance, consistency and obsolescence (for both hardware and software). A.RISK.3 Assign confidence levels Phase: Basic engineering. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible:. Acceptable contributions from Owner.Provide inputs on operational and business risks relevant for the project, the unit and systems. Requirement definition: The confidence level (CL) of the functions and systems shall be assigned. The detailed content and borders of the different systems within scope of ISDS shall be documented. Confidence levels of the functions and systems can be assigned using three different practices: functions and systems in scope of this standard are assigned a confidence level based on the recommendations in Ch.3, Sec.1, B200, the above functions and systems may be assigned a higher confidence level by the owner, additional functions and systems may be assigned a confidence level based on the procedure described in DNV- RP-D201. Confidence level matrix for the relevant systems. Vessel specification (confidence levels) (FI): reviewed in A.IV.1 at CL3 used in A.IV.2 and B.IV.4 at CL3. Acceptable contributions from System integrator. Contribute to assigning confidence levels to functions and systems within the scope.
64 App.B Page 64 A.VV.1 Validate the concept of the unit with the users Phase: Basic engineering. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Owner. Requirement definition: The concept of the unit and its systems shall be presented to representatives of the end users to ensure it can fulfil their needs. Comments and changes to the concept shall be shared with all stakeholders. The concept should represent an overall idea of the unit and its purpose. The concept validation allows the project to get early feedback from the users, increase the user awareness of the future unit, and reduce issues during commissioning and sea trials. Prototypes or simulations may be used to validate the concept of the system with the users, providing a different point of view of the system in addition to the specifications. Other means of validation may be used, such as using existing concepts already validated by the end users. Unit concept presentation: Simulations and Minutes of System Concept Review Meeting. FEED study. Acceptable contributions from System integrator. Provide inputs to the unit concept and participate in the presentation of the concept to the end users. A.VV.2 Verify the unit and system requirements Phase: Basic engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: The unit requirements shall be verified in the context of operational scenarios and the defined mission and vision of the unit. The correctness, completeness and consistency of the allocation and derivation of requirements to individual systems and known components from the unit requirements shall be verified by reviewing the requirement traceability information. This activity is a quality assurance of the results from activities A.REQ.1, A.REQ.3, A.REQ.4 and A.DES.1. Traceability information (from activity A.REQ.6) is normally used to facilitate this activity. Review records of the unit requirements. Review records for the system requirements.
65 App.B Page 65 A 300 Activity definition engineering B.ACQ.1 Select COTS products based on defined criteria Phase: Engineering. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: In case of COTS acquisition, selection criteria shall be established that include functionality, RAMS, obsolescence management and other relevant criteria. This standard focuses on the technical aspects of the selection process, not the financial aspects, which are out of scope. The COTS products in question may be used as a stand-alone product or as a component in the supplier s system. This activity should be coordinated with the results of activity B.DES.5. COTS product selection procedure: obsolescence management. COTS product selection matrix: rationale for selection, selection criteria, evaluations and selection. B.ACQ.2 Establish contract with sub-suppliers Phase: Engineering. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Contracts with sub-suppliers shall be established, and shall contain at least the following items: Products, functions and components to be included in the delivery. Relevant ISDS standard requirements. Criteria for acceptance of the acquired product. Proprietary, usage, ownership, warranty and licensing rights. Maintenance and support in the future. If the development, updating or configuration of software dependent systems is sub-contracted, relevant parts of the ISDS standard apply to the sub-supplier and shall be included in the contract. This standard focuses on the technical aspects of the selection process, not the financial aspects, which are out of scope. If a sub-supplier deliver non-cots system(s) with custom software, the contractual agreements should clarify which parts of the ISDS standard the sub-contractor should adhere to and be assessed towards, and which parts of the standard the supplier will take care of. Supplier agreement: product or component specifications, functional specifications, technical acceptance criteria, ownership transfer conditions, delivery strategy, provisions for review of intermediate deliveries.
66 App.B Page 66 B.CM.1 Establish baselines of requirements and design Phase: Engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: Baselines of the unit specifications, unit design, system requirements, system designs, base products, and interface specifications shall be established before the information is used for further detailing of design, implementation and verification. Changes to these baselines require review and approval by appropriate stakeholders, ref. X.CM.1. The term base products is here used to describe any kind of existing product, component, software library, software template or similar on which the supplier bases the development (or automatic generation) of the custom specific product. Baseline repositories. Identification of baselines. Approved and controlled documents (baselines) for: unit specifications, unit design, system requirements, system design, interface specifications and base products. Acceptable contributions from Owner. Approval of requirements and design documents. B.CM.2 Establish and implement configuration management Phase: Engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: The roles, responsibilities and mechanisms for managing changes and handling versions of all software and documents included in the baselines shall be defined and implemented. The project specific details regarding configuration items and responsibilities shall be documented. These mechanisms shall apply until a Change Control Board (CCB) for commissioning is established in the acceptance phase. The configuration management plan may be a part of the project plan for the organisation, see activity B.PM.1. Separate configuration management mechanisms may be established for the unit and individual systems. The unit level mechanism typically does not provide for a software repository, but this should be in place at the supplier level. A typical way to manage changes to baselines is to let one decision-body (normally called a CCB) make an impact analysis of proposed changes and then make a yes or no decision and communicate the approved changes to be performed. The version control rules should also encompass software/documents/information that has not been included in a formal baseline. Tools for version control are often utilised. This activity focuses on the configuration management during the engineering and construction phases in the project, for the acceptance phase other mechanisms are typically needed, see D.CM.1. Configuration management plan: Definition of a Change Control Board (CCB) process or similar, identification of required baselines, required baseline content, and change request forms. Change requests and change decisions. Version history information of baselines. Defined rules and mechanisms for version control. Effective implementation of version control mechanisms. Acceptable contributions from Owner. Approval of configuration management mechanisms and participation in the Change Control Board (CCB) for unit level baselines.
67 App.B Page 67 B.DES.1 Design the system Phase: Engineering. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: The design for the system shall be developed and documented. The design shall identify the major subsystems and components and how they interact. All external interfaces shall be documented and coordinated with other relevant systems. The design shall show the software components of the system and indicate which parts are developed for the specific project and which parts are reused/configured. The definition of external interfaces is related to the coordination of the inter-system interfaces, see activity B.INT.2. Some of the interface definition may be done during the design of software components, see B.DES.2. The system design can normally not be finalized until the unit design has been refined in activity B.DES.4. CL2 & CL3 systems are usually designed to be free of or resilient to single points of failure. Design for system (hardware & software): functional description, user interface descriptions, block/topology diagrams with software components, external interface descriptions and internal interface descriptions. Acceptable contributions from System integrator. Review the functional descriptions and system topology. B.DES.2 Design each software component Phase: Engineering. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Interface description (FI), Functional description (FI) and Block (topology) diagram (FI) reviewed in B.IV.1 at CL3. Requirement definition: Design for each software component shall be documented. The design shall identify the major parts of the software component, their functions and behaviour and describe the interactions and behaviour between these parts and with the hardware. The design shall describe which hardware runs the software component and its parts. The intent and assumptions of the designer shall be clearly visible. The known limitations of the design shall be listed. The design shall show which parts are developed within the project and which parts are reused, e.g., standard software or legacy software. This activity aims at detailing the software design which is a sub-set of the system design defined in B.DES.1. The software component design normally contains information about the structure and behaviour of software components, (e.g. state diagrams, sequence diagrams, class diagrams, etc.) along with a description of the functionality of the component and internal interfaces. The design may be documented using various recognised methods and means from models and databases to diagrams and documents. The design should not be confused with the actual software. Component design for each software component, in sufficient detail so as to proceed to the making of the software: structural description, functional description, behaviour description, parameters (default, intervals, asdesigned), interfaces description, allocation of software to hardware and assumptions and known limitations of the design. Software design description (FI) reviewed in B.IV.1 at CL3.
68 App.B Page 68 B.DES.3 Use established design guidelines and methods Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: Appropriate guidelines and methods shall be used to ensure consistent design of the inter-system behaviour, the systems and their components. In this context the term guidelines' also includes methods, techniques, tools etc. Some guidelines may be mandatory to follow and are referred to as rules. The definition and identification of applicable guidelines can be performed as a part of the activities A.PQA.2 and B.PQA.1 The supplier typically uses the design guidelines as input to the design activities B.DES.1, B.DES.2 and B.RAMS.2. The system integrator typically uses the design guidelines as input to the activities B.DES.4 and B.RAMS.2. If applicable the guidelines should also take be selected according to the safety integrity level. Established and well-known design guidelines, techniques and measures are commonly available; for example, IEEE 12207, IEEE 1016, IEC 61499, part 1 or ISO/IEC 61508, part 7. System design guidelines: including RAMS related aspects. Unit design guidelines: including RAMS related aspects. B.DES.4 Analyse and refine the unit design Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. This activity can usually not be completed until the suppliers and systems have been selected and awarded contracts because both documentation and participation from the suppliers are normally needed in order to establish the common solutions. The system design from the suppliers is created in activity B.DES.1. The unit architecture/design is created in activity A.DES.1. The operational scenarios are defined in activity A.REQ.3. The unit architecture may contain a number of views: The physical layout describes the way components and networks are physically distributed around the facility, including the way they are interconnected (network and main CPUs). The logical view of the function describes the function in terms of functionality towards its users. The scenarios or dynamic view describes the primary interaction between components when a function is executed, and how the users interact with the functions. The process view of the function describes the processes and components that compose the function, as well as the responsibility of individual components and processes, their interaction, triggers and cycle times. The physical view of the function (or allocation) describes the mapping of the main processes/ software components onto the hardware components. The development view of the function describes the breakdown of the function into sub-functions. RAMS design guidelines and methods for the vessel (FI) used in B.IV.1 at CL3. RAMS design guidelines and methods for the system (FI) used in B.IV.1 at CL3. Requirement definition: The system designs shall be analysed in conjunction with the unit architecture and operational scenarios to identify critical behavioural interactions and complex data flows. Common solutions shall be established, standard alarms and signals defined, common constraints identified, and rules for handling of shared data resources defined. The unit architecture documentation shall be updated to reflect the results of the analysis. Updated unit design documentation: unit design specifications, systems/network topology with software components, interface specifications, and functional descriptions. Interface description (FI) reviewed in B.IV.1 at CL3, Functional description (FI) reviewed in B.IV.1 at CL3, Block (topology) diagram (FI): reviewed in B.IV.1 at CL3 used in B.IV.2 at CL2 and CL3. Acceptable contributions from Owner and Supplier. Owner: Review the unit design documents. Supplier: Give input to the system integrator in the form of system design information, topology, software structure, and interfaces for the supplier s system(s). Participate in review of the common solutions (unit design).
69 App.B Page 69 B.DES.5 Define obsolescence strategy Phase: Engineering. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Supplier. Requirement definition: Obsolescence risks shall be assessed, based on top level architecture and make/buy/reuse analyses. The strategy/plan for the future management of obsolescence shall be defined. The owner is responsible for the overall obsolescence strategy while the supplier makes obsolescence statements for the individual system(s). Obsolescence management plan: Authorised vendor list, Spare parts list (hardware & software), stock, alternate spare parts list, management of intellectual property. Obsolescence criteria for software. Manufacturer preferred equipment list. Acceptable contributions from System integrator and Supplier. System integrator: Review of obsolescence strategy/plan. Assessment of obsolescence of the computer and networking hardware and software. Supplier: Provide obsolescence statements regarding the systems and components, including software components. B.INT.1 Define integration plan Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: Roles and responsibilities for planning and executing the integration of all parts of the systems in ISDS scope shall be defined and documented. The plan for integration of the different systems into the unit shall be defined and documented. The criteria, procedures, and the environment for the integration shall be described. The integration plan shall also describe the type of tests that will be performed to ensure that systems and components interact correctly. As a minimum the supplier must define the integration readiness criteria for the components. When required, the system integrator shall ask the supplier for a more comprehensive integration plan for integration of components into a system. The integration plan should describe the steps to follow to assemble the systems and components of the system. The integration plan should take into account the need for test stubs or simulators to perform integration tests (See C.VV.3 and D.VV.4). The integration readiness criteria are going to be used in activity C.INT.1. The output from this activity can be combined with the output from B.VV.1 Define verification and validation strategy, in order to avoid duplication of information regarding test strategies. The various roles and responsibilities regarding integration should be defined and documented at an early stage, while the definition of integration sequences and environments may be defined at a later stage. This means that the integration plan may need to be updated in the construction phase. Special attention should be placed on identifying boundaries and exclusions of each organization s responsibility. Separate integration plan for a system is normally only required for complex systems, e.g. the drilling package. Plan for integration of systems into the unit: The responsibilities of the different organizations, dependencies among systems, sequence for integration, integration environment, tests and integration readiness criteria. Plan for integration of sub-systems and components into systems (when required): Dependencies among systems, sub-systems and components, sequence for integration, integration environment, tests and integration readiness criteria. Integration plan (FI): reviewed in B.IV.2 at CL2 and CL3 used in C.IV.1 at CL3. Acceptable contributions from Supplier. Provide information about the system dependencies, and other considerations for integrating their system(s) into the unit. Acknowledge assigned responsibilities. When requested, provide a plan for integration of sub-systems and components into a system.
70 App.B Page 70 B.INT.2 Coordinate inter-system interfaces Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: An interface matrix or a similar mechanism shall be established to identify inter-system interfaces and assign the responsibility for defining and testing them. The status of the interface definition and verification shall be tracked. Changes to the inter-system interfaces shall be controlled and coordinated. This activity is related to the specification of systems and their interaction (A.DES.1, A.REQ.3, and B.DES.4). Controlling the changes to the inter-system interfaces may be performed by including the interface matrix and the interface definitions in baselines (see B.CM.1). Interface specifications are typically documented by the suppliers as a part of the design of the systems (B.DES.1) and may be included in design documents or documented separately. Multiple interface specification documents may be prepared or they may be packaged into a common document. Typically, the system integrator will manage the baseline of inter-system interface specifications. Interface overview/matrix information with assigned responsibilities. Agreed inter-system interface specifications containing: protocol selected, definition of commands, messages, data and alarms to be communicated and specifications of message formats. Interface definition and verification status. B.PM.1 Establish the project plan for each organisation Phase: Engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: Supplier. In practice, the plan may be detailed for the current phase and roughly outlined for ulterior phases. When reaching the ulterior phases, the plan may then be detailed. As the acceptance phase is often a critical period for the project, it is recommended that this phase be planned in detail to reduce project risks. The plan for the phases should be established and maintained in coordination with other stakeholders, based on the master plan for the project. Interface description (FI) used in B.IV.2 at CL2 and CL3. Acceptable contributions from Supplier. Notify the system integrator when changes are needed to inter-system interfaces. Cooperate with other suppliers in the definition of inter-system interfaces. Update inter-system interface specifications when required. Requirement definition: A project plan shall be established and maintained, in coordination with other stakeholders, for engineering and all succeeding phases. The plan shall include at least schedules and resource allocation of software related activities and take into account the activities for the different phases of the project. Schedule. Project plan: WBS, technical attributes used for estimating, effort and costs estimates, deliverables and milestones, configuration management plan. Resource allocation. Acceptable contributions from Owner. Provide inputs to the project plan, including users schedules and constraints.
71 App.B Page 71 B.PM.2 Coordinate and integrate the project plans with the master plan Phase: Engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: The master plan of the system should be refined and made consistent with suppliers or other stakeholders plans and schedules. The initial master plan is provided in activity A.PM.1. The integration plan (see B.INT.1) and the RAMS plan (see B.RAMS.4) should be taken into account and made visible in the master and project plans. The verification and validation strategies (see B.VV.1) normally also gives inputs that impacts the project plans. Master plan. Project plans. Acceptable contributions from Owner and Supplier. Coordinate plans and schedules. B.PQA.1 Define procedures (supplier) Phase: Engineering. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Procedures to be used within the project shall be defined, coordinated, and agreed within and between organisations participating in the project. Roles, responsibilities and specific requirements as defined in this standard shall be explicitly addressed. The word procedures in this context is used to represent all documentation regarding the way of working, e.g. process descriptions, standard operating procedures, work instructions, checklists, guidelines etc. Some of the supplier s procedures should be coordinated with the procedures given by the owner (see A.PQA.1) and the system integrator (see A.PQA.2). The defined procedures are normally made up of quality management system documents: e.g. standard operating procedures, process description, checklists, and document templates along with any project specific adaptations of these. The following areas are normally expected to be covered: All activities required to be performed by the supplier, listed in Ch.2 Sec.7. Responsibilities and authorities of different disciplines and roles, Mechanisms for submitting and receiving information (documents) between different organisations, Mechanisms for defining baselines of information (documents), Mechanisms for handling of documents and information while they are work in progress, Mechanisms for review and approval of drawings and other documents, Mechanisms for approval of deliverables (verification mechanisms), Mechanisms for handling of changes to already agreed technical scope, schedule or costs, Mechanisms for escalation of problems (see C.PQA.1), Mechanisms for follow-up of process adherence (see X.PQA.3 and X.PQA.6), Mechanisms allowing management insight into the project s status, Internal procedures and rules for the work to be carried out in the project, e.g. guidelines for design and implementation (see B.DES.3 and C.IMP.4). Internal procedures and rules regarding how to document the requirements, design, implementation, verification & validation, and acceptance of the systems within ISDS scope. A quality system, documents, minutes of meetings, or other relevant information showing: A defined way of working for the major activities in the project, Clear roles and responsibilities and Defined ways of interaction between the different organizations (e.g. owner, system integrator, supplier, independent verifier, and others).
72 App.B Page 72 B.RAMS.1 Identify software-related RAMS risks and priorities Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: Software-related RAMS hazards and risks shall be identified and prioritised. Established methods shall be used for the risk identification. Risks identified at unit and system level shall be shared between relevant roles in the project. The RAMS risks are those risks related to the unit (and systems) in operation. At CL1 RAMS risks may be identified along with the project risks. At higher confidence levels special methods should be used, such as: Fault Three Analysis, HAZID, HAZOP, FME(C)A. RAMS hazard and risk list showing consideration of software risks. Defined risk identification and analysis methods. Relevant risks are communicated to other roles. Acceptable contributions from Owner. Give input to, and participate in risk identification on unit level. B.RAMS.2 Identify RAMS risk mitigation actions Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. This activity builds upon the output from activity B.RAMS.1. Mitigation actions can be identified on all levels: the unit, system, subsystem, hardware components, and software components. Typical risk mitigation actions include: changes to requirements or design, changed or added verification activities, changes to operational procedures, changes to documentation etc. RAMS risk register (FI) and RAMS risk analysis documentation (FI) reviewed in B.IV.3 at CL3. Requirement definition: Identified software-related hazards and risks shall be analysed to identify mitigation actions. Relevant mitigation actions shall be shared between relevant roles in the project. RAMS hazard and risk mitigation list showing mitigation actions for software risks. Relevant mitigation actions are communicated to other roles B.RAMS.3 Consider software failure modes in safety analysis activities Phase: Engineering. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Examples of safety analysis methods include, but are not limited to: FME(C)A, HAZOP, HAZID, Fault Three Analysis, Bowtie, Safety cases, etc. Examples of software failure modes include, but are not limited to: No output or control action not provided when needed, Wrong output or wrong control action provided, Spurious output or control action provided when not needed, Late output or control action not provided on-time, Output or control action stops too soon. RAMS risk register (FI): reviewed in B.IV.3 at CL3 used in C.IV.3 and D.IV.3 at CL3. Requirement definition: When performing safety analysis, potential software failure modes shall be included in the analysis. Safety analysis showing consideration of software failure modes. Safety assessment report (FI) used in C.IV.3 at CL3.
73 App.B Page 73 B.RAMS.4 Develop the RAMS plan for the system Phase: Engineering. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: RAMS activities shall be planned, including methods and tools to be used. The RAMS plan for the system shall be derived from the RAMS plan for the unit. This activity builds on the output from activity A.RAMS.3. The methods, tools and procedures used in all RAMS-related activities should be defined, documented and put under configuration control. The RAMS plan should cover all relevant RAMS activities included in this standard. This activity may be coordinated with the activities B.PM.1 and B.PQA.1 and the RAMS plan may be a separate document, a part of the project plan, or in a quality assurance plan. Safety may be dealt with separately, for example in a functional safety management plan Plan showing objectives, methods, tools, and procedures to be used, consistent with the RAMS plan for the unit. Schedule of RAMS activities. RAM data to be collected (CL3). Plan for handling of RAMS (FI): reviewed in B.IV.4 at CL3 used in C.IV.3 at CL3. Acceptable contributions from System integrator. Review the RAMS pan for the system to check consistency with the RAMS plan for the unit. B.REQ.1 Submit proposals to system integrator with compliance status Phase: Engineering. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: The supplier shall submit its proposal in response to the system integrator s request. The proposal shall contain software lifecycle information and a compliance status towards the system requirements. The supplier should provide a breakdown of the systems conforming to the Make/Buy/Reuse analyses, and take care to identify any required customisation or parameterisation of existing products, in particular of existing software products. Submitted technical proposal for the system: system breakdown, alternatives and options, description of customisation or parameterisation of existing products (including software), requirements compliance matrix and software lifecycle information (including licensing, ownership and obsolescence). B.REQ.2 Refine system requirements into software component requirements Phase: Engineering. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. It should be possible to distinguish which requirements are satisfied by the base product as-is, which requirements can be satisfied by a simple customisation and which requirements actually need specific developments. Specification (FI) used in B.IV.1 at CL3. Requirement definition: For each software component of the system, requirements shall be refined and allocated into component requirements. If the system is configured or generated from an existing base-product or components, there shall be explicit allocation of requirements to the base-product part and the custom part. Refined component requirements and specification. Requirement allocation matrix. Specifications (FI) used in B.IV.1 at CL3.
74 App.B Page 74 B.REQ.3 Detail operational scenarios Phase: Engineering. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: The key operational scenarios shall be developed and detailed to show the interaction between the end user and the systems(s) along with the interaction between different sub-systems or components in the system(s). Use case descriptions may be used in order to drill down in operational scenarios and describe the interaction between the end user and the system. Use case descriptions should include both the normal sequence and relevant alternative- and error -sequences. Performance targets should be defined for use cases. Use case realisation diagrams (often in the form of sequence diagrams or collaboration diagrams) may be used to show the interaction between different systems and components when performing specific use cases. System/component behaviour and interaction specification and descriptions: use cases, sequences (including signal usage), state diagrams, interlocks, degraded sequences, performance targets and constraints and limitations. B.VV.1 Define verification and validation strategy Phase: Engineering. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: Supplier. The purpose of the verification and validation strategies is to ensure that testing and evaluation is performed efficiently and effectively through a combination of complementary methods. Relevant stakeholders, e.g. owner, system integrator and supplier should review and approve the verification and validation strategy for the elements in their scope. The supplier's verification strategy should take into account the software module test (C.IMP.3) in addition to all relevant VV activities described in this standard. The system integrator should define the verification and validation strategy at the unit level, while the suppliers should define their verification and validation strategy at the lower levels. Both should ensure their respective strategies are consistent with each other, and that relevant verification and validation activities and requirements described in this standard are covered. The validation strategy should be consistent, complementary and as little redundant as possible with the verification strategy. Care should be taken to identify the purpose of each test activity performed (for example, test for verification against a requirement, or test for validation against a user scenario). Detailed procedures and tools for testing may be established during the construction phase (See C.VV.3, C.VV.8 and D.VV.1). The information listed under the Assessment criteria may be a part of a project plan, a project quality plan or similar. Specifications (FI) used in B.IV.1 at CL3. Requirement definition: The verification and validation strategies shall be defined and documented. The strategy shall include a list of verification and validation activities, the test environment, the purpose of each activity, the method to be employed in the activity, the quality criteria (quality objectives) to be fulfilled, and the organisational responsibility for conducting and recording the activity. On the system level, the verification and validation strategies shall distinguish between base product software, configured software, modified software, newly developed software, and defect corrections. The verification strategy shall define the means to ensure the unit/system meets its requirements.
75 App.B Page 75 Verification strategy: which part to verify: unit, system, sub-system, component, module, design documents. Method specification documents, etc.: which methods to use for this verification: testing, inspection, code analysis, simulation, prototyping, peer review techniques, quality criteria and targets, which test types to use: functional, performance, regression, user interface, negative, what environment to use for verification and identification of the test stages (e.g. sea trials, integration tests, commissioning, FAT, internal testing, component testing) to be used for the verification and the schedule for those tests. Validation strategy: products to be validated, validation criteria, operational scenarios, methods and environments. B.VV.2 Review the design with respect to requirements and design rules Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. In order to secure that all relevant aspects are covered, this activity normally encompasses one or more review meetings. Both intra- and inter-disciplinary reviews are normally performed. The review should include software considerations. The design rules are defined in activity B.DES.3. Good traceability between different requirements levels eases this review; see A.REQ4, B.REQ.2 and X.REQ.1. This activity verifies the outcomes of all DES activities in phase B. Verification and validation strategy (FI): Acceptable contributions from Owner. Provide inputs based on operational scenarios, quality criteria, user needs etc. Provide inputs regarding FAT, commissioning, integration, and quay or sea trials. reviewed in B.IV.2, at CL2 and CL3 used in C.IV.1 at CL3, and C.IV.2 and D.IV.1 at CL2 and CL3. Requirement definition: The design information shall be reviewed to verify the completeness of the design with respect to requirements (e.g. functions, interfaces, performance, and RAMS properties), design rules and uncertainties (e.g. technical risks, and design margins, design assumptions). Documented design review records addressing: requirements verification, design rules and verification of uncertainties. Acceptable contributions from Owner. Review the design from the operation and user point of view. B.VV.3 Review consistency between design and operational scenarios Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: Consistency between functions and interfaces assigned to each system and the defined operational scenarios shall be reviewed. Minutes from review: review results considering consistency of interface/function/component/scenarios.
76 App.B Page 76 B.VV.4 Review interface specifications Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: All inter-system interface specifications and intra-system interfaces shall be reviewed by relevant stakeholders. The review of inter-system interfaces shall be coordinated by the system integrator. This activity verifies the outcomes of B.INT.2 and B.DES.1. Interface specifications may be included in design documents or prepared separately. Typical stakeholders for inter-system interfaces are all suppliers relying on the interface information. Typical stakeholders for intra-system interfaces are different departments and disciplines at the supplier, relying on the interface information. Interface specification reviews addressing at least: consistency between input and output signals, frequency and scan rates, deadlocks, propagation of failures from one part to another, engineering units, network domination. B.VV.5 Validate critical or novel user-system interactions Phase: Engineering. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: Within the specification constraints for the unit, critical or novel user-system interactions shall be validated for usability and consistency. This validation may be performed by reviewing the different systems against each other to check for usability and consistency. The validation is usually done as one or several workshops with supplier and user representatives. A document review alone is usually not sufficient. Diverse means may be used such as simulation (from whiteboard to 3D modelling), demonstrations of similar existing systems and their interactions, etc. The definition of typical or classes of user system interactions may be used to select the interactions to be validated. Validation records including: workshop minutes, user representative s participation and comments and agreed action lists. Acceptable contributions from Owner and Supplier. Owner: provide inputs based on operational scenarios, quality criteria, user needs etc. The users comment on usability. Supplier: provide information about system usage.
77 App.B Page 77 A 400 Activity definition construction C.ACQ.1 Accept deliverables Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Deliverables from the sub-suppliers shall be submitted to formal acceptance, using predefined criteria and procedures. Acceptance tests shall at least verify that the delivered product is compliant to its specification. In case of COTS acquisition, the acquired products shall be qualified for use in the ISDS scoped system to be delivered. The principles for the qualification mentioned in this activity are outlined in the guidance note of activity C.VV.7 - Qualify reused software. If software development or software configuration is performed by a sub-supplier, the sub-supplier is going to be assessed towards relevant parts of this ISDS standard. Component acceptance data: acceptance criteria, component acceptance (FAT, SAT) test procedures, component acceptance test records, component acceptance issue and problems list and component acceptance coverage measurements (requirements, structural). C.ACQ.2 Ensure transition and integration of the delivered product Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Documentation, examples, or support necessary to ensure the integrateability of the delivered components and systems shall be planned and provided. This activity is an extension of C.ACQ.1 and is addressing the delivery from the supplier to the system integrator. The aim of this activity is to help the system integrator by avoiding extra integration work originating from the subsuppliers' components. Supplier agreement on: list of deliverables, review and approval plans and support and maintenance agreement. Product documentation. Operation manual. Configuration information.
78 App.B Page 78 C.IMP.1 Develop and configure the software components from design Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Software components shall be developed according to their design. Configuration and parameterisation of software shall be considered part of the development. The software components should be developed from the design (and not the other way around). Software development includes creation of new software components, modification of existing software components, or parameterisation and configuration of new, modified and existing software components. Developed component release note. Commented software source code. Parameters and configuration files. I/O List. Development environment configuration. C.IMP.2 Develop support documentation Phase: Construction. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Support documentation for the components and the whole system shall be developed and checked against each other for consistency. Focus should be put on delivering this documentation early enough to be available for testing. System and component support documentation: data sheets, user manuals, administration manuals, operating and maintenance procedures, training material and FAQs, known defects and troubleshooting guides. Review records for the support documentation. C.IMP.3 Perform software component testing Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Acceptable contributions from System integrator and Owner. System integrator: Provide guidelines and rules for consistency of support documentation, Review support documentation. Owner: Review support documentation. Requirement definition: Software components shall be tested before verification or acceptance, according to the verification strategy. The extent (coverage) of the testing and the results shall be documented and reported. This particularly applies to testing of new, modified, configured or impacted software. Custom made function blocks should be tested while standard libraries and standard function blocks are usually explicitly tested. This testing is usually performed by the software development team and is often also called software unit test. The tests are white box, meaning that knowledge about the software s internal structure is utilized to ensure that all relevant parts of the code are covered by the tests. Software test log: list of defects, date of test, tester, test scope and pass or fail. Software defect list.
79 App.B Page 79 C.IMP.4 Use established software implementation guidelines and methods Phase: Construction. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Appropriate guidelines and methods shall be used to enhance RAMS of system and components. Guidelines typically address naming conventions, coding style, patterns to be used or avoided etc. Some guidelines may be mandatory to follow and are referred to as rules. IEC references a set of guidelines and methods. Confidence level and safety level should be considered when selecting and defining guidelines. Verification that the guidelines and methods have been followed is typically done in activity C.VV.5 Perform code analysis on new and modified software and in the review activities C.VV.1 and C.VV.2 Software guidelines/standards/rules/checklists/automated checks. Review records. C.INT.1 Check readiness status of systems and components before integration Phase: Construction. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: Readiness of the systems and components for integration shall be checked before starting integration. The criteria defined in the integration plan shall be used to assess readiness. The verification analysis report from C.VV.6 and the RAMS compliance report from C.RAMS.1 may be used as a basis for the check. See activity B.INT.1 for a description of the integration plan. Integration readiness criteria fulfilled per component and per system. C.PQA.1 Establish procedures for problem resolution and maintenance activities in the construction and acceptance phases Phase: Construction. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: Establish procedures for receiving, recording, resolving, and tracking problems and modification requests. Maintenance procedures covering software updates, backup and rollback shall also be established. The procedures required here should be coordinated with the requirements in D.CM.1 Manage software changes during commissioning. Care should be taken to describe procedures for resolving joint problems, at the interface of two or more systems within ISDS scope. Starting at FAT there should be a joint process for resolving problems. Agreed maintenance procedures: Procedures for general system maintenance activities and procedures for software update, backup and roll-back. Agreed problem resolution procedures: Procedures for receiving, recording, resolving, tracking problems (punches) and modification requests. Acceptable contributions from Supplier. Provide inputs to the procedures for problem resolution and maintenance activities and participate in the problem resolution process.
80 App.B Page 80 C.RAMS.1 Demonstrate achievement of system RAMS requirements Phase: Construction. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: RAMS arguments and evidence shall be assembled to demonstrate achievement of RAMS requirements and objectives on the system level. The RAMS requirements and objectives are defined in activities A.RAMS.1 and A.RAMS.2. The arguments and evidence may include: Safety analysis, FME(C)A, test cases, test results, inspections, computations and simulations, certifications, qualification of legacy systems, etc. RAMS compliance analysis information. C.RAMS.2 Evaluate software systems and software components against RAM objectives Phase: Construction. Confidence level: 3. Unit level responsible:. System level responsible: Supplier. This activity is an extension of the activity C.RAMS.1. The RAM data is typically statistical data collected and analysed using models such as Weibull, PDS (SINTEF STF38A97434), etc. RAMS compliance report (FI) reviewed in C.IV.3 at CL3. Requirement definition: Software systems and software components shall be specifically checked against RAM requirements and objectives using data collected from internal testing, FAT and other verification activities. RAM report: Calculations of RAM values for designated systems and RAM data. C.RAMS.3 Prepare a plan for system maintenance during operation Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. The output from this activity forms the basis for the execution of the maintenance in activity E.RAMS.1. The finalised plan is normally needed in order to perform the activity D.CM.3 Transfer responsibility for system configuration management to owner. RAM report (FI) reviewed in C.IV.3. Requirement definition: A plan for the maintenance of the system during operation shall be defined, describing maintenance related functions like restarts, backups and replacement of equipment/parts during operation. Maintenance management plan: configuration items, rules for operation/maintenance, backup and restore procedures, expected maintenance activities, expected software update, migration and retirement activities, schedules and tailored procedures for maintenance in operation. Acceptable contributions from Owner. Provide inputs to the development of the maintenance plan.
81 App.B Page 81 C.VV.1 Perform peer-reviews of software Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Peer reviews of new, modified and configured/parameterised software shall be performed. The consistency of the software with its design documents shall be checked. In this context the term software is used to represent all types of source code, graphical notations and models that is readable for humans and used to generate machine-readable programs that can be executed by a computer. Peer reviews are complementary means to testing to detect defects as early as possible in the development cycle. Peer reviews are often also used to verify that applicable guidelines and methods have been applied in the software, see activity C.IMP.4. Peer review methodology description. Peer review schedule. Peer review records. Peer review check lists. C.VV.2 Review software parameterisation data Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Software peer review records (FI) used in C.IV.1 at CL3. Requirement definition: Review the software parameterisation for completeness and correctness. Successful parameterisation of the software depends on the quality of the data taken into account by the supplier and provided by the system integrator documenting the physical or dynamic properties of the unit and its various systems. In practice, the supplier responsible and the system integrator usually review the quality of the parameterisation together. The input data for the parameterisation is typically an outcome of B.DES.1. Parameter list review report: name, value, tolerance, function. Acceptable contributions from System integrator. Participate to review of key parameters, check for: completeness, minimum level of uncertainties, accurate description of unit.
82 App.B Page 82 C.VV.3 Perform internal testing Phase: Construction. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Developed systems and components shall be verified before decision to release for FAT. The system and component verification shall be analysed and compared with expected quality attribute targets, e.g. RAMS (qualitative). Tests shall include: white box tests with 100% statement coverage of new and modified software interface testing covering I/O for both bus-interfaces and hardwired interfaces. tests of relevant intra-system interface dynamics by defined scenarios tests of relevant inter-system interface dynamics by defined scenarios normal functionality error situations and corresponding alarms degraded functionality integration between hardware and software. All requirements mapped to the system shall be tested. Functions or properties that cannot be tested before FAT or even in FAT shall be clearly identified. The ambition levels for these tests should be clearly described in the verification strategy (see B.VV.1), and based on the minimum assessment criteria described here. Details regarding the mapping of requirements to tests are covered in the activity X.REQ.1. The software component test described in activity C.IMP.3 can be used as a means of white box testing. The other aspects of the internal testing are normally performed after the software component test and are sometimes referred to as an internal acceptance test. The limitations of the internal test environment and the use of tools like e.g. test-drivers, simulators and emulators should be clearly described. Test procedures. Test reports. C.VV.4 Perform high integrity internal testing Phase: Construction. Confidence level: 3. Unit level responsible:. System level responsible: Supplier. Acceptable contributions from System integrator. Review the internal tests results and analyse impact on the verification strategy. Requirement definition: Internal high integrity tests shall be performed. Tests shall include: white box tests with 100% decision coverage of new and modified software statistical testing detailed performance testing. stress testing FME(C)A based testing Security related testing start, restart, shutdown testing. Functions or properties that cannot be tested before FAT or even in FAT shall be clearly identified. This activity is an extension of the internal test described in C.VV.3 taking into account the additional quantitative RAMS requirements on CL3. The ambition levels for these tests should be clearly described in the verification strategy, and based on the minimum assessment criteria described here. Test procedures Test reports Acceptable contributions from System integrator. Review the internal tests results and analyse impact on the verification strategy. Test procedure at manufacturer (FI) and Test report at manufacturer (FI) used in C.IV.1 at CL3.
83 App.B Page 83 C.VV.5 Perform code analysis on new and modified software Phase: Construction. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Adequate software source code verification shall be performed against a predefined set of rules. Code analysis shall be performed on new and modified software. For example, the source code verification may be performed by peer reviews, static analysis, or complementary specific analyses such as schedulability analysis. Semi-automated or manual methods may be used. See ISO for other recommended means. See ISO 9126 for internal quality characteristics. If this activity is performed by peer reviews, it can be combined with C.VV.1 provided the software is checked against both design and rule set. The code rule set depend on the type of programming language (e.g. a strongly typed language) and the application of e.g. PLCs or standard software, and can be combined with the rules mentioned in activity C.IMP.4. Software code verification: peer review reports, code analysis reports and code rule set. C.VV.6 Analyse verification results with respect to targets Phase: Construction. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. The purpose of this activity is to check if the system is fit for purpose and to identify actions to improve defect prone software in a fast and effective manner. The targets are defined in activity B.VV.1. Software code analysis record (FI) used in C.IV.1 at CL3. Requirement definition: The results of all verification activities performed by the supplier shall be evaluated and compared with the verification targets as defined in the verification strategy. Defects (punches) shall be tracked and analysed. Criteria for classification of defects shall be established. Defects shall be analysed for trends to identify defect prone software. The quality status of the system under development shall be measured. Verification result evaluation: result analyses, punch lists, action lists, defect correction and focus on defect prone software. Acceptable contributions from System integrator. Review the verification analysis information. Verification analysis report (FI) reviewed in C.IV.1 at CL3.
84 App.B Page 84 C.VV.7 Qualify reused software Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Standard software and non-standard legacy software shall be qualified for use in the project. The qualification shall be performed by either a) applying the ISDS standard, b) assessing the quality through due diligence of the software or c) by demonstrating that the software is proven-in-use. Modified reused software (modified source code) shall be treated as new software. Qualification of software can be done using the following methods: a) The process for developing the software is compliant to this ISDS standard. b) The due diligence should evaluate the component against predefined quality criteria. It should review relevant documentation like existing qualification certificates (e.g., type approvals), in-use feedback, requirements specifications, design descriptions, source code, test reports, release notes, and user manuals and other support documentation. When needed, critical functionality, performance, and other characteristics should be tested in a test environment similar to the target environment with the configuration data used in the systems in the ISDS scope. c) In order to demonstrate that the software is proven-in-use, arguments based on analysis of experiences from previous systems where the component has been used should be provided. The analysis should compare how the component has been used, e.g., how it has been configured, with how it will be used in the current system. Differences should be documented and it should be demonstrated, e.g., by testing or analysis, that the differences do not imply any unacceptable risks. The proven-in-use analysis should investigate failure data from previous systems that have been operated in a controlled way, e.g., all errors and software changes must have been recorded (the requirements in this standard for the operation phase apply). Software qualification report: reused software component list, qualification method for each reused software component and qualification data.
85 App.B Page 85 C.VV.8 Perform Factory Acceptance Tests (FAT) Phase: Construction. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Factory acceptance tests (FAT) shall be performed to ensure the system is compliant to its requirements and acceptable for the owner. The FAT shall include: tests to ensure the system is compliant with selected operational requirements, including normal modes, error situations, and degraded modes. tests covering the hardware/software integration. tests of relevant inter-system interface dynamics by defined scenarios. tests of relevant intra-system interface dynamics by defined scenarios. Functions or properties that cannot be tested in the FAT shall be clearly identified. The purpose of the FAT is for the system integrator and owner to accept the system. Full coverage of requirements is not required as long as the owner and the system integrator are satisfied. The demonstration of previously run tests is typically performed by providing test reports from internal tests. The limitations of the FAT environment and the use of tools like e.g. test-drivers, simulators and emulators should be clearly described. System FAT procedure: coverage of requirements, functionality, performance, RAMS (when applicable), integration testing, hardware/software integration, interfaces and degraded modes. System FAT report: consistent with procedure, deviations identified and coverage measured. Acceptable contributions from Owner and System integrator. System integrator: approve the FAT test procedures before the start of the FAT. Both: review, analyse and approve the tests results. Both: accept or reject the system as necessary, based on objective criteria. C.VV.9 Arrange independent testing Phase: Construction. Confidence level: 3. Unit level responsible:. System level responsible: System integrator. A representative environment is the actual unit or an integration of relevant systems creating a test situation similar to the final operational environment. HIL testing may be one method to be used for independent testing (see SfC 2.24). The system integrator is responsible for arranging the independent testing, but the independent party performing the testing is normally not the system integrator or the supplier. The term system in this context refers to a system on CL3 including all interfaces to other systems regardless of their confidence level. System FAT procedure (FI) and System FAT report (FI): reviewed in C.IV.2 at CL2 and CL3 used in C.IV.1 at CL3. Requirement definition: The system and its interfaces shall be tested by an independent party in its representative environment. The environment shall be documented, including its assumptions and limitations. Test procedure: covering the system and its interfaces. Test report. Acceptable contributions from Supplier. Contribute to the preparation of the independent tests. Independent test procedure (FI) and Independent test report (FI) reviewed in D.IV.1 at CL3. Independent test report (FI) used in D.IV.2 at CL3.
86 App.B Page 86 A 500 Activity definition acceptance D.CM.1 Manage software changes during commissioning Phase: Acceptance. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: Any change in a system or sub-system/component shall be submitted to a change control board which analyses impact and makes a decision regarding the change. The roles, responsibilities and mechanisms for the change control board shall be defined. These shall ensure that the deployed configuration of the systems that have passed FAT cannot change outside of a strict change control procedure. The system integrator should coordinate changes across all systems. This activity should be related to the activity C.PQA.1 Establish procedures for problem resolution and maintenance activities in the construction and acceptance phases, and also to X.REQ.1 Maintain requirements traceability information. The system integrator should consult with the suppliers about the design of the configuration management system. The configuration management system may be described in the configuration management plan. Defined software configuration management: definition of Change Control Board (CCB), change request forms, description of change process for software, impact analysis, Identification of items to be controlled, configuration management tool, including, issue, change, version and configuration tracking tool and prevents unauthorised changes. Modification records justifying changes: configuration records, version histories, release notes, change orders. Acceptable contributions from Owner and Supplier. Approval of software and document change requests. D.CM.2 Establish a release note for the systems in ISDS scope Phase: Acceptance. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: Before entering into operation, the configuration of the systems in the ISDS scope shall be documented. A release note shall be produced, describing all the items and their versions, as well as their status (e.g. known defects). In the case of changes, differences with respect to previous version shall be documented in the release note. The overall release note can link to each system's release note. The system integrator will base the release note on inputs from the suppliers, see X.CM.2. Overall release note for the systems in ISDS scope.
87 App.B Page 87 D.CM.3 Transfer responsibility for system configuration management to owner Phase: Acceptance. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: After acceptance, the owner shall take the responsibility for the configuration management of the systems within ISDS scope. The system integrator shall deliver all necessary software, documentation, and data to the owner. The owner may adopt the configuration management mechanisms already defined or modify them appropriately. Approved configuration management plan. Records of transmission of software, documentation and data, or responsibility thereof. Acceptable contributions from Owner and Supplier. Owner: Approval of configuration management system. Supplier: Supply the system integrator with relevant software, documentation and data. D.RAMS.1 Demonstrate achievement of unit RAMS requirements Phase: Acceptance. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: RAMS arguments and evidence shall be assembled to demonstrate achievement of RAMS requirements and objectives on the unit level. The RAMS requirements and objectives are defined in activities A.RAMS.1 and A.RAMS.2. The arguments and evidence may include: Safety analysis, FME(C)A, test cases, inspections, computations and simulations, certifications etc. The suppliers normally provide information gathered in the activity C.RAMS.1. RAMS compliance analysis information. Acceptable contributions from Supplier. Provide RAMS arguments and evidence for the system. D.RAMS.2 Collect data and calculate RAM values Phase: Acceptance. Confidence level: 3. Unit level responsible: System integrator. System level responsible: System integrator. This activity is an extension of D.RAMS.1. Several methods for estimating reliability are available, for example: PDS (SINTEF STF38A97434). Reliability and maintainability data can be used to calculate availability. Maintainability for software may differentiate activities such as restoration of the production (typically a fast activity), and defect correction (typically a longer one). The information supplied by the suppliers normally come from the activity C.RAMS.2. RAMS compliance report (FI) reviewed in D.IV.3 at CL3. Requirement definition: Data from commissioning and integration testing shall be collected and used together with data from earlier verification activities to estimate reliability, availability and maintainability values relative to the RAM objectives. Calculations of RAM values for relevant systems and the unit. RAM data. Acceptable contributions from Supplier. Provide RAM data and evaluations for the system. RAM report (FI) unit reviewed in D.IV.3. RAM report (FI) system used in D.IV.3.
88 App.B Page 88 D.RAMS.3 Perform a security audit on the deployed systems Phase: Acceptance. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: System integrator. Requirement definition: A security audit shall be performed on the relevant systems to verify that the security related requirements are fulfilled. The DNV OS-D202 contains some security related requirements, and the owner may request additional requirements to be met. The Security requirements are normally documented together with the other unit and system level requirements, see activity A.REQ.2. A number of different standards are available and one of them may be used as a basis for the security audit, e.g. ISA 99, BS 5750, and ISO Security audit records. D.VV.1 Perform validation testing Phase: Acceptance. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible: Owner. Validation testing is usually broken down and may be performed partly during commissioning, integration testing, and quay or sea trials. Parameterisation and calibration (of the software) should be performed prior to commissioning. A pre-commissioning step is often suitable. In some cases the validation tests also include tests that have not been possible to run during internal tests at the supplier (C.VV.3) or at the FAT (C.VV.8). Security audit report (FI) used in D.IV.3 at CL3. Requirement definition: The validation procedure shall be established according to the validation strategy. The testing steps for validation shall be identified and clearly separated from the parameterisation and calibration steps. They shall take software into consideration. Test procedure: black box tests, boundary tests, software behaviour and parameterisation and calibration. Test reports: executed consistent with procedure. Test issue list: deviations (punches) and variations. Acceptable contributions from System integrator and Supplier. Both: provide input to test procedures and test data. D.VV.2 Perform validation with operational scenarios Phase: Acceptance. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Owner. Qualified simulators, or the unit itself in integration testing, quay or sea trials qualify as representative of the target environment. Test procedure for quay and sea trials (FI) and Report from quay and sea trials (FI) reviewed in D.IV.1 at CL2 and CL3. Report from quay and sea trials (FI) used in D.IV.2 at CL3. Requirement definition: Operational scenarios shall be demonstrated on the systems in the ISDS scope in an environment representative of the target environment. Test procedure: operational scenarios. Test reports: tests performed in compliance with procedure and coverage of scenarios. Acceptable contributions from System integrator and Supplier. Both: provide input to test procedures and test data. Test procedure (FI) and Test report (FI) reviewed in D.IV.1 at CL2 and CL3. Test report (FI) used in D.IV.2 at CL3.
89 App.B Page 89 D.VV.3 Analyse validation results with respect to targets Phase: Acceptance. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Owner. Requirement definition: The validation results and its analyses, including the evaluation of the validation method, the defects identified, and the comparison between the expected results and the actual results shall be recorded. Quality criteria shall be evaluated and compared with quality targets. Decision for authorising the system or unit to go into operation should be made on the basis of the validation results from commissioning, integration testing and quay and sea trials. The quality targets are defined in activity B.VV.1. Test procedure: quality criteria. Test reports: analysis of the results. Test issue list. D.VV.4 Perform systems integration tests Phase: Acceptance. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible:. Verification analysis report (FI) reviewed in D.IV.2 at CL3. Requirement definition: Integration tests shall be performed to ensure that inter-system functionality is working as expected and that the interfaces between systems are compliant to the requirements. Scenarios involving interfaces between the different parts shall be included in the test cases. The interaction between the different systems should be tested, not only the match between output signals and input signals. Both the nominal behaviour and the degraded behaviour should be tested. To test the complete function the following testing method may be applied: black-box testing, boundary testing. Black-box testing is normally guided: a) by the structure of the requirements, use case, operational scenarios or results from simulations, b) by the structure of the input data, c) by the risks, d) randomly. For more information on black-box testing and boundary testing (or analysis), see IEC Integration test procedures covering system interfaces and inter-system functionality. Integration test reports. Test procedure (FI) and Test report (FI) reviewed in D.IV.1 at CL2 and CL3. Test report (FI) used in D.IV.2 at CL3. Acceptable contributions from Supplier. Contribute with inputs from FAT (activity C.VV.8).
90 App.B Page 90 A 600 Activity definition operation E.ACQ.1 Manage and monitor obsolescence Phase: Operation. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: The acquired components and systems shall be monitored so that pro-active actions can be made before parts become obsolete. The responsibilities for monitoring obsolescence and taking action when needed shall be clearly defined within the organisation. Obsolescence strategy document. Obsolescence management plan: Authorised vendor list, Spare parts list (HW & compatible SW), Alternate spare parts list and Management of intellectual property. E.CM.1 Manage change requests during operation Phase: Operation. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible: Supplier. E.CM.2 Perform configuration audits Phase: Operation. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible:. Requirement definition: Change requests shall be systematically handled. Potential changes shall be analysed to assess their impact on operation, as well as effects on other systems. A Change Control Board (CCB) shall consider the impact analysis before approving proposed changes. Change requests Impact analysis Change orders Work orders Problem reports Release notes Maintenance logs Requirement definition: There shall be regular configuration audits to verify the integrity of the configuration in operation. Configuration audits shall confirm 1) that the operational software versions match supplier records and onboard back-ups, 2) documentation versions match the operational software versions, and 3) the configuration management plan is being followed. Configuration audit reports. Acceptable contributions from Supplier. Identification of software versions as installed from supplier.
91 App.B Page 91 E.PQA.1 Define procedures for problem resolution, change handling, and maintenance activities Phase: Operation. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible:. Requirement definition: The owner shall establish procedures for receiving, recording, resolving, and tracking problems, modification requests, and maintenance activities. Software update, migration, retirement, backup and restore procedures shall also be included. The problem resolution should be based on a systematic problem solving method (e.g. identified in the units International Safety Management (ISM) procedures) to investigate and provide a detailed response (including corrective action) to the problem(s) identified. Configuration management plan. Configuration management procedure: migration issues and software obsolescence (ref E.ACQ.1). Maintenance procedures: procedures for the maintenance, software update, migration and retirement, backup and restore procedures and procedures for receiving, recording, resolving, tracking problems and modification requests. Change management procedure. Issue tracking and resolution procedure. Acceptable contributions from Supplier. Provide inputs regarding the supplier s system(s) to the problem resolution and change handling process. E.RAMS.1 Maintain and execute the plan for maintenance in operation Phase: Operation. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible: Owner. Requirement definition: The plan for the maintenance in operation of the unit and systems shall be maintained and executed, identifying the configuration items, the rules for maintenance in operation (e.g. access control and logging of maintenance), and the expected activities for maintenance and security patches. Configuration audits, security audits, obsolescence of hardware and software, migration and software retirement issues shall also be addressed in this plan. The maintenance plans are based on the maintenance related input from the suppliers in activity C.RAMS.3. Some elements of this plan may be addressed in the configuration management system. Maintenance plan: configuration items, audit activities, maintenance activities, expected software update, migration and retirement activities, maintenance intervals and tailored procedures for the maintenance in operation. Malicious software scan log files records. Maintenance logs. Acceptable contributions from Supplier.Provide input regarding the supplier s system(s) regarding maintenance and obsolescence.
92 App.B Page 92 E.RAMS.2 Collect RAMS data Phase: Operation. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Owner. Requirement definition: A system to collect RAMS data from the unit in operation shall be established. Data shall be forwarded to the relevant suppliers. RAMS data typically include: Failures and incidents, downtime, uptime, number of demands, time to repair, etc. Even small defects and incidents should be reported, particularly if repeated. RAMS data collection system. RAMS data collected. E.RAMS.3 Analyse RAMS data and address discrepancies Phase: Operation. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Supplier. RAMS analysis typically result in numbers for MTBF, MTTR etc. Requirement definition: RAMS data shall be analysed in order to assess and improve the RAMS performance. For the software, the analysis must be broken down to the level of software components in order to identify the components that are candidates for improvement. Discrepancies between RAMS requirements/objectives and the actual RAMS results shall be addressed. RAMS analysis. E.RAMS.4 Perform RAMS impact analysis of changes Phase: Operation. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Supplier. Requirement definition: Before changes are made to the system in operation, the impacts on RAMS properties shall be analysed. For major changes or changes with major RAMS impact, The owner shall inform DNV before the change is made. This activity is an extension of the activity E.CM.1. For major changes, or changes with major RAMS impact, DNV will typically review and witness the commissioning of the systems in question as described in activity E.IV.1. Major changes are changes that are impacting functionality, performance or the interfaces of systems on CL2 or CL3. Bug fixes are normally not considered a major change. Impact analysis showing RAMS evaluation.
93 App.B Page 93 E.RAMS.5 Periodically perform security audits of the systems in operation Phase: Operation. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Owner. Requirement definition: Relevant systems in ISDS scope shall periodically be audited from a security point of view. DNV OS-D202 contains some security related requirement. A number of different standards are available and one of them may be used as a basis for the security audit, e.g. ISA 99, BS 5750 and ISO Security audit report. E.VV.1 Perform validation testing after changes in the systems in operation Phase: Operation. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible: Owner. Requirement definition: After minor upgrades and corrections, validation testing shall be done for the systems affected. Major upgrades or conversions require the application of the whole ISDS standard. Test procedure: includes black box tests and includes boundary tests. Test reports: consistent with procedure. Acceptable contributions from Supplier. Contribute to preparing test procedures and to execute test. E.VV.2 Perform validation with operational scenarios after changes in the systems in operation Phase: Operation. Confidence level: 2 and above. Unit level responsible: Owner. System level responsible: Owner. This activity is an extension of E.VV.1. Major upgrades or conversions require the application of the whole ISDS standard. Test procedure (FI) and Test report (FI) reviewed in E.IV.1 at CL3. Requirement definition: After minor upgrades and corrections, validation with operational scenarios shall be done for the systems affected. The validation procedure shall be established according to the validation strategy. The testing steps for validation shall be identified and clearly separated from the parameterisation and calibration steps. They shall take software into consideration. The validation tests shall be performed according to the test procedure. Test procedures: Covering relevant Operational scenarios. Test reports: tests performed in compliance with procedure and analysis of the results. Acceptable contributions from Supplier. Contribute to preparing test procedures and to execute test. Test procedure (FI) and Test report (FI) reviewed in E.IV.1 at CL3.
94 App.B Page 94 A 700 Activity definition several phases X.ACQ.1 Monitor contract execution and changes Phase: Several. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: When development or configuration of a software component is subcontracted, the buying organisation shall monitor that the contract is executed as agreed. The ISDS requirements specified in the contract shall be followed up. Progress and quality shall be tracked. Progress reviews with sub-suppliers shall be planned and held. The impact of contract changes on software components shall be considered. This activity shall be performed in phases B-E. Sub-supplier progress review schedule. Sub-supplier progress review reports. Sub-supplier project control records. Sub-supplier quality control records. X.ACQ.2 Review intermediate deliverables Phase: Several. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Selected intermediate deliverables from sub-suppliers shall be provided for information and review, in order to give visibility into status and progress. The review of deliverables shall be planned. This activity shall be performed in phases B-C. Supplier agreement: list of deliverables and review and approval plans. Review records/minutes. X.CM.1 Track and control changes to the baselines Phase: Several. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: The development of any part of the unit shall follow the rules in the configuration management plan to maintain consistency at all levels and among all software components. Changes to requirements, design, interface definitions and software baselines shall be tracked and controlled. As changes are made to lower level designs and code, higher level designs and requirements shall be updated appropriately. This activity shall be performed in phases A-D for the system integrator and in phases B-E for the supplier. Typical information tracked includes: date, author, contents of the change, rationale for the change, components impacted, and version and configuration number changes. Proposed changes should be reviewed and approved. This activity should follow the configuration management plan. Change requests/orders. Version histories for baselines. Changes to: unit requirements, unit design, system requirements, system design, software design, interface specifications and software. Configuration records from document or software repositories.
95 App.B Page 95 X.CM.2 Establish a release note for the delivered system Phase: Several. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Each delivered system/sub-system shall come with a release note, describing the functional content of the delivery (versions of the applicable specifications) as well as its physical content (list of items with their versions). In case of new delivery of a software component, differences with respect to the previous version shall be documented in the release note. This activity shall be performed in phases C-E. Component release note: including list of changes to previous version of component. X.DES.1 Update the base-product design documentation Phase: Several. Confidence level: 2 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: If the system/software is created using base-products, the base product design documentation shall be kept up to date. The documentation of tools and environments needed to configure or generate the system/ software from the base-products shall also be kept up to date. This activity shall be performed in phases B-E. Tools and environments needed to configure or generate a project specific product should be considered a part of the same product repository. Base product design description. Revision information for updated base-product components.
96 App.B Page 96 X.PM.1 Monitor project status against plan Phase: Several. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: The project s activities shall be monitored against the plan and reported. Corrective actions shall be taken when significant deviations from the plan occur. When needed, coordinated actions shall be undertaken with other stakeholders. This activity shall be performed in phases A-D for the system integrator and phases B-E for the supplier. It is strongly recommended that joint meetings between the different organizations/roles are conducted on a regular basis in order to coordinate and track the progress of the activities in this standard. Corrective actions may include actions to bring back the project s status to the plan, or actions to establish new work estimates and/or an updated plan. Policies (information security) or strategies (obsolescence, verification, validation, and integration) should be updated when needed. During operation, most activities are constrained by maintenance or operation plan. Significant upgrades or corrections requiring coordination with several stakeholders may require a specific project plan, as specified by this standard. Master schedule. Master plan (updated). Project status report. Project action list. Minutes of review meetings. Progress report. Acceptable contributions from Owner and Supplier.Supplier to provide inputs on specific schedule constraints and deviations to plan for the unit level. Owner to contribute to update policies and strategies and to establish corrective actions. Owner to provide the master plan in phase E, when applicable. X.PM.2 Perform joint project milestone reviews Phase: Several. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: Joint project milestone reviews to check achievement of phase objectives shall be planned and carried-out. Significant risks, issues and their impact shall be documented and tracked until closure. Decisions whether or not, or how, to progress to the next phase shall be recorded. In the A phase, the system integrator and the owner shall participate. In the B-D phases, all roles shall participate. The ISDS milestones are intended to be the critical event to decide whether to proceed further in the project, weighing the risks of moving to the next phase versus the risks of postponing it. In some cases, the decision to move forward may include considerable project risks. The milestone is a good practice to communicate the extent of such risks to all stakeholders. The milestone review is typically the last activity to be performed in each phase. Minutes of joint milestone meetings. ISDS compliance status. Action plans. Acceptable contributions from Owner and Supplier. Owner and Supplier to participate in the milestone meeting and provide status and plans. Owner and Supplier to contribute to the decisions related to the further progress of the project.
97 App.B Page 97 X.PQA.1 Control procedures (owner) Phase: Several. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible:. Requirement definition: Procedures shall be controlled to ensure defined procedures are followed and that the activities required by this standard are executed in practice. This activity shall be performed in phases A-E. Follow up of the procedures may result in increased process adherence, improvement of procedures, or training as required. Proof that process adherence is being assessed: Quality control records, Project control records and Minutes of meetings, or other relevant information. X.PQA.2 Control procedures (system integrator) Phase: Several. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. Requirement definition: Procedures shall be controlled to ensure defined procedures are followed and that the activities required by this standard are executed in practice. This activity shall be performed in phases A-D. Follow up of the procedures may result in increased process adherence, improvement of procedures, or training as required. Proof that process adherence is being assessed: Quality control records, Project control records and Minutes of meetings, or other relevant information. X.PQA.3 Control procedures (supplier) Phase: Several. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Procedures shall be controlled to ensure defined procedures are followed and that the activities required by this standard are executed in practice. This activity shall be performed in phases B-E. Follow up of the procedures may result in increased process adherence, improvement of procedures, or training as required. Proof that process adherence is being assessed: Quality control records, Project control records and Minutes of meetings, or other relevant information.
98 App.B Page 98 X.PQA.4 Follow-up of ISDS assessment gaps (owner) Phase: Several. Confidence level: 1 and above. Unit level responsible: Owner. System level responsible:. Requirement definition: If the independent verifier finds gaps towards this standard during a process assessment, the organisation in question shall plan and implement actions to close those gaps within reasonable time. A corrective action plan outlining the actions to be taken shall be submitted to the independent verifier for approval. This activity shall be performed in phases A-E. The reasonable time can be decided from case to case, but normally an action plan is expected to be submitted for approval within 14 days of the assessment report is issued, and gaps are expected to be closed within the ISDS projectphase in question, and not later than 3 months after the assessment report. The assessed organisation is expected to closely follow-up on the activities in the corrective action plan. Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of implementation of the actions. X.PQA.5 Follow-up of ISDS assessment gaps (system integrator) Phase: Several. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible:. The reasonable time can be decided from case to case, but normally an action plan is expected to be submitted for approval within 14 days of the assessment report is issued, and gaps are expected to be closed within the ISDS projectphase in question, and not later than 3 months after the assessment report. The assessed organisation is expected to closely follow-up on the activities in the corrective action plan. Corrective action plan (AP) reviewed and approved in X.IV.1. Requirement definition: If the independent verifier finds gaps towards this standard during a process assessment, the organisation in question shall plan and implement actions to close those gaps within reasonable time. A corrective action plan outlining the actions to be taken shall be submitted to the independent verifier for approval. This activity shall be performed in phases A-D. Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of implementation of the actions. Corrective action plan (AP) reviewed and approved in X.IV.1.
99 App.B Page 99 X.PQA.6 Follow-up of ISDS assessment gaps (supplier) Phase: Several. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: If the independent verifier finds gaps towards this standard during a process assessment, the organisation in question shall plan and implement actions to close those gaps within reasonable time. A corrective action plan outlining the actions to be taken shall be submitted to the independent verifier for approval. This activity shall be performed in phases B-E. The reasonable time can be decided from case to case, but normally an action plan is expected to be submitted for approval within 14 days of the assessment report is issued, and gaps are expected to be closed within the ISDS projectphase in question, and not later than 3 months after the assessment report. The assessed organisation is expected to closely follow-up on the activities in the corrective action plan. Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of implementation of the actions. X.REQ.1 Maintain requirements traceability information Phase: Several. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. Depending on the confidence level there may be different ambition levels for traceability, but regardless of the ambition level the trace information should be kept up to date when during the project. The traceability information between unit and systems is established in activities A.REQ.6 and B.INT.2. The traceability information within a system emerges in activity B.REQ.2, B.DES.1 and B.DES.2. The traceability information from requirements to verification and validation emerges in activity C.VV.3/4/5/8, D.VV.1/2/4 and X.VV.1/2 as applicable. Traceability matrices are normally used to document the traceability information, but also databases or references from documents to documents can be used as long as the traceability information is explicit and reviewable. Corrective action plan (AP) reviewed and approved in X.IV.1. Requirement definition: Traceability of requirements shall be kept up to date. Three kinds of traceability information are required: 1) Traceability between requirements on different levels (e.g. from unit to system). 2) Traceability from a requirement to where and how it is designed and implemented. 3) Traceability from a requirement to where and how it is verified and validated. This activity shall be performed in phases A-D for the system integrator and in phases B-E for the supplier. Up to date traceability information: from owner to system requirements, from system requirements to functional specifications (where applicable), from system requirements to base-product and configuration data (where applicable), from functional specifications to subsystem/component specifications and from requirements to test procedures (when the test procedures are available). Completeness and consistency review records of the traceability information. Traceability matrices (FI) used in B.IV.1 and C.IV.1 at CL3, and in C.IV.2 at CL2 and CL3.
100 App.B Page 100 X.RISK.1 Track, review and update risks Phase: Several. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: The risk list shall be reviewed and updated regularly, in order to re-evaluate the risk attributes or to take into account new risks. Risks involving other stakeholders shall be regularly shared, reviewed and updated jointly. This activity shall be performed in phases B-D for the system integrator and phases B-E for the supplier. For the E phase, the owner shall establish and maintain the risk list for significant upgrades or conversions. It is important that a consistent picture of the risks involving other stakeholders is regularly shared, both with external stakeholders, and with the stakeholders within the organisation. Project risk management plan. Updated internal risk register (per organization). Updated project risk register (jointly managed). Acceptable contributions from Owner and Supplier. Supplier shall provide input on risks identification on unit level. Owner shall provide inputs on operational and business risks relevant for the project and related to the product. Owner shall manage the risk list for the E phase. X.RISK.2 Decide, implement and track risk mitigation actions to closure Phase: Several. Confidence level: 2 and above. Unit level responsible: System integrator. System level responsible: Supplier. Requirement definition: Risk mitigation actions shall be decided and planned, according to the risk strategy. Status of these actions shall be monitored regularly. Efficiency of mitigation actions shall be assessed and new actions taken as needed. Mitigation actions involving other stakeholders shall be coordinated. This activity shall be performed in phases A-D for the system integrator and phases B-E for the supplier. Major upgrades or conversions require the application of the whole ISDS standard. New stakeholders introduced throughout the project should be informed about the risk strategy and the jointly identified risks and be given an opportunity to provide inputs to necessary updates. Updated internal risk register: risk list, mitigation actions and follow-up records (per organization). Updated project risk register: risk list, mitigation actions and follow-up records (jointly managed). Acceptable contributions from Owner Contribute to identifying and deciding on risk mitigation actions and the closing of same.
101 App.B Page 101 X.VV.1 Perform verification and validation on added and modified software components Phase: Several. Confidence level: 1 and above. Unit level responsible:. System level responsible: Supplier. Requirement definition: Developed and integrated software components shall be verified, validated and tested for regression before decision to accept the component. This also applies if changes to software components occur after defined baselines, such as FAT. The status of the component verification and validation shall be analysed and compared with expected process and quality attributes targets. This activity shall be performed in phases C-E. Changes to software components should not only trigger verification and validation of the component that has been changed, but also of other related components to prevent regression. Test procedure: consistent with change or upgrade scope. Test report: consistent with test procedure. Acceptable contributions from Owner. Provide inputs on expected process and quality attribute targets. X.VV.2 Detail procedures for testing Phase: Several. Confidence level: 1 and above. Unit level responsible: System integrator. System level responsible: Supplier. Test procedure (FI) and Test report (FI) used in E.IV.1 at CL3. Requirement definition: Detailed procedures for testing shall be completed (test cases), and documented. The testing steps shall be identified and clearly separated from the parameterisation and calibration steps. The expected results for each test case shall be specified. Traceability to requirements/function specifications shall be taken into consideration. This activity shall be performed in phases C-E for the supplier and phase D for the system integrator. Procedures for testing should take software into consideration, and should focus on new, modified and parameterised software. In case of (semi-)automated testing, implementation of the test cases should be performed beforehand. For testing performed in C.IMP.3, a test log is usually sufficient. Existence of relevant test procedures.
<name of project> Software Project Management Plan
The document in this file is adapted from the IEEE standards for Software Project Management Plans, 1058-1998, which conforms to the requirements of ISO standard 12207 Software Life Cycle Processes. Tailor
Program Lifecycle Methodology Version 1.7
Version 1.7 March 30, 2011 REVISION HISTORY VERSION NO. DATE DESCRIPTION AUTHOR 1.0 Initial Draft Hkelley 1.2 10/22/08 Updated with feedback Hkelley 1.3 1/7/2009 Copy edited Kevans 1.4 4/22/2010 Updated
PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >
PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME > Date of Issue: < date > Document Revision #: < version # > Project Manager: < name > Project Management Plan < Insert Project Name > Revision History Name
General Description of The CMC- Services
STANDARD FOR CERTIFICATION No.1.1 General Description of The CMC- Services JANUARY 2013 The electronic pdf version of this document found through http://www.dnv.com is the officially binding version The
INTEGRATED SOFTWARE QUALITY MANAGEMENT (ISQM)
Guide for Integrated Software Quality Management (ISQM) GUIDE FOR INTEGRATED SOFTWARE QUALITY MANAGEMENT (ISQM) SEPTEMBER 2012 (Updated July 2014 see next page) American Bureau of Shipping Incorporated
Project Management Guidelines
Project Management Guidelines 1. INTRODUCTION. This Appendix (Project Management Guidelines) sets forth the detailed Project Management Guidelines. 2. PROJECT MANAGEMENT PLAN POLICY AND GUIDELINES OVERVIEW.
8. Master Test Plan (MTP)
8. Master Test Plan (MTP) The purpose of the Master Test Plan (MTP) is to provide an overall test planning and test management document for multiple levels of test (either within one project or across
3SL. Requirements Definition and Management Using Cradle
3SL Requirements Definition and Management Using Cradle November 2014 1 1 Introduction This white paper describes Requirements Definition and Management activities for system/product development and modification
Management of Safety and Environmental Protection (SEP)
RULES FOR CLASSIFICATION OF Ships / High Speed, Light Craft and Naval Surface Craft PART 7 CHAPTER 3 SHIPS IN OPERATION Management of Safety and Environmental Protection (SEP) JULY 2006 This chapter has
DRAFT REGULATORY GUIDE
U.S. NUCLEAR REGULATORY COMMISSION August 2012 OFFICE OF NUCLEAR REGULATORY RESEARCH Division 1 DRAFT REGULATORY GUIDE Contact: K. Sturzebecher (301) 251-7494 DRAFT REGULATORY GUIDE DG-1206 (Proposed Revision
The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.
CAPACITY AND AVAILABILITY MANAGEMENT A Project Management Process Area at Maturity Level 3 Purpose The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision
Electrical Shore Connections / Cold Ironing
STANDARD FOR CERTIFICATION No. 2.25 Electrical Shore Connections / Cold Ironing JULY 2014 The electronic pdf version of this document found through http://www.dnv.com is the officially binding version
Appendix 2-A. Application and System Development Requirements
Appendix 2-A. Application and System Development Requirements Introduction AHRQ has set up a Distributed Systems Engineering Lab (DSEL) to support all internal development efforts and provide a facility
Software Quality Subcontractor Survey Questionnaire INSTRUCTIONS FOR PURCHASE ORDER ATTACHMENT Q-201
PURCHASE ORDER ATTACHMENT Q-201A Software Quality Subcontractor Survey Questionnaire INSTRUCTIONS FOR PURCHASE ORDER ATTACHMENT Q-201 1. A qualified employee shall be selected by the Software Quality Manager
Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.
INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. February 2013 1 Executive Summary Adnet is pleased to provide this white paper, describing our approach to performing
AP1000 European 18. Human Factors Engineering Design Control Document
18.2 Human Factors Engineering Program Management The purpose of this section is to describe the goals of the AP1000 human factors engineering program, the technical program to accomplish these goals,
Project Management Plan for
Project Management Plan for [Project ID] Prepared by: Date: [Name], Project Manager Approved by: Date: [Name], Project Sponsor Approved by: Date: [Name], Executive Manager Table of Contents Project Summary...
R000. Revision Summary Revision Number Date Description of Revisions R000 Feb. 18, 2011 Initial issue of the document.
2 of 34 Revision Summary Revision Number Date Description of Revisions Initial issue of the document. Table of Contents Item Description Page 1. Introduction and Purpose... 5 2. Project Management Approach...
System Development and Life-Cycle Management (SDLCM) Methodology. Approval CISSCO Program Director
System Development and Life-Cycle Management (SDLCM) Methodology Subject Type Standard Approval CISSCO Program Director A. PURPOSE This standard specifies content and format requirements for a Physical
PHASE 5: DESIGN PHASE
PHASE 5: DESIGN PHASE During the Design Phase, the system is designed to satisfy the requirements identified in the previous phases. The requirements identified in the Requirements Analysis Phase are transformed
SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT
SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT Mar 31, 2014 Japan Aerospace Exploration Agency This is an English translation of JERG-2-610. Whenever there is anything ambiguous in this document, the original
ITIL A guide to service asset and configuration management
ITIL A guide to service asset and configuration management The goal of service asset and configuration management The goals of configuration management are to: Support many of the ITIL processes by providing
CMMI: Specific Goals and Practices
Software Engineering for Outsourced & Offshore Development CMMI: Specific Goals and Practices PeterKolb Software Engineering CMMI Process Areas for R&D Projects Slide 2 Content Management in Projects Project
Template K Implementation Requirements Instructions for RFP Response RFP #
Template K Implementation Requirements Instructions for RFP Response Table of Contents 1.0 Project Management Approach... 3 1.1 Program and Project Management... 3 1.2 Change Management Plan... 3 1.3 Relationship
Validating Enterprise Systems: A Practical Guide
Table of Contents Validating Enterprise Systems: A Practical Guide Foreword 1 Introduction The Need for Guidance on Compliant Enterprise Systems What is an Enterprise System The Need to Validate Enterprise
Project Plan for <project name>
Note: Text displayed in blue italics is included to provide guidance to the author and should be deleted or hidden before publishing the document. This template can be used at it is, or to complete and
DNVGL-RU-0050 Edition October 2014
RULES FOR CLASSIFICATION DNVGL-RU-0050 Edition October 2014 The content of this service document is the subject of intellectual property rights reserved by ( DNV GL ). The user accepts that it is prohibited
IT Project: System Implementation Project Template Description
2929 Campus Drive Suite 250 IT Project: System Implementation Project Template Description Table of Contents Introduction... 2 Project Phases... 3 Initiation & Requirements Gathering Milestone... 3 Initiation
NODIS Library Program Formulation(7000s) Search
NODIS Library Program Formulation(7000s) Search NASA Procedural Requirements This Document Is Uncontrolled When Printed. Check the NASA Online Directives Information System (NODIS) Library to verify that
System Development Life Cycle Guide
TEXAS DEPARTMENT OF INFORMATION RESOURCES System Development Life Cycle Guide Version 1.1 30 MAY 2008 Version History This and other Framework Extension tools are available on Framework Web site. Release
THE PROJECT MANAGEMENT KNOWLEDGE AREAS
THE PROJECT MANAGEMENT KNOWLEDGE AREAS 4. Project Integration Management 5. Project Scope Management 6. Project Time Management 7. Project Cost Management 8. Project Quality Management 9. Project Human
ISO 9001: 2008 Construction Quality Management System Sample - Selected pages (not a complete plan)
ISO 9001: 2008 Construction Quality Management System Sample - Selected pages (not a complete plan) Part 1: Project-Specific Quality Plan Part 2: Company Quality Manual Part 3: Submittal Forms Part 4:
IEC 61508 Overview Report
IEC 61508 Overview Report A Summary of the IEC 61508 Standard for Functional Safety of Electrical/Electronic/Programmable Electronic Safety-Related Systems exida Sellersville, PA 18960, USA +1-215-453-1720
STANDARD. Maritime training providers DNVGL-ST-0029:2014-04 DNV GL AS
STANDARD DNVGL-ST-0029:2014-04 Maritime training providers The electronic pdf version of this document found through http://www.dnvgl.com is the officially binding version. The documents are available
Implementation of ANSI/AAMI/IEC 62304 Medical Device Software Lifecycle Processes.
Implementation of ANSI/AAMI/IEC 62304 Medical Device Software Lifecycle Processes.. www.pharmout.net Page 1 of 15 Version-02 1. Scope 1.1. Purpose This paper reviews the implementation of the ANSI/AAMI/IEC
SOFTWARE ASSURANCE STANDARD
NOT MEASUREMENT SENSITIVE National Aeronautics and NASA-STD-8739.8 w/change 1 Space Administration July 28, 2004 SOFTWARE ASSURANCE STANDARD NASA TECHNICAL STANDARD REPLACES NASA-STD-2201-93 DATED NOVEMBER
074-8432-552 Page 1 of 7 Effective Date: 12/18/03 Software Supplier Process Requirements
Page 1 of 7 Software Supplier Process Requirements 1.0 QUALITY SYSTEM FRAMEWORK 1.1 QUALITY POLICY The Seller shall document and implement a quality program in the form of Quality manual or detailed Quality
Software Test Plan (STP) Template
(STP) Template Items that are intended to stay in as part of your document are in bold; explanatory comments are in italic text. Plain text is used where you might insert wording about your project. This
Procedure for Assessment of System and Software
Doc. No: STQC IT/ Assessment/ 01, Version 1.0 Procedure for Assessment of System and Software May, 2014 STQC - IT Services STQC Directorate, Department of Electronics and Information Technology, Ministry
TfNSW Standard Requirements TSR T Technical Management
Template Applicable to: Transport Projects Quality Management System Status: Division: Approved Transport Projects Version: 5.0 Desksite No.: 3455797_1 Date of issue: 1 July 2014 Effective date: 1 July
SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK
Office of Safety and Mission Assurance NASA-GB-9503 SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK AUGUST 1995 National Aeronautics and Space Administration Washington, D.C. 20546 PREFACE The growth in cost
PHASE 3: PLANNING PHASE
PHASE 3: PLANNING PHASE The Planning Phase focuses principally on required project planning work. Proper comprehensive project planning is essential to a successful IT project, and incomplete project planning
CMS Policy for Configuration Management
Chief Information Officer Centers for Medicare & Medicaid Services CMS Policy for Configuration April 2012 Document Number: CMS-CIO-POL-MGT01-01 TABLE OF CONTENTS 1. PURPOSE...1 2. BACKGROUND...1 3. CONFIGURATION
Software Project Management Plan (SPMP)
Software Project Management Plan (SPMP) The basic template to be used is derived from IEEE Std 1058-1998, IEEE Standard for Software Project Management Plans. The following is a template for the SPMP.
AEO Guide to Engineering Management
Management standard AEO Guide to Engineering Management Issued Date: 4 June 2013 Important Warning This document is one of a set of standards developed solely and specifically for use on the rail network
Department of Administration Portfolio Management System 1.3 June 30, 2010
E 06/ 30/ 2010 EX AM PL 1. 3 06/ 28/ 2010 06/ 24/ 2010 06/ 23/ 2010 06/ 15/ 2010 06/ 18/ 2010 Portfolio System 1.3 June 30, 2010 Contents Section 1. Project Overview... 1 1.1 Project Description... 1 1.2
A COMPARISON OF PRINCE2 AGAINST PMBOK
Introduction This comparison takes each part of the PMBOK and gives an opinion on what match there is with elements of the PRINCE2 method. It can be used in any discussion of the respective merits of the
SOFTWARE DEVELOPMENT PLAN
SOFTWARE DEVELOPMENT PLAN This document outline is based on the IEEE Standard 1058.1-1987 for Software Project Management Plans. This is the controlling document for managing a software project, and it
How To Write An Slcm Project Plan
SLCM 2003.1 Artifacts in a Nutshell ( as of 01/21/2005) Project Development Phases Pension Benefit Guaranty Corporation s (PBGC) System Life Cycle Methodology (SLCM) is comprised of five project development
MKS Integrity & CMMI. July, 2007
& CMMI July, 2007 Why the drive for CMMI? Missed commitments Spiralling costs Late delivery to the market Last minute crunches Inadequate management visibility Too many surprises Quality problems Customer
How To Write A Contract For Software Quality Assurance
U.S. Department of Energy Washington, D.C. NOTICE DOE N 203.1 Approved: Expires: 06-02-01 SUBJECT: SOFTWARE QUALITY ASSURANCE 1. OBJECTIVES. To define requirements and responsibilities for software quality
Software Engineering Reference Framework
Software Engineering Reference Framework Michel Chaudron, Jan Friso Groote, Kees van Hee, Kees Hemerik, Lou Somers, Tom Verhoeff. Department of Mathematics and Computer Science Eindhoven University of
Project QA and Collaboration Plan for <project name>
Note: Text displayed in blue italics is included to provide guidance to the author and should be deleted or hidden before publishing the document. This template can be used at it is, or to complete and
Develop Project Charter. Develop Project Management Plan
Develop Charter Develop Charter is the process of developing documentation that formally authorizes a project or a phase. The documentation includes initial requirements that satisfy stakeholder needs
OPERATIONAL STANDARD
1 of 11 1. Introduction The International Safe Transit Association (ISTA), a non-profit association whose objective is to prevent product damage and excess packaging usage within the distribution environment.
Linac Coherent Light Source (LCLS)
Linac Coherent Light Source (LCLS) An X Ray Free Electron Laser LLNL Quality Implementation Plan PMD 003 r0 May 2004 Prepared for the US Department of Energy under contract numbers: SLAC DE AC03 76SF00515
INTEGRATED MANAGEMENT SYSTEM MANUAL IMS. Based on ISO 9001:2008 and ISO 14001:2004 Standards
INTEGRATED MANAGEMENT SYSTEM MANUAL IMS Based on ISO 9001:2008 and ISO 14001:2004 Standards Approved by Robert Melani Issue Date 30 December 2009 Issued To Management Representative Controlled Y N Copy
CalMod Design-Build Electrification Services
SECTION 01800 SYSTEMS INTEGRATION AND INTEGRATOR REQUIREMENTS PART 1 GENERAL DESCRIPTION A. This section specifies the system-wide integration requirements for the Caltrain Electrification system, i.e.
How to Upgrade SPICE-Compliant Processes for Functional Safety
How to Upgrade SPICE-Compliant Processes for Functional Safety Dr. Erwin Petry KUGLER MAAG CIE GmbH Leibnizstraße 11 70806 Kornwestheim Germany Mobile: +49 173 67 87 337 Tel: +49 7154-1796-222 Fax: +49
ITS Projects Systems Engineering Process Compliance Checklist
ITS Projects Systems Engineering Process Compliance Checklist FHWA Final Rule (23 CFR 940) This checklist is to be completed by the MDOT or LPA Project Management Staff. Please refer to the accompanying
PHASE 3: PLANNING PHASE
PHASE 3: PLANNING PHASE The ning Phase focuses principally on required project planning work. Proper comprehensive project planning is essential to a successful IT project, and incomplete project planning
Certified Software Quality Engineer (CSQE) Body of Knowledge
Certified Software Quality Engineer (CSQE) Body of Knowledge The topics in this Body of Knowledge include additional detail in the form of subtext explanations and the cognitive level at which the questions
R214 SPECIFIC REQUIREMENTS: INFORMATION TECHNOLOGY TESTING LABORATORY ACCREDITATION PROGRAM
The American Association for Laboratory Accreditation Document Revised: R214: Specific Requirements: Information Technology Testing Laboratory Accreditation July 13, 2010 Program Page 1 of 26 R214 SPECIFIC
Engineering Procedure
Engineering Procedure Design EPD 0020 INVENTORY MANAGEMENT ENGINEERING RESPONSIBILITIES Owner: Approved by: Manager, Engineering Standards and Configurations Jagath Peiris Manager Engineering Standards
Version: 1.0 Latest Edition: 2006-08-24. Guideline
Management of Comments on this report are gratefully received by Johan Hedberg at SP Swedish National Testing and Research Institute mailto:[email protected] Quoting of this report is allowed but please
REGULATORY GUIDE 1.170 (Draft was issued as DG-1207, dated August 2012)
Purpose U.S. NUCLEAR REGULATORY COMMISSION July 2013 Revision 1 REGULATORY GUIDE OFFICE OF NUCLEAR REGULATORY RESEARCH REGULATORY GUIDE 1.170 (Draft was issued as DG-1207, dated August 2012) Technical
Quality management systems
L E C T U R E 9 Quality management systems LECTURE 9 - OVERVIEW Quality management system based on ISO 9000 WHAT IS QMS (QUALITY MANAGEMENT SYSTEM) Goal: Meet customer needs Quality management system includes
A Guide To The Project Management Body of Knowledge (PMBOK) Significant Changes from the 3 rd edition to the 4 th edition
A Guide To The Project Body of Knowledge (PMBOK) Significant Changes from the 3 rd edition to the 4 th edition Major Changes The adoption of the verb-noun format for process names Amplification as to Enterprise
Intland s Medical Template
Intland s Medical Template Traceability Browser Risk Management & FMEA Medical Wiki Supports compliance with IEC 62304, FDA Title 21 CFR Part 11, ISO 14971, IEC 60601 and more INTLAND codebeamer ALM is
MNLARS Project Audit Checklist
Audit Checklist The following provides a detailed checklist to assist the audit team in reviewing the health of a project. Relevance (at this time) How relevant is this attribute to this project or audit?
Certification of Materials and Components
OIL & GAS Certification of Materials and Components Standardisation of the industry s approach to quality control and assurance processes Martin Fowlie 12th February 2015 1 DNV GL 2013 12th February 2015
IRCA Briefing note ISO/IEC 20000-1: 2011
IRCA Briefing note ISO/IEC 20000-1: 2011 How to apply for and maintain Training Organization Approval and Training Course Certification IRCA 3000 Contents Introduction 3 Summary of the changes within ISO/IEC
NABL NATIONAL ACCREDITATION
NABL 160 NABL NATIONAL ACCREDITATION BOARD FOR TESTING AND CALIBRATION LABORATORIES GUIDE for PREPARING A QUALITY MANUAL ISSUE NO. : 05 AMENDMENT NO : 00 ISSUE DATE: 27.06.2012 AMENDMENT DATE: -- Amendment
Space Project Management
EUROPEAN COOPERATION FOR SPACE STANDARDIZATION Space Project Management Configuration Management Secretariat ESA ESTEC Requirements & Standards Division Noordwijk, The Netherlands Published by: Price:
INFORMATION TECHNOLOGY SECURITY STANDARDS
INFORMATION TECHNOLOGY SECURITY STANDARDS Version 2.0 December 2013 Table of Contents 1 OVERVIEW 3 2 SCOPE 4 3 STRUCTURE 5 4 ASSET MANAGEMENT 6 5 HUMAN RESOURCES SECURITY 7 6 PHYSICAL AND ENVIRONMENTAL
Lecture Slides for Managing and Leading Software Projects. Chapter 1: Introduction
Lecture Slides for Managing and Leading Software Projects Chapter 1: Introduction developed by Richard E. (Dick) Fairley, Ph.D. to accompany the text Managing and Leading Software Projects published by
NORWEGIAN PROJECT MANAGEMENT PRINCIPLES APPLIED IN THE JURONG ROCK CAVERN PROJECT
NORWEGIAN PROJECT MANAGEMENT PRINCIPLES APPLIED IN THE JURONG ROCK CAVERN PROJECT FINN FAGERVIK 1, PETTER PLASSBAK 1 and TEO TIONG YONG 2 1 Sintef-Tritech-Multiconsult (STM) Consortium, Singapore E-mail:[email protected]
Micron Quality Manual
Micron Quality Manual The Quality Management System (QMS) is an integral component in allowing Micron to achieve the next level in Quality Control and in delivering Total Quality Excellence to our customers.
Appendix V Risk Management Plan Template
Appendix V Risk Management Plan Template Version 2 March 7, 2005 This page is intentionally left blank. Version 2 March 7, 2005 Title Page Document Control Panel Table of Contents List of Acronyms Definitions
Engineering Procurement Construction Quality Plan
Engineering Procurement Construction Quality Plan Index 1 Introduction... 4 1.1 Project Background... 4 1.2 Document Purpose... 4 1.3 Change Control... 4 1.4 Contract... 4 1.5 Quality system... 4 1.6 Distribution...
IEC 61508 Functional Safety Assessment. Project: K-TEK Corporation AT100, AT100S, AT200 Magnetostrictive Level Transmitter.
61508 SIL 3 CAPABLE IEC 61508 Functional Safety Assessment Project: K-TEK Corporation AT100, AT100S, AT200 Magnetostrictive Level Transmitter Customer: K-TEK Corporation Prairieville, LA USA Contract No.:
Capability Maturity Model Integration (CMMI SM ) Fundamentals
Capability Maturity Model Integration (CMMI SM ) Fundamentals Capability Maturity Model Integration and CMMI are are service marks of Carnegie Mellon University 2008, GRafP Technologies inc. 1 What is
FAA WILLIAM J. HUGHES TECHNICAL CENTER ATLANTIC CITY INTERNATIONAL AIRPORT, NEW JERSEY 08405
FAA WILLIAM J. HUGHES TECHNICAL CENTER TEST AND EVALUATION HANDBOOK DOCUMENT # VVSPT-A2-PDD-013 VERSION # VERSION 3.0 VERSION DATE SEPTEMBER 24, 2013 FAA WILLIAM J. HUGHES TECHNICAL CENTER ATLANTIC CITY
Superseded by T MU AM 04001 PL v2.0
Plan T MU AM 04001 PL TfNSW Configuration Management Plan Important Warning This document is one of a set of standards developed solely and specifically for use on the rail network owned or managed by
Testing Automated Manufacturing Processes
Testing Automated Manufacturing Processes (PLC based architecture) 1 ❶ Introduction. ❷ Regulations. ❸ CSV Automated Manufacturing Systems. ❹ PLCs Validation Methodology / Approach. ❺ Testing. ❻ Controls
ELECTROTECHNIQUE IEC INTERNATIONALE 61508-3 INTERNATIONAL ELECTROTECHNICAL
61508-3 ª IEC: 1997 1 Version 12.0 05/12/97 COMMISSION CEI ELECTROTECHNIQUE IEC INTERNATIONALE 61508-3 INTERNATIONAL ELECTROTECHNICAL COMMISSION Functional safety of electrical/electronic/ programmable
CP14 ISSUE 5 DATED 1 st OCTOBER 2015 BINDT Audit Procedure Conformity Assessment and Certification/Verification of Management Systems
Certification Services Division Newton Building, St George s Avenue Northampton, NN2 6JB United Kingdom Tel: +44(0)1604-893-811. Fax: +44(0)1604-893-868. E-mail: [email protected] CP14 ISSUE 5 DATED 1 st OCTOBER
Hardware safety integrity Guideline
Hardware safety integrity Comments on this report are gratefully received by Johan Hedberg at SP Swedish National Testing and Research Institute mailto:[email protected] Quoting of this report is allowed
Request for Proposal for Application Development and Maintenance Services for XML Store platforms
Request for Proposal for Application Development and Maintenance s for ML Store platforms Annex 4: Application Development & Maintenance Requirements Description TABLE OF CONTENTS Page 1 1.0 s Overview...
Copyright 2014 Carnegie Mellon University The Cyber Resilience Review is based on the Cyber Resilience Evaluation Method and the CERT Resilience
Copyright 2014 Carnegie Mellon University The Cyber Resilience Review is based on the Cyber Resilience Evaluation Method and the CERT Resilience Management Model (CERT-RMM), both developed at Carnegie
IAEA-TECDOC-1328 Solutions for cost effective assessment of software based instrumentation and control systems in nuclear power plants
IAEA-TECDOC-1328 Solutions for cost effective assessment of software based instrumentation and control systems in nuclear power plants Report prepared within the framework of the Technical Working Group
Company Management System. Business Continuity in SIA
Company Management System Business Continuity in SIA Document code: Classification: Company Project/Service Year Document No. Version Public INDEX 1. INTRODUCTION... 3 2. SIA S BUSINESS CONTINUITY MANAGEMENT
Business Operations. Module Db. Capita s Combined Offer for Business & Enforcement Operations delivers many overarching benefits for TfL:
Module Db Technical Solution Capita s Combined Offer for Business & Enforcement Operations delivers many overarching benefits for TfL: Cost is reduced through greater economies of scale, removal of duplication
Network Certification Body
Network Certification Body Scheme rules for assessment of railway projects to requirements of the Railways Interoperability Regulations as a Notified and Designated Body 1 NCB_MS_56 Contents 1 Normative
Fuel Treatment and Conditioning Systems
RULES FOR CLASSIFICATION OF Ships PART 6 CHAPTER 14 NEWBUILDINGS SPECIAL EQUIPMENT AND SYSTEMS ADDITIONAL CLASS Fuel Treatment and Conditioning Systems JULY 2006 This chapter has been amended since the
Positive Train Control (PTC) Program Management Plan
Positive Train Control (PTC) Program Management Plan Proposed Framework This document is considered an uncontrolled copy unless it is viewed online in the organization s Program Management Information
STATE BOARD OF ELECTIONS P.O. BOX 6486, ANNAPOLIS, MD 21401-0486 PHONE (410) 269-2840
MARYLAND STATE BOARD OF ELECTIONS P.O. BOX 6486, ANNAPOLIS, MD 21401-0486 PHONE (410) 269-2840 Bobbie S. Mack, Chairman David J. McManus, Jr., Vice Chairman Rachel T. McGuckian Patrick H. Murray Charles
Domain 1 The Process of Auditing Information Systems
Certified Information Systems Auditor (CISA ) Certification Course Description Our 5-day ISACA Certified Information Systems Auditor (CISA) training course equips information professionals with the knowledge
Agile Project Execution
ebook Agile Project Execution The future of Industrial Process Automation projects v1.4 EMK(VDS)-TR-EB-01 APEX ebook Table of Contents Intro Agile Project Execution Page 2. Chapter 1 Conventional Project
Independent Verification and Validation of SAPHIRE 8 Software Project Plan
INL/EXT-09-17022 Rev. 2 Independent Verification and Validation of SAPHIRE 8 Software Project Plan March 2010 The INL is a U.S. Department of Energy National Laboratory operated by Battelle Energy Alliance
