DNV-OS-D203: Integrated Software Dependent Systems (ISDS)

OFFSHORE STANDARD
DNV-OS-D203
Integrated Software
Dependent Systems (ISDS)
DECEMBER 2012
The electronic pdf version of this document found through http://www.dnv.com is the officially binding version
DET NORSKE VERITAS AS
FOREWORD
DNV is a global provider of knowledge for managing risk. Today, safe and responsible business conduct is both a license
to operate and a competitive advantage. Our core competence is to identify, assess, and advise on risk management. From
our leading position in certification, classification, verification, and training, we develop and apply standards and best
practices. This helps our customers safely and responsibly improve their business performance. DNV is an independent
organisation with dedicated risk professionals in more than 100 countries, with the purpose of safeguarding life, property
and the environment.
DNV service documents consist of among others the following types of documents:
— Service Specifications. Procedural requirements.
— Standards. Technical requirements.
— Recommended Practices. Guidance.
The Standards and Recommended Practices are offered within the following areas:
A) Qualification, Quality and Safety Methodology
B) Materials Technology
C) Structures
D) Systems
E) Special Facilities
F) Pipelines and Risers
G) Asset Operation
H) Marine Operations
J) Cleaner Energy
O) Subsea Systems
U) Unconventional Oil & Gas
© Det Norske Veritas AS December 2012
Any comments may be sent by e-mail to [email protected]
This service document has been prepared based on available knowledge, technology and/or information at the time of issuance of this document, and is believed to reflect the best of
contemporary technology. The use of this document by others than DNV is at the user's sole risk. DNV does not accept any liability or responsibility for loss or damages resulting from
any use of this document.
Offshore Standard DNV-OS-D203, December 2012
Changes – Page 3
CHANGES
General
This document supersedes DNV-OS-D203, May 2011.
Text affected by the main changes in this edition is highlighted in red colour. However, if the changes involve
a whole chapter, section or sub-section, normally only the title will be in red colour.
Changes
The document is totally revised.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Contents – Page 4
CONTENTS
CH. 1 INTRODUCTION .............................................................................................................................. 6
Sec. 1
A.
A
A
A
A
A
A
A
General ................................................................................................................................................ 7
General ............................................................................................................................................................................ 7
100 Introduction........................................................................................................................................................... 7
200 Objectives ............................................................................................................................................................. 7
300 Organisation of content......................................................................................................................................... 7
400 Assumptions.......................................................................................................................................................... 7
500 Scope and application ........................................................................................................................................... 7
600 Types of software within scope ............................................................................................................................ 8
700 Alterations and additions of approved systems .................................................................................................... 8
B. References ....................................................................................................................................................................... 8
B 100 International or national references ...................................................................................................................... 8
B 200 DNV references .................................................................................................................................................... 9
CH. 2 TECHNICAL PROVISIONS.......................................................................................................... 10
Sec. 1
Principles........................................................................................................................................... 11
A. General .......................................................................................................................................................................... 11
A 100 Process requirements .......................................................................................................................................... 11
A 200 System hierarchy................................................................................................................................................. 11
Sec. 2
Confidence Levels............................................................................................................................. 12
A. Confidence Levels........................................................................................................................................................ 12
A 100 Definition of confidence levels........................................................................................................................... 12
Sec. 3
Responsibilities ................................................................................................................................. 13
A. Activities and Roles ...................................................................................................................................................... 13
A 100 Activities ............................................................................................................................................................. 13
A 200 Roles ................................................................................................................................................................... 14
Sec. 4
Project Phases and Process Areas................................................................................................... 15
A.
A
A
A
A
A
A
Project phases................................................................................................................................................................ 15
100 Introduction......................................................................................................................................................... 15
200 Basic engineering phase (A) ............................................................................................................................... 15
300 Engineering phase (B)......................................................................................................................................... 15
400 Construction phase (C) ....................................................................................................................................... 16
500 Acceptance phase (D) ......................................................................................................................................... 16
600 Operation phase (E) ............................................................................................................................................ 16
B.
B
B
B
B
B
B
B
B
B
B
B
B
Process Areas ................................................................................................................................................................ 16
100 Introduction......................................................................................................................................................... 16
200 Requirements engineering (REQ)....................................................................................................................... 16
300 Design (DES) ...................................................................................................................................................... 16
400 Implementation (IMP) ........................................................................................................................................ 16
500 Acquisition (ACQ).............................................................................................................................................. 17
600 Integration (INT)................................................................................................................................................. 17
700 Verification and Validation (VV) ....................................................................................................................... 17
800 Reliability, Availability, Maintainability and Safety (RAMS)........................................................................... 17
900 Project Management (PM).................................................................................................................................. 17
1000 Risk Management (RISK) .................................................................................................................................. 18
1100 Process and Quality Assurance (PQA) ............................................................................................................... 18
1200 Configuration Management (CM) ...................................................................................................................... 18
Sec. 5
A.
A
A
A
ISDS Requirements for Owners...................................................................................................... 19
Owner requirements ...................................................................................................................................................... 19
100 Requirements under the owner’s responsibility.................................................................................................. 19
200 Acceptance criteria for owner assessments......................................................................................................... 20
300 Documentation criteria for the owner ................................................................................................................. 21
Sec. 6
ISDS Requirements for System Integrators .................................................................................. 22
A. System integrator requirements .................................................................................................................................... 22
A 100 Requirements under the system integrator’s responsibility ................................................................................ 22
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Contents – Page 5
A 200
A 300
Acceptance criteria for system integrator assessments....................................................................................... 23
Documentation criteria for the system integrator ............................................................................................... 26
Sec. 7
ISDS Requirements for Suppliers................................................................................................... 28
A.
A
A
A
Supplier requirements ................................................................................................................................................... 28
100 Requirements under the supplier’s responsibility............................................................................................... 28
200 Acceptance criteria for supplier assessments...................................................................................................... 29
300 Documentation criteria for the supplier .............................................................................................................. 32
Sec. 8
ISDS Requirements for the Independent Verifier ........................................................................ 34
A. Independent verifier requirements ................................................................................................................................ 34
A 100 Activities for which the independent verifier is responsible .............................................................................. 34
CH. 3 CLASSIFICATION AND CERTIFICATION .............................................................................. 42
Sec. 1
A.
A
A
A
A
A
A
A
Requirements .................................................................................................................................... 43
General .......................................................................................................................................................................... 43
100 Introduction......................................................................................................................................................... 43
200 Organisation of Chapter 3................................................................................................................................... 43
300 Classification principles...................................................................................................................................... 43
400 Compliance of Activities .................................................................................................................................... 43
500 Approval of Documents...................................................................................................................................... 43
600 Rating of compliance ......................................................................................................................................... 43
700 Reporting and milestone meetings...................................................................................................................... 43
B. Class notation ................................................................................................................................................................ 44
B 100 Designation ......................................................................................................................................................... 44
B 200 Scope................................................................................................................................................................... 44
C.
C
C
C
In operation assessments ............................................................................................................................................... 46
100 Objectives ........................................................................................................................................................... 46
200 Scope of annual assessments .............................................................................................................................. 47
300 Scope of renewal assessments ............................................................................................................................ 47
App. A DEFINITIONS AND ABBREVIATIONS ..................................................................................... 49
A. Definitions..................................................................................................................................................................... 49
A 100 Verbal Forms ...................................................................................................................................................... 49
A 200 Definitions .......................................................................................................................................................... 49
B. Abbreviations ................................................................................................................................................................ 54
App. B REQUIREMENT DEFINITION .................................................................................................... 56
A.
A
A
A
A
A
A
A
Requirement definition ................................................................................................................................................. 56
100 General................................................................................................................................................................ 56
200 Activity definition basic engineering.................................................................................................................. 56
300 Activity definition engineering ........................................................................................................................... 65
400 Activity definition construction .......................................................................................................................... 77
500 Activity definition acceptance ............................................................................................................................ 86
600 Activity definition operation............................................................................................................................... 90
700 Activity definition several phases....................................................................................................................... 94
DET NORSKE VERITAS AS
OFFSHORE STANDARD
DNV-OS-D203
INTEGRATED SOFTWARE DEPENDENT SYSTEMS
CHAPTER 1
INTRODUCTION
CONTENTS
Sec. 1
PAGE
General................................................................................................................................. 7
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.1 Sec.1 – Page 7
SECTION 1
GENERAL
A. General
A 100 Introduction
101
This standard contains requirements and guidance on the process of design, construction,
commissioning and operation of Integrated Software Dependent Systems (ISDS). ISDS are integrated systems
where the overall behaviour depends on the behaviour of the systems’ software components.
102 This standard focuses on the integration of the software dependent systems, sub-systems and system
components, and the effects these have on the overall performance of the unit (ship, rig etc.) in terms of
functionality, quality, reliability, availability, maintainability and safety.
This standard intends to help system integrators and suppliers as well as owner to:
— reduce the risk for delays in new-build projects and modification projects,
— reduce the risk for downtime and accidents caused by software in the operation phase,
— improve the processes for maintenance and upgrades of software dependent systems throughout the life
cycle,
— improve the knowledge of the relevant systems and software across the organisations,
— work within a common framework to deliver on schedule while achieving functionality, quality, reliability,
availability, maintainability and safety targets,
— communicate and resolve key issues related to integration challenges at an early stage and throughout the
whole life cycle.
A 200 Objectives
201 The objectives of this standard are to:
— provide an internationally acceptable standard for integrated software dependent systems by defining
requirements for the work processes during design, construction, commissioning and operation,
— serve as a contractual reference document between suppliers and purchasers,
— serve as a guideline for designers, suppliers, purchasers and regulators,
— specify processes and requirements for units or installations subject to DNV certification and classification
services.
A 300 Organisation of content
301 This document is divided into the following chapters and appendices:
—
—
—
—
—
Ch.1 gives general introduction, scope and references.
Ch.2 lists the requirements for the different roles, including assessment and document requirements.
Ch.3 gives procedures and principles applicable when this standard is used as part of DNV classification.
Appendix A lists definitions and abbreviations used in this standard.
Appendix B gives a detailed description of the activities introduced in Ch.2.
A 400 Assumptions
401 The requirements of this standard are based on the assumptions that the personnel are qualified to execute
the assigned activities.
402 The requirements of this standard are based on the assumptions that the parties involved in the different
processes are familiar with the intended function(s) of the system(s) subject for ISDS.
A 500 Scope and application
501 The requirements of this standard apply to the processes that manage ISDS throughout the life cycle of
a ship or offshore unit, and apply to new-builds, upgrades and modification projects. It puts requirements on
the ways of working, but does not contain any specific product requirements.
502 The requirements of this standard apply to systems, sub-systems and software components created,
modified, parameterized, tuned and/or configured for the specific project where this standard is applied. This
standard focuses on the software aspect in the context of system and unit requirements.
503 The voluntary ISDS class notation, as specified in Ch.3, may be assigned when DNV has verified
compliance. DNV’s verification activities include all the activities specified under the independent verifier role
in Ch.2, for the relevant confidence level.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.1 Sec.1 – Page 8
A 600 Types of software within scope
601 This standard focuses on achieving high software quality and takes into consideration all typical types
of software. The requirements differ depending on whether the software is new or reused:
— New software (typically application software) developed within the project is qualified for use in the ISDS
by showing that the supplier’s development process is compliant to this standard.
— All reused software shall be qualified for use. Reused software is either COTS or ‘base products’. The term
‘base product’ is here used to describe any kind of existing product, component, software library, software
template or similar on which the supplier bases the development (or automatic generation) of the custom
specific product.
The qualification of reused software shall be performed by using one of these options:
1)
2)
3)
4)
Demonstrating compliance with this standard.
Assessing the quality through due diligence of the software.
Demonstrating that the software is proven-in-use.
Procurement of COTS software as described in this standard.
A 700 Alterations and additions of approved systems
701 When an alteration or addition to the approved system(s) is proposed, applicable ISDS requirements
shall be applied and relevant information shall be submitted to DNV. The alterations or additions shall be
presented for assessment and verification.
B. References
B 100 International or national references
101 The standards listed in Table B1are referenced in this standard.
Table B1 International or national references
Reference
IEC IEV 191
IEC 61508
IEC 61511
IEC 19501:2005
IEEE 610.12:1990
IEEE 828-1983
IEEE 829-1983
IEEE 1074:2006
INCOSE SE 2004
ISO/IEC 9126
ISO/IEC 15288
ISO 9000
SWEBOK 2004
Title
Dependability and quality of service
Functional safety of electrical/electronic/programmable electronic safety-related systems
Functional safety – Safety instrumented systems for the process industry sector
Unified Modelling Language Specification
Glossary of software engineering terminology
Software configuration management plan
Software test documentation
Developing software life cycle processes
INCOSE System Engineering Handbook, 2004
Software engineering — Product quality
Life Cycle Management — System Life Cycle Processes
Quality management systems
Guide to the Software Engineering Body of Knowledge (SWEBOK), 2004 Version
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.1 Sec.1 – Page 9
B 200 DNV references
201 This standard is complimentary to the standards listed in Table B2 and B3.
Table B2 DNV Offshore Standards
Standard
DNV-OSS-101
DNV-OSS-102
DNV-OSS-103
DNV-OSS-300
DNV-OS-A101
DNV-OS-D101
DNV-OS-D201
DNV-OS-D202
DNV-OS-D301
DNV-OS-E101
DNV-OS-E201
DNV-OS-E301
Title
Rules for Classification of Offshore Drilling and Support Units
Rules for Classification of Floating Production, Storage and Loading Units
Rules for Classification of LNG/LPG Floating Production and Storage Units or Installations
Risk Based Verification
Safety Principles and Arrangements
Marine and Machinery Systems and Equipment
Electrical Installations
Automation, Safety and Telecommunication Systems
Fire Protection
Drilling Plant
Oil and Gas Processing Systems
Position Mooring
Table B3 Other DNV references
Reference
DNV-RP-D201
DNV-RP-A201
DNV-RP-A203
Pt.6 Ch.7
Pt.6 Ch.26
SfC 2.24
Title
Recommended practice for Integrated Software Dependent Systems
Plan Approval Documentation Types – Definitions
Recommended Practice for Qualification of New Technology
Rules for Classification of Ships - Dynamic Positioning Systems
Rules for Classification of Ships - Dynamic Positioning System - Enhanced Reliability
DYNPOS-ER
Standards for Certification - Hardware in the Loop Testing (HIL)
DET NORSKE VERITAS AS
OFFSHORE STANDARD
DNV-OS-D203
INTEGRATED SOFTWARE DEPENDENT SYSTEMS
CHAPTER 2
TECHNICAL PROVISIONS
CONTENTS
Sec.
Sec.
Sec.
Sec.
Sec.
Sec.
Sec.
Sec.
1
2
3
4
5
6
7
8
PAGE
Principles............................................................................................................................ 11
Confidence Levels ............................................................................................................. 12
Responsibilities .................................................................................................................. 13
Project Phases and Process Areas ...................................................................................... 15
ISDS Requirements for Owners......................................................................................... 19
ISDS Requirements for System Integrators....................................................................... 22
ISDS Requirements for Suppliers...................................................................................... 28
Requirements for the Independent Verifier ....................................................................... 34
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.1 – Page 11
SECTION 1
PRINCIPLES
A. General
A 100 Process requirements
101 This standard provides requirements for a process. These requirements are formulated as a set of
activities that apply for specific roles during specific project phases and at specific confidence levels.
A 200 System hierarchy
201 In order to describe the different parts that make up Integrated Software Dependent Systems, this
standard uses the hierarchy defined in Fig.1.
Figure 1
The hierarchy terms used in this standard
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.2 – Page 12
SECTION 2
CONFIDENCE LEVELS
A. Confidence Levels
A 100 Definition of confidence levels
101 Confidence levels are assigned by the owner to a selection of the unit’s functions and systems, based on
evaluations of the importance of these functions in relation to reliability, availability, maintainability and
safety.
102 Confidence levels define the required level of trust that a given function (implemented by one or more
systems) will perform as expected. This standard defines the confidence levels 1 through 3 where the higher
confidence level will require a higher project ambition level with the aim of increasing the dependability of the
systems in scope. The higher confidence levels also include the activities required for the lower ones.
103 Table A1 shows the difference between confidence levels 1, 2 and 3.
104 Ch.3, Sec.1, B200 shows the recommended confidence levels for systems and components relevant for
selected unit types (drilling unit, FPSO etc.).
Guidance note:
See DNV-RP-D201 for general guidance on the principles of assigning confidence levels to functions and systems.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Table A1 The difference between the confidence levels
Confidence level
1
Characteristics
Basic software confidence
Focus
System
2
Enhanced integration confidence
Systems
System integration
3
Enhanced quantified confidence
Systems
System integration
High dependability
DET NORSKE VERITAS AS
Key activities
Project management
Defined ways of working
Design and verification of software
within a system
Interface definition
Describing interaction between
systems
Traceability of requirements
Qualitative RAMS
Obsolescence management
Quantitative RAMS
High involvement of independent
verifier
Enhanced verification
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.3 – Page 13
SECTION 3
RESPONSIBILITIES
A. Activities and Roles
A 100 Activities
101 Each requirement of this standard is formulated as an activity which is assigned to a role, a defined project
phase, and a confidence level. The activities are listed in Sec.5 to 7 and described in detail in Appendix B.
Guidance note:
Each required activity has a unique identifier. The identifier is structured in three parts: Z.YYY.NN. The first part
(“Z”) of the activity identifier refers to the project phase. The second part (“YYY”) of the activity identifier refers to
the process area. The third part (“NN”) of the activity identifier is a unique number for the activity. For example,
A.REQ.2 is the identifier of the 2nd activity of the requirements process area for the basic engineering phase. Some
activities are performed in 2 or several phases. In this case, the activity’s phase is described as an “X”. Each “X”
activity describes in which phases it shall be performed.X.REQ.1 is the common activity no. 1 for the requirements
process area for managing requirement changes in all phases.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
102 Several activities require communication between different roles to be carried out. For these activities
the contributing role(s) are specified, in addition to the responsible role. The expected contributions are
specified in this standard, and the contributing role shall provide the specified information to the responsible
role when requested.
Figure 1
An overview of communication and exchange of information between the roles.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.3 – Page 14
A 200 Roles
201 This standard defines requirements on several organisations with responsibilities within the system life
cycle. Each role is assigned activities and has the responsibility to perform the activity with an outcome that
fulfils the specified criteria.
202 Each organisation is assigned one of four predefined roles. The four roles are:
— Owner (OW): In the context of this standard the owner is the organisation who decides to develop the unit
(ship, rig etc.), and provides funding. The operator of the system can be part of the owner's organisation,
or can be a subcontractor acting on behalf of the owner. For a definition of the term owner reference is also
made to DNV-OSS-101 Ch.1, Sec.1, B.
— System integrator (SI): Responsible for the integration of all systems included in the scope of this standard.
The system integrator is normally the shipyard, but parts of the integration may be delegated to other
parties. In such case this shall be clearly defined and documented.
— Supplier (SU): Responsible for the integration and delivery of one or more single systems (drilling control
system, power management system etc.). If the supplier purchases products and services from other
organisations, these are regarded as sub-suppliers, and are under the supplier's responsibility.
— Independent verifier (IV): An organisation that is mandated to independently verify that the system is
developed according to this standard. As part of the classification process for the ISDS notation, DNV shall
take the role of independent verifier.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.4 – Page 15
SECTION 4
PROJECT PHASES AND PROCESS AREAS
A. Project phases
A 100 Introduction
101 All activities in this standard are mapped to the typical project life cycle, see Fig.1.
102 The transitions between the phases represent ISDS milestones. At each milestone the following
information shall be reported by the involved roles:
— status for the compliance to this standard,
— action plans for handling non-conformities,
— risk observations made by the independent verifier.
103 At each milestone a “milestone meeting” should be arranged. The system integrator is responsible for
arranging such meetings at all milestones, except M5 for which the owner is responsible.
104 An ISDS milestone is completed when the owner, the system integrator and the independent verifier
endorse the information presented at the milestone.
Figure 1
Process chart describing the relationship between project phases (A to E) and ISDS milestones (M1 to
M5). ISDS milestone M5 is only applicable for modifications made during the operation phase.
A 200 Basic engineering phase (A)
201 During this phase the technical specification and design of the unit are established, including RAMS
requirements. The main systems which will be included in the scope for ISDS requirements and their
confidence levels are identified. The contract between the owner and the system integrator is established during
this phase.
Guidance note:
The following activities normally take place before the contract between the owner and the system integrator is
signed, depending on the confidence level for the different systems:
—
—
—
—
Define mission, objectives and shared vision (A.REQ.1, CL1 and above).
Define operational modes and scenarios to capture expected behaviour (A.REQ.3, CL2 and above).
Define RAM related requirements and objectives (A.RAMS.2, CL2 and above).
Define procedures (owner) (A.PQA.1, CL1 and above).
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
A 300 Engineering phase (B)
301 In this phase contracts are established with suppliers, and the suppliers are involved in setting up the
development/configuration of each system. The detailed design of the unit and systems is documented. The
verification, validation, testing and integration strategies, and major interfaces are established, and RAMS
analyses are carried out.
Guidance note:
The ISDS process is normally aligned with the overall building schedule so that steel cutting takes place towards the
end of the ISDS engineering phase.
Normally, only one phase B activity takes place before the contracts between the system integrator and the suppliers
are signed:
— Submit proposals to system integrator with compliance status (B.REQ.1, CL1 and above).
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.4 – Page 16
A 400 Construction phase (C)
401 In this phase the construction and integration of the systems are carried out. Detailed software design,
coding and/or parameterization are performed. Systems and interfaces are tested, validated and verified as part
of this phase.
A 500 Acceptance phase (D)
501 In this phase, the functionality, performance, RAMS requirements and other aspects of the integrated
systems are validated, through commissioning activities, integration testing, and sea trials.
A 600 Operation phase (E)
601 In this phase the unit is in operation. Maintenance and small upgrades are performed as deemed
necessary by the owner.
602 Large upgrades should be managed as separate projects, following a distinct lifecycle based on this
standard.
Guidance note:
Any planned upgrade resulting in the shutdown of the unit or ship for any extended period of time should be regarded
as a large upgrade.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
B. Process Areas
B 100 Introduction
101 Activities are logically grouped in process areas based on the typical engineering discipline they address.
Each process area spans multiple project phases.
B 200 Requirements engineering (REQ)
201 The requirements engineering process area covers the activities needed to define, document and manage
the requirements on the unit, the systems and the related software.
On CL1, the overall goal and vision for the unit are defined and the requirements on the unit and relevant
systems are specified. A dialogue between the owner/system integrator on one hand and the potential suppliers
on the other is expected to take place in order to align the requirements with the supplier’s systems. The
allocation of requirements to systems shall be documented. No specific methods or formats for the unit or
system requirements are expected.
On CL2, operational modes and scenarios are defined in order to put the requirements into an operational
context, and to detail the interaction between different systems. A trace from requirements to design and
verification shall be documented and maintained.
CL3 introduces additional independent verifier activities.
B 300 Design (DES)
301 The design process area consists of activities to establish a design on different levels. Together with the
interface related activities in the integration process area (INT), this creates the design basis from which the
systems and related software can be produced and verified.
On CL1, each system is designed with identification of major sub-systems and components. The external
interfaces shall be documented and coordinated with other relevant systems.
On CL2, the unit design is documented, maintained, and analysed with focus on integration of the systems. A
strategy for handling of current and future obsolescence is expected to be defined, and design guidelines to be
established. The architecture of relevant systems and software components is detailed and documented,
including new, reused, and parameterized software. The documentation related to any base-products shall be
kept up to date.
CL3 introduces additional independent verifier activities.
B 400 Implementation (IMP)
401 The implementation process area covers the coding and configuration activities needed to create and
customize software modules in order to fulfil the specified design. In addition, associated support
documentation shall be created.
On CL1, the software components are programmed, parameterized and tested based on a baselined design.
On CL2, implementation guidelines and methods are expected to be used as additional input to the
programming and parameterization, and support documentation like user manuals is produced.
CL3 does not add any requirements.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.4 – Page 17
B 500 Acquisition (ACQ)
501 The acquisition process area includes activities related to when a supplier uses sub-suppliers to develop
or deliver components/systems. It covers both the situation where configuration and development of the
software components is subcontracted and the situation where the supplier buys 'commercial off the shelf'
(COTS) systems.
On CL1, specific contracts between the supplier and the sub-supplier are established and followed-up. The
components or systems are verified at delivery and it shall be ensured that the delivered component/system can
be integrated into the system/unit in question.
On CL2, the COTS systems are selected based on defined criteria. Intermediate deliveries are reviewed, and
acquired components or systems are monitored with regards to obsolescence during the operation phase.
CL3 does not add any requirements.
B 600 Integration (INT)
601 The integration process area covers the assembly of the systems into the unit, and activities to coordinate
the interfaces between the systems.
On CL1, the responsibilities regarding each system and how it is to be integrated are defined.
On CL2, a specific integration plan is produced. Inter-system interfaces are coordinated and systems checked
against pre-defined criteria before the integration takes place.
CL3 introduces additional independent verifier activities.
B 700 Verification and Validation (VV)
701 The verification and validation process area addresses the quality assurance activities needed to ensure
that that each software component, each system and the integrated systems in concert perform as expected and
fulfil the needs of the intended use.
On CL1, the focus is on the individual systems. It is required that a verification strategy is defined, and that
basic verification activities like FAT, peer reviews, and qualification of reused software are prepared and
performed according to this strategy. It is also expected that the owner performs validation testing during the
acceptance phase, and when modifications and upgrades are performed during the operation phase.
On CL2 the focus is on the functionality and performance of the integrated systems working together, and on
early verification of the correctness and the completeness of requirements, interface specifications, user
interfaces, and design documentation. The verification and validation results are expected to be analysed and
compared with defined targets.
On CL3, the focus is on an elaborated system testing, and that an independent party performs testing on the
system(s). Additional independent verifier activities are also introduced.
B 800 Reliability, Availability, Maintainability and Safety (RAMS)
801 The reliability, availability, maintainability and safety (RAMS) process area gathers the activities
dealing with the definition, analysis and verification of the RAMS properties of the unit and specific systems.
Security aspects are also included.
On CL1, the focus is on the safety part, meaning that applicable laws and regulations regarding safety are
identified, and that software and software failures are taken into account when doing safety analysis. In the
operation phase a structured way of doing maintenance is required.
On CL 2, the focus is on the reliability, availability and maintainability (RAM) of the systems in question.
Goals regarding RAM are established, analysed and verified, but the goals can be qualitative in nature. Risks
and mitigations related to the RAMS aspects are managed. The activities related to handling of the RAMS
aspects are planned and followed-up during the project. Security aspects are dealt with by performing security
audits.
On CL3, the focus is on RAM objectives that are explicitly defined, analysed and proven fulfilled. In order to
achieve this, the RAM objectives need to be quantitative. Additional independent verifier activities are also
introduced.
B 900 Project Management (PM)
901 The project management process area covers the activities required to make sure that the project plans
for the different organizations involved are created, synchronized and followed-up.
On CL1, basic project management activities regarding project planning and tracking are required, and there
are no additional requirements at CL2 and CL3.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.4 – Page 18
B 1000 Risk Management (RISK)
1001 The risk management process area covers activities related to identifying, mitigating and tracking
product and project risks related to systems and software. Based on the risks, the different systems are assigned
a confidence level.
On CL1, risks are identified, reviewed, tracked, and updated.
On CL2, also the risk mitigation actions shall be tracked to verify that they have the expected effect on the risk.
CL3 introduces additional independent verifier activities.
B 1100 Process and Quality Assurance (PQA)
1101 The process and quality assurance process area covers the activities needed to define and follow up the
way of working within the project. It also covers the activities needed to make sure that the involved
organizations fulfil the requirements in this standard.
On CL1, the applicable procedures for each organization are defined and when necessary coordinated with the
other roles. The adherence to the defined procedures is followed-up by each organization. DNV will follow-up
on the adherence to the requirements in this standard.
There are no additional requirements at CL2 and CL3.
B 1200 Configuration Management (CM)
1201 The configuration management area covers activities to make sure that changes to documents and
software are performed in a controlled way, and to ensure the integrity and consistency of the systems, their
configuration, and all related work products (requirements, design, interface specifications, specifications,
source code, documentation, etc.).
On CL1, the configuration management area includes all required activities, and there are no additional
requirements at CL2 and CL3.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.5 – Page 19
SECTION 5
ISDS REQUIREMENTS FOR OWNERS
A. Owner requirements
A 100 Requirements under the owner’s responsibility
101 The following Table A1 lists the requirements under the owner’s responsibility. See also Table A2 for
the associated acceptance criteria and Table A3 for documentation criteria.
102 The owner shall also contribute to requirements that are under the responsibility of other roles.
103 Appendix B fully specifies the requirements for all roles.
Table A1: Requirements under owner’s responsibility
Reference
A.PQA.1
A.RAMS.2
A.REQ.1
A.RISK.1
A.RISK.3
A.VV.1
Required activity
Define procedures (owner)
Define RAM related requirements and objectives
Define mission, objectives and shared vision
Define operational modes and scenarios to capture
expected behaviour
Define a strategy for risk management
Assign confidence levels
Validate the concept of the unit with the users
B.DES.5
Define obsolescence strategy
D.VV.1
Perform validation testing
D.VV.2
Perform validation with operational scenarios
D.VV.3
E.CM.1
E.CM.2
Analyse validation results with respect to targets
Manage change requests during operation
Perform configuration audits
Define procedures for problem resolution, change
handling, and maintenance activities
Maintain and execute the plan for maintenance in
operation
Collect RAMS data
Analyse RAMS data and address discrepancies
Perform RAMS impact analysis of changes
Periodically perform security audits of the systems in
operation
Perform validation testing after changes in the systems
in operation
Perform validation with operational scenarios after
changes in the systems in operation
Control procedures (owner)
Follow-up of ISDS assessment gaps (owner)
A.REQ.3
E.PQA.1
E.RAMS.1
E.RAMS.2
E.RAMS.3
E.RAMS.4
E.RAMS.5
E.VV.1
E.VV.2
X.PQA.1
X.PQA.4
Contributor(s)
Phase
Basic engineering
Basic engineering
Basic engineering
CL
CL1
CL2
CL1
Basic engineering
CL2
Basic engineering
Basic engineering
Basic engineering
CL2
CL1
CL2
Engineering
CL2
Acceptance
CL1
Acceptance
CL2
Supplier
Acceptance
Operation
Operation
CL2
CL1
CL1
Supplier
Operation
CL1
Supplier
Operation
CL1
Operation
Operation
Operation
CL2
CL2
CL2
Operation
CL2
Supplier
Operation
CL1
Supplier
Operation
CL2
Several
Several
CL1
CL1
System integrator
System integrator
System integrator
System integrator,
Supplier
System integrator,
Supplier
System integrator,
Supplier
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.5 – Page 20
A 200 Acceptance criteria for owner assessments
201 The following Table A2 lists the acceptance criteria for assessments of the owner. The following
evidence shall be presented to the independent verifier during assessments to document that the required
activities have been performed.
202 See also Table A3 for the required documentation criteria.
Table A2: Acceptance criteria for assessments of owner
Reference
A.PQA.1
A.RAMS.2
A.REQ.1
A.REQ.3
A.RISK.1
A.RISK.3
A.VV.1
B.DES.5
D.VV.1
D.VV.2
D.VV.3
E.CM.1
E.CM.2
E.PQA.1
E.RAMS.1
E.RAMS.2
E.RAMS.3
E.RAMS.4
E.RAMS.5
Assessment criteria
A quality system, documents, minutes of meetings, or other relevant information showing: A defined way
of working for the major activities in the project, clear roles and responsibilities and defined ways of
interaction between the different organizations (e.g. owner, system integrator, supplier, independent
verifier, and others).
Listing of RAM requirements.
For CL2: qualitative requirements are acceptable.
For CL3: quantitative requirements (objectives) are required.
Unit design intention and philosophy: The vision of the unit/system, descriptions of the unit/systems
overall behaviour and the expected business/safety/environmental performance.
Vessel specification: description of the operational modes and corresponding key operational scenarios,
detailed to the level of the different systems.
Risk management procedure.
Blank risk register.
Confidence level matrix for the relevant systems.
Unit concept presentation: Simulations and Minutes of System Concept Review Meeting.
FEED study.
Obsolescence management plan: Authorised vendor list, Spare parts list (hardware & software), stock,
alternate spare parts list, management of intellectual property. Obsolescence criteria for software.
Manufacturer preferred equipment list.
Test procedure: black box tests, boundary tests, software behaviour and parameterisation and calibration.
Test reports: executed consistent with procedure.
Test issue list: deviations (punches) and variations.
Test procedure: operational scenarios.
Test reports: tests performed in compliance with procedure and coverage of scenarios.
Test procedure: quality criteria.
Test reports: analysis of the results.
Test issue list.
Change requests
Impact analysis
Change orders
Work orders
Problem reports
Release notes
Maintenance logs
Configuration audit reports.
Configuration management plan.
Configuration management procedure: migration issues and software obsolescence (ref E.ACQ.1).
Maintenance procedures: procedures for the maintenance, software update, migration and retirement,
backup and restore procedures and procedures for receiving, recording, resolving, tracking problems and
modification requests.
Change management procedure.
Issue tracking and resolution procedure.
Maintenance plan: configuration items, audit activities, maintenance activities, expected software update,
migration and retirement activities, maintenance intervals and tailored procedures for the maintenance in
operation.
Malicious software scan log files records.
Maintenance logs.
RAMS data collection system.
RAMS data collected.
RAMS analysis.
Impact analysis showing RAMS evaluation.
Security audit report.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.5 – Page 21
Table A2: Acceptance criteria for assessments of owner (Continued)
Reference
E.VV.1
E.VV.2
X.PQA.1
X.PQA.4
Assessment criteria
Test procedure: includes black box tests and includes boundary tests.
Test reports: consistent with procedure.
Test procedures: Covering relevant Operational scenarios.
Test reports: tests performed in compliance with procedure and analysis of the results.
Proof that process adherence is being assessed: Quality control records, Project control records and
Minutes of meetings, or other relevant information.
Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of
implementation of the actions.
A 300 Documentation criteria for the owner
301 The table below lists all documents to be sent to the independent verifier and in which activities the
independent verifier is going to use the different documents.
302 When the independent verifier is expected to comment on the document, the word ‘reviewed’ is
employed. For documents which serve as background information to put the reviewed documents in a context,
the word ‘used’ is employed.
303 Most documents are provided for information (FI). The only document that is sent to the independent
verifier for approval (AP) is the corrective action plan.
Table A3: Documents required for review
Reference
A.PQA.1
A.RAMS.2
A.REQ.1
A.REQ.3
A.RISK.1
A.RISK.3
A.VV.1
B.DES.5
D.VV.1
D.VV.2
D.VV.3
E.CM.1
E.CM.2
E.PQA.1
E.RAMS.1
E.RAMS.2
E.RAMS.3
E.RAMS.4
E.RAMS.5
E.VV.1
E.VV.2
X.PQA.1
X.PQA.4
Documents
No documentation to be submitted to DNV for review.
List of RAM requirements unit (FI):
- reviewed in A.IV.2 and B.IV.4 at CL3
- used in D.IV.3 at CL3.
List of RAM requirements system (FI) used in C.IV.3 at CL3.
Design Philosophy (FI) used in A.IV.1 at CL3.
Vessel specification (FI) reviewed in A.IV.1 at CL3.
No documentation to be submitted to DNV for review.
Vessel specification (confidence levels) (FI):
- reviewed in A.IV.1 at CL3
- used in A.IV.2 and B.IV.4 at CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
Test procedure for quay and sea trials (FI) and Report from quay and sea trials (FI) reviewed in D.IV.1 at
CL2 and CL3.
Report from quay and sea trials (FI) used in D.IV.2 at CL3.
Test procedure (FI) and Test report (FI) reviewed in D.IV.1 at CL2 and CL3.
Test report (FI) used in D.IV.2 at CL3.
Verification analysis report (FI) reviewed in D.IV.2 at CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
Test procedure (FI) and Test report (FI) reviewed in E.IV.1 at CL3.
Test procedure (FI) and Test report (FI) reviewed in E.IV.1 at CL3.
No documentation to be submitted to DNV for review.
Corrective action plan (AP) reviewed and approved in X.IV.1.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.6 – Page 22
SECTION 6
ISDS REQUIREMENTS FOR SYSTEM INTEGRATORS
A. System integrator requirements
A 100 Requirements under the system integrator’s responsibility
101 The following Table A1 lists the requirements under the system integrator’s responsibility. See also
Table A2 for the associated acceptance criteria and Table A3 for documentation criteria.
102 The system integrator shall also contribute to requirements that are under the responsibility of other roles.
103 Appendix B fully specifies the requirements for all roles.
Table A1: Requirements under system integrator’s responsibility
Reference
Required activity
A.CM.1
Establish a baseline of requirements for the unit
A.DES.1
Establish the unit design
A.PM.1
Establish the master plan
A.PQA.2
Define procedures (system integrator)
A.RAMS.1
Determine safety rules, standards and laws applicable
A.RAMS.3
Develop the RAMS plan for the unit
A.REQ.2
Collect requirements for the unit and systems
A.REQ.4
Allocate functions and requirements to systems
A.REQ.5
Consult potential suppliers for acquiring of systems
A.REQ.6
Establish traceability of requirements
A.RISK.2
Jointly identify risks
A.VV.2
Verify the unit and system requirements
B.CM.1
B.CM.2
B.DES.3
Establish baselines of requirements and design
Establish and implement configuration management
Use established design guidelines and methods
B.DES.4
Analyse and refine the unit design
B.INT.1
B.INT.2
B.PM.1
Define integration plan
Coordinate inter-system interfaces
Establish the project plan for each organisation
Coordinate and integrate the project plans with the master
plan
Identify software-related RAMS risks and priorities
Identify RAMS risk mitigation actions
Define verification and validation strategy
Review the design with respect to requirements and design
rules
Review consistency between design and operational
scenarios
Review interface specifications
B.PM.2
B.RAMS.1
B.RAMS.2
B.VV.1
B.VV.2
B.VV.3
B.VV.4
B.VV.5
Validate critical or novel user-system interactions
C.INT.1
Check readiness status of systems and components before
integration
DET NORSKE VERITAS AS
Contributor(s) Phase
Basic
Owner
engineering
Basic
Owner
engineering
Basic
Owner
engineering
Basic
engineering
Basic
Owner
engineering
Basic
Owner
engineering
Basic
Owner
engineering
Basic
Owner
engineering
Basic
Owner
engineering
Basic
engineering
Basic
Owner
engineering
Basic
engineering
Owner
Engineering
Owner
Engineering
Engineering
Owner,
Engineering
Supplier
Supplier
Engineering
Supplier
Engineering
Owner
Engineering
Owner,
Engineering
Supplier
Owner
Engineering
Engineering
Owner
Engineering
CL2
CL2
CL1
Owner
Engineering
CL2
Engineering
CL2
Engineering
CL2
Engineering
CL2
Construction
CL2
Owner,
Supplier
CL
CL1
CL2
CL1
CL1
CL1
CL2
CL1
CL1
CL1
CL2
CL1
CL2
CL1
CL1
CL2
CL2
CL2
CL2
CL1
CL1
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.6 – Page 23
Table A1: Requirements under system integrator’s responsibility (Continued)
Reference
C.PQA.1
C.VV.9
D.CM.1
D.CM.2
D.CM.3
D.RAMS.1
D.RAMS.2
D.RAMS.3
D.VV.4
X.CM.1
X.PM.1
X.PM.2
X.PQA.2
X.PQA.5
X.REQ.1
X.RISK.1
X.RISK.2
X.VV.2
Required activity
Contributor(s) Phase
Establish procedures for problem resolution and maintenance
Supplier
Construction
activities in the construction and acceptance phases
Arrange independent testing
Supplier
Construction
Owner,
Manage software changes during commissioning
Acceptance
Supplier
Establish a release note for the systems in ISDS scope
Acceptance
Transfer responsibility for system configuration management Owner,
Acceptance
to owner
Supplier
Demonstrate achievement of unit RAMS requirements
Supplier
Acceptance
Collect data and calculate RAM values
Supplier
Acceptance
Perform a security audit on the deployed systems
Acceptance
Perform systems integration tests
Supplier
Acceptance
Track and control changes to the baselines
Several
Owner,
Monitor project status against plan
Several
Supplier
Owner,
Perform joint project milestone reviews
Several
Supplier
Control procedures (system integrator)
Several
Follow-up of ISDS assessment gaps (system integrator)
Several
Maintain requirements traceability information
Several
Owner,
Track, review and update risks
Several
Supplier
Decide, implement and track risk mitigation actions to
Owner
Several
closure
Detail procedures for testing
Several
CL
CL1
CL3
CL1
CL1
CL1
CL2
CL3
CL2
CL2
CL1
CL1
CL1
CL1
CL1
CL2
CL1
CL2
CL1
A 200 Acceptance criteria for system integrator assessments
201 The following Table A2 lists the acceptance criteria for assessments of the system integrator. The
following evidence shall be presented to the independent verifier during assessments to document that the
required activities have been performed.
202 See also Table A3 for the required documentation criteria.
Table A2: Acceptance criteria for assessments of system integrator
Reference
A.CM.1
A.DES.1
A.PM.1
A.PQA.2
A.RAMS.1
A.RAMS.3
A.REQ.2
A.REQ.4
A.REQ.5
A.REQ.6
A.RISK.2
Assessment criteria
Approved and controlled unit requirements document.
Revision history of unit requirements document.
Unit design: unit design specifications, systems/network topology and functional descriptions.
Master plan: Activities, work breakdown structure (WBS), schedule, and milestones.
A quality system, documents, minutes of meetings, or other relevant information showing: A defined way
of working for the major activities in the project, clear roles and responsibilities and defined ways of
interaction between the different organizations (e.g. owner, system integrator, supplier, independent
verifier, and others).
Listing of regulatory requirements that apply regarding safety.
Resolution of conflicting rules.
Application guidelines.
Plan(s) showing the methods, tools, and procedures to be used for RAMS activities.
Schedule of RAMS activities.
Expectations on the suppliers’ RAMS plan.
RAM data to be collected (CL3).
Vessel specification: operational requirements, functional requirements, non-functional requirements
and technical constraints.
Design specification (or requirements) for the relevant systems.
System request for proposal (RfP): functional specifications, generic system requirements and
obsolescence information.
Requirements compliance information (on CL2 and above).
Traceability information between requirements on unit level and requirements on the different systems.
Defined mechanisms and ambition-level regarding requirements traceability.
Project risk list: risk list with risks related to e.g. requirements, schedule, effort, quality, performance,
consistency and obsolescence (for both hardware and software).
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.6 – Page 24
Table A2: Acceptance criteria for assessments of system integrator (Continued)
Reference
A.VV.2
B.CM.1
B.CM.2
B.DES.3
B.DES.4
B.INT.1
B.INT.2
B.PM.1
B.PM.2
B.RAMS.1
B.RAMS.2
B.VV.1
B.VV.2
B.VV.3
B.VV.4
B.VV.5
C.INT.1
C.PQA.1
C.VV.9
Assessment criteria
Review records of the unit requirements.
Review records for the system requirements.
Baseline repositories.
Identification of baselines.
Approved and controlled documents (baselines) for: unit specifications, unit design, system
requirements, system design, interface specifications and base products.
Configuration management plan: Definition of a Change Control Board (CCB) process or similar,
identification of required baselines, required baseline content, change request forms.
Change requests and change decisions.
Version history information of baselines.
Defined rules and mechanisms for version control.
Effective implementation of version control mechanisms.
System design guidelines: including RAMS related aspects.
Unit design guidelines: including RAMS related aspects.
Updated unit design documentation: unit design specifications, systems/network topology with software
components, interface specifications, and functional descriptions.
Plan for integration of systems into unit: The responsibilities of the different organizations, dependencies
among systems, sequence for integration, integration environment, tests and integration readiness
criteria.
Plan for integration of sub-systems and components into systems (when required): Dependencies among
systems, sub-systems and components, sequence for integration, integration environment, tests and
integration readiness criteria.
Interface overview/matrix information with assigned responsibilities.
Agreed inter-system interface specifications containing: protocol selected, definition of commands,
messages, data and alarms to be communicated and specifications of message formats.
Interface definition and verification status.
Schedule.
Project plan: WBS, technical attributes used for estimating, effort and costs estimates, deliverables and
milestones, configuration management plan.
Resource allocation.
Master plan.
Project plans.
RAMS hazard and risk list showing consideration of software risks.
Defined risk identification and analysis methods.
Relevant risks are communicated to other roles.
RAMS hazard and risk mitigation list showing mitigation actions for software risks.
Relevant mitigation actions are communicated to other roles
Verification strategy: which part to verify: unit, system, sub-system, component, module, design
documents.
Method specification documents, etc.: which methods to use for this verification: testing, inspection,
code analysis, simulation, prototyping, peer review techniques, quality criteria and targets, which test
types to use: functional, performance, regression, user interface, negative, what environment to use for
verification and identification of the test stages (e.g. sea trials, integration tests, commissioning, FAT,
internal testing, component testing) to be used for the verification and the schedule for those tests.
Validation strategy: products to be validated, validation criteria, operational scenarios, methods and
environments.
Documented design review records addressing: requirements verification, design rules and verification
of uncertainties.
Minutes from review: review results considering consistency of interface/function/component/scenarios.
Interface specification reviews addressing at least: consistency between input and output signals,
frequency and scan rates, deadlocks, propagation of failures from one part to another, engineering units,
network domination.
Validation records including: workshop minutes, user representative’s participation and comments and
agreed action lists.
Integration readiness criteria fulfilled per component and per system.
Agreed maintenance procedures: Procedures for general system maintenance activities and procedures
for software update, backup and roll-back.
Agreed problem resolution procedures: Procedures for receiving, recording, resolving, tracking problems
(punches) and modification requests.
Test procedure: covering the system and its interfaces.
Test report.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.6 – Page 25
Table A2: Acceptance criteria for assessments of system integrator (Continued)
Reference
D.CM.1
D.CM.2
D.CM.3
D.RAMS.1
D.RAMS.2
D.RAMS.3
D.VV.4
X.CM.1
X.PM.1
X.PM.2
X.PQA.2
X.PQA.5
X.REQ.1
X.RISK.1
X.RISK.2
X.VV.2
Assessment criteria
Defined software configuration management: definition of Change Control Board (CCB), change request
forms, description of change process for software, impact analysis, Identification of items to be
controlled, configuration management tool, including, issue, change, version and configuration tracking
tool and prevents unauthorised changes.
Modification records justifying changes:
configuration records,
version histories,
release notes,
change orders.
Overall release note for the systems in ISDS scope.
Approved configuration management plan.
Records of transmission of software, documentation and data, or responsibility thereof.
RAMS compliance analysis information.
Calculations of RAM values for relevant systems and the unit.
RAM data.
Security audit records.
Integration test procedures covering system interfaces and inter-system functionality.
Integration test reports.
Change requests/orders.
Version histories for baselines.
Changes to: unit requirements, unit design, system requirements, system design, software design,
interface specifications and software.
Configuration records from document or software repositories.
Master schedule.
Master plan (updated).
Project status report.
Project action list.
Minutes of review meetings.
Progress report.
Minutes of joint milestone meetings.
ISDS compliance status.
Action plans.
Proof that process adherence is being assessed: Quality control records, Project control records and
Minutes of meetings, or other relevant information.
Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of
implementation of the actions.
Up to date traceability information: from owner to system requirements, from system requirements to
functional specifications (where applicable), from system requirements to base-product and
configuration data (where applicable), from functional specifications to sub-system/component
specifications and from requirements to test procedures (when the test procedures are available).
Completeness and consistency review records of the traceability information.
Project risk management plan.
Updated internal risk register (per organization).
Updated project risk register (jointly managed).
Updated internal risk register: risk list, mitigation actions and follow-up records (per organization).
Updated project risk register: risk list, mitigation actions and follow-up records (jointly managed).
Existence of relevant test procedures.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.6 – Page 26
A 300 Documentation criteria for the system integrator
301 The table below lists all documents to be sent to the independent verifier and in which activities the
independent verifier is going to use the different documents.
302 When the independent verifier is expected to comment on the document, the word ‘reviewed’ is
employed. For documents which serve as background information to put the reviewed documents in a context,
the word ‘used’ is employed.
303 Most documents are provided for information (FI). The only document that is sent to the independent
verifier for approval (AP) is the corrective action plan.
Table A3: Documents required for review
Reference
A.CM.1
A.DES.1
A.PM.1
A.PQA.2
A.RAMS.1
A.RAMS.3
A.REQ.2
A.REQ.4
A.REQ.5
A.REQ.6
A.RISK.2
A.VV.2
B.CM.1
B.CM.2
B.DES.3
B.DES.4
B.INT.1
B.INT.2
B.PM.1
B.PM.2
B.RAMS.1
B.RAMS.2
B.VV.1
B.VV.2
B.VV.3
B.VV.4
B.VV.5
C.INT.1
C.PQA.1
C.VV.9
D.CM.1
D.CM.2
Documents
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
List of regulatory requirements unit (FI):
- reviewed in A.IV.2 at CL3
- used in B.IV.4 and D.IV.3 at CL3.
List of regulatory requirements system (FI) used in C.IV.3 at CL3.
Plan for handling of RAMS (FI):
- reviewed in A.IV.2 at CL3
- used in B.IV.4 at CL3.
Vessel specification (FI) reviewed in A.IV.1 at CL3.
Specification (FI) reviewed in A.IV.1 at CL3.
No documentation to be submitted to DNV for review.
Traceability matrices (FI) used in A.IV.1 at CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
RAMS design guidelines and methods for the vessel (FI) used in B.IV.1 at CL3.
RAMS design guidelines and methods for the system (FI) used in B.IV.1 at CL3.
Interface description (FI) reviewed in B.IV.1 at CL3,
Functional description (FI) reviewed in B.IV.1 at CL3,
Block (topology) diagram (FI):
- reviewed in B.IV.1 at CL3
- used in B.IV.2 at CL2 and CL3.
Integration plan (FI):
- reviewed in B.IV.2 at CL2 and CL3
- used in C.IV.1 at CL3.
Interface description (FI) used in B.IV.2 at CL2 and CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
RAMS risk register (FI) and RAMS risk analysis documentation (FI) reviewed in B.IV.3 at CL3.
RAMS risk register (FI):
- reviewed in B.IV.3 at CL3
- used in C.IV.3 and D.IV.3 at CL3.
Verification and validation strategy (FI):
- reviewed in B.IV.2, at CL2 and CL3
- used in C.IV.1 at CL3, and C.IV.2 and D.IV.1 at CL2 and CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
Independent test procedure (FI) and Independent test report (FI) reviewed in D.IV.1 at CL3.
Independent test report (FI) used in D.IV.2 at CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.6 – Page 27
Table A3: Documents required for review (Continued)
Reference
D.CM.3
D.RAMS.1
D.RAMS.2
D.RAMS.3
D.VV.4
X.CM.1
X.PM.1
X.PM.2
X.PQA.2
X.PQA.5
X.REQ.1
X.RISK.1
X.RISK.2
X.VV.2
Documents
No documentation to be submitted to DNV for review.
RAMS compliance report (FI) reviewed in D.IV.3 at CL3.
RAM report (FI) unit reviewed in D.IV.3.
RAM report (FI) system used in D.IV.3.
Security audit report (FI) used in D.IV.3 at CL3.
Test procedure (FI) and Test report (FI) reviewed in D.IV.1 at CL2 and CL3.
Test report (FI) used in D.IV.2 at CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
Corrective action plan (AP) reviewed and approved in X.IV.1.
Traceability matrices (FI) used in B.IV.1 and C.IV.1 at CL3, and in C.IV.2 at CL2 and CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.7 – Page 28
SECTION 7
ISDS REQUIREMENTS FOR SUPPLIERS
A. Supplier requirements
A 100 Requirements under the supplier’s responsibility
101 The following Table A1 lists the requirements under the supplier’s responsibility. See also Table A2 for
the associated acceptance criteria and Table A3 for documentation criteria.
102
The supplier shall also contribute to requirements that are under the responsibility of other roles.
103
Appendix B fully specifies the requirements for all roles.
Table A1: Requirements under supplier’s responsibility
Reference
B.ACQ.1
B.ACQ.2
B.CM.1
B.CM.2
B.DES.1
B.DES.2
B.DES.3
Required activity
Select COTS products based on defined criteria
Establish contract with sub-suppliers
Establish baselines of requirements and design
Establish and implement configuration management
Design the system
Design each software component
Use established design guidelines and methods
B.DES.5
Define obsolescence strategy
B.INT.1
B.PM.1
B.PQA.1
B.RAMS.1
B.RAMS.2
Define integration plan
Establish the project plan for each organisation
Define procedures (supplier)
Identify software-related RAMS risks and priorities
Identify RAMS risk mitigation actions
Consider software failure modes in safety analysis
activities
Develop the RAMS plan for the system
Submit proposals to system integrator with compliance
status
Refine system requirements into software component
requirements
Detail operational scenarios
Define verification and validation strategy
Review the design with respect to requirements and design
rules
Review consistency between design and operational
scenarios
Review interface specifications
Accept deliverables
Ensure transition and integration of the delivered product
Develop and configure the software components from
design
B.RAMS.3
B.RAMS.4
B.REQ.1
B.REQ.2
B.REQ.3
B.VV.1
B.VV.2
B.VV.3
B.VV.4
C.ACQ.1
C.ACQ.2
C.IMP.1
Contributor(s)
Phase
Engineering
Engineering
Engineering
Engineering
Engineering
Engineering
Engineering
CL
CL2
CL1
CL1
CL1
CL1
CL2
CL2
Engineering
CL2
Engineering
Engineering
Engineering
Engineering
Engineering
CL2
CL1
CL1
CL2
CL2
Engineering
CL1
Engineering
CL2
Engineering
CL1
Engineering
CL2
Owner
Engineering
Engineering
CL2
CL1
Owner
Engineering
CL2
Engineering
CL2
Engineering
Construction
Construction
CL2
CL1
CL1
Construction
CL1
Construction
CL2
Construction
CL1
Construction
CL2
Construction
CL2
Construction
CL2
Construction
CL3
Construction
Construction
CL1
CL1
Owner
Owner
System integrator
System integrator,
Supplier
Supplier
Owner
Owner
System integrator
Owner,
System integrator
C.IMP.2
Develop support documentation
C.IMP.3
Perform software component testing
Use established software implementation guidelines and
methods
Check readiness status of systems and components before
integration
Demonstrate achievement of system RAMS requirements
Evaluate software systems and software components
against RAM objectives
Prepare a plan for system maintenance during operation Owner
Perform peer-reviews of software
C.IMP.4
C.INT.1
C.RAMS.1
C.RAMS.2
C.RAMS.3
C.VV.1
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.7 – Page 29
Table A1: Requirements under supplier’s responsibility (Continued)
Reference
C.VV.2
C.VV.3
C.VV.4
C.VV.5
C.VV.6
C.VV.7
Required activity
Review software parameterisation data
Perform internal testing
Perform high integrity internal testing
Perform code analysis on new and modified software
Analyse verification results with respect to targets
Qualify reused software
C.VV.8
Perform Factory Acceptance Tests (FAT)
E.ACQ.1
E.CM.1
E.RAMS.3
E.RAMS.4
X.ACQ.1
X.ACQ.2
X.CM.1
X.CM.2
X.DES.1
Manage and monitor obsolescence
Manage change requests during operation
Analyse RAMS data and address discrepancies
Perform RAMS impact analysis of changes
Monitor contract execution and changes
Review intermediate deliverables
Track and control changes to the baselines
Establish a release note for the delivered system
Update the base-product design documentation
X.PM.1
Monitor project status against plan
X.PQA.3
X.PQA.6
X.REQ.1
Control procedures (supplier)
Follow-up of ISDS assessment gaps (supplier)
Maintain requirements traceability information
X.RISK.1
Track, review and update risks
X.RISK.2
X.VV.1
X.VV.2
Contributor(s)
System integrator
System integrator
System integrator
System integrator
Owner,
System integrator
Owner,
Supplier
Owner,
Supplier
Decide, implement and track risk mitigation actions to
Owner
closure
Perform verification and validation on added and modified Owner
software components
Detail procedures for testing
Phase
Construction
Construction
Construction
Construction
Construction
Construction
CL
CL1
CL2
CL3
CL2
CL2
CL1
Construction
CL1
Operation
Operation
Operation
Operation
Several
Several
Several
Several
Several
CL2
CL1
CL2
CL2
CL1
CL2
CL1
CL1
CL2
Several
CL1
Several
Several
Several
CL1
CL1
CL2
Several
CL1
Several
CL2
Several
CL1
Several
CL1
A 200 Acceptance criteria for supplier assessments
201 The following Table A2 lists the acceptance criteria for assessments of the supplier. The following
evidence shall be presented to the independent verifier during assessments to document that the required
activities have been performed.
202
See also Table A3 for the required documentation criteria.
Table A2: Acceptance criteria for assessments of supplier
Reference
B.ACQ.1
B.ACQ.2
B.CM.1
B.CM.2
B.DES.1
B.DES.2
Assessment criteria
COTS product selection procedure: obsolescence management.
COTS product selection matrix: rationale for selection, selection criteria, evaluations and selection.
Supplier agreement: product or component specifications, functional specifications, technical acceptance
criteria, ownership transfer conditions, delivery strategy, provisions for review of intermediate deliveries.
Baseline repositories.
Identification of baselines.
Approved and controlled documents (baselines) for: unit specifications, unit design, system requirements,
system design, interface specifications and base products.
Configuration management plan: Definition of a Change Control Board (CCB) process or similar,
identification of required baselines, required baseline content, change request forms.
Change requests and change decisions.
Version history information of baselines.
Defined rules and mechanisms for version control.
Effective implementation of version control mechanisms.
Design for system (hardware & software): functional description, user interface descriptions, block/
topology diagrams with software components, external interface descriptions and internal interface
descriptions.
Component design for each software component, in sufficient detail so as to proceed to the making of the
software: structural description, functional description, behaviour description, parameters (default,
intervals, as-designed), interfaces description, allocation of software to hardware and assumptions and
known limitations of the design.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.7 – Page 30
Table A2: Acceptance criteria for assessments of supplier (Continued)
Reference
B.DES.3
B.DES.5
B.INT.1
B.PM.1
B.PQA.1
B.RAMS.1
B.RAMS.2
B.RAMS.3
B.RAMS.4
B.REQ.1
B.REQ.2
B.REQ.3
B.VV.1
B.VV.2
B.VV.3
B.VV.4
C.ACQ.1
C.ACQ.2
C.IMP.1
Assessment criteria
System design guidelines: including RAMS related aspects.
Unit design guidelines: including RAMS related aspects.
Obsolescence management plan: Authorised vendor list, Spare parts list (hardware & software), stock,
alternate spare parts list, management of intellectual property. Obsolescence criteria for software.
Manufacturer preferred equipment list.
Plan for integration of systems into unit: The responsibilities of the different organizations, dependencies
among systems, sequence for integration, integration environment, tests and integration readiness criteria.
Plan for integration of sub-systems and components into systems (when required): Dependencies among
systems, sub-systems and components, sequence for integration, integration environment, tests and
integration readiness criteria.
Schedule.
Project plan: WBS, technical attributes used for estimating, effort and costs estimates, deliverables and
milestones, configuration management plan.
Resource allocation.
A quality system, documents, minutes of meetings, or other relevant information showing: A defined way
of working for the major activities in the project, Clear roles and responsibilities and Defined ways of
interaction between the different organizations (e.g. owner, system integrator, supplier, independent
verifier, and others).
RAMS hazard and risk list showing consideration of software risks.
Defined risk identification and analysis methods.
Relevant risks are communicated to other roles.
RAMS hazard and risk mitigation list showing mitigation actions for software risks.
Relevant mitigation actions are communicated to other roles
Safety analysis showing consideration of software failure modes.
Plan showing objectives, methods, tools, and procedures to be used, consistent with the RAMS plan for
the unit.
Schedule of RAMS activities.
RAM data to be collected (CL3).
Submitted technical proposal for the system: system breakdown, alternatives and options, description of
customisation or parameterisation of existing products (including software), requirements compliance
matrix and software lifecycle information (including licensing, ownership and obsolescence).
Refined component requirements and specification.
Requirement allocation matrix.
System/component behaviour and interaction specification and descriptions: use cases, sequences
(including signal usage), state diagrams, interlocks, degraded sequences, performance targets and
constraints and limitations.
Verification strategy: which part to verify: unit, system, sub-system, component, module, design
documents.
Method specification documents, etc.: which methods to use for this verification: testing, inspection, code
analysis, simulation, prototyping, peer review techniques, quality criteria and targets, which test types to
use: functional, performance, regression, user interface, negative, what environment to use for verification
and identification of the test stages (e.g. sea trials, integration tests, commissioning, FAT, internal testing,
component testing) to be used for the verification and the schedule for those tests.
Validation strategy: products to be validated, validation criteria, operational scenarios, methods and
environments.
Documented design review records addressing: requirements verification, design rules and verification of
uncertainties.
Minutes from review: review results considering consistency of interface/function/component/scenarios.
Interface specification reviews addressing at least: consistency between input and output signals,
frequency and scan rates, deadlocks, propagation of failures from one part to another, engineering units,
network domination.
Component acceptance data: acceptance criteria, component acceptance (FAT, SAT) test procedures,
component acceptance test records, component acceptance issue and problems list and component
acceptance coverage measurements (requirements, structural).
Supplier agreement on: list of deliverables, review and approval plans and support and maintenance
agreement.
Product documentation.
Operation manual.
Configuration information.
Developed component release note.
Commented software source code.
Parameters and configuration files.
I/O List.
Development environment configuration.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.7 – Page 31
Table A2: Acceptance criteria for assessments of supplier (Continued)
Reference
C.IMP.2
C.IMP.3
C.IMP.4
C.INT.1
C.RAMS.1
C.RAMS.2
C.RAMS.3
C.VV.1
C.VV.2
C.VV.3
C.VV.4
C.VV.5
C.VV.6
C.VV.7
C.VV.8
E.ACQ.1
E.CM.1
E.RAMS.3
E.RAMS.4
X.ACQ.1
X.ACQ.2
X.CM.1
X.CM.2
X.DES.1
Assessment criteria
System and component support documentation: data sheets, user manuals, administration manuals,
operating and maintenance procedures, training material and FAQs, known defects and troubleshooting
guides.
Review records for the support documentation.
Software test log: list of defects, date of test, tester, test scope and pass or fail.
Software defect list.
Software guidelines/standards/rules/checklists/automated checks.
Review records.
Integration readiness criteria fulfilled per component and per system.
RAMS compliance analysis information.
RAM report: Calculations of RAM values for designated systems and RAM data.
Maintenance management plan: configuration items, rules for operation/maintenance, backup and restore
procedures, expected maintenance activities, expected software update, migration and retirement
activities, schedules and tailored procedures for maintenance in operation.
Peer review methodology description.
Peer review schedule.
Peer review records.
Peer review check lists.
Parameter list review report: name, value, tolerance, function.
Test procedures.
Test reports.
Test procedures
Test reports
Software code verification: peer review reports, code analysis reports and code rule set.
Verification result evaluation: result analyses, punch lists, action lists, defect correction and focus on
defect prone software.
Software qualification report: reused software component list, qualification method for each reused
software component and qualification data.
System FAT procedure: coverage of requirements, functionality, performance, RAMS (when applicable),
integration testing, hardware/software integration, interfaces and degraded modes.
System FAT report: consistent with procedure, deviations identified and coverage measured.
Obsolescence strategy document.
Obsolescence management plan: Authorised vendor list, Spare parts list (HW & compatible SW),
Alternate spare parts list and Management of intellectual property.
Change requests
Impact analysis
Change orders
Work orders
Problem reports
Release notes
Maintenance logs
RAMS analysis.
Impact analysis showing RAMS evaluation.
Sub-supplier progress review schedule.
Sub-supplier progress review reports.
Sub-supplier project control records.
Sub-supplier quality control records.
Supplier agreement: list of deliverables and review and approval plans.
Review records/minutes.
Change requests/orders.
Version histories for baselines.
Changes to: unit requirements, unit design, system requirements, system design, software design, interface
specifications and software.
Configuration records from document or software repositories.
Component release note: including list of changes to previous version of component.
Base product design description.
Revision information for updated base-product components.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.7 – Page 32
Table A2: Acceptance criteria for assessments of supplier (Continued)
Reference
X.PM.1
X.PQA.3
X.PQA.6
X.REQ.1
X.RISK.1
X.RISK.2
X.VV.1
X.VV.2
Assessment criteria
Master schedule.
Master plan (updated).
Project status report.
Project action list.
Minutes of review meetings.
Progress report.
Proof that process adherence is being assessed: Quality control records, Project control records and
Minutes of meetings, or other relevant information.
Corrective action plan: Responsibility allocation for actions, Records of actions taken and Evidence of
implementation of the actions.
Up to date traceability information: from owner to system requirements, from system requirements to
functional specifications (where applicable), from system requirements to base-product and configuration
data (where applicable), from functional specifications to sub-system/component specifications and from
requirements to test procedures (when the test procedures are available).
Completeness and consistency review records of the traceability information.
Project risk management plan.
Updated internal risk register (per organization).
Updated project risk register (jointly managed).
Updated internal risk register: risk list, mitigation actions and follow-up records (per organization).
Updated project risk register: risk list, mitigation actions and follow-up records (jointly managed).
Test procedure: consistent with change or upgrade scope.
Test report: consistent with test procedure.
Existence of relevant test procedures.
A 300 Documentation criteria for the supplier
301 The table below lists all documents to be sent to the independent verifier and in which activities the
independent verifier is going to use the different documents.
302 When the independent verifier is expected to comment on the document, the word ‘reviewed’ is
employed. For documents which serve as background information to put the reviewed documents in a context,
the word ‘used’ is employed.
303 Most documents are provided for information (FI). The only document that is sent to the independent
verifier for approval (AP) is the corrective action plan.
Table A3: Documents required for review
Reference
B.ACQ.1
B.ACQ.2
B.CM.1
B.CM.2
B.DES.1
B.DES.2
B.DES.3
B.DES.5
B.INT.1
B.PM.1
B.PQA.1
B.RAMS.1
B.RAMS.2
B.RAMS.3
B.RAMS.4
B.REQ.1
Documents
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
Interface description (FI),
Functional description (FI) and
Block (topology) diagram (FI) reviewed in B.IV.1 at CL3.
Software design description (FI) reviewed in B.IV.1 at CL3.
RAMS design guidelines and methods for the vessel (FI) used in B.IV.1 at CL3.
RAMS design guidelines and methods for the system (FI) used in B.IV.1 at CL3.
No documentation to be submitted to DNV for review.
Integration plan (FI):
- reviewed in B.IV.2 at CL2 and CL3
- used in C.IV.1 at CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
RAMS risk register (FI) and RAMS risk analysis documentation (FI) reviewed in B.IV.3 at CL3.
RAMS risk register (FI):
- reviewed in B.IV.3 at CL3
- used in C.IV.3 and D.IV.3 at CL3.
Safety assessment report (FI) used in C.IV.3 at CL3.
Plan for handling of RAMS (FI):
- reviewed in B.IV.4 at CL3
- used in C.IV.3 at CL3.
Specification (FI) used in B.IV.1 at CL3.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.7 – Page 33
Table A3: Documents required for review (Continued)
Reference
B.REQ.2
B.REQ.3
B.VV.1
B.VV.2
B.VV.3
B.VV.4
C.ACQ.1
C.ACQ.2
C.IMP.1
C.IMP.2
C.IMP.3
C.IMP.4
C.INT.1
C.RAMS.1
C.RAMS.2
C.RAMS.3
C.VV.1
C.VV.2
C.VV.3
C.VV.4
C.VV.5
C.VV.6
C.VV.7
C.VV.8
E.ACQ.1
E.CM.1
E.RAMS.3
E.RAMS.4
X.ACQ.1
X.ACQ.2
X.CM.1
X.CM.2
X.DES.1
X.PM.1
X.PQA.3
X.PQA.6
X.REQ.1
X.RISK.1
X.RISK.2
X.VV.1
X.VV.2
Documents
Specifications (FI) used in B.IV.1 at CL3.
Specifications (FI) used in B.IV.1 at CL3.
Verification and validation strategy (FI):
- reviewed in B.IV.2, at CL2 and CL3
- used in C.IV.1 at CL3, and C.IV.2 and D.IV.1 at CL2 and CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
RAMS compliance report (FI) reviewed in C.IV.3 at CL3.
RAM report (FI) reviewed in C.IV.3.
No documentation to be submitted to DNV for review.
Software peer review records (FI) used in C.IV.1 at CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
Test procedure at manufacturer (FI) and
Test report at manufacturer (FI) used in C.IV.1 at CL3.
Software code analysis record (FI) used in C.IV.1 at CL3.
Verification analysis report (FI) reviewed in C.IV.1 at CL3.
No documentation to be submitted to DNV for review.
System FAT procedure (FI) and System FAT report (FI):
- reviewed in C.IV.2 at CL2 and CL3
- used in C.IV.1 at CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
Corrective action plan (AP) reviewed and approved in X.IV.1.
Traceability matrices (FI) used in B.IV.1 and C.IV.1 at CL3, and in C.IV.2 at CL2 and CL3.
No documentation to be submitted to DNV for review.
No documentation to be submitted to DNV for review.
Test procedure (FI) and Test report (FI) used in E.IV.1 at CL3.
No documentation to be submitted to DNV for review.
DET NORSKE VERITAS AS
SECTION 8
ISDS REQUIREMENTS FOR THE INDEPENDENT VERIFIER
A. Independent verifier requirements
A 100 Activities for which the independent verifier is responsible
101 The following table describes the activities that shall be performed by the independent verifier.
102 As part of the classification process for the ISDS notation, DNV shall take the role of independent verifier.
103 Most documents are reviewed for information (FI), only the corrective action plan is reviewed for approval (AP).
104 Document types listed in the table are further described in RP-A201 and a description of the content in an ISDS context is given in the assessment criteria for
each of the referenced activities.
Table A1 Requirements under independent verifier’s responsibility
CL
A.IV.1 3
Activity
Review
technical
specification
Description
Guidance Note
The technical specification of the None
unit shall be reviewed to verify
that CL3 systems are working
together without inconsistencies.
Interfaces to other systems are
also addressed in order to see that
there are no compromising
impacts on the CL3 systems.
Activity
Ref.
Z050-Design Philosophy
— CL3 systems are designed to A.REQ.1
(A.REQ.1)
work together without
A.REQ.2
Z040-Vessel specification
inconsistencies
A.REQ.3
(A.REQ.2 & A.REQ.3)
— Other systems are not having a A.REQ.4
Z100-Specification (A.REQ.4)
compromising impact on the A.REQ.6
Z290-Record-Traceability
CL3 system(s).
A.RISK.3
matrices (A.REQ.6)
Z040-Vessel specificationISDS Confidence levels
(A.RISK.3)
Document type
Acceptance Criteria
Verification
Output
- Review
comments on
Z040 and Z100
document types
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.8 – Page 34
DET NORSKE VERITAS AS
Input
ID
Table A1 Requirements under independent verifier’s responsibility (Continued)
Input
ID
CL
A.IV.2 3
Activity
Description
Review the
RAMS
approach for
the unit
Review of RAMS requirements,
the allocation to systems and the
plan for performing RAM
activities to ensure completeness
and consistency.
Guidance Note
None
Document type
Acceptance Criteria
Verification
Output
- Review
comments on all
I300 input
document types
— For CL2: qualitative
requirements
— For CL3: quantitative
requirements are required.
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.8 – Page 35
DET NORSKE VERITAS AS
I300-RAMS documentation for — Completeness:
the vessel-List of regulatory
a) Listing of, laws,
requirements (A.RAMS.1)
standards, and rules that
I300-RAMS documentation for
apply regarding safety
the vessel-List of RAM
requirements for the vessel
b) Plan showing objectives,
(A.RAMS.2)
methods, tools, and
I300-RAMS documentation for
procedures to be used.
the vessel-Plan for handling of
c)
Schedule
of RAMS
RAMS (A.RAMS.3)
activities.
Z040-Vessel specificationISDS Confidence levels
d) RAM data to be collected
(CL3)
(A.RISK.3)
e) RAMS requirements
f) Confidence levels
Activity
Ref.
A.RAMS.1
A.RAMS.2
A.RAMS.3
A.RISK.3
Table A1 Requirements under independent verifier’s responsibility (Continued)
Input
ID
CL
Guidance Note
B.IV.1 3
Review
technical
specification
and design
Review of functional design
specification and software
design specification agreed upon
by supplier and yard against
vessel specification, as well as
other criteria and RAMS
objectives.
None
B.IV.2 2
Review
integration,
verification
and
validation
strategy
Review verification/validation/
integration strategy against
technical specification for
completeness of verification
intended to be achieved.
None
Document type
Z100-Specification (B.REQ.1,
B.REQ.2, B.REQ.3)
Z060-Functional description
(B.DES.4).
I020-Functional description
(B.DES.1)
I030-Block (topology) diagram
(B.DES.1, B.DES.4)
I220-Interface descriptionInter-system interface
specification (B.DES.4)
I220-Interface descriptionIntra-system interface
specification (B.DES.1)
I290-Software design
description (B.DES.2)
I300-RAMS documentation for
the vessel-RAMS design
guidelines and methods
(B.DES.3)
I310-RAMS documentation for
the system-RAMS design
guidelines and methods
(B.DES.3)
Z290-Record-Traceability
matrices (X.REQ.1)
I140-Verification and
Validation Strategy (B.VV.1),
I210-Integration plan(B.INT.1),
I220-Interface description –
inter system interface
specification (B.INT.2)
I030-Block (topology) diagram
(B.DES.4)
Acceptance Criteria
— Consistency of functional
design specification and
software design specification
agreed upon by supplier and
yard against vessel
specification,
— Consistency with RAMS
objectives.
— Completeness of verification
intended to be achieved
(functions, interfaces)
— Completeness of validation
intended to be achieved
(requirements, scenarios)
— Completeness of integration
plan with regards to software
Verification
Output
Activity
Ref.
B.DES.1
B.DES.2
B.DES.3
B.DES.4
B.REQ.1
B.REQ.2
B.REQ.3
X.REQ.1
- Review
comments on
I020, I030,
I220, I290 and
Z060 document
types
B.DES.4
B.INT.1
B.INT.2
B.VV.1
- Review
comments on
I140 and I210
document types
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.8 – Page 36
Description
DET NORSKE VERITAS AS
Activity
Table A1 Requirements under independent verifier’s responsibility (Continued)
Input
ID
CL
B.IV.3 3
Description
Review and
comment on
the software
part of the
safety
analysis for
critical
functions in
scope of the
ISDS
Review the software-related
RAMS risks and priorities, and
the resulting software-related
RAMS mitigation actions.
Review the
RAMS
approach for
the system
Review of RAMS requirements
for the system, and the plan for
performing RAM activities to
ensure completeness and
consistency.
Guidance Note
Special
methods for
RAMS risk
analysis and
mitigation are
used, such as:
HAZID,
HAZOP,
FMEAs, and
FMECAs.
None
Document type
I300-RAMS documentation for
the vessel - RAMS risk register
(B.RAMS.1 & B.RAMS.2)
I310-RAMS documentation for
the system - RAMS risk
register (B.RAMS.1 &
B.RAMS.2)
I300- RAMS documentation
for the vessel-RAMS risk
analysis documentation
(B.RAMS.1)
I310- RAMS documentation
for the system-RAMS risk
analysis documentation
(B.RAMS.1)
I300-RAMS documentation for
the vessel-List of regulatory
requirements (A.RAMS.1)
I300-RAMS documentation for
the vessel-Plan for handling of
RAMS
(A.RAMS.3)
I310-RAMS documentation for
the system-Listing of RAM
requirements for the system
(A.RAMS.2)
I310-RAMS documentation for
the system-Plan for handling of
RAMS (B.RAMS.4)
Z040-Vessel specificationISDS Confidence levels
(A.RISK.3)
Verification
Activity
Output
Ref.
Consistency of the scope
B.RAMS.1 - Review
selection for risk
B.RAMS.2 comments on
identification,
I300 and I310
Consistency between unit and
document types
system level risk analysis,
Consideration of the software
failures modes,
Consideration of the
hardware-induced software
failure modes.
Acceptance Criteria
—
—
—
—
— Completeness:
a) Listing of, laws,
standards, and rules that
apply regarding safety
b) Plan showing objectives,
methods, tools, and
procedures to be used.
c) Schedule of RAMS
activities.
d) RAM data to be collected
(CL3)
e) RAMS requirements.
— For CL2: qualitative
requirements are good enough
— For CL3: quantitative
requirements are required.
A.RAMS.1
A.RAMS.2
A.RAMS.3
A.RISK.3
B.RAMS.4
- Review
comments on
the I310
document type
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.8 – Page 37
DET NORSKE VERITAS AS
B.IV.4 3
Activity
Table A1 Requirements under independent verifier’s responsibility (Continued)
Input
ID
CL
Activity
Description
Guidance Note
Evaluate how completely the
verification strategy has been
executed in the construction
phase (through the FAT).
Determine whether or not all
required testing, peer reviews,
and other verification activities
have been conducted.
Confirm that any issues or
problems identified during these
activities are being tracked to
closure.
Records of
V&V activities
(e.g., test
reports, peer
review reports)
should be
matched to
configuration
items.
Traceability
matrices should
be checked to
see that
requirements
are included in
test plans or
other V&V
activities.
C.IV.2 2
Review and
witness the
FAT
Review and contribute to the
FAT procedure.
Witness the FAT execution.
None
Z290-Record-Software peer
review (C.VV.1)
Z120-Test procedure at
manufacturer (C.VV.4)
Z130 -Report from test at
manufacturer (C.VV.4)
Z290-Record -Software code
analysis (C.VV.5)
Z120-Test procedure at
manufacturer-System FAT
procedure (C.VV.8)
Z130 -Report from test at
manufacturer-System FAT
report(C.VV.8)
Z241-Measurement reportVerification analysis report
(C.VV.6)
I210-Integration plan
(B.INT.1)
I140-Software quality plan Verification and validation
strategy (B.VV.1)
Z290-Record-Traceability
matrices (X.REQ.1)
Z120-Test procedure at
manufacturer-System FAT
procedure (C.VV.8)
Z130 -Report from test at
manufacturer–System FAT
report(C.VV.8)
I140-Software quality planVerification and validation
strategy (B.VV.1)
Z290-Record-Traceability
matrices (X.REQ.1)
Activity
Ref.
— Planned verification activities B.INT.1
B.VV.1
have been successfully
C.VV.1
completed and problems
C.VV.4
resolved.
— Verification and validation is C.VV.5
C.VV.6
on track as expected by the
C.VV.8
strategies.
X.REQ.1
Acceptance Criteria
— Consistency between
verification strategy and test
procedures
— Completeness of test
procedures (functions,
interfaces, design)
— Completeness of test cases
— Test reports reflect the actual
test results with pass/fail
judgements and deviations
from procedures.
B.VV.1
C.VV.8
X.REQ.1
Verification
Output
- Review
comments on
the Z241
document type
- Review
comments on
Z120 and
Z130 document
types
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.8 – Page 38
Review the
verification
activities in
the
construction
phase
DET NORSKE VERITAS AS
C.IV.1 3
Document type
Table A1 Requirements under independent verifier’s responsibility (Continued)
Input
ID
CL
Activity
Description
Guidance Note
Review
RAMS
arguments
and
evidence for
the system
The RAMS arguments and
reviews shall be reviewed to
ensure completeness and
compared with the RAMS
requirements for the system.
None
D.IV.1 2
Review and
witness the
Commission
ing tests
Review and contribute to the
commissioning tests. Witness
the commissioning execution.
None
Acceptance Criteria
Verification
Output
- Review
comments on
I310-RAMS
compliance
report and
I310-RAM
report
document types
-Review
comments on
Z140 and
Z150 document
types
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.8 – Page 39
DET NORSKE VERITAS AS
C.IV.3 3
Activity
Ref.
I310-RAMS documentation for — Completeness of content:
A.RAMS.1
the system-RAMS compliance
A.RAMS.2
a) RAM data and RAM
report (C.RAMS.1)
B.RAMS.2
calculations
I310-RAMS documentation for
B.RAMS.3
the system -RAM report
b) RAMS compliance report B.RAMS.4
(C.RAMS.2)
C.RAMS.1
I310-RAMS documentation for — Risks to RAM and software
C.RAMS.2
safety are under control.
the system-List of RAM
requirements (A.RAMS.2)
I310-RAMS documentation for
the system-List of regulatory
requirements (A.RAMS.1)
I310-RAMS documentation for
the system - RAMS risk
register ( B.RAMS.2)
I310-RAMS documentation for
the system -Safety assessment
report (B.RAMS.3)
I310-RAMS documentation for
the system-Plan for handling of
RAMS (B.RAMS.4)
I140-Software quality plan — Consistency between
B.VV.1
Verification and validation
verification strategy and test
C.VV.9
strategy (B.VV.1)
procedures
D.VV.1
Z140-Test procedure for quay — Completeness of test
D.VV.2
and sea trial (D.VV.1, D.VV.2)
procedures (functions,
D.VV.4
Z150-Report from quay and sea
interfaces, design)
trials (D.VV.1, D.VV.2)
— Completeness of test cases
Z140-Test procedure for quay — Consistent with RAMS
and sea trial-Integration test
objectives
procedure (D.VV.4)
— Test reports reflect the actual
Z150-Report from quay and sea
test results with pass/fail
trials-Integration test report
judgements and deviations
(D.VV.4)
from procedures.
Z140-Test procedure for quay
and sea trial-Independent test
procedure, if at CL3 (C.VV.9)
Z150-Report from quay and sea
trials-Independent test report, if
at CL3 (C.VV.9)
Document type
Table A1 Requirements under independent verifier’s responsibility (Continued)
Input
ID
CL
Activity
Description
Guidance Note
Review and analyse the results of This activity
all tests activities done by the
focuses on the
system integrator and owner.
review of
activities
performed
after C.IV.1
activity.
D.IV.3 3
Review
RAMS
arguments
and
evidence for
the unit
The RAMS arguments and
reviews shall be reviewed to
ensure completeness and
compared with the RAMS
requirements for the unit.
None
E.IV.1 3
Witness
upgrade
commission
ing tests
Review and contribute to the
upgrade tests. Witness the test
execution.
None
Acceptance Criteria
Verification
Output
- Review
comments on
the Z241
document type.
- Review
comments on
I300-RAMS
compliance
report and
I300-RAM
report
document types
-Review
comments on
Z140 and
Z150 document
types
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.8 – Page 40
Review
commission
ing test
results
DET NORSKE VERITAS AS
D.IV.2 3
Activity
Ref.
Z150-Report from quay and sea — Planned verification and
C.VV.9
trials (D.VV.1, D.VV.2)
validation activities have been D.VV.1
Z150-Report from quay and sea
successfully completed and
D.VV.2
trials-Integration test report
problems resolved.
D.VV.3
(D.VV.4)
— Verification and validation
D.VV.4
Z150-Report from quay and sea
strategies have achieved their
trials-Independent test
objectives.
report(C.VV.9)
— RAMS objectives are met.
Z241-Measurements reportVerification analysis report
(D.VV.3)
I300-RAMS documentation for — Completeness of content of
A.RAMS.1
the vessel-List of regulatory
reports
A.RAMS.2
requirements (A.RAMS.1)
B.RAMS.2
a) Calculations of RAM
I300-RAMS documentation for
D.RAMS.1
values for designated
the vessel-List of RAM
D.RAMS.2
systems
requirements (A.RAMS.2)
D.RAMS.3
I300-RAMS documentation for
b) RAMS compliance report
the vessel-RAMS risk register
(B.RAMS.2)
— For CL2 systems: qualitative
I300-RAMS documentation for
RAMS are good enough
the vessel-RAMS compliance — For CL3 systems: qualitative
report (D.RAMS.1)
RAMS are required
I300-RAMS documentation for — Risks To RAM and Software
the vessel-RAM report
Safety are under control.
(D.RAMS.2)
— Security audit has been
I310-RAMS documentation for
performed.
the system-RAM report
(D.RAMS.2)
I300-RAMS documentation for
the vessel-Security audit report
(D.RAMS.3)
Z140-Test procedure for quay — Completeness of test
E.VV.1
and sea trial (E.VV.1, E.VV.2)
procedures (impacted
E.VV.2
Z150-Report from quay and sea
functions, impacted
X.VV.1
trials (E.VV.1, E.VV.2)
requirements, impacted
Z120-Test procedure at
scenarios)
manufacturer (X.VV.1)
— Consistent with RAMS
Z130 -Report from test at
objectives.
manufacturer (X.VV.1)
Document type
Table A1 Requirements under independent verifier’s responsibility (Continued)
Input
ID
CL
X.IV.1 1
Activity
Assess
compliance
to the ISDS
standard
Guidance Note
Processes shall be assessed to
determine how they comply with
the applicable ISDS standard.
An activity status report is issued
by the independent verifier for
each milestone.
Each
organization is
responsible
for its quality
assurance. The
quality
assurance is
responsible
for ensuring
the ISDS
practices are
achieved. The
purpose of the
independent
verifier is not
to bear
responsibility
for the
organizations
to perform
their activities
as intended,
but to verify
that such a
target has been
achieved.
The inputs to
this activity
are the results
from all the IV
activities
performed in
the phase.
The corrective action plan shall
be reviewed for approval. The
implementation of the corrective
actions shall be assessed in the
subsequent phase.
DET NORSKE VERITAS AS
This activity shall be performed
in phases A to D and may be
performed in phase E.
For the E phase, see description
of annual and renewal
assessments in Ch.3 Sec.1
C200-300.
X.IV.2 1
Analyse and
present the
ISDS
assessment
and IV
activities
results for
the phase
Gather the results from the
various independent verification
activities.
Analyse the results from the
independent verifier activities
and analyse the associated risks.
Present the analysis result and
associated risks at milestone
meeting.
This activity shall be performed
in phases A to D.
Document type
Q030-Corrective action plan
(X.PQA.4, X.PQA.5 and
X.PQA.6)
In addition relevant
documentation is reviewed
during the assessment.
Activity
Ref.
— Compliance to ISDS activities X.PQA.4
and document requirements.
X.PQA.5
X.PQA.6
All documents produced by IV activities.
Acceptance Criteria
IV
activities
Verification
Output
- Review
comments on,
and approval
of the Q030
document type
- Milestone
report
- Milestone
presentation
Offshore Standard DNV-OS-D203, December 2012
Ch.2 Sec.8 – Page 41
Description
OFFSHORE STANDARD
DNV-OS-D203
INTEGRATED SOFTWARE DEPENDENT SYSTEMS
CHAPTER 3
CLASSIFICATION AND CERTIFICATION
CONTENTS
Sec. 1
PAGE
Requirements ..................................................................................................................... 43
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.3 Sec.1 – Page 43
SECTION 1
REQUIREMENTS
A. General
A 100 Introduction
101 As well as representing DNV’s interpretation of safe and recommended engineering practice for general
use by the offshore industry, the offshore standards also provide the technical basis for DNV classification,
certification and verification services.
102 A complete description of principles, procedures, applicable class notations and technical basis for
offshore classification is given by the offshore service specifications, see Table A1.
Table A1 Offshore Service Specifications
Document reference no.
DNV-OSS-101
DNV-OSS-102
DNV-OSS-103
Title
Rules for Classification of Offshore Drilling and Support Units
Rules for Classification of Floating Production, Storage and Loading Units
Rules for Classification of LNG/LPG Floating Production and Storage Units or Installations
A 200 Organisation of Chapter 3
201 Chapter 3 identifies the specific documentation, certification and surveying requirements to be applied
when using this standard for certification and classification purposes.
A 300 Classification principles
301 The requirements of this standard shall only be applied for systems also subject to classification through
main class and other additional class notations, as relevant.
302 As part of the classification process DNV shall take the role as the independent verifier (IV). Independent
verifier activities are given in Ch.2, Sec.8.
A 400 Compliance of Activities
401 The requirements and corresponding acceptance criteria for all activities are listed in Ch.2, Sec.5 to 7,
and explained in detail in Appendix B.
402 Compliance of activities is verified through assessments carried out by the independent verifier. Each
role shall be assessed by the independent verifier during the project. For the operation phase (E), assessments
shall be performed regularly as defined in C100 to 300.
403 During the assessment the independent verifier shall assess each activity for the relevant role, confidence
level and project phase. Findings shall be reported by the independent verifier to the assessed role in an
assessment report.
A 500 Approval of Documents
501 Based on the assessment performed by the independent verifier the assessed role shall present an action
plan for approval, with specific and time bound actions for each non-conformity.
502 The independent verifier shall verify that the actions in the approved action plan are carried out according
to the plan.
503 Documentation, other than the action plan, listed under the documentation criteria for each role in Ch.2
in this standard, shall be reviewed and commented by the independent verifier for information.
A 600 Rating of compliance
601 DNV rates the different activities defined for each role in Ch.2 based on observations during assessments
and document review, and list these as findings in an assessment report.
602 Three different types of ratings are used; High compliance, Medium compliance and Low compliance.
603 In the assessment report High compliance is represented by the colour green, Medium compliance by
the colour orange and Low compliance by the colour red.
A 700 Reporting and milestone meetings
701 Each assessed organisation will receive an assessment report with the major findings and status with
regards to approval of activities.
702 DNV will provide input to the milestone meetings M1, M2, M3 and M4 in the form of a presentation of
an ISDS status report which summarises the different organisation’s compliance with this standard as assessed
by DNV.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.3 Sec.1 – Page 44
703 DNV will at the same time provide an assessment of the risks associated with the total project’s
compliance-status towards this standard.
B. Class notation
B 100 Designation
101 Units built and tested in compliance with the requirements of this standard can be assigned the optional
class notations for Integrated Software Dependent Systems (ISDS).
102 The notation can be assigned to a new-build when compliance is verified for phases A through D. To
maintain the notation, compliance must be verified for phase E.
103 The designation for the class notation shows the notation name, which is ISDS, the systems included in
the scope of the notation, and the confidence level specified for each system:
ISDS (system1CL, system2CL,…, systemnCL)
Guidance note:
Example: ISDS (DP2, PMS2, WCS3): these systems have been developed according to the scope and confidence
levels identified in Ch.3, Sec.1, B200. The DP abbreviation refers to the system itself and not to the class notations
such as DYNPOS AUTR etc.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
104
The ISDS class notation does not replace, but is complementary to other class notations.
B 200 Scope
201 Table B1 defines the systems and confidence levels recommended to include in the scope of the ISDS
class notation.
202 Unless otherwise agreed by Owner and DNV, and specified by Owner, the scope and confidence levels
defined in Table B1 applies for the ISDS class notation.
Table B1: ISDS class notation scope, system definitions and confidence levels for selected systems
Function
System that may be
included in scope for ISDS
Prevent
Shutdown and
escalation of Disconnection Systems
abnormal
(SDS)
conditions
Typical sub-systems and
components
— Network
— Emergency Shutdown (ESD)
system, including:
— Input devices
— Interfaces towards other
safety systems
— Central control unit
— Output actuators
— Signal transfer lines
— Power supply
— Process Shutdown (PSD)
system
— Emergency Disconnect
System (EDS)/Emergency
Quick Disconnect (EQD)
system
— Critical Alarm and Action
Panel (CAAP)
— High Integrity Protection
System (HIPS)
Fire & Gas System (F&G) — Network
— Fire and gas detection system
— Alarm and communication
system
— Systems for automatic action
DET NORSKE VERITAS AS
DNV Rule CL Applicable for
reference*
2 Drilling Unit
OS-A101
Well Intervention Unit
FPSO
FSU
OS-E101
OS-D202
OS-E201
OS-D301
2
Drilling Unit
Well Intervention Unit
FPSO
FSU
Offshore Standard DNV-OS-D203, December 2012
Ch.3 Sec.1 – Page 45
Table B1: ISDS class notation scope, system definitions and confidence levels for selected systems
(Continued)
Function
System that may be
included in scope for ISDS
Well control Well Control System
(WCS)
Drilling
Drilling Control system
(DCS)
Typical sub-systems and
components
— Choke and kill system
— Diverter system
— Blow Out Prevention (BOP)
system or Well Control
Package (WCP), including
—
—
—
—
—
—
—
—
Equipment Handling
—
(EH)
—
—
—
—
Heave Compensation and —
Tensioning System
(HCTS)
—
—
Drilling fluid circulation —
and cementing (MUD)
—
Well Testing Systems
—
(WT)
—
Work Over/ Work Over Control
—
Completion System (WOCS)
—
—
—
Power
Power Management
—
generation System (PMS)
and
—
distribution
—
—
—
—
—
—
—
—
Power Plant Control
System (PPC)
—
—
—
—
—
—
— Topside panels
— Network
— Subsea Electronic
Modules (SEM)
HVAC for driller’s cabin
Driller’s chair
Network
Zone management/anticollision system
Drilling data acquisition
system
Top drive
Drawwork
Rotary table
Vertical pipe handler
Horizontal pipe handler
Fingerboard
Make up system
BOP handler
Marine riser tensioners,
including re-coil system
Active compensation systems
Heave motion compensators
Mud circulation system, high
pressure
Cementing system
Production shut down system
Blow down system
WOCS Container
WOCS’s chair
Data acquisition system
Network
Power generation remote
control and monitoring
Power distribution remote
control and monitoring
Blackout prevention
Load dependent start/stop
including drilling drive and
thruster drive power
limitations.
Engine change over
Load sharing in remote droop
mode (symmetric, asymmetric
and manual)
Blackout recovery
Network
Operator stations
Integration with system
specific PMS (drilling system
PMS etc.), if applicable
Governor & synchronizing
unit
Turbine controls
Protection relays
Thruster drives
Drilling drives
High voltage or low voltage
switchboard
DET NORSKE VERITAS AS
DNV Rule CL Applicable for
reference*
OS-E101
3 Drilling Unit
Table A2
Well Intervention Unit
OS-E101
2
Drilling Unit
Well Intervention Unit
OS-E101
Table A5
2
Drilling Unit
Well Intervention Unit
OS-E101
Table A3
2** Drilling Unit
Well Intervention Unit
OS-E101
Table A6
2
Drilling Unit
Well Intervention Unit
OS-E101
Table A7
OS-E101
2
2
Drilling Unit
Well Intervention Unit
Well Intervention Unit
OS-D201
2
All unit types
OS-D201
2
All unit types
Offshore Standard DNV-OS-D203, December 2012
Ch.3 Sec.1 – Page 46
Table B1: ISDS class notation scope, system definitions and confidence levels for selected systems
(Continued)
Function
System that may be
included in scope for ISDS
Dynamic Positioning
System (DP)
Typical sub-systems and
components
Position
— DP control computer(s)
keeping
— Independent joystick
— Sensor systems
— Display systems
— Operator panels
— Network
— Positioning reference systems
— Thruster control mode
selection system
POSMOOR system
— POSMOOR control
(POS)
computer(s)
— Independent joystick
— Sensor systems
— Display systems
— Operator panels
— Network
— Positioning reference systems
— Active mooring equipment,
e.g. windlass and winch
Jacking Systems (JACK) — Hydraulic system
— Control system
— Drives
Control and Integrated Control and
— Sea water, fresh water, hot
monitoring Monitoring System (ICM)
water, high pressure wash
down system
— Fuel oil system
— HVAC system
— Service, instrument and bulk
air system
— Hydraulic power system
— Ballast system
— Load and stability system
DNV Rule CL Applicable for
reference*
Pt.6 Ch.7
2 All unit types
Pt.6 Ch.26
OS-E301
2
Drilling Unit
Well Intervention Unit
FPSO
FSU
OS-D101
2
OS-D202
2
Drilling Unit
Well Intervention Unit
Wind Installation Unit
All unit types
* Only systems and components with software should be considered.
**CL3 should be applied for units involved in fixed-to-bottom operations.
203 Other systems than those listed in Table B1may be granted the ISDS notation at CL1, 2 or 3 as specified
below:
—
—
—
—
—
Crane (CR)
Navigation Systems (NAV)
Process Control System (PCS)
Propulsion System (PROP)
Steering Systems (ST).
204 For each system a reference to the applicable DNV Rule or Offshore Standard (OS) is listed. The rules
or standards identified will provide a more specific description of the systems. DNV OS-D202 provides generic
common requirements to, and is applicable for, all systems listed above.
C. In operation assessments
C 100 Objectives
101 The ISDS class notation is maintained through demonstrating compliance to the requirements of this
standard in the operation phase.
102 DNV, as the independent verifier, shall perform an annual assessment of the unit in operation to verify
compliance.
103 DNV shall perform a renewal assessment of the unit every fifth year.
104 Annual and renewal assessments for the ISDS notation should be carried out at the same time and interval
as the periodical classification survey of the unit, and can also be carried out on the basis of documentation and
evidence assessed onshore.
105 An action plan that demonstrates how findings will be closed shall be prepared by the owner, and
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
Ch.3 Sec.1 – Page 47
approved by DNV. DNV shall verify, through re-assessment(s), that the action plan is implemented and the
notation maintained.
106 The owner is to inform DNV whenever a system with the ISDS notation is modified. For major upgrades
or conversions of the unit in operation the full set of requirements in this standard may apply.
C 200 Scope of annual assessments
201 The purpose of the annual assessment is to ensure that the confidence that has been built into the unit is
actually maintained. The effective implementation and continuous maintenance of the activities required by
this standard for phase E, operation, shall be assessed.
202 As part of the annual assessment any changes, introduced after the latest assessment, to the systems
within ISDS scope are to be addressed. An impact analysis of changes shall be reviewed and confirmed. Any
follow up activities are to be agreed.
203 Updated evidence is to be kept and made available for review by the attending surveyor. Relevant
evidence include (with reference to required activity in parentheses):
— Inventory and spare part records with focus on obsolescence (produced in E.ACQ.1)
— Configuration management plan and logs for configuration management activities, including SW change
orders (produced in E.CM.1)
— Change request logs including impact assessments (produced in E.CM.1)
— Configuration audit reports (produced in E.CM.2)
— Procedures for modifications request (produced in E.PQA.1)
— Maintenance plans and procedures (produced in E.RAMS.1 and E.PQA.1)
— Corrective action plans (produced in X.PQA.4 and X.PQA.6)
— Quality control and project control records, as relevant (produced in X.PQA.1 and X.PQA.3)
— Maintenance in operation plans, including migration and SW retirement plans (produced in E.RAMS.1)
— Records of RAMS data (produced in E.RAMS.2)
— Analysis and investigation reports from RAMS incidents/failures (produced in E.RAMS.3)
— Records of RAMS impact analysis (produced in E.RAMS.4)
— Security audit reports (produced in E.RAMS.5)
— Records from validation, verification and testing, as relevant if systems have been changed in operation
(produced in E.VV1 and E.VV.2)
— Version histories for baselines, requirements trace and configuration records (produced in X.CM.1).
C 300 Scope of renewal assessments
301 The scope of the annual assessment is to be carried out. In addition the renewal assessment will have a
specific focus on identified process areas or activities. These areas or activities are to be selected based on a
discussion with owner of specific focus areas and should also be based on important or frequent findings from
the annual assessments carried out since the last renewal.
DET NORSKE VERITAS AS
OFFSHORE STANDARD
DNV-OS-D203
INTEGRATED SOFTWARE DEPENDENT SYSTEMS
APPENDICES
CONTENTS
App. A
App. B
PAGE
DEFINITIONS AND ABBREVIATIONS........................................................................ 49
REQUIREMENT DEFINITION ....................................................................................... 56
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.A – Page 49
APPENDIX A
DEFINITIONS AND ABBREVIATIONS
A. Definitions
A 100 Verbal Forms
Shall: Indicates a mandatory requirement to be followed for fulfilment or compliance with the present standard.
Deviations are not permitted unless formally and rigorously justified, and accepted by all parties.
Should: Indicates a recommendation that a certain course of action is preferred or particularly suitable.
Alternative courses of action are allowable under the standard when agreed between contracting parties, but
shall be justified, documented and approved by DNV.
May: Indicates permission, or an opinion, which is permitted as a part of conformance with the standard.
Can: Indicates a conditional possibility.
A 200 Definitions
Acceptance criteria: The criteria that a system or component must satisfy in order to be accepted by a user,
customer, or other authorized entity [IEEE 610.12:1990].
Activity: A defined body of work to be performed, including its required input and output information.
[IEEE 1074:2006].
Availability: The ability of the system to provide access to its resources in a timely manner for a specified
duration [IEC IEV 191-02-05], alternatively, the time or proportion of time that the system is functioning as
intended.
Baseline: A consistent set of specifications or products that have been formally reviewed and agreed upon, that
thereafter serve as the basis for further development, and that can be changed only through formal change
control procedures [ISO IEC 15288:2008].
Base Product: A pre-existing product which is reused, configured, qualified, modified or enhanced to meet the
specific needs of new projects.
Black-box:
(1) A system or component whose inputs, outputs, and general function are known but whose contents or
implementation are unknown or irrelevant.
(2) Pertaining to an approach that treats a system or component as in (1).
[IEEE 610.12:1990].
Black-box testing: see Functional testing.
Block diagram: A diagram of a system, computer, or device in which the principal parts are represented by
suitably annotated geometrical figures to show both the functions of the parts and their functional relationships
[IEEE 610.12:1990].
Change control board: See Configuration control board.
Code review: Systematic examination (often as peer review) of computer source code intended to find and fix
mistakes overlooked in the initial development phase, improving the overall quality of software. Code review
may also be partly automated.
Commercial Off-The-Shelf (COTS): COTS products are ready-made packages sold off-the-shelf to the acquirer
who had no influence on its features and other qualities. Typically the software is sold pre-wrapped with its
user documentation [ISO/IEC 25051:2006(E)].
Commissioning (tests): Verifying and documenting that the unit and all of its systems are designed, installed,
tested and can be operated and maintained to meet the owner's requirements.
Component: A logical grouping of other components or modules inside a system or sub-system.
Component testing: Testing of individual hardware or software components or groups of related components
[IEEE 610.12:1990].
Configuration audits: Activities ensuring that the configuration management process is followed and that the
evolution of a product is compliant to specifications, policies, and contractual agreements. Functional
configuration audits are intended to validate that the development of a configuration item has been completed
and it has achieved the performance and functional characteristics specified in the System Specification
(functional baseline). The physical configuration audit is a technical review of the configuration item to verify
that the as-built maps to the technical documentation [INCOSE SE 2004].
Configuration control board: A group of people or a person responsible for evaluating and approving or
disapproving proposed changes to configuration items, and for ensuring implementation of approved changes.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.A – Page 50
Configuration data: Data used to configure/tailor a system or component. The data needs to be quality assured.
Configuration item: An aggregation of hardware, software, or both, that is designated for configuration
management and treated as a single entity in the configuration management process [IEEE 610.12:1990].
Consequence (failure): Real or relative magnitude of the seriousness of the failure (business, environmental
and safety).
Consistency: The degree of uniformity, standardization, and freedom from contradiction among the documents
or parts of a system or component [IEEE 610.12:1990].
Coverage: The amount or proportion of a software component that has been tested. It is commonly quantified
by counting the execution of statements, decision outcomes, and I/0 values. Coverage measures help to identify
unreachable and untested code.
Critical: Any function or component whose failure could interfere significantly with the operation or activity
under consideration.
Criticality: The degree of impact that a requirement, module, error, fault, failure, or other item has on the
development or operation of a system [IEEE 610.12:1990].
Cycle time:
(1) The period of time required to complete a sequence of events.
(2) A set of operations that is repeated regularly in the same sequence, possibly with variations in each
repetition; for example, a computer's read cycle.
[IEEE 610.12:1990].
Deadlock: A situation in which computer processing is suspended because two or more devices or processes
are each awaiting resources assigned to the others [IEEE 610.12:1990].
Decision Coverage: A measure of the amount of software tested, typically expressed as the number or
percentage of outcomes of decision statements in the component that have been tested.
Defect: Non-fulfilment of a requirement related to an intended or specified use [ISO 9000: 2005].
Dependability: Collective term used to describe the availability performance and its influencing factors:
reliability performance, maintainability performance and maintenance support performance. See also RAMS,
to which it adds Security [IEC IEV 191-02-03].
Due diligence (software): An investigation, validation and verification of a software system/sub-system/
component/module for proving its usefulness and conformity in a given context.
Error: A discrepancy between a computed, observed or measured value and condition and the true, specified
or theoretically correct value or condition [IEC 61508-4].
Essential function (or system): A system supporting the function, which needs to be in continuous operation or
continuously available for on demand operation for maintaining the unit's safety [DNV-OS-D202].
Established design: A design that has largely been previously, successfully implemented. Such designs are
often the basis for turnkey fixed price systems such as current generation drill ships.
Factory Acceptance Tests (FAT): Acceptance testing (see above) of a component, sub-system or system before
delivery and integration.
Failure: The termination of the ability of a functional unit to perform a required function on demand.
Note: a fault in a part of the system may lead to the failure of its function, itself leading to a fault in other linked
parts or systems etc.
Failure mode: A defined manner in which a failure can occur. Failure modes can be seen as scenarios for how
a system can go wrong.
Fault: Abnormal condition that may cause a reduction in, or loss of, the capability of a functional unit to
perform a required function excluding the inability during preventive maintenance or other planned actions, or
due to lack of external resources [IEC 61508-4].
Finding: The result of approval, assessment and renewal activities by DNV, identifying the most important
issues, problems or opportunities for improvement within the scope of this standard.
Firmware: The combination of a hardware device and computer instructions and data that reside as read-only
software on that device [IEEE 610.12:1990].
Functional requirement: A requirement that specifies a function that a system or system component must be
able to perform [IEEE 610.12:1990].
Functional testing:
(1) Testing that ignores the internal mechanism of a system or component and focuses solely on the outputs
generated in response to selected inputs and execution conditions.
(2) Testing conducted to evaluate the compliance of a system or component with specified functional
requirements.
[IEEE 610.12:1990].
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.A – Page 51
Generic System Requirement (GSR): an attribute of the system or its function, relating to performance, quality
or RAMS, as seen by the owners [DNV-RP-D201].
Contain advice which is not mandatory for the assignment or retention of class, but with which DNV, in light
of general experience, advises compliance [DNV-OSS-101].
Impact analysis: Analysis, which identifies all systems and software products affected by a software change
request and develops an estimate of the resources needed to accomplish the change and determines the risk of
making the change [SWEBOK 2004].
Integration strategy: An assembly sequence and strategy that minimizes system integration risks. This strategy
may permit verification against a sequence of progressively more complete component configurations and be
consistent with a fault isolation and diagnosis strategy. It defines the schedule of component availability and
the availability of the verification facilities, including test jigs, conditioning facilities, assembly equipment
[ISO/IEC 15288].
Integrated Software Dependent System: An integrated software dependent system is an integrated system for
which the overall behaviour is dependent on the behaviour of its software components.
Integrated System: An integrated system is a set of elements which interact according to a design, where an
element of a system can be another system, called a subsystem, which may be a controlling system or a
controlled system and may include hardware, software and human interaction [IEC 61508-4].
Interface:
(1) A shared boundary across which information is passed.
(2) A hardware or software component that connects two or more other components for the purpose of passing
information from one to the other.
(3) To connect two or more components for the purpose of passing information from one to the other.
(4) To serve as a connecting or connected component as in (2) [IEEE 610.12:1990].
(5) A collection of operations that are used to specify a service of a component.
Interface Specification: Data describing the communications and interactions among systems and subsystems.
Interface testing: Testing conducted to evaluate whether systems or components pass data and control correctly
to one another [IEEE 610.12:1990].
Maintainability:
(1) The ease with which a software system or component can be modified to correct faults, improve
performance or other attributes, or adapt to a changed environment.
(2) The ease with which a hardware system or component can be retained in, or restored to, a state in which it
can perform its required functions [IEEE 610.12:1990].
Migration: System migration involves moving a set of instructions or programs, e.g., PLC programs, from one
platform to another, minimizing reengineering. Migration of systems can also involve downtime, while the old
system is replaced with a new one.
Milestone: A scheduled event marking the transition from one project phase to the next. This standard identifies
5 milestones.
Mitigation: Action that reduces the consequence(s) of a hazardous event or risk [IEC 61511-1].
Modification request: See Change request.
Module: In this standard used to describe the lowest branches in the system hierarchy. Modules can be made
of hardware (HW) or software (SW).
Non-functional requirement: A requirement that specifies a characteristic or property that is not described as
function, e.g. performance requirements.
Non-standard legacy software: non-standard software is software that was not designed to be reused, but may
be reused anyway.
Novel: A feature, capability, or interface that largely is not present in the base product.
Obsolescence (Risk): Risk associated with technology within the system that becomes obsolete before the end
of the Expected Shelf or Operations Life, and cannot provide the planned and desired functionality. This risk
may be mitigated by the Portability Generic System Requirement [DNV-RP-D201].
Peer review: A process of subjecting an author's work to the scrutiny of others who are experts in the same
field.
Portability: The ease with which a system or component can be transferred from one hardware or software
environment to another [IEEE 610.12:1990].
Probability (of failure on demand): For hardware and random failures, probability that an item fails to operate
when required. This probability is estimated by the ratio of the number of failures to operate for a given number
of commands to operate (demands) [IEC IEV-191-29-1].
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.A – Page 52
Project: In the context of this standard the term Project refers to the activities, responsibilities and roles
involved in a new build or upgrade project where the ISDS standard is applied.
Prototype: A model implemented to check the feasibility of implementing the system against the given constraints,
to communicate the specifier’s interpretation of the system to the customer, in order to locate misunderstandings.
A subset of system functions, constraints, and performance requirements are selected. A prototype is built using
high-level tools. At this stage, constraints such as the target computer, implementation language, program size,
maintainability, reliability and availability need not be considered [ISO/IEC 61508-7:2010].
Prototyping: A hardware and software development technique in which a preliminary version of part or all of
the hardware or software is developed to permit user feed-back, determine feasibility, or investigate timing or
other issues in support of the development process [IEEE 610.12-1990].
Quality target: The objective or criteria agreed by the stakeholders to be reached for a quality characteristic.
ISO/IEC 9126-1 proposes many quality characteristics for consideration.
Record: Information or documents stating results achieved or providing evidence of activities performed.
Redundancy: The existence of more than one means for performing a required function or for representing
information [IEC 61508-4]. Redundancy prevents the entire system from failing when one component fails.
Regression testing: Selective retesting of a system or component to verify that modifications have not caused
unintended effects and that the system or component still complies with its specified requirements [IEEE
610.12:1990].
Release note: The term “release” is used to refer to the distribution of a software configuration item outside the
development activity. This includes internal releases as activities might contain provisions which cannot be
satisfied at the designated point in the life cycle [IEEE12207.0-96:c6s2.6]. The release notes typically describe
new capabilities, known problems, and platform requirements necessary for proper product operation
[SWEBOK 2004].
Reliability: The capability of the ISDS to maintain a specified level of performance when used under specified
conditions.
Reliability, Availability, Maintainability, (Functional) Safety (RAMS): A set of commonly linked generic
system attributes that often need to be dealt with in a systematic manner. See also Dependability.
Requirement: A condition or capability that must be met or possessed by a system or system component to
satisfy a contract, standard, specification, or other formally imposed documents [IEEE 610.12:1990].
Reused software: Software integrated into the system that is not developed during the project, i.e., both
standard software and non-standard legacy software. Software can be reused “as-is” or be configured or
modified.
Review: Activity undertaken to determine the suitability, adequacy and effectiveness of the subject matter to
achieve established objectives [ISO 9000:2005].
Revision control: Management of multiple revisions of the same unit of information (also known as version
control, source control or (source) code management).
Risk: The qualitative or quantitative likelihood of an accident or unplanned event occurring, considered in
conjunction with the potential consequences of such a failure. In quantitative terms, risk is the quantified
probability of a defined failure mode times its quantified consequence [DNV-OSS-300].
Role: A role is an organization with responsibilities within the system lifecycle. A role has specific activities
to perform. See Ch.2, Sec.3, A200.
Safety integrity level (SIL): A relative level of risk-reduction provided by a safety function, or to specify a target
level of risk reduction. [IEC 61508] defines four levels, where SIL 4 is the most dependable and SIL 1 is the
least. SIL is not to be confused with confidence level, which is defined in Ch.2, Sec.2.
Software: Computer programs, procedures, and possibly associated documentation and data pertaining to the
operation of a computer system [IEEE 610.12:1990].
Software Component: A software component is an interacting set of software modules. A software component
is a configuration item.
Software lifecycle: The period of time that begins when a software product is conceived and ends when the
software is no longer available for use. The software life cycle typically includes a concept phase, requirements
phase, design phase, implementation phase, test phase, installation and checkout phase, operation and
maintenance phase, and, sometimes, retirement phase. Note: These phases may overlap or be performed
iteratively [IEEE 610.12:1990].
Software Module: Separately compilable or executable piece of source code. It is also called “Software Unit”
or “Software Package” [ISO/IEC 12207:2008]. A small self-contained program which carries out a clearly
defined task and is intended to operate within a larger program.
Single point of failure: Component or interface of a system for which no backup or redundancy exists and the
failure of which will disable the entire system.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.A – Page 53
Simulation: One of real-world simulation, process simulation, electronic circuit simulation, fault simulation,
simulation of Software in the Loop (SIL), simulation of the system’s environment or Hardware-In-the-Loop (HIL)
[SfC 2.24]. Simulators serve various purposes and use different modelling techniques [ISO/IEC 61508-7:2010].
Site Acceptance Tests: An acceptance test for a fully integrated system. May be part of commissioning, or may
be performed in the factory before delivery to the unit.
Source code: Computer instructions and data definitions expressed in a form suitable for input to an assembler,
compiler, interpreter or other translator [IEEE 610.12:1990].
Specification: A document that specifies, in a complete, precise, verifiable manner, the requirements, design,
behaviour, or other characteristics of a system or component, and, often, the procedures for determining
whether these provisions have been satisfied [IEEE 610.12:1990].
Standard software: Ready-made and packaged software intended to be used in different systems, for example
COTS software, the ISDS supplier’s own developed standard software components, and open source software.
State diagram: A diagram that depicts the states that a system or component can assume, and shows the events
or circumstances that cause or result from a change from one state to another [IEEE 610.12:1990].
Statement Coverage: A measure of the amount of software that has been tested, typically expressed as a
percentage of the statements executed out of all the statements in the component tested.
Static Analysis: The process of evaluating a system or component based on its form, structure, content, or
documentation. Contrast with: dynamic analysis [IEEE 610.12:1990].
Statistical Testing: A testing method based on allocating test cases to components and functions in proportion
to their expected use in operational scenarios, also called random testing. Data from statistical testing can be
used to predict operational reliability.
Sub-supplier: (in the context of this standard) Organisations and companies delivering products and services
to suppliers.
Sub-system: A Sub-system is a part of a system. For example Choke and Kill is a part of the Well Control
System, the Independent joystick is part of the Dynamic Positioning system (ref. definition in Ch.3 Sec.1B).
System: A defined product which contains sub-systems. For the purposes of the ISDS notation, a system refers
to the qualifier identified in the notation, for example DP, DCS, PMS, etc. (ref. Ch.3 Sec.1B).
System Architecture: A selection of the types of system elements, their characteristics, and their arrangement
[INCOSE SE 2004].
System Design Review: A review conducted to evaluate the manner in which the requirements for a system have
been allocated to configuration items, the system engineering process that produced the allocation, the
engineering planning for the next phase of the effort, manufacturing considerations, and the planning for
production engineering [IEEE 610.12:1990].
System testing: Testing conducted on a complete, integrated system to evaluate the system's compliance with its
specified requirements. See also: component testing; integration testing; interface testing [IEEE 610.12:1990].
Stress testing: Testing conducted to evaluate a system or component at or beyond the limits of its specified
requirements [IEEE 610.12:1990].
Target environment: The configuration of network protocols, computers, PLCs, sensors, final elements and
other hardware on which a software integrated system is intended to be executed in operations.
Test case: A specification of a test in terms of:
—
—
—
—
a description of the purpose of the test
pre-conditions (e.g. the state of the software under test and it’s environment)
actions organized in one or several scenarios (including what data to provide)
expected results
Traceability: Linkage between requirements and subsequent work products, e.g. design documentation and test
documentation.
Traceability matrix: A matrix that records the relationship between two or more products of the development
process; for example, a matrix that records the relationship between the requirements and the design of a given
software component [IEEE 610.12:1990].
Unit: The vessel that will get the ISDS notation, typically a Mobile Offshore Unit or a Ship.
Use Case: The Use Case specifies the concepts used primarily to define the behaviour and the functionality of
a system or a subsystem, without specifying its internal structure. Use cases and actors (users) interact when
the services of the system are used. In this context an actor plays a coherent set of roles when interacting with
the system. Note that in the Use Case context an actor may be a user or another system interacting with the first
one [IEC 19501:2005].
Validation: Confirmation, through the provision of objective evidence that the requirements for a specific
intended use or application have been fulfilled [ISO 9000:2005].
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.A – Page 54
Validation strategy: Identification (e.g., list) of validation activities to be performed, along with validation
methods, objectives, and responsibility assigned to them. The purpose of this strategy is to minimize
redundancy and maximize effectiveness of the various validation activities.
Verification: Tasks, actions and activities performed to evaluate progress and effectiveness of the evolving system
solutions (people, products and process) and to measure compliance with requirements. Analysis (including
simulation, demonstration, test and inspection) are verification approaches used to evaluate: risk; people, product
and process capabilities; compliance with requirements, and proof of concept [INCOSE SE 2004].
Verification strategy: Identification (e.g., list) of verification activities to be performed, along with verification
methods, objectives, and responsibility assigned to them. The purpose of this strategy is to minimize
redundancy and maximize effectiveness of the various verification activities.
Version: Software items evolve as a software project proceeds. A version of a software item is a particular
identified and specified item. It can be thought of as a state of an evolving item [SWEBOK 2004].
White box testing: A testing method that uses knowledge of the internal organization of the software to select
test cases that provides adequate coverage of the software. White box testing is also called structural testing.
Work product: A deliverable or outcome that must be produced to prove the completion of an activity or task.
Work products may also be referred to as artefacts.
B. Abbreviations
The abbreviations in Table B1 are used.
Table B1 Abbreviations
AP
BOP
CAAP
CCB
CL-<n>
CMC
COTS
CPU
DNV
DP
EDS
EQD
ESD
F&G
FAT
FEED
FI
FMECA
GSR
HCTS
HIPS
HVAC
HW
IAS
IEC
IEEE
INCOSE
IO
ISDS
ISO
IV
MOU
MTTF
MTBF
For Approval
Blow Out Prevention
Critical Alarm and Action Panel
Change Control Board
Confidence Level <n> (n=1 to 3)
Certification of Materials and Components
Commercial off-the-shelf
Central Processing Unit
Det Norske Veritas
Dynamic Positioning
Emergency Disconnect System
Emergency Quick Disconnect
Emergency Shut Down
Fire and Gas
Factory Acceptance Tests
Front-End Engineering and Design
For Information
Failure Modes, Effects, Criticality Analysis
Generic System Requirement
Heave Compensation and Tensioning System
High Integrity Protection System
Heating, Ventilation, and Air Conditioning
Hardware (as opposed to software)
Integrated Automation System
The International Electrotechnical Commission
The Institute of Electrical and Electronics Engineers
International Council on Systems Engineering
Input/output (also I/O)
Integrated Software Dependent Systems
International Organization for Standardization
Independent Verifier
Mobile Offshore Unit
Mean Time To Failure
Mean Time Between Failures
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.A – Page 55
Table B1 Abbreviations (Continued)
MTTR
MUD
OS
OSS
OW
PLC
PMS
PPC
PRH
PSD
RAM
RAMS
RfP
RMS
RP
SAT
SEM
SI
SU
SW
UIO
WCS
WCP
WOCS
Mean Time To Repair
Bulk Storage, drilling fluid circulation and cementing
Offshore Standard
Offshore Service Specifications
Owner
Programmable Logical Controller
Power Management System
Power Plant Control System
Pipe / Riser Handling
Process Shut Down
Reliability, Availability, Maintainability
Reliability, Availability, Maintainability, Safety
Request for Proposal
Riser Monitoring System
Recommended Practice
Site Acceptance Tests
Subsea Electronic Module
System Integrator
Supplier
Software
Unit In Operation
Well Control System
Well Control Package
Work Over Control System
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 56
APPENDIX B
REQUIREMENT DEFINITION
A. Requirement definition
A 100 General
101 The following lists the required activities for all phases and all roles.
102 Each activity is presented in the same manner. The list of activities is sorted alphabetically by the activity
ID and contains the following elements:
— The header describes the unique activity identifier (ID), enabling easy reference and traceability, and the
name of the activity. The identifier is structured in three parts: Z.YYY.NN. The first part (“Z”) of the
activity identifier refers to the project phase. The second part (“YYY”) of the activity identifier refers to
the process area. The third part (“NN”) of the activity identifier is a unique number for the activity.
— The phase and the confidence level at which the activity is to be performed.
— Assignment of responsible roles for unit and system level.
— The requirement definition provides a detailed description of the activity requirements.
— A guidance note is provided when needed.
— The assessment criteria field provides typical evidence to be made available by the responsible role(s) for
the assessment.
— The documentation criteria field provides a list of required documentation to be submitted to DNV for
approval (AP) or for information (FI).
— The contributions field lists the roles that are expected to contribute to the activity and the details of the
expected contributions.
A 200 Activity definition basic engineering
A.CM.1 Establish a baseline of requirements for the unit
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: A baseline of the unit requirements shall be established at the end of the basic engineering phase.
This baseline shall be used as the reference when requirements evolve in later phases.
Guidance note:
The purpose of an explicit requirements baseline is to achieve control of any changes to the requirements.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Approved and controlled unit requirements document.
Revision history of unit requirements document.
Acceptable contributions from Owner.
Approval of the requirements document.
Documents required for review:
None
A.DES.1 Establish the unit design
Phase: Basic engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: A top level architecture of the systems and software within ISDS scope shall be established,
identifying the systems and their interrelationships, including the required interfaces.
Assessment criteria:
Unit design: unit design specifications, systems/network
topology and functional descriptions.
Acceptable contributions from Owner.
Review the unit design/top level architecture.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 57
A.PM.1 Establish the master plan
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: A master plan for the development of the unit’s systems in ISDS scope shall be established. The
plan shall contain a high level master schedule, taking into account rough estimations and discussions with potential
suppliers. Milestones for the whole project shall be established, including milestones to the ISDS standards.
All stakeholders shall be in agreement on the plan.
Guidance note:
The development plan and the milestones should typically show the requirements and design freeze dates.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Master plan: Activities, work breakdown structure (WBS), None
schedule, and milestones.
Acceptable contributions from Owner.
Provide inputs on specific schedule constraints and master schedule.
A.PQA.1 Define procedures (owner)
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: Procedures to be used within the project shall be defined, coordinated, and agreed within and
between organisations participating in the project.
Roles, responsibilities and specific requirements as defined in this standard shall be explicitly addressed.
Guidance note:
The word ‘procedures’ in this context is used to represent all documentation regarding the way of working, e.g.
process descriptions, standard operating procedures, work instructions, checklists, guidelines etc.
Some procedures may need to be coordinated with the system integrator’s procedures when the system integrator is
selected (see A.PQA.2).
The defined procedures are normally made up of quality management system documents: e.g. standard operating
procedures, process description, checklists, and document templates along with any project specific adaptations of
these.
The following areas are normally expected to be covered:
— All activities required to be performed by the owner, listed in Ch.2 Sec.5.
— Responsibilities and authorities of different disciplines and roles,
— Mechanisms for submitting and receiving information (documents) between different organisations,
— Mechanisms for defining baselines of information (documents),
— Mechanisms for handling of documents and information while they are work in progress,
— Mechanisms for review and approval of drawings and other documents,
— Mechanisms for approval of deliverables (verification mechanisms),
— Mechanisms for handling of changes to already agreed technical scope, schedule or costs,
— Mechanisms for follow-up of process adherence (see X.PQA.1 and X.PQA.4).
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
A quality system, documents, minutes of meetings, or other None
relevant information showing: A defined way of working
for the major activities in the project, clear roles and
responsibilities and defined ways of interaction between the
different organizations (e.g. owner, system integrator,
supplier, independent verifier, and others).
Contributions: No required contributions.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 58
A.PQA.2 Define procedures (system integrator)
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Procedures to be used within the project shall be defined, coordinated, and agreed within and
between organisations participating in the project.
Roles, responsibilities and specific requirements as defined in this standard shall be explicitly addressed.
Guidance note:
The word ‘procedures’ in this context is used to represent all documentation regarding the way of working, e.g.
process descriptions, standard operating procedures, work instructions, checklists, guidelines etc.
Some procedures may need to be coordinated with the owner (see A.PQA.1) and the suppliers (see B.PQA.1) when
the suppliers are selected.
The defined procedures are normally made up of quality management system documents: e.g. standard operating
procedures, process description, checklists, and document templates along with any project specific adaptations of these.
The following areas are normally expected to be covered:
— All activities required to be performed by the system integrator, listed in Ch.2 Sec.6.
— Responsibilities and authorities of different disciplines and roles,
— Mechanisms for submitting and receiving information (documents) between different organisations,
— Mechanisms for defining baselines of information (documents),
— Mechanisms for handling of documents and information while they are work in progress,
— Mechanisms for review and approval of drawings and other documents,
— Mechanisms for approval of deliverables (verification mechanisms),
— Mechanisms for handling of changes to already agreed technical scope, schedule or costs,
— Mechanisms for escalation of problems (see C.PQA.1),
— Mechanisms for follow-up of process adherence (see X.PQA.2 and X.PQA.5),
— Mechanisms allowing management insight into the project’s status,
— Internal procedures and rules for the work to be carried out in the project e.g. design guidelines (see B.DES.3).
— Internal procedures and rules regarding how to document the requirements, design, implementation, verification
& validation, and acceptance of the systems within ISDS scope.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
A quality system, documents, minutes of meetings, or
other relevant information showing: A defined way of
working for the major activities in the project, clear roles
and responsibilities and defined ways of interaction
between the different organizations (e.g. owner, system
integrator, supplier, independent verifier, and others).
Contributions: No required contributions.
A.RAMS.1 Determine safety rules, standards and laws applicable
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: System integrator.
Requirement definition: Rules, standards and applicable laws for safety requirements shall be identified, reviewed and
agreed upon.
Guidance note:
Sources of statutory rules include flag states, shelf states, and classification societies. Also, industry standards such
as IEC 61508/61511 or ISO 17894 may be applied.
This activity is an extension of the A.REQ.2 activity.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Listing of regulatory requirements that apply
regarding safety.
Resolution of conflicting rules.
Application guidelines.
Documents required for review:
List of regulatory requirements unit (FI):
— reviewed in A.IV.2 at CL3
— used in B.IV.4 and D.IV.3 at CL3.
List of regulatory requirements system (FI) used in C.IV.3 at CL3.
Acceptable contributions from Owner.
Make clear any specific considerations such as sovereignty of intended area of operation.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 59
A.RAMS.2 Define RAM related requirements and objectives
Phase: Basic engineering. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: Reliability, availability, and maintainability related requirements shall be defined and
documented. These requirements shall be detailed down to the system level and shall be quantitative for CL3.
Guidance note:
RAM requirements may be explicitly defined as quantitative or qualitative targets or objectives that shall be met by
the functions or the systems.
As an example of a quantitative requirement, MTTF or MTBF for a given system in a given environment may be used
as a reliability target. Relative reliability (e.g., more reliable than…) is an example of a qualitative requirement. RAM
requirements may be part of an overall requirements document.
This activity should be coordinated with the activity A.REQ.2, and the RAM requirements are normally documented
in the same document as the other unit and system requirements.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Listing of RAM requirements.
For CL2: qualitative requirements are acceptable.
For CL3: quantitative requirements (objectives) are
required.
Documents required for review:
List of RAM requirements unit (FI):
— reviewed in A.IV.2 and B.IV.4 at CL3
— used in D.IV.3 at CL3.
List of RAM requirements system (FI) used in C.IV.3 at CL3.
Contributions: No required contributions.
A.RAMS.3 Develop the RAMS plan for the unit
Phase: Basic engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: RAMS activities shall be planned, including methods and tools to be used.
Expectations on the suppliers’ RAMS activities, plans and reporting shall be identified.
Guidance note:
The methods, tools and procedures used in all RAMS-related activities should be defined, documented and put under
configuration control. This activity provides input to the activity B.RAMS.4. The RAMS plan should cover all
relevant RAMS activities included in this standard. This activity may be coordinated with the activities A.PM.1 and
A.PQA.2 and the RAMS plan may be a separate document, or a part of the project plan or quality assurance plan.
Safety may be dealt with separately, for example in a functional safety management plan.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Plan(s) showing the methods, tools, and procedures to be Plan for handling of RAMS (FI):
used for RAMS activities.
— reviewed in A.IV.2 at CL3
Schedule of RAMS activities.
— used in B.IV.4 at CL3.
Expectations on the suppliers’ RAMS plan.
RAM data to be collected (CL3).
Acceptable contributions from Owner.Review and comment on the RAMS plan for the unit.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 60
A.REQ.1 Define mission, objectives and shared vision
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: Mission, vision and objectives of the unit shall be defined. The vision of the unit’s final
solution shall be formalised and distributed to all stakeholders.
Guidance note:
Agreeing on the overall vision, mission and objectives makes it possible for all involved roles to make correct
decisions regarding the functionality and capability of the systems in question.
The results may be a design-basis document or part of a FEED study or similar report.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Design Philosophy (FI) used in A.IV.1 at CL3.
Unit design intention and philosophy: The vision of the
unit/system, descriptions of the unit/systems overall
behaviour and the expected business/safety/environmental
performance.
Contributions: No required contributions.
A.REQ.2 Collect requirements for the unit and systems
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Owner requirements shall be collected. These requirements shall focus on operational needs,
along with the constraints the owner is placing on the unit and systems.
Guidance note:
It is important that requirements on individual Systems are defined while taking into account interacting System and
the whole Unit.
There are no specific demands regarding the format or analysis of the requirements, but it is still important that all
relevant aspects like functionality, performance, reliability, availability, maintenance, safety are covered. For a more
detailed description of what aspects to cover, see DNV RP-D201, Appendix A: Generic System Requirements.
Collecting the requirements may be supported by prototyping, FEED studies, technology qualification or other means.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Vessel specification: operational requirements, functional
requirements, non-functional requirements and technical
constraints.
Acceptable contributions from Owner.
Provide requirements.
Review of requirements documents.
Participation in clarification of requirements.
Documents required for review:
Vessel specification (FI) reviewed in A.IV.1 at CL3.
A.REQ.3 Define operational modes and scenarios to capture expected behaviour
Phase: Basic engineering. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: The different operational modes of the unit shall be described and the corresponding operational
scenarios defined.
The interaction between the operators and the different systems shall be defined for key operational scenarios.
The key operational scenarios shall be explicitly described in order to enable the system integrator and suppliers to focus
on these.
Guidance note:
This activity is an extension of the activity A.REQ.2.
An operational scenario is a description of a typical sequence of events that includes the interaction of the system
environment and users, as well as interaction among its components. Nominal scenarios should be described as well
as key the important degraded ones.
The task of selecting which of the operational scenarios to define as ‘key’ can be done by evaluating the scenarios
towards criteria like the novelty of the scenario, novelty of the systems involved, associated risks etc.
During the basic engineering phase operational scenarios are used to evaluate the vessel specification.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 61
At a later stage the operational scenarios are used as input to more detailed use-cases and to create tests-cases for the
interaction between different systems.
The operational modes and scenarios may also serve as input to operation manuals.
The documentation of operational modes and scenarios can be achieved by creating a concept of operation document
(CONOPS).
For more info see the ANSI/AIAA G-043-1992 (focusing on new systems) and IEEE Standard 1362 (focusing on
upgrades to existing systems).
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Vessel specification: description of the operational modes Vessel specification (FI) reviewed in A.IV.1 at CL3.
and corresponding key operational scenarios, detailed to
the level of the different systems.
Contributions: No required contributions.
A.REQ.4Allocate functions and requirements to systems
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: The requirements shall be organized into packages that can serve as the basis for the
development of each system. The allocation of requirements shall ensure that systems on a lower confidence level do not
have compromising impact on systems on a higher confidence level.
Guidance note:
The system integrator normally requests information from possible suppliers to specify the system requirements.
Some requirements may be unique to a system; other requirements may be common across all or multiple systems.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Design specification (or requirements) for the relevant
Specification (FI) reviewed in A.IV.1 at CL3.
systems.
Acceptable contributions from Owner.
Review of system level functional specifications/requirements documents, participation in clarification of requirements.
A.REQ.5 Consult potential suppliers for acquiring of systems
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Potential subcontractors for developing/delivering systems shall be consulted. On CL2 and
above, the requirements allocated to the system shall be used to track compliance of the supplier’s offer.
Guidance note:
The requirements allocation to systems is defined in activity A.REQ.4 and the traceability information in activity
A.REQ.6.
Make/buy/reuse analyses are normally used to determine which/what part of a system could be acquired.
The system integrator should make sure that the system is not over-specified before consulting what suppliers have
to offer.
The system integrator should communicate the hard operational constraints so that the suppliers understand the need
for their system.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria: System request for proposal (RfP):
functional specifications, generic system requirements and None
obsolescence information.
Requirements compliance information (on CL2 and above).
Acceptable contributions from Owner. Contribution of Owner to System request for proposal (RfP):
— Technical constraints,
— Generic System Requirements (GSR).
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 62
A.REQ.6 Establish traceability of requirements
Phase: Basic engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: The mechanisms, ambition-level, and eventual tools to be used for the traceability of
requirements shall be defined and communicated to suppliers. Traceability between different levels of requirements shall
be documented, and shall at this point in time at least cover the trace from unit level requirements to system level
requirements.
Guidance note:
Three different types of traceability are normally expected to be documented during the project (see X.REQ.1):
1) Traceability between requirements on different levels (e.g. from unit to system).
2) Traceability from a requirement to where and how it is designed and implemented.
3) Traceability from a requirement to where and how it is verified and validated.
Different levels and granularity of traceability may be defined depending on the complexity and the risk of not
achieving the requirements and the confidence level of the system in question.
The traceability information should be kept up to date as described in activity X.REQ.1, this also includes updates
that the suppliers are expected to perform.
Traceability matrices are normally used to document the requirement allocation, but also databases or references from
documents to documents can be used as long as the traceability information is explicit and reviewable.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Traceability information between requirements on unit
level and requirements on the different systems.
Defined mechanisms and ambition-level regarding
requirements traceability.
Contributions: No required contributions.
Documents required for review:
Traceability matrices (FI) used in A.IV.1 at CL3.
A.RISK.1 Define a strategy for risk management
Phase: Basic engineering. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: A risk management strategy shall be defined. It shall identify the sources of the risks, how they
are categorised, how they are characterised (attributes) and prioritised, how risks are reported, who is responsible, which
risk mitigations to use and when the risk should be mitigated.
Guidance note:
The risk management strategy is typically a common asset for individual projects. The ISDS standard focuses on the
risk management of common risks which impact several systems or the whole project.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Risk management procedure.
Blank risk register.
Acceptable contributions from System integrator.
Review the risk strategy.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 63
A.RISK.2 Jointly identify risks
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Risks impacting the project plans or the efforts shall be identified at the beginning of the project.
Consequences (schedule, effort, quality, obsolescence, etc.) and the associated probability shall be evaluated and
documented.
Guidance note:
Risks in this process area should focus on project (schedule and effort) and consider product risks which potentially
impact several systems or the whole project.
RAMS risks are managed in the RAMS process area.
A clear distinction should be made between risks (not having occurred yet) and issues (already present).
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Project risk list: risk list with risks related to e.g.
requirements, schedule, effort, quality, performance,
consistency and obsolescence (for both hardware and
software).
Acceptable contributions from Owner.Provide inputs on operational and business risks relevant for the project, the unit
and systems.
A.RISK.3 Assign confidence levels
Phase: Basic engineering. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: The confidence level (CL) of the functions and systems shall be assigned. The detailed content
and borders of the different systems within scope of ISDS shall be documented.
Guidance note:
Confidence levels of the functions and systems can be assigned using three different practices:
— functions and systems in scope of this standard are assigned a confidence level based on the recommendations in
Ch.3, Sec.1, B200,
— the above functions and systems may be assigned a higher confidence level by the owner,
— additional functions and systems may be assigned a confidence level based on the procedure described in DNVRP-D201.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria: Confidence level matrix for the
relevant systems.
Documents required for review:
Vessel specification (confidence levels) (FI):
— reviewed in A.IV.1 at CL3
— used in A.IV.2 and B.IV.4 at CL3.
Acceptable contributions from System integrator.
Contribute to assigning confidence levels to functions and systems within the scope.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 64
A.VV.1 Validate the concept of the unit with the users
Phase: Basic engineering. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: The concept of the unit and its systems shall be presented to representatives of the end users to
ensure it can fulfil their needs.
Comments and changes to the concept shall be shared with all stakeholders.
Guidance note:
The concept should represent an overall idea of the unit and its purpose.
The concept validation allows the project to get early feedback from the users, increase the user awareness of the
future unit, and reduce issues during commissioning and sea trials.
Prototypes or simulations may be used to validate the concept of the system with the users, providing a different point
of view of the system in addition to the specifications.
Other means of validation may be used, such as using existing concepts already validated by the end users.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Unit concept presentation: Simulations and Minutes of
None
System Concept Review Meeting.
FEED study.
Acceptable contributions from System integrator.
Provide inputs to the unit concept and participate in the presentation of the concept to the end users.
A.VV.2 Verify the unit and system requirements
Phase: Basic engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: The unit requirements shall be verified in the context of operational scenarios and the defined
mission and vision of the unit.
The correctness, completeness and consistency of the allocation and derivation of requirements to individual systems
and known components from the unit requirements shall be verified by reviewing the requirement traceability
information.
Guidance note:
This activity is a quality assurance of the results from activities A.REQ.1, A.REQ.3, A.REQ.4 and A.DES.1.
Traceability information (from activity A.REQ.6) is normally used to facilitate this activity.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Review records of the unit requirements.
Review records for the system requirements.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 65
A 300 Activity definition engineering
B.ACQ.1 Select COTS products based on defined criteria
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: In case of COTS acquisition, selection criteria shall be established that include functionality,
RAMS, obsolescence management and other relevant criteria.
Guidance note:
This standard focuses on the technical aspects of the selection process, not the financial aspects, which are out of
scope.
The COTS products in question may be used as a stand-alone product or as a component in the supplier’s system.
This activity should be coordinated with the results of activity B.DES.5.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
COTS product selection procedure: obsolescence
management.
COTS product selection matrix: rationale for selection,
selection criteria, evaluations and selection.
Contributions: No required contributions.
Documents required for review:
None
B.ACQ.2 Establish contract with sub-suppliers
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Contracts with sub-suppliers shall be established, and shall contain at least the following items:
—
—
—
—
—
Products, functions and components to be included in the delivery.
Relevant ISDS standard requirements.
Criteria for acceptance of the acquired product.
Proprietary, usage, ownership, warranty and licensing rights.
Maintenance and support in the future.
If the development, updating or configuration of software dependent systems is sub-contracted, relevant parts of the
ISDS standard apply to the sub-supplier and shall be included in the contract.
Guidance note:
This standard focuses on the technical aspects of the selection process, not the financial aspects, which are out of
scope.
If a sub-supplier deliver non-COTS system(s) with custom software, the contractual agreements should clarify which
parts of the ISDS standard the sub-contractor should adhere to and be assessed towards, and which parts of the
standard the supplier will take care of.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Supplier agreement: product or component specifications, None
functional specifications, technical acceptance criteria,
ownership transfer conditions, delivery strategy, provisions
for review of intermediate deliveries.
Contributions: No required contributions.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 66
B.CM.1 Establish baselines of requirements and design
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Baselines of the unit specifications, unit design, system requirements, system designs, base
products, and interface specifications shall be established before the information is used for further detailing of design,
implementation and verification.
Guidance note:
Changes to these baselines require review and approval by appropriate stakeholders, ref. X.CM.1.
The term ‘base products’ is here used to describe any kind of existing product, component, software library, software
template or similar on which the supplier bases the development (or automatic generation) of the custom specific
product.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Baseline repositories.
Identification of baselines.
Approved and controlled documents (baselines) for: unit
specifications, unit design, system requirements, system
design, interface specifications and base products.
Acceptable contributions from Owner.
Approval of requirements and design documents.
Documents required for review:
None
B.CM.2 Establish and implement configuration management
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: The roles, responsibilities and mechanisms for managing changes and handling versions of all
software and documents included in the baselines shall be defined and implemented. The project specific details
regarding configuration items and responsibilities shall be documented.
These mechanisms shall apply until a Change Control Board (CCB) for commissioning is established in the acceptance
phase.
Guidance note:
The configuration management plan may be a part of the project plan for the organisation, see activity B.PM.1.
Separate configuration management mechanisms may be established for the unit and individual systems. The unit
level mechanism typically does not provide for a software repository, but this should be in place at the supplier level.
A typical way to manage changes to baselines is to let one decision-body (normally called a CCB) make an impact
analysis of proposed changes and then make a ‘yes’ or ‘no’ decision and communicate the approved changes to be
performed.
The version control rules should also encompass software/documents/information that has not been included in a
formal baseline. Tools for version control are often utilised.
This activity focuses on the configuration management during the engineering and construction phases in the project,
for the acceptance phase other mechanisms are typically needed, see D.CM.1.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Configuration management plan: Definition of a Change None
Control Board (CCB) process or similar, identification of
required baselines, required baseline content, and change
request forms.
Change requests and change decisions.
Version history information of baselines.
Defined rules and mechanisms for version control.
Effective implementation of version control mechanisms.
Acceptable contributions from Owner.
Approval of configuration management mechanisms and participation in the Change Control Board (CCB) for unit level
baselines.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 67
B.DES.1 Design the system
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: The design for the system shall be developed and documented.
The design shall identify the major subsystems and components and how they interact.
All external interfaces shall be documented and coordinated with other relevant systems.
The design shall show the software components of the system and indicate which parts are developed for the specific
project and which parts are reused/configured.
Guidance note:
The definition of external interfaces is related to the coordination of the inter-system interfaces, see activity B.INT.2.
Some of the interface definition may be done during the design of software components, see B.DES.2.
The system design can normally not be finalized until the unit design has been refined in activity B.DES.4.
CL2 & CL3 systems are usually designed to be free of or resilient to single points of failure.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Design for system (hardware & software): functional
description, user interface descriptions, block/topology
diagrams with software components, external interface
descriptions and internal interface descriptions.
Acceptable contributions from System integrator.
Review the functional descriptions and system topology.
Documents required for review:
Interface description (FI),
Functional description (FI) and
Block (topology) diagram (FI) reviewed in B.IV.1 at CL3.
B.DES.2 Design each software component
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Design for each software component shall be documented. The design shall identify the major
parts of the software component, their functions and behaviour and describe the interactions and behaviour between these
parts and with the hardware. The design shall describe which hardware runs the software component and its parts. The
intent and assumptions of the designer shall be clearly visible. The known limitations of the design shall be listed. The
design shall show which parts are developed within the project and which parts are reused, e.g., standard software or
legacy software.
Guidance note:
This activity aims at detailing the software design which is a sub-set of the system design defined in B.DES.1.
The software component design normally contains information about the structure and behaviour of software
components, (e.g. state diagrams, sequence diagrams, class diagrams, etc.) along with a description of the
functionality of the component and internal interfaces.
The design may be documented using various recognised methods and means from models and databases to diagrams
and documents.
The design should not be confused with the actual software.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Software design description (FI) reviewed in B.IV.1 at
Component design for each software component, in
CL3.
sufficient detail so as to proceed to the making of the
software: structural description, functional description,
behaviour description, parameters (default, intervals, asdesigned), interfaces description, allocation of software to
hardware and assumptions and known limitations of the
design.
Contributions: No required contributions.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 68
B.DES.3 Use established design guidelines and methods
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Appropriate guidelines and methods shall be used to ensure consistent design of the inter-system
behaviour, the systems and their components.
Guidance note:
In this context the term ‘guidelines' also includes methods, techniques, tools etc. Some guidelines may be mandatory
to follow and are referred to as ‘rules’. The definition and identification of applicable guidelines can be performed as
a part of the activities A.PQA.2 and B.PQA.1
The supplier typically uses the design guidelines as input to the design activities B.DES.1, B.DES.2 and B.RAMS.2.
The system integrator typically uses the design guidelines as input to the activities B.DES.4 and B.RAMS.2.
If applicable the guidelines should also take be selected according to the safety integrity level.
Established and well-known design guidelines, techniques and measures are commonly available; for example, IEEE
12207, IEEE 1016, IEC 61499, part 1 or ISO/IEC 61508, part 7.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
System design guidelines: including RAMS related aspects. RAMS design guidelines and methods for the vessel (FI)
Unit design guidelines: including RAMS related aspects. used in B.IV.1 at CL3.
RAMS design guidelines and methods for the system (FI)
used in B.IV.1 at CL3.
Contributions: No required contributions.
B.DES.4 Analyse and refine the unit design
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: The system designs shall be analysed in conjunction with the unit architecture and operational
scenarios to identify critical behavioural interactions and complex data flows. Common solutions shall be established,
standard alarms and signals defined, common constraints identified, and rules for handling of shared data resources
defined. The unit architecture documentation shall be updated to reflect the results of the analysis.
Guidance note:
This activity can usually not be completed until the suppliers and systems have been selected and awarded contracts because
both documentation and participation from the suppliers are normally needed in order to establish the common solutions.
The system design from the suppliers is created in activity B.DES.1. The unit architecture/design is created in activity
A.DES.1. The operational scenarios are defined in activity A.REQ.3.
The unit architecture may contain a number of views:
The physical layout describes the way components and networks are physically distributed around the facility,
including the way they are interconnected (network and main CPUs).
The logical view of the function describes the function in terms of functionality towards its users.
The scenarios or dynamic view describes the primary interaction between components when a function is executed,
and how the users interact with the functions.
The process view of the function describes the processes and components that compose the function, as well as the
responsibility of individual components and processes, their interaction, triggers and cycle times.
The physical view of the function (or allocation) describes the mapping of the main processes/ software components
onto the hardware components.
The development view of the function describes the breakdown of the function into sub-functions.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria: Updated unit design documentation:
unit design specifications, systems/network topology with
software components, interface specifications, and
functional descriptions.
Documents required for review:
Interface description (FI) reviewed in B.IV.1 at CL3,
Functional description (FI) reviewed in B.IV.1 at CL3,
Block (topology) diagram (FI):
— reviewed in B.IV.1 at CL3
— used in B.IV.2 at CL2 and CL3.
Acceptable contributions from Owner and Supplier.
Owner: Review the unit design documents.
Supplier: Give input to the system integrator in the form of system design information, topology, software structure, and
interfaces for the supplier’s system(s). Participate in review of the common solutions (unit design).
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 69
B.DES.5 Define obsolescence strategy
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Supplier.
Requirement definition: Obsolescence risks shall be assessed, based on top level architecture and make/buy/reuse
analyses. The strategy/plan for the future management of obsolescence shall be defined.
Guidance note:
The owner is responsible for the overall obsolescence strategy while the supplier makes obsolescence statements for
the individual system(s).
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria: Obsolescence management plan:
None
Authorised vendor list, Spare parts list (hardware &
software), stock, alternate spare parts list, management of
intellectual property. Obsolescence criteria for software.
Manufacturer preferred equipment list.
Acceptable contributions from System integrator and Supplier.
System integrator: Review of obsolescence strategy/plan. Assessment of obsolescence of the computer and networking
hardware and software.
Supplier: Provide obsolescence statements regarding the systems and components, including software components.
B.INT.1 Define integration plan
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Roles and responsibilities for planning and executing the integration of all parts of the systems
in ISDS scope shall be defined and documented. The plan for integration of the different systems into the unit shall be
defined and documented. The criteria, procedures, and the environment for the integration shall be described. The
integration plan shall also describe the type of tests that will be performed to ensure that systems and components interact
correctly.
As a minimum the supplier must define the integration readiness criteria for the components.
When required, the system integrator shall ask the supplier for a more comprehensive integration plan for integration of
components into a system.
Guidance note:
The integration plan should describe the steps to follow to assemble the systems and components of the system. The
integration plan should take into account the need for test stubs or simulators to perform integration tests (See C.VV.3
and D.VV.4).
The integration readiness criteria are going to be used in activity C.INT.1.
The output from this activity can be combined with the output from B.VV.1 “Define verification and validation
strategy”, in order to avoid duplication of information regarding test strategies.
The various roles and responsibilities regarding integration should be defined and documented at an early stage, while
the definition of integration sequences and environments may be defined at a later stage. This means that the
integration plan may need to be updated in the construction phase.
Special attention should be placed on identifying boundaries and exclusions of each organization’s responsibility.
Separate integration plan for a system is normally only required for complex systems, e.g. the drilling package.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Integration plan (FI):
Plan for integration of systems into the unit: The
responsibilities of the different organizations, dependencies
— reviewed in B.IV.2 at CL2 and CL3
among systems, sequence for integration, integration
— used in C.IV.1 at CL3.
environment, tests and integration readiness criteria.
Plan for integration of sub-systems and components into
systems (when required): Dependencies among systems,
sub-systems and components, sequence for integration,
integration environment, tests and integration readiness
criteria.
Acceptable contributions from Supplier.
Provide information about the system dependencies, and other considerations for integrating their system(s) into the unit.
Acknowledge assigned responsibilities.
When requested, provide a plan for integration of sub-systems and components into a system.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 70
B.INT.2 Coordinate inter-system interfaces
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: An interface matrix or a similar mechanism shall be established to identify inter-system
interfaces and assign the responsibility for defining and testing them. The status of the interface definition and
verification shall be tracked.
Changes to the inter-system interfaces shall be controlled and coordinated.
Guidance note:
This activity is related to the specification of systems and their interaction (A.DES.1, A.REQ.3, and B.DES.4).
Controlling the changes to the inter-system interfaces may be performed by including the interface matrix and the
interface definitions in baselines (see B.CM.1).
Interface specifications are typically documented by the suppliers as a part of the design of the systems (B.DES.1)
and may be included in design documents or documented separately. Multiple interface specification documents may
be prepared or they may be packaged into a common document. Typically, the system integrator will manage the
baseline of inter-system interface specifications.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Interface description (FI) used in B.IV.2 at CL2 and CL3.
Interface overview/matrix information with assigned
responsibilities.
Agreed inter-system interface specifications containing:
protocol selected, definition of commands, messages, data
and alarms to be communicated and specifications of
message formats.
Interface definition and verification status.
Acceptable contributions from Supplier.
Notify the system integrator when changes are needed to inter-system interfaces. Cooperate with other suppliers in the
definition of inter-system interfaces. Update inter-system interface specifications when required.
B.PM.1 Establish the project plan for each organisation
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: A project plan shall be established and maintained, in coordination with other stakeholders, for
engineering and all succeeding phases.
The plan shall include at least schedules and resource allocation of software related activities and take into account the
activities for the different phases of the project.
Guidance note:
In practice, the plan may be detailed for the current phase and roughly outlined for ulterior phases. When reaching the
ulterior phases, the plan may then be detailed. As the acceptance phase is often a critical period for the project, it is
recommended that this phase be planned in detail to reduce project risks.
The plan for the phases should be established and maintained in coordination with other stakeholders, based on the
master plan for the project.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Schedule.
Project plan: WBS, technical attributes used for
estimating, effort and costs estimates, deliverables and
milestones, configuration management plan.
Resource allocation.
Acceptable contributions from Owner.
Provide inputs to the project plan, including users’ schedules and constraints.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 71
B.PM.2 Coordinate and integrate the project plans with the master plan
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: The master plan of the system should be refined and made consistent with suppliers’ or other
stakeholders’ plans and schedules.
Guidance note:
The initial master plan is provided in activity A.PM.1.
The integration plan (see B.INT.1) and the RAMS plan (see B.RAMS.4) should be taken into account and made
visible in the master and project plans. The verification and validation strategies (see B.VV.1) normally also gives
inputs that impacts the project plans.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Master plan.
Project plans.
Acceptable contributions from Owner and Supplier.
Coordinate plans and schedules.
Documents required for review:
None
B.PQA.1 Define procedures (supplier)
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Procedures to be used within the project shall be defined, coordinated, and agreed within and
between organisations participating in the project.
Roles, responsibilities and specific requirements as defined in this standard shall be explicitly addressed.
Guidance note:
The word ‘procedures’ in this context is used to represent all documentation regarding the way of working, e.g.
process descriptions, standard operating procedures, work instructions, checklists, guidelines etc.
Some of the supplier’s procedures should be coordinated with the procedures given by the owner (see A.PQA.1) and
the system integrator (see A.PQA.2).
The defined procedures are normally made up of quality management system documents: e.g. standard operating
procedures, process description, checklists, and document templates along with any project specific adaptations of these.
The following areas are normally expected to be covered:
— All activities required to be performed by the supplier, listed in Ch.2 Sec.7.
— Responsibilities and authorities of different disciplines and roles,
— Mechanisms for submitting and receiving information (documents) between different organisations,
— Mechanisms for defining baselines of information (documents),
— Mechanisms for handling of documents and information while they are work in progress,
— Mechanisms for review and approval of drawings and other documents,
— Mechanisms for approval of deliverables (verification mechanisms),
— Mechanisms for handling of changes to already agreed technical scope, schedule or costs,
— Mechanisms for escalation of problems (see C.PQA.1),
— Mechanisms for follow-up of process adherence (see X.PQA.3 and X.PQA.6),
— Mechanisms allowing management insight into the project’s status,
— Internal procedures and rules for the work to be carried out in the project, e.g. guidelines for design and
implementation (see B.DES.3 and C.IMP.4).
— Internal procedures and rules regarding how to document the requirements, design, implementation, verification
& validation, and acceptance of the systems within ISDS scope.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
A quality system, documents, minutes of meetings, or other None
relevant information showing: A defined way of working
for the major activities in the project, Clear roles and
responsibilities and Defined ways of interaction between
the different organizations (e.g. owner, system integrator,
supplier, independent verifier, and others).
Contributions: No required contributions.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 72
B.RAMS.1 Identify software-related RAMS risks and priorities
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Software-related RAMS hazards and risks shall be identified and prioritised. Established
methods shall be used for the risk identification. Risks identified at unit and system level shall be shared between relevant
roles in the project.
Guidance note:
The RAMS risks are those risks related to the unit (and systems) in operation. At CL1 RAMS risks may be identified
along with the project risks. At higher confidence levels special methods should be used, such as: Fault Three
Analysis, HAZID, HAZOP, FME(C)A.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
RAMS hazard and risk list showing consideration of software risks. RAMS risk register (FI) and RAMS risk analysis
Defined risk identification and analysis methods.
documentation (FI) reviewed in B.IV.3 at CL3.
Relevant risks are communicated to other roles.
Acceptable contributions from Owner.
Give input to, and participate in risk identification on unit level.
B.RAMS.2 Identify RAMS risk mitigation actions
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Identified software-related hazards and risks shall be analysed to identify mitigation actions.
Relevant mitigation actions shall be shared between relevant roles in the project.
Guidance note:
This activity builds upon the output from activity B.RAMS.1.
Mitigation actions can be identified on all levels: the unit, system, subsystem, hardware components, and software
components. Typical risk mitigation actions include: changes to requirements or design, changed or added verification
activities, changes to operational procedures, changes to documentation etc.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
RAMS hazard and risk mitigation list showing mitigation actions for RAMS risk register (FI):
software risks.
— reviewed in B.IV.3 at CL3
Relevant mitigation actions are communicated to other roles
— used in C.IV.3 and D.IV.3 at CL3.
Contributions: No required contributions.
B.RAMS.3 Consider software failure modes in safety analysis activities
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: When performing safety analysis, potential software failure modes shall be included in the analysis.
Guidance note:
Examples of safety analysis methods include, but are not limited to:
FME(C)A, HAZOP, HAZID, Fault Three Analysis, Bowtie, Safety cases, etc.
Examples of software failure modes include, but are not limited to:
— No output or control action not provided when needed,
— Wrong output or wrong control action provided,
— Spurious output or control action provided when not needed,
— Late output or control action not provided on-time,
— Output or control action stops too soon.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Safety analysis showing consideration of software failure modes. Safety assessment report (FI) used in C.IV.3 at CL3.
Contributions: No required contributions.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 73
B.RAMS.4 Develop the RAMS plan for the system
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: RAMS activities shall be planned, including methods and tools to be used.
The RAMS plan for the system shall be derived from the RAMS plan for the unit.
Guidance note:
This activity builds on the output from activity A.RAMS.3.
The methods, tools and procedures used in all RAMS-related activities should be defined, documented and put under
configuration control. The RAMS plan should cover all relevant RAMS activities included in this standard. This
activity may be coordinated with the activities B.PM.1 and B.PQA.1 and the RAMS plan may be a separate document,
a part of the project plan, or in a quality assurance plan. Safety may be dealt with separately, for example in a
functional safety management plan
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Plan showing objectives, methods, tools, and procedures to Plan for handling of RAMS (FI):
be used, consistent with the RAMS plan for the unit.
— reviewed in B.IV.4 at CL3
Schedule of RAMS activities.
— used in C.IV.3 at CL3.
RAM data to be collected (CL3).
Acceptable contributions from System integrator.
Review the RAMS pan for the system to check consistency with the RAMS plan for the unit.
B.REQ.1 Submit proposals to system integrator with compliance status
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: The supplier shall submit its proposal in response to the system integrator’s request. The
proposal shall contain software lifecycle information and a compliance status towards the system requirements.
Guidance note:
The supplier should provide a breakdown of the systems conforming to the Make/Buy/Reuse analyses, and take care
to identify any required customisation or parameterisation of existing products, in particular of existing software
products.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Specification (FI) used in B.IV.1 at CL3.
Submitted technical proposal for the system: system
breakdown, alternatives and options, description of
customisation or parameterisation of existing products
(including software), requirements compliance matrix and
software lifecycle information (including licensing,
ownership and obsolescence).
Contributions: No required contributions.
B.REQ.2 Refine system requirements into software component requirements
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: For each software component of the system, requirements shall be refined and allocated into
component requirements.
If the system is configured or generated from an existing base-product or components, there shall be explicit allocation
of requirements to the base-product part and the custom part.
Guidance note:
It should be possible to distinguish which requirements are satisfied by the base product as-is, which requirements can
be satisfied by a simple customisation and which requirements actually need specific developments.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Refined component requirements and specification.
Requirement allocation matrix.
Contributions: No required contributions.
Documents required for review:
Specifications (FI) used in B.IV.1 at CL3.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 74
B.REQ.3 Detail operational scenarios
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: The key operational scenarios shall be developed and detailed to show the interaction between
the end user and the systems(s) along with the interaction between different sub-systems or components in the system(s).
Guidance note:
Use case descriptions may be used in order to ‘drill down’ in operational scenarios and describe the interaction
between the end user and the system.
Use case descriptions should include both the normal sequence and relevant alternative- and error -sequences.
Performance targets should be defined for use cases.
Use case realisation diagrams (often in the form of sequence diagrams or collaboration diagrams) may be used to show
the interaction between different systems and components when performing specific use cases.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
System/component behaviour and interaction specification Specifications (FI) used in B.IV.1 at CL3.
and descriptions: use cases, sequences (including signal
usage), state diagrams, interlocks, degraded sequences,
performance targets and constraints and limitations.
Contributions: No required contributions.
B.VV.1 Define verification and validation strategy
Phase: Engineering. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: The verification and validation strategies shall be defined and documented.
The strategy shall include a list of verification and validation activities, the test environment, the purpose of each activity,
the method to be employed in the activity, the quality criteria (quality objectives) to be fulfilled, and the organisational
responsibility for conducting and recording the activity.
On the system level, the verification and validation strategies shall distinguish between base product software, configured
software, modified software, newly developed software, and defect corrections.
The verification strategy shall define the means to ensure the unit/system meets its requirements.
Guidance note:
The purpose of the verification and validation strategies is to ensure that testing and evaluation is performed
efficiently and effectively through a combination of complementary methods.
Relevant stakeholders, e.g. owner, system integrator and supplier should review and approve the verification and
validation strategy for the elements in their scope. The supplier's verification strategy should take into account the
software module test (C.IMP.3) in addition to all relevant VV activities described in this standard.
The system integrator should define the verification and validation strategy at the unit level, while the suppliers should
define their verification and validation strategy at the lower levels. Both should ensure their respective strategies are
consistent with each other, and that relevant verification and validation activities and requirements described in this
standard are covered.
The validation strategy should be consistent, complementary and as little redundant as possible with the verification
strategy.
Care should be taken to identify the purpose of each test activity performed (for example, test for verification against
a requirement, or test for validation against a user scenario).
Detailed procedures and tools for testing may be established during the construction phase (See C.VV.3, C.VV.8 and
D.VV.1).
The information listed under the Assessment criteria may be a part of a project plan, a project quality plan or similar.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 75
Documents required for review:
Assessment criteria:
Verification and validation strategy (FI):
Verification strategy: which part to verify: unit, system,
sub-system, component, module, design documents.
— reviewed in B.IV.2, at CL2 and CL3
Method specification documents, etc.: which methods to
use for this verification: testing, inspection, code analysis, — used in C.IV.1 at CL3, and C.IV.2 and D.IV.1 at CL2
and CL3.
simulation, prototyping, peer review techniques, quality
criteria and targets, which test types to use: functional,
performance, regression, user interface, negative, what
environment to use for verification and identification of the
test stages (e.g. sea trials, integration tests, commissioning,
FAT, internal testing, component testing) to be used for the
verification and the schedule for those tests.
Validation strategy: products to be validated, validation
criteria, operational scenarios, methods and environments.
Acceptable contributions from Owner.
Provide inputs based on operational scenarios, quality criteria, user needs etc.
Provide inputs regarding FAT, commissioning, integration, and quay or sea trials.
B.VV.2 Review the design with respect to requirements and design rules
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: The design information shall be reviewed to verify the completeness of the design with respect
to requirements (e.g. functions, interfaces, performance, and RAMS properties), design rules and uncertainties (e.g.
technical risks, and design margins, design assumptions).
Guidance note:
In order to secure that all relevant aspects are covered, this activity normally encompasses one or more review
meetings. Both intra- and inter-disciplinary reviews are normally performed.
The review should include software considerations. The design rules are defined in activity B.DES.3.
Good traceability between different requirements levels eases this review; see A.REQ4, B.REQ.2 and X.REQ.1.
This activity verifies the outcomes of all DES activities in phase B.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Documented design review records addressing:
None
requirements verification, design rules and verification of
uncertainties.
Acceptable contributions from Owner.
Review the design from the operation and user point of view.
B.VV.3 Review consistency between design and operational scenarios
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Consistency between functions and interfaces assigned to each system and the defined
operational scenarios shall be reviewed.
Assessment criteria:
Minutes from review: review results considering
consistency of interface/function/component/scenarios.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 76
B.VV.4 Review interface specifications
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: All inter-system interface specifications and intra-system interfaces shall be reviewed by
relevant stakeholders. The review of inter-system interfaces shall be coordinated by the system integrator.
Guidance note:
This activity verifies the outcomes of B.INT.2 and B.DES.1.
Interface specifications may be included in design documents or prepared separately.
Typical stakeholders for inter-system interfaces are all suppliers relying on the interface information.
Typical stakeholders for intra-system interfaces are different departments and disciplines at the supplier, relying on
the interface information.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Interface specification reviews addressing at least:
consistency between input and output signals, frequency
and scan rates, deadlocks, propagation of failures from one
part to another, engineering units, network domination.
Contributions: No required contributions.
B.VV.5 Validate critical or novel user-system interactions
Phase: Engineering. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Within the specification constraints for the unit, critical or novel user-system interactions shall
be validated for usability and consistency.
Guidance note:
This validation may be performed by reviewing the different systems against each other to check for usability and
consistency.
The validation is usually done as one or several workshops with supplier and user representatives. A document review
alone is usually not sufficient.
Diverse means may be used such as simulation (from whiteboard to 3D modelling), demonstrations of similar existing
systems and their interactions, etc.
The definition of typical or classes of user system interactions may be used to select the interactions to be validated.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria: Validation records including:
Documents required for review:
workshop minutes, user representative’s participation and None
comments and agreed action lists.
Acceptable contributions from Owner and Supplier.
Owner: provide inputs based on operational scenarios, quality criteria, user needs etc. The users comment on usability.
Supplier: provide information about system usage.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 77
A 400 Activity definition construction
C.ACQ.1 Accept deliverables
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Deliverables from the sub-suppliers shall be submitted to formal acceptance, using predefined
criteria and procedures. Acceptance tests shall at least verify that the delivered product is compliant to its specification.
In case of COTS acquisition, the acquired products shall be qualified for use in the ISDS scoped system to be delivered.
Guidance note:
The principles for the qualification mentioned in this activity are outlined in the guidance note of activity C.VV.7 “Qualify reused software”.
If software development or software configuration is performed by a sub-supplier, the sub-supplier is going to be
assessed towards relevant parts of this ISDS standard.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Component acceptance data: acceptance criteria,
component acceptance (FAT, SAT) test procedures,
component acceptance test records, component acceptance
issue and problems list and component acceptance
coverage measurements (requirements, structural).
Contributions: No required contributions.
C.ACQ.2 Ensure transition and integration of the delivered product
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Documentation, examples, or support necessary to ensure the integrateability of the delivered
components and systems shall be planned and provided.
Guidance note:
This activity is an extension of C.ACQ.1 and is addressing the delivery from the supplier to the system integrator.
The aim of this activity is to help the system integrator by avoiding extra integration work originating from the subsuppliers' components.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Supplier agreement on: list of deliverables, review and
approval plans and support and maintenance agreement.
Product documentation.
Operation manual.
Configuration information.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 78
C.IMP.1 Develop and configure the software components from design
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Software components shall be developed according to their design. Configuration and
parameterisation of software shall be considered part of the development.
Guidance note:
The software components should be developed from the design (and not the other way around).
Software development includes creation of new software components, modification of existing software components,
or parameterisation and configuration of new, modified and existing software components.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
None
Assessment criteria:
Developed component release note.
Commented software source code.
Parameters and configuration files.
I/O List.
Development environment configuration.
Contributions: No required contributions.
C.IMP.2 Develop support documentation
Phase: Construction. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Support documentation for the components and the whole system shall be developed and
checked against each other for consistency.
Guidance note:
Focus should be put on delivering this documentation early enough to be available for testing.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
System and component support documentation: data sheets, None
user manuals, administration manuals, operating and
maintenance procedures, training material and FAQs,
known defects and troubleshooting guides.
Review records for the support documentation.
Acceptable contributions from System integrator and Owner.
System integrator: Provide guidelines and rules for consistency of support documentation, Review support documentation.
Owner: Review support documentation.
C.IMP.3 Perform software component testing
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Software components shall be tested before verification or acceptance, according to the
verification strategy.
The extent (coverage) of the testing and the results shall be documented and reported.
Guidance note:
This particularly applies to testing of new, modified, configured or impacted software.
Custom made function blocks should be tested while standard libraries and standard function blocks are usually explicitly
tested. This testing is usually performed by the software development team and is often also called ‘software unit test’.
The tests are ‘white box’, meaning that knowledge about the software’s internal structure is utilized to ensure that all
relevant parts of the code are covered by the tests.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Software test log: list of defects, date of test, tester, test
scope and pass or fail.
Software defect list.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 79
C.IMP.4 Use established software implementation guidelines and methods
Phase: Construction. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Appropriate guidelines and methods shall be used to enhance RAMS of system and components.
Guidance note:
Guidelines typically address naming conventions, coding style, patterns to be used or avoided etc. Some guidelines
may be mandatory to follow and are referred to as ‘rules’. IEC 61508-7 references a set of guidelines and methods.
Confidence level and safety level should be considered when selecting and defining guidelines.
Verification that the guidelines and methods have been followed is typically done in activity C.VV.5 “Perform code
analysis on new and modified software” and in the review activities C.VV.1 and C.VV.2
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Software guidelines/standards/rules/checklists/automated
checks.
Review records.
Contributions: No required contributions.
Documents required for review:
None
C.INT.1 Check readiness status of systems and components before integration
Phase: Construction. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Readiness of the systems and components for integration shall be checked before starting
integration. The criteria defined in the integration plan shall be used to assess readiness.
Guidance note:
The verification analysis report from C.VV.6 and the RAMS compliance report from C.RAMS.1 may be used as a
basis for the check.
See activity B.INT.1 for a description of the integration plan.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Integration readiness criteria fulfilled per component and
per system.
Contributions: No required contributions.
Documents required for review:
None
C.PQA.1 Establish procedures for problem resolution and maintenance activities in the construction
and acceptance phases
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Establish procedures for receiving, recording, resolving, and tracking problems and
modification requests. Maintenance procedures covering software updates, backup and rollback shall also be established.
Guidance note:
The procedures required here should be coordinated with the requirements in D.CM.1 “Manage software changes
during commissioning”.
Care should be taken to describe procedures for resolving joint problems, at the interface of two or more systems
within ISDS scope. Starting at FAT there should be a joint process for resolving problems.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Agreed maintenance procedures: Procedures for general
system maintenance activities and procedures for software
update, backup and roll-back.
Agreed problem resolution procedures: Procedures for
receiving, recording, resolving, tracking problems
(punches) and modification requests.
Acceptable contributions from Supplier.
Provide inputs to the procedures for problem resolution and maintenance activities and participate in the problem
resolution process.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 80
C.RAMS.1 Demonstrate achievement of system RAMS requirements
Phase: Construction. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: RAMS arguments and evidence shall be assembled to demonstrate achievement of RAMS
requirements and objectives on the system level.
Guidance note:
The RAMS requirements and objectives are defined in activities A.RAMS.1 and A.RAMS.2.
The arguments and evidence may include: Safety analysis, FME(C)A, test cases, test results, inspections,
computations and simulations, certifications, qualification of legacy systems, etc.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
RAMS compliance analysis information.
Contributions: No required contributions.
Documents required for review:
RAMS compliance report (FI) reviewed in C.IV.3 at CL3.
C.RAMS.2 Evaluate software systems and software components against RAM objectives
Phase: Construction. Confidence level: 3.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Software systems and software components shall be specifically checked against RAM
requirements and objectives using data collected from internal testing, FAT and other verification activities.
Guidance note:
This activity is an extension of the activity C.RAMS.1.
The RAM data is typically statistical data collected and analysed using models such as Weibull, PDS (SINTEF
STF38A97434), etc.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
RAM report: Calculations of RAM values for designated
systems and RAM data.
Contributions: No required contributions.
Documents required for review:
RAM report (FI) reviewed in C.IV.3.
C.RAMS.3 Prepare a plan for system maintenance during operation
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: A plan for the maintenance of the system during operation shall be defined, describing
maintenance related functions like restarts, backups and replacement of equipment/parts during operation.
Guidance note:
The output from this activity forms the basis for the execution of the maintenance in activity E.RAMS.1.
The finalised plan is normally needed in order to perform the activity D.CM.3 “Transfer responsibility for system
configuration management to owner”.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Maintenance management plan: configuration items, rules None
for operation/maintenance, backup and restore procedures,
expected maintenance activities, expected software update,
migration and retirement activities, schedules and tailored
procedures for maintenance in operation.
Acceptable contributions from Owner.
Provide inputs to the development of the maintenance plan.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 81
C.VV.1 Perform peer-reviews of software
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Peer reviews of new, modified and configured/parameterised software shall be performed. The
consistency of the software with its design documents shall be checked.
Guidance note:
In this context the term ‘software’ is used to represent all types of source code, graphical notations and models that is
readable for humans and used to generate machine-readable programs that can be executed by a computer.
Peer reviews are complementary means to testing to detect defects as early as possible in the development cycle.
Peer reviews are often also used to verify that applicable guidelines and methods have been applied in the software,
see activity C.IMP.4.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Software peer review records (FI) used in C.IV.1 at CL3.
Assessment criteria:
Peer review methodology description.
Peer review schedule.
Peer review records.
Peer review check lists.
Contributions: No required contributions.
C.VV.2 Review software parameterisation data
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Review the software parameterisation for completeness and correctness.
Guidance note:
Successful parameterisation of the software depends on the quality of the data taken into account by the supplier and
provided by the system integrator documenting the physical or dynamic properties of the unit and its various systems.
In practice, the supplier responsible and the system integrator usually review the quality of the parameterisation
together.
The input data for the parameterisation is typically an outcome of B.DES.1.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Parameter list review report: name, value, tolerance,
function.
Acceptable contributions from System integrator.
Participate to review of key parameters, check for:
Documents required for review:
None
— completeness,
— minimum level of uncertainties,
— accurate description of unit.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 82
C.VV.3 Perform internal testing
Phase: Construction. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Developed systems and components shall be verified before decision to release for FAT. The
system and component verification shall be analysed and compared with expected quality attribute targets, e.g. RAMS
(qualitative).
Tests shall include:
—
—
—
—
—
—
—
—
white box tests with 100% statement coverage of new and modified software
interface testing covering I/O for both bus-interfaces and hardwired interfaces.
tests of relevant intra-system interface dynamics by defined scenarios
tests of relevant inter-system interface dynamics by defined scenarios
normal functionality
error situations and corresponding alarms
degraded functionality
integration between hardware and software.
All requirements mapped to the system shall be tested.
Functions or properties that cannot be tested before FAT or even in FAT shall be clearly identified.
Guidance note:
The ambition levels for these tests should be clearly described in the verification strategy (see B.VV.1), and based on
the minimum assessment criteria described here. Details regarding the mapping of requirements to tests are covered
in the activity X.REQ.1. The software component test described in activity C.IMP.3 can be used as a means of white
box testing. The other aspects of the internal testing are normally performed after the software component test and are
sometimes referred to as an ‘internal acceptance test’. The limitations of the internal test environment and the use of
tools like e.g. test-drivers, simulators and emulators should be clearly described.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Test procedures.
None
Test reports.
Acceptable contributions from System integrator.
Review the internal tests results and analyse impact on the verification strategy.
C.VV.4 Perform high integrity internal testing
Phase: Construction. Confidence level: 3.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Internal high integrity tests shall be performed.
Tests shall include:
—
—
—
—
—
—
—
white box tests with 100% decision coverage of new and modified software
statistical testing
detailed performance testing.
stress testing
FME(C)A based testing
Security related testing
start, restart, shutdown testing.
Functions or properties that cannot be tested before FAT or even in FAT shall be clearly identified.
Guidance note:
This activity is an extension of the internal test described in C.VV.3 taking into account the additional quantitative
RAMS requirements on CL3. The ambition levels for these tests should be clearly described in the verification
strategy, and based on the minimum assessment criteria described here.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Test procedures
Test procedure at manufacturer (FI) and
Test reports
Test report at manufacturer (FI) used in C.IV.1 at CL3.
Acceptable contributions from System integrator.
Review the internal tests results and analyse impact on the verification strategy.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 83
C.VV.5 Perform code analysis on new and modified software
Phase: Construction. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Adequate software source code verification shall be performed against a predefined set of rules.
Code analysis shall be performed on new and modified software.
Guidance note:
For example, the source code verification may be performed by peer reviews, static analysis, or complementary
specific analyses such as schedulability analysis. Semi-automated or manual methods may be used. See ISO 61508
for other recommended means. See ISO 9126 for internal quality characteristics.
If this activity is performed by peer reviews, it can be combined with C.VV.1 provided the software is checked against
both design and rule set.
The code rule set depend on the type of programming language (e.g. a strongly typed language) and the application
of e.g. PLCs or standard software, and can be combined with the rules mentioned in activity C.IMP.4.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Software code verification: peer review reports, code
analysis reports and code rule set.
Contributions: No required contributions.
Documents required for review:
Software code analysis record (FI) used in C.IV.1 at CL3.
C.VV.6 Analyse verification results with respect to targets
Phase: Construction. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: The results of all verification activities performed by the supplier shall be evaluated and
compared with the verification targets as defined in the verification strategy.
Defects (punches) shall be tracked and analysed. Criteria for classification of defects shall be established. Defects shall
be analysed for trends to identify defect prone software.
The quality status of the system under development shall be measured.
Guidance note:
The purpose of this activity is to check if the system is fit for purpose and to identify actions to improve defect prone
software in a fast and effective manner. The targets are defined in activity B.VV.1.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Verification result evaluation: result analyses, punch lists, Verification analysis report (FI) reviewed in C.IV.1 at CL3.
action lists, defect correction and focus on defect prone
software.
Acceptable contributions from System integrator.
Review the verification analysis information.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 84
C.VV.7 Qualify reused software
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Standard software and non-standard legacy software shall be qualified for use in the project. The
qualification shall be performed by either a) applying the ISDS standard, b) assessing the quality through due diligence
of the software or c) by demonstrating that the software is proven-in-use.
Modified reused software (modified source code) shall be treated as new software.
Guidance note:
Qualification of software can be done using the following methods:
a) The process for developing the software is compliant to this ISDS standard.
b) The due diligence should evaluate the component against predefined quality criteria. It should review relevant
documentation like existing qualification certificates (e.g., type approvals), in-use feedback, requirements
specifications, design descriptions, source code, test reports, release notes, and user manuals and other support
documentation. When needed, critical functionality, performance, and other characteristics should be tested in a
test environment similar to the target environment with the configuration data used in the systems in the ISDS
scope.
c) In order to demonstrate that the software is proven-in-use, arguments based on analysis of experiences from
previous systems where the component has been used should be provided. The analysis should compare how the
component has been used, e.g., how it has been configured, with how it will be used in the current system.
Differences should be documented and it should be demonstrated, e.g., by testing or analysis, that the differences
do not imply any unacceptable risks. The proven-in-use analysis should investigate failure data from previous
systems that have been operated in a controlled way, e.g., all errors and software changes must have been recorded
(the requirements in this standard for the operation phase apply).
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Software qualification report: reused software component
list, qualification method for each reused software
component and qualification data.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 85
C.VV.8 Perform Factory Acceptance Tests (FAT)
Phase: Construction. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Factory acceptance tests (FAT) shall be performed to ensure the system is compliant to its
requirements and acceptable for the owner.
The FAT shall include:
— tests to ensure the system is compliant with selected operational requirements, including normal modes, error
situations, and degraded modes.
— tests covering the hardware/software integration.
— tests of relevant inter-system interface dynamics by defined scenarios.
— tests of relevant intra-system interface dynamics by defined scenarios.
Functions or properties that cannot be tested in the FAT shall be clearly identified.
Guidance note:
The purpose of the FAT is for the system integrator and owner to accept the system. Full coverage of requirements is
not required as long as the owner and the system integrator are satisfied. The demonstration of previously run tests is
typically performed by providing test reports from internal tests.
The limitations of the FAT environment and the use of tools like e.g. test-drivers, simulators and emulators should be
clearly described.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
System FAT procedure (FI) and System FAT report (FI):
System FAT procedure: coverage of requirements,
functionality, performance, RAMS (when applicable),
integration testing, hardware/software integration, interfaces — reviewed in C.IV.2 at CL2 and CL3
— used in C.IV.1 at CL3.
and degraded modes.
System FAT report: consistent with procedure, deviations
identified and coverage measured.
Acceptable contributions from Owner and System integrator.
System integrator: approve the FAT test procedures before the start of the FAT.
Both: review, analyse and approve the tests results.
Both: accept or reject the system as necessary, based on objective criteria.
C.VV.9 Arrange independent testing
Phase: Construction. Confidence level: 3.
Unit level responsible: None. System level responsible: System integrator.
Requirement definition: The system and its interfaces shall be tested by an independent party in its representative
environment.
The environment shall be documented, including its assumptions and limitations.
Guidance note:
A representative environment is the actual unit or an integration of relevant systems creating a test situation similar
to the final operational environment. HIL testing may be one method to be used for independent testing (see SfC 2.24).
The system integrator is responsible for arranging the independent testing, but the independent party performing the
testing is normally not the system integrator or the supplier.
The term ‘system’ in this context refers to a system on CL3 including all interfaces to other systems regardless of their
confidence level.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Test procedure: covering the system and its interfaces.
Test report.
Documents required for review:
Independent test procedure (FI) and Independent test report
(FI) reviewed in D.IV.1 at CL3.
Independent test report (FI) used in D.IV.2 at CL3.
Acceptable contributions from Supplier.
Contribute to the preparation of the independent tests.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 86
A 500 Activity definition acceptance
D.CM.1 Manage software changes during commissioning
Phase: Acceptance. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Any change in a system or sub-system/component shall be submitted to a change control board
which analyses impact and makes a decision regarding the change.
The roles, responsibilities and mechanisms for the change control board shall be defined. These shall ensure that the
deployed configuration of the systems that have passed FAT cannot change outside of a strict change control procedure.
Guidance note:
The system integrator should coordinate changes across all systems. This activity should be related to the activity
C.PQA.1 “Establish procedures for problem resolution and maintenance activities in the construction and acceptance
phases”, and also to X.REQ.1 “Maintain requirements traceability information”.
The system integrator should consult with the suppliers about the design of the configuration management system.
The configuration management system may be described in the configuration management plan.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Defined software configuration management: definition of None
Change Control Board (CCB), change request forms,
description of change process for software, impact analysis,
Identification of items to be controlled, configuration
management tool, including, issue, change, version and
configuration tracking tool and prevents unauthorised
changes.
Modification records justifying changes:
configuration records,
version histories,
release notes,
change orders.
Acceptable contributions from Owner and Supplier.
Approval of software and document change requests.
D.CM.2 Establish a release note for the systems in ISDS scope
Phase: Acceptance. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Before entering into operation, the configuration of the systems in the ISDS scope shall be
documented. A release note shall be produced, describing all the items and their versions, as well as their status (e.g.
known defects).
In the case of changes, differences with respect to previous version shall be documented in the release note.
Guidance note:
The overall release note can link to each system's release note. The system integrator will base the release note on
inputs from the suppliers, see X.CM.2.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Overall release note for the systems in ISDS scope.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 87
D.CM.3 Transfer responsibility for system configuration management to owner
Phase: Acceptance. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: After acceptance, the owner shall take the responsibility for the configuration management of
the systems within ISDS scope. The system integrator shall deliver all necessary software, documentation, and data to
the owner.
Guidance note:
The owner may adopt the configuration management mechanisms already defined or modify them appropriately.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Approved configuration management plan.
None
Records of transmission of software, documentation and
data, or responsibility thereof.
Acceptable contributions from Owner and Supplier.
Owner: Approval of configuration management system.
Supplier: Supply the system integrator with relevant software, documentation and data.
D.RAMS.1 Demonstrate achievement of unit RAMS requirements
Phase: Acceptance. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: RAMS arguments and evidence shall be assembled to demonstrate achievement of RAMS
requirements and objectives on the unit level.
Guidance note:
The RAMS requirements and objectives are defined in activities A.RAMS.1 and A.RAMS.2.
The arguments and evidence may include: Safety analysis, FME(C)A, test cases, inspections, computations and
simulations, certifications etc. The suppliers normally provide information gathered in the activity C.RAMS.1.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
RAMS compliance analysis information.
Acceptable contributions from Supplier.
Provide RAMS arguments and evidence for the system.
Documents required for review:
RAMS compliance report (FI) reviewed in D.IV.3 at CL3.
D.RAMS.2 Collect data and calculate RAM values
Phase: Acceptance. Confidence level: 3.
Unit level responsible: System integrator. System level responsible: System integrator.
Requirement definition: Data from commissioning and integration testing shall be collected and used together with data
from earlier verification activities to estimate reliability, availability and maintainability values relative to the RAM
objectives.
Guidance note:
This activity is an extension of D.RAMS.1.
Several methods for estimating reliability are available, for example: PDS (SINTEF STF38A97434).
Reliability and maintainability data can be used to calculate availability.
Maintainability for software may differentiate activities such as restoration of the production (typically a fast activity),
and defect correction (typically a longer one).
The information supplied by the suppliers normally come from the activity C.RAMS.2.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Calculations of RAM values for relevant systems and the
unit.
RAM data.
Acceptable contributions from Supplier.
Provide RAM data and evaluations for the system.
Documents required for review:
RAM report (FI) unit reviewed in D.IV.3.
RAM report (FI) system used in D.IV.3.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 88
D.RAMS.3 Perform a security audit on the deployed systems
Phase: Acceptance. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: System integrator.
Requirement definition: A security audit shall be performed on the relevant systems to verify that the security related
requirements are fulfilled.
Guidance note:
The DNV OS-D202 contains some security related requirements, and the owner may request additional requirements
to be met. The Security requirements are normally documented together with the other unit and system level
requirements, see activity A.REQ.2. A number of different standards are available and one of them may be used as a
basis for the security audit, e.g. ISA 99, BS 5750, and ISO 62443.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Security audit records.
Contributions: No required contributions.
Documents required for review:
Security audit report (FI) used in D.IV.3 at CL3.
D.VV.1 Perform validation testing
Phase: Acceptance. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: The validation procedure shall be established according to the validation strategy.
The testing steps for validation shall be identified and clearly separated from the parameterisation and calibration steps.
They shall take software into consideration.
Guidance note:
Validation testing is usually broken down and may be performed partly during commissioning, integration testing,
and quay or sea trials.
Parameterisation and calibration (of the software) should be performed prior to commissioning. A pre-commissioning
step is often suitable.
In some cases the validation tests also include tests that have not been possible to run during internal tests at the
supplier (C.VV.3) or at the FAT (C.VV.8).
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Test procedure: black box tests, boundary tests, software Test procedure for quay and sea trials (FI) and Report from
quay and sea trials (FI) reviewed in D.IV.1 at CL2 and CL3.
behaviour and parameterisation and calibration.
Report from quay and sea trials (FI) used in D.IV.2 at CL3.
Test reports: executed consistent with procedure.
Test issue list: deviations (punches) and variations.
Acceptable contributions from System integrator and Supplier.
Both: provide input to test procedures and test data.
D.VV.2 Perform validation with operational scenarios
Phase: Acceptance. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: Operational scenarios shall be demonstrated on the systems in the ISDS scope in an environment
representative of the target environment.
Guidance note:
Qualified simulators, or the unit itself in integration testing, quay or sea trials qualify as representative of the target
environment.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Test procedure (FI) and Test report (FI) reviewed in D.IV.1
Test procedure: operational scenarios.
Test reports: tests performed in compliance with procedure at CL2 and CL3.
Test report (FI) used in D.IV.2 at CL3.
and coverage of scenarios.
Acceptable contributions from System integrator and Supplier.
Both: provide input to test procedures and test data.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 89
D.VV.3 Analyse validation results with respect to targets
Phase: Acceptance. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: The validation results and its analyses, including the evaluation of the validation method, the
defects identified, and the comparison between the expected results and the actual results shall be recorded.
Quality criteria shall be evaluated and compared with quality targets.
Guidance note:
Decision for authorising the system or unit to go into operation should be made on the basis of the validation results
from commissioning, integration testing and quay and sea trials.
The quality targets are defined in activity B.VV.1.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Test procedure: quality criteria.
Test reports: analysis of the results.
Test issue list.
Contributions: No required contributions.
Documents required for review:
Verification analysis report (FI) reviewed in D.IV.2 at CL3.
D.VV.4 Perform systems integration tests
Phase: Acceptance. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Integration tests shall be performed to ensure that inter-system functionality is working as
expected and that the interfaces between systems are compliant to the requirements. Scenarios involving interfaces
between the different parts shall be included in the test cases.
Guidance note:
The interaction between the different systems should be tested, not only the match between output signals and input
signals.
Both the nominal behaviour and the degraded behaviour should be tested.
To test the complete function the following testing method may be applied:
— black-box testing,
— boundary testing.
Black-box testing is normally guided:
a) by the structure of the requirements, use case, operational scenarios or results from simulations,
b) by the structure of the input data,
c) by the risks,
d) randomly.
For more information on black-box testing and boundary testing (or analysis), see IEC 61508-7.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Integration test procedures covering system interfaces and
inter-system functionality.
Integration test reports.
Documents required for review:
Test procedure (FI) and Test report (FI) reviewed in
D.IV.1 at CL2 and CL3.
Test report (FI) used in D.IV.2 at CL3.
Acceptable contributions from Supplier.
Contribute with inputs from FAT (activity C.VV.8).
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 90
A 600 Activity definition operation
E.ACQ.1 Manage and monitor obsolescence
Phase: Operation. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: The acquired components and systems shall be monitored so that pro-active actions can be
made before parts become obsolete.
The responsibilities for monitoring obsolescence and taking action when needed shall be clearly defined within the
organisation.
Documents required for review:
Assessment criteria:
None
Obsolescence strategy document.
Obsolescence management plan: Authorised vendor list,
Spare parts list (HW & compatible SW), Alternate spare
parts list and Management of intellectual property.
Contributions: No required contributions.
E.CM.1 Manage change requests during operation
Phase: Operation. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: Supplier.
Requirement definition: Change requests shall be systematically handled. Potential changes shall be analysed to assess
their impact on operation, as well as effects on other systems. A Change Control Board (CCB) shall consider the impact
analysis before approving proposed changes.
Assessment criteria:
Change requests
Impact analysis
Change orders
Work orders
Problem reports
Release notes
Maintenance logs
Contributions: No required contributions.
Documents required for review:
None
E.CM.2 Perform configuration audits
Phase: Operation. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: There shall be regular configuration audits to verify the integrity of the configuration in
operation. Configuration audits shall confirm 1) that the operational software versions match supplier records and onboard back-ups, 2) documentation versions match the operational software versions, and 3) the configuration
management plan is being followed.
Assessment criteria:
Documents required for review:
Configuration audit reports.
None
Acceptable contributions from Supplier.
Identification of software versions as installed from supplier.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 91
E.PQA.1 Define procedures for problem resolution, change handling, and maintenance activities
Phase: Operation. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: The owner shall establish procedures for receiving, recording, resolving, and tracking problems,
modification requests, and maintenance activities. Software update, migration, retirement, backup and restore
procedures shall also be included.
Guidance note:
The problem resolution should be based on a systematic problem solving method (e.g. identified in the units
International Safety Management (ISM) procedures) to investigate and provide a detailed response (including
corrective action) to the problem(s) identified.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Configuration management plan.
Configuration management procedure: migration issues
and software obsolescence (ref E.ACQ.1).
Maintenance procedures: procedures for the maintenance,
software update, migration and retirement, backup and
restore procedures and procedures for receiving, recording,
resolving, tracking problems and modification requests.
Change management procedure.
Issue tracking and resolution procedure.
Acceptable contributions from Supplier.
Provide inputs regarding the supplier’s system(s) to the problem resolution and change handling process.
E.RAMS.1 Maintain and execute the plan for maintenance in operation
Phase: Operation. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: The plan for the maintenance in operation of the unit and systems shall be maintained and
executed, identifying the configuration items, the rules for maintenance in operation (e.g. access control and logging of
maintenance), and the expected activities for maintenance and security patches.
Configuration audits, security audits, obsolescence of hardware and software, migration and software retirement issues
shall also be addressed in this plan.
Guidance note:
The maintenance plans are based on the maintenance related input from the suppliers in activity C.RAMS.3.
Some elements of this plan may be addressed in the configuration management system.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Maintenance plan: configuration items, audit activities,
maintenance activities, expected software update,
migration and retirement activities, maintenance intervals
and tailored procedures for the maintenance in operation.
Malicious software scan log files records.
Maintenance logs.
Acceptable contributions from Supplier.Provide input regarding the supplier’s system(s) regarding maintenance and
obsolescence.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 92
E.RAMS.2 Collect RAMS data
Phase: Operation. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: A system to collect RAMS data from the unit in operation shall be established.
Data shall be forwarded to the relevant suppliers.
Guidance note:
RAMS data typically include:
Failures and incidents, downtime, uptime, number of demands, time to repair, etc.
Even small defects and incidents should be reported, particularly if repeated.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
RAMS data collection system.
RAMS data collected.
Contributions: No required contributions.
Documents required for review:
None
E.RAMS.3 Analyse RAMS data and address discrepancies
Phase: Operation. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Supplier.
Requirement definition: RAMS data shall be analysed in order to assess and improve the RAMS performance.
For the software, the analysis must be broken down to the level of software components in order to identify the
components that are candidates for improvement.
Discrepancies between RAMS requirements/objectives and the actual RAMS results shall be addressed.
Guidance note:
RAMS analysis typically result in numbers for MTBF, MTTR etc.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
RAMS analysis.
Contributions: No required contributions.
Documents required for review:
None
E.RAMS.4 Perform RAMS impact analysis of changes
Phase: Operation. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Supplier.
Requirement definition: Before changes are made to the system in operation, the impacts on RAMS properties shall be
analysed. For major changes or changes with major RAMS impact, The owner shall inform DNV before the change is
made.
Guidance note:
This activity is an extension of the activity E.CM.1.
For major changes, or changes with major RAMS impact, DNV will typically review and witness the commissioning
of the systems in question as described in activity E.IV.1.
Major changes are changes that are impacting functionality, performance or the interfaces of systems on CL2 or CL3.
Bug fixes are normally not considered a major change.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Impact analysis showing RAMS evaluation.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 93
E.RAMS.5 Periodically perform security audits of the systems in operation
Phase: Operation. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: Relevant systems in ISDS scope shall periodically be audited from a security point of view.
Guidance note:
DNV OS-D202 contains some security related requirement.
A number of different standards are available and one of them may be used as a basis for the security audit, e.g. ISA
99, BS 5750 and ISO 62443.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Security audit report.
Contributions: No required contributions.
Documents required for review:
None
E.VV.1 Perform validation testing after changes in the systems in operation
Phase: Operation. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: After minor upgrades and corrections, validation testing shall be done for the systems affected.
Guidance note:
Major upgrades or conversions require the application of the whole ISDS standard.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Test procedure (FI) and Test report (FI) reviewed in E.IV.1
Test procedure: includes black box tests and includes
at CL3.
boundary tests.
Test reports: consistent with procedure.
Acceptable contributions from Supplier.
Contribute to preparing test procedures and to execute test.
E.VV.2 Perform validation with operational scenarios after changes in the systems in operation
Phase: Operation. Confidence level: 2 and above.
Unit level responsible: Owner. System level responsible: Owner.
Requirement definition: After minor upgrades and corrections, validation with operational scenarios shall be done for the
systems affected.
The validation procedure shall be established according to the validation strategy.
The testing steps for validation shall be identified and clearly separated from the parameterisation and calibration steps.
They shall take software into consideration.
The validation tests shall be performed according to the test procedure.
Guidance note:
This activity is an extension of E.VV.1.
Major upgrades or conversions require the application of the whole ISDS standard.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Test procedures: Covering relevant Operational scenarios. Test procedure (FI) and Test report (FI) reviewed in E.IV.1
Test reports: tests performed in compliance with procedure at CL3.
and analysis of the results.
Acceptable contributions from Supplier.
Contribute to preparing test procedures and to execute test.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 94
A 700 Activity definition several phases
X.ACQ.1 Monitor contract execution and changes
Phase: Several. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: When development or configuration of a software component is subcontracted, the buying
organisation shall monitor that the contract is executed as agreed. The ISDS requirements specified in the contract shall
be followed up. Progress and quality shall be tracked.
Progress reviews with sub-suppliers shall be planned and held.
The impact of contract changes on software components shall be considered.
This activity shall be performed in phases B-E.
Documents required for review:
None
Assessment criteria:
Sub-supplier progress review schedule.
Sub-supplier progress review reports.
Sub-supplier project control records.
Sub-supplier quality control records.
Contributions: No required contributions.
X.ACQ.2 Review intermediate deliverables
Phase: Several. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Selected intermediate deliverables from sub-suppliers shall be provided for information and
review, in order to give visibility into status and progress.
The review of deliverables shall be planned.
This activity shall be performed in phases B-C.
Assessment criteria:
Supplier agreement: list of deliverables and review and
approval plans.
Review records/minutes.
Contributions: No required contributions.
Documents required for review:
None
X.CM.1 Track and control changes to the baselines
Phase: Several. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: The development of any part of the unit shall follow the rules in the configuration management
plan to maintain consistency at all levels and among all software components.
Changes to requirements, design, interface definitions and software baselines shall be tracked and controlled. As changes
are made to lower level designs and code, higher level designs and requirements shall be updated appropriately.
This activity shall be performed in phases A-D for the system integrator and in phases B-E for the supplier.
Guidance note:
Typical information tracked includes: date, author, contents of the change, rationale for the change, components
impacted, and version and configuration number changes. Proposed changes should be reviewed and approved. This
activity should follow the configuration management plan.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Change requests/orders.
Version histories for baselines.
Changes to: unit requirements, unit design, system
requirements, system design, software design, interface
specifications and software.
Configuration records from document or software
repositories.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 95
X.CM.2 Establish a release note for the delivered system
Phase: Several. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Each delivered system/sub-system shall come with a release note, describing the functional
content of the delivery (versions of the applicable specifications) as well as its physical content (list of items with their
versions). In case of new delivery of a software component, differences with respect to the previous version shall be
documented in the release note.
This activity shall be performed in phases C-E.
Assessment criteria:
Component release note: including list of changes to
previous version of component.
Contributions: No required contributions.
Documents required for review:
None
X.DES.1 Update the base-product design documentation
Phase: Several. Confidence level: 2 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: If the system/software is created using base-products, the base product design documentation
shall be kept up to date. The documentation of tools and environments needed to configure or generate the system/
software from the base-products shall also be kept up to date.
This activity shall be performed in phases B-E.
Guidance note:
Tools and environments needed to configure or generate a project specific product should be considered a part of the
same product repository.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Base product design description.
Revision information for updated base-product components.
Contributions: No required contributions.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 96
X.PM.1 Monitor project status against plan
Phase: Several. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: The project’s activities shall be monitored against the plan and reported. Corrective actions shall
be taken when significant deviations from the plan occur. When needed, coordinated actions shall be undertaken with
other stakeholders.
This activity shall be performed in phases A-D for the system integrator and phases B-E for the supplier.
Guidance note:
It is strongly recommended that joint meetings between the different organizations/roles are conducted on a regular
basis in order to coordinate and track the progress of the activities in this standard.
Corrective actions may include actions to bring back the project’s status to the plan, or actions to establish new work
estimates and/or an updated plan.
Policies (information security) or strategies (obsolescence, verification, validation, and integration) should be updated
when needed.
During operation, most activities are constrained by maintenance or operation plan. Significant upgrades or
corrections requiring coordination with several stakeholders may require a specific project plan, as specified by this
standard.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Master schedule.
Master plan (updated).
Project status report.
Project action list.
Minutes of review meetings.
Progress report.
Acceptable contributions from Owner and Supplier.Supplier to provide inputs on specific schedule constraints and
deviations to plan for the unit level.
Owner to contribute to update policies and strategies and to establish corrective actions.
Owner to provide the master plan in phase E, when applicable.
X.PM.2 Perform joint project milestone reviews
Phase: Several. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Joint project milestone reviews to check achievement of phase objectives shall be planned and
carried-out. Significant risks, issues and their impact shall be documented and tracked until closure. Decisions whether
or not, or how, to progress to the next phase shall be recorded.
In the A phase, the system integrator and the owner shall participate. In the B-D phases, all roles shall participate.
Guidance note:
The ISDS milestones are intended to be the critical event to decide whether to proceed further in the project, weighing
the risks of moving to the next phase versus the risks of postponing it. In some cases, the decision to move forward
may include considerable project risks. The milestone is a good practice to communicate the extent of such risks to
all stakeholders.
The milestone review is typically the last activity to be performed in each phase.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Minutes of joint milestone meetings.
None
ISDS compliance status.
Action plans.
Acceptable contributions from Owner and Supplier.
Owner and Supplier to participate in the milestone meeting and provide status and plans.
Owner and Supplier to contribute to the decisions related to the further progress of the project.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 97
X.PQA.1 Control procedures (owner)
Phase: Several. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: Procedures shall be controlled to ensure defined procedures are followed and that the activities
required by this standard are executed in practice.
This activity shall be performed in phases A-E.
Guidance note:
Follow up of the procedures may result in increased process adherence, improvement of procedures, or training as
required.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Proof that process adherence is being assessed: Quality
control records, Project control records and Minutes of
meetings, or other relevant information.
Contributions: No required contributions.
Documents required for review:
None
X.PQA.2 Control procedures (system integrator)
Phase: Several. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: Procedures shall be controlled to ensure defined procedures are followed and that the activities
required by this standard are executed in practice.
This activity shall be performed in phases A-D.
Guidance note:
Follow up of the procedures may result in increased process adherence, improvement of procedures, or training as
required.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Proof that process adherence is being assessed: Quality
control records, Project control records and Minutes of
meetings, or other relevant information.
Contributions: No required contributions.
Documents required for review:
None
X.PQA.3 Control procedures (supplier)
Phase: Several. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Procedures shall be controlled to ensure defined procedures are followed and that the activities
required by this standard are executed in practice.
This activity shall be performed in phases B-E.
Guidance note:
Follow up of the procedures may result in increased process adherence, improvement of procedures, or training as
required.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Proof that process adherence is being assessed: Quality
control records, Project control records and Minutes of
meetings, or other relevant information.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 98
X.PQA.4 Follow-up of ISDS assessment gaps (owner)
Phase: Several. Confidence level: 1 and above.
Unit level responsible: Owner. System level responsible: None.
Requirement definition: If the independent verifier finds gaps towards this standard during a process assessment, the
organisation in question shall plan and implement actions to close those gaps within reasonable time. A corrective action
plan outlining the actions to be taken shall be submitted to the independent verifier for approval.
This activity shall be performed in phases A-E.
Guidance note:
The ‘reasonable time’ can be decided from case to case, but normally an action plan is expected to be submitted for
approval within 14 days of the assessment report is issued, and gaps are expected to be closed within the ISDS projectphase in question, and not later than 3 months after the assessment report.
The assessed organisation is expected to closely follow-up on the activities in the corrective action plan.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Corrective action plan: Responsibility allocation for
actions, Records of actions taken and Evidence of
implementation of the actions.
Contributions: No required contributions.
Documents required for review:
Corrective action plan (AP) reviewed and approved in
X.IV.1.
X.PQA.5 Follow-up of ISDS assessment gaps (system integrator)
Phase: Several. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: None.
Requirement definition: If the independent verifier finds gaps towards this standard during a process assessment, the
organisation in question shall plan and implement actions to close those gaps within reasonable time. A corrective action
plan outlining the actions to be taken shall be submitted to the independent verifier for approval.
This activity shall be performed in phases A-D.
Guidance note:
The ‘reasonable time’ can be decided from case to case, but normally an action plan is expected to be submitted for
approval within 14 days of the assessment report is issued, and gaps are expected to be closed within the ISDS projectphase in question, and not later than 3 months after the assessment report.
The assessed organisation is expected to closely follow-up on the activities in the corrective action plan.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Corrective action plan: Responsibility allocation for
actions, Records of actions taken and Evidence of
implementation of the actions.
Contributions: No required contributions.
Documents required for review:
Corrective action plan (AP) reviewed and approved in
X.IV.1.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 99
X.PQA.6 Follow-up of ISDS assessment gaps (supplier)
Phase: Several. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: If the independent verifier finds gaps towards this standard during a process assessment, the
organisation in question shall plan and implement actions to close those gaps within reasonable time. A corrective action
plan outlining the actions to be taken shall be submitted to the independent verifier for approval.
This activity shall be performed in phases B-E.
Guidance note:
The ‘reasonable time’ can be decided from case to case, but normally an action plan is expected to be submitted for
approval within 14 days of the assessment report is issued, and gaps are expected to be closed within the ISDS projectphase in question, and not later than 3 months after the assessment report.
The assessed organisation is expected to closely follow-up on the activities in the corrective action plan.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Corrective action plan: Responsibility allocation for
actions, Records of actions taken and Evidence of
implementation of the actions.
Contributions: No required contributions.
Documents required for review:
Corrective action plan (AP) reviewed and approved in
X.IV.1.
X.REQ.1 Maintain requirements traceability information
Phase: Several. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Traceability of requirements shall be kept up to date. Three kinds of traceability information are
required:
1) Traceability between requirements on different levels (e.g. from unit to system).
2) Traceability from a requirement to where and how it is designed and implemented.
3) Traceability from a requirement to where and how it is verified and validated.
This activity shall be performed in phases A-D for the system integrator and in phases B-E for the supplier.
Guidance note:
Depending on the confidence level there may be different ambition levels for traceability, but regardless of the
ambition level the trace information should be kept up to date when during the project.
The traceability information between unit and systems is established in activities A.REQ.6 and B.INT.2.
The traceability information within a system emerges in activity B.REQ.2, B.DES.1 and B.DES.2.
The traceability information from requirements to verification and validation emerges in activity C.VV.3/4/5/8,
D.VV.1/2/4 and X.VV.1/2 as applicable.
Traceability matrices are normally used to document the traceability information, but also databases or references
from documents to documents can be used as long as the traceability information is explicit and reviewable.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
Up to date traceability information: from owner to system Traceability matrices (FI) used in B.IV.1 and C.IV.1 at
CL3, and in C.IV.2 at CL2 and CL3.
requirements, from system requirements to functional
specifications (where applicable), from system
requirements to base-product and configuration data
(where applicable), from functional specifications to subsystem/component specifications and from requirements to
test procedures (when the test procedures are available).
Completeness and consistency review records of the
traceability information.
Contributions: No required contributions.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 100
X.RISK.1 Track, review and update risks
Phase: Several. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: The risk list shall be reviewed and updated regularly, in order to re-evaluate the risk attributes
or to take into account new risks. Risks involving other stakeholders shall be regularly shared, reviewed and updated
jointly.
This activity shall be performed in phases B-D for the system integrator and phases B-E for the supplier. For the E phase,
the owner shall establish and maintain the risk list for significant upgrades or conversions.
Guidance note:
It is important that a consistent picture of the risks involving other stakeholders is regularly shared, both with external
stakeholders, and with the stakeholders within the organisation.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Documents required for review:
Assessment criteria:
None
Project risk management plan.
Updated internal risk register (per organization).
Updated project risk register (jointly managed).
Acceptable contributions from Owner and Supplier.
Supplier shall provide input on risks identification on unit level.
Owner shall provide inputs on operational and business risks relevant for the project and related to the product.
Owner shall manage the risk list for the E phase.
X.RISK.2 Decide, implement and track risk mitigation actions to closure
Phase: Several. Confidence level: 2 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Risk mitigation actions shall be decided and planned, according to the risk strategy. Status of
these actions shall be monitored regularly. Efficiency of mitigation actions shall be assessed and new actions taken as
needed. Mitigation actions involving other stakeholders shall be coordinated.
This activity shall be performed in phases A-D for the system integrator and phases B-E for the supplier.
Major upgrades or conversions require the application of the whole ISDS standard.
Guidance note:
New stakeholders introduced throughout the project should be informed about the risk strategy and the jointly
identified risks and be given an opportunity to provide inputs to necessary updates.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Updated internal risk register: risk list, mitigation actions None
and follow-up records (per organization).
Updated project risk register: risk list, mitigation actions
and follow-up records (jointly managed).
Acceptable contributions from Owner
Contribute to identifying and deciding on risk mitigation actions and the closing of same.
DET NORSKE VERITAS AS
Offshore Standard DNV-OS-D203, December 2012
App.B – Page 101
X.VV.1 Perform verification and validation on added and modified software components
Phase: Several. Confidence level: 1 and above.
Unit level responsible: None. System level responsible: Supplier.
Requirement definition: Developed and integrated software components shall be verified, validated and tested for
regression before decision to accept the component. This also applies if changes to software components occur after
defined baselines, such as FAT. The status of the component verification and validation shall be analysed and compared
with expected process and quality attributes targets.
This activity shall be performed in phases C-E.
Guidance note:
Changes to software components should not only trigger verification and validation of the component that has been
changed, but also of other related components to prevent regression.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Documents required for review:
Test procedure: consistent with change or upgrade scope. Test procedure (FI) and Test report (FI) used in E.IV.1 at
Test report: consistent with test procedure.
CL3.
Acceptable contributions from Owner.
Provide inputs on expected process and quality attribute targets.
X.VV.2 Detail procedures for testing
Phase: Several. Confidence level: 1 and above.
Unit level responsible: System integrator. System level responsible: Supplier.
Requirement definition: Detailed procedures for testing shall be completed (test cases), and documented. The testing
steps shall be identified and clearly separated from the parameterisation and calibration steps.
The expected results for each test case shall be specified.
Traceability to requirements/function specifications shall be taken into consideration.
This activity shall be performed in phases C-E for the supplier and phase D for the system integrator.
Guidance note:
Procedures for testing should take software into consideration, and should focus on new, modified and parameterised
software.
In case of (semi-)automated testing, implementation of the test cases should be performed beforehand.
For testing performed in C.IMP.3, a test log is usually sufficient.
---e-n-d---of---G-u-i-d-a-n-c-e---n-o-t-e---
Assessment criteria:
Existence of relevant test procedures.
Contributions: No required contributions.
Documents required for review:
None
DET NORSKE VERITAS AS