1 jrc – ie petten benchmark exercise of safety evaluation of computer based systems v.kopustinskas...

24
1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1 , C.Kirchsteiger 1 , B.Soubies 2 , F.Daumas 2 , J.Gassino 2 , JC. Péron 2 , P. Régnier 2 , J. Märtz 3 , M. Baleanu 3 , H. Miedl 3 , M. Kersken 3 , U. Pulkkinen 4 , M. Koskela 4 , P. Haapanen 4 , M- L.Järvinen 5 , H-W.Bock 6 , W.Dreves 6 , 1) EC DG-JRC, IE (NL); 2) IRSN (FR) 3) ISTec (D) 4) VTT (FIN) 5) STUK (FIN) 6) Framatome ANP (D)

Upload: colleen-goodwin

Post on 11-Jan-2016

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

1 JRC – IE Petten

Benchmark Exercise of Safety Evaluation of Computer Based Systems

V.Kopustinskas1, C.Kirchsteiger1, B.Soubies2, F.Daumas2, J.Gassino2, JC. Péron2, P. Régnier 2, J. Märtz3, M. Baleanu3, H. Miedl3, M. Kersken 3, U.

Pulkkinen4, M. Koskela4, P. Haapanen4, M-L.Järvinen5, H-W.Bock6, W.Dreves6,

1) EC DG-JRC, IE (NL); 2) IRSN (FR)3) ISTec (D) 4) VTT (FIN)5) STUK (FIN) 6) Framatome ANP (D)

To be presented at post -FISA’2003 workshop

Page 2: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

2 JRC – IE Petten

Project partnersProject duration: 01/2001 – 12/2003

Project coordinator: EC-JRC-Inst. for Energy, Petten (Netherlands)Project started by: EC-JRC-IPSC, Ispra (Italy)

Industrial partner: Framatome ANP (Germany, form. Siemens)

Assessment teams: IRSN (France)ISTec (Germany)STUK and VTT (Finland)

Page 3: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

3 JRC – IE Petten

Project objectives

The project primary target is a comparative evaluation of existing safety assessment methodologies of safety critical computer based systems in use in the nuclear field among EU regulators and technical support organizations.

Page 4: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

4 JRC – IE Petten

Work packages

WP1: High-level specification of the Benchmark ExerciseWP2: Reference software study case definition and designWP3: Final specification of the assessment methodologies WP4: Application of the assessment methodologiesWP5: Comparison of the assessment methodologiesWP6: Project coordination and financial coordination

Page 5: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

5 JRC – IE Petten

Project implementation

Framatome ANP provided a reference case study of a hypothetical reactor protection system, including the requirements and functional specification of a limited number of safety functions that were selected by the project partners. The proprietary documentation was made available to the assessor partners, namely IRSN, ISTec, STUK and VTT.

Each assessor applied their assessment methodology to the reference case study. The comparison study was performed to highlight the current practices and methods used in the field by major research and regulatory support organizations.

Page 6: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

6 JRC – IE Petten

Reference study caseThe case study comprised a limited part of a complete safety I&C

modernization project, using the tools of the TELEPERM XS system platform.

8 MADTEB group functions were selected from the safety functions of the KWU Konvoi plants and which were intended to be applied for the Finland 1400 MW PWR plant (status of 1993 year). The MADTEB functions are part of the reactor limitation system and limit the allowed range of process variables (mainly coolant pressure and pressurizer level) of the primary coolant loop of the reactor.

Page 7: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

7 JRC – IE Petten

Reference study case MADTEB functions- A33 Reduction of leakage in case of steam generator tube rupture- A34 Ensure effectiveness of extra borating system spraying- C31 Prevent violation of maximum allowable working pressure - D01 Prevent inadvertent opening of 1st pressurizer safety valve- D02 Prevent response of 2nd and 3rd pressurizer safety valve- D32 Pressurizer overfeed protection- D33 Prevent loss of coolant via stuck open 1st pressurizer safety valve- J34 Prevent emptying of pressurizer

Page 8: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

8 JRC – IE Petten

Reference study case design process• Provision of typical documents to be developed in a safety I&C

modernization project;• Specification of the requirements to be met by the system;• Specification of the safety system on the basis of the TELEPERM XS

system platform;• Detailed design of the functions;• Verification of the design using the SPACE-engineering tools of

TELEPERM XS;• Production of code which is able to run on an existing test system;• Demonstration of operation of the code in the test system;• Validation tests of the software.

Page 9: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

9 JRC – IE Petten

Reference study case documentation• All the benchmark study related and generic Teleperm XS system

documentation was available to the assessors;• Software source code was available upon request;• A number of technical meetings were organised to provide

clarifications and discuss technical questions;• FANP made available simulation testing of the benchmarked

software

Page 10: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

10 JRC – IE Petten

Reference study case limitationsAs a consequence of the limited scope essential parts of a real project were not performed, e. g.

- Validation of the functional requirements- Design of interfaces to other systems- Hardware procurement and manufacture- Validation of the hardware

Page 11: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

11 JRC – IE Petten

Assessment studies and results

To be presented by each partner: IRSN ISTec VTT/STUK

Page 12: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

12 JRC – IE Petten

Comparison procedure The comparison is based for the following main items:

- technical basis of different approaches, - depth of analysis allowed to perform by the methodologies; - availability of various methods and analysis tools used for assessment;- assessment phases;

- assessment results and findings.

Page 13: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

13 JRC – IE Petten

Comparison procedure The comparison procedure will not target to

- identify any possible deficiencies in the methodologies;- decide which methodology is better or worse;- conclude anything about safety of study case software.

Page 14: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

14 JRC – IE Petten

Comparison procedure The comparison procedure resulted in a descriptive study of the

following main items: - Comparison of the methodological approaches;

- Comparison of the assessment studies;- Comparison of the assessment results and findings.

Page 15: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

15 JRC – IE Petten

Comparison: Regulatory requirementsAll three assessment teams follow national regulatory requirements,

which are mainly based on the international IEC 60880 guide. Although based on the same international standard, at the national level, the requirements are slightly different. For example, finnish regulatory guide YVL-5.5 requires quantitative reliability analysis for the safety critical class 1 computer based systems, while the French and German regulations do not. Also finnish regulation explicitly requires FMEA to be performed.

Page 16: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

16 JRC – IE Petten

Comparison: Life cycleAll assessment teams follow basically the same assessment steps

that correspond to life cycle phases. The typical assessment procedure starts with general assessment of the quality assurance and V&V plans and engineering process itself. Then each life cycle phase is evaluated. Although different in titles, the content of the life cycle phases is nearly the same.

The following life cycle was used for the comparison purposes:- Requirements specification - System specification - Detailed Design- Code generation- Testing

Page 17: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

17 JRC – IE Petten

Comparison: Quality assurance and eng. process A number of deficiencies were identified by the assessors in this

phase.It is important to note that most of them are related to limitations of

the study case. This especially concerns FANP testing strategy, as all test were performed by simulation tool SIVAT, but not on a realistic system. In the frame of this benchmark study, testing on a real system would not justified due to limited resources of the project.

However, some identified deficiencies, like lack of united life cycle model description or lack of rigorous V&V procedures could be useful to improve the software documentation.

Page 18: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

18 JRC – IE Petten

Comparison: Requirements specification The results of the assessment indicate the need for independent

verification of each development step. As a validation of the specification of the requirements by a process engineer was out of the scope of BE-SECBS, a fault could be identified (incorrect operation of AA011 valve).

Page 19: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

19 JRC – IE Petten

Comparison: System specification As all previous ones, also this assessment step was performed by

the assessors only by a critical review of the documentation. One assessor also used the SPACE-tool for navigating and tracing signal-paths. No critical faults were identified. The deficiencies mentioned by the assessors would be useful to improve the development process and documentation.

Page 20: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

20 JRC – IE Petten

Comparison: Detailed design Most of the deficiencies reported are related to limitations of the

benchmark exercise. However, supported by the SPACE-tool during checking the detailed design, one assessor detected an inconsistency in the function diagram JEB00CS811. According to FANP, it has no impact on the functional behavior of the integrated system, but this unintended inconsistency passed developers verification process. This confirms that in addition to any manufacturer's internal verification process an external independent validation by an assessor is needed.

Page 21: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

21 JRC – IE Petten

Comparison: Source code No major errors were reported, but the source code analysis by

static analysis tools (QAC, Polyspace Verifier, RETRANS) provided some interesting insights that would be hard to observe manually.

In addition, a number of potential problem areas were reported that would require more detailed analysis, which was, however, not performed due to limited project resources.

Page 22: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

22 JRC – IE Petten

Comparison: Testing The assessment results of the testing life cycle phase revealed

some differences in the approaches and depth of the analysis:• One assessment team concluded that due to the normed

structure of the TXS C-code, the definition of the amount of testing that is required is only based on the functionality of the system and not on code-coverage measuring. Within BE-SECBS, the amount of functional testing could be accepted as sufficient;

• Another assessment team identified the missing tests and other non-compliances with the regulatory requirements and suggested additional tests.

These differences could come from the different approaches, analysis tools and different requirements applied.

Page 23: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

23 JRC – IE Petten

Comparison: Quantitative reliability analysis A Bayesian network (by VTT/STUK) was developed, consisting of the

following variables: requirement specification, concept design, detailed design, application C-code, code compilation and linking, platform software and integrated system tests.

The main information on which the Bayesian network is built is coming from the limited qualitative assessment, which is based mostly on a critical review of the documentation. No tools or other assessment methods were applied that could enhance the credibility of the quality rating.

The quantitative reliability study could significantly be improved with the information that is now available from the qualitative analysis performed by IRSN and ISTec by using QAC, Polyspace Verifier, Claire, Gatel, RETRANS tools.

Page 24: 1 JRC – IE Petten Benchmark Exercise of Safety Evaluation of Computer Based Systems V.Kopustinskas 1, C.Kirchsteiger 1, B.Soubies 2, F.Daumas 2, J.Gassino

24 JRC – IE Petten

Comparative assessment conclusions• The actual findings (in requirements specification and detail

design) were identified, this confirms the need for independent as well as internal verification and validation processes to be performed;

• Assessment tools (both own developed or standard) could enhance a lot the depth of the assessment and the credibility of the evaluation;

• Quantitative software reliability analysis represents a useful analysis item that could be used also in PSA studies. The credibility of the analysis could be enhanced by information from the qualitative analysis, performed by various analysis tools, used in the exercise.