system integration verification and validation. remember v-cycle for all increments? sw requirements...

31
System Integration Verification and Validation

Upload: magdalena-banville

Post on 14-Dec-2015

235 views

Category:

Documents


2 download

TRANSCRIPT

System IntegrationVerification and Validation

Remember V-Cycle for all Increments?

SW Requirements

SW Architecture

SW Design

SW Coding

SW Module Test

SW Integration

SW VerificationSW Activities

System Requirements

System Architecture & Design

System Integration

System Verification

SE/PVV Activities

EE ActivitiesME Activities

Code Review

V1V1

V3V3

V4V4

SystemPlanning

V2V2

Verification& Validation

Realization

Disc.Planning

Test Area Who Tests What Against What Where

SWModuleTest

The SW-Developer

Tests a SW Package (or single Modules)

Against the SW Design

Simulation on PC, Target

SW Integration Test

The SW-Test Engineer

Tests a SW Component (possibly within a SW System)

Against the SW Architecture

Partial or whole System

SW Verification Test

The SW-Test Engineer

Tests SW Components (within a SW System)

Against the SW Requirements

Usually whole System

Test Overview

Why SI?

Why do we need System Integration ?

The purpose of System Integration is

to assemble a system from its defined components

to ensure that the interfaces between the components of the integrated system function properly

Focus is on the System Level, i.e.

releases of sub projects are handled as one part (system modules / discipline components)

interfaces between these parts are the object of tests

Remark:

Even if a project has no official System Integration Team, the activities are performed anyway

(by highest / last integration in the project)

Integration testing(interface testing)

Examines the interaction of software elements (components) after system integration

Integration is the activity of combining individual software components into a larger subsystems

Further integration of subsystems is also part of the system integration process

Each component has already been tested for its internal functionality (component test). Integration tests examine the external functions

Scope:

Integration tests examine the interaction of software components (subsystems) with each other:

interfaces with other components

interfaces among GUIs/ MMIs

Integration tests examine the interfaces with the system environment

Tests cases may be derived from interface specifications, architectural design or data models

Verification vs. Validation

Verification: Proof of compliance with the stated requirements (def. after ISO 9000)

" Did we proceed correctly when building the system?"

Validation: Proof of fitness for expected use

" Did we build the right system software system ?"

Verification within the general V-Model

Each development level is verified against the contents of the level above it

-to verify: to give proof of evidence

-to verify: means to check whether the requirements and definitions of the previous level

were implemented correctly

requirementsdefinition

functional system design

technical system design

component specification

programming

component test

integration test

system test

acceptance test

Development and Integration

Verification

System test(Verification)

Testing the integrated software system to prove compliance with the specified requirements

-software quality is looked at form the user's point of view

System tests refer to

functional requirements: suitability, accuracy, interoperability, compliance, security"what the system does"

nonfunctional requirements: reliability, usability,efficiency, portability, maintability"how the system works"

Terminologie

RELIABILITY: the ability of the software product to perform required functions under stated conditions for a

specified period of time for a specified number of operations

USABILITY: testing to determine the extent of which the software product is understood, easy to learn, easy to operate

EFFICIENCY: the process of testing to determine the efficiency of software product

MAINTABILITY: the process of testing to determine the maintability of software product

PORTABILITY: the process of testing to determine the easy to transfer the software product from one hardware to another (or software environment to another)

SUITABILITY: the capability of a software product to provide an appropriate set of functions for specified tasks and user objectives

ACCURACY: the capability of a software product to provide the right or agreed results or effects with the needed degree of precision

COMPLIANCE: ability of software product to adhere to standards, conventions

INTEROPERABILITY: ability of software product to interact with one or more specified components or system

SECURITY: attributes of software product that bear on its ability to prevent unauthorized access to programs and data

System test

Test cases may be derived from:

functional specifications

use cases

business processes

risk assessments

Scope:

Test of the integrated system from the user's point of view

The test environment should match the true environment:

All external interfaces are tested under true conditions

No tests in the real life environment!

Validation within the general V-Model

Validation refers to the correctness of the each development level

to validate: to give proof of having value

to validate: means to check the appropriateness of the results of one development level

requirementsdefinition

functional system design

technical system design

component specification

programming

component test

integration test

system test

acceptance test

Development and Integration

Verification

Validation

Testing process @ Continental

Discipline(Entertainment, Navi etc.) System Integration

PVV(Product Verification and Validation)

TestResult PRODUCTION

Failed Passed

Req

uire

men

ts /

Arc

hite

ctur

e

Verification vs.Validation

Verification

Validation

Test design techniques

Deriving test cases from requirements

Designing test cases must be a controlled process

test object and requirements on the test object defining test

requirements and test criteria test case 1test case 1

Test cases can be created in a formal way or in a informal way depending on the project delimitations and on the maturity of the process in use.

Test design techniques

Traceability

Tests should be traceable: which test case was included in the test portofolio, based on which requirement?

test scenarios

test object and requirements on the test object defining test

requirements and test criteria test case 1test case 1 test case

test casetest case 1

test casetest case

test case4

Test design techiques

Definitions

Test object

the subject to be examined (a document or a piece of software in the software development process)

Test condition

an item or an event: a function ,a transaction ,a quality criterion or an element in the system

Test criteria

the test object has to confirm the test criteria in order to pass the test

Test design techniques

Test case description according to IEEE 829:

Input values: description of the input data on the test object

Preconditions: situation previous to test execution or characteristics of the test object before conducting the test case

Expected results: output data that the test object is expected to produce

Postconditions: characteristics of the test object after test execution, description of its situation after test

Dependencies: order of execution of test cases, reason for dependencies

Distinct identification: Id or key in order to link ,for example ,an error report to the test cases where it appeared

Requirements: characteristics of the test object that the test case will examine

Test level

PVV (Product Verification and Validation) - Test Level Definition

To optimize test execution, the following test levels were defined:

XXS (extreme small): Screening-Test -> handover test between PVV and SI/SW

XS (extra small): Quick-Test -> to get a fast overview about all main functions/features

S (small): Pre-release-Test -> check all functions and find major errors (feedback for SW dev.)

M (medium): Release-Test (part 1) -> to find all serious errors

L (large): Release-Test (part 2) -> to find all errors

Remarks: M- and L-level are forming 100% of all test cases, therefore all other test levels are sub groups of

the complete set (e.g. S will be a sub group of M).

The complete release test can be divided in two parts (M- and L-level). This gives the opportunity, to deliver

the release to the customer with a finalized M-test first.

Note: It is not allowed to change the SW version through hole release test run!

Types of tests

Module tests (MP3 Player, GPS, HMI) - smoke tests

Integration tests (e.g. MP3 player vs HMI)

Verification tests

Validation tests

System requirements document????

Improvements for applications-user point of view?

Test suites

A Test suite is a collection of test cases which are grouped based on one criteria or more

A system can have an unlimited number of test suites based on different criteria:

- A test suite for HMI tests

- A test suite for Performance tests

- A test suite for Navigation tests

- A test suite for Entertainment tests

- etc

Test plan-Test suite-Test package

Verification test -Example

Verification test-Examples

Examples

Examples

Examples

Examples

Validation test-Example

Evaluation

Create System requirement document (1p)

Create Test plan(1p)

Improvements for HMI,GPS,MP3 player and speech applications (1p)

Create 2 verification tests (1p)

Create 2 validation tests for each application (1p)

Free tests (EBTs –Experience Based Testing) on HMI, GPS, MP3 player ,Speech applications (1p)

Find Bugs