dsa2.1 – software quality assurance plan · the software quality assurance plan is organized as...
TRANSCRIPT
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 1 / 47
EUROPEAN M IDDLEWARE
INITIATIVE
DSA2.1 – SOFTWARE QUALITY
ASSURANCE PLAN
EU DELIVERABLE: DSA2.1
Document identifier: EMI-DSA2.1-1277599-QA_Plan-v1.2.doc
Date: 31/05//2010
Activity: SA2 – Quality Assurance
Lead Partner: CERN
Document status: Final
Document link: http://cdsweb.cern.ch/record/1277599
Abstract:
This deliverable contains the definition of the global software QA processes, procedures,
roles and responsibilities and the related metrics and measurement methodologies for the
EMI (European Middleware Initiative) project. What described in this document applies to the
software teams in their development, release and support activities.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 2 / 47
Copyright notice:
Copyright (c) Members of the EMI Collaboration. 2010.
See http://www.eu-emi.eu/about/Partners/ for details on the copyright holders.
EMI (“European Middleware Initiative”) is a project partially funded by the European Commiss ion. For more information on the project, its partners and contributors please see http://www.eu-emi.eu.
This document is released under the Open Access license. You are permitted to copy and distribute verbatim copies of this document containing this copyright notice, but modifying this document is not allowed. You are permitted to copy this document in whole or in part into other documents if you attach the following reference to the copied elements: "Copyright (C) 2010. Members of the EMI Collaboration. http://www.eu-emi.eu ".
The information contained in this document represents the views of EMI as of the date they are published. EMI does not guarantee that any information contained herein is error-free, or up to date.
EMI MAKES NO WARRANTIES, EXPRESS, IMPLIED, OR STATUTORY, BY PUBLISHING THIS DOCUMENT.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 3 / 47
Delivery Slip
Name Partner / Activity
Date Signature
From Maria Alandes
Pradillo CERN/SA2 18/07/2010
Reviewed by Giuseppe
Fiameni, Andrea
Ceccanti
CINECA/SA1,
INFN/JRA1 08/08/2010
Approved by PEB 23/08/2010
Document Log
Issue Date Comment Author / Partner
0.1 23.06.2010 First draft of the document. Missing Procedures. Maria Alandes Pradillo
0.2 24.06.2010 Added Procedures. Maria Alandes Pradillo
0.3 02.07.2010 New version after phone conference feedback. Maria Alandes Pradillo
0.4 18.07.2010 Added Metrics and Quality Factors. Maria Alandes Pradillo
1.0 10.08.2010 Added comments from official reviewers. Maria Alandes Pradillo
1.1 20.08.2010 Added Deadlines for Guidelines. Maria Alandes Pradillo
1.2 23.08.2010 Updated Executive Summary and Conclusions Alberto Aimar
1.2 23.08.2010 Final release PEB
Document Change Record
Issue Item Reason for Change
1.1 Deadlines for guidelines
The different satellite guideline documents
need to have clear release dates, which were
missing
1.2 Updated Executive Summary and
Conclusions
Exec summary and conclusions were too short
and not descriptive enough to be useful
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 4 / 47
TABLE OF CONTENTS
1. INTRODUCTION ......................................................................................................................................... 6
1.1. PURPOSE .................................................................................................................................................. 6
1.2. DOCUMENT ORGANISATION ..................................................................................................................... 6
1.3. APPLICATION AREA ................................................................................................................................. 6
1.4. REFERENCES ............................................................................................................................................ 6
1.5. DOCUMENT AMENDMENT PROCEDURE .................................................................................................... 7
1.6. TERMINOLOGY ......................................................................................................................................... 7
2. EXECUTIVE SUMMARY ......................................................................................................................... 10
3. SQA FACTORS ........................................................................................................................................... 12
4. SQA MANAGEMENT ................................................................................................................................ 13
4.1. SQA ORGANISATION ............................................................................................................................. 13
4.2. SQA TASKS ........................................................................................................................................... 14 4.2.1 SQA Tasks Summary .................................................................................................................. 14 4.2.2 Documentation tasks .................................................................................................................. 15
4.2.3 Minimum Documentation Requirements ...............................................................................................15 4.2.4 Technical Development Plan ................................................................................................................18 4.2.5 Software Release Plan...........................................................................................................................18 4.2.6 Software Release Schedule ....................................................................................................................19 4.2.7 Software Maintenance and Support Plan ..............................................................................................19 4.2.8 QA Tools 20 4.2.9 Continuous Integration and Certification Testbeds ..............................................................................20 4.2.10 Security Assessment Plan ......................................................................................................................20 4.2.11 Security Assessments .............................................................................................................................20
4.2.12 Review Tasks .............................................................................................................................. 21 4.2.13 Review of the Minimum Required Documentation ................................................................................22 4.2.14 Review the Technical Development Plan ..............................................................................................22 4.2.15 Review of the Software Release Plan ....................................................................................................22 4.2.16 Review of the EMI software components ..............................................................................................23 4.2.17 Review the Software Release Schedule .................................................................................................23 4.2.18 Review the Software Maintenance and Support Plan ...........................................................................24 4.2.19 Review of the EMI Production Releases ...............................................................................................24 4.2.20 Review the Security Assessments ...........................................................................................................24 4.2.21 Review the SQAP ..................................................................................................................................25
4.2.22 Reporting tasks .......................................................................................................................... 25 4.2.23 Periodic QA Reports .............................................................................................................................25 4.2.24 Periodic Software Development Quality Control..................................................................................26 4.2.25 Periodic Software Maintenance Quality Control ..................................................................................26
5. PROCEDURES ............................................................................................................................................ 27
5.1. MEETINGS .............................................................................................................................................. 27
5.2. HOW TO CARRY OUT QA REVIEWS ....................................................................................................... 27
5.3. SUPPORT TO PRODUCT TEAMS ............................................................................................................... 29
6. METRICS .................................................................................................................................................... 30
6.1. CRITICAL BUG METRICS ........................................................................................................................ 30
6.2. BUG SEVERITY DISTRIBUTION ................................................................................................................ 31
6.3. BACKLOG MANAGEMENT INDEX ............................................................................................................ 32
6.4. FAILED BUILDS METRIC .......................................................................................................................... 33
6.5. INTEGRATION TEST EFFECTIVENESS METRIC .......................................................................................... 34
6.6. UP-TO-DATE DOCUMENTATION METRIC ................................................................................................. 35
6.7. DELAY ON RELEASE SCHEDULE METRIC ................................................................................................. 36
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 5 / 47
6.8. UNIT TEST COVERAGE METRIC ............................................................................................................... 37
6.9. MEMORY LEAKAGE WARNINGS METRIC ................................................................................................. 38
6.10. CODE COMMENTING METRIC .................................................................................................................. 39
6.11. NUMBER OF SUPPORTED PLATFORMS METRIC ........................................................................................ 40
6.12. TOTAL BUG DENSITY METRIC ................................................................................................................. 41
6.13. BUG DENSITY PER RELEASE METRIC ....................................................................................................... 42
6.14. TOTAL USER INCIDENTS METRIC ............................................................................................................ 43
6.15. TRAINING AND SUPPORT INCIDENT METRIC ............................................................................................ 44
6.16. AVERAGE TIME TO DEAL WITH AN INCIDENT .......................................................................................... 45
7. STANDARD PRACTICES AND CONVENTIONS ................................................................................. 46
8. CONCLUSIONS .......................................................................................................................................... 47
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 6 / 47
1. INTRODUCTION
1.1. PURPOSE
The European Middleware Initiative is a close collaboration of the three major middleware
providers, ARC, gLite and UNICORE, and other software providers. It will deliver a
consolidated set of middleware components for deployment in EGI, PRACE and other
distributed computing infrastructures. The SQAP specifies the manner in which the EMI
project is to achieve its quality goals [R1] in terms of software development.
The SQAP applies to all EMI software components. The list of EMI software components can
be found in the EMI components table from the project deliverable document: DNA1.3.1 –
Technical Development Plan [R9].
The SQAP will specify which documents are needed, which reviews are going to be carried out and
the roles and responsibilities within the EMI project related to the software lifecycle.
1.2. DOCUMENT ORGANISATION
The Software Quality Assurance Plan is organized as follows:
Chapter 1 and 2 are the introduction and the executive summary respectively.
Chapter 3 describes the management: who is responsible for quality and what tasks need to be
carried out.
Chapter 4 describes the documentation: which documents ensure the quality of the EMI
software.
Chapter 5 describes the procedures: how the software quality assurance plan is managed.
1.3. APPLICATION AREA
The Software Quality Assurance Plan refers to the EMI software life cycle and basically affects SA1,
SA2 and JRA1 activities.
1.4. REFERENCES
Table 1: Table of References
R1 Eric J. Braude – Software Engineering – An Object-Oriented Perspective
R2 IEEE Standard For Software Quality Assurance Plans
R3 IEEE Guide for Software Quality Assurance Planning
R4 ISO/IEC-12207 - Software Lifecycle process
R5 ISO/IEC-9126 – Software Engineering – Product Quality
R6 ITIL v3
http://www.itil-officialsite.com/Qualifications/ITILV3QualificationScheme.asp
R7 LCG gLite Testing Guidelines
https://twiki.cern.ch/twiki/bin/view/LCG/LCGgliteTestWritingGuidelines
R8 EMI DoW
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 7 / 47
https://twiki.cern.ch/twiki/pub/EMI/EmiDocuments/EMI-Part_B_20100624-PUBLIC.pdf
R9 The McCall Quality Model
http://www.americanscience.org/journals/am-
sci/am0603/22_2208_Qutaish_am0603_166_175.pdf
R10 EMI Technical Development Plan – Project Deliverable DNA 1.3.1 (to be released)
R11 Atlassian Bamboo
http://www.atlassian.com/software/bamboo
R12 Valgrind framework
http://valgrind.org/
R13 The Eclipse Test & Performance Tools Platform
http://www.eclipse.org/tptp/
R14 EMI Incident Management System (GGUS)
https://gus.fzk.de/pages/home.php
R15 LCG-ROLLOUT mailing list
https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=LCG-ROLLOUT
1.5. DOCUMENT AMENDMENT PROCEDURE
Amendments, comments and suggestions should be sent to the SA2.2 task leader.
This document can be amended by the Quality Assurance team (SA2) further to any feedback
from the other teams. Minor changes, such as spelling corrections, content formatting or
minor text reorganisation not affecting the content and meaning of the document can be
applied by SA2 without previous review. Other changes must be peer reviewed and submitted
to the PEB for approval.
When the document is modified for any reason, its version number shall be incremented
accordingly. The document version number shall follow the standard EMI conventions for
document versioning. All versions of the document shall be maintained using the document
management tools selected by the EMI project.
1.6. TERMINOLOGY
This section explains the main terms and acronyms used in the document. These are consistent with
the definition in the EMI DoW [R8]. The Quality Assurance activities will use existing ISO standards
for the definition of the common terminology and software lifecycle phases and will follow the
guidelines of the good-practice methodologies described in CMMi and ITIL.
API Application Programming Interface
APT Advanced Packaging Tool
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 8 / 47
BDII Berkley Database Information Index
CMMi Capability Maturity Model Integration
DoW Description of Work
GGUS Global Grid User Support
EMT Engineering Management Team
ITIL Information Technology Infrastructure Library
KLOC Thousands of Lines of Code
KPI Key Performance Indicator
LOC Lines of Code
PEB Project Executive Board
PT Product Team
PTB Project Technical Board
SDP Software Development Plan
SQA Software Quality Assurance
SQAP Software Quality Assurance Plan
SQC Software Quality Control
WP Work Package
YUM Yellowdog Updater Modified
Table 1: Table of Acronyms
Incident An unplanned interruption to an IT Service or a reduction
in the Quality of an IT Service.
Problem A cause of one or more Incidents. The cause is not usually
known at the time a Problem Record is created, and the
Problem Management Process is responsible for further
investigation.
Problem Record A Record containing the details of a Problem. Each
Problem Record documents the Lifecycle of a single
Problem.
Product Team Product Teams are teams of software developers fully
responsible for the successful release of a particular
software product (or group of tightly coupled related
products) compliant with agreed sets of technical
specification and acceptance criteria.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 9 / 47
Request A request from a User for information, or advice, or for a
Standard Change or for Access to an IT Service.
Software Development Plan A project plan for a software development project.
Software Measurement A measurement is an indication of the size, quantity,
amount or dimension of a particular attribute of a product
or process. For example the number of errors in a system is
a measurement.
Software Metric A Metric is a measurement of the degree that any attribute
belongs to a system, product or process. For example the
number of errors per person hours would be a metric.
Software Quality Assurance A set of activities designed to evaluate the process by
which software products are developed or manufactured.
Software Quality Assurance Plan The software quality assurance plan is a document that
describes the procedures, documents, roles and
responsibilities that will ensure the quality of the software
development process.
Software Quality Control A set of activities designed to evaluate the quality of
developed or manufactured software products. In addition,
in EMI, SQC also verifies that User Requirements are
satisfied.
Work Package The EMI project is composed of two Networking Work
Packages (NA1 and NA2), two Service Work Packages and
one Joint Research Work Packages (JRA1).
Table 2: Table of Definitions
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 10 / 47
2. EXECUTIVE SUMMARY
The Software Quality Assurance Plan (SQAP) is a document that specifies tasks that are needed to
ensure quality, responsibilities for quality monitoring, documentation required and procedures that
will be followed to manage the software process. In practice the SQAP specifies the manner in which
the EMI project is going to achieve its quality goals.
The quality factors that have been agreed and their utility for the project are explained. Metrics and
measurements are associated to the quality factors in order to evaluate the quality of the EMI software
lifecycle. These focus on Product Operation, Revision and Transition.
SQA tasks, roles and responsibilities of the EMI technical activities (SA1, SA2 and JRA1) are
described as well as those of the EMI technical bodies (PTB and EMT). The SQA tasks follow the
EMI DoW guidelines and cover documentation tasks, reporting tasks and reviewing tasks.
The documentation tasks include the minimum documentation requirements for each software
component and services and must be provided by every EMI product team. Other documents on plans
and schedules are more used for the strategic and technical coordination controlled by the PTB and the
EMT groups.
Two documents are particularly important as they define and guide the general project-wide technical
objectives and the plans to achieve them:
Technical Development Plan provides the details of the technical development plan for all
EMI services. It contains an initial status assessment, a list of known requirements,
requirement prioritisation and a plan with agreed deliverables and milestones.
Software Release Plan describes the release procedures and policies for the middleware
services and the complete EMI reference distributions. It also contains the initial release
schedule to be prepared by the EMT in collaboration with the PTB and the JRA1work
package.
Additional documents are needed to describe the software engineering tools and the repository
management systems provided by SA2 to EMI and third-party users; and to describe the distributed
certification test beds for internal and acceptance certification and its access and usage requirements.
The reviews aim at helping to ensure the quality of the EMI software. Review tasks are under the
supervision of SA2 as a whole and in particular of SA2.5, which is responsible for collecting and
storing the reviews making sure they are conducted following the SQAP. The reviews will verify all
artefacts and documents produced and create review reports and tables that summarize the status of the
software management and development activities.
This document also describes the procedures in place for managing the software quality assurance
process. It provides details about the communications means (meetings, mailing lists, twiki areas, etc)
to be used for QA activity, how the reviews will be performed and the interaction workflow among the
different work packages and project bodies. It also describes the change management: details on how
to change the SQA documents if the results of a review are not meeting the quality expectations.
The last section presents the list of metrics used for the different reviews planned in the SQAP. The
list of metrics includes metrics on the development and support of the products (external metrics) and
metrics on the artefacts like documentation and code (internal metrics). These metrics can be modified
throughout the lifetime of the project and will be possible to add new metrics or remove obsolete
metrics according to the needs of the SQA process.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 11 / 47
For technical details, five satellites documents are being maintained to serve as guidelines for some of
the software tasks (release, integration, certification, testing, etc).
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 12 / 47
3. SQA FACTORS
This section presents the quality factors [R9] that have been agreed to respect and their utility for the
project. Metrics and measurements will be associated to the quality factors to be able to evaluate the
quality of the EMI software lifecycle.
They are grouped in three parts:
Product Operation
Usability: The software must be easy to use providing intuitive, effective and efficient
interfaces to end users.
Correctness: The software must answer to the objectives of the end users and satisfy their
requirements.
Reliability: The software must consistently perform according to its specifications and report
any internal error that might influence its functionality.
Availability: The software must allow the system where it has been deployed to remain
operational even when faults occur. The system must remain operational even in presence of a
high load for hardware resources and continue operating even at a reduced capacity depending
of the number of user's requests it is receiving.
Integrity: The software must guarantee a high level of security about the protection of data
against loss, theft or unauthorized access.
Product Revision
Maintainability: The maintainability is an indicator of the ease by which a component, device
or system can be maintained and repaired.
Flexibility: The flexibility is an indicator of the ease by which a component, device or system
can be adapted to changes.
Testability: The testability is an indicator of the ease by which a component, device or system
can be tested.
Product Transition
Portability: The software can be portable on different platforms with ease.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 13 / 47
4. SQA MANAGEMENT
4.1. SQA ORGANISATION
Each product team member is responsible for the quality of his or her work. In addition, the
following roles are part of the organisational structure that directly influences the quality of
the software:
SA2 – Quality Assurance
SA2 Quality Assurance activity leader: Alberto Aimar: responsible for the whole SQA layer
of EMI.
SA2.2 Quality Assurance plan task leader: Maria Alandes: responsible for the Software
Quality Assurance Plan, which is a plan written at project level to declare commitment to
follow an applicable set of standards, regulations, procedures and tools during the
development lifecycle.
SA2.3: Metrics and KPIs task leader: Eamonn Kenny: responsible for the definition, collection
and revision of software quality metrics.
SA2.4: Tools and Repositories task leader: Lorenzo Dini: responsible for the set up and
maintenance of the tools needed in the QA process.
SA2.5: QA Implementation review and support task leader: Jozef Cernak: responsible for the
review activities and support to the PTs as far as QA is concerned.
SA2.6: Test beds task leader: Danilo Dongiovanni: responsible for the set up and maintenance
of test beds.
JRA1 – Middleware Development, Evolution and Integration
JRA1 Quality Control task leader: Andrea Ceccanti: responsible for the Software Quality
Control layer, which ensures both SQA and SQAP defined by SA2 are being followed by the
development teams.
JRA1 leader: Morris Riedel, responsible for the Middleware Development: Evolution and
Integration WP.
JRA1.2 leader: Massimo Sgaravatto, responsible for the Compute Area.
JRA1.3 leader: Patrick Fuhrmann, responsible for the Data Area.
JRA1.4 leader: John White, responsible for the Security Area.
JRA1.5 leader: Laurence Field, Infrastructure Area.
SA1 – Maintenance and Support
SA1 Quality Control task leader: Giuseppe Fiameni: responsible for the Software Quality
Control layer, which ensures both SQA and SQAP defined by SA2 are being followed by the
development teams.
SA1 leader: Francesco Giacomini, responsible for the Software Maintenance and Support WP.
SA1.3 Release Manager: Cristina Aiftimiei.
The following boards and management roles may be involved to take decisions affecting the
SQA procedures when there is a conflict or a critical change:
Technical Director: decides on specific technical matters within the project and leads the
Project Technical Board.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 14 / 47
PEB: responsible to assist the Project Director in the execution of the project plans and in
monitoring the milestones, achievements, risks and conflicts within the project. It is led by the
PD and is composed of the Work Package Leaders, the Technical Director and the Deputy
Technical Director.
PTB: is led by the Technical Director and composed of the Technical Area Leaders and a
representative and is responsible to assist the Technical Director in defining the technical
vision of the project and deciding on specific technical issues. The PTB members are the
coordinators of the project Technological Areas. The PTB can invite experts (e.g. product
team leaders, component developers, etc.) or delegate specific tasks to appointed working
groups as required.
EMT: is lead by the Release Manager and composed of the Product Team Leaders (or duly
appointed representatives), a QA representative, a Security representative, representatives of
the Operations teams of the major infrastructures (EGI, PRACE, etc.) and invited experts as
necessary. The role of the EMT is to manage the release process and the 'standard„ changes,
that is all the changes that are pre-approved based on established policies and do not need to
be individually approved by the PEB or the PTB. This includes software defects triaging and
prioritization, technical management of releases, integration issues, and user support request
analysis.
4.2. SQA TASKS
4.2.1 SQA Tasks Summary
A summary of the SQA tasks is described below. They have been defined following the guidelines in
[R7] EMI DoW, [R2] IEEE Standard for Software Quality Assurance Plans and [R3] IEEE Guide for
Software Quality Assurance Planning. For a complete description of each task, please check the
following sections:
Documentation tasks:
o Definition of the Minimum Required Documentation for each software component.
o Definition of the Technical Development Plan: There is a project deliverable
DNA1.3.1 – Technical Development Plan.
o Definition of the Software Release Plan: There is a project deliverable DSA1.2 -
Software Release plan.
o Definition of the Software Maintenance and Support Plan: There is a project
deliverable DSA1.1 - Software Maintenance and Support Plan.
o Definition of the QA Tools Documentation: There are several project deliverables for
QA Tools Documentation: DSA 2.2.1, DSA2.2.2 and DSA2.2.3.
o Definition of the Continuous Integration and Certification Test beds: There is a
project deliverable DSA2.4 - Continuous Integration and Certification Test beds.
o Security Assessments.
Reporting tasks:
o Periodic QA Reports: There are project deliverables DSA2.3.1 – DSA2.3.4 every end
of project quarter.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 15 / 47
o Periodic Software Development Quality Control: There are project deliverables
DJRA1.7.1 – DJRA1.7.4 every end of project quarter.
o Periodic Software Maintenance Quality Control: There are project deliverables
DSA1.3.1 – DSA1.3.4 every end of project quarter.
Review tasks:
o Review of the Minimum Required Documentation for each software component.
o Review the status of the Software Development Plan.
o Review of the status of the Software Release Plan.
o Review of the EMI software component releases.
o Review the status of the Software Release Schedule.
o Review the status of the Software Maintenance and Support Plan.
o Review of the EMI Production Releases.
o Review the Security Assessments.
o Review the SQAP.
4.2.2 Documentation tasks
This section defines the documents governing the development, verification and validation, use and
maintenance of the software.
The documents will be accessible from the SQAP twiki page under:
https://twiki.cern.ch/twiki/bin/view/EMI/SQAP#SQAP_Documentation
Changes and updates to these documents will be notified using the SA2.2 activity mailing list.
4.2.3 Minimum Documentation Requirements
Description: This document identifies the documentation governing the development, verification,
validation, use and maintenance of the software.
Deadline: It‟s part of the SQAP.
Author: Maria Alandes
Requirements: The Minimum Documentation Requirements for the software components
and services are:
Software Requirements Specifications
Software Design Description
Software Verification and Validation Plan
Software Verification and Validation Report
User Documentation
Installation Guide
Troubleshooting guide
Software Requirements Specifications
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 16 / 47
Description: The Software Requirements Specifications should define the requirements of the
software component.
Author: PTB
Requirements: The PTB will decide whether a detailed document describing the software
requirements is needed or not. Otherwise a tracking tool will be used to describe and track the
requirements. Validation criteria (which tests are needed to verify the requirements) should be also
included.
Software Design Description
Description: The Software Design Description should describe the software architecture of the
software component.
Author: PTB or a delegated person by the PTB.
Requirements: The PTB will decide whether a detailed document describing the software design is
needed or not per software component basis.
Software Verification and Validation Plan
Description: The Software Verification and Validation Plan document should describe the strategy
that will be adopted in testing each software component.
Author: Product teams and PTB.
Requirements: The set of tests included in the Software Verification and Validation Plan should
contain:
Unit and Functional tests
o Unit Testing: Testing of individual software units.
o Functional Testing: Testing conducted to evaluate the compliance of a component
with specified functional requirements
CLI: every command of the CLI, every available option.
API: every function or method.
GUI: correct functioning of graphical interfaces.
Integration tests
o System Testing: verify if the component works within the grid in interaction with
other grid services.
o Deployment Testing: verify if the component installs and configures for upgrade and
clean installations.
Regression tests
Performance and Scalability tests
Definition of which set of tests are executed for each type of release (major, minor and
revision).
Complete and detailed guidelines on what should be included in the Verification and Validation Plan
will be given as described in section 7.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 17 / 47
Software Verification and Validation Report
Description: The Software Verification and Validation Report should contain the result of the tests
specified in the Software Verification and Validation Plan. This report should be provided for each
type of release as it is also specified in the Software Verification and Validation Plan.
Author: Product teams
Requirements: The Verification and Validation Report should at least contain:
Test coverage of the code
The test report should contain the test results of all the tests included in the Software
Verification and Validation plan according to the type of release (major, minor or revision).
Complete and detailed guidelines on what should be included in the Verification and Validation
Report will be given as described in section 7.
User Documentation
Description: The User Documentation should contain all the necessary information needed by the
users of the software.
Author: Product teams.
Requirements: The User documentation for each EMI software component should contain when
applicable:
Introduction to the software component describing the main functionality and architecture.
Installation and configuration instructions.
Access or log-on and sign-off the software.
Information on software commands: required parameters, optional parameters, default options,
order of commands, and syntax.
Description of all error situations which can occur and how to react.
Glossary with terms specific to the usage of the software component.
References to all documents or manuals intended for use by the users.
o Known Issues
o System Administrator Guide (if any)
o Tutorials
o Developer‟s Guide i.e. API documentation, Build instructions.
Installation Guide
Description: The Installation Guide should contain the necessary instructions on how to install and
configure the software.
Author: Product teams
Requirements: The Installation Guide for each EMI software component should contain:
The list of supported platforms
Software and Hardware requirements to install the software component
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 18 / 47
Installation instructions
o Supported installation tools
o Installation command
Configuration instructions
o Default configuration variables
o Mandatory configuration variables
o Configuration command
Known Issues during Installation and Configuration of the software component.
Troubleshooting guide
Description: The Troubleshooting Guide of each EMI software component should help users when
tracking and solving problems.
Author: Product teams
Requirements: The Troubleshooting Guide for each EMI software component should describe the
most common scenarios where users have problems in the following areas:
Installation
Configuration
Administration
Debugging
4.2.4 Technical Development Plan
Description: There is a project deliverable DNA1.3.1 – Technical Development Plan. This document
provides the details of the technical development plan for all EMI services. It contains an initial status
assessment, a list of known requirements and their prioritization and a plan with deliverables and
milestones for each Product Team.
Deadline: PM3.
Author: Balazs Konya
Requirements: The Technical Development Plan should specify the following items for each
software component:
The software component description.
The responsible product team.
The work plan for the implementation of the software component including the list of the
milestones concerning the release of each feature.
The description of the activities and resources required and planned to meet specifications
detailed in requirements documents.
4.2.5 Software Release Plan
Description: There is a project deliverable DSA1.2 - Software Release plan. This document describes
the release procedures and policies for the middleware services and the complete EMI reference
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 19 / 47
distributions. It also contains the initial release schedule to be prepared in collaboration with the PTB
and the JRA1WP.
Deadline: PM3.
Author: Cristina Aiftimiei
Requirements: The Software Release Plan should specify the way software components are going to
be released into the Production EMI Repository. It should define the following:
How to manage revision, minor and major releases.
How to manage external dependencies.
The build process.
The communication channels: mailing lists, meetings.
The software delivery strategy to the Production Infrastructures: i.e. Where to upload software
packages, which metadata information is required.
4.2.6 Software Release Schedule
Description: The Software Release Schedule should define the dates for which certain versions of the
software components need to be released in the Production EMI Repository. It has to take into account
the priorities of the project that need to be aligned with the priorities of the different infrastructures
using EMI.
Deadline: Every three months according to [7] EMI DoW.
Author: Cristina Aiftimiei
Requirements: The Software Release Schedule should contain:
List of software component to be released in the next three months.
Estimated dates for the release of each component.
4.2.7 Software Maintenance and Support Plan
Description: There is a project deliverable DSA1.1 - Software Maintenance and Support Plan. This
document describes the Software Maintenance and Support processes, the roles and responsibilities
and the main metrics to be used for Service Level Agreements.
Deadline: PM3.
Author: Francesco Giacomini
Requirements: The Software Maintenance and Support Plan should describe the way to maintain and
support the EMI middleware. It should define the following:
How to handle incidents reported by EMI users using GGUS.
How to handle requests coming from EMI users or other PTs.
How to handle problems.
How to handle changes.
All these requirements should be measurable.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 20 / 47
4.2.8 QA Tools
Description: There is a project deliverable for QA Tools Documentation DSA2.2.1, DSA2.2.2 and
DSA2.2.3. This document describes the software engineering tools and the repository management
systems provided by SA2 to EMI and third-party users. This document is updated and revised
regularly.
Deadline: PM3, PM10 and PM22.
Author: Lorenzo Dini.
Requirements: The tools documentation should define the following:
How to sign software packages.
How to upload software packages in the EMI repository.
What technologies are supported to install the software packages: i.e. APT and YUM.
Description of the tools/scripts used to generate metrics from the tracking tools.
Up to date inventory of the tools used by the product teams.
Dashboard or any other tool needed by Quality Control to monitor the software components.
4.2.9 Continuous Integration and Certification Testbeds
Description: There is a project deliverable DSA2.4 - Continuous Integration and Certification Test
beds. This document describes the distributed certification test beds for internal and acceptance
certification and its access and usage requirements.
Deadline: PM3.
Author: Danilo Dongiovanni
Requirements: The Continuous Integration and Certification Test beds document should define:
The test bed structure with the list of available services and their versions.
The procedure to access the test bed and run tests.
The monitoring strategy of the test bed.
4.2.10 Security Assessment Plan
Description: The Security Assessment Plan identifies which software components are going to be
assessed and when the assessments are going to take place.
Author: Elisa Heymann
Deadline: PM6
Requirements:
Description of the Security Criteria.
List of components and prioritisation for their assessment.
4.2.11 Security Assessments
Description: Vulnerability assessments will be carried out following the approach called First
Principles Vulnerability Assessment (FPVA). FPVA is a primarily analyst-centric (manual) approach
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 21 / 47
to assessment whose aim is to focus the analyst‟s attention on the parts of the software system and its
resources that are mostly likely to contain vulnerabilities. FPVA is designed to find new threats to a
system. It‟s not dependent on a list of known threats.
Deadline: As defined in the Security Assessment Plan.
Author: University of Wisconsin and Universitat Autonoma de Barcelona.
Requirements:
Architectural Analysis: identify the major structural components of the system, including
modules, threads, processes, and hosts. For each of these components, identify the way in
which they interact, both with each other and with users. The artefact produced at this stage is
a document that diagrams the structure of the system and the interactions amongst the
different components and with the end users.
Resource Identification: identify the key resources accessed by each component and the
operations supported on those resources. Resources include elements such as hosts, files,
databases, logs, and devices. For each resource, describe its value as an end target or as an
intermediate target. The artefact produced at this stage is an annotation of the architectural
diagrams with resource descriptions.
Trust and Privilege Analysis: identify the trust assumptions about each component, answering
such questions as how are they protected and who can access them. Trust evaluation is also
based on the hardware and software security surrounding the component. Associated with
trust is describing the privilege level at which each executable component runs. The privilege
levels control the extent of access for each component and, in the case of exploitation, the
extent of damage that it can accomplish directly. A complex but crucial part of trust and
privilege analysis is evaluating trust delegation. By combining the information from the first
two steps, we determine what operations a component will execute on behalf of another
component. The artefact produced at this stage is a further labelling of the basic diagrams with
trust levels and labelling of interactions with delegation information.
Component Evaluation: examine each component in depth. For large systems, a line-by-line
manual examination of the code is infeasible, even for a well-funded effort. A key aspect of
our technique is that this step is guided by information obtained in the first three steps, helping
to prioritize the work so that high value targets are evaluated first. The artefacts produced by
this step are vulnerability reports, perhaps with suggested fixes, to be provided to the software
developers.
Dissemination of Results: Vulnerabilities are reported to the PT leaders. Users are never
informed of vulnerabilities until they are fixed and the fix is available to the users.
4.2.12 Review Tasks
The reviews will help to ensure the quality of the EMI software. The metrics associated with each
review are described in section 6.
The review tasks will be under the supervision of the SA2.5 activity that is responsible for collecting
and storing the reviews making sure they happen according to the SQAP. Reviews and Review
Templates will be available from the SQAP twiki page under:
https://twiki.cern.ch/twiki/bin/view/EMI/SQAP#SQAP_Review_Schedule
The following reviews need to be carried out throughout the software lifecycle:
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 22 / 47
4.2.13 Review of the Minimum Required Documentation
Description: The Review of the Minimum Required Documentation should asses for each software
component the status of the documents that are defined in section 4.2.3.
Frequency: To be included in every SA2 QA report.
Author: Maria Alandes.
Review checklist: follow the checklist below for each software component:
All the documents exist and are up to date.
The contents of the documents conform to the descriptions done in section 4.1.
Output: It‟s a document containing a table with the EMI software components and the list of the
minimum required documents. A link to the document will be included if it exists and, if applicable,
any metrics or measurements associated with the document. If the document doesn‟t exist, it will be
marked as NONE. The table will be published in the Periodic QA report and sent to JRA1 so that
developers are aware of what it is missing as far as documentation is concerned for their software
components.
4.2.14 Review the Technical Development Plan
Description: The Review of the Technical Development Plan should check that the plan is up to date
and that it reflects the real development plans of the project.
Frequency: PM6, PM9, PM12, PM15, PM18, PM21, PM24, PM27, PM30, PM33, PM36.
Author: Jozef Cernak.
Review checklist: follow the checklist below for each software component:
The software component deliverables and associated completion criteria is up to date. In case
of an extended deadline, there is a justification and it‟s documented.
The schedule and interrelationships among other software components is up to date.
The responsibility for the component is up to date.
Output: It‟s a document that contains a report on the status of every item in the checklist and the
metrics and measurements associated with the Technical Development plan.
4.2.15 Review of the Software Release Plan
Description: The Review of the Software Release Plan should check that the plan is up to date and
that it describes the actual release process.
Frequency: PM6, PM9, PM12, PM15, PM18, PM21, PM24, PM27, PM30, PM33, PM36.
Author: Giuseppe Fiameni.
Review checklist: follow the checklist below:
The list of supported platforms corresponds to the actual set of platforms on which software
components are released.
Installation of external dependencies is well documented.
Instructions to use the supported build systems are up to date.
The list of supported delivery software formats is up to date (source and binary packages,
tarballs, package lists, etc).
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 23 / 47
The description of the process on how to handle changes is up to date.
The communication channels are published with updated information about:
Mailing lists developers should use for discussion, requests, announcements,
etc.
Meetings and the purpose of each of them.
The process on how to deliver software to the Production Infrastructures is up to date and it‟s
aligned to what the Production Infrastructures are expecting.
Output: It‟s a document that contains a report on the status of every item in the checklist and the
metrics and measurements associated with the Software Release Plan.
4.2.16 Review of the EMI software components
Description: The Review of the software component should check that the software components are
meeting the requirements described below.
Frequency: Every week.
Author: Andrea Ceccanti
Review checklist: follow the checklist below:
The software component contains the required code metrics.
The software component meets the naming and packaging conventions.
The software component builds successfully in the supported platforms.
The necessary tests (unit and functional tests) have been carried out and have been successful
according to the Test Plan of the component.
The corresponding bug tracker items are consistent and reflect the release status of the
component.
Tickets assigned to each product team are dealt with.
If applicable, security vulnerabilities have been addressed.
Output: The status of each item in the checklist should be integrated and monitored in a dash board
that will automatically collect and publish the relevant metrics and measurements associated to the
software component. Alarms and notification on deviations should be monitored by Andrea Ceccanti.
4.2.17 Review the Software Release Schedule
Description: The Review of the Software Release Schedule should check that the priorities of the
project are taken into account and reflected in the scheduled releases.
Frequency: PM6, PM9, PM12, PM15, PM18, PM21, PM24, PM27, PM30, PM33, PM36.
Author: Giuseppe Fiameni
Review checklist: follow the checklist below:
Check whether the previous schedule has been kept.
Check whether the new schedule takes into account what wasn‟t accomplished in the previous
schedule.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 24 / 47
Check whether the new schedule is aligned to the Software Development Plan and the
priorities of the project.
Output: It‟s a document that contains a report on the status of checklist and the metrics and
measurements associated with the Software Release Schedule.
4.2.18 Review the Software Maintenance and Support Plan
Description: The Review of the Software Maintenance and Support Plan should check that the plan is
up to date and describes the actual maintenance and support processes and that the SLAs are
respected.
Frequency: PM6, PM9, PM12, PM15, PM18, PM21, PM24, PM27, PM30, PM33, PM36.
Author: Giuseppe Fiameni
Review checklist: follow the checklist below:
The process on how to handle incidents reported by EMI users using GGUS is up to date.
The process on how to handle requests coming from EMI users or other PTs is up to date.
The process on how to handle problems is up to date.
Output: It‟s a document that contains a report on the status of checklist and the metrics and
measurements associated with the Software Maintenance and Support Plan.
4.2.19 Review of the EMI Production Releases
Description: the Review of the EMI Production Releases should check that the release is meeting the
requirements described below.
Frequency: for every EMI Production Release.
Author: Cristina Aiftimiei
Review Checklist: follow the checklist below:
Release notes: they exist and they summarise the most relevant changes in the release.
Repositories: they are updated with the new packages.
Announcement to the relevant users: it‟s done after the release notes are published and the
repositories updated.
Status of the relevant bug tracker items: it‟s consistent.
Test reports: they exist and they are successful. They also contain the minimum set of tests
necessary to pass the certification (deployment and regression tests).
Output: It‟s a verification report that should be part of the documentation shipped in the EMI release.
The verification report summarises the status of the items in the check list and it also contains any
metrics and measurements associated to the EMI release.
4.2.20 Review the Security Assessments
Description: the Review of the Security Assessments should check that the different stages described
in the First Principles Vulnerability Assessment (FPVA) approach are being followed.
Frequency: Part of the QC reports.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 25 / 47
Author: Giuseppe Fiameni
Review Checklist: follow the checklist below:
The Architectural Analysis has been carried out and the output contains a diagram describing
the interactions among components and end users.
The Resource Identification has been carried out and the output contains the resource
descriptions.
The Trust and Privilege Analysis has been carried out and the output contains the trust levels
and the delegation information for all the components and their interactions.
The Component Evaluation has been carried out and the output contains identified
vulnerabilities and their suggested fixes.
The Dissemination of Results has been carried out.
4.2.21 Review the SQAP
Description: the Review of the SQAP should check that the plan is being followed and that all the
activities described in it are being carried out as planned.
Frequency: Every month.
Author: Maria Alandes
Review Checklist: follow the checklist below:
Check that the following tasks are carried out as defined in the plan:
o Documentation tasks
o Review tasks
o Reporting tasks
Check the list of responsibilities is up to date.
Check the description of the SQA procedures is up to date.
Check the list of metrics is up to date.
Check the guidelines are up to date.
Output: It‟s a document that contains a report on the status of the checklist.
4.2.22 Reporting tasks
Periodic QA Reports are planned to be done throughout the lifetime of the project. The following set
of reports are planned to be done as project deliverables:
4.2.23 Periodic QA Reports
Description: QA Reports will summarise the metrics collected by the SA2.3 activities and the reviews
performed by the SQC task leaders. They will be done in collaboration with the SA2.5 activity.
Deadlines:
DSA2.3.1 - Periodic QA Reports - End of July 2010.
DSA2.3.2 - Periodic QA Reports - End of April 2011.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 26 / 47
DSA2.3.3 - Periodic QA Reports - End of April 2012.
DSA2.3.4 - Periodic QA Reports - End of April 2013.
Author: Maria Alandes
Contents: QA reports should summarise the results of the different review tasks and evaluate whether
the quality factors defined by the project are respected.
4.2.24 Periodic Software Development Quality Control
Description: Periodic Software Development Quality Control summarise the status and performance
of the software development activity.
Deadlines:
DJRA1.7.1 – Software Development Quality Control Report – End of July 2010
DJRA1.7.2 - Software Development Quality Control Report – End of April 2011.
DJRA1.7.3 - Software Development Quality Control Report – End of April 2012.
DJRA1.7.4 - Software Development Quality Control Report – End of February 2013.
Author: Andrea Ceccanti
Contents: The Software Development Quality Control reports should summarize the results of the
EMI software component reviews. It should also contain details on the availability and execution of
unit, functional and compliance tests for the EMI components.
4.2.25 Periodic Software Maintenance Quality Control
Description: Periodic Software Maintenance Quality Control summarise the status and performance
of the software maintenance activity.
Deadlines:
DSA1.3.1 - Software Maintenance Quality Control Report – End of October 2010.
DSA1.3.2 - Software Maintenance Quality Control Report – End of February 2011.
DSA1.3.3 - Software Maintenance Quality Control Report – End of February 2012.
DSA1.3.4 - Software Maintenance Quality Control Report – End of February 2013.
Author: Giuseppe Fiameni
Contents: The Software Maintenance and Quality Control reports should summarize the results of the
Software Release Plan Review, Software Release Schedule Review and Software Maintenance and
Support Plan Review. It should also contain details on availability and execution of regression tests for
the supported EMI components and various metrics on released components.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 27 / 47
5. PROCEDURES
This section describes the procedures to manage the software quality assurance process. This includes:
- Meetings: Details about the meetings to be held for QA activity.
- Reviews: Details on how the reviews will be carried out and the workflow among the different
work packages and management. It also describes the change management: details on how to
change the SQA documents if the results of a review are not meeting the quality expectations.
- Support: What activities will be done to give support to product teams as far as QA is
concerned.
5.1. MEETINGS
The following meetings will be held to allow project members discuss and organise the different QA
activities:
- Weekly phone meetings will be held every Wednesday at 10h00 to coordinate the whole SA2
activity. The meeting will be chaired by the SA2 task leader. All members of the SA2 activity
shall participate. SQC task leaders are also welcome to participate. The different task leaders
shall report about the progress of their tasks.
o Indico Agendas: http://indico.cern.ch/categoryDisplay.py?categId=3001
o Minutes of the meetings:
https://twiki.cern.ch/twiki/bin/view/EMI/EmiSa2WeeklyMeetings
Meetings will be also organised as needed by the different SA2 task leaders to coordinate their tasks.
These meetings will be announced and the different people who should attend the meeting will be
contacted.
The different SA2 members may be also appointed to take part in other activity meetings for the
purpose of representing the SQA activity or to take care that SQA issues are taken into account across
the project. One example is the EMT meetings where there is always a member of SA2.
Mailing lists are also an important communication channel to discuss open issues, to make
announcements or distribute documentation. The registration is open but the activity or tasks leaders
should give their approval before. The following mailing lists are used within SA2:
- [email protected] : general mailing list to discuss about SQA issues.
- [email protected] : mailing list to discuss SA2.2 related issues (SQAP).
- [email protected] : mailing list to discuss SA2.3 related issues (Metrics).
- [email protected] : mailing list to discuss SA2.4 related issues (Tools).
- [email protected] : mailing list to discuss SA2.5 related issues (SQA review and support).
- [email protected] : mailing list to discuss SA2.6 related issues (Test beds).
5.2. HOW TO CARRY OUT QA REVIEWS
The QA Reviews should be carried out in the following way:
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 28 / 47
SA2.2 should prepare the Review Template and make it available from the
https://twiki.cern.ch/twiki/bin/view/EMI/SQAP#SQAP_Review_Schedule twiki page. SA2.2
will send the link to the Review responsible two weeks before the Review deadline.
The responsible person to carry out the Review should fill in the Review Template. When
there‟s no such template, as it is the case of the continuous review of the EMI software
components, the responsible person can send his comments directly to the EMT mailing list
and afterwards summarise the results in the QC reports.
The Review responsible should send back the Review to SA2.2 who will check that all
information is present.
In case any corrective measure is recommended, SA2.2 should inform the SA2 activity leader.
SA2.2 is responsible for updating with the correct links to the Review and announcing that the
Review is available in the SA2 mailing list.
https://twiki.cern.ch/twiki/bin/view/EMI/SQAP#SQAP_Review_Schedule
SA2 activity leader is responsible for informing the PEB/PTB about any deviation and
recommended corrective measures if the changes are critical. Otherwise, corrective measures
will be discussed at the EMT or internally within SA2 if they affect SA2 documentation only.
PEB/PTB/EMT should approve or reject the proposed corrective measure.
SA2 activity leader should inform SA2.2 whether the corrective measure has been approved
by PEB/PTB.
SA2.2 should contact the relevant task leaders so that the affected documentation is updated.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 29 / 47
The task leaders should send SA2.2 all the updated documentation. SA2.2 should check that
the corrective measure has actually been reflected in the documentation.
SA2.2 is responsible for updating
https://twiki.cern.ch/twiki/bin/view/EMI/SQAP#SQAP_Documentation with the correct links
to the updated documentation and announcing that the updated documents are available in the
SA2 Mailing list.
In order to monitor the implementation of the accepted corrective measures, SA2.2 is
responsible for including an item for each of the accepted corrective measures in the next
relevant Review template.
5.3. SUPPORT TO PRODUCT TEAMS
SA2.5 is responsible for giving support to PTs to implement the SQA process. The following support
activities will be implemented:
After each Review of the Minimum Required Documentation, for those PTs failing to meet
the requirements, SA2.5 will get in touch with them and supply them with templates so that
they can provide the necessary documents.
After each Review of the EMI software components, SA2.5 will contact the PTs failing to
meet the requirements by providing them with the necessary guidelines and standards.
Especially in the area of testing, SA2.5 should help the PTs to design and implement the
necessary missing tests.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 30 / 47
6. METRICS
The following sections present the list of metrics used in the different reviews planned in the SQAP.
The list of metrics can be modified throughout the lifetime of the project to add new metrics or remove
obsolete metrics according to the needs of the SQA process.
6.1. CRITICAL BUG METRICS
Metric id PRIORITYBUG
Name Average time to handle an immediate priority bug
Description This metric describes the time needed to provide a fix to a bug with the
highest level of priority. This can be calculated starting from the time a
bug is submitted to the bug tracker, and the time the bug fix is released
(bug in „Ready for Review‟).
This metric could be presented as a time varying graph showing the time
spent in each state.
It could also be presented as a control chart (x axis sample time, y axis is
the average time to handle the bug, with a line for the upper limit stated
by the project).
Unit Time, in hours and/or time varying in time.
Scope Per component, per product team.
Thresholds/target value <= project wide target (input is needed here)
Tools Bug tracker(s).
Review Review of the EMI software components
Quality factor Integrity, Reliability and Correctness.
Goals The goal of this metric is to improve/streamline the processes involved
in producing and releasing a critical fix.
Risks Use of different bug trackers will make convergence difficult.
Priority levels need to be the same for each bug tracker.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 31 / 47
6.2. BUG SEVERITY DISTRIBUTION
Metric id BUGSEVERITYDISTRIBUTION
Name Bug severity distribution
Description This metric shows the distribution of bugs per severity level. It is shown
as a histogram with severity levels on the x axis and number of bugs
found in each category on the y axis.
Unit Distribution
Scope Per component, per product team.
Thresholds/target value None, however analysis of the distribution will vary with time and be
on-going during the project duration.
Tools Bug tracker
Review Review of the EMI software components
Quality factor Correctness, Reliability and Flexibility
Goals The goal of this metric is to highlight critical components. SA2/JRA1
should attempt to categorize/explain the reason for large numbers of
bugs related to single components.
Risks Usage of different bug trackers: the bug severity has to be clearly
defined for each bug, which is currently not likely to be tracked by all
bug tracking systems.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 32 / 47
6.3. BACKLOG MANAGEMENT INDEX
Please check the description, not convinced yet.
Metric id BACKLOG
Name Backlog management index
Description The metric is calculated using the formula:
(number of problems (bugs) closed with a given release)/(number of
problems arrivals during the previous release time period) *100
It measures the backlog management capability. When the value is
greater than 100% it means that the backlog is reduced.
Unit %
Scope Per product, per product team
Thresholds/target value >= 100%
Tools Bug tracker
Review Review of the EMI software components
Quality factor Maintainability
Goals Improving quality in software maintenance. Highlighting areas where
the process is backing up on itself.
Risks Use of different bug trackers
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 33 / 47
6.4. FAILED BUILDS METRIC
Metric id SUCCESSFULBUILDS
Name % of failed nightly builds
Description Number of failed builds over the number of total builds, in %. The
availability of this metric depends on the build tool used. For Java,
Atlassian bamboo [R11] is used for example in Unicore components.
Unit %
Scope Per component, per product team, per platform Type, Time-Varying
Thresholds/target value To be defined
Tools Bamboo for Jira, Etics?
Review Review of the EMI software components
Quality factor Maintainability
Goals Improve software development process. To decrease the amount of time
a component is failing to build. Push developers not to commit code that
does not compile.
Risks Availability of tools to collect the metric. Its usefulness is related to
SDP.
Failed builds due to a problem with a dependency which is not under the
control of the PT.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 34 / 47
6.5. INTEGRATION TEST EFFECTIVENESS METRIC
Metric id CERTIFICATIONTESTSEFFECTIVENESS
Name Certification tests effectiveness
Description % of bugs found during integration tests of a release over the number of
bugs found in production for the same release.
It could be shown per release or as average over a certain time period.
Unit %
Scope Per component, per product team.
Thresholds/target value An increasing value over time would show positive results.
Tools Bug tracker
Review Review of the EMI software components
Quality factor Availability, Correctness and Usability
Goals Improve integration tests effectiveness. End result should be to
encourage product team to keep up to date their certification tests and
improve them, especially analyzing open bugs.
Risks Bug detection area has to be correctly specified in the bug tracker. Use
of different bug trackers.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 35 / 47
6.6. UP-TO-DATE DOCUMENTATION METRIC
Metric id UPDATEDOC
Name Up to date documentation
Description This metric should state whether the documentation is present and up to
date as defined in the Minimum Required Documentation in the SQAP.
Unit (Number of documents provided)/(Number of documents requested by
the SQAP)
Scope Product teams and project tasks
Thresholds/target value 1
Tools Static analysis
Review Review of the Minimum Required Documentation for each software
component.
Quality factor Usability and Maintainability
Goals Improved usability and learnability of the software, tools and processes
used and produced by the project.
Used by Reviews as stated in DSA2.1
Risks The main risk is related to the way the documentation is reviewed.
Necessary manpower is needed in order to perform a serious
documentation review.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 36 / 47
6.7. DELAY ON RELEASE SCHEDULE METRIC
Metric id DELAYONTHERELEASE
Name Delay on the release schedule
Description This metric could be provided as a histogram showing the delay time (in
days) for each release, weighted using the release time.
Unit (release delay)/(release time) * 100
Scope Software component, EMI releases
Thresholds/target value Ideally the release deadlines should be always met, leading to 0 delays
for each release. Proper thresholds have to be defined. The trend of the
delays over time could provide useful hints for process optimization.
Tools Static analysis of release plan.
Review Review of the Software Release Schedule.
Quality factor Flexibility
Goals Improve the release time estimation and/or the processes involved in the
release preparation.
Risks Proper tracking of releases. Automatic collection of delay values.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 37 / 47
6.8. UNIT TEST COVERAGE METRIC
Metric id TESTCOVERAGE
Name Test coverage
Description This metric measures the amount of code that is exercised by the tests.
Different type of coverage metrics can be collected from coverage tools.
(e.g. Code, function, branch)
Unit % of code exercised during testing
Scope Per component, per product team
Thresholds/target value [0,25%) critical, [25%,75%) to be improved, [75%,100%] good. Levels
should be adjusted over time.
Tools Tools depend on the programming language. Etics is providing most of
them.
Review Review of the EMI software component releases.
Quality factors Reliability and Testability.
Goals Improve reliability of the software by improving code coverage of
components over time.
Risks In order to collect this metric we need to integrate a code coverage tool
in the integration tool/s used by different product teams and agree on the
type of coverage reported.
In theory, it is possible to have a code coverage value of 90% whilst
missing the most important functionality of a product team‟s work.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 38 / 47
6.9. MEMORY LEAKAGE WARNINGS METRIC
Metric id MEMORYLEAK
Name Memory leak warnings
Description It measures the amount of warnings (or errors) related to memory leaks.
Unit Number of warnings
Scope Per components, per product teams. The availability of this metric could
depend on the programming language used, and the availability of tools
to check memory leaks. (Valgrind [12] for C/C++, TPTP for Java in
Eclipse [13])
Thresholds/target value Need more understanding of the tool and its usage by product teams.
Tools It depends on the programming language
Review Review of the EMI software component releases.
Quality factor Reliability and Testability
Goals Improve the reliability of the software by reducing memory usage
warnings
Risks A good understanding of memory checking tools (e.g.valgrind) is
necessary in order to have a meaningful metric that product teams can
rely on.
It is possible to obtain false-positives with Valgrind, so it is important to
introduce Valgrind on an experimental level first. If continual false-
positives arise, the issue must be avoided, or the metric must be dropped.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 39 / 47
6.10. CODE COMMENTING METRIC
Metric id CODECOMMENTS
Name Code comments
Description This metric describes the presence of an adequate amount of comments
in the code. It is calculated by counting the number of lines of comments
over the total number of lines of code.
Unit %
Scope Per component, per product team.
Thresholds/target value Proper targets have to be understood, depending on the language and
tool used.
Tools Tools depend on the programming language
Review Review of the EMI software component releases.
Quality factor Maintainability
Goals Improve maintainability of the software.
Risks Definition of proper thresholds, agreement with product teams.
Easy to fudge values, replicating lines per class/function. Needs periodic
review by SA1 and/or SA2.5.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 40 / 47
6.11. NUMBER OF SUPPORTED PLATFORMS METRIC
Metric id SUPPORTEDPLATFORMS
Name Number of supported platforms
Description This metric defines the number of platforms supported by each
component over the number of supported platforms as stated in the
SQAP. The term “supported has to be better specified”. Server and client
components might have different targets.
Unit Number of platforms
Scope Per component, per product team
Thresholds/target value Must be a project wide decision.
Tools Build tool
Review Review of the EMI Production Releases.
Quality factor Portability
Goals Improve portability
Risks The percentage allotment of a developer time to moving from a
homogeneous infrastructure to a heterogeneous infrastructure is
fundamental to this task.
The flexibility of the build system, programming language, posix
conformance and non-linux specific coding standardization is
fundamental to this task.
The understanding of the term “supported” must be well defined and
understood to avoid confusion. Some server side APIs will never be
ported, for instance.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 41 / 47
6.12. TOTAL BUG DENSITY METRIC
Metric id TOTALBUGDENSITY
Name Total bug density
Description In our context a bug is, in ITL terminology, a problem that can be the
cause of one or more incidents (GGUS tickets [14]). This metric states
the amount of bugs per KLOC (it has to be clarified which type of LOC
measure we are going to adopt).
Unit Bugs per KLOC
Scope Per component, per product team.
Thresholds/target value No threshold can be stated in general for all the components. The trend
could provide useful information.
Tools KLOC per product can be measured with different tools, depending on
the programming language. Open bugs can be taken from the bug
tracker.
Review Review of the EMI software component releases.
Quality factor Correctness, Reliability and Flexibility
Goals Improving the intrinsic quality of the software
Risks LOC should not be used to compare different components.
A more meaningful metric should consider the bugs per KLOC of new
and changed code (bug02). Usage of different bug trackers.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 42 / 47
6.13. BUG DENSITY PER RELEASE METRIC
Metric id BUGDENSITYPERRELEASE
Name Bug density per release
Description In our context a bug is, in ITIL terminology, a problem that can be the
cause of one or more incident (GGUS tickets). This metric states the
amount of bugs found on a release per KLOC of new and changed code
shipped with the release (it has to be clarified which type of LOC
measure we are going to adopt).
Unit Bugs per KLOC of new and changed code
Scope Per component, per product team.
Thresholds/target value Taking as a starting point the first value of this metric, bug04 values
should decrease in order to show improvements of the SDP.
Tools Need investigation
Review Review of the EMI Production Releases
Quality factor Correctness, Reliability and Flexibility
Goals Improving the SDP
Risks Availability of a proper tool is the main risk of this metric. Bugs should
be linked to release and code.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 43 / 47
6.14. TOTAL USER INCIDENTS METRIC
Metric id TOTALUSERINCIDENTS
Name Total user incidents per user month
Description This metric covers defects not only in the software but also in the
documentation, training and user support processes, per user month.
User month means the number of users (in our case, deployed services?)
Per month.
Unit GGUS tickets per user per month
Scope Per component, per product team.
Thresholds/target value It is difficult to state a threshold valid for all the product teams, in
general a decreasing trend would show positive results.
Tools GGUS plus BDII or operational dashboards.
Review Review the status of the Software Maintenance and Support Plan
Quality factor Usability
Goals Improving the SDP including documentation and user support.
Risks GGUS has to be updated in order to provide a unit for each product team
on the 3rd
level support.
The use of mailing lists such as LCG-ROLLOUT [R15] is damaging for
tracking incidents related metrics. All incidents reports from users
should be reported through GGUS.
Number of deployed services does not generally provide an estimate of
the number of users actually using a service. This metric should be
normalized with a measure of the usage of the software component.
In order to track incidents per component and per product team a clear
assignment of „type of problem‟ (in GGUS) to components and product
teams has to be made.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 44 / 47
6.15. TRAINING AND SUPPORT INCIDENT METRIC
Metric id TRAININGSUPPORTINCIDENTS
Name Training and support incident per user month.
Description This metric covers defects in the training and user support processes, per
user month. User month means the number of users (deployed services?)
per month. The training and support defects can be derived by
subtracting the tickets in status unsolved (ticket that generated a bug)
from the total number of opened tickets. It relies on proper bug opening
from GGUS tickets, especially for what concerns ambiguous or missing
documentation.
Unit Incident per user month.
Scope Per component, per product team.
Thresholds/target value Decreasing trend.
Tools GGUS plus BDII or operational dashboards.
Review Review the status of the Software Maintenance and Support Plan
Quality factor Usability
Goals Improving documentation, training and user support processes.
Risks GGUS has to be updated in order to provide a unit for each product team
on the 3rd
level support.
The use of mailing lists such as lcg-rollout is damaging for tracking
incidents related metrics. All incidents reports from users should be
reported through GGUS.
Number of deployed services does not generally provide an estimate of
the number of users actually using a service. This metric should be
normalized with a measure of the usage of the software component.
In order to track incidents per component and per product team a clear
assignment of „type of problem‟ (in GGUS) to components and product
teams has to be made.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 45 / 47
6.16. AVERAGE TIME TO DEAL WITH AN INCIDENT
Metric id AVERAGETIMEFORUSERINCIDENTS
Name Average time to deal with an incident at the 3rd
level of user support
Description This metric wants to measure the effectiveness of a product team to
provide 3rd
level user support. The time is measured from the time the
ticket reaches a PT‟s 3rd
level support and the time the ticket is moved to
the status solved or unsolved.
Unit Days
Scope Per component, per product team.
Thresholds/target value Need project wide agreement.
Tools GGUS.
Review Review the status of the Software Maintenance and Support Plan
Quality factor Flexibility and Availability
Goals Improving effectiveness of 3rd
level user support processes within
product teams.
Risks GGUS has to be updated in order to provide a unit for each product team
on the 3rd
level support.
The use of mailing lists such as lcg-rollout is damaging for tracking
incidents related metrics. All incidents reports from users should be
reported through GGUS.
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 46 / 47
7. STANDARD PRACTICES AND CONVENTIONS
The following guidelines are defined as part of the SQA activity to give support to other activities.
These guidelines should be used by SA1 and JRA1 in the different stages of the software life cycle
process. The guidelines are updated and maintained in the following twiki pages:
Configuration and Integration:
https://twiki.cern.ch/twiki/bin/view/EMI/EmiSa2ConfigurationIntegrationGuidelines
Packaging and Releasing:
https://twiki.cern.ch/twiki/bin/view/EMI/EmiSa2PackagingReleasingGuidelines
Change Management:
https://twiki.cern.ch/twiki/bin/view/EMI/EmiSa2ChangeManagementGuidelines
Metrics Generation:
https://twiki.cern.ch/twiki/bin/view/EMI/EmiSa2MetricsGenerationGuidelines
Certification and Testing:
https://twiki.cern.ch/twiki/bin/view/EMI/EmiSa2CertTestGuidelines
The following deadlines and responsible people have been defined to have a first version of the
guidelines available for PEB and PTs review. Guidelines may be modified throughout the project life
time and they will be kept up to date by the responsible people. Changes will be communicated to the
EMT mailing list.
Configuration and Integration – Lorenzo Dini – 31.08.2010
Packaging and Releasing – Lorenzo Dini – 29.10.2010
Change Management – Maria Alandes – 31.08.2010
Metrics Generation – Eamonn Kenny – 10.09.2010
Certification and Testing – Jozef Cernak – 29.10.2010
DSA2.1 – SOFTWARE QUALITY ASSURANCE PLAN
Doc. Identifier: 1277599
Date: 31/05/2010
INFSO-RI-261611 2010 © Members of EMI collaboration PUBLIC 47 / 47
8. CONCLUSIONS
This Software Quality Assurance Plan is the foundation of the QA software activities in the EMI
project and is the baseline and reference for all the software and documentation reviews and reporting
tasks in the EMI collaboration. For more fine-grained technical details, five satellites documents
serving as specific software guidelines for SA1 and JRA1 are produced and maintained by SA2.The
preparation of these documents requires a careful analysis of the existing procedures in the
middleware collaborations within EMI and project-wide support to make sure they are affectively
adopted and enforced.
This plan was defined with input from experienced members of the project and will be periodically
discussed within EMI and tuned to better reach EMI objectives. Metrics and procedures will be
explained in more detail once all tools are in place and could be also modified as required. It is likely
that a yearly update of this document will be published to make sure every project member is properly
aware of changes and improvements.