[ieee comput. soc first asia-pacific conference on quality software - hong kong, china (30-31 oct....

7
SOFTWARE PROJECT MEASUREMENT CRITERIA I. M. Hampton B. Eng., C.Eng., MIEE ian @mtrcorn.com 1. INTRODUCTION The consensus of opinion of a sample of Mass Transit Railway Corporation (MTRC) management and engineering staff (8 staff) is that projects containing software products have proved difficult to deliver to programme and within budget. This fact is further exasperated when a particular Supplier does not subscribe to a “mature” Software Development Process (SDP) thereby failing to either sustain consistency in application or provide the client with visibility of progress throughout the development and manufacturing process. The introduction of standard SDPs has recently become an integral part of many contract specifications. This development has provided a defined measure of visibility of the SDP to the client. Software being of a nebulous composition, as opposed to being a physical commodity which can be visibly “measured”, the quality of the software product is therefore deemed analogous to the processes adopted during production. The result is that the SDP can be ameliorated to formulate a framework for establishing a robust set of measurement criteria. This paper describes an approach adopted by MTRC in measuring the progress of a safety related software intensive development project. It should be noted however, that the exercise described in this paper is not contained to software development and the methodology can be adopted to measure any phase, element or activity of any project. 2. PURPOSE OF MEASUREMENT CRITERIA The derivation of a suite of project measurement indicators establishes a method to quantify, validate and measure parameters of a project with the purpose of deriving an objective account of the work undertaken and hence, the value of work completed. In addition, they provide a process tool to assist in monitoring the progress of the work and the data can be extrapolated to determine the rate of acceleration leading to subsequent actions required to attain targets. The measurement criteria are not however, intended to replace the project planning process. They are designed to complement the “planning function” by providing an alternative perspective of contract progress thus mitigating the subjectivity of programme “inputs” with respect to the reporting of actual progress. The purpose of measurement criteria as applied to MTRC’s project can be summarised as follows:- 0-7695-0825-1/00 $10.00 0 2000 IEEE B. W. T. Quinn MIRSE brianq @mtrcom.com re-enforce project planning and promote an objective account of the project progress; manage the supplier by conveying a representation of “actual” progress and resultant effects; promote a culture of anticipation as opposed to a reaction for both the client and supplier; provide a building block for the client to aid future project valuation and estimates of similar software based products; act as a motivator for both client and supplier personnel by visibly depicting “progress” on a daily, weekly or monthly basis, as deemed appropriate; provide a medium to model project scenarios. I Figure 2 - Software Development Process (SDP) 258

Upload: bwt

Post on 24-Feb-2017

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: [IEEE Comput. Soc First Asia-Pacific Conference on Quality Software - Hong Kong, China (30-31 Oct. 2000)] Proceedings First Asia-Pacific Conference on Quality Software - Software project

SOFTWARE PROJECT MEASUREMENT CRITERIA

I. M. Hampton B. Eng., C.Eng., MIEE ian @mtrcorn.com

1. INTRODUCTION

The consensus of opinion of a sample of Mass Transit Railway Corporation (MTRC) management and engineering staff (8 staff) is that projects containing software products have proved difficult to deliver to programme and within budget. This fact is further exasperated when a particular Supplier does not subscribe to a “mature” Software Development Process (SDP) thereby failing to either sustain consistency in application or provide the client with visibility of progress throughout the development and manufacturing process.

The introduction of standard SDPs has recently become an integral part of many contract specifications. This development has provided a defined measure of visibility of the SDP to the client. Software being of a nebulous composition, as opposed to being a physical commodity which can be visibly “measured”, the quality of the software product is therefore deemed analogous to the processes adopted during production. The result is that the SDP can be ameliorated to formulate a framework for establishing a robust set of measurement criteria.

This paper describes an approach adopted by MTRC in measuring the progress of a safety related software intensive development project. It should be noted however, that the exercise described in this paper is not contained to software development and the methodology can be adopted to measure any phase, element or activity of any project.

2. PURPOSE OF MEASUREMENT CRITERIA

The derivation of a suite of project measurement indicators establishes a method to quantify, validate and measure parameters of a project with the purpose of deriving an objective account of the work undertaken and hence, the value of work completed. In addition, they provide a process tool to assist in monitoring the progress of the work and the data can be extrapolated to determine the rate of acceleration leading to subsequent actions required to attain targets. The measurement criteria are not however, intended to replace the project planning process. They are designed to complement the “planning function” by providing an alternative perspective of contract progress thus mitigating the subjectivity of programme “inputs” with respect to the reporting of actual progress. The purpose of measurement criteria as applied to MTRC’s project can be summarised as follows:-

0-7695-0825-1/00 $10.00 0 2000 IEEE

B. W. T. Quinn MIRSE brianq @mtrcom.com

re-enforce project planning and promote an objective account of the project progress;

manage the supplier by conveying a representation of “actual” progress and resultant effects;

promote a culture of anticipation as opposed to a reaction for both the client and supplier;

provide a building block for the client to aid future project valuation and estimates of similar software based products;

act as a motivator for both client and supplier personnel by visibly depicting “progress” on a daily, weekly or monthly basis, as deemed appropriate;

provide a medium to model project scenarios.

I Figure 2 - Software Development Process (SDP)

258

Page 2: [IEEE Comput. Soc First Asia-Pacific Conference on Quality Software - Hong Kong, China (30-31 Oct. 2000)] Proceedings First Asia-Pacific Conference on Quality Software - Software project

Measurement Criteria r Planning & Design

(measurement of the planning and dcsign process)

Validation Test Reports

(measurement of the software manufacturing process and site test &

Site Installation

Strategic Completion Stages (SCS)

Preliminary Design Review (PDW

Final Dcsign Review (FDR)

Factory Acceptance Test (FAT)

System Acceptance Test (SAT)

Software Development Process (SDP) Phase

1 Requirements Definition &

2 Overall System Design

Analysis

3 Software Architectural

4 Detailed Software Deign Design

5 Software Module Coding

6 Software Integration 7 System Integration

~ ~ ~~ ~ ~

8 System Acceptance 9 Operation & Maintenance

Key Performance Indicators (K1

I Planning (foundation) 2 Preliminary Dcsign 3 Systems Assurance

4 Detailed Design (PRC Sofiwarc 5 Detailccl Design (ECS Softwarc 6 Detailed Design (ECS Logic)

7 PRC Software Test Reports 8 ECS Software Test Reports 9 PRC Site Test Reports 10 ECS Site Test Reports

11 Cable Installation 12 Cable Termination & Testing 1 3 l T P Installation 14 PRC Equipment to Site 15 PRC Installation 16 PRC Commissioning 17 PRC Decommissioning 18 ECS Equipment to Site 19 ECS Installation 20 ECS Cubicle Installation 21 ECS Cut-Over Panel Installatio 22 ECS Commissioning 23 ECS Decommissioning

Table 4.1 - Decomposition of Measurement Criteria

3. SOFTWARE DEVLOPMENT PROCESS

A defined SDP provides all of the salient factors to develop a set of measurement criteria. The SDP adopted by MTRC’s Contractor in question closely matches that described in the international standard prEN50128 (provisional European Standard EN50128, “Railway Application: Software for Railway Control & Protection Systems”) for safety related software systems and comprises of nine key development phases (refer Figure 2). Each development phase requires the software under development to be subjected to a number of tools, techniques and measures. For example, Design Reviews,

Traceability Analysis, Testing etc. This application is aimed at ensuring a defined level of integrity with respect to a system that has been assigned a Safety Integrity Level 2 (SIL 2) classification. The specification of the SIL 2 implies the application of designated techniques that are described in the contract Software Quality Plan, Verification Plan and Validation Plan. In addition to the SDP, other support activities, for example, Systems Assurance, Hardware Design etc., are required lo be factored into the measurement criteria. These activities are also commensurate with the SDP and are described within separate project documents, for example, System Safety Plan, Hardware Design Specifications etc.

259

Page 3: [IEEE Comput. Soc First Asia-Pacific Conference on Quality Software - Hong Kong, China (30-31 Oct. 2000)] Proceedings First Asia-Pacific Conference on Quality Software - Software project

I 1 Planning Documentation

Configuration Management Plan Data Preparation Plan Design Plan Display Preparation Plan ECS Logic Quality Plan Maintainability Plan Electromagnetic Compatibility Plan Quality Audit Programme Quality Plan Reliability Plan Software Quality Plan System Safety Plan Training Plan System Inspection & Test Plan Validation Plan Verification Plan

I

Manpower Plan Master Programme

Table 4.2 - Contractor’s Planning Documentation

4. ESTABLISHING MEASUREMENT CRITERIA

Prior to determining the measurement criteria it is fundamental to develop a structured format to guide the process for establishing the basic building blocks of the measurement data. To facilitate this process, four Strategic Completion Stages (SCSs) were identified comprising of a Preliminary Design Review, a Final Design Review, a Factory Acceptance Test and a System Acceptance Test. The SCSs were subsequently categorised into Preliminary Design, Final Design, Manufacture and Site Installation (refer Figurc 2). Each SCS incorporated two or more software development phases. For example, completion of the SCS Preliminary Design Review marks completion of the Preliminary Design phase that also incorporates two SDP phases, namely, the Requirements Definition & Analysis Phase and the Overall System Design Phase. (Refer Table 4.1 for the relationship between the remaining SCSs and the SDP phases.) The SCSs were superimposed on the SDP (refer Figure 2) to map their integration with the SDP lifecycle. This facilitated construction of a format for detailing the components of the measurement criteria. Three primary components of measurement criteria were subsequently established and comprised Planning & Design, Validation Test Reports and Site Installation (refer Table 4.1).

The measurement criteria were then transposed into a series of Key Performance Indicators (KPIs) which in essence, are a composition of a number of KPI Elements. A KPI Element is an individual measurable item that is

either a document or a commodity. At contract commencement, pertinent information can be extracted from the contract specification to develop the Foundation KPI. In this instance, the extracted information from the contract specification forms the basis for establishing the “Planning” KPI which includes 18 documents as depicted in Table 4.2.

The contents of the planning documentation in Table 4.2 describe all of the works planned through the project life cycle. Thercfore, only when the planning documentation is published, can the foundations of the remaining KPIs be developed. For example, the Preliminary Design KPI is derived from a combination of the Software Quality Plan, ECS Logic Quality Plan, Validation Plan and Verification Plan (which are components of the “Planning” or Foundation KPI). MTRC’s Project Team subsequently established 23 KPIs as described in Table 4.1. However, it should be noted that it may not be possible to formulate KPI’s for the project duration as a one-off exercise, because the total quantity required may not be known until commencement of the Detailed Software Design e.g. PRC Software Test Reports. Therefore, as a general principle, project KPIs should be developed as information or critical paths emerge and discarded as contract vagaries dictate.

5. USING MEASUREMENT CRITERIA

Following establishment of the KPIs, the KPI Element metrics can be shown statistically or graphically in a number of guises using standard software application packages (refer Figure 5.1 for an example). Reference to

20

1 15 E ; I O

9 5

Figure 5.2 - KPI Depicting “Approved In Principle” Figure 5.3 - KPI Depicting “Approved”

260

Page 4: [IEEE Comput. Soc First Asia-Pacific Conference on Quality Software - Hong Kong, China (30-31 Oct. 2000)] Proceedings First Asia-Pacific Conference on Quality Software - Software project

25 1 Extrapolated PRELIMINARY DESIGN

Required Rate , ,

Figure 5.4 - KPI Extrapolation

the “Planning” KPI shown in Figure 5.2 shows the Contractor’s “Planning” KPI based on achieving “Approved In Principle” status. Figure 5.3 depicts the Contractor’s “Planning” KPI based on achieving “Approved” status. Both represent a measure of performance of the Contractor’s submitted planning documentation but with a different degree of measurement which facilities a greater depth of understanding of actual progress as opposed to perceived progress. Reference Figure 5.1 represents a graphical and tabulated description of the Planning & Design Measurement Criteria (shading represents completed work). The complete Planning & Design Measurement Criteria embracing all the KPI Elements correlates directly to the shaded regions depicted in Table 4.1. Evaluation of the KPIs can be used to promote typical questions: -

Is the rate of progress commensurate with meeting the target completion dates ?

Are there periods of no progress ?

What if scenarios ? Care must be taken to pitch the KPI’s at an optimum

level of performance e.g. the onus is on keeping the project goal in mind as opposed to policing the Contractor. The following are two case examples extracted from the project in question.

5.1 Project Case Example 1

Reference Figure 5.2, the Contractor’s progress with respect to the delivery of Planning documentation during the early months of the contract was deemed satisfactory. In November 1997, the rate of documentation return began to falter and in January 1998, there was clear evidence that the Contractor was unlikely to achieve the original target completion date. The situation subsequently deteriorated until May 1998 when the Contractor subsequently revised their development strategy and implemented organisation changes to meet a revised target completion date.

5.2 Project Case Example 2

Refer Figure 5.4, depicting information in graphical format may serve to initiate a user to extrapolate data to implement pro-active actions. In September 1998, the Contractor was presented with a required rate of production of one Preliminary Design document per week over 13 weeks to achieve the target completion date in late December 1999. In addition, the extrapolated target was presented to the Contractor which forecasted an anticipated completion date in late February 1999 should the current situation continue unabated. The situation was compounded by the fact that Preliminary Design is a precursor to commencing the Detailed Design activities. Unfortunately the Contractor in question was unable to curtail the situation and attained a completion date of March 1999, (refer Figure 5.1). However, this serves to illustrate the accuracy of extrapolation in forecasting the completion of work.

261

Page 5: [IEEE Comput. Soc First Asia-Pacific Conference on Quality Software - Hong Kong, China (30-31 Oct. 2000)] Proceedings First Asia-Pacific Conference on Quality Software - Software project

Software Development Process (SDP) Phase

Strategic Completion Measurement KPls KPI Weighting Stage (SCS) Criteria Elemen ts

Requirements Definition

Analysis

Software Architectural

Design

Table 6 - SDP, SCS, Measurement Criteria, KPI, KPI Element and Weighting Relationship

Preliminary Design Overall System Review

Design Preliminary Design Planning &Design 6 319 x 2

Detailed Software Design Review Design Final Design

6. PERFORMANCE MEASUREMENT APPROACH

The adoption of measurement criteria as used in the MTRC project may ultimately provide a more informed performance measure of the Actual Volume of Work Performed (AVWP) versus the Total Volume of Work Required (TVWR). To determine what actually constitutes AVWP and TVWR. the relationship between the KPI Elements and the Contract Sum needs to be understood. To explain this relationship, the contract sum (less mobilisation costs), which represents a finite amount of money for the TVWR, would be divided by the total number of project KPI Elements. Therefore, TVWR is synonymous with the total number of project KPI Elements and AVWP is synonymous with the total number of completed KPI Elements.

The Measurement Criteria can also be “weighted” to

Software Integration

promote a finer degree of accuracy that wIll further refine the KPI approach to performance measurement. The relationship between the SDP, SCS, Measurement Criteria, KPI, KPI Element and Weighting are shown in Table 6, which also depicts actual values of the parameters devcloped and used by the MTRC project team. The examples shown in a) and b) below depict a calculation of the Value of Work Completed (VWC). Example a) depicts KPI Elements complete. Example b) depicts KPI Elements complete with “weighting” applied. The calculations utilise the KPI Elements completed as shown in Figure 5.1. The total number of KPI Elements and Weighting are depicted in Table 6.

Repolrs 4 323 x 3 System Fac“JrY Acceptance Tests Validation Test

Integration Manufacture

The calculations shown above demonstrate that the application of “weighting” clearly influences the derived

System Acceptance Maintenance

a) Measurement Criteria (without “weighting” applied)

Value of Work Complete - - KPI Elements (Comulete) x 100% Z (Measurement Criteria)

186 x 100% - - (319 + 323 + 1264)

I O % - -

System Acceptance Tests

Site Installation Site Installation 13 1264 x l

b) Measurement Criteria (with “weighting” applied)

Value of Work Complete - - KPI Elements (Complete) x Weightinp x 100% C (Measurement Criteria x Weighting)

- - (186 x 2) x 100% ( 3 1 9 ~ 2 ) + ( 3 2 3 ~ 3 ) + ( 1 2 6 4 x 1)

262

Page 6: [IEEE Comput. Soc First Asia-Pacific Conference on Quality Software - Hong Kong, China (30-31 Oct. 2000)] Proceedings First Asia-Pacific Conference on Quality Software - Software project

VWC. Therefore, care should he taken in assigning the “weighting” value to ensure that the calculation does not adversely manipulate the value of the VWC. An opportunity presents itself to further develop a bespoke “weighting” criteria to reflect a measure of assigned value to the client as opposed to feeding a payment structure founded on a Schedule of Prices (SOP) formulated before the Contract even begins ! The concern being that the SOP is generally formulated by the supplier based on a scope of work detailed in the tender when the software content is not developed and hence not fully understood.

The VWC prior to the commencement of the software validation works (refer Software Module Coding phase in Figure 2) constitutes the production of a number of Planning & Design documents represented by the KPI Elements depicted in Figure 5.1. The Project Team concludes that the “real” VWC will be when the software validation exercise commences allied to the resultant emergence of a functional software product. This has been reflected in the calculation above by assigning the greater number of Validation KPI Elements (323) “weighted” at 3 compared to the Planning & Design KPI Elements (3 19) “weighted” at I . However, this period in the Contract has not been assigned a value in the SOP with respect to the Contract in question, coupled with the fact that it is not a contract requirement to furnish the client with validation test report information. The MTRC project team concludes that the VWC should endeavour to bear a closer relationship to the contract SOP to ensure payments are biased towards the delivery of commodities deemed as “value a d d e d by the client.

7. CONCLUSION

The MTRC project team are currently developing a measurement template for inclusion in future contract documents. In addition, the MTRC project team is evaluating whether the prescribed subscription by a Contractor in sustaining KPIs via electronic format in an attempt to create “real-time” information, should also be a future Contract requirement allied to reporting by exception. Indeed, the MTRC project team believe it is questionable whether the Master Programme concept alone can continue to sustain the rigors of commercial scrutiny and accountability for complex software dependant projects because of the subjectivity, variable number and un-timeliness of programme inputs.

The MTRC project team considers the adoption of Key Performance Indicators (KPIs) in providing measurement criteria an essential component in monitoring the intrinsic health of any project. Furthermore, KPIs should not just be constrained to “software” but be adopted for any feature of a project. Future developments of the measurement criteria process allied to the MTRC project will include a process for “weighting” individual components of the KPIs to give

more recognition to thc amount of labour utilised to undertake a KPI Element and strategic related work components.

The Contractor’s “buy-in” during the establishment and formulation of the KPIs is of paramount importance and promotes ownership by both the client and supplier. This is important with respect to the timely supply of information and the shared publication of the results and the resultant recommended changes in implementation strategy to counter under performance. It is recommended that future contract SOP’S are vetted by the client to ensure that the software development phase of a project is duly rewarded for the Validation and Verification outputs which underwrite the release of a software product.

Finally, the most important lessons learnt from the MTRC project measurement criteria exercise and resultant initiative in stimulating MTRC’s Contractor in the delivery of a software intensive control system is

’ I What Gets Measures - Gets Done”.

“When you can measure what you are speaking about, arid express it in numbers, you know something about it, but when you canno! express it in numbers, )tour knowledge is of a meagre and unsatisfactory kind. I t may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the stage of science” Lord (William Thomson) Kelvin (1 824-1907).

263

Page 7: [IEEE Comput. Soc First Asia-Pacific Conference on Quality Software - Hong Kong, China (30-31 Oct. 2000)] Proceedings First Asia-Pacific Conference on Quality Software - Software project

\

-r--- U-Inf J , , , , ,

264