c-spin3/11/991 q q software measurement and process improvement mehmet tumer chicago software...
Post on 20-Dec-2015
224 views
TRANSCRIPT
C-SPIN 3/11/99 1
Software Measurement and Process
ImprovementMehmet Tumer
Chicago Software Process Improvement Network
C-SPIN 3/11/99 2
Introduction
Role of measurement in software process improvement
Examples of software development process metrics
Recommendations for establishing and maintaining a measurement program
C-SPIN 3/11/99 3
Agenda Overview of Publitec Project Management Metrics Defect Based Metrics Development Process Metrics Quality System Metric Tools for Metrics System Summary Feedback and Experience Sharing
C-SPIN 3/11/99 4
Publitec
Internal software house Established in the Netherlands in 1987 Re-organization in 1991 Established the Quality Assurance
Group
C-SPIN 3/11/99 5
Process Improvement Approach
Involvement by everyone Decisions based on facts Prioritization, implement critical
processes first Define Measurable processes
C-SPIN 3/11/99 6
Major Milestones
Sep. ‘91 Establish QA Group Sep. ‘92 Procedures Implemented Feb. ‘93 ISO9001 Trial Audit Aug. ‘93 ISO9001 Certification Oct. ‘96 SEI CMM training Jan. ‘97 CMM assessment by
independent organization
C-SPIN 3/11/99 7
Objectives
Deliver at the required time within the estimated cost» Project Management Metrics
Deliver the expected functionality» Defect Metrics» Development Process Metrics
Justify Cost of Quality» Quality System Metric
C-SPIN 3/11/99 8
Product and Process Measurement
Guidelines
Application: What does the metric indicate
Primitives: What is the basic data, how to capture
Implementation: How to calculate Interpretation: Meaning and targets
C-SPIN 3/11/99 9
Measurement
METRICS
QUALITYSYSTEMMETRICS
PROJECTMANAGEMENT
METRICSDEFECTMETRICS
DEVELOPMENTPROCESSMETRICS
C-SPIN 3/11/99 10
Project Management Metrics
Cost performance Outlook» Provides a tool for measuring the Cost and schedule
maintenance against the project plan» At the end of the project provides historic data on the initial
estimate, final estimate, and the actual cost and schedule.
Project Plan Visibility» Indicates the overall visibility of project plans
Software development Effort Distribution» Provides the distribution of effort for the software
development projects across different phases in the development life cycle.
C-SPIN 3/11/99 11
Growth of a Software Problem(Or the tale of the boiling frog)
0
5
10
15
20
25
30
35
1 3 5 7 9 11 13 15 17 19 21 23 25
Co
st,
Sch
ed
ule
an
d Q
ual
ity
Pro
ble
ms
REALITY
APPEARANCES
RequirementDefinition
Design Coding Testing
??
C-SPIN 3/11/99 12
VisibilityPeriod
Planned Effort
Planned Earned Value
0 5 01 10 52 25 103 40 254 60 405 85 806 120 1157 150 1258 200 1609 220 21010 230 22011 235 23012 240 240
SUM 1620 1460Visibility= 1460/1620= 90%
Project Visibility
0
50
100
150
200
250
0 1 2 3 4 5 6 7 8 9 10 11 12
Planned Effort
Planned Earned Value
C-SPIN 3/11/99 13
Plan Visibility
70%
74%
78%
82%
86%
90%
94%
98%
4Q91
1Q92
2Q92
3Q92
4Q92
1Q93
2Q93
3Q93
4Q93
1Q94
2Q94
3Q94
4Q94
1Q95
2Q95
3Q95
4Q95
1Q96
2Q96
3Q96
4Q96
1Q97
2Q97
3Q97
4Q97
1Q98
C-SPIN 3/11/99 14
Cost Performance Outlook
0
2000000
4000000
6000000
8000000
10000000
12000000
14000000
1996 jan97 feb97 mar97 apr97 may97 jun97 jul97 aug97 sep97 oct97 nov97 dec97
BASELINE PLANNED ACTUAL COST ACTUAL EARNED VALUE PLANNED EARNED VALUE BASELINE EARNED VALUE
C-SPIN 3/11/99 15
SD14 Cost Performance Outlook
0
100,000
200,000
300,000
400,000
500,000
600,000
700,000
800,000
900,000
1997 jan98 feb98 mar98 apr98 may98 jun98 jul98 aug98 sep98 oct98 nov98 dec98
BASELINE PLANNED ACTUAL COST
ACTUAL EARNED VALUE PLANNED EARNED VALUE BASELINE EARNED VALUE
C-SPIN 3/11/99 16
PD07 Cost Performance Outlook
0
50,000
100,000
150,000
200,000
250,000
300,000
350,000
400,000
1996 jan97 feb97 mar97 apr97 may97 jun97 jul97 aug97 sep97 oct97 nov97 dec97
BASELINE PLANNED ACTUAL COST ACTUAL EARNED VALUE PLANNED EARNED VALUE BASELINE EARNED VALUE
C-SPIN 3/11/99 17
On Time Delivery
85%
90%
95%
100%
105%
110%
115%
'92 '93 '94 '95 '96 '97 '98
C-SPIN 3/11/99 18
Development Effort Distribution
0%
10%
20%
30%
40%
50%
60%
Analy sis Design Coding Test & Rev iew Rew ork / Bug fix Config. Control &
Integration
Others
3Q95 4Q95 1Q96 2Q96
3Q96 4Q96 1Q97 2Q97
3Q97 4Q97 1Q98
C-SPIN 3/11/99 19
Defect Based Metrics
Defect Days Number» The effectiveness of the software development
process depends upon the timely detection and removal of detects across the entire life cycle.
» The earlier in the development cycle a defect is identified, the cheaper it is to remove that defect.
» This metric represents the number of days that defects spend in the software system from their introduction to their detection and removal
C-SPIN 3/11/99 20
Defect Days Number (based on Detected Date)
0
200
400
600
800
1000
1200
1400
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51
Weeks
Defect Days Number (based on Removed Date)
0
50
100
150
200
250
300
350
400
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51
Weeks
C-SPIN 3/11/99 21
Defect Based Metrics Defect Distribution by Cause Category
This metrics provides information on the reason of defects
Defect Distribution by Discovery Mechanism
This defect indicates where in the development process, majority of the defects are identified. It is also possible to see the effectiveness of the review and testing process in identifying the defects before they are found by the customers.
C-SPIN 3/11/99 22
Defect Distribution by Cause Category
0
200
400
600
800
1000
1200
1400
C-SPIN 3/11/99 23
Discover Mechanism
39%
46%
8%
7%
Review
Test
Proceeding Phases
User Complaint
C-SPIN 3/11/99 24
Development Process Metrics
23
These metrics are used to judge how effective the development process is in identifying and avoiding defects. These are plotted against the timeline to indicate the improvement in the process.
Analysis (Design) Review Efficiency
Indicates the percentage of analysis (design) defects found at the analysis (design) review, right after the analysis (design) activity.
Long Stay Defects
Percentage of the defects which took more than 100 days to remove after their introduction
Late Detect Defects
Percentage of the defects which took more than 30 days to detect after their introduction
C-SPIN 3/11/99 25
Development Efficiency
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
4Q91 2Q92 4Q92 2Q93 4Q93 2Q94 4Q94 2Q95 4Q95 2Q96 4Q96 2Q97 4Q97
Long Stay Defects >100days Late Detect Defects >30days
Analysis Review Efficiency Design Review Eff iciency
C-SPIN 3/11/99 26
Quality System Metric
25
QUALITY LOSS
» Cost of Rework
» Cost of Scrap
QUALITY INVESTMENT
» Cost of Quality System
» Cost of Project QA Activities
QUALITY COST = QUALITY LOSS + QUALITY INVESTMENT
This metric is used to measure the effectiveness of the quality system
C-SPIN 3/11/99 27
Cost of Quality
0%
4%
8%
12%
16%
20%
24%
4Q91 2Q92 4Q92 2Q93 4Q93 2Q94 4Q94 2Q95 4Q95 2Q96 4Q96 2Q97 4Q97
% D
evel
opm
ent C
ost
Quality Loss Quality Investment TOTAL Quality Cost
C-SPIN 3/11/99 28
Tools for Metrics System
Scheduling Tool Tracking Tool (Time Registration Tool) Defect Management Tool Reporting Tool
C-SPIN 3/11/99 29
Summary
Improvement is not possible without measurement
Design measurable processes Involve all Stakeholders in the
description of the processes and metrics
C-SPIN 3/11/99 30
Summary
Measure with a focus Set targets and track achievement
against the targets Don’t ignore results, provide continuous
feedback Never use metrics to evaluate groups or
individuals
C-SPIN 3/11/99 31
Summary
Capture as much data as possible, report only those metrics that make sense
Audit the data collection process Validate the conclusions with different
metrics Beware of aging metrics