c-spin3/11/991 q q software measurement and process improvement mehmet tumer chicago software...

31
C-SPIN 3/11/99 Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

Post on 20-Dec-2015

224 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 1

QQ

Software Measurement and Process

ImprovementMehmet Tumer

Chicago Software Process Improvement Network

Page 2: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 2

Introduction

Role of measurement in software process improvement

Examples of software development process metrics

Recommendations for establishing and maintaining a measurement program

Page 3: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 3

Agenda Overview of Publitec Project Management Metrics Defect Based Metrics Development Process Metrics Quality System Metric Tools for Metrics System Summary Feedback and Experience Sharing

Page 4: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 4

Publitec

Internal software house Established in the Netherlands in 1987 Re-organization in 1991 Established the Quality Assurance

Group

Page 5: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 5

Process Improvement Approach

Involvement by everyone Decisions based on facts Prioritization, implement critical

processes first Define Measurable processes

Page 6: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 6

Major Milestones

Sep. ‘91 Establish QA Group Sep. ‘92 Procedures Implemented Feb. ‘93 ISO9001 Trial Audit Aug. ‘93 ISO9001 Certification Oct. ‘96 SEI CMM training Jan. ‘97 CMM assessment by

independent organization

Page 7: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 7

Objectives

Deliver at the required time within the estimated cost» Project Management Metrics

Deliver the expected functionality» Defect Metrics» Development Process Metrics

Justify Cost of Quality» Quality System Metric

Page 8: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 8

Product and Process Measurement

Guidelines

Application: What does the metric indicate

Primitives: What is the basic data, how to capture

Implementation: How to calculate Interpretation: Meaning and targets

Page 9: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 9

Measurement

METRICS

QUALITYSYSTEMMETRICS

PROJECTMANAGEMENT

METRICSDEFECTMETRICS

DEVELOPMENTPROCESSMETRICS

Page 10: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 10

Project Management Metrics

Cost performance Outlook» Provides a tool for measuring the Cost and schedule

maintenance against the project plan» At the end of the project provides historic data on the initial

estimate, final estimate, and the actual cost and schedule.

Project Plan Visibility» Indicates the overall visibility of project plans

Software development Effort Distribution» Provides the distribution of effort for the software

development projects across different phases in the development life cycle.

Page 11: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 11

Growth of a Software Problem(Or the tale of the boiling frog)

0

5

10

15

20

25

30

35

1 3 5 7 9 11 13 15 17 19 21 23 25

Co

st,

Sch

ed

ule

an

d Q

ual

ity

Pro

ble

ms

REALITY

APPEARANCES

RequirementDefinition

Design Coding Testing

??

Page 12: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 12

VisibilityPeriod

Planned Effort

Planned Earned Value

0 5 01 10 52 25 103 40 254 60 405 85 806 120 1157 150 1258 200 1609 220 21010 230 22011 235 23012 240 240

SUM 1620 1460Visibility= 1460/1620= 90%

Project Visibility

0

50

100

150

200

250

0 1 2 3 4 5 6 7 8 9 10 11 12

Planned Effort

Planned Earned Value

Page 13: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 13

Plan Visibility

70%

74%

78%

82%

86%

90%

94%

98%

4Q91

1Q92

2Q92

3Q92

4Q92

1Q93

2Q93

3Q93

4Q93

1Q94

2Q94

3Q94

4Q94

1Q95

2Q95

3Q95

4Q95

1Q96

2Q96

3Q96

4Q96

1Q97

2Q97

3Q97

4Q97

1Q98

Page 14: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 14

Cost Performance Outlook

0

2000000

4000000

6000000

8000000

10000000

12000000

14000000

1996 jan97 feb97 mar97 apr97 may97 jun97 jul97 aug97 sep97 oct97 nov97 dec97

BASELINE PLANNED ACTUAL COST ACTUAL EARNED VALUE PLANNED EARNED VALUE BASELINE EARNED VALUE

Page 15: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 15

SD14 Cost Performance Outlook

0

100,000

200,000

300,000

400,000

500,000

600,000

700,000

800,000

900,000

1997 jan98 feb98 mar98 apr98 may98 jun98 jul98 aug98 sep98 oct98 nov98 dec98

BASELINE PLANNED ACTUAL COST

ACTUAL EARNED VALUE PLANNED EARNED VALUE BASELINE EARNED VALUE

Page 16: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 16

PD07 Cost Performance Outlook

0

50,000

100,000

150,000

200,000

250,000

300,000

350,000

400,000

1996 jan97 feb97 mar97 apr97 may97 jun97 jul97 aug97 sep97 oct97 nov97 dec97

BASELINE PLANNED ACTUAL COST ACTUAL EARNED VALUE PLANNED EARNED VALUE BASELINE EARNED VALUE

Page 17: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 17

On Time Delivery

85%

90%

95%

100%

105%

110%

115%

'92 '93 '94 '95 '96 '97 '98

Page 18: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 18

Development Effort Distribution

0%

10%

20%

30%

40%

50%

60%

Analy sis Design Coding Test & Rev iew Rew ork / Bug fix Config. Control &

Integration

Others

3Q95 4Q95 1Q96 2Q96

3Q96 4Q96 1Q97 2Q97

3Q97 4Q97 1Q98

Page 19: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 19

Defect Based Metrics

Defect Days Number» The effectiveness of the software development

process depends upon the timely detection and removal of detects across the entire life cycle.

» The earlier in the development cycle a defect is identified, the cheaper it is to remove that defect.

» This metric represents the number of days that defects spend in the software system from their introduction to their detection and removal

Page 20: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 20

Defect Days Number (based on Detected Date)

0

200

400

600

800

1000

1200

1400

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51

Weeks

Defect Days Number (based on Removed Date)

0

50

100

150

200

250

300

350

400

1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51

Weeks

Page 21: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 21

Defect Based Metrics Defect Distribution by Cause Category

This metrics provides information on the reason of defects

Defect Distribution by Discovery Mechanism

This defect indicates where in the development process, majority of the defects are identified. It is also possible to see the effectiveness of the review and testing process in identifying the defects before they are found by the customers.

Page 22: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 22

Defect Distribution by Cause Category

0

200

400

600

800

1000

1200

1400

Page 23: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 23

Discover Mechanism

39%

46%

8%

7%

Review

Test

Proceeding Phases

User Complaint

Page 24: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 24

Development Process Metrics

23

These metrics are used to judge how effective the development process is in identifying and avoiding defects. These are plotted against the timeline to indicate the improvement in the process.

Analysis (Design) Review Efficiency

Indicates the percentage of analysis (design) defects found at the analysis (design) review, right after the analysis (design) activity.

Long Stay Defects

Percentage of the defects which took more than 100 days to remove after their introduction

Late Detect Defects

Percentage of the defects which took more than 30 days to detect after their introduction

Page 25: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 25

Development Efficiency

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

4Q91 2Q92 4Q92 2Q93 4Q93 2Q94 4Q94 2Q95 4Q95 2Q96 4Q96 2Q97 4Q97

Long Stay Defects >100days Late Detect Defects >30days

Analysis Review Efficiency Design Review Eff iciency

Page 26: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 26

Quality System Metric

25

QUALITY LOSS

» Cost of Rework

» Cost of Scrap

QUALITY INVESTMENT

» Cost of Quality System

» Cost of Project QA Activities

QUALITY COST = QUALITY LOSS + QUALITY INVESTMENT

This metric is used to measure the effectiveness of the quality system

Page 27: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 27

Cost of Quality

0%

4%

8%

12%

16%

20%

24%

4Q91 2Q92 4Q92 2Q93 4Q93 2Q94 4Q94 2Q95 4Q95 2Q96 4Q96 2Q97 4Q97

% D

evel

opm

ent C

ost

Quality Loss Quality Investment TOTAL Quality Cost

Page 28: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 28

Tools for Metrics System

Scheduling Tool Tracking Tool (Time Registration Tool) Defect Management Tool Reporting Tool

Page 29: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 29

Summary

Improvement is not possible without measurement

Design measurable processes Involve all Stakeholders in the

description of the processes and metrics

Page 30: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 30

Summary

Measure with a focus Set targets and track achievement

against the targets Don’t ignore results, provide continuous

feedback Never use metrics to evaluate groups or

individuals

Page 31: C-SPIN3/11/991 Q Q Software Measurement and Process Improvement Mehmet Tumer Chicago Software Process Improvement Network

C-SPIN 3/11/99 31

Summary

Capture as much data as possible, report only those metrics that make sense

Audit the data collection process Validate the conclusions with different

metrics Beware of aging metrics