metrics sirisha

32
Software Metrics – Overview Blackboard by Sirisha N

Upload: interactionaccount

Post on 28-May-2015

381 views

Category:

Technology


0 download

TRANSCRIPT

Page 1: Metrics  Sirisha

Software Metrics –OverviewBlackboard

by Sirisha N

Page 2: Metrics  Sirisha

Objectives

• Understand metrics

• Applying Metrics to drive Testing Projects

• Data sources and means of capturing metrics

Page 3: Metrics  Sirisha

Agenda

• Operational definitions

• Introduction– Why Measure– Purpose– Metrics in ISO, CMM & CMMI– Basic Definitions– Institutionalize Metrics program– What to Measure– Data Collection Strategy

• Metrics based Project Management

Page 4: Metrics  Sirisha

Purpose

• To plan projects (estimation based on past data)

• To control project’s process– Taking corrective and preventive action in a timely manner

– To monitor the goals of the project set by client/organization

• To provide feedback on performance of projects

• To improve the organization process/tools/methods

• To identify the training needs based on project performance

Page 5: Metrics  Sirisha

Metrics in ISO, CMM & CMMI

ISO9001:2000• Section 8 – Measurement Analysis & improvement

CMM• Level 4 KPA’s – Quantitative Process Management

– Software Quality Management

CMMI• Level 2 KPA – Measurement and Analysis

• Level 4 KPA’s – Quantitative Project Management

– Organizational Process Performance

Page 6: Metrics  Sirisha

Basic Definitions

Metrics“ A quantitative measure of the degree to which a system, component or

process possesses a given attribute” ---- IEEE

e.g, No of defects per KLOC/FP, No. of TC Authored / Hr.

Defects per person hour

Measurement“ is the act of determining a measure of something”

Measure“ is a single quantitative attribute of an entity – the basic building block for a

measurement”

e.g, 100 LOC – “100 is the measure, LOC is the Unit of measure”

Page 7: Metrics  Sirisha

Institutionalize Metrics Program

SG 1 Align Measurement and Analysis ActivitiesSP 1.1-1 Establish Measurement ObjectivesSP 1.2.1 Specify MeasuresSP 1.3.1 Specify Data Collection and Storage ProceduresSP 1.4.1 Specify Analysis Procedures

CMMI L 2 – PA 5

SG 2 Provide Measurement ResultsSP 2.1-1 Collect Measurement DataSP 2.2.1 Analyze Measurement DataSP 2.3.1 Store Data and ResultsSP 2.4.1 Communicate Results

Clarify Business Goals

Prioritize Issues

Select & Define Measures

Collect, Verify & Store Data

Analyze Process Behavior

Stable?

Capable?

ContinuallyImprove

Remove AssignableCauses

Change Process

NewIssues?

NewMeasures?

NewGoals?

Y

N

N

N

N

N

Y

Y

Y

Page 8: Metrics  Sirisha

What to Measure ?

Process Metrics:Metrics that are used to control processes in a software system

(i.e, Productivity, Efficiency, etc)

Product Metrics:Metrics that are used to control the software life cycle process

(not within the scope of EQA)

Project Metrics:Metrics that are used to control project life cycle process

(i.e, Effort Variation, Schedule Variation, etc)

Quality Metrics:Metrics that are used to control quality in product or service

(i.e, CSI, % TC modified, etc)

Page 9: Metrics  Sirisha

…..What to Measure ?

SoftwareTest Metrics

ProductMetrics

ProcessMetrics

ProjectMetrics

QualityMetrics

1. Size Variation2. Defect density3. Code coverage4. MTBF

1. TCA Productivity2. TCR Productivity3. TCE Productivity4. Test Case challenged percentage

1. Effort variation2. Schedule Variation3. Schedule Compliance4. Staff Utilization

1. Adhoc Bug %2. Challenged Bug %3. Rejected Bug %4. Customer satisfaction Index

Page 10: Metrics  Sirisha

Data Collection Strategy

WBS (.XLS)Time

Sheet (.XLS)

Estimation Sheet (.XLT)

Estimation

Methodology

DTS (.XLS)

RL (.XLS)

Data Collected in PROJECT DATA

COLLECTION EQA 2.0 D1.XLS

PROJECT WBS EQA 1.0 D1.XLS

Resource Name, Project Name, Build Name, Planned Tasks, Unplanned Tasks, Time spent

Testing Defects, Customer identified defects (CID)

Review Errors & Defects

Test Report (.XLS)

Guidelines, Templates

Planned Effort, Actual Effort, Planned Start date, Planned Finish date, Interim Start date, Interim Finish date, Actual Start date, Actual Finish date,

Estimated Size, Estimated Effort, Estimated Resource Count,

CID Report (.XLS)

Defect ID, Description, Source / Location, Identified date / by, Defect Type / Class, Detected in Phase, Injected in Phase, Defect Severity, Defect Status

Error / Defect ID, Description, Source / Location, Identified date / by, Error/Defect Status

Derived Metrics: 1. Effort Variation 2. Schedule Variation 3. TCA productivity 4. TCR productivity 5. TCE productivity 6. Challenged TC % 7. Adhoc Bug % 8. Challenged Bug % 9. Rejected Bug %

Test Case ID, Executed by, Execution date, Test procedure, Expected results, Actual results, Execution status, Defect description

Defect description Identified by, Identified date, Defect Type, Defect priority,

Page 11: Metrics  Sirisha

Operational Definitions• For each Metric

– Objective

– Definition

– Formula used

– Unit of Measure

– Key Note

• For each Metric

– Data Input

– Data Source

– Responsibility

– Frequency

A. Metrics Operational Definitions:

Metric UOM Attribute/Entity Definition Life Cycle

Base Measure Measurement Method

B. Decision Criteria:

Data Pattern Reporting Format Data extraction cycle Reporting cycle Distribution Availability

C. Data Collection Procedure:

Data Item Data type Database Record Data Elements / Fields

Who Collects the Data

Data Collection Rules & Procedures

Page 12: Metrics  Sirisha

Effort Variation

Objective– To improve the estimation and productivity

Definition– Effort variation is the % deviation of the actual effort spent on a project/

phase/activity from the estimated effort

Formula usedEffort Variation = Actual Effort – Estimated Effort x 100%

Estimated Effort

Unit of Measure– %

Page 13: Metrics  Sirisha

Effort Variation

Key Note– If the figure is negative – efforts put in the project is less

– If the figure is positive - efforts put in the project is more

– Can be used as a multiplication factor to the estimated effort to arrive at a near approximate and realistic figure

Data Input– Activity code/Activity name

– Actual Effort (PH)

– Estimated Effort (PH)

– Estimation methodology

Other Inputs– Effort type

(Requirement. Analysis/ Test Design/Test Execution)

– Product/Build/Module/ Phase

Page 14: Metrics  Sirisha

Effort Variation

Effort can be derived from Size, if Productivity factor is known..

Effort (PH) = {Size (# TC) x 1000} / Productivity (TCA or TCR or TCE /Hr.)

Data Collection Sheet for Effort Variation -Phase Wise

ActivityCode

Product Build Modules Phase ActivityPlannedEffort (inperson hrs)

ActualEffort (inperson hrs)

% Variation

                 

                 

                 

                 

                 

                 

Artifact : Project WBS EQA 1.0 D1.xls

Page 15: Metrics  Sirisha

Effort Variation

Responsibility– PL/TL (EQA) will provide estimated effort data

– Actual effort data/activity will be provided by every staff using Time sheet

Frequency– Estimated effort will be sent to SEPG at the time of estimation and effort

distribution (includes Re-estimates too)

– Actual effort data will be sent to SEPG on weekly basis (Time sheet)

Data Source– Estimated Effort – Proposal.doc, Contract.doc, Test Effort Estimation Sheet.doc

– Actual Effort - Project Data Collection.xls, Project WBS EQA.xls, Time Sheet.xls

Page 16: Metrics  Sirisha

Schedule Variation

Objective– To identify the current status of the project and check whether the project can meet

the schedule deadline

Definition– Schedule variation is the % deviation of actual duration from the planned

duration of a project, phase or activity

Formula usedSchedule Variation = Actual Finish – Planned Finish x 100%

(Planned Finish – Planned Start) + 1

Unit of Measure– %

Page 17: Metrics  Sirisha

Schedule Variation

Key Note– If the figure is negative - the schedule has crossed the target duration

– If the figure is positive - the schedule is ahead of the target duration

– Can be used as a multiplication factor to estimate realistic schedule for each milestone

Data Input– Activity code/Activity name

– Plan start & finish dates

– Planned Duration (PD)

– Actual start & finish dates

– Actual Duration (AD)

– % complete

Other Inputs– Task type

(Technical/Project)

(Planned/Unplanned)

– Product/Build/Module/ Phase

Page 18: Metrics  Sirisha

Schedule Variation

Data Collection Sheet for Schedule Variation - Phase Wise

Act.Code

Product Build Modules Phase Activity

PlanDuration(cal.days)

PlanStartDate

PlanFinishDate

ActualDuration(cal.days)

Actual StartDate

Actual Finish Date

% Variation

% Complete

                           

                           

                           

                           

                           

                           Artifact : Project WBS EQA 1.0 D1.xls

Page 19: Metrics  Sirisha

Schedule Variation

Responsibility– PL/TL (EQA) will provide the planned schedule data

Frequency– Schedule data will be reported to SEPG at WBS completion and as and

when re-scheduled during the PLC phases

– Final schedule data will be sent at the project closure

Data Source– Planned Duration – WBS Sheet.mpp, Test Plan.doc

– Actual Duration – Project Data Collection.xls, Project WBS EQA.xls,

Page 20: Metrics  Sirisha

Productivity

Objective– To find out the productivity of a project

Definition– Is the size of the task completed per hours of effort (effort being fixed)

Formula used(TCA/TCR/TCE) Productivity = Actual Size (# TC)

Effort

Unit of Measure– # TCA/PH, # TCR/PH , # TCE/PH

Page 21: Metrics  Sirisha

Productivity

Key Note– No. of test steps shall also be taken into account

– In case of test scripts, Lines of script (LOS) shall be considered

– In case of back end scripting, LOS generated by Tool shall be counted

Data Input– # TC Authored

– # TC Reviewed

– # TC Executed

– TCA Effort spent

– TCR Effort spent

– TCE Effort spent

Other Inputs– Date

– Resource ID

– Product/Build/Module

Page 22: Metrics  Sirisha

Productivity

DateResourceID Product Build Module

# TCAuthored

TCAuthoring

Effort# TC

Reviewed

TCReviewing

Effort# TC

Executed

TCExecution

Effort

                     

                     

                     

                     

                     

                     

                     

Artifact : Project Data Collection EQA 2.0 D1.xls

Data Collection Sheet for Test Case Productivity

Page 23: Metrics  Sirisha

Productivity

Responsibility– PL/TL will provide the size and effort data (design/Doc./Manual Testing)– Size Capturing Toll will provide size data for coding/Automated Testing

Frequency– productivity data will be sent to SEPG on weekly basis (every Friday)– Final size and effort details are reported at every phase milestone

Data Source– Effort data - Project Data Collection EQA 2.0 D1.xls,

– Size data - Project Data Collection EQA 2.0 D1.xls, Size Capturing Tools,

Page 24: Metrics  Sirisha

Adhoc/Challenged/Rejected Bug %

Objective– To effectively identify and report bugs early in the product

Definition– Is the % of bugs Adhoc/Challenged/Rejected as compared to the total no.

of bugs identified in the product

Formula usedAdhoc/Chall./Rej. Bug % = # of Adhoc/Chall./Rej. bugs x 100%

Total # of bugs found

Unit of Measure– %

Page 25: Metrics  Sirisha

Key Note– Challenged bugs are challenged by client and later accepted– Adhoc bugs are identified during Adhoc/Exploratory testing

Data Input– Bugs by Testing type

– Total Bugs posted

– # Enhancement Bugs

– # Challenged Bugs

– # Redundant Bugs

– # Invalid Bugs

Adhoc/Challenged/Rejected Bug %

Other Inputs– Date

– Resource ID

– Product/Build/Module

– Severity (Cr/H/M/L)

Page 26: Metrics  Sirisha

DateResourceID Product Build Module

BugsbyTestingType

# BugsPosted

#Enhancements

#ChallengedBugs

#RedundantBugs

#InvalidBugs

                     

                     

                     

                     

Adhoc/Challenged/Rejected Bug %

Data Collection Sheet for Bug Details

Artifact : Project Data Collection EQA 2.0 D1.xls

Page 27: Metrics  Sirisha

Responsibility– PL/TL will provide the review Bug data

Frequency– will be sent to SEPG on weekly basis (every Friday)

– Final review defect details are reported at project closure

Data Source– Bug data - Project Data Collection EQA 2.0 D1.xls,

Adhoc/Challenged/Rejected Bug %

Page 28: Metrics  Sirisha

Process Capability Baseline

Is Process Stable/Capable?

– Variation brings inconsistency in a process

– Variations are either due to Chance/Assignable Causes

– 80% of Variation are caused by 20% of Causes

– Eliminating variation brings a stable process

– However a stable process may not be Capable!!!

Page 29: Metrics  Sirisha

Metrics based Project Mgmt.

Page 30: Metrics  Sirisha

Metrics based Project Mgmt.

Page 31: Metrics  Sirisha

Metrics based Project Mgmt.

Page 32: Metrics  Sirisha

QQThankThank