lecture 5 estimation-revised estimate size, then estimate effort, schedule and cost from size cs 551

40
Lecture 5 Estimation- revised Estimate size, then Estimate effort, schedule and cost from size CS 551

Post on 22-Dec-2015

220 views

Category:

Documents


2 download

TRANSCRIPT

Lecture 5 Estimation-revised

Estimate size, thenEstimate effort, schedule and cost from size

CS 551

Project Metrics

Cost and schedule estimation Measure progress Calibrate models for future estimating Metric/Scope Manager Product

Number of projects x number of metrics = 15-20

Approaches to Cost Estimtation

• By expert

• By analogies

• Decomposition

• Parkinson’s Law; work expands to fill time available

• Pricing to win: price is set at customer willingness to pay

• Lines of Code

• Function Points

• Mathematical Models: Function Points & COCOMO

Time

Staff-month

Ttheoretical

75% * Ttheoretical

Impossible design

Linear increase

Boehm: “A project can not be done in less than 75% of theoretical time”

Ttheoretical = 2.5 * 3√staff-months

But, how can I estimate staff months?

Sizing Software Projects

Effort = (productivity)-1 (size)c

productivity ≡ staff-months/kloc

size ≡ kloc

Staff

months

Lines of Code or

Function Points

500

Understanding the equations

Consider a transaction project of 38,000 lines of code, what is the shortest time it will take to develop? Module development is about 400 SLOC/staff month

Effort = (productivity)-1 (size)c

= (1/.400 KSLOC/SM) (38 KSLOC)1.02

= 2.5 (38)1.02 ≈ 100 SMMin time = .75 T= (.75)(2.5)(SM)1/3

≈ 1.875(100)1/3

≈ 1.875 x 4.63 ≈ 9 months

How many software engineers?

1 full time staff week = 40 hours, 1 student week = 20 hours.

Therefore, our estimation of 100 staff months is actually 200 student months.

The period of coding is December 2004 through April 2005, which is a period of 5 months.

200 staff months/5 months = 40 student software engineers, therefore simplification is mandatory

As of 8Nov07

0

2

4

6

8

10

12

20 40 80 160 320 640 1280 2560 5120 10240 20480 40960

Function Points

Bell Laboratories data

Capers Jones data

Prod

uctiv

ity (F

unct

ion

poin

ts /

staf

f mon

th)

Productivity= f (size)

Average Change Processing Time: for two Systems of Systems

Average workdays to

process changes

0

20

40

60

80

100

120

140

160

WithinGroups

AcrossGroups

ContractMods

Thanks to Barry Boehm

Effect of ignoring software structure

O rig in a l

S W

S e n so rs

S W

N e tw o rks

S W

W M I

C 4 IS R S ys E n gr P la tfo rm s

P M

•Software risks discovered too late

•Slow, buggy change management

SW

Software

SW SW

(WBS-based)

Thanks to Barry Boehm

Software Development Schedule TrendsNumber of Years ≈ 0.04 * cube root (NCKSLOC)

0

10

20

10 100 1000 10000 100000

Years to Develop

Software, Hardware

HW

Thousands of source lines of code (KSLOC)

SW

05/22/2007 (c) USC-CSSE 12

The Cone of Uncertainty: Usual result of total commitment

Feasibility

Concept of Operation

Rqts. Spec.

Plans and

Rqts.

Product Design

Product Design Spec.

Detail Design Spec.

Detail Design

Devel. and Test

Accepted Software

Phases and Milestones

RelativeCost Range x

4x

2x

1.25x

1.5x

0.25x

0.5x

0.67x

0.8x

90% confidence limits:- Pessimistic

- Optimistic

^Inadequate PDR

Better to buy information to reduce risk

Thanks to Barry Boehm

05/22/2007 (c) USC-CSSE 13

There is Another Cone of Uncertainty:Shorter increments are better

Feasibility

Concept of Operation

Rqts. Spec.

Plans and

Rqts.

Product Design

Product Design Spec.

Detail Design Spec.

Detail Design

Devel. and Test

Accepted Software

Phases and Milestones

RelativeCost Range x

4x

2x

1.25x

1.5x

0.25x

0.5x

0.67x

0.8x

Uncertainties in competition, technology, organizations,

mission priorities

The Incremental Commitment Life Cycle Process: OverviewStage I: Definition Stage II: Development and Operations

Anchor Point Milestones

Anchor Point Milestones

Concurrently engr. OpCon, rqts, arch, plans, prototypes

Concurrently engr. OpCon, rqts, arch, plans, prototypes

Concurrently engr. Incr.N (ops), N+1

(devel), N+2 (arch)

Concurrently engr. Incr.N (ops), N+1

(devel), N+2 (arch)

ICM Stage II: Increment View

Increment N Baseline

Rapid Change

High Assurance

Short, Stabilized Development of Increment N

Increment N Transition/O&M

Short Development Increments

Stable Development Increments

Foreseeable Change (Plan)

Increment N Baseline

Rapid Change

High Assurance

Short, Stabilized Development of Increment N

Increment N Transition/O&M

Short Development Increments

Stable Development Increments

Foreseeable Change (Plan)

Increment View

Increment N Baseline

Future Increment Baselines Rapid Change

High Assurance

Agile Rebaselining for Future Increments

Short, Stabilized Development of Increment N

V&V of Increment N

Increment N Transition/O&M

Current V&V

Short Development Increments

Future V&V

Stable Development Increments

Continuous V&V

Concerns Artifacts

Deferrals Foreseeable Change (Plan)

Resources Resources

Increment N Baseline

Future Increment Baselines Rapid Change

High Assurance

Agile Rebaselining for

Short, Stabilized Development of Increment N

V&V of Increment N

Increment N Transition/O&M

Current V&V

Short Development Increments

Future V&V

Stable Development Increments

Continuous V&V

Concerns Artifacts

Deferrals Foreseeable Change (Plan)

Resources Resources

Unforseeable Change (Adapt)

Thanks to Barry Boehm

VCR

DCR

IOC

OCR

ACR

C

CD

RUP/ICM Anchor Points Enable Concurrent Engineering

Thanks to Barry Boehm

Lines of Code

LOC ≡ Line of Code KLOC ≡ Thousands of LOC KSLOC ≡ Thousands of Source LOC NCSLOC ≡ New or Changed KSLOC

Productivity per staff-month:» 50 NCSLOC for OS code (or real-time system)

» 250-500 NCSLOC for intermediary applications (high risk, on-line)

» 500-1000 NCSLOC for normal applications (low risk, on-line)

» 10,000 – 20,000 NCSLOC for reused code

Reuse note: Sometimes, reusing code that does not provide the exact functionality needed can be achieved by reformatting input/output. This decreases performance but dramatically shortens development time.

Bernstein’s rule of thumb

Productivity: Measured in 2000

Classical rates 130 – 195 NCSLOC

Evolutionary or Incremental approaches

244 – 325 NCSLOC

New embedded flight software

17 – 105 NCSLOC

QSE Lambda Protocol

Prospectus Measurable Operational Value Prototyping or Modeling sQFD Schedule, Staffing, Quality Estimates ICED-T Trade-off Analysis

Heuristics for requirements engineering

Move some of the desired functionality into version 2

Deliver product in stages 0.2, 0.4… Eliminate features Simplify Features Reduce Gold Plating Relax the specific feature specificaitons

Function Point (FP) Analysis

Useful during requirement phase Substantial data supports the methodology Software skills and project characteristics are accounted

for in the Adjusted Function Points FP is technology and project process dependent so that

technology changes require recalibration of project models.

Converting Unadjusted FPs (UFP) to LOC for a specific language (technology) and then use a model such as COCOMO.

Function Point Calculations

Unadjusted Function Points

UFP= 4I + 5O + 4E + 10L + 7F, Where

I ≡ Count of input types that are user inputs and change data structures. O ≡ Count of output typesE ≡ Count of inquiry types or inputs controlling execution.

[think menu selections]L ≡ Count of logical internal files, internal data used by system

[think index files; they are group of logically related data entirely within the applications boundary and maintained by external inputs. ]

F ≡ Count of interfaces data output or shared with another application

Note that the constants in the nominal equation can be calibrated to a specific software product line.

Complexity Table

TYPE: SIMPLE AVERAGE COMPLEX

INPUT (I) 3 4 6

OUTPUT(O) 4 5 7

INQUIRY(E) 3 4 6

LOG INT (L) 7 10 15

INTERFACES (F)

5 7 10

Complexity Factors

1. Problem Domain ___2. Architecture Complexity ___3. Logic Design -Data ___4. Logic Design- Code ___

Total ___

Complexity = Total/4 = _________

Problem Domain Measure of Complexity (1 is simple and 5 is complex)

1. All algorithms and calculations are simple.2. Most algorithms and calculations are simple.3. Most algorithms and calculations are moderately

complex.4. Some algorithms and calculations are difficult.5. Many algorithms and calculations are difficult.

Score ____

Architecture ComplexityMeasure of Complexity (1 is simple and 5 is complex)

1. Code ported from one known environment to another. Application does not change more than 5%.2. Architecture follows an existing pattern. Process design is straightforward. No complex hardware/software interfaces.3. Architecture created from scratch. Process design is straightforward. No complex hardware/software interfaces.4. Architecture created from scratch. Process design is complex. Complex hardware/software interfaces exist but they are well defined and unchanging.5. Architecture created from scratch. Process design is complex. Complex hardware/software interfaces are ill defined and changing.

Score ____

Logic Design -Data

1. Simple well defined and unchanging data structures. Shallow inheritance in class structures. No object classes have inheritance greater than 3.

2. Several data element types with straightforward relationships. No object classes have inheritance greater than

3. Multiple data files, complex data relationships, many libraries, large object library. No more than ten percent of the object classes have inheritance greater than three. The number of object classes is less than 1% of the function points

4. Complex data elements, parameter passing module-to-module, complex data relationships and many object classes has inheritance greater than three. A large but stable number of object classes.

5. Complex data elements, parameter passing module-to-module, complex data relationships and many object classes has inheritance greater than three. A large and growing number of object classes. No attempt to normalize data between modules

Score ____

Logic Design- Code

1. Nonprocedural code (4GL, generated code, screen skeletons). High cohesion. Programs inspected. Module size constrained between 50 and 500 Source Lines of Code (SLOCs).

2. Program skeletons or patterns used. ). High cohesion. Programs inspected. Module size constrained between 50 and 500 SLOCs. Reused modules. Commercial object libraries relied on. High cohesion.

3. Well-structured, small modules with low coupling. Object class methods well focused and generalized. Modules with single entry and exit points. Programs reviewed.

4. Complex but known structure randomly sized modules. Some complex object classes. Error paths unknown. High coupling.

5. Code structure unknown, randomly sized modules, complex object classes and error paths unknown. High coupling.

Score __

Complexity Factors

1. Problem Domain ___2. Architecture Complexity ___3. Logic Design -Data ___4. Logic Design- Code ___

Total ___

Complexity = Total/4 = _________

Computing Function Points

See http://www.engin.umd.umich.edu/CIS/course.des/cis525/js/f00/artan/functionpoints.htm

Function Points Qualifiers

Based on counting data structures Focus is on-line data base systems Less accurate for WEB applications Even less accurate for Games, finite state machine and

algorithm software Not useful for extended machine software and compliers

An alternative to NCKSLOC because estimates can be based on requirements and design data.

Initial Conversion

Language Median SLOC/function pointC 104

C++ 53

HTML 42

JAVA 59

Perl 60

J2EE 50

Visual Basic 42

http://www.qsm.com/FPGearing.html

SLOC

Function Points = UFP x TCF = 78 * .96 = 51.84 ~ 52 function points

78 UFP * 53 (C++ )SLOC / UFP = 4,134 SLOC

= 4.158 KSLOC

.

(Reference for SLOC per function point: http://www.qsm.com/FPGearing.html)

Function Point pros and cons

Pros:

• Language independent

• Understandable by client

• Simple modeling

• Hard to fudge

• Visible feature creep

Cons:• Labor intensive• Extensive training • Inexperience results in

inconsistent results• Weighted to file

manipulation and transactions

• Systematic error introduced by single person, multiple raters advised

Heuristics to do Better Estimates

Decompose Work Breakdown Structure to lowest possible level and type of software.

Review assumptions with all stakeholders Do your homework - past organizational experience Retain contact with developers Update estimates and track new projections (and warn) Use multiple methods Reuse makes it easier (and more difficult) Use ‘current estimate’ scheme

Specification for Development Plan

Project Feature List Development Process Size Estimates Staff Estimates Schedule Estimates Organization Gantt Chart