earned value management as a measure of “quality technical accomplishment” joseph houser dan...

29
Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Upload: sebastian-herne

Post on 14-Dec-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Earned Value Management as a measure of

“Quality Technical Accomplishment”

Joseph HouserDan Milano

Paul Solomon

January 2009

Page 2: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Objective:

Review the current expectations of EVM to provide management a measure of quality technical accomplishments and progress

• Current environment

• Review OMB and DoD Policy and Guides

• Review the ANSI/EIA 748 EVM Standard

• Identify gaps

• Next steps

Most descriptions of EVM include a measure of technical progress

2

Page 3: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Earned Value Management has matured for the past 40+ years with several success stories

• 1967 – DoD - Cost/Schedule Control Systems Criteria

• 1996 – OMB Circular A-11, Part 3

• 1997 – DoD - Earned Value Management Systems Criteria

• 1998 – ANSI/EIA-748 EVM Standard (Comm’l)

• 2002 – OMB Circular A-11, Part 7 (requires 748 compliance – all Agencies)

• 2002 – ANSI/EIA-748-A

• 2004 – NDIA EVMS Intent Guide

• 2005 – PMI EVMS Practice Standard

• 2006 ANSI/EIA 748 (update to recognize Program Level EVM)•)) )

EVM has matured over the years and the Government accepts and endorses ANSI/EIA 748 EVM Standard

3

Page 4: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

MATURE PM PROCESSES AND PRACTICES USING EVM IMPROVES BUSINESS MEASURES

PROGRAM CPARS RATING

Program Management Capability

Worst

Best

MinimalCapability

MarginalPerformer

QualifiedParticipant

Best In Class CompositeWorld Class

- 8%

- 6%

- 4%

- 2%

Mid-Point

+ 2%

+ 4%

+ 6%

+ 8%

Corre

lation

of p

rogr

am

man

agem

ent c

apab

ilities

to a

ward

fee

perfo

rman

ce

CompositeWorld Class

MinimalCapability

Program Management Capability

Per

ce

nt

Aw

ard

Fe

e C

ap

ture

QualifiedPerformer

Best in ClassMarginalPerformer

PROGRAM AWARD FEE CAPTURE

MarginalPerformer

PROGRAM MANAGEMENT CAPABILITY VS.

COMPANY RETURN ON SALES

-15%

- 10%

- 5%

Mid-Point

+5%

+10%

+15%

Program Management Capability

Ret

urn

on

Sa

les

(%

)

MinimalCapability

QualifiedParticipant

Best In Class CompositeWorld Class

Source: 00-Mar 21 DCMC Conference

Program Management Capability

COMPANY WIN RATE VS. PROGRAM MANAGEMENT CAPABILITY

- 30%

- 20%

- 10%

Mid-Point

+ 10%

+ 20%

+ 30%

3y

r A

ver

ag

e W

in R

ate

MinimalCapability

MarginalPerformer

QualifiedParticipant

Best In Class CompositeWorld Class

4

Page 5: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Improved cost and schedule control processes and practices do not have to increase PM costs

-30% -20% -10% MID-POINT +10% +20% +30%

Program Office FTEs as % of Total Program FTEs

Pro

gra

m M

anag

emen

t C

apab

ility

Minimal Capability

Marginal Performer

Qualified Participant

Best in Class

Composite World Class

FTE — Full Time EquivalentProgram Manager(s), Deputy Program Manager(s), Financial Manager(s)/Financial Analyst(s), Scheduler(s)/Planner(s), Configuration and Data Manager(s), Chief Engineer(s)/Chief Technical Specialists, IPT or Functional Team Leads, Risk Focal Point(s), Subcontract Management, Administrative Support, Other Program Office functions

5

Page 6: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Cost of PM Using EVM (FY06)

6.5%

13.5%

20.5%

27.5%

34.5%

0.7 1.2 1.7 2.2 2.7 3.2

Quality of EVM Implementation

PM

% o

f T

ota

l C

ost

s

22.2% Avg

16.6% Avg

13.0% Avg

= Wg’t Avg PM %▪

PM %

of T

otal

Cos

t

Quality of EVM Implementation

▪▪

FAA Cost of Program Management Using EVM

- Quality of EVM implementation based on EVM assessments (FAA EVM Flag)- PM% of total cost based on FY06 Resource Planning Document (RPD)

RED YELLOW GREEN

Cost of PM Using EVM (FY06)

6.5%

13.5%

20.5%

27.5%

34.5%

0.7 1.2 1.7 2.2 2.7 3.2

Quality of EVM Implementation

PM

% o

f T

ota

l C

ost

s

22.2% Avg

16.6% Avg

13.0% Avg

= Wg’t Avg PM %▪

PM %

of T

otal

Cos

t

Quality of EVM Implementation

▪▪

FAA Cost of Program Management Using EVM

- Quality of EVM implementation based on EVM assessments (FAA EVM Flag)- PM% of total cost based on FY06 Resource Planning Document (RPD)

RED YELLOW GREEN

FAA “Cost of EVM” study indicated programs with mature EVM incur less PM costs

6

Page 7: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Cost of PM Using EVM (FY06)

6.5%

13.5%

20.5%

27.5%

34.5%

0.7 1.2 1.7 2.2 2.7 3.2

Quality of EVM Implementation

PM

% o

f T

ota

l C

ost

s

22.2% Avg

16.6% Avg

13.0% Avg

= Wg’t Avg PM %▪

PM %

of T

otal

Cos

t

Quality of EVM Implementation

▪▪

FAA Cost of Program Management Using EVM

- Quality of EVM implementation based on EVM assessments (FAA EVM Flag)- PM% of total cost based on FY06 Resource Planning Document (RPD)

RED YELLOW GREEN

Cost of PM Using EVM (FY06)

6.5%

13.5%

20.5%

27.5%

34.5%

0.7 1.2 1.7 2.2 2.7 3.2

Quality of EVM Implementation

PM

% o

f T

ota

l C

ost

s

22.2% Avg

16.6% Avg

13.0% Avg

= Wg’t Avg PM %▪

PM %

of T

otal

Cos

t

Quality of EVM Implementation

▪▪

FAA Cost of Program Management Using EVM

- Quality of EVM implementation based on EVM assessments (FAA EVM Flag)- PM% of total cost based on FY06 Resource Planning Document (RPD)

RED YELLOW GREEN

The use of EVM has several success stories with the Government and industry striving to increase EVM success stories

7

Page 8: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Most EVM training include integration of technical / schedule / costs

“All programs have an element of risk requiring effective and integrated cost / schedule management processes.”

Tech

nica

l

Schedule

Cost

S

uper

ior

t

echn

ical

so

lutio

n

Quick

delivery

Low cost

Risk Management

8

Page 9: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

9

OMB requires Quality measurement duringAcquisition of Capital Assets

Circular No. A-11, Section 300,

Planning, Budgeting, Acquisition and Management of Capital Assets, Section 300-5

• Performance-based acquisition management• Based on EVMS standard• Measure progress towards milestones

• Cost• Capability to meet specified requirements

• Timeliness• Quality

Page 10: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

10

PMI PMBOK® Guide recognizes Product Scope and quality/technical parameters

10.5.1.1 Project Management Plan• PMB:

- Typically integrates scope, schedule, and cost parameters of a project

- May also include technical and quality parameters

5. Project Scope Management, 2 elements• Product scope. The features and functions that

characterize a product, service, or result

• Project scope. The work that needs to be accomplished to deliver a product, service, or result with the specified features and functions.

It can be argued that project management plans should always include technical and quality parameters

Page 11: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

GAO Expects EVM to measure technical progress

GAO Cost Guide:

“Reliable EVM data usually indicate monthly how well a program is performing in terms of cost, schedule, and technical matters.”

“A WBS is an essential part of EVM cost, schedule, and technical monitoring, because it provides a consistent framework from which to measure actual progress.”

“The benefits of using EVM are singularly dependent on the data from the EVM system. Organizations must be able to evaluate the quality of an EVM system in order to determine the extent to which the cost, schedule, and technical performance data can be relied on for program management purposes.”

11

Page 12: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

Management expectation for EVM to include measures of quality technical progress is reasonable

12

PMI PMBOK 

10.5.1.1 Project Management Plan

5. Project Scope Management

GAO Cost Guide

Page 13: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

OSD “the left bar chart illustrates the fact that roughly half of our key Earned Value data fields are empty, for a variety of reasons”

•EVM Data Quality: Unacceptable for Critical Measurements and Decision-making

•Funding Data Quality: Acceptable for Critical Measurements and Decision-making

13

Page 14: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

14

OSD “the left bar chart illustrates the fact that roughly half of our key Earned Value data fields are empty, for a variety of reasons”

Page 15: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

March 2008:

• DCMA determined that the data as being of poor quality and issued a report stating that it is deficient to the point where the government is not obtaining useful program performance data to manage risks.

GAO recent report included poor quality data finding on a major procurement

15

Page 16: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

GAO March 2008:

• DCMA determined that the data as being of poor quality and issued a report stating that it is deficient to the point where the government is not obtaining useful program performance data to manage risks.

The EVM community needs to conduct a root cause analysis with corrective action to regain our customers confidence

16

DCMA has significantly increased oversight with the intent to improve the usefulness of EVM to management

Page 17: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

• EVM works with numerous success stories• Some implementations go beyond the ANSI 32

Guidelines and have integrated quality and technical parameters

• Integration of scope, schedule, cost, quality, and technical measures is “Desired by our stakeholders using EVM data”

• EVM data integrity is a major issue with OSD

Let’s summarize,

17

Page 18: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

• DoDI 5000.02, Operation of the Defense Acquisition System (POL)

• Defense Acquisition Guidebook (DAG)

• Systems Engineering Plan (SEP) Preparation Guide

• WBS Handbook, Mil-HDBK-881A (WBS)

• Integrated Master Plan & Integrated Master Schedule Preparation

& Use Guide (IMP/IMS)

• Guide for Integrating SE into DOD Acquisition Contracts (Integ

SE)

18

DoD Policy and Guides Specify Technical Performance

Page 19: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

• DAG, WBS, IMP/IMS, SEP• Integrated Plans

• WBS, SEP, IMP/IMS

• Technical Performance Measures (TPM)

• EVM

• Technical reviews• Event-driven timing

• Success criteria

• Assess technical maturity

• Integ SE Guide• Include technical baselines in IMP/IMS

• During IBR, review:

• Correlation of TPMs, IMP, IMS, EVM

• Success criteria

• Corr

19

DoD Policy and Guides: Common Elements

Page 20: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

ANSI/EIA 748 EVM StandardParagraph 3.8 – Performance Measurement

“Earned value is a direct measurement of the quantity of work accomplished. The quality and technical content of work performed is controlled by other processes. Earned value is a value added metric that is computed on the basis of the resources assigned to the completed work scope as budget.”

EXAMPLES:

1. If a test is complete (design meets the requirements); then it is acceptable to claim 100% earned value of the planned scope for “test”

2. If software design, code, and test is complete, then it is acceptable to claim 100% earned value of the planned scope for “SW Development”

ANSI does not require links or interfaces to quality or technical parameter measurement processes

ANSI does not require technical nor quality parameter measurement, only the “quantity of work accomplished”

20

Page 21: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

ANSI/EIA 748 EVM Standard

Paragraph 2.2 Planning, Scheduling, and Budgeting• 

•a) Schedule the authorized work in a manner which describes the sequence of work and identifies significant task interdependencies required to meet the requirements of the program.

• 

•B) Identify physical products, milestones, technical performance goals, or other indicators that will be used to measure progress.

Note: Technical performance goals are acceptable, but not required

ANSI recognizes technical performance goals (but not required) and there are no references to quality parameters

21

Page 22: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

• OMB Guide• Measure capability to meet requirements• Measure quality

• OSD Policies and Guides• Integrate WBS, IMP/IMS, EVM with TPMs• Include success criteria of technical reviews in IMS• Assess technical maturity

• ANSI/EIA 748 EVM Standard• EVM limited to “quantity of work accomplished”• Technical performance goals recognized, but not required• Quality performance is not referenced• Quality and Technical are specifically referenced as being

“controlled by other processes”

Let’s summarize OMB, OSD policies and ANSI/EIA 748 as related to EVM, quality, and technical performance:

22

Page 23: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

• FAA has incorporated standard program milestones with exit criteria that represent:

• Quality and technical parameters• Decision authority to verify acceptable quality

and technical performance progress

Is it possible to include quality and technical parameters with EVM?

23

Page 24: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

24

The FAA technical milestones (required by AMS policy) are defined with clear exit criteria and decision authority

Mile

sto

ne

Milestone Description

Lev

el

AMS Phase

Ap

plic

abili

ty

WBS Description Decision Authority

S18

Final investment decision - The JRC approves and baselines an investment program at the final investment decision. The decision document is the XPB and its planning attachments

1 Final investment analysis

F,C,N

WBS 2.2.3 : Final Investment Decision - Documentation All activities associated with completing the final investment decision which includes the following documents: JRC briefing package, revalidated MNS, final requirements document, final XPB, final acquisition strategy, final integrated program plan (with

risk management plan), and the final investment analysis report.

Joint Resources Council

S19

Integrated baseline review completed - The IBR conduct is complete and the service team and leader agree on action items.

2 Solution implementation

F,C,N

WBS 3.1.1 : Solution Implementation - Program Planning, Authorization, Management and Control All activities associated with developing the strategy for implementing and executing the overall program.All activities associated with planning, authorizing, and managing all actions and activities that must be accomplished for successful program development, which includes preparation of the acquisition strategy paper and the integrated program plan, and the project-specific input to agency-level planning documents, such as the call for estimates and the NAS architecture. It also includes all activities required to ensure that all cost, schedule, performance, and benefit objectives are met.

Service team leader

S24

Preliminary design review (PDR) completed - PDR is conducted by the service team to determine conformity of functional characteristics of the design to baseline requirements. The PDR represents approval to begin detailed design. The PDR is complete when the service team determines that action items resulting from the review are sufficiently completed and the contracting officer authorizes the contractor to proceed.

2 Solution implementation

F,N

WBS 3.2.3 : Solution Development - Analysis, Design, and Integration All activities associated with the overall analysis, design, test, and integration of the solution, (e.g., hardware system, software, facility, and telecommunications ). This includes design, integrity, test and analysis, intra- and inter-system compatibility assurance (interface identification, analysis, and design), and the integration and balancing of reliability, maintainability, producibility, safety, and survivability. Design includes allocating functions to appropriate elements (e.g., hardware, software, telecommunications, user functions, services, facilities, etc.), and presenting prepared design information at identified design

reviews.

The service team leader

S31

Operational test & evaluation (OT&E) Completed: OT&E is conducted in an environment as operationally realistic as possible. This milestone is completed when government integration and shakedown testing has been performed at the test and evaluation site. It occurs when all test procedures have been successfully completed per the test plan.

2Solution

implementationF,C,N

WBS 3.5.2 : System Operational Test and Evaluation All activities associated with tests and evaluations conducted to assess the prospective system’s utility, operational effectiveness, operational suitability, and logistics supportability (including compatibility, interoperability, reliability, maintainability, logistics requirements, security administration, etc.). It includes all support activities (e.g., technical assistance, maintenance, labor, material, support elements and testing spares etc.) required during this phase of testing.All activities associated with development and construction of those special test facilities, test simulators, test beds, and models

required for performance of the operational tests.

OT&E team leader (WJHTC)

Applicability Legend: F= Full Scale Development C= Commercial Off The Shelf N= Non-Development Items

STANDARD AMS SYSTEM MILESTONES WBS MAPPING AND DECISION AUTHORITY

Page 25: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

25

The FAA AMS technical milestones are used by multiple processes to align common performance measures

Mil

es

ton

e

Milestone Description

Le

ve

l

AMS Phase

Ap

pli

ca

bil

ity

WBS Description Decision Authority

S18

Final investment decision - The JRC approves and baselines an investment program at the final investment decision. The decision document is the XPB and its planning attachments

1 Final investment analysis

F,C,N

WBS 2.2.3 : Final Investment Decision - Documentation All activities associated with completing the final investment decision which includes the following documents: JRC briefing package, revalidated MNS, final requirements document, final XPB, final acquisition strategy, final integrated program plan (with

risk management plan), and the final investment analysis report.

Joint Resources Council

S19

Integrated baseline review completed - The IBR conduct is complete and the service team and leader agree on action items.

2 Solution implementation

F,C,N

WBS 3.1.1 : Solution Implementation - Program Planning, Authorization, Management and Control All activities associated with developing the strategy for implementing and executing the overall program.All activities associated with planning, authorizing, and managing all actions and activities that must be accomplished for successful program development, which includes preparation of the acquisition strategy paper and the integrated program plan, and the project-specific input to agency-level planning documents, such as the call for estimates and the NAS architecture. It also includes all activities required to ensure that all cost, schedule, performance, and benefit objectives are met.

Service team leader

S24

Preliminary design review (PDR) completed - PDR is conducted by the service team to determine conformity of functional characteristics of the design to baseline requirements. The PDR represents approval to begin detailed design. The PDR is complete when the service team determines that action items resulting from the review are sufficiently completed and the contracting officer authorizes the contractor to proceed.

2 Solution implementation

F,N

WBS 3.2.3 : Solution Development - Analysis, Design, and Integration All activities associated with the overall analysis, design, test, and integration of the solution, (e.g., hardware system, software, facility, and telecommunications ). This includes design, integrity, test and analysis, intra- and inter-system compatibility assurance (interface identification, analysis, and design), and the integration and balancing of reliability, maintainability, producibility, safety, and survivability. Design includes allocating functions to appropriate elements (e.g., hardware, software, telecommunications, user functions, services, facilities, etc.), and presenting prepared design information at identified design

reviews.

The service team leader

S31

Operational test & evaluation (OT&E) Completed: OT&E is conducted in an environment as operationally realistic as possible. This milestone is completed when government integration and shakedown testing has been performed at the test and evaluation site. It occurs when all test procedures have been successfully completed per the test plan.

2Solution

implementationF,C,N

WBS 3.5.2 : System Operational Test and Evaluation All activities associated with tests and evaluations conducted to assess the prospective system’s utility, operational effectiveness, operational suitability, and logistics supportability (including compatibility, interoperability, reliability, maintainability, logistics requirements, security administration, etc.). It includes all support activities (e.g., technical assistance, maintenance, labor, material, support elements and testing spares etc.) required during this phase of testing.All activities associated with development and construction of those special test facilities, test simulators, test beds, and models

required for performance of the operational tests.

OT&E team leader (WJHTC)

Applicability Legend:

F= Full Scale Development

C= Commercial Off The Shelf

N= Non-Development Items

STANDARD AMS SYSTEM MILESTONES WBS MAPPING AND DECISION AUTHORITY

Page 26: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

FAA EVM is summarizes the work and activities required to achieve the FAA AMS Standard Program Milestones

Description of Milestone

Planned Completion

Date (mm/dd/yyyy)

Total Cost ($M)

Estimated

Completion Date (mm/dd/yyyy)

Planned

Completion Date (mm/dd/yyyy)

ActualTotal Cost

($M) Planned:Total Cost ($M)

ActualSchedule:#

days Cost $MPercent

Complete

Total x300 Program Baseline 4/30/2015 111.400 3/4/2012 3/4/2012 133.000 48.503 0 64.547 85.0%

Planning

(S9) Initial Investment Decision 6/7/2006 2.500 6/7/2006 6/7/2006 2.500 1.430 0 1.070 100.0%

(S18) Final Investment Decision (FID) 2/23/2007 2.500 2/23/2007 2/23/2007 2.500 1.470 0 1.030 100.0%Total Planning 2/23/2007 5.000 2/23/2007 1/0/1900 5.000 2.900 0 2.100 100.0%

Solution Implementation (Acquisition)Design Phase ( JRC Approved)Program Management 8/13/2008 5.000 8/13/2008 9/15/2008 5.000 2.340 0 2.660 100.0%(S19) IBR 4/5/2007 0.500 4/5/2007 4/7/2007 0.500 0.453 -2 0.047 100.0%(S20) Contract Award 4/30/2007 2.500 4/30/2007 5/15/2007 2.500 2.870 -15 -0.370 100.0%(S24) PDR 6/17/2007 3.500 6/17/2007 7/21/2007 3.500 3.522 -34 -0.022 100.0%(S25) CDR 11/30/2007 2.000 11/30/2007 12/12/2007 2.000 1.544 -12 0.456 100.0%(S26) Prod Demo Decision 8/13/2008 3.000 8/13/2008 9/15/2008 3.000 2.713 -33 0.287 100.0%Other 8/13/2008 2.000 8/13/2008 9/15/2008 2.000 1.970 -33 0.030 100.0%

Subtotal Design 8/13/2008 18.500 8/13/2008 8/13/2008 18.500 15.412 -129 3.088 100.0%

Program Management 11/11/2009 5.000 11/11/2009 11/11/2009 5.000 2.367 0 2.633 100.0%(S28) System Delivery 8/30/2008 2.400 8/30/2008 9/15/2008 2.400 1.250 -15 1.150 100.0%(S30) Development T&E (DT&E) 12/17/2008 3.200 12/17/2008 12/17/2008 3.200 2.598 0 0.602 100.0%(S31) Operational T&E (OT&E) 3/11/2009 2.000 3/11/2009 3/11/2009 2.000 2.175 0 -0.175 100.0%(S34) Prod Readiness Review (PRR) 7/23/2009 4.000 7/23/2009 7/23/2009 4.000 2.301 0 1.699 100.0%(S35) Production Decision 11/11/2009 3.205 11/11/2009 3.205 2.800 0.052 89.0%Other 11/11/2009 4.600 11/11/2009 4.600 4.200 0.170 95.0%

Subtotal Product Demonstration 11/11/2009 24.405 11/11/2009 11/11/2009 24.405 17.691 -15 6.131 1.000Production and Deployment Phase (Planned)

Product Demonstration Phase (JRC Approved)

26

Page 27: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

The FAA implementation of the ANSI/EIA 748 includes clear and well understood quality and technical parameters

27

Description of Milestone

Planned Completion

Date (mm/dd/yyyy)

Total Cost ($M)

Estimated

Completion Date (mm/dd/yyyy)

Planned

Completion Date (mm/dd/yyyy)

ActualTotal Cost

($M) Planned:Total Cost ($M)

ActualSchedule:#

days Cost $MPercent

Complete

Total x300 Program Baseline 4/30/2015 111.400 3/4/2012 3/4/2012 133.000 48.503 0 64.547 85.0%

Planning

(S9) Initial Investment Decision 6/7/2006 2.500 6/7/2006 6/7/2006 2.500 1.430 0 1.070 100.0%

(S18) Final Investment Decision (FID) 2/23/2007 2.500 2/23/2007 2/23/2007 2.500 1.470 0 1.030 100.0%Total Planning 2/23/2007 5.000 2/23/2007 1/0/1900 5.000 2.900 0 2.100 100.0%

Solution Implementation (Acquisition)Design Phase ( JRC Approved)Program Management 8/13/2008 5.000 8/13/2008 9/15/2008 5.000 2.340 0 2.660 100.0%(S19) IBR 4/5/2007 0.500 4/5/2007 4/7/2007 0.500 0.453 -2 0.047 100.0%(S20) Contract Award 4/30/2007 2.500 4/30/2007 5/15/2007 2.500 2.870 -15 -0.370 100.0%(S24) PDR 6/17/2007 3.500 6/17/2007 7/21/2007 3.500 3.522 -34 -0.022 100.0%(S25) CDR 11/30/2007 2.000 11/30/2007 12/12/2007 2.000 1.544 -12 0.456 100.0%(S26) Prod Demo Decision 8/13/2008 3.000 8/13/2008 9/15/2008 3.000 2.713 -33 0.287 100.0%Other 8/13/2008 2.000 8/13/2008 9/15/2008 2.000 1.970 -33 0.030 100.0%

Subtotal Design 8/13/2008 18.500 8/13/2008 8/13/2008 18.500 15.412 -129 3.088 100.0%

Program Management 11/11/2009 5.000 11/11/2009 11/11/2009 5.000 2.367 0 2.633 100.0%(S28) System Delivery 8/30/2008 2.400 8/30/2008 9/15/2008 2.400 1.250 -15 1.150 100.0%(S30) Development T&E (DT&E) 12/17/2008 3.200 12/17/2008 12/17/2008 3.200 2.598 0 0.602 100.0%(S31) Operational T&E (OT&E) 3/11/2009 2.000 3/11/2009 3/11/2009 2.000 2.175 0 -0.175 100.0%(S34) Prod Readiness Review (PRR) 7/23/2009 4.000 7/23/2009 7/23/2009 4.000 2.301 0 1.699 100.0%(S35) Production Decision 11/11/2009 3.205 11/11/2009 3.205 2.800 0.052 89.0%Other 11/11/2009 4.600 11/11/2009 4.600 4.200 0.170 95.0%

Subtotal Product Demonstration 11/11/2009 24.405 11/11/2009 11/11/2009 24.405 17.691 -15 6.131 1.000Production and Deployment Phase (Planned)

Product Demonstration Phase (JRC Approved)

Page 28: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

• Milestone success criteria met to claim 100% EV• CDR: Design solution meets

– Allocated performance requirements– Functional performance requirements– Interface requirements

• Interim milestones with planned values for TPMs• Weight does not exceed 300 lb. at (date)• 90% of software functionality shalls met (date)

• Base EV on 2 measures:• Completion of enabling work products (drawings, code)• Meeting product requirements (as documented in technical

baseline)

28

Examples of integrating technical performance with EVM

Page 29: Earned Value Management as a measure of “Quality Technical Accomplishment” Joseph Houser Dan Milano Paul Solomon January 2009

29

Section Is: Recommended Clarification:

Intro: 1. Introduction The principles of an EVMS are: Plan all work scope for the

program to completion

Clarify work scope includes technical and quality requirements

Paragraph 3.8 – Performance Measurement

Earned value is a direct measurement of the quantity of work accomplished. The quality and technical content of work performed is controlled by other processes. Earned value is a value added metric that is computed on the basis of the resources assigned to the completed work scope as budget.”

Clarify that earned value is a direct measurement of the quantity of work accomplished and technical/quality performance.

Some of the possible ANSI revisions are: