nasa project management system engineering and project control processes and requierments

209
Project Management: Systems Engineering & Project Control Processes and Requirements JPR 7120.3 March 2004

Upload: pooyan-aslani

Post on 21-Jan-2016

108 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: Systems Engineering & Project Control Processes and Requirements

JPR 7120.3March 2004

Page 2: NASA Project Management System Engineering and Project Control Processes and Requierments

JPR 7120.3 JSC Procedures and Guidelines for Project Management:

Systems Engineering and Project Control Processes and Requirements

Page 3: NASA Project Management System Engineering and Project Control Processes and Requierments

Development Team

This document was developed as a joint effort between the JSC Systems Engineering Working Group (J-SEWG) and the Project Management Working Group (PMWG).

Consultation supported proved by the following personnel:

Booz-Allen & Hamilton, Inc./Jack Gavalas Raytheon Technical Services Company, LCC/Larry Patrick Raytheon Technical Services Company, LCC/Brian Vining Raytheon Technical Services Company, LCC/Harold Smith

Editorial and graphics design support provided by the following personnel:

DynCorp/Betty Conaway InDyne/Sharon Hecht

InDyne/Sid Jones

J-SEWG: AG/James Ortiz (Chair) AG/Jeff Phillips, Dwight Auzenne, Ralph Anderson EA/Linda Bromley, Joyce Carpenter, Cliff Farmer DA/Larry Bishop, Patricia Carreon IA/Jon Symes, John Jurgensen JA/Jay Hoover NA/Daniel Freund RA/Jason Noble SA/Karen Morrison XA/Cuong Nguyen OA/Jonette Stecklein

PMWG: AG/Lee Graham (Chair) AG/Samuel Padgett EA/Linda Bromley DA/Jerome Yencharis IA/Jon Symes JA/Gary Wessels AH/Brad Mudgett, Erica Vandersand NA/Jeevan Perera RA/Jason Noble SA/Douglas Whitehead LA/Richard Whitlock

Page 4: NASA Project Management System Engineering and Project Control Processes and Requierments

Change Record

Revision Date Organization/Phone Description

Basic March 2004 AG/Lee Graham/281-244-5192

Change 1 April 2006 AG/Bobby Watkins/281-483-0243 Change in applicability to basic and applied research.

Page 5: NASA Project Management System Engineering and Project Control Processes and Requierments

This JSC Procedures Guideline (JPG) is being changed to a JSC Procedures Requirement (JPR) document with no context changes. All referenced JPGs in this document will be evaluated during the annual review of this document and will be updated to reflect additional changes to any JPGs that are currently referenced.

Page 6: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

v

Chapter 1 Introduction------------------------------------------------------------------------------- 1-1 1.1 Purpose------------------------------------------------------------------- 1-1 1.2 Applicability and Scope --------------------------------------------- 1-1 1.3 Authority ---------------------------------------------------------------- 1-2 1.4 References--------------------------------------------------------------- 1-3 1.4.1 External Documents -------------------------------------------------- 1-3 1.4.2 NASA Documents ---------------------------------------------------- 1-4 1.4.3 JSC Documents -------------------------------------------------------- 1-7 1.5 Cancellation------------------------------------------------------------- 1-7

Chapter 2 Overview of Project Management at JSC----------------------------------------- 2-1 2.1 Scope --------------------------------------------------------------------- 2-1 2.2 Systems Engineering ------------------------------------------------- 2-1 2.3 Project Control--------------------------------------------------------- 2-1 2.4 Project Types, Approaches, and Life Cycle Phases---------- 2-2 2.4.1 Project Types ----------------------------------------------------------- 2-2 2.4.1.1 Space flight system development projects---------------------- 2-2 2.4.1.2 Advanced technology development projects------------------- 2-2 2.4.1.3 Science research, applied research, and

advanced studies------------------------------------------------------- 2-2 2.4.1.4 Institutional------------------------------------------------------------- 2-3 2.4.1.5 Operations--------------------------------------------------------------- 2-3 2.4.2 Project Approaches --------------------------------------------------- 2-4 2.4.2.1 Staffing------------------------------------------------------------------- 2-4 2.4.2.2 Type of contract ------------------------------------------------------- 2-4 2.4.2.3 Magnitude--------------------------------------------------------------- 2-5 2.4.2.4 Implementation approach ------------------------------------------- 2-5 2.4.3 Project Life Cycle Phases ------------------------------------------- 2-6 2.5 Project Management Forums --------------------------------------- 2-7 2.5.1 Project-level Forums ------------------------------------------------- 2-7 2.5.2 Directorate-level Forums -------------------------------------------- 2-7 2.5.3 Center-level Forums -------------------------------------------------- 2-7 2.5.3.1 JSC Engineering Review Board (ERB) ------------------------- 2-7 2.5.3.2 JSC Facility Review Board (FRB)-------------------------------- 2-7 2.5.3.3 JSC Project Management Council -------------------------------- 2-8 2.6 Documentation Tree -------------------------------------------------- 2-11 2.7 Project Team------------------------------------------------------------ 2-12 2.7.1 Organization ------------------------------------------------------------ 2-12 2.7.2 Membership------------------------------------------------------------- 2-12 2.7.2.1 Project manager role -------------------------------------------------- 2-12 2.7.2.2 Deputy project manager (DPM) role ----------------------------- 2-13 2.7.2.3 Lead systems engineer (LSE) role -------------------------------- 2-13 2.7.2.4 Project control officer role ------------------------------------------ 2-13 2.7.2.5 Subsystem and discipline lead engineers role ----------------- 2-14 2.7.2.6 Verification lead role ------------------------------------------------- 2-14 2.7.2.7 Validation lead role --------------------------------------------------- 2-15 2.7.2.8 Project scientis t role -------------------------------------------------- 2-15 2.7.2.9 User representative role---------------------------------------------- 2-15 2.7.2.10 Administrative officer role ------------------------------------------ 2-15 2.7.3 Support Team ---------------------------------------------------------- 2-15 2.7.3.1 Line management support------------------------------------------- 2-15

STS104-723-014 (21 July 2001) Backdropped over a wide scene of topography in the Middle East, the International Space Station (ISS) passes over the Persian Gulf. This photograph was taken with a 70mm handheld camera during a fly -around inspection by the Space Shuttle Atlantis crew not long after the two spacecraft separated. Prominent on the starboard side of the outpost is the newly-installed Quest airlock.

STS104-315-005 (12-24 July 2001) With Earth’s horizon in the background, astronaut Michael L. Gernhardt, STS-104 mission specialist, participates in one of three spacewalks aimed toward wrapping up the completion of work on the second phase of the ISS. Gernhardt was joined on this spacewalk by astronaut James F. Reilly.

Page 7: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

vi

2.7.3.2 Safety and mission assurance (S&MA) support -------------- 2-16 2.7.3.3 Mission operations support ----------------------------------------- 2-16 2.7.3.4 Test operations lead support---------------------------------------- 2-16 2.7.3.5 Facilities support ------------------------------------------------------ 2-17 2.7.3.6 Environmental support----------------------------------------------- 2-17 2.7.3.7 Counterintelligence support ---------------------------------------- 2-17 2.7.3.8 Export control support ----------------------------------------------- 2-17 2.7.3.9 Logistics, transportation, and shipping support --------------- 2-18 2.7.3.10 Procurement support ------------------------------------------------- 2-18 2.7.3.11 Office of the Chief Financial Officer (CFO)

support ------------------------------------------------------------------- 2-18 2.7.3.12 Office of the Chief Engineer support ---------------------------- 2-18 2.7.3.13 Legal support----------------------------------------------------------- 2-18 2.7.3.14 Communication and information technology

support ------------------------------------------------------------------- 2-19 2.7.3.15 Documentation and graphics support ---------------------------- 2-19 2.7.3.16 Photographic-video support ---------------------------------------- 2-19 2.7.3.17 Public Affairs Office support -------------------------------------- 2-19 2.7.3.18 Human factors support----------------------------------------------- 2-19 2.7.3.19 Technology transfer and intellectual property

management support ------------------------------------------------- 2-19 2.8 Project Management and Planning ------------------------------- 2-21 2.8.1 Project Management-------------------------------------------------- 2-21 2.8.2 Project Management Plan and Project Baseline--------------- 2-21 2.8.3 Program Operating Plan (Project Baseline

Updates) ----------------------------------------------------------------- 2-22 2.8.4 Customer-Project Agreement -------------------------------------- 2-22

Chapter 3 Project Life Cycle Requirements---------------------------------------------------- 3-1 3.1 Pre-Phase A – Advanced Studies --------------------------------- 3-2 3.1.1 Overview---------------------------------------------------------------- 3-2 3.1.2 Expected Results/Outcomes---------------------------------------- 3-2 3.1.3 Products------------------------------------------------------------------ 3-2 3.1.4 Process------------------------------------------------------------------- 3-2 3.1.5 Reviews – Concept Review ---------------------------------------- 3-2 3.1.6 Management Decision ----------------------------------------------- 3-2 3.1.7 References--------------------------------------------------------------- 3-3 3.2 Phase A – Preliminary Analysis ----------------------------------- 3-4 3.2.1 Overview---------------------------------------------------------------- 3-4 3.2.2 Expected Results/Outcomes---------------------------------------- 3-4 3.2.3 Products------------------------------------------------------------------ 3-4 3.2.4 Process------------------------------------------------------------------- 3-4 3.2.5 Reviews ------------------------------------------------------------------ 3-4 3.2.5.1 The Requirements Review------------------------------------------ 3-4 3.2.5.2 The Definition Review----------------------------------------------- 3-4 3.2.6 Management Decision ----------------------------------------------- 3-6 3.2.7 References--------------------------------------------------------------- 3-6 3.3 Phase B – Definition ------------------------------------------------- 3-7 3.3.1 Overview---------------------------------------------------------------- 3-7 3.3.2 Expected Results/Outcomes---------------------------------------- 3-7 3.3.3 Products------------------------------------------------------------------ 3-7 3.3.4 Process------------------------------------------------------------------- 3-7 3.3.5 Reviews ------------------------------------------------------------------ 3-7 3.3.5.1 The System Requirements Review------------------------------- 3-7

ISS01-323-010 (8 November 2000) Early film documentation of the Expedition 1 crewmembers on board the ISS shows cosmonauts Sergei K. Krikalev (left) and Yuri P. Gidzenko at work in the Zvezda Service Module. This is one of the first film images that was released from the Expedition 1 crew.

ISS01-E-5129 (December 2000) Astronaut William M. (Bill) Shepherd, Expedition One commander, works on the Ward Room table in the Zvezda Service Module aboard the ISS.

Page 8: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

vii

3.3.5.2 The System Definition Review------------------------------------ 3-8 3.3.5.3 The Preliminary Design Review ---------------------------------- 3-9 3.3.6 Management Decision ----------------------------------------------- 3-10 3.3.7 References--------------------------------------------------------------- 3-10 3.4 Phase C – Design------------------------------------------------------ 3-11 3.4.1 Overview---------------------------------------------------------------- 3-11 3.4.2 Expected Results/Outcomes---------------------------------------- 3-11 3.4.3 Products------------------------------------------------------------------ 3-11 3.4.4 Process------------------------------------------------------------------- 3-11 3.4.5 Reviews – The Critical Design Review------------------------- 3-11 3.4.6 Management Decision ----------------------------------------------- 3-12 3.4.7 References--------------------------------------------------------------- 3-12 3.5 Phase D – Development --------------------------------------------- 3-13 3.5.1 Overview---------------------------------------------------------------- 3-13 3.5.2 Expected Results/Outcomes---------------------------------------- 3-13 3.5.3 Products------------------------------------------------------------------ 3-13 3.5.4 Process------------------------------------------------------------------- 3-13 3.5.5 Reviews ------------------------------------------------------------------ 3-16 3.5.5.1 The Production Readiness Review ------------------------------- 3-16 3.5.5.2 The Test Readiness Review---------------------------------------- 3-17 3.5.5.3 The System Acceptance Review ---------------------------------- 3-17 3.5.5.4 The Deployment Readiness Review ----------------------------- 3-17 3.5.5.5 The Operational Readiness Review------------------------------ 3-18 3.5.6 Management Decision ----------------------------------------------- 3-18 3.5.7 References--------------------------------------------------------------- 3-18 3.6 Phase E – Operations------------------------------------------------- 3-19 3.6.1 Overview---------------------------------------------------------------- 3-19 3.6.2 Expected Results/Outcomes---------------------------------------- 3-19 3.6.3 Products------------------------------------------------------------------ 3-19 3.6.4 Process------------------------------------------------------------------- 3-19 3.6.5 Reviews ------------------------------------------------------------------ 3-19 3.6.5.1 The Delta Operational Readiness Review ---------------------- 3-19 3.6.5.2 The System Upgrade Review-------------------------------------- 3-19 3.6.5.3 The Safety Review---------------------------------------------------- 3-19 3.6.5.4 The Decommissioning Review ------------------------------------ 3-20 3.6.6 Management Decision ----------------------------------------------- 3-20 3.6.7 References--------------------------------------------------------------- 3-20 3.7 Project Termination--------------------------------------------------- 3-21 3.7.1 Overview---------------------------------------------------------------- 3-21 3.7.2 Activities ---------------------------------------------------------------- 3-21 3.7.2.1 Nominal termination ------------------------------------------------- 3-21 3.7.2.2 Off-nominal termination -------------------------------------------- 3-21 3.7.3 Products------------------------------------------------------------------ 3-22 3.7.4 Process------------------------------------------------------------------- 3-22 3.7.5 Reviews ------------------------------------------------------------------ 3-22 3.7.5.1 The Termination Review-------------------------------------------- 3-22 3.7.5.2 The Decommissioning Review ------------------------------------ 3-22 3.7.6 Management Decision ----------------------------------------------- 3-24 3.7.7 References--------------------------------------------------------------- 3-24

STS105-725-006 (16 August 2001) Astronaut Daniel T. Barry, STS-105 mission specialist, traverses along the Space Shuttle Discovery ’s payload bay, backdropped against the blue and white Earth, during one of two days of space-walks. Barry was joined by astronaut Patrick G. Forrester, mission specialist, on both of the spacewalks scheduled for the STS-105 mission.

STS105-E-5168 (13 August 2001)Astronaut Yury V. Usachev works out on the treadmill device in the Zvezda Service Module aboard the ISS. The Expedition 2 mission commander is only days away from returning to Earth following five months aboard the orbital outpost.

Page 9: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

viii

Chapter 4 Project Management Processes and Requirements ----------------------------- 4-1 4.0.1 References--------------------------------------------------------------- 4-3 4.1 Systems Engineering Processes ----------------------------------- 4-4 4.1.1 Requirements Development ---------------------------------------- 4-4 4.1.1.1 Function------------------------------------------------------------------ 4-4 4.1.1.2 Objective ---------------------------------------------------------------- 4-4 4.1.1.3 Responsibilities -------------------------------------------------------- 4-4 4.1.1.4 Life cycle ---------------------------------------------------------------- 4-4 4.1.1.5 Inputs --------------------------------------------------------------------- 4-4 4.1.1.6 Steps---------------------------------------------------------------------- 4-4 4.1.1.7 Outputs ------------------------------------------------------------------- 4-6 4.1.1.8 Exit criteria -------------------------------------------------------------- 4-6 4.1.1.9 Measurement ----------------------------------------------------------- 4-7 4.1.1.10 Methods and techniques--------------------------------------------- 4-7 4.1.1.11 Software tools ---------------------------------------------------------- 4-7 4.1.1.12 References--------------------------------------------------------------- 4-7 4.1.2 Requirements Management----------------------------------------- 4-9 4.1.2.1 Function------------------------------------------------------------------ 4-9 4.1.2.2 Objective ---------------------------------------------------------------- 4-9 4.1.2.3 Responsibilities -------------------------------------------------------- 4-9 4.1.2.4 Life cycle ---------------------------------------------------------------- 4-9 4.1.2.5 Inputs --------------------------------------------------------------------- 4-9 4.1.2.6 Steps---------------------------------------------------------------------- 4-9 4.1.2.7 Outputs ------------------------------------------------------------------- 4-10 4.1.2.8 Exit criteria -------------------------------------------------------------- 4-10 4.1.2.9 Measurement ----------------------------------------------------------- 4-11 4.1.2.10 Methods and techniques--------------------------------------------- 4-11 4.1.2.11 Software tools ---------------------------------------------------------- 4-11 4.1.2.12 References--------------------------------------------------------------- 4-11 4.1.3 Operational Concept Development------------------------------- 4-12 4.1.3.1 Function------------------------------------------------------------------ 4-12 4.1.3.2 Objective ---------------------------------------------------------------- 4-12 4.1.3.3 Responsibilities -------------------------------------------------------- 4-12 4.1.3.4 Life cycle ---------------------------------------------------------------- 4-12 4.1.3.5 Inputs --------------------------------------------------------------------- 4-12 4.1.3.6 Steps---------------------------------------------------------------------- 4-12 4.1.3.7 Outputs ------------------------------------------------------------------- 4-13 4.1.3.8 Exit criteria -------------------------------------------------------------- 4-14 4.1.3.9 Measurement ----------------------------------------------------------- 4-14 4.1.3.10 Methods and techniques--------------------------------------------- 4-14 4.1.3.11 Software tools ---------------------------------------------------------- 4-14 4.1.3.12 References--------------------------------------------------------------- 4-14 4.1.4 Decomposition--------------------------------------------------------- 4-15 4.1.4.1 Function------------------------------------------------------------------ 4-15 4.1.4.2 Objective ---------------------------------------------------------------- 4-15 4.1.4.3 Responsibilities -------------------------------------------------------- 4-15 4.1.4.4 Life cycle ---------------------------------------------------------------- 4-15 4.1.4.5 Inputs --------------------------------------------------------------------- 4-15 4.1.4.6 Steps---------------------------------------------------------------------- 4-15 4.1.4.7 Outputs ------------------------------------------------------------------- 4-17 4.1.4.8 Exit criteria -------------------------------------------------------------- 4-17 4.1.4.9 Measurement ----------------------------------------------------------- 4-17 4.1.4.10 Methods and techniques--------------------------------------------- 4-17 4.1.4.11 Software tools ---------------------------------------------------------- 4-18

STS100-347-025 (28 April 2001) The Canadian-built Space Station robotic arm transfers its launch cradle over to Endeavour’s Canadian-built robotic arm. A Canadian mission specialist, astronaut Chris A. Hadfield, was instrumental in the activity as he was at the controls of the original robot arm from his post on the aft flight deck of the Shuttle.

STS100-395-017 (19 April - 1 May 2001) A close look at the window in this picture of the Destiny laboratory reveals the faces of astronauts Susan J. Helms and James S. Voss, flight engineers for the Expedition 2 mission. One of the two STS-100 space walkers, astronauts Scott E. Parazynski and Chris A. Hadfield, exposed the image with a 35mm camera on one of two days of performing spacewalks.

Page 10: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

ix

4.1.4.12 References--------------------------------------------------------------- 4-18 4.1.5 Feasibility Study------------------------------------------------------- 4-19 4.1.5.1 Function------------------------------------------------------------------ 4-19 4.1.5.2 Objective ---------------------------------------------------------------- 4-19 4.1.5.3 Responsibilities -------------------------------------------------------- 4-19 4.1.5.4 Life cycle ---------------------------------------------------------------- 4-19 4.1.5.5 Inputs --------------------------------------------------------------------- 4-19 4.1.5.6 Steps---------------------------------------------------------------------- 4-19 4.1.5.7 Outputs ------------------------------------------------------------------- 4-20 4.1.5.8 Exit criteria -------------------------------------------------------------- 4-21 4.1.5.9 Measurement ----------------------------------------------------------- 4-21 4.1.5.10 Methods and techniques--------------------------------------------- 4-21 4.1.5.11 Software tools ---------------------------------------------------------- 4-21 4.1.5.12 References--------------------------------------------------------------- 4-22 4.1.6 Technology Planning------------------------------------------------- 4-23 4.1.6.1 Function------------------------------------------------------------------ 4-23 4.1.6.2 Objectives --------------------------------------------------------------- 4-23 4.1.6.3 Responsibilities -------------------------------------------------------- 4-23 4.1.6.4 Life cycle ---------------------------------------------------------------- 4-23 4.1.6.5 Inputs --------------------------------------------------------------------- 4-23 4.1.6.6 Steps---------------------------------------------------------------------- 4-23 4.1.6.7 Outputs ------------------------------------------------------------------- 4-25 4.1.6.8 Exit criteria -------------------------------------------------------------- 4-25 4.1.6.9 Measurement ----------------------------------------------------------- 4-25 4.1.6.10 Methods and techniques--------------------------------------------- 4-25 4.1.6.11 Software tools ---------------------------------------------------------- 4-25 4.1.6.12 References--------------------------------------------------------------- 4-25 4.1.7 Design-------------------------------------------------------------------- 4-26 4.1.7.1 Function------------------------------------------------------------------ 4-26 4.1.7.2 Objective ---------------------------------------------------------------- 4-26 4.1.7.3 Responsibilities -------------------------------------------------------- 4-26 4.1.7.4 Life cycle ---------------------------------------------------------------- 4-27 4.1.7.5 Inputs --------------------------------------------------------------------- 4-27 4.1.7.6 Steps---------------------------------------------------------------------- 4-27 4.1.7.7 Outputs ------------------------------------------------------------------- 4-28 4.1.7.8 Exit criteria -------------------------------------------------------------- 4-29 4.1.7.9 Measurement ----------------------------------------------------------- 4-29 4.1.7.10 Methods and techniques--------------------------------------------- 4-29 4.1.7.11 Software tools ---------------------------------------------------------- 4-29 4.1.7.12 References--------------------------------------------------------------- 4-30 4.1.8 Attainment -------------------------------------------------------------- 4-31 4.1.8.1 Function------------------------------------------------------------------ 4-31 4.1.8.2 Objective ---------------------------------------------------------------- 4-31 4.1.8.3 Responsibilities -------------------------------------------------------- 4-31 4.1.8.4 Life cycle ---------------------------------------------------------------- 4-31 4.1.8.5 Inputs --------------------------------------------------------------------- 4-31 4.1.8.6 Steps---------------------------------------------------------------------- 4-31 4.1.8.7 Outputs ------------------------------------------------------------------- 4-32 4.1.8.8 Exit criteria -------------------------------------------------------------- 4-33 4.1.8.9 Measurement ----------------------------------------------------------- 4-33 4.1.8.10 Methods and techniques--------------------------------------------- 4-33 4.1.8.11 Software tools ---------------------------------------------------------- 4-33 4.1.8.12 References--------------------------------------------------------------- 4-34 4.1.9 Integration--------------------------------------------------------------- 4-35 4.1.9.1 Function------------------------------------------------------------------ 4-35 4.1.9.2 Objective ---------------------------------------------------------------- 4-35

STS108-725-010 (5-17 December 2001) Backdropped by the blackness of space, the ISS was photographed by a crew member aboard the Space Shuttle Endeavour.

STS108-350-009 (10 December 2001) Astronaut Linda M. Godwin, STS-108 mission specialist, works during a four-hour, 12-minute spac ewalk The main objective of the spacewalk was to install thermal blankets on mechanisms that rotate the ISS’s main solar arrays.

Page 11: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

x

4.1.9.3 Responsibilities -------------------------------------------------------- 4-35 4.1.9.4 Life cycle ---------------------------------------------------------------- 4-35 4.1.9.5 Inputs --------------------------------------------------------------------- 4-35 4.1.9.6 Steps---------------------------------------------------------------------- 4-35 4.1.9.7 Outputs ------------------------------------------------------------------- 4-40 4.1.9.8 Exit criteria -------------------------------------------------------------- 4-40 4.1.9.9 Measurement ----------------------------------------------------------- 4-40 4.1.9.10 Methods and techniques--------------------------------------------- 4-41 4.1.9.11 Software tools ---------------------------------------------------------- 4-41 4.1.9.12 References--------------------------------------------------------------- 4-41 4.1.10 Technical Work and Resource Management------------------- 4-42 4.1.10.1 Function------------------------------------------------------------------ 4-42 4.1.10.2 Objective ---------------------------------------------------------------- 4-42 4.1.10.3 Responsibilities -------------------------------------------------------- 4-42 4.1.10.4 Life cycle ---------------------------------------------------------------- 4-42 4.1.10.5 Inputs --------------------------------------------------------------------- 4-42 4.1.10.6 Steps---------------------------------------------------------------------- 4-43 4.1.10.7 Outputs ------------------------------------------------------------------- 4-45 4.1.10.8 Exit criteria -------------------------------------------------------------- 4-45 4.1.10.9 Measurement ----------------------------------------------------------- 4-45 4.1.10.10 Methods and techniques--------------------------------------------- 4-46 4.1.10.11 Software tools ---------------------------------------------------------- 4-46 4.1.10.12 References--------------------------------------------------------------- 4-46 4.1.11 Safety and Mission Success---------------------------------------- 4-47 4.1.11.1 Function------------------------------------------------------------------ 4-47 4.1.11.2 Objective ---------------------------------------------------------------- 4-47 4.1.11.3 Responsibilities -------------------------------------------------------- 4-47 4.1.11.4 Life cycle ---------------------------------------------------------------- 4-48 4.1.11.5 Inputs --------------------------------------------------------------------- 4-48 4.1.11.6 Steps---------------------------------------------------------------------- 4-48 4.1.11.7 Outputs ------------------------------------------------------------------- 4-51 4.1.11.8 Exit criteria -------------------------------------------------------------- 4-51 4.1.11.9 Measurement ----------------------------------------------------------- 4-51 4.1.11.10 Methods and techniques--------------------------------------------- 4-51 4.1.11.11 Software tools ---------------------------------------------------------- 4-51 4.1.11.12 References--------------------------------------------------------------- 4-51 4.1.12 Control ------------------------------------------------------------------- 4-53 4.1.12.1 Function------------------------------------------------------------------ 4-53 4.1.12.2 Objective ---------------------------------------------------------------- 4-53 4.1.12.3 Responsibilities -------------------------------------------------------- 4-53 4.1.12.4 Life cycle ---------------------------------------------------------------- 4-53 4.1.12.5 Inputs --------------------------------------------------------------------- 4-53 4.1.12.6 Steps---------------------------------------------------------------------- 4-53 4.1.12.7 Outputs ------------------------------------------------------------------- 4-55 4.1.12.8 Exit criteria -------------------------------------------------------------- 4-55 4.1.12.9 Measurement ----------------------------------------------------------- 4-55 4.1.12.10 Methods and techniques--------------------------------------------- 4-56 4.1.12.11 Software tools ---------------------------------------------------------- 4-56 4.1.12.12 References--------------------------------------------------------------- 4-56 4.1.13 System Analysis ------------------------------------------------------- 4-57 4.1.13.1 Function------------------------------------------------------------------ 4-57 4.1.13.2 Objective ---------------------------------------------------------------- 4-57 4.1.13.3 Responsibilities -------------------------------------------------------- 4-57 4.1.13.4 Life cycle ---------------------------------------------------------------- 4-57 4.1.13.5 Inputs --------------------------------------------------------------------- 4-57 4.1.13.6 Steps---------------------------------------------------------------------- 4-58

ISS003-E-5415 (10 September 2001) Expedition 3 mission commander Frank L. Culbertson, Jr., conducts inflight maintenance with a ratchet under a panel in the Unity Node 1 on the ISS. This image was taken with a digital still camera.

ISS003-E-5634 (17 September 2001) Cosmonaut Mikhail Tyurin, Expedition 3 flight engineer, packs the docking probe in a stowage bag in Unity. The docking probe successfully guided the arrival of the Russian-built Pirs docking compartment to the ISS.

Page 12: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

xi

4.1.13.7 Outputs ------------------------------------------------------------------- 4-59 4.1.13.8 Exit criteria -------------------------------------------------------------- 4-59 4.1.13.9 Measurement ----------------------------------------------------------- 4-59 4.1.13.10 Methods and techniques--------------------------------------------- 4-60 4.1.13.11 Software tools ---------------------------------------------------------- 4-60 4.1.13.12 References--------------------------------------------------------------- 4-60 4.1.14 Verification ------------------------------------------------------------- 4-61 4.1.14.1 Function------------------------------------------------------------------ 4-61 4.1.14.2 Objective ---------------------------------------------------------------- 4-61 4.1.14.3 Responsibilities -------------------------------------------------------- 4-61 4.1.14.4 Life cycle ---------------------------------------------------------------- 4-61 4.1.14.5 Inputs --------------------------------------------------------------------- 4-61 4.1.14.6 Steps---------------------------------------------------------------------- 4-62 4.1.14.7 Outputs ------------------------------------------------------------------- 4-63 4.1.14.8 Exit criteria -------------------------------------------------------------- 4-63 4.1.14.9 Measurement ----------------------------------------------------------- 4-63 4.1.14.10 Methods and techniques--------------------------------------------- 4-63 4.1.14.11 Software tools ---------------------------------------------------------- 4-64 4.1.14.12 References--------------------------------------------------------------- 4-64 4.1.15 Validation --------------------------------------------------------------- 4-65 4.1.15.1 Function------------------------------------------------------------------ 4-65 4.1.15.2 Objective ---------------------------------------------------------------- 4-65 4.1.15.3 Responsibilities -------------------------------------------------------- 4-65 4.1.15.4 Life cycle ---------------------------------------------------------------- 4-65 4.1.15.5 Inputs --------------------------------------------------------------------- 4-65 4.1.15.6 Steps---------------------------------------------------------------------- 4-65 4.1.15.7 Outputs ------------------------------------------------------------------- 4-66 4.1.15.8 Exit criteria -------------------------------------------------------------- 4-67 4.1.15.9 Measurement ----------------------------------------------------------- 4-67 4.1.15.10 Methods and techniques--------------------------------------------- 4-67 4.1.15.11 Software tools ---------------------------------------------------------- 4-67 4.1.15.12 References--------------------------------------------------------------- 4-68 4.1.16 Reviews ------------------------------------------------------------------ 4-69 4.1.16.1 Function------------------------------------------------------------------ 4-69 4.1.16.2 Objective ---------------------------------------------------------------- 4-69 4.1.16.3 Responsibilities -------------------------------------------------------- 4-69 4.1.16.4 Life cycle ---------------------------------------------------------------- 4-69 4.1.16.5 Inputs --------------------------------------------------------------------- 4-69 4.1.16.6 Steps---------------------------------------------------------------------- 4-69 4.1.16.7 Outputs ------------------------------------------------------------------- 4-71 4.1.16.8 Exit criteria -------------------------------------------------------------- 4-71 4.1.16.9 Measurement ----------------------------------------------------------- 4-71 4.1.16.10 Methods and techniques--------------------------------------------- 4-71 4.1.16.11 Software tools ---------------------------------------------------------- 4-72 4.1.16.12 References--------------------------------------------------------------- 4-72 4.2 Project Control Processes ------------------------------------------- 4-73 4.2.1 Resource Management----------------------------------------------- 4-73 4.2.1.1 Function------------------------------------------------------------------ 4-73 4.2.1.2 Objective ---------------------------------------------------------------- 4-74 4.2.1.3 Responsibilities -------------------------------------------------------- 4-74 4.2.1.4 Life cycle ---------------------------------------------------------------- 4-74 4.2.1.5 Inputs --------------------------------------------------------------------- 4-74 4.2.1.6 Steps---------------------------------------------------------------------- 4-74 4.2.1.7 Outputs ------------------------------------------------------------------- 4-74 4.2.1.8 Exit criteria -------------------------------------------------------------- 4-74 4.2.1.9 Measurement ----------------------------------------------------------- 4-74

JSC2001-E-24454 (8 August 2001) Astronaut John M. Grunsfeld, STS-109 payload commander, uses virtual reality hardware at the Johnson Space Center (JSC) to rehearse some of his duties on the upcoming STS-109 mission, NASA’s fourth servicing visit to the Hubble Space Telescope.

STS109-E-5048 (3 March 2002) The top portion of the Hubble Space Telescope is photographed some 350 miles above the Pacific Ocean southwest of Mexico, as the Space Shuttle Columbia is about to use its 50-foot-long robotic arm to lower the telescope into its cargo bay. The image was one of a series recorded with a digital still camera.

Page 13: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

xii

4.2.1.10 Methods and techniques--------------------------------------------- 4-74 4.2.1.11 Software tools ---------------------------------------------------------- 4-75 4.2.1.12 References--------------------------------------------------------------- 4-75 4.2.2 Planning------------------------------------------------------------------ 4-76 4.2.2.1 Function------------------------------------------------------------------ 4-76 4.2.2.2 Objective ---------------------------------------------------------------- 4-76 4.2.2.3 Responsibilities -------------------------------------------------------- 4-76 4.2.2.4 Life cycle ---------------------------------------------------------------- 4-76 4.2.2.5 Inputs --------------------------------------------------------------------- 4-76 4.2.2.6 Steps---------------------------------------------------------------------- 4-76 4.2.2.7 Outputs ------------------------------------------------------------------- 4-76 4.2.2.8 Exit criteria -------------------------------------------------------------- 4-77 4.2.2.9 Measurement ----------------------------------------------------------- 4-77 4.2.2.10 Methods and techniques--------------------------------------------- 4-78 4.2.2.11 Software tools ---------------------------------------------------------- 4-78 4.2.2.12 References--------------------------------------------------------------- 4-78 4.2.3 Documentation and Data Management-------------------------- 4-79 4.2.3.1 Function------------------------------------------------------------------ 4-79 4.2.3.2 Objective ---------------------------------------------------------------- 4-79 4.2.3.3 Responsibilities -------------------------------------------------------- 4-79 4.2.3.4 Life cycle ---------------------------------------------------------------- 4-79 4.2.3.5 Inputs --------------------------------------------------------------------- 4-79 4.2.3.6 Steps---------------------------------------------------------------------- 4-79 4.2.3.7 Outputs ------------------------------------------------------------------- 4-80 4.2.3.8 Exit criteria -------------------------------------------------------------- 4-80 4.2.3.9 Measurement ----------------------------------------------------------- 4-80 4.2.3.10 Methods and techniques--------------------------------------------- 4-80 4.2.3.11 Software tools ---------------------------------------------------------- 4-80 4.2.3.12 References--------------------------------------------------------------- 4-81 4.2.4 Cost Estimating -------------------------------------------------------- 4-82 4.2.4.1 Function------------------------------------------------------------------ 4-82 4.2.4.2 Objective ---------------------------------------------------------------- 4-82 4.2.4.3 Responsibilities -------------------------------------------------------- 4-82 4.2.4.4 Life cycle ---------------------------------------------------------------- 4-82 4.2.4.5 Inputs --------------------------------------------------------------------- 4-82 4.2.4.6 Steps---------------------------------------------------------------------- 4-82 4.2.4.7 Outputs ------------------------------------------------------------------- 4-83 4.2.4.8 Exit criteria -------------------------------------------------------------- 4-83 4.2.4.9 Measurement ----------------------------------------------------------- 4-83 4.2.4.10 Methods and techniques--------------------------------------------- 4-84 4.2.4.11 Software tools ---------------------------------------------------------- 4-84 4.2.4.12 References--------------------------------------------------------------- 4-84 4.2.5 Performance Measurement ----------------------------------------- 4-85 4.2.5.1 Function------------------------------------------------------------------ 4-85 4.2.5.2 Objective ---------------------------------------------------------------- 4-86 4.2.5.3 Responsibilities -------------------------------------------------------- 4-86 4.2.5.4 Life cycle ---------------------------------------------------------------- 4-86 4.2.5.5 Inputs --------------------------------------------------------------------- 4-86 4.2.5.6 Steps---------------------------------------------------------------------- 4-87 4.2.5.7 Outputs ------------------------------------------------------------------- 4-89 4.2.5.8 Exit criteria -------------------------------------------------------------- 4-89 4.2.5.9 Measurement ----------------------------------------------------------- 4-89 4.2.5.10 Methods and techniques--------------------------------------------- 4-89 4.2.5.11 Software tools ---------------------------------------------------------- 4-89 4.2.5.12 References--------------------------------------------------------------- 4-90 4.2.6 Schedule Management----------------------------------------------- 4-91

STS108-E-5594 (15 December 2001) As seen in a medium view from a digital still camera aimed through a window on Endeavour’s aft flight deck, the ISS, now staffed with its fourth three-person crew, is contrasted against a patch of the blue and white Earth during a farewell look from the Shuttle following undocking. The Destiny laboratory is partially covered with shadows in the foreground.

ISS004-E-10071 (17 April 2002) Moments prior to the undocking of the Space Shuttle Atlantis from the ISS, an Expedition 4 crewmember took this digital still photograph from a window in the Pirs Docking Compartment. The STS-110 crew spent about a week aboard the ISS and successfully installed the S0 (S-zero) truss. Also visible in this image are the Soyuz Spacecraft, Space Station Remote Manipulator System/Canadarm2 and Pressurized Mating Adapter 3.

Page 14: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

xiii

4.2.6.1 Function------------------------------------------------------------------ 4-91 4.2.6.2 Objective ---------------------------------------------------------------- 4-91 4.2.6.3 Responsibilities -------------------------------------------------------- 4-91 4.2.6.4 Life cycle ---------------------------------------------------------------- 4-91 4.2.6.5 Inputs --------------------------------------------------------------------- 4-91 4.2.6.6 Steps---------------------------------------------------------------------- 4-91 4.2.6.7 Outputs ------------------------------------------------------------------- 4-92 4.2.6.8 Exit criteria -------------------------------------------------------------- 4-92 4.2.6.9 Measurement ----------------------------------------------------------- 4-93 4.2.6.10 Methods and techniques--------------------------------------------- 4-93 4.2.6.11 Software tools ---------------------------------------------------------- 4-93 4.2.6.12 References--------------------------------------------------------------- 4-93 4.2.7 Project Analysis ------------------------------------------------------- 4-94 4.2.7.1 Function------------------------------------------------------------------ 4-94 4.2.7.2 Objective ---------------------------------------------------------------- 4-94 4.2.7.3 Responsibilities -------------------------------------------------------- 4-94 4.2.7.4 Life cycle ---------------------------------------------------------------- 4-94 4.2.7.5 Inputs --------------------------------------------------------------------- 4-94 4.2.7.6 Steps---------------------------------------------------------------------- 4-94 4.2.7.7 Outputs ------------------------------------------------------------------- 4-96 4.2.7.8 Exit criteria -------------------------------------------------------------- 4-96 4.2.7.9 Measurement ----------------------------------------------------------- 4-96 4.2.7.10 Methods and techniques--------------------------------------------- 4-96 4.2.7.11 Software tools ---------------------------------------------------------- 4-96 4.2.7.12 References--------------------------------------------------------------- 4-96 4.3 Crosscutting Processes----------------------------------------------- 4-97 4.3.1 Acquisition Management-------------------------------------------- 4-97 4.3.1.1 Function------------------------------------------------------------------ 4-97 4.3.1.2 Objective ---------------------------------------------------------------- 4-97 4.3.1.3 Responsibilities -------------------------------------------------------- 4-97 4.3.1.4 Life cycle ---------------------------------------------------------------- 4-97 4.3.1.5 Inputs --------------------------------------------------------------------- 4-97 4.3.1.6 Steps---------------------------------------------------------------------- 4-98 4.3.1.7 Outputs ------------------------------------------------------------------- 4-99 4.3.1.8 Exit criteria -------------------------------------------------------------- 4-99 4.3.1.9 Measurement ----------------------------------------------------------- 4-99 4.3.1.10 Methods and techniques--------------------------------------------- 4-100 4.3.1.11 Software tools ---------------------------------------------------------- 4-100 4.3.1.12 References--------------------------------------------------------------- 4-100 4.3.2 Risk Management ----------------------------------------------------- 4-101 4.3.2.1 Function------------------------------------------------------------------ 4-101 4.3.2.2 Objective ---------------------------------------------------------------- 4-101 4.3.2.3 Responsibilities -------------------------------------------------------- 4-101 4.3.2.4 Life cycle ---------------------------------------------------------------- 4-101 4.3.2.5 Inputs --------------------------------------------------------------------- 4-101 4.3.2.6 Steps---------------------------------------------------------------------- 4-101 4.3.2.7 Outputs ------------------------------------------------------------------- 4-102 4.3.2.8 Exit criteria -------------------------------------------------------------- 4-103 4.3.2.9 Measurement ----------------------------------------------------------- 4-103 4.3.2.10 Methods and techniques--------------------------------------------- 4-103 4.3.2.11 Software tools ---------------------------------------------------------- 4-104 4.3.2.12 References--------------------------------------------------------------- 4-104 4.3.3 Configuration Management ---------------------------------------- 4-105 4.3.3.1 Function------------------------------------------------------------------ 4-105 4.3.3.2 Objective ---------------------------------------------------------------- 4-105 4.3.3.3 Responsibilities -------------------------------------------------------- 4-105

STS110-718-013 (13 April 2002) Astronaut Lee M. E. Morin, STS-110 mission specialist, anchored on the mobile foot restraint on the ISS Canadarm2, moves toward the Station’s newly installed S0 (S-zero) truss during this second scheduled spacewalk Astronaut Jerry L. Ross, mission specialist, worked in tandem with Morin on the spacewalk.

STS110-366-002 (11 April 2002) Astronaut Steven L. Smith, STS-110 mission specialist, works inside the S0 truss, newly installed on the ISS. Astronaut Rex J. Walheim (out of frame), mission specialist, worked in tandem with Smith during the spacewalk.

Page 15: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

xiv

4.3.3.4 Life cycle ---------------------------------------------------------------- 4-105 4.3.3.5 Inputs --------------------------------------------------------------------- 4-105 4.3.3.6 Steps---------------------------------------------------------------------- 4-106 4.3.3.7 Outputs ------------------------------------------------------------------- 4-107 4.3.3.8 Exit criteria -------------------------------------------------------------- 4-108 4.3.3.9 Measurement ----------------------------------------------------------- 4-108 4.3.3.10 Methods and techniques--------------------------------------------- 4-108 4.3.3.11 Software tools ---------------------------------------------------------- 4-108 4.3.3.12 References--------------------------------------------------------------- 4-109 4.3.4 Quality Management ------------------------------------------------- 4-110 4.3.4.1 Function------------------------------------------------------------------ 4-110 4.3.4.2 Objective ---------------------------------------------------------------- 4-110 4.3.4.3 Responsibilities -------------------------------------------------------- 4-110 4.3.4.4 Life cycle ---------------------------------------------------------------- 4-110 4.3.4.5 Inputs --------------------------------------------------------------------- 4-110 4.3.4.6 Steps---------------------------------------------------------------------- 4-111 4.3.4.7 Outputs ------------------------------------------------------------------- 4-112 4.3.4.8 Exit criteria -------------------------------------------------------------- 4-113 4.3.4.9 Measurement ----------------------------------------------------------- 4-113 4.3.4.10 Methods and techniques--------------------------------------------- 4-113 4.3.4.11 Software tools ---------------------------------------------------------- 4-113 4.3.4.12 References--------------------------------------------------------------- 4-113

Appendices Appendix A – Project Management Content Requirements ----------------- A-1 A.1 Title Page---------------------------------------------------------------- A-1 A.2 Introduction------------------------------------------------------------- A-1 A.3 Objectives --------------------------------------------------------------- A-1 A.4 Customer Definition and Advocacy------------------------------ A-1 A.5 Project Authority ------------------------------------------------------ A-1 A.6 Management------------------------------------------------------------ A-1 A.7 Project Requirements------------------------------------------------- A-1 A.8 Technical Summary -------------------------------------------------- A-1 A.9 Logistics ----------------------------------------------------------------- A-1 A.10 Schedules---------------------------------------------------------------- A-1 A.11 Resources---------------------------------------------------------------- A-1 A.12 Controls ------------------------------------------------------------------ A-2 A.13 Implementation Approach ------------------------------------------ A-2 A.14 Acquisition Summary ------------------------------------------------ A-2 A.15 Program/Project Dependencies ------------------------------------ A-2 A.16 Agreements ------------------------------------------------------------- A-2 A.17 Safety and Mission Success---------------------------------------- A-2 A.18 Risk Management ----------------------------------------------------- A-2 A.19 Environmental Impact ----------------------------------------------- A-2 A.20 Test and Verification ------------------------------------------------- A-2 A.21 Technology Assessment--------------------------------------------- A-2 A.22 Commercialization---------------------------------------------------- A-2 A.23 Reviews ------------------------------------------------------------------ A-3 A.24 Termination Review Criteria --------------------------------------- A-3 A.25 Tailoring ----------------------------------------------------------------- A-3 A.26 Change Log ------------------------------------------------------------- A-3

ISS005-E-19968 (6 November 2002) Astronaut Peggy A. Whitson, Expedition 5 flight engineer, exercises in the Destiny laboratory on the ISS.

ISS005-E-20309 (8 November 2002) Soyuz 5 Commander Sergei Zalyotin looks at a plant growth experiment in the Zvezda Service Module on the ISS.

Page 16: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

xv

Appendix B – Systems Engineering Management Plan Outline------------ B-1 B.1 Description of the System of Interest---------------------------- B-1 B.2 Description of the Technical Processes-------------------------- B-1 B.3 Software Development----------------------------------------------- B-1 B.4 Project Technical Management Processes ---------------------- B-1 B.5 Organization (Team) Structure ------------------------------------ B-2 B.6 Other Systems Engineering Concerns --------------------------- B-2 Appendix C – Trace of Project Management Processes to Life Cycle ---- C-1 Appendix D – Project Tailoring Guidelines-------------------------------------- D-1 Appendix E – Terms and Definitions---------------------------------------------- E-1 Appendix F – Acronyms -------------------------------------------------------------- F-1 Appendix G – Photograph Captions------------------------------------------------ G-1 Appendix H – “Spiral” Development Process----------------------------------- H-1

Figures Chapter 2 2.1-1 Project management scope------------------------------------------ 2-1 2.4-1 Typical development processes for projects ------------------- 2-3 2.4-2 Project life cycle phases --------------------------------------------- 2-6 2.5-1 Pre-proposal/proposal/implementation

approval process------------------------------------------------------- 2-9 2.6-1 Documentation tree --------------------------------------------------- 2-11 Chapter 3 3-1 Pre-Phase A advanced studies diagram-------------------------- 3-3 3-2 Phase A preliminary analysis diagram--------------------------- 3-5 3.3-1 Phase B system definition diagram------------------------------- 3-8 3.3-2 Phase B preliminary design diagram----------------------------- 3-9 3-4 Phase C design diagram--------------------------------------------- 3-12 3-5.1 Phase D development – fabrication and

integration stage diagram ------------------------------------------- 3-14 3-5.2 Phase D development – preparation for

deployment stage diagram ------------------------------------------ 3-15 3-5.3 Phase D development – deployment and

operational verification stage diagram--------------------------- 3-16 3-6 Phase E operations diagram---------------------------------------- 3-20 3-7.1 Off-nominal project termination diagram----------------------- 3-23 3-7.2 Nominal project termination diagram---------------------------- 3-23 Chapter 4 4-1 Example of three levels of system of interest------------------ 4-2 4-2 Example of enabling systems -------------------------------------- 4-2 4-3 Decomposition of the systems of interest----------------------- 4-3 4-4 Relationship among needs, goals, and objectives------------- 4-3 4.1-1 Requirements development process diagram ------------------ 4-5 4.1-2 Requirements management process diagram ------------------ 4-10 4.1-3 Operational concept development process diagram---------- 4-13 4.1-4 Decomposition process diagram ---------------------------------- 4-16

STS111-383-009 (5-19 June 2002) Astronauts Kenneth D. Cockrell (left) and Paul S. Lockhart, STS-111 mission com-mander and pilot, participate in one of the STS-111 detailed test objectives (DTOs). The purpose of DTO 694 is to produce ultra-pure water from the Shuttle’s fuel cell water. This water can replace manifested ultra-pure water supplies, and significantly decrease the mass and volume required to support biotechnology payloads.

STS111-318-030 (5-19 June 2002) Astronaut Philippe Perrin, STS-111 mission specialist representing CNES, the French Space Agency, looks out an aft flight deck window of Endeavour.

Page 17: NASA Project Management System Engineering and Project Control Processes and Requierments

TTaabbllee ooff CCoonntteennttss

xvi

4.1-5 Feasibility study process diagram--------------------------------- 4-20 4.1-6 Technology planning process diagram -------------------------- 4-24 4.1-7 Design process diagram --------------------------------------------- 4-28 4.1-8 Attainment process diagram---------------------------------------- 4-32 4.1-9 Integration process diagram ---------------------------------------- 4-36 “Vee” chart-------------------------------------------------------------- 4-37 4.1-10 Technical work and resource management

process diagram-------------------------------------------------------- 4-43 4.1-11 Safety and mission success process diagram------------------- 4-48 4.1-12 Control process diagram--------------------------------------------- 4-54 4.1-13 System analysis process diagram --------------------------------- 4-58 4.1-14 Verification process diagram--------------------------------------- 4-62 4.1-15 Validation process diagram----------------------------------------- 4-66 4.1-16 Reviews process diagram ------------------------------------------- 4-70 4.2-1 Resource management process diagram ------------------------ 4-75 4.2-2 The planning process diagram ------------------------------------- 4-77 4.2-3 Documentation and data management process

diagram ------------------------------------------------------------------ 4-80 4.2-4 Cost estimating process diagram---------------------------------- 4-83 4.2-5 Performance measurement process diagram ------------------- 4-87 4.2-6 Schedule management process diagram------------------------- 4-92 4.2-7 Project analysis process diagram---------------------------------- 4-95 4.3-1 Acquisition management process diagram --------------------- 4-98 4.3-2 Risk management process diagram------------------------------- 4-102 4.3-3 Configuration management process diagram ------------------ 4-106 4.3-4 Quality management process diagram--------------------------- 4-111

JSC2003-E-02172 (15 January 2003) An overall view of the Station flight control room (BFCR) in Houston’s Mission Control Center (MCC). A large screen at the front of the room shows astronauts Kenneth D. Bowersox and Donald R. Pettit, Expedition 6 mission commander and NASA ISS science officer, during the mission’s only scheduled spacewalk.

Page 18: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: Systems Engineering &Project Control Processes and Requirements In

trod

uctio

n

Page 19: NASA Project Management System Engineering and Project Control Processes and Requierments

Introduction

Chapter 1 Introduction This document provides a description of the basic processes and general practice for the development and operation of all projects managed at the Lyndon B. Johnson Space Center (JSC). In addition to the processes and requirements documented here, all projects shall comply with applicable JSC directives and requirements established by applicable law, reg-ulations, Executive Orders, and NASA directives. In the event of a conflict, the NASA directive, Executive Order, regulation, or law (in ascending order) shall have precedence. Over the last ten years, JSC and NASA have under-gone many changes, examples of which include the creation of the Center-level Systems Management Offices and the Headquarters-level Chief Financial Officer’s Cost Analysis Division, an increased em-phasis on risk management and the use of probabilistic risk analysis (PRA) and cost assessment requirements description (CARD) documentation. One of the more visible and significant changes is that the Agency has evolved a strategic planning process that is based on a family of Enterprises. JSC has also created new of-fices and processes to implement Agency, Congress, and White House input and direction. Similarly, NASA program and project develop-ment and management have evolved as documented in NASA Procedures and Requirements (NPR) 7120.5B, NASA Program and Project Management Processes and Requirements. The Agency has also developed interim guidance related to systems engineering (SE) as documented in draft NPR 71xx.x (document num-ber not yet assigned), NASA Systems Engineering Processes and Requirements. These changes have been put in place to ensure that programs and projects are not only in concert with the Enterprise efforts, but are also efficiently and consistently planned, budgeted, and executed. Additional guidelines – e.g., risk man-agement, software-independent verification and valida-tion (IV&V), logistics management, and export control – have also been put in place for similar purposes. Another significant change has been that both the Agency and JSC have implemented the International Organization of Standardization (ISO) 9000 Quality Management System. This system is the day-to-day process by which JSC implements the requirement to continually improve Center processes, products, and services, and to use measurable objectives to estab-lish the quality of its work. JSC has documented this in JSC Procedures and Guidelines (JPG) 5335.3, JSC Quality Management System Quality Manual. As part of this continual improvement effort, JSC has made a number of additional critical decisions on project pro-cesses and product control. These include: • Implementing JSC Policy Directive (JPD)

8090.1, Systems Engineering Policy

• Implementing JPC 7120.2, JSC Project Manage-ment Council

• Implementing this document, JPG 7120.3, Proj-ect Management: Systems Engineering and Project Control Processes and Requirements

• Implementing JPD 7120.1, JSC Project Manage-ment Policy

• Aligning the SE processes to be used at JSC with interim Agency guidance and industry standards, practices, and maturity models – such as the Elec-tronic Industries Alliance (EIA)-632, Processes for Engineering a System, the INCOSE Systems Engineering Handbook, and the Capability Ma-turity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing – and documenting them in this publication.

This document therefore combines, in coordination with the JSC Management System and the JSC organ-izational structure, project management, project con-trol and SE principles and practices. It does this in a fashion compatible with the previously mentioned Agency project management and SE guidelines and directives. In this document, a requirement is identified by a “shall,” a good practice by a “should,” permission by a “may” or a “can,” expectation by a “will,” and des-criptive material by an “is.” 1.1 Purpose The purpose of this document is to define Center-wide processes, practices, and product requirements to support the planning, operations, and management of projects at JSC. This document provides overall direction and processes to be followed by all projects and project team members at JSC. 1.2 Applicability and Scope This document has been developed to ensure a common understanding and documentation of the processes, practices, and procedures that shall be re-quired for development and integration of all hardware and software projects at JSC. These projects include activities in the areas of space flight system develop-ment, advanced technology development, advanced studies, institutional support operations, and any combination thereof. It also serves as an information-only source for activities such as: • Basic and applied research investigations that

are part of a portfolio (per NPR 7120.5C) • Small Business Innovation Research (SBIR) • Center Director Discretionary Fund (CDDF)

funded projects

1-1

Page 20: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

1-2

• Space Act Agreements that do not exceed $100,000 (including civil service labor, facili-ties, and materials cost)

• Research and Technology Objectives and Plans The JSC Center Director is accountable for JSC support to projects managed by Agency programs. As such, JSC activities and JSC projects managed by Agency programs shall follow the processes stipula-ted in this document. Any disagreements that may arise between the requirements of this document and direction provided by the program(s) shall be brought forward to the JSC Engineering Review Board (ERB) for clarification on the potential level of risk being assumed. The ERB chairperson and the appropriate program manager will discuss the issue and resolve it. Final resolutions shall be appropriately document-ed in the project management plan and project-cus-tomer agreements (e.g., ITAs, MOUs, and Class 1 deliverables). While basic common processes and general prac-tices are discussed in this document, project managers, working with their project team members, should tailor implementation to the specific needs of the project consistent with the project scope, visibility, cost, complexity, criticality, and risk. For purposes of this document, tailoring should be considered to be a method to encourage innovation and achieve products in an efficient manner while still meeting the expecta-tions of the customer. Tailoring approaches shall be coordinated with all affected parties as early as pos-

sible and documented in the project management plan (PMP) for approval, as a minimum, and the program commitment agreement (PCA) and program plan, as appropriate. 1.3 Authority 1.3.1 Per NPR 7120.5B, paragraph P2.4, “Each Center

is responsible for developing and implementing the Center-level policies, processes, procedures, and requirements necessary to ensure successful program/project execution.”

1.3.2 Per NPR 71xx.x (document number not yet assigned), paragraph 2.4, “Each Center is responsible for developing and implementing Center-level policies, processes, procedures and requirements necessary to ensure successful execution of the processes and requirements according to this document.”

1.3.3 Per JPD 8090.1, “The JSC Office of the Chief Engineering is responsible for: 1) Establishing policy, requirements, and guidelines for sys-tems engineering processes and products for the development and operation of JSC systems.”

1.3.4 Per JPD 7120.1, “The JSC Chief Engineer is responsible for: 1) Serving as the process ste-ward for JSC project management, including development and maintenance of JSC project management procedures and guidelines that are compliant with Agency policies and the JSC quality management system.”

Page 21: NASA Project Management System Engineering and Project Control Processes and Requierments

Introduction

1-3

1.4 References 1.4.1 External Documents Number Title Paragraph AD-A319533KKG, DTIC#: AD-A319 533\6\XAB

Continuous Risk Management Guidebook 4.3.2.12

ANSI/AIAA G-043-1992

Guide for Preparation of Operational Concept Documents

4.1.3.12

CMMI-SE/SW/IPPD/SS

Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing

1, 4.1.1.6, 4.1.1.7, 4.1.1.10, 4.1.1.11, 4.1.1.12, 4.1.2, 4.1.2.2, 4.1.2.4, 4.1.2.6, 4.1.2.7, 4.1.2.8, 4.1.2.11, 4.1.2.12, 4.1.3.1, 4.1.3.6, 4.1.3.7, 4.1.3.12, 4.1.4.5, 4.1.4.6, 4.1.4.7, 4.1.4.10, 4.1.4.12, 4.1.7.12, 4.1.8, 4.1.8.12, 4.1.9, 4.1.9.2, 4.1.9.5, 4.1.9.6, 4.1.9.12, 4.1.10.6, 4.1.10.7, 4.1.10.12, 4.1.11.12, 4.1.12.6, 4.1.12.12, 4.1.13, 4.1.13.6, 4.1.13.12, 4.1.14.12, 4.1.15.6, 4.1.15.12, 4.3.1.1, 4.3.1.4, 4.3.1.5, 4.3.1.6, 4.3.1.8, 4.3.1.10, 4.3.1.12, 4.3.2, 4.3.2.12, 4.3.3.6, 4.3.3.7, 4.3.3.12, 4.3.4, 4.3.4.6, 4.3.4.7, 4.3.4.12, App. E

CMMI Project Planning Process App. E EIA-632 Processes for Engineering a System 1, 4.1.1, 4.1.1.2, 4.1.1.5, 4.1.1.11,

4.1.1.12, 4.1.2.4, 4.1.2.7, 4.1.2.11, 4.1.2.12, 4.1.3.6, 4.1.3.7, 4.1.3.8, 4.1.3.12, 4.1.4.6, 4.1.4.7, 4.1.4.8, 4.1.4.12, 4.1.7, 4.1.7.11, 4.1.7.12, 4.1.8, 4.1.8.7, 4.1.8.12, 4.1.9.2, 4.1.9.5, 4.1.9.6, 4.1.9.12, 4.1.11.12, 4.1.12.6, 4.1.12.12, 4.1.13, 4.1.13.10, 4.1.13.11, 4.1.13.12, 4.1.14.1, 4.1.14.12, 4.1.15, 4.1.15.1, 4.1.15.12, 4.2.5, 4.2.5.1, 4.2.5.12, 4.3.1.6, 4.3.1.7, 4.3.1.9, 4.3.1.12, 4.3.3.12

EIA-731.1 Systems Engineering Capability Model 4.1.10, 4.1.10.12, 4.1.13.4, 4.1.13.12, 4.3.4, 4.3.4.2, 4.3.4.7, 4.3.4.8, 4.3.4.10, 4.3.4.11, 4.3.4.12

EIA-748A Earned Value Management Systems 4.2.5, 4.2.5.1, 4.2.5.12 EIA-748-98 Industry Guidelines for Earned Value Management

Systems 4.2.7, 4.2.7.12

IEEE 1220-1998 Institute of Electrical and Electronic Engineers (IEEE) Standard for Application and Management of Systems Engineering Processes

App. E

ISO 9001:2000 Quality Management Systems – Requirements 4.3.4, 4.3.4.12 MIL-HDBK-340A, Vols. 1 and 2

Test Requirements for Launch, Upper Stage and Space Vehicles

2.4.2.4

SP 800-12 An Introduction to Computer Security: the NIST Handbook (Web page)

4.3.2.12

SWELT:RM0.2 Requirements Management Guidebook 4.1.2.4, 4.1.2.7, 4.1.2.11, 4.1.2.12 Analysis of Automated Requirements Management

Capabilities 4.1.14.12, 4.1.15.12

Page 22: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

1-4

Number Title Paragraph Customer-Centered Products 4.0.1, 4.1.2.4, 4.1.2.7, 4.1.2.11,

4.1.2.12 Expert Choice, Inc. (Web page) 4.1.5.12 Fundamentals of Project Management 4.2.2, 4.2.2.12 INCOSE Systems Engineering Handbook 1, 4.1.1.5, 4.1.1.7, 4.1.1.8, 4.1.1.10,

4.1.1.11, 4.1.1.12, 4.1.2.2, 4.1.2.4, 4.1.2.5, 4.1.2.7, 4.1.2.11, 4.1.2.12, 4.1.3.2, 4.1.3.5, 4.1.3.6, 4.1.3.7, 4.1.3.8, 4.1.3.9, 4.1.3.10, 4.1.3.12, 4.1.4.2, 4.1.4.6, 4.1.4.7, 4.1.4.8, 4.1.4.9, 4.1.4.10, 4.1.4.11, 4.1.4.12, 4.1.6.6, 4.1.6.12, 4.1.7, 4.1.7.3, 4.1.7.5, 4.1.7.10, 4.1.7.12, 4.1.8.3, 4.1.8.10, 4.1.8.12, 4.1.9.5, 4.1.9.10, 4.1.9.11, 4.1.9.12, 4.1.11.12, 4.1.12.10, 4.1.12.11, 4.1.12.12, 4.1.13.9, 4.1.13.11, 4.1.13.12, 4.3.1.7, 4.3.1.12, 4.3.3.12, 4.3.4.6, 4.3.4.12

Project Management Toolbox 4.2.2, 4.2.2.12, 4.2.4, 4.2.4.12, 4.2.5, 4.2.5.12, 4.2.6, 4.2.6.12

Risk Management Guide for DoD Acquisition 4.3.2.12 System Engineering Fundamentals 4.1.3.10, 4.1.3.12, 4.1.4.2, 4.1.4.5,

4.1.4.10, 4.1.4.12, 4.1.13.12 Visualizing Project Management 4.2.2, 4.2.2.12, 4.2.6, 4.2.6.12 W. Lilly Project Control Study for the National

Academy of Public Administration (NAPA) 2.1

1.4.2 NASA Documents Number Title Paragraph Full Cost Initiative Agencywide Implementation

Guide (Web page) 4.2.1, 4.2.1.12, 4.2.2.7

KHB 1700.7 KSC Payload Ground Safety Handbook 4.1.11.6, 4.1.11.12 NASA-GB-1740.13-96

NASA Guidebook for Safety Critical Software – Analysis and Development

4.1.11.6, 4.1.11.12

NASA-STD-2100-91 NASA Software Documentation Standard 4.1.11.6, 4.1.11.12 NASA-STD-2201-93 Software Assurance Standard 4.1.11.1, 4.1.11.6, 4.1.11.12 NASA-STD-8719.13A

Software Safety NASA Technical Standard 4.1.11.6, 4.1.11.12

NASA-STD-8739.8 Software Assurance 4.1.11.6 NASA-STD-8739 series

4.1.11.12

Next-Generation Space Telescope (NGST) (Web page)

4.1.6.6, 4.1.6.12

NMI 7234.1 Facilities Utilization Program 2.5.3.2 NPD 1280.1 NASA Management System Policy 4.3.4, 4.3.4.2, 4.3.4.12 NPD 1440.6G NASA Records Management 4.2.6, 4.2.6.12 NPD 1441.1D NASA Record Retention Schedule 4.2.3, 4.2.3.12 NPD 2190 NASA Export Control Program 2.7.3.8 NPD 2820.1 NASA Software Policies 2.6, 4.1.11.6 NPD 7330.1F Approval Authorities for Facilities Projects 2.4.1.4, 2.6

Page 23: NASA Project Management System Engineering and Project Control Processes and Requierments

Introduction

1-5

Number Title Paragraph NPD 7500.1 Program/Project Logistics 2.6 NPD 8010.2 Metrics System 2.6 NPD 8010.3 Notification of Intent to Terminate Operating Space

Systems 2.6, 3.7, 3.7.5.2, 3.7.7

NPD 8730.4 NASA Software Independent Verification and Validation (IV&V) Policy

2.6, A.20

NPD 8800.14B Policy for Real Property Management 2.4.1.4 NPD 8810.2 Master Planning for Real Property 2.4.1.4 NPD 8820.2A Design and Construction of Facilities 2.4.1.4, 2.6 NPD 8820.3 Facility Sustainable Design 2.4.1.4, 2.6 NPD 8831.1D Management of Institutional and Program Facilities

and Related Equipment 2.4.1.4, 2.6

NPD 9501.3A Earned Value Performance Management 4.2.5, 4.2.5.1, 4.2.5.12 NPR 1440.6G Records Management 2.6 NPR 2190.1 Export Control 2.6 NPR 71xx.x NASA Systems Engineering Processes and

Requirements (document number not yet assigned) 1, 1.3, 4.0.1, 4.1.1, 4.1.1.1, 4.1.1.11, 4.1.1.12, 4.1.2, 4.1.2.1, 4.1.2.4, 4.1.2.7, 4.1.2.11, 4.1.2.12, 4.1.3, 4.1.3.1, 4.1.3.7, 4.1.3.12, 4.1.4, 4.1.4.1, 4.1.4.7, 4.1.4.12, 4.1.5, 4.1.5.1, 4.1.5.12, 4.1.6, 4.1.6.1, 4.1.6.6, 4.1.6.12, 4.1.7, 4.1.7.1, 4.1.7.2, 4.1.7.5, 4.1.7.8, 4.1.7.12, 4.1.8, 4.1.8.1, 4.1.8.2, 4.1.8.4, 4.1.8.5, 4.1.8.12, 4.1.9, 4.1.9.1, 4.1.9.12, 4.1.10, 4.1.10.1, 4.1.10.2, 4.1.10.12, 4.1.11, 4.1.11.1, 4.1.11.12, 4.1.12, 4.1.12.1, 4.1.12.12, 4.1.13, 4.1.13.1, 4.1.13.12, 4.1.14, 4.1.14.1, 4.1.14.12, 4.1.15, 4.1.15.1, 4.1.15.12, 4.1.16, 4.1.16.1, 4.1.16.12, 4.3.1, 4.3.1.1, 4.3.1.2, 4.3.1.6, 4.3.1.12, 4.3.2.1, 4.3.2.12, 4.3.3, 4.3.3.1, 4.3.3.12, 4.3.4, 4.3.4.1, 4.3.4.12, App. E

NPR 7120.5B NASA Program and Project Management Processes and Requirements

1, 1.3, 2.4.2.4, 2.6, 3.2, 3.2.7, 3.3, 3.3.7, 3.4, 3.4.7, 3.5, 3.5.7, 3.6, 3.6.7, 4.1.6, 4.1.6.1, 4.1.6.3, 4.1.6.12, 4.1.9.6, 4.1.9.12, 4.1.10.12, 4.1.11.1, 4.1.11.2, 4.1.11.12, 4.1.12.10, 4.1.12.12, 4.1.13, 4.1.13.12, 4.1.14, 4.1.14.12, 4.1.15.12, 4.2.1, 4.2.1.12, 4.2.4, 4.2.4.12, 4.3.1, 4.3.1.1, 4.3.1.2, 4.3.1.3, 4.3.1.6, 4.3.1.12, 4.3.2.5, 4.3.2.12, 4.3.3.12, 4.3.4, 4.3.4.12, App. A, A.18, B.4, App. E

NPR 7120.5C NASA Program and Project Management Processes and Requirements (Draft)

App. E

NPR 8000.4 Risk Management Procedures and Guidelines 2.6, 4.3.2, 4.3.2.6, 4.3.2.10, 4.3.2.12, A.18

NPR 8705.2 Human-Rating Requirements and Guidelines for Space Flight Systems

4.1.7.6

Page 24: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

1-6

Number Title Paragraph NPR 8820.2E Facility Project Implementation Guide 2.4.1.4, 2.6 NPR 8831.2D Facilities Maintenance Management 2.4.1.4, 2.6 NPR 9501.3 Earned Value Management Implementation on

NASA Contract 4.2.7, 4.2.7.12

NSTS 13830 Payloads Safety Review and Data Submittal Requirements

4.1.11.6, 4.1.11.12

NSTS 1700.7 Safety Policy and Requirements for Payloads Using the STS

4.1.11.6, 4.1.11.12

NSTS 1700.7 Addendum

Safety Policy Requirements for Payloads Using the International Space Station

4.1.11.6, 4.1.11.12

NSTS 5300.4(1D-2) Safety, Reliability, Maintainability, and Quality Provisions for the Space Shuttle Program

4.1.11.6

RP 1358 Systems Engineering “Toolbox” for Design-Oriented Engineers

2.6, 4.1.1.11, 4.1.1.12, 4.1.2.4, 4.1.2.7, 4.1.2.11, 4.1.2.12, 4.1.5.10, 4.1.5.12, 4.1.6.10, 4.1.6.12, 4.3.2.10, 4.3.2.12

SP 6103 NASA Readings in Project Control 2.3, 4.2.1, 4.2.1.12, 4.2.4, 4.2.4.12 SP 6105 NASA Systems Engineering Handbook 2.4.2.4, 2.6, 3, 3.1, 3.1.7, 3.2, 3.2.7,

3.3, 3.3.7, 3.4, 3.4.7, 3.5, 3.5.7, 3.6, 3.6.7, 4.1.1.11, 4.1.1.12, 4.1.2.4, 4.1.2.7, 4.1.2.11, 4.1.2.12, 4.1.4.2, 4.1.4.3, 4.1.4.7, 4.1.4.10, 4.1.4.12, 4.1.5.1, 4.1.5.12, 4.1.6.4, 4.1.6.12, 4.1.7, 4.1.7.3, 4.1.7.6, 4.1.7.8, 4.1.7.12, 4.1.8.5, 4.1.8.6, 4.1.8.7, 4.1.8.12, 4.1.9.3, 4.1.9.6, 4.1.9.7, 4.1.9.12, 4.1.10.3, 4.1.10.5, 4.1.10.10, 4.1.10.12, 4.1.11, 4.1.11.1, 4.1.11.6, 4.1.11.12, 4.1.12.2, 4.1.12.12, 4.1.13, 4.1.13.2, 4.1.13.3, 4.1.13.6, 4.1.13.7, 4.1.13.10, 4.1.13.12, 4.1.16.1, 4.1.16.2, 4.1.16.4, 4.1.16.5, 4.1.16.6, 4.1.16.7, 4.1.16.8, 4.1.16.12, 4.2.4, 4.2.4.12, 4.2.6, 4.2.6.12, 4.3.1.5, 4.3.1.7, 4.3.1.10, 4.3.1.12, 4.3.3, 4.3.3.3, 4.3.3.12, 4.3.4.3, 4.3.4.12, App. E

SSP 41173 Space Station Program Quality Assurance Requirements

4.1.11.6

SSP 50038 Computer-Based Control System Safety Requirements

4.1.11.6, 4.1.11.12

SSP 51079 International Space Station Program Risk Management Plan

4.3.2.12

NASA Cost Estimating Handbook 2002 (Web page) 4.2.4, 4.2.4.12, 4.2.7, 4.2.7.12 Probabilistic Risk Assessment Procedures Guide-

book for NASA Managers and Practitioners 4.3.2.12

Technology Readiness Levels: A White Paper 4.1.6.12 WSP 09-0014 Project Management

Page 25: NASA Project Management System Engineering and Project Control Processes and Requierments

Introduction

1-7

1.4.3 JSC Documents Number Title Paragraph AG-CWI-001 JSC Lessons Learned Process 4.3.4.6, 4.3.4.12 CWI J29W-01 JSC Export Compliance 2.7.3.8 JMI 2314.2L Identifying and Processing JSC Scientific, Technical

and Administrative Documents 4.2.3, 4.2.3.12

JMI 5150.5F Processing of JSC Procurements Through Delivery, Acceptance and Payment Stages

4.3.1.12

JMI 5151.5B Management of Support Contracts 4.3.1.12 JPC 7120.2 JSC Project Management Council 1, 2.5.3.3 JPD 1382.1H Release of Information to News Media 2.7.3.17 JPD 1382.4K Freedom of Information Act (FOIA) 2.7.3.17 JPD 2200.1A Release of JSC Scientific and Technical Information

to External Audiences 4.2.3, 4.2.3.12

JPD 5335.1 JSC Quality Policy 4.3.4, 4.3.4.12 JPD 7120.1 JSC Project Management Policy 1, 1.3, 2.6 JPD 8090.1 JSC Systems Engineering Policy 1, 1.3, 2.6 JPG 1440.3 JSC Files and Record Management Procedures 2.7.2.10, 4.2.3, 4.2.3.12, 4.2.6,

4.2.6.12 JPG 1700.1 JSC Safety and Total Health Handbook 4.1.11.6, 4.1.11.12 JPG 5335.2 Space Act Agreements 4.3.1.1, 4.3.1.3, 4.3.1.12 JPG 5335.3 JSC Quality Management System Quality Manual 1, 4.1.11.6, 4.3.4, 4.3.4.3, 4.3.4.12 JPG 8080.5 JSC Design and Procedural Standard Manual 4.1.11.6 JSC Announcement 02-035

Charter of the Engineering Review Board 2.5.3.1

JSC Earned Value Management Handbook 2.6 JSC-24937 Limited Life Time Cycle Items Program

Requirements Document 4.1.11.6, 4.1.11.12

LA-CWI-01 Budget Planning Process 4.2.1.6 SLP 4.6 Procurement 4.3.1.1, 4.3.1.3, 4.3.1.12 SLP 4.20 Process Measurement and Improvement 4, 4.2.1.10, 4.3.3.10, 4.3.3.12 JSC Cost Estimating (Web page) 4.1.12.11, 4.1.13.12 1.5 Cancellation None.

Page 26: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: Systems Engineering &Project Control Processes and RequirementsO

verview o

f Pro

ject Man

agemen

t at JSC

Page 27: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-1

Chapter 2 Overview of Project Management at JSC

2.1 Scope Project Management is the function of planning, overseeing, and directing the numerous activities required to achieve the requirements, goals, and objectives of the customer within specified cost, quality, and schedule constraints. The scope of project management includes two major areas of emphasis, both of equal weight and importance. These areas are systems engineering (SE) and project control. It is critical for the success of the project that the project management team understands that both major areas must be successfully implemented and continu-ally used throughout the project life cycle. Definitions for SE and project control, as well as the main con-tent areas under each, are depicted in Figure 2.1-1.

2.2 Systems Engineering SE is a disciplined approach for the definition, im-plementation, integration, and operation of a system

(product or service). The emphasis is on achieving stakeholder functional, physical, and operational per-formance requirements in the intended use environ-ments over the planned life of the system and within cost and schedule constraints. SE includes the engineering and technical manage-ment processes that consider the interface relationships across all elements of the system or other systems, or as part of a larger system. The SE function systemat-ically considers all technical aspects of a project in making design choices and is a continuous, iterative process that is used throughout the life cycle of the project. These iterative efforts result in the best system architecture, design, manufacturing, and operations possible for the given cost and schedule constraints. The success of every Johnson Space Center (JSC) project is highly dependent on the SE process being properly exercised at all levels of de-sign and across all phases of the project life cycle.

2.3 Project Control Project control is the total management process of establishing and maintaining project baselines and

Project Management

• Configuration management

• Risk management

• Acquisition management

• Quality management

Systems Engineering A disciplined approach for the definition, implementation, integration, and operation of a system (product or service). The emphasis is on achieving stakeholder functional, physical, and operational performance requirements in the intended use environment over its planned life. SE includes the engineering processes and technical management processes that consider the interface relationships across all elements of the system, other systems, or as part of a larger system.1

• Systems requirement development and management

• Operations concept development • Systems architecture development

(decomposition, feasibility studies, design, attainment, technology plan-ning, systems analysis)

• Integration, verification, validation, safety, and mission success

• SE management (technical work and resource management, control, reviews)

Project Control Total management process of establishing and maintaining project baselines for project content, scope, configuration, schedule, and cost.2

• Resource management • Planning • Project analysis • Documentation and data management • Schedule management • Performance measurement • Cost estimating

1NASA Systems Engineering Working Group Definition, Draft SE NPR 12/2002. 2From W. Lilly Project Control Study for the National Academy of Publication Administration (NAPA), 1989.

Figure 2.1-1. Project management scope.

Page 28: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-2

effectively supporting the project manager in meeting the overall objectives of the project. It functions in both a proactive and a reactive context. The proactive aspects of project control include baseline control of project processes and control of the project manage-ment plan and its changes over the life cycle. An ex-ample of such a proactive aspect might be identifica-tion of process trends that may eventually lead to prob-lems in meeting cost or schedule. Reactive aspects of project control include performance measurement and control, and management of variances to the project cost and schedule. This includes taking corrective actions for these variances. The success of projects depends on disciplined attention to numerous planning, resource, and sched-uling variables, and the integration and optimization of interrelated activities. Project control ensures that project objectives are met by monitoring and measur-ing progress regularly to identify variances from plan. Corrective action can then be taken when necessary. For example, systematic attention to the implications of variances between planned baselines and actual performance on projects is critical to taking early remedial action, thereby reducing costly delays and increasing the probability of achieving success. Additional detailed background discussions on project control experiences can be found in Special Publication (SP) 6103, NASA Readings in Project Control . 2.4 Project Types, Approaches, and

Life Cycle Phases This section addresses two related but distinct sub-jects. First, in Section 2.4.1, the various types of proj-ect implementations at the center are presented along with their defining characteristics. This is followed by a discussion, in Section 2.4.2, of the important subject of identifying and selecting the appropriate project approach to improve the likelihood of achiev-ing project success. The project life cycle is presented in Section 2.4.3. 2.4.1 Project Types A discussion of the types of projects at JSC needs to be prefaced with an understanding that most, if not all projects, involve one or more of the listed project types at some point during the project life cycle. The determination of what type of project designation is most appropriate is based on the intended final pro-duct and the project approach used to complete it. The list of project “types” includes:

• Space flight system development project • Advanced technology development project • Science research, applied research, or advanced

studies project

• Institutional project • Operations project

2.4.1.1 Space flight system development

projects Space flight system development projects are gen-erally those hardware (H/W) and/or software (S/W) projects that result in an end product for flight into space. These items may include products for flight experiments, flight crew equipment, or detailed test objectives (DTOs). A common characteristic of a space flight system development project is often a “waterfall” develop-ment process, as shown in Figure 2.4-1. 2.4.1.2 Advanced technology development

projects Advanced technology development projects are those H/W and/or S/W projects characterized by an end product for use in test or demonstration of a de-sign concept(s). These intended final products may or may not result in a product to be operated in space or on an aircraft. The development of these products can be characterized by a “spiral” development process, as shown in Figure 2.4-1. A larger view of the “spiral” detail is provided in Appendix H. Examples of this type of project at JSC include Robonaut and the variable specific impulse magnetoplasma rocket (VASIMR). 2.4.1.3 Science research, applied research,

and advanced studies Science research and applied research projects are efforts that are intended to answer, or at least to address, fundamental research or development ques-tions. Basic research can thus be understood to have no direct or foreseen applications. Applied re search is most commonly an effort to develop advanced proto-type H/W and/or S/W. Advanced studies examine the feasibility of approaches to a design problem or an operational alternative. Examples of advanced studies include studies performed to support strategic planning formulation. Since these final products are commonly developed based on a “pay as you go” level of effort (LOE) process tied to funding availability, the devel-opment effort may vary from year to year. Because of the diverse scope and unique selection and review process for fundamental, applied research and advanced studies projects, a modified management approach is warranted. The end goals of research proj-ects often do not fit the life cycle definitions shown here. However, sub-projects within a research project may fit within these requirements. Examples include the development of research facilities, test rigs, mock-ups, and other research equipment.

Page 29: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-3

2.4.1.4 Institutional Institutional projects are the buildings, site infra-structure, and H/W and/or S/W development efforts that culminate in an end product that supports a mis-sion-critical infrastructure or Center-operational func-tion. Examples of this project type include typical construction of facilities (CofF), Center-funded con-struction, infrastructure modification (e.g., telephone system), or education efforts (e.g., KC-135 Student Program, etc.). It also includes technical support sys-tem development such as the design and data manage-ment system (DDMS), Mission Control Center (MCC) local area network (LAN) upgrade, or aircraft H/W and S/W development efforts. The CofF and associated support projects have a unique feature among institutional project types since they already have an existing set of project manage-ment processes, practices, and requirements. These include:

• NASA Policies and Directives (NPD) 7330.1F, Approval Authorities for Facilities Projects

• NPD 8800.14B, Policy for Real Property Management

• NPD 8810.2, Master Planning for Real Property • NPD 8820.2A, Design and Construction of

Facilities

• NASA Procedures and Requirements (NPR) 8820.2E, Facility Project Implementation Guide

• NPD 8831.1D, Management of Institutional and Program Facilities and Related Equipment

• NPR 8831.2D, Facilities Maintenance Management

• NPD 8820.3, Facility Sustainable Design In addition, the current versions of NPR 8820.3 and NPR 8820.2E require an engineering-procure-ment-construction (EPC) approach to facility delivery, commonly referred to as a “design-bid-build” process using performance-based, fixed-price contracts. Facilities and associated support projects shall be exempt from the requirements documented herein, with the exception of the various project management forums discussed in Section 2.5, because of the EPC approach, the existing laws and regulations concern-ing design and construction standards, and the exist-ing body of knowledge and practice in this area. 2.4.1.5 Operations Operations projects are those building, site infra-structure, and H/W and/or S/W development efforts that provide a foundation for future development and research to use or build on. This project type is typic-ally characterized by the following major points:

Figure 2.4-1. Typical development processes for projects.

Space Flight System Development

Advanced Technology Development

Science and Applied Research

Institutional

Operations

“Waterfall” development

“Spiral” development based on technology readiness level (TRL)

“Pay as you go” level of effort (LOE) typically sub-TRL 3 activities

Either 1) Waterfall 2) Spiral 3) LOE Either 1) Waterfall 2) Spiral 3) LOE

PROJECT TYPE TYPICAL DEVELOPMENT PROCESS

Page 30: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-4

• The vast majority of the life of the project is spent in the operations phase.

• The project product functions as a “facility” to allow additional H/W and/or S/W to be added, where this new S/W could accomplish an as-yet-unspecified new mission.

• The project management team makes decisions on a life cycle basis, as opposed to a near-term problem-resolution focus.

• Activities such as a preplanned product improve-ment (P3I) effort may be part of the core project.

• The project management process has a clear un-derstanding of not precluding future as-yet-un-specified H/W and/or S/W modifications.

It should be understood that while the operations project H/W and/or S/W may have been developed under a space flight system waterfall process, an ad-vanced technology spiral process, or even an LOE process, the project type is considered “operations” due to the role of a support facility for additional fu-ture capabilities. Examples of this project type in-clude the Space Station Training Facility, the Hubble Space Telescope, and, at a program level, the Space Shuttle Program and the International Space Station Program. 2.4.2 Project Approaches Given the project need and real-world constraints, the project team, stakeholders, and customer must identify the project approach that will most likely achieve success. It is critical that discussions of proj-ect approach include all stakeholders and customers, and that consensus results are formally documented in the project management plan (PMP) prior to author-ization to proceed (ATP).* (See Section 2.5.3.3 for a discussion of the ATP process.) Development and implementation of the approach of the project is the responsibility of the project manager and the project management team. For JSC projects, some characteristics for success should be implemented to the maximum extent pos-sible. These include:

• The maximum use of contractor documentation and processes, once these have been determined to meet or exceed government-equivalent items, for contractor-supported/NASA -managed projects

• A very close contractor/NASA working relation-ship for all team members, especially including the NASA contracting officer, for contractor-supported/NASA-managed projects

*ATP is a formal approval provided by the JSC Project Management Council (PMC) to implement a JSC project. This approval marks a commitment by the Center to exe-cute the project within the plans and resource envelopes authorized at the time approval is given.

• An agreed-to-, documented minimum project documentation/deliverables list

• An agreed-to, documented minimum content for each project documentation/deliverable

• An absolute minimum set of requirements for the project (both stated and implied)

• An extremely aggressive effort to minimize re-quirement changes/additions over the life of the project

• A documented, agreed-to minimum amount of reporting and statusing to customer, Center, Agency, or external forums

• Each technical area should include technically experienced team members who are capable of successfully working with others

• A rapid but well-documented, streamlined change order review, approval, and implementa-tion process

• A rapid but well-documented H/W and/or S/W modification process

• Collocation of the entire team to the greatest extent possible

• Maximum use of information technology (IT) resources to increase the productivity of each team member

The project approach is developed by carefully considering, discussing, and selecting from several sets of supportive options. Elements of project ap-proach to consider include:

• Staffing • Type of contract • Magnitude • Implementation approach • Degree of implementation of the characteristics

for success previously mentioned • Customer desires

The potential breadth of answers to these discus-sions clearly shows the wide variation possible in many aspects of the project (e.g., deliverables), and the re-sulting impact to project schedules and resources. 2.4.2.1 Staffing One of the first project approach discussions shall be the level of civil service involvement. This would include whether the project will be a government-furnished equipment (GFE)/data project (i.e., accom-plished solely by civil service technical development), a contractor-developed/NASA-managed project, or a project that is a combination of both. 2.4.2.2 Type of contract If the project requires contractor support, the next question that needs to be addressed is the type of con-tract to be used.

Page 31: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-5

2.4.2.3 Magnitude Still another supportive option to consider in de-veloping the project approach is the overall life cycle cost estimate for the project. A project estimated to cost less than $1M could, and probably should, have a simpler project management approach than a proj-ect costing greater than $10M. The selected project approach should be tailored to effectively manage and achieve the individual project. This could include reduced project team size, increased use of Web-based SE and project control applications, increased use of IT resources to support project decision making, re-duced number of project deliverables, reduced content of the agreed-to list of project deliverables, etc. Dis -cussion of the budget-driven project tailoring should be a significant factor in the development of the PMP, schedules, and deliverables. For example, for a small project under $1M it may be technically acceptable to combine the Preliminary Design Review (PDR) and Critical Design Review (CDR). In contrast, a large project greater than $10M may require not only a PDR and a CDR, but also H/W PDRs and CDRs sep-arate from S/W PDRs and CDRs. In that case, inte-grated system PDRs and CDRs would be required as well. As the reader call tell, the budget-driven project tailoring approach is a key discussion when planning and defining a project. 2.4.2.4 Implementation approach For space flight systems and advanced technology development projects, the project management team shall consider whether the project approach is a “baseline” approach, an “X-project” approach, a “protoflight” approach, or a “protoqual” approach. Each of these approaches is discussed further in this document. This document focuses primarily on the baseline approach. It shall be considered the default standard approach to be used by JSC projects. Although the other approaches are acceptable alternatives, the rationale for any alternate tailored approach must be documented in the PMP. It is of special importance, however, that the project team understands that use of other than a baseline approach needs to be discussed and concurred with by the customer, directorate, and Center management as early as possible in the project formulation efforts and prior to ATP.

a) Baseline approach – A baseline approach can be considered to be a project following a “stan-dard” life cycle and providing a complete proj-ect deliverables set such as is discussed in SP 6105, NASA Systems Engineering Handbook , and the latest version of NPR 7120.5B, NASA Program and Project Management Processes and Requirements. This includes planning and documentation development (e.g., trade studies,

PMP, acceptance test plan, system documenta-tion, etc.). It begins at Pre-Phase A or Phase A and continues on until termination of the project. A tailored baseline approach is generally the default approach for institutional and opera-tional projects based on NPRs and NPDs for facilities.

b) X-project approach – An X-project approach is an option when the project management and the customer are willing to accept an increased level of risk, above that of the baseline approach, in meeting cost, schedule, and performance require-ments for the project. Normally, this fast-paced approach is appropriate when the project incre -mentally builds on past development efforts and pushes the technological boundaries in only one or two well-defined areas. Typically, X-project types exhibit all of the characteristics for project success discussed earlier.

c) Protoflight approach – A protoflight approach is an option to consider when the project/program is also willing to accept an increased level of risk above the baseline approach, and when there are other factors that would make non-baseline approaches desirable. An example of this might be the need to develop a “quick reaction capabil-ity” to solve an urgent problem within an opera-tional system. A protoflight approach requires a test program that combines the objectives of the qualification and acceptance test programs with the under-standing that all protoflight components and assemblies are intended for subsequent flight use. The protoflight approach uses agreed-to and documented reduced test levels, cycles, and/or durations from the standard qualification test requirements. This is intended to allow the protoflight-tested H/W to be used later for flight. This approach is not intended, nor it is accept-able, to be used as a standard lower-cost, faster alternative to the baseline qualification and acceptance test programs. A protoflight approach has a higher level of technical risk approach as compared to a full qualification test program due to there being no demonstrated flight duration capability (i.e., num-ber of cycles, or time or operation or exposure to the service environment) and, in some cases, lower demonstrated margins over the service environment extremes. A key characteristic of a protoflight approach is that there is a signifi-cant amount of discussion with the customer prior to accepting this higher-risk approach as documented in the PMP.

d) Protoqual approach – A protoqual approach is another increased-risk alternative approach

Page 32: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-6

from the baseline flight qualification and accep-tance testing approach. A protoqual approach is very similar to a protoflight approach, except that a modified qualification test program is conducted only on a single item, and that test item is considered available for flight. The nor-mal full acceptance test program is then con-ducted on all other items intended for flight. Additional information regarding both the protoflight and the protoqual approaches can be obtained in MIL-HDBK-340A, Vols. 1 and 2, Test Requirements for Launch, Upper Stage and Space Vehicles.

2.4.3 Project Life Cycle Phases All JSC project activities shall use the same project life cycle phase template to manage their efforts. This template shall consist of an interval-based approach, defined by the completion of various activities and products, as well as successful completion of the re-spective “control gate” at the end of the stage or phase. This control gate will be a defined management and/ or technical milestone in the development of the project and will be documented in the PMP (see FIG. 2.4-2). Documentation of these life cycle phases, their respective activities and products, and their control gate serves as a common template for project man-agement efforts throughout the life of the project. Because of the wide diversity of project types at JSC, it is important to understand the intent behind each activity, phase, and stage to identify the applicable processes and control gates. While the terminology may not be identical across all project types, the in-tent is the same. For example, for research projects, Figure 2.4-2. Project life cycle phases.

the Phase A goals and objectives may be the An-nouncement of Opportunity (AO) the researcher is responding to, while for a space flight system devel-opment project goals and objectives may be provided as part of the Enterprise strategic plans. Another ex-ample is the risk management plan. The researcher will follow the same risk management process called out in Section 4.3.2; however, it may be documented as an integral part of the research plan developed in support of the effort. Detailed discussions of the various life cycle phases are provided in Chapter 3, Project Life Cycle Requirements.

Control Gates Life Cycle Phase Stage Objective

Pre-Phase A (Advanced Studies)

Phase A (Preliminary

Analysis)

Phase B (Definition)

Phase C (Design)

Phase D (Development)

Phase E (Operations)

Project Feasibility

System Definition

– Understand the customer – Identify feasible alternatives

– Analyze project requirements – Establish opti-mum architecture

System Definition

Preliminary Design

– Analyze system requirements – Establish optimal system design

– Perform preliminary design

Final Design

– Perform final design

Fabrication & Integration

– Manufacture & assembly – Integration & test – Verification & acceptance

Preparation for Deployment – Prepare system for deployment

Deployment & Operational Verification – Conduct deployment & operational verification

Operations

Disposal

– Conduct operation

– Decommission or dispose of system

Page 33: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-7

2.5 Project Management Forums JSC projects may have several different forums that provide management and technical oversight. The initial baseline project management oversight forum shall be the JSC PMC until formally delegated to a lower-level board. However, to the greatest de-gree possible, the Center management team will at-tempt to delegate oversight to the lowest possible board, thus putting the management and technical oversight as close as possible to the project and its efforts. Project managers and project teams must under-stand that:

• Projects may be required to report to center-level forums

• Projects may be required to report to program-level forums.

• Projects may be required to report to directorate-level forums.

• Projects may establish their own forums − At and across the project level. − At and across the system level. − At the subsystem level.

There are many different forms a forum can take. For example,

• Management councils • Management boards • Review boards • Control boards • Technical working groups

The forums used and defined by the project shall be defined in the PMP. 2.5.1 Project-level Forums The designation of the project-specific boards is defined at the discretion of the project manager and project management team as they see fit to ensure that project needs, goals, and objectives are satisfied. The project team may also establish technical forums down and across the project organization to commun-icate, resolve, and concur on technical aspects of the system. In both cases, some form of these boards should exist over the entire life cycle of the project. 2.5.2 Directorate-level Forums Each directorate may have a unique number of technical and management boards, board structures, and placements within the organization deemed ap-propriate to manage their projects. One option for delegation of management oversight from the JSC PMC is to a directorate-level management board.

2.5.3 Center-level Forums 2.5.3.1 JSC Engineering Review Board (ERB) The JSC ERB charter and membership are doc-umented in JSC Announcement 02-035, Charter of the Engineering Review Board . The JSC ERB ensures consistent application of policies, guidelines, proc-esses, standards, and requirements as related to SE across JSC. The Board also evaluates new program and institutional initia tives, ensuring consistency with the Center’s strategic objectives. Further, the Board provides the Center Director a means to ensure that viable project trade-offs and decisions are made. The ERB functions in concert with the JSC PMC, provid -ing a technical review capability; however, ERB endorse-ment is not a prerequisite to a JSC PMC presentation or decision. Types of ERB presentations include: project status/ overview presentations, technical issue resolution presen-tations, Center process improvement initiative presen-tations, project pre-proposal authorization presentations, and project implementation approval presentations. Center project management-related process improve-ment initiatives are presented to the ERB for approval and status since the Board is responsible for ensuring consistent application of Center processes. ERB proj-ect pre-proposal authorization presentations may be required, by direction of the Center PMC. This would occur prior to Center PMC pre-proposal authorization. For projects requiring Center PMC ATP, project im-plementation approval presentations must be approved by the ERB. This approval reflects an ERB decision as to whether the system and its operation are well enough understood to warrant design and acquisition of the end items. Direction within the purview of the ERB may result during any presentation, and may re-quire further coordination with the customer and stake-holders . Any changes affecting the PMP or project-customer agreements shall be documented. 2.5.3.2 JSC Facility Review Board (FRB) The FRB provides senior management review and assessment of proposed facility and facility-associated support projects and functions as a Facilities Utiliza-tion Review Board in accordance with NASA Man-agement Instruction (NMI) 7234.1. The responsible office is Center Operations with membership from across the JSC directorates. The FRB charter is at: http://server-mpo.arc.nasa.gov/Services/CDMSDocs/Centers/JSC/Dirs/JPG/JPG1107.1AC4.html. The Board functions as a management review forum for facilities and associated support project types, as dis-cussed in Section 2.4.1. It shall review all proposed facility and facility-associated support projects, and recommend to the PMC a primary project oversight forum for these project types. If delegated by the PMC, the FRB functions as the primary oversight forum for this project type.

Page 34: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-8

2.5.3.3 JSC Project Management Council In relation to JSC projects, the JSC PMC is the highest Center-level decision-making body, and it exists to ensure JSC project success. A detailed dis -cussion of the JSC PMC applicability, functions, and membership is defined in JPC 7120.2. Because of the number of JSC projects, not all will be reviewed on a regular basis by the JSC PMC. Some shall be formally delegated, in writing, to a directorate-level project man-agement board. However, any JSC project, whether delegated to a directorate board or not, shall be re-quired to provide a status to the JSC PMC if requested by the JSC PMC chairperson. As a general rule, JSC projects meet with the PMC to receive formal ATP as defined in the PMC pre-proposal, proposal, and implementation process out-lined in Section 2.5.5.1. Once established, projects meet with the PMC to present status on an agreed-to basis. Other project topics may be covered at the dis-cretion of the JSC PMC chairperson. Where the project manager for a JSC project is resident in a program office, the JSC PMC shall be limited to reviewing only the JSC support to that project. In this case, the presenter need not be the project manager. For those projects where the project manager is resident in a directorate, the JSC PMC will review the full project scope, including cost, technical, and schedule aspects. In this case, the presenter should be the project manager. 2.5.3.3.1 JSC Project Management Council

processes 1) General Process Overview – The JSC PMC

chairperson convenes the PMC quarterly as a minimum. The JSC PMC secretary coordinates requests for special JSC PMCs with the JSC PMC membership. The results of the JSC PMC discussions and assessments are documented in the form of minutes, findings, and actions. Out-side-of-board (OSB) decision making by the JSC PMC chairperson is not the preferred pro-cess and is used on an exception basis only. All OSB decisions are documented in writing and provided to the JSC PMC membership by the JSC PMC secretary at the earliest opportunity. Unless otherwise documented herein, OSB ac-tivity is limited mainly to only time-critical decisions. The following paragraphs define the standard JSC PMC milestones on which the JSC PMC bases its deliberations and decisions regarding the initiation or continuance of projects. The JSC PMC makes three major decisions in the life cycle of a project. These are to: (1) pursue new business by initiating or concurring on pre-proposal/proposal activity for development or

support service projects; (2) approve imple -mentation of a project; and (3) for projects that have given the ATP, concur with continued development and implementation of the project, authorize or enable the corrective action neces -sary to mitigate unfavorable impacts during project development, or terminate the project (FIG. 2.5-1).

2) Pre-proposal Approval Process – The purpose of the pre-proposal approval process is to char-acterize for the JSC PMC any perceived business opportunities in which JSC may want to partici-pate. This review occurs as soon as sufficient understanding of the customer’s request and resources required is available. It occurs before the concept review milestone in Pre-Phase A. The discussion at the JSC PMC will be focused around four major points. These points are: • Whether the project being proposed is

within the JSC mission and goals • Whether there is the appropriate level of

JSC support available to provide a reason-able probability of success for the pre-pro-posal and proposal development effort

• What the estimated costs for the pre-pro-posal and proposal efforts are, and what funding sources are available

• Whether this pre-proposal and proposal development effort was requested by a NASA program

A favorable outcome of this JSC PMC re-view results in approval to begin a pre-proposal effort. No set criterion is established to define, in advance, whether delegation to a lower-level directorate board should be done. However, at the same time as the JSC PMC provides pre-proposal approval, the project may formally request delegation to a lower-level board. The information requested is minimal in order that JSC PMC decisions may be timely; however, the information must provide assurance that there has been a complete and thorough as -sessment of all resources necessary to success-fully complete the pre-proposal and proposal development effort. For facility and facility-associated support projects, the JC PMC shall review the recommendations of the FRB and decide on the appropriate project review board. If the pre-proposal and/or proposal efforts were specifically requested in writing by a NASA program, the pre-proposal team shall immediately proceed without waiting for JSC PMC approval and shall report their initiation of efforts at the earliest possible JSC PMC. Similarly, if the pre -proposal effort was not requested by the program but can be funded by internal directorate funds

Page 35: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-9

Yes

Yes

Figure 2.5-1. Pre-proposal/proposal/implementation approval process.

Present to JSC PMC.

Pre-proposal approval given?

Yes

PMC delegated

oversight to directorate

?

Project pre -proposal discussion at directorate(s)

Is it a facility or facility- associated

project?

No Directorate(s) identify pre-proposal/proposal

manager for project

Has pre-proposal funding been

obtained?

Yes

No

Yes

Facility Review Board review

FRB recommends PMC review oversight of

project?

Yes

Follow facility NPDs and NPRs for project

development

FRB provides project oversight

Modify PMC pre-proposal form

Develop PMC pre -proposal approval form

Pre-Phase A work begins

Inform JSC PMC of pre-proposal

development effort

No Actions assigned

?

No No Terminate activity

Yes

Directorate boards provide project oversight

Facility project

?

Yes

No

Directorate(s) name pre-proposal/proposal

manager

Follow facility NPDs and NPRs for project development

Key: ATP – authorization to proceed ERB – Engineering Review Board FRB – Facility Review Board JSC – Johnson Space Center NPD – NASA Policy Directive NPR – NASA Procedures and Requirements OSB – outside of board PMC – Project Management Council

Pre-Phase A work

Phase A work

Directorate or designated board reviews

Phase B work begins

System Definition Review (SDR)

completed

ERB review Program- directed project?

Yes

No

Present to JSC PMC w/ proposal results.

ATP given?

Customer approval review

Present to JSC PMC for concurrence

(may be processed OSB)

Phase C work begins

Project presents status to JSC PMC on agreed-to project -specific schedule

No Actions assigned

? No

Yes

Completed assigned actions

Terminate activity per Section 3.7

No

Yes

Yes

Page 36: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-10

(e.g., Center general and administration (G&A) allocation) or external Center funding, the team may immediately proceed without waiting for JSC PMC approval. In such case, the pre-pro-posal team shall report their initiation of efforts at the earliest possible JSC PMC. At that time, the JSC PMC must determine whether project oversight will be kept at the PMC or delegated to a directorate-level project management board. The JSC PMC approval for this effort may be given OSB. If the pre-proposal efforts are for a project in support of a non-NASA program and funded by internal Center funds, JSC PMC approval shall be required prior to initiation of these activities. Detailed PMC procedures in support of this process are documented in the JSC Office of the Chie f Engineer’s Web site.

3) Implementation Approval Process – The purpose of the implementation approval process is to re-view the project proposal developed and to de-termine whether it should be implemented. This review occurs at the same time as System Defi-nition Review (SDR) in Phase B of the project life cycle. The discussion at the JSC PMC will be focused around four major points: • That the project is appropriate for the JSC

mission and goals • That there is the appropriate completeness

and accuracy of the proposal development effort expended to date

• Whether the appropriate level of JSC support is available to provide a reasonable proba-bility of success for the resulting project

• The appropriateness of delegating manage-ment oversight to a directorate-level board (and which directorate-level board to dele-gate to)

A favorable outcome results in an ATP for implementation of the project. An unfavorable outcome may result in additional work being required or termination of the effort (FIG. 2.5-1). For all projects given ATP, the JSC PMC shall document in its minutes the frequency of future status reviews. The JSC PMC project approval information requested is minimal in order that JSC PMC de-cisions may be timely; however, the information must also provide assurance that a complete and thorough proposal development effort has been made. If the proposal efforts were specifically requested in writing by a NASA program, the proposal team can immediately proceed without waiting for JSC PMC approval and shall report their initiation of efforts at the earliest possible

time to the JSC PMC. JSC PMC approval for this effort may be given OSB. Detailed PMC procedures in support of this process are documented in the JSC Office of the Chief Engineer’s Web site.

4) Implementation Review Process – The purpose of the JSC PMC implementation review is to look at the current status of the project or project sup-port effort. The discussion at the JSC PMC will be focused around two major points; i.e., to: • Demonstrate to the JSC PMC that the project

is meeting its intended goals and requirements within the approved schedule and resource envelopes.

• Apprise the JSC PMC of any corrective ac-tion needed above the project level.

A favorable outcome results in an authorization to continue des ign, development, or operations activities on the project. An unfavorable outcome may result in additional work being required or termination of the project. The JSC PMC project status review information requested is minimal in order that JSC PMC decisions may be timely; however, the information must also provide as-surance that there is complete and thorough project management of the design, development, or operations effort. All approved projects will be reviewed on an agreed frequency, and this frequency shall be formally documented in the JSC PMC minutes. Detailed PMC procedures in support of this process are documented in the JSC Office of the Chief Engineer’s Web site.

Page 37: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-11

2.6 Documentation Tree The following figure (FIG. 2.6-1) is provided to graphically show the various levels and relationships of many project management or project management-related documents as they affect JSC projects. The listings shown are not a complete representation of all documents since there are many more documents (including NASA external documentation) that are applicable. The listing is instead intended to show the JSC Center-level and directorate-level implementation of these documents and their requirements.

Figure 2.6-1. Documentation tree.

NPR Systems

Engineering

Headquarters Level “What”

NPR 7120.5B

Program/Project Management

Facilities NPDs and NPRs

NPD 8820.2A Design and Construction of Facilities NPD 8820.3 Facility Sustainable Design NPR 8820.2E Facility Project Implementation Guide NPR 8831.2D Facilities Maintenance Management NPD 8831.1D Management of Institutional and Program Facilities and Equipment NPD 7330.1F Approval Authorities for Facilities Projects

NPR 1440.6G

Records Management

NPD 7500.1

Program/Project Logistics

NPD 8010.2

Metrics System

NPD 8730.4

S/W IV&V

NPD 8010.3

Terminate Space Systems

NPR 8000.4

Risk Management

NPD 9501

EVM

NPR 2190.1

Export Control

NPD 2820.1 Software Policies

Center Level “Detail of What” “How”

JSC EVM Handbook

NASA RP 1358 Systems Engineering

Toolbox

SP 6105 Systems Engineering

Handbook

JPG Project Management

JPG

Software Guide and Requirements

JPD

Project Control (includes EVM)

JPD 7120.1 Project

Management

JPD 8090.1 Systems

Engineering

Directorate Level “Detail of How”

EA-WI-023

Engineering

WP 09-0014

Project Management

WSTF

TBD

MOD

TBD

SLSD

TBD

IRD

TBD

FCOD

Various WIs

JA

Key:

Completed In development

NPR Systems

Engineering

Handbook

Page 38: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-12

2.7 Project Team JSC uses, and shall continue to use, a collaborative engineering and development approach for all Center projects. As such, the project and support teams include representation of all organizations necessary to suc-cessfully complete the project. The following project and support team member discussions are intended to document a minimum set of roles and responsibilities needed to effectively and successfully manage a proj-ect. The team member listings also highlight the need for a conscious project discussion as to individual team member roles and responsibilities. The project must make a strong effort to identify the correct minimum number of individuals early in the project planning effort. In this identification pro-cess, however, there must be a clear understanding of the expected workload for each project and support team member. Especially important is an understand-ing of the workload for those individuals who take on multiple project roles (e.g., project manager and proj-ect control officer). The detailed discussion of what is required for each role is, therefore, provided to lay a common framework across all projects and director-ates at JSC. The project manager can also use the project team membership discussion for additional purposes. For example, they can use it for:

• A quantitative planning tool for estimating the minimum number of project personnel.

• An introduction to the tailored management processes and procedures to be used.

Finally, these agreed-to project management posi-tions shall be documented in the project management plan. 2.7.1 Organization The project organization is the establishment of clear, non-conflicting authority relationships between positions that have been assigned specific tasks requir-ed to achieve project requirements. A full understand-ing of the project performance requirements, costs, and schedules is necessary to identify the specific sets of responsibilities involved. Delegation of authority to undertake these responsibilities is key to organiza -tion, and is one of the most important managerial abilities. Unless authority is effectively delegated, duties requiring coordination of group activities can-not be effectively assigned to a subordinate supervisor, who must, in turn, must have adequate authority to accomplish those tasks or assign them to others. A project’s organizational structure and staffing are dependent on the character of the project and will change as the project matures and areas of importance and priorities shift. The project manager is responsible for identifying these necessary project changes and directing the associated replanning, reorganizing, and

re-staffing. Not all project position described in the following sections are necessary for every project. 2.7.2 Membership The following paragraphs describe functions and roles of various positions in the project management team. 2.7.2.1 Project manager role The project manager shall have the authority and responsibility to execute the project. The project manager is responsible for all aspects of the project beginning with identification of the project team skill requirements and ending, ultimately, in ensuring that project requirements are met within budget and sched-ule. The project manager is responsible, in accordance with Federal regulations and NASA and JSC manage-ment directives (many of which are referenced here-in), for the successful planning and implementation of project resources, schedule, and performance ob-jectives. This includes being responsible for the overall project safety and risk management. The project manager receives authority via a clear chain of delegation beginning with the program com-mitment agreement (PCA) and flowing through the program plan to the project management plan (for program-delegated projects) or by the approved proj-ect management plan (for JSC projects not tied to a specific program). The project manager monitors all aspects of project planning, definition, development, and implementation to develop a clear grasp of progress and problems. The project manager will then determine the necessary cor-rective action and implement it. The project manager is therefore responsible for determining when changes to the project schedule, design requirements, and avail-able resources are necessary or when interim test results, failure investigations, or other unpredictable constraints also require changes in the project, such as parallel and/or repetition of design activities. Additional key aspects of the project manager’s responsibilities include:

a. Functioning as the project representative to upper management. As such, it is the project manager’s responsibility to remain cognizant of external issues and concerns and address them in a timely manner.

b. Identifying to line management the skills required to successfully achieve the project objectives and working with line management to identify team members who meet these requirements.

c. Ensuring that the SE functions are accomplish-ed in accordance with this guideline.

d. Ensuring that the project control functions are accomplished in accordance with this guideline,

Page 39: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-13

including incorporation of appropriate earned value management (EVM).

e. Ensuring formulation and maintenance of the project acquisition strategy.

f. Ensuring adequate training for the project team members.

g. Knowledge of the information generated with-in the project and taking appropriate action to protect it, depending on its sensitivity.

h. Ensuring project compliance with export control requirements.

i. Ensuring appropriate implementation of safety for personnel and protection of national securi-ty classified/sensitive/valuable unclassified in-formation, material, facilities, and other property.

j. For a contractor-supported/NASA-managed project, developing and maintaining an under-standing of the contractor’s processes, proced-ures, and activities.

2.7.2.2 Deputy project manager (DPM) role Depending on the project, it may be necessary to have a DPM in addition to the project manager. The DPM will normally be formally delegated all of the responsibilities held by the project manager when the project manager is absent. In addition, the DPM may be delegated primary responsibilities for selected proj-ect management tasks by the project manager. Any project-unique responsibilities need to be document-ed in the PMP. Some examples of reasons why a DPM may be required include:

• A very large or complex project • A geographically distributed project • A project management developmental oppor-

tunity for an individual 2.7.2.3 Lead systems engineer (LSE) role The LSE is a key member of the project team. The LSE is functionally responsible to the project manager for assuring that system implementation fulfills system needs and that proper system engineering practices are being followed. The LSE oversees the project system engineering activities, including in-house and any contractor responsibilities, to assure they are ade-quate and in compliance with the project constraints of performance requirements, cost, and schedule. As part of the effect to ensure that proper system engineering practices are being followed, the LSE must direct, monitor, or coordinate applicable SE tasks both within the Center and, to the appropriate level and as required, outside the Center. The LSE must also constantly review and evaluate the tech-nical aspects of the project to ensure that the system/ subsystems engineering processes are functioning appropriately. In addition, the LSE plays the major

role in determining tailoring, if any, for the SE practices of the project. Although the project manager will look to the subsystem managers as the authorities on subsystem performance, it is the responsibility of the LSE to:

• Ensure the analytical and physical integration of the subsystems.

• Monitor the performance of all subsystems. • Ensure the technical performance of the overall

integrated system. • Identify, plan, schedule, appropriately resource

load, and execute all system-level test activities to support the overall project schedule.

As part of their effort in ensuring that system im-plementation fulfills system needs, the LSE is respon-sible for managing and directing all of the project’s SE activities performed using the SE processes out-lined in Chapter 4. These activities include require-ments development and flowdown, formulation of systems architectures and concepts, systems design, interface definition and control, draft development and monitoring of technical performance measure -ments (TPMs), system-level risk management, system-level trade studies, fabrication planning, test plan development, system integration and testing, and verification. Another role of the LSE is to make recommenda-tions on project design trade-offs where performance, cost, and schedule must be balanced. Through respon-sibility for the technical adequacy of all system-relat-ed activities of the project, the LSE must strive to ensure that system engineering is exercised in all project decisions. Finally, the skills and knowledge necessary for an LSE are normally accrued from multiple past subsys-tem or system assignments, training classes, and other professional developmental experiences. 2.7.2.4 Project control officer role The project control officer shall be responsible to the project manager for all project documentation, for assessing project progress against baseline schedule(s), for providing integrated (i.e., technical, cost, and sched-ule) project-level non-technical recommendations, and for assisting the project manager in control of project resources and activities, including budget, schedules, customer agreement, and overall project management. The project control officer shall also be responsible for providing support to the total manage-ment process of establishing and maintaining project baselines and effectively supporting the project man-ager in meeting the overall objectives of the project. The project control officer is responsible for:

• Resource management • Project planning

Page 40: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-14

• Documentation and data management • Project assessment

− Cost estimating − Performance measurement − Project analysis

• Schedule management • Acquisition management • Risk management • Configuration management (CM) • Quality management

The project control officer shall: a. Provide detailed recommendations early in the

project planning phases of what minimum data set (including the deliverables list) is required throughout the project life cycle.

b. Ensure project progress and activities are consistent with approved project customer agreements, budgets, schedules, and acquisition strategies.

c. Ensure all aspects of the project cost and schedule status are integrated and documented (especially including subsystem status).

d. Manage the day-to-day project-support func-tions such as management information systems (including IT security plans), organizing the lo-gistics of programmatic reviews, compiling and reporting project metrics, and reporting contract or NASA-internal performance measurement sur-veillance efforts (including any EVM efforts).

e. Ensure project document compliance with any NASA requirements and, if applicable, ensure contractor document compliance with any con-tractual requirements.

f. Fulfill the library and records retention function for storage and retrieval of project documents.

g. Develop the project data management proced-ures for the project.

h. Develop and implement planning for contract acquisition.

i. Provide detailed recommendations for the proj-ect acquisition strategy.

j. Provide the project input for the annual Center/ program operating plan (POP) cycle.

2.7.2.5 Subsystem and discipline lead

engineers role The subsystem lead engineer (SLE) and the discipline lead engineer (DLE) prove a significant factor in the successful technical and cost manage-ment of a project. The SLE is the project manager’s primary contact on management of a particular sub-system. Depending on the project product complexity and staffing constraints, an SLE may be responsible for more than one subsystem. The DLE performs a similar function, but is responsible for the manage-

ment of performance analyses for the system or sub-system. Examples of engineering disciplines that may be important to the project include: aerodynamics an-alysis; aerothermal analysis; guidance, navigation, and control; static and dynamic structural analyses; materials analysis; software/avionics development, and reliability, maintainability, availability, and human factors. SLEs and DLEs shall support the subsystem or analysis detailed status to the project LSE for tech-nical review and integration, and to the project con-trol officer for review and integration of cost and schedule status. The SLE or DLE shall:

a. Be responsible for ensuring the subsystem or analysis requirements are appropriately and cor-rectly flowed down from system requirements.

b. Be responsible for ensuring the subsystem or analysis requirements are achieved within proj-ect budget and schedule.

c. Be responsible for the subsystem functions and, therefore, for the technical performance of their subsystem. [SLEs]

d. Be responsible for the technical accuracy and completeness of their discipline areas. [DLEs]

e. Make appropriate use of the management tools provided by the project control officer (e.g., EVM, critical path analysis, 5×5 risk manage-ment matrix, technical performance measure-ments, etc.).

f. Be responsible for developing subsystem or analysis design documentation to support scheduled reviews.

g. Be responsible for the planning and conduct of subsystem fabrication and testing. [SLEs]

h. Ensure that, to the maximum extent possible, the system will be tested appropriately and can fulfill its requirements and perform as intended.

i. All subsystem-level test activities and support are identified, planned, scheduled, appropriately resource loaded, and executed to support the overall project schedule.

2.7.2.6 Verification lead role The verification lead is accountable to the project manager and is responsible for all verification aspects of the project. The verification lead shall:

a. Ensure requirements are verifiable. b. Develop the verification plan. c. Identify the most effective verification methods. d. Identify verification criteria and coordinate

activities with verification facilities, participants, and other team members.

e. Execute the verification plan. f. Develop or coordinate verification procedures.

Page 41: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-15

g. Collect and document verification results. h. Evaluate results for compliance or need for

reverification. 2.7.2.7 Validation lead role The validation lead is accountable to the project manager and is responsible for all validation aspects of the project. The validation lead shall:

a. Develop the validation plan. b. Identify the most effective validation method. c. Coordinate activities with validation facilities,

participants, and other team members. d. Execute the validation plan. e. Develop or coordinate the validation procedures . f. Collect and document the validation results.

2.7.2.8 Project scientist role Not every JSC project needs to have a project sci-entist. However, it may be necessary to have a project scientist on a project if there is significant scientific in-strumentation/utilization in the project. For a science-based project, the project scientist shall be deemed necessary when no principal investigator (PI) exists, when multiple PIs exist, when the PI is external to the Center, or when the JSC PI cannot serve in the proj-ect scientist function for the project. The project scientist’s role, duties, and responsibil-ities shall include:

a. Overseeing the scientific integrity of the proj-ect’s mission within project constraints.

b. Ensuring that science requirements are adequate-ly documented and verified in the project.

c. Ensuring that the project planning, definition, implementation, and operations are appropriate for the science requirements.

d. Serving as the scientific advisor to the project manager, advising on proposed changes to sci-ence objectives or requirements, when necessary .

e. Participating in appropriate project reviews to ensure the project science requirements are ap-propriately addressed.

f. Acting as the science interface for data analysis and plans.

It should be noted that when a PI serves in the role of project scientist, the PI may also serve as the proj-ect manager. 2.7.2.9 User representative role A key input into the project day-to-day manage-ment often comes from the end user of the product. While not every project at JSC needs to have a user representative on its team, it is critical that the project team continually consider the user as its develops the end products. Some examples of user representatives include flight crew, flight controllers, surgeons, or the JSC IT community. For those projects that have

a flight crew representative, their role is to provide the operator’s viewpoint and experience throughout all phases of the project, and to obtain consensus of the Astronaut Office on issues involving safety and mission success. The flight crew representative shall act as a consultant to the project team, and provide input into, and concurrence on, project requirements and products. The flight crew representative will also coordinate participation from the Astronaut Office in testing and checkout, procedures development and verification, and other activities as appropriate. Involvement of the Astronaut Office ensures that the on-orbit experience and lessons learned are car-ried forward in engineering activities dealing with human space flight. In addition, the Astronaut Office benefits by gaining in-depth knowledge of the sys-tems they will operate on orbit. 2.7.2.10 Administrative officer role The administrative officer is responsible for assist-ing the project manager in a variety of administrative activities in support of the project. Responsibilities may vary depending on the size and complexity of the project and the project manager’s delegation of responsibilities and assignments. Responsibilities may include the tracking of project-related corres-pondence, handling of personnel actions, oversight and maintenance of internal task agreements (ITAs), and recording and tracking of action items (both in-ternal to the project and assigned to the project ex-ternally). The administrative officer may also serve as the official training and records officer for the project, thereby ensuring compliance with JSC Procedures and Guidelines (JPG) 1440.3, JSC Files and Record Management Procedures. 2.7.3 Support Team The following paragraphs describe various func-tions and roles in support of the project management team. 2.7.3.1 Line management support JSC projects normally operate on a matrix organ-izational approach. As such, the primary functions of line management in the life cycle of a project include:

a. Participating in the project approval process. b. Providing the resources, facilities, and tools re-

quired for the project. c. Selecting competent and knowledge project

managers. d. Implementing Agency, Center, directorate, and

division policies, procedures, and standards. e. Enforcing safety, cost, schedule, technical, and

management processes to ensure product and services are consistent and of high quality.

Page 42: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-16

f. Providing support and oversight over technical management decisions.

g. Ensuring projects and project decisions that impact other projects, processes, or priorities established by the customers, the Agency, Center, directorate, and/or division are coordinated/communicated.

h. Ensuring communication and integration across projects and other divisional/organizational lines.

i. Mentoring the entire project team. j. Providing technical and management guidance

to the project manager based on project complex-ity and the experience level of the project manager.

While line management reserves the right to provide technical direction to project managers under their supervision, any resulting changes in baselined cost, schedule, or performance requirements shall be approved by the customer in accordance with the es-tablished CM process. Technical issues that cannot be resolved between the line management chain and the project manager shall be presented to the JSC ERB for final disposition prior to coordination with the customer. In cases where the project or line management feels a decision jeopardizes crew safety or mission success, appeals should be taken directly through the appropriate management chain or indirectly (anony-mously) through the independent assessment (IA) or-ganizations, the JSC Ombuds, the NASA Safety and Reporting System (NSRS), the Government Auditing Office, and/or the NASA Inspector General. Given line management responsibilities for the selection and performance of the project manager, line management can replace any or all of the project management team under their supervision when sig-nificant technical competency or management issues with the project management team are identified. In all cases, this removal should be coordinated with the project customer and the directorate. 2.7.3.2 Safety and mission assurance

(S&MA) support S&MA project support is provided through a Safety and Mission Assurance Directorate-assigned representative, usually collocated with the project team. In this capacity, the S&MA representative shall:

a. Assist the project manager in assuring that all system S&MA requirements (including person-nel and facility safety) are appropriately defined and implemented.

b. Provide for an independent S&MA oversight and assessment function for all aspects of the project.

c. Serve as the single point-of-contact between the project team and the Safety and Mission Assur-

ance Directorate, assuring proper coordination, review, or approval of all safety, reliability, maintainability, and quality assurance (QA) responsibilities and practices.

d. Ensure that S&MA processes are established within the project.

2.7.3.3 Mission operations support Mission operations project support is provided through an assigned representative or representatives. typically these representatives come from the Mission Operations Directorate. The representative shall be the project’s interface with the mission and opera-tions discipline organizations and personnel. The project team normally requires a person to lead efforts associated with ensuring that the project mis -sion operations are properly defined, planned, and executed. Mission operations encompass the flight and ground operations support personnel, S/W, pro-cedures, H/W, and facilities required to execute the flight mission. The responsibilities of the mission operations support team member are also to lead the efforts associated with defining the operations team and ensure that the operations team is trained and ready to support operations. Ground operations are included as the ground segment of mission operations. 2.7.3.4 Test operations lead support The test operations lead is accountable to the proj-ect manager and is the system test team individual responsible for ensuring that, to the maximum extent possible, the system will be tested appropriately and can fulfill its requirements and perform as intended. The test operations lead shall ensure that:

a. All system-level test requirements provided by the project team have been identified and docu-mented prior to testing.

b. All system-level test procedures provided by the project team have been developed and veri-fied prior to testing.

c. All subsystems supporting the system-level tests are ready to support and are appropriately documented.

d. The Test Readiness Review (TRR) appropriately addresses all system test performance and safety requirements and incorporates all applicable les-sons learned from the JSC lessons learned data-base (http://iss-www.jsc.nasa.gov/ss/issapt/lldb/) as well as the NASA lessons learned database (http://llis.gsfc.nasa.gov/); and that the required test and/or integration facilities have been veri-fied to provide the necessary conditions and have been scheduled and are available.

e. The required test and/or integration facilities have been verified to provide the necessary

Page 43: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-17

conditions and have been scheduled and are available.

f. The required ground support equipment (GSE) has been verified to provide the necessary sup-port resources and has been scheduled and is available.

2.7.3.5 Facilities support In some cases, it may be necessary for the project to have support for the modification or construction of Center facilities. Coordination of these activities may be accomplished through a Center Operations representative on the project team, or it may be ac-complished through project team coordination di-rectly with appropriate Center Operations personnel. For new facilities, it is critical that all requirements be identified in any budget submission, and the impact on other funding sources (e.g., institutional) also be defined. New facility requirements should be split between nonrecurring (outfitting) and recurring (op-erations and maintenance). Considerable and timely foresight must be given to facility requirements. Most requirements ideally can be accommodated with min-imal changes to existing facilities. Where changes are required, however, cost and schedule considerations will determine the procedure for affecting change. Facility projects may be funded by project or CofF funds. Project funding can be used only through use of an “Unforeseen Programmatic Document” that explains the urgency of the requirement and gives reasons why the requirement has not been included in the CofF budget cycle. The CofF budget cycle requires a minimum of three years from initial submission of the requirement to completion of construction. Initially, the project re-quirements are specified for inclusion in the CofF bud-get cycle. A preliminary engineering report further de-fines the requirements during the first year, design is accomplished during the second year, and construc-tion is started during the third year. In most instances, construction can be completed in one year. A number of Center offices participate in the preliminary engi-neering report and design phases, ensuring the com-pleted project will satisfy all requirements and meet Center objectives. 2.7.3.6 Environmental support In some cases, it may be necessary for the project to have support for environmental concerns or impacts . Coordination of these activities may be accomplished through a Center Operations representative on the proj-ect team, or it may be accomplished through project team coordination directly with appropriate Center Operations personnel. When planned project activities have the potential for environmental impact, the support function pro-

vides the capability to perform analyses and make assessments of the potential environmental impact. Some of the areas the environmental team supports include storm water and wastewater management, industrial and hazardous waste management, oil storage, asbestos and lead control, and spills and releases management. 2.7.3.7 Counterintelligence support In almost all cases, it will be necessary for the project to have support for the counterintelligence function. Coordination of these activities may be ac-complished through a Center Operations representa-tive on the project team, or it may be accomplished through project team coordination directly with the appropriate Center Operations personnel. The Center counterintelligence function provides an assessment and support capability for the project, the Center, and the Agency. This function works to actively identify potential technologies and other products that could be used by foreign governments (businesses or individuals) for their economic/stra-tegic advantage. This function is distinctly separate from the export control function in that it also works to actively assess technology development areas and analyzes them against specific threat areas. A key characteristic of these counterintelligence represent-atives is their ability to work proactively with the project management team to ensure both the appro-priate awareness and responsibilities to manage the protection of these technologies, and how to identify attempts to access them improperly. 2.7.3.8 Export control support In some cases, it may be necessary for the project to have support for export control activities. Coordi-nation of these activities may be accomplished through a Center Operations representative on the project team,, or it may be accomplished through project team coordination directly with the appropriate Center Operations personnel. The purpose of the export control function is to ensure compliance with the JSC export control re -quirements documented in the JSC Common Work Instruction (CWI) J29W-01, JSC Export Compliance, the NASA export control directive, NPD 2190, NASA Export Control Program, and the export control require-ments of the Departments of State and Commerce as well as other agencies. This important function cov-ers all JSC projects and encompasses export control to all product technologies and technical information that have the potential for export outside of the U.S. In addition, the Export Control Office provides con-sultation and guidance for all project activities at JSC.

Page 44: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-18

2.7.3.9 Logistics, transportation, and shipping support

In some cases, it may be necessary for the project to have support for logistics, transportation, and ship-ping. Coordination of these activities may be accom-plished through a Center Operations representative on the project team, or it may be accomplished through project team coordination directly with appropriate Center Operations personnel. For logistics, transportation, and shipping support, it is critical that all requirements be identified as early as possible in the life cycle. Early identification and communication of any of these requirements would minimize the probability of project disruption from these support areas. 2.7.3.10 Procurement support For contracted projects, the project manager’s procurement interface is a Procurement Office repre-sentative to the project team. This representative may or may not be collocated. For a JSC internal project (e.g., GFE), the project Procurement Office represent-ative may only function to assist in purchasing of materials, services, or components for the final project product. The Procurement Office representative shall be responsible for:

a. Development of the Master Buy Plan submis sion to the JSC Procurement Office.

b. Coordinating the acquisition strategy including scheduling and assisting in the development of the presentation to the Acquisition Strategy Meeting with JSC management and, if required, NASA Headquarters.

c. Assisting in the development of the statement of work (SOW) for the contract.

d. The request for proposal (RFP). e. Formation and operation of a Source Evaluation

Board leading to source selection. The Procurement Office representative is also responsible for contract negotiation for a contracted project. The Procurement Office representative also supports the development of a change order control system to implement contract changes. Timely devel-opment of change requirements and technical evalua-tion of change proposals by the Project Office will permit early change order negotiations and ensure a firm contract baseline. Formal authority to enter into and modify contracts rests solely with the contracting officer in the cognizant Procurement Office; however, there must be a close working relationship between the contracting office and the project contracting officer technical represen-tative (COTR) to ensure appropriate management and direction of any contractor’s support efforts.

2.7.3.11 Office of the Chief Financial Officer (CFO) support

The Office of the Chief Financial officer is the focal point for financial planning and budget execu-tion for the Center. The Office designs and implements the financial and resources systems required for prop-er data collection and reporting and ensures that Agency and Center-level financial and resource decisions are implemented. It reviews, approves, and implements financial and resources policies and systems and inte-grates the planning, implementation, management, and control of all resources for which the Center is responsible. The CFO provides guidance in administrative re-sponsibilities of the technical community in regard to the initiation and approval of purchase requests. The guidance is included in the IFM On-line Quick Refer-ence, http://olqr-cf.ifmp.nasa.gov. Guidance regarding planning and execution of civil service travel will be provided at a later date. 2.7.3.12 Office of the Chief Engineer

support The Office of the Chief Engineer provides support to the projects primarily through the Systems Manage-ment Office (SMO). The SMO provides leadership, consultation services, and technical expertise on SE and project management. The SMO also reviews and determines Centerwide usage and consistency for common SE and project management processes, pro-cedures, and practices. This includes the use and im-plementation of appropriate performance measurement systems requirements such as EVM. As a project consultant, the SMO can assist the project in planning and accomplishing Pre-Phase A, Phase A, and even some Phase B efforts. Experience has shown that a significant percentage of projects that have failed to meet performance, cost, or schedule requirements has had an inadequate effort in these critical project phases. Therefore, the SMO can be called on to assist the project in executing these phases appropriately. As a project reviewer, the SMO can act as an in-dependent reviewer, either at the request of the project itself, the program manager, the director, or the Cen-ter Director. In this capacity, the SMO can examine the project from a technical, cost, and/or schedule viewpoint and provide detailed recommendations for corrective actions. 2.7.3.13 Legal support The project manager must maintain a close work-ing relationship with the legal representative on all issues or concerns developing with legal implications or involving legal policy. Matters that should give rise

Page 45: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-19

to claims or litigation should be communicated and coordinated as early as possible. Any correspondence or contacts by external NASA legal counsel should be referred to or reported immediately to the Chief Counsel or the Chief Counsel’s designated represent-ative. Any court or administrative legal papers affect-ing NASA (e.g., lawsuits, claims, subpoenas, or sum-mons) shall also be referred to the Chief Counsel for advice and guidance. 2.7.3.14 Communication and information

technology support JSC has significant existing IT and communications capabilities to support individual or project needs. Be-cause of the diverse nature of projects at JSC, however, unique communication or IT requirements may devel-op. Coordination and discussion of these requirements may be accomplished through an Information Resources Directorate (IRD) representative on the project team, or it may be accomplished through project team coor-dination directly with appropriate IRD personnel. For information on existing directorate-specific IT tools, the project team should review the listings at http://cio.jsc.nasa.gov/Center/itp/standards/standards.htm. These unique communications and IT requirements should be identified early in the project planning efforts to provide for effective long-range implementation and budget planning as part of the Agency and JSC forecast. With this information, early communications feasibility engineering for lease or contracting work-loads for project execution can be accomplished. Timely scheduling to include communications in project planning will permit the required communi-cations and IT support to be available to support the project for management, scientific, technical, and operational requirements. 2.7.3.15 Documentation and graphics

support JSC has significant existing documentation and graphics capabilities to support individual or project needs. Coordination and discussion of these needs may be accomplished through an IRD representative on the project team, or it may be accomplished through project team coordination directly with the appropri-ate IRD personnel. Some of the documentation and graphics services and products provided throughout the project life cy-cle include printing and reproduction, editing and technical writing, graphics, mail delivery/pickup services, and documentation repository. 2.7.3.16 Photographic-video support The project may require photographic or video support over its life cycle. The IRD is responsible for providing the photographic and operational video

services for development and mission support. Co-ordination and discussion of these needs may be ac-complished through an IRD representative on the project team, or it may be accomplished through project team coordination directly with the appro-priate IRD personnel. For space flight mission support, this includes acquisition, distribution, recording, search/retrieval and playback, editing, duplication, and archiving. Operational services are also provided to satisfy project-level test, training, and administration video support requirements. This support team also oper-ates the audio/video distribution network and the JSC cable TV system. 2.7.3.17 Public Affairs Office support The Public Affairs Office provides support to the projects and the project team by serving as a liaison for project information going to the public and the media. Coordination of these activities may be accom-plished through a Public Affairs Office representative on the project team (if there is enough demand); or it is more likely that it may be accomplished through project team coordination directly with the appropri-ate Public Affairs Office personnel. Additional guid-ance can be obtained through reference to JPD 1382.1H and JPD 1382.4K. 2.7.3.18 Human factors support For cases in which the H/W or S/W being develop-ed requires a human interface, it may be necessary for the project to have support for human factors and habitability concerns or impacts. Coordination of these activities may be accomplished through project team coordination directly with the appropriate repre -sentatives or by having an assigned representative in the team. Typically these representatives come from the Space and Life Sciences Directorate. When planned project activities have the potential for human factors or habitability impact, the support function provides the capability to perform analyses and make assessments of potential impact. Involve-ment of human factors also ensures that on-orbit op-erational habitability experience reports and lessons learned are carried forward in project activities. Some of the areas human factors and habitability personnel support include physical and visual accessibility, hu-man strength capabilities and limitations, workstation design, internal environment design, labeling and coding, and user/computer interaction. 2.7.3.19 Technology transfer and intellect-

ual property management support The Office of Technology Transfer & Commercial-ization provides support to projects in two areas, tech-nology transfer and intellectual property management.

Page 46: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-20

The technology transfer representative shall: a. Assist the project manager in technology

planning by • Identifying and negotiating with potential

partners for joint technology development ventures.

• Identifying technologies at other government organizations and universities that would be prime candidates for technology infusion into the project.

b. Assist the project manager in identification and documentation of new innovations that result from technology development performed by the project.

c. Provide guidance on all technology transfer functions and initial guidance on intellectual property matters.

The Patent Counsel and the Patent Counsel’s staff shall assist the project manager on all matters pertain-ing to intellectual property, which includes but is not limited to invention disclosures, patent prosecution, copyrights, S/W usage agreements, new technology and patent rights clauses in contracts, and procure-ment counseling on related issues.

Page 47: NASA Project Management System Engineering and Project Control Processes and Requierments

Overview of Project Management at JSC

2-21

2.8 Project Management and Planning Project management and planning activities pro-vide the framework to ensure that a project meets the established budget, schedule, safety, and technical re-quirements to satisfy both project and the customer’s objectives. The project management and planning ac-tivities shall operate over the entire life cycle of the project and must be developed and implemented such that a rapid assessment and response capability is in -herent. The amount of project management and plan-ning performed should be commensurate with project approach, type, size, risks, and complexity. Project management and planning activities are included in a PMP (see Appendix A for template). This not only establishes project controls that include a project baseline to allow for performance to be mon-itored and corrective action taken, if necessary; it also allows for a clear communication to project team mem-bers of the processes, methods, and activities that will be used in managing the project. Included in this shall be any tailoring approaches to be used during the life-time of the project. In addition to the PMP, project man-agement and planning activities create other products and documentation that supplement the PMP and support the annual program operating plan (POP) process. Project activities will likely change the project baseline as the project progresses. Reviews may be conducted internally or externally by the project, pro-gram (if applicable), or center or through an indepen-dent assessment organization. If significant perform-ance variances or risks are identified that jeopardize project objectives, changes to the PMP and project baseline may be required. 2.8.1 Project Management One of the key aspects of project management is the formation and operation of a project team. That is the art of transforming a group into a team that operates efficiently and effectively. While each project is unique, there are key underlying characteristics that should be understood and followed by each project manager at JSC. They are:

• Honesty • Integrity • Leadership • Motivation • Negotiation • Communication • Delegation • Decision making

As the project begins Pre-Phase A activities and plans and progresses through Phase E, the project team will transform itself. This fact must be well understood by the project manager since each phase has specific

characteristics and, more importantly, requires a different management guidance style for each (also called situational leadership). Many descriptions of this transformation process exist. As an example, one description is called the Four-Stage Model. Simply stated, the team passes through four stages on its way to a unified, effective work team. The stages and the associated management style are lis ted below:

1. Forming F (Telling; e.g., provides specific directions and closely supervises performance)

2. Storming F (Selling; e.g., explains decisions and provides opportunity for clarification)

3. Norming F (Participating; e.g., shares ideas and facilities in decision making)

4. Performing F (Delegating; e.g., turns over re-sponsibility for decisions and implementations)

Since a considerable number of knowledge sources exist on this topic, further discussion on the generic transformation process and the appropriate manage-ment guidance style for each is left for the reader to obtain external to this document. Available resources include the Human Resources Development Branch and the Academy of Program and Project Leadership (APPL) (http://appl.nasa.gov/home/). 2.8.2 Project Management Plan and

Project Baseline As the project develops, one of the key documents that captures and establishes the baseline for the proj-ect implementation activities is the PMP. A PMP doc-uments project products and describes the overall plan for project implementation. The PMP shall serve as the basic agreement for the project among the project manager, the Center managing the project, and the program management (if the project is directly sup-porting a program). PMPs are unique to each project; and the level of detail may vary with the size, com-plexity, sensitivity, and other particular characterist-ics of the project. The PMP also contains additional information, and shall be prepared in accordance with Appendix A of this document. The PMP shall create an integrated project baseline by linking scope, requirements, schedule, and cost to risk. To accomplish this, projects shall start the plan-ning process in the Pre-Phase A efforts, a process that will mature with the continuation of the system defini-tion and preliminary design process. The draft PMP shall be completed at the end of Phase A. The final PMP shall be completed during Phase B, and it shall be maintained and updated throughout the remaining project life cycles. The planning process should be conducted in parallel with the definition of requirements. During Phase A, a product-oriented work breakdown struc-ture (WBS) shall be developed and refined as require-ments are more defined. In addition to a WBS, projects

Page 48: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

2-22

should have a high-level schedule, a life cycle cost estimate, and an organization concept developed. During Phase B, an integrated resource-loaded schedule shall be developed with refinement of the WBS and schedule and development of a bottom-up budget estimate. A risk assessment should also be completed to determine the amount of budget and schedule management reserve needed. As the project progresses through the remaining life cycle phases, periodic reviews should be made to determine whe-ther other changes within the project are necessary. (See Section 4.2.2 for an overview of the planning process.) 2.8.3 Program Operating Plan (Project

Baseline Updates) Project management and planning efforts may be used to assess Center or directorate-level support ef-forts and take corrective action, if necessary. Every year each project is required to submit a budget esti-mate, updating life cycle and phased funding, and schedule and workforce requirements to the program/ Enterprise. The program/Enterprise consolidates the various project budget estimates into a POP. The POP covers all aspects of the project and is the latest esti-mate of funding needs required to achieve baseline scope and schedule. The development process for the POP begins with the issuance of POP guidelines. The project manager then conducts a review of the entire project. Included in this review is an assessment of any requirements changes, an assessment of the previous year’s plan vs. the project’s actual performance, major project risks and their associated cost and/or schedule poten-tial impacts, and any readjustments from the original project life cycle based on unforeseen funding changes . However, any changes to requirements, schedule, and resulting budget, as well as the risk profile and reserve posture, should be managed through an internal project change process and be reflected as changes to fund-ing needs. The resulting funding needs are what is presented annually as part of the POP process. Center management reviews the POP submittal for consistency and compliance with Center commitments and responsibilities. The POP is then submitted to the programs and Headquarters for review, comment, and eventual approval. Through this review process and subsequent POP marks, current operating plans and future year funding, and workforce and schedule re-quirements are established for each project. 2.8.4 Customer-Project Agreement The customer-project agreement documents the requirements on the project related to how it will do business relative to the customer and the expectations on the product or service supplied (usually by reference

to an attached specification). The project shall ensure that there is a written document of the customer-project agreement, agreed-to and signed by both parties. Par-ties should be aware that the degree of clarity and definition of the customer-project agreement directly influences the success of the project. Several forms of customer-project agreements may exist. For technical agreements, the minimum requirements listed below may be implemented in the form of a contract, a let ter of agreement, or documented according to a customer-specific template. The customer-project agreement should address, at a minimum:

• Project needs, goals, objectives, assumptions, and constraints (the context for the project).

• Product or service requirements, including per-formance, interfaces, quality, environmental and other constraints, and verification and validation requirements.

• Requirements on the conduct/implementation of the project. Includes the content, level, and frequency of reporting; data, H/W and S/W de-liverables; schedule information to support monitoring and incorporation into higher-level (e.g., program) schedules; process standards; and customer and project roles relative to the major technical reviews.

• Resources provided by the customer or other organizations to the project such as funding, personnel, equipment, facilities, and materials.

• Resources devoted by the supplier organization (the project) the accomplish the project such as personnel, equipment, facilities, materials, and subcontracts.

The project agreement with the customer will flow down into the project PMP, either directly or as the implementing elements necessary to meet the agree-ment. While the elements of the customer-project agreement will be included in the PMP, the project benefits from having a separate agreement with the customer. In that way, the customer’s approval is not required to change elements of the PMP that do not directly impact the customer.

Page 49: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: Systems Engineering &Project Control Processes and RequirementsP

roject L

ife Cycle R

equ

iremen

ts

Page 50: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-1

Chapter 3 Project Life Cycle Requirements

This chapter outlines requirements for Johnson Space Center (JSC) projects as they relate to the pro-ducts, reviews, and management decisions that are expected for each phase of the project development life cycle.* A discussion of each life cycle phase includes:

• Overview – Provides a top-level summary narrative of major project activities accom-plished during the life cycle phase.

• Expected Results/Outcomes – Lists the major results and outcomes that the project is expected to accomplish during the life cycle phase.

• Products – Lists the major products that the project is required or expected to produce during the life cycle phase.

*Refer to Section 2.4 for a discussion of the project life cycle for facilities and associated support projects .

• Process – Provides a flow diagram that depicts

the sequence of major activities that occurs dur-ing this phase, and shows how the project man-agement processes support these activities. NOTE: For an in-depth discussion of the process steps, refer to Special Publication (SP) 6105, NASA Systems Engineering Handbook .

• Reviews – Lists the p roject-level life cycle reviews that are performed during this life cycle phase. This discussion includes the purpose of the re-view; the objectives the project team should accomplish as part of the review; checklists to aid in determining successful completion of the review; and decisions that are made at the proj-ect level based on information from the review (Section 4.1.16, Reviews).

• Management Decisions – Outlines any manage-ment decisions made outside of the project team during the life cycle phase.

Page 51: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-2

3.1 Pre-Phase A – Advanced Studies1 3.1.1 Overview Pre-Phase A, Advanced Studies, is performed to develop an understanding of project needs and expec-tations, and to identify feasible project alternatives to fulfill these needs and expectations. This phase initiates the project development life cycle. Involvement from the project customer is essential at this stage to scope the project. The project customer provides a statement of project needs, goals, and objectives, and discloses project constraints and assump tions. This phase results in a feasibility assessment of whether project needs can be accomplished within the constraints and as -sumptions provided by the customer. 3.1.2 Expected Results/Outcomes During Pre-Phase A, the project team should:

• Confirm customer project needs. • Assess the technical and programmatic feasi-

bility of the project. • Define objectives and top-level functional and

performance requirements. • Produce a feasibility assessment report. • Conduct a concept review.

3.1.3 Products The feasibility assessment report shall be the pri-mary product of this phase. It documents the studies completed during Pre-Phase A to assess project feas-ibility. The contents of this report are reviewed during the Concept Review (CR). The following are repre -sentative contents of this report:

• Project needs, goals, and objectives and relevance to the Center Implementation Plan

• Project assumptions, guidelines, and constraints • Top-level functional and performance

requirements • Documentation of all system and functional

concepts and architectures considered, and rationale for selection or rejection

• Operations concepts • Evaluation criteria (also known as figures of

merit or measures of effectiveness) • Performance measures (cost, schedule, and

technical) • Life cycle schedule and cost estimates • Technology development requirements • Trades and analysis results • A review and report on applicable lessons learned

3.1.4 Process The diagram (FIG. 3-1) at the top of the following page indicates the steps in this phase of the project life cycle that should be accomplished when evolving the system of interest. Interaction of these steps and

the associated reviews is also illustrated along with the project management (PM) processes that should be referenced in executing each activity. 3.1.5 Reviews – Concept Review A CR, the purpose of which is to validate project needs and objectives and to examine concepts for meeting those objectives, shall be performed to exit Pre-Phase A. At the review, the project team should be prepared to:

• Demonstrate that project objectives are complete and understandable.

• Confirm that proposed concepts demonstrate technical and programmatic feasibility for meet-ing stakeholder needs and objectives.

• Confirm that customer project needs are clear and achievable.

• Ensure that prioritized evaluation criteria (also known as figures of merit or measures of effect-iveness) are provided for subsequent analysis.

• Identify skills needed for the next phase. The following checklist is provided to aid in deter-mining successful completion of the review:

• Has the project need been clearly identified? • Are the project objectives clearly defined and

stated? Are they unambiguous and consistent? • Are project assumptions and constraints clearly

defined? • Will satisfaction of the preliminary set of

requirements provide a system that will meet stakeholder needs and objectives?

• Have the concept evaluation criteria that are to be used in the candidate system’s evaluation been identified and prioritized?

• Is the project feasible? Has at least one solution been identified that is technically feasible, and that meets stated assumptions and constraints? Is the rough cost estimate within the acceptable cost range?

• Are schedule and cost estimates credible? • Was a technology search done to identify exist-

ing assets that could satisfy the project or part of the project?

• Were all applicable lessons learned identified, evaluated, and incorporated to the maximum extent possible?

Project Decision – The project manager decides whether the project is ready to recommend authoriza -tion to proceed to the preliminary analysis phase. 3.1.6 Management Decision The decision to proceed from Pre-Phase A, Advanced Studies, to Phase A, Preliminary Analysis, shall be pro-vided by a directorate-level board or its delegate. For projects that involve multiple directorates, approval

Page 52: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-3

to proceed shall be obtained from all directorate-level boards. For management decisions concerning entry into this phase, please refer to the PMC discussion in Section 2.5.3.3. 3.1.7 References The following document, which was used to prepare this section, offers additional insights into Pre-Phase A, Advanced Studies:

1SP 6105, NASA Systems Engineering Handbook , 1995.

Start

Figure 3-1. Pre-Phase A advanced studies diagram.

X.X Phase Step

Supporting PM Process

Management and Planning Technology Planning, Acquisition Management, Risk Management, Quality Management,

Control, Technical Work and Sources Management, C onfiguration Management, Safety and Mission Success, Resource Management, Planning, Documentation and Data Management,

Cost Estimating, Performance Measurement, Schedule Management, Project Analysis

*Refer to Section 2.5.3.3 (JSC Project Management Council)

Management Decision/Control

Gate

1.8 Allocate Requirements Requirements Development & Management

Feasibility Study

1.7 Develop Feasible Systems Concept(s)

Decomposition, Feasibility Study

1.9 Analyze & Evaluate

Feasibility Study, Technology

Planning

1.10 Synthesize & Down Select

Feasibility Study

1.6 Flowdown Top-Level Systems

Requirements

Requirements Development & Management

Project Customer Provides Project Needs, Goals, Objectives, Assumptions, and Constraints

Management Decision/Control

Gate

1.2a Assess Project

Feasibility (“bid/no bid”

decision) Feasibility

Study

1.2b Prepare

PMC Proposal Approval Form*

PMC Approval*

1.2 Refine Constraints & Assumptions Requirements Development & Management

1.3 Develop Top-Level

Requirements Requirements Development & Management

1.5 Develop Evaluation

Criteria Feasibility

Study

1.4 Develop Top-Level Functional Concept

Operations Concept

Development

Concept Review

Reviews

1.1 Refine User Needs &

Objectives Requirements Development & Management

Identify Feasible Alternatives

Iterate Requirements and Concepts to Establish Feasibility

Understand Customer

Start

Page 53: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-4

3.2 Phase A – Preliminary Analysis1,2 3.2.1 Overview Phase A, Preliminary Analysis, is performed to examine further the feasibility and desirable of a sug-gested new system and its compatibility with NASA strategic plans. During this phase, requirements and concepts are iterated to establish optimal system re-quirements and top-level system architecture. 3.2.2 Expected Results/Outcomes During Phase A, the project team should:

• Refine both top-level functional and perform-ance requirements and corresponding evaluation criteria and metrics.

• Develop and refine both system-level require -ments and corresponding evaluation criteria and metrics.

• Develop interface requirements to external systems, if applicable.

• Identify alternative operations and logistics concepts.

• Demonstrate that credible, feasible system architecture designs exist.

• Examine alternative system architectural design concepts.

• Establish an optimized system architecture. • Identify technical and technology risks, and

outline mitigation plans. • Initiate environmental impact studies, if

applicable. • Refine resource need estimates for the project. • Produce a needs statement. • Produce a top-level operations concept. • Produce a preliminary set of project plans, in-

cluding preliminary versions of the project man-agement plan (PMP), systems engineering (SE) management plan, risk management plan, tech-nology development plan, and logistic plan.

3.2.3 Products The project definition package shall be the primary product of Phase A. This package documents the activ-ities conducted during this phase to define project re-quirements, systems architecture, and preliminary plans. The following are representative contents of this package:

• A needs statement for the project • Project constraints and system boundaries • Top-level project and system requirements with

corresponding evaluation criteria and metrics (also known as figures of merit or measures of effectiveness)

• A list of credible, feasible systems architecture designs considered

• Alternative operations and logistics concepts

• Feasibility studies • Recommended system architecture • Advanced technology requirements • Risk studies, including mitigation plans • Cost and schedule estimates • A preliminary set of project plans, including

preliminary versions of the PMP, SE manage-ment plan, risk management plan, technology development plan, and logistics plan.

3.2.4 Process The diagram at the top of the following page (FIG. 3-2) indicates the steps in this phase of the project life cycle that should be accomplished in the course of evolving the system of interest. Interaction of these steps and the associated reviews is also illustrated along with the PM processes that should be refer-enced in executing each activity. 3.2.5 Reviews 3.2.5.1 The Requirements Review A Requirements Review (RR) shall be conducted, usually early in Phase A, to determine whether the re-quirements development and the requirements man-agement functions are sufficiently established to proceed with determining the optimum system architecture. At the RR, the project team should be prepared to:

• Demonstrate that project requirements are complete and understandable.

• Demonstrate that prioritized evaluation criteria are consistent with requirements and the opera-tions and logistics concepts.

• Confirm that requirements and evaluation cri-teria are consistent with customer needs.

• Demonstrate that operations and architecture concepts support mission needs, goals, and ob-jectives; assumptions, guidelines, and constraints ; and project requirements.

• Demonstrate that the process for managing change in requirements is established, docu-mented in the project information repository, and communicated to stakeholders.

Project Decision – Based on the results of the RR, the project manager shall decide whether to proceed with steps toward the establishment of optimum sys-tem architecture. 3.2.5.2 The Definition Review A successful Definition Review (DR) shall be con-ducted prior to exiting Phase A. The purpose of the DR is to determine whether to proceed with develop-ing the proposed system architecture design and any technology needed to accomplish project goals. Review

Page 54: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-5

results should reinforce project merit and provide a basis for the system acquisition strategy. At the DR, the project team should be prepared to:

• Establish the proposed systems architecture design and the allocation of functional system requirements are optimized to satisfy project objectives with respect to the requirements trades and evaluation criteria established at the CR.

• Demonstrate that system requirements meet project objectives.

• Identify both technical and technology risks, and the plans to mitigate those risks.

• Present refined cost, schedule, and personnel resource estimates.

The following checklist should be used to aid in determining successful completion of the DR:

• Will the selected system architecture design meet system requirements and satisfy project objectives and stated needs?

• Are cost and schedule estimates in the prelim-inary plans realistic in view of system require -ments and the selected architecture?

• Are system-level requirements complete, consistent, and verifiable? Have preliminary allocations been made to lower levels?

• Have the requirements trades converged on an optimized set of system requirements?

• Do the trades address project cost and schedule constraints as well as technical needs?

• Do the trades cover a broad spectrum of options? • Have the remaining trades been identified to

select a proposed system architecture design? • Are plans in place for the design and develop-

ment phases? • Are the upper levels of the system product

breakdown structure completely defined? • Are decisions made as a result of the trades

consistent with evaluation criteria established at the CR?

• Have major design issues been identified for the elements?

• Have technical and technology risks been iden-tified, and have mitigation plans been developed?

Figure 3-2. Phase A preliminary analysis diagram.

Management and Planning

Technology Planning, Acquisition Management, Risk Management, Quality Management, Work and Resource Management, Control, Configuration Management, Safety and Mission Success, Resource Management, Planning, Documentation and Data Management, Cost Estimating,

Performance Measurement, Schedule Management, Project Analysis

Develop Tools and Methods

Verification, Validation, System Analysis, Feasibility Study

Management Decision/Control

Gate Concept Review

Reviews

Start

2.1 Refine Top-Level Project Requirements

Requirements Development & Management

2.3 Develop & Refine

Evaluation Criteria Feasibility Study

2.2 Refine Mission & Operations Concept(s)

Operations Concept Development

Iterate Requirements and Concepts to Establish Optimal System Requirements and Top-Level Architecture

Analyze Requirements

Establish Optimum

Architecture

Definition Review

Reviews

Management Decision/Control

Gate

2.4 Define & Refine System Requirements

Requi rements Development & Management

2.6 Allocate Requirements To Elements Decomposition

2.5 Develop Alternative System

Architecture & Concepts

Operations Concept Development Feasibility

Study; Technology Planning; Design

2.8 Synthesize & Down Select

Feasibility Study

2.7 Analyze & Evaluate

Architectures & Concepts

Feasibility Study; Technology Planning

X.X Phase Step

Supporting PM Process

Requirements Review

Reviews

Page 55: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-6

Project Decision – The project manager decides whether the project is ready to proceed to Phase B, Definition, for developing the system architecture/ design and any technology needed to accomplish the goals of the project. If the project is deemed ready, authorization is requested to proceed to Phase B, Definition. 3.2.6 Management Decision Directorate-level board(s) or its delegates shall authorize the project to proceed to Phase B, Defini-tion, to develop the system architecture/design and any technology need to accomplish project goals.

3.2.7 References The following documents, which were used to prepare this section, offer additional insights into Phase A, Preliminary Analysis:

1SP 6105, NASA Systems Engineering Handbook , 1995. 2NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002.

Page 56: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-7

3.3 Phase B – Definition1,2 3.3.1 Overview Phase B, Definition, of the project life cycle is performed to establish an initial project “design-to” baseline. The baseline includes a formal flowdown of project-level performance requirements to a complete set of design specifications for the system of interest down to the lower levels of its architecture. During this phase, technical requirements are sufficiently detailed to establish firm schedule and cost esti-mates for the project. Early in Phase B, the effort focuses on system requirements analysis and system definition and selec-tion, allocating functions to particular items of hardware (H/W), software (S/W), personnel, etc. System func-tional and performance requirements along with arch-itectures and designs are solidified as system of interest trade studies and subsystem trade studies iterate in an effort to seek out more cost-effective designs. Later in Phase B, the effort shifts to establishing a functionally complete, preliminary design solution that meets project goals and objectives. Trade studies continue. Interfaces among major end items are defined. Engineering test items may be developed and used to derive data for further design work; and project risks are reduced by successful risk mitigation efforts, in-cluding technology developments and demonstrations. Phase B culminates in a series of Preliminary Design Reviews (PDRs), continuing the system of interest PDR and PDRs for lower-level end items, as appropriate. 3.3.2 Expected Results/Outcomes During Phase B, the project team should:

• Demonstrate that the system of interest archi-tecture and design are acceptable, complete, and optimized relative to measures of effectiveness (e.g., figure of merit, performance goals, etc.), and that all requirement allocations to func-tional elements are complete.

• Show that the system of interest design is correct and can be built to meet project needs, goals, and objectives.

• Show that all major project risks have mitigation plans in place.

• Show that all requirements are complete and consistent.

• Show that the design meets cost and schedule constraints.

3.3.3 Products A functionally complete solution that meets project goals and objectives shall be produced as the primary product of Phase B. This solution shall be expressed and control as a “design-to” specification at all levels of the system architecture.

A number of products should be prepared in support of the solution, including:

• New or revised plans, such as the PMP, SE management plan, risk management plan, and other supporting plans such as specialty engi-neering plans, safety plans, S/W management plans, test plans, and verification and validation plans.

• System of interest functional requirements defined to the lowest level as appropriate, in-cluding internal and external interfaces (inter-face control documents (ICDs))

• Trade studies and feasibility analyses • Verification requirements matrix • Baseline concept of operation • Work breakdown structure (WBS) and dictionary • Statement(s) of work (SOW(s)) • System of interest cost-effectiveness model,

technical resource estimates, and life cycle cost estimates

• Technology development test results • Acquisition strategies and requirements

3.3.4 Process Figures 3-3.1 and 3-3.2, which appear at the top of the following pages, indicate the steps in Phase B that should be accomplished when evolving the system of interest. The interaction of these steps and their associ-ated reviews are also illustrated in the figures along with the PM processes that should be referenced in executing each activity. 3.3.5 Reviews 3.3.5.1 The System Requirements Review The System Requirements Review (SRR) shall be conducted early in Phase B. The objective of this re-view is to confirm that system-level requirements and specifications are sufficient to meet project objectives . At the SRR, the project team should be prepared to demonstrate that systems -level requirements and specifications meet project objectives. The following checklist is provided to aid in de-termining successful completion of the SRR:

• Are allocations contained in the system of interest specifications sufficient to meet project objectives?

• Are evaluation criteria established and realistic? • Are measures of effectiveness established and

realistic? • Are cost estimates established and realistic? • Has a system verification concept been identified? • Are appropriate plans being initiated to support

project system development milestones?

Page 57: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-8

• Have technology development issues been iden-tified along with approaches to their solution?

• Have specialty engineering considerations been incorporated into the systems -level requirements and specifications?

Project Decision – At the SRR, a decision is reached regarding whether the design team has demonstrated sufficient understanding of the mission for the system of interest, programmatic issues, and potential system of interest implementations to enable a commitment to the feasibility of the desired capability. Successful completion of the SRR freezes project requirements and leads to a formal decision to proceed with prepa-rations for project implementation. Complicated sys-tems of interest with multiple elements may require multiple SRRs. The decision to proceed may result in the preparation of a request for a formal project start decision.

3.3.5.2 The System Definition Review A System Definition Review (SDR) shall be con-ducted to exit design selection and enter preliminary design. An SDR examines the proposed system architecture/design and the flowdown to all func-tional elements of the system to determine whether to proceed with developing the selected design and any technology needed to accomplish project goals. At the SDR, the project team should be prepared to:

• Demonstrate that the architecture/design is acceptable, requirements allocation is complete, and a system that fulfills project objectives can be built within constraints.

• Ensure that the verification concept and prelim-inary verification program are defined.

• Have established end item acceptance criteria. • Ensure that adequate detailed information exists

to support initiation of further development or acquisition efforts.

Figure 3-3.1. Phase B system definition diagram.

3.7 Synthesize/Select Optimal Option

Design, Attainment

Develop Technology Technology Planning

3.11 Develop & Refine Tools & Methods Technology Planning

X.X Phase Step Supporting PM Process

Management and Planning Technology Planning, Acquisition Management, Risk Management, Quality Management, Control, Work and Resource Management, Configuration Management, Safety and Mission Success, Resource Management, Planning, Documentation and Data Management,

Cost Estimating, Performance Management, Schedule Management, Project Analysis

Integrate Integration, Systems Analysis

Management Decision/Control

Gate

System Definition

Review(s) Reviews

Element C

Element B Element A

System

Requirements Review

Reviews

Definition Review

Reviews

Establish Optimum System Design Analyze System Requirements

3.1 Mission & Requirements

Analysis

Requirements Development & Management

Start

3.3 Flowdown & Refine Requirements

Requirements Development & Management, Decomposition

3.2 Develop System Evaluation Criteria

Design

3.4 Develop & Refine Prime/Critical Item

Concepts

Decomposition, Design

3.5 Allocate Requirements to End Items

Requirements Management, Decomposition

3.6 Evaluate & Analyze System End Item

Concepts

Systems Analysis

Iterate Requirements and Concepts to Establish Optimal “Design-To” Baseline

Page 58: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-9

The following checklist is provided to aid in de-termining successful completion of the SDR:

• Will the top-level system design selected meet system requirements, satisfy project objectives, and address operational needs?

• Can the selected top-level system design be built within cost constraints and in a timely manner?

• Have all system-level requirements been allo-cated to one or more lower levels?

• Have major design issues for the elements and subsystems been identified? Have major risk areas been identified and addressed with mitigation plans?

• Have plans been completed to control the development design process?

• Is a development verification/test plan in place to provide data for making informed design de-cisions? Is the minimum end item product per-formance documented in the acceptance criteria?

• Is there sufficient information to support contract proposal efforts? Is there a complete, validated set of requirements with sufficient system definition to support cost and schedule estimates (if required)?

Project Decision – At the SDR, a decision is made that indicates whether the system of interest and its operation are well enough understood to warrant de-sign and acquisition of the end items. Successful

completion of the SDR releases both approved specifications for the system of interest and its elements and preliminary specifications for the design of appropriate functional elements. Authori-zation to proceed with preliminary design activities is requested. 3.3.5.3 The Preliminary Design Review A PDR shall be conducted for the system of interest and for each of its major subsystems and/or configur-ation items (CIs). The PDR is not a single review but is a number of reviews that includes the system of in-terest PDR and PDRs conducted on sub-elements of the system. The PDR demonstrates that the prelimi-nary design meets all system requirements with ac-ceptable risk; shows that the correct design option has been selected, interfaces have been identified, and verification methods have been satisfactorily described; and establishes the basis for proceeding with detailed design. At the PDR, the project team should:

• Ensure that all system requirements have been al-located, the requirements are complete, and flow-down is adequate to verify system performance.

Figure 3-3.2. Phase B preliminary design diagram.

Management and Planning

Technology Planning, Acquisition Management, Risk Management, Quality Management, Control, Work & Resource Management, Configuration Management, Safety and Mission Success, Resource Management, Planning, Document - ation and Data Management, Cost Estimating, Performance Measurement, Schedule Management, Project Analysis

X.X Phase Step

Supporting PM Process

Management Decision/Control

Gate

Management Decision/Control

Gate

System Definition

Review(s) Reviews

System Definition

Review(s) Reviews

Integration Systems/Elements/Lower Architectural Elements

Integration

Start

4.1 Analyze & Refine Requirements

Requirements Development & Management

4.2 Perform Design Analysis

Design

4.3 Perform Engineering

Development Tests Verification

4.4 Define Interfaces

Decomposition, Design

4.5 Perform Preliminary

Design Design

4.6 Evaluate, Verify, & Validate Design

Systems Analysis, Verification, Validation

4.7 Complete Plans & Documentation for Qualification Items

Verification

Lower Architectural Elements

Preliminary Design

Page 59: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-10

• Show that the proposed design is expected to meet the functional and performance require -ments at the CI level.

• Show sufficient maturity in the proposed design approach to proceed to final design

• Show that the design is verifiable and the risks have been identified, characterized, and mitigat-ed, where appropriate.

The following checklist is provided to aid in de-termining successful completion of the PDR:

• Can the proposed preliminary design meet all requirements within planned cost and schedule?

• Have all external interfaces been identified? • Have all system and segment requirements been

allocated down to the CI level? • Are all CI “design-to” specifications complete

and ready for formal approval and release? • Has an acceptable operations concept been

developed? • Does the proposed design satisfy requirements

critical to human safety and project success? • Do the human factors considerations of

the proposed design support the intended end users’ ability to perform and operate the system effectively?

• Have production, verification, operations, and other specialty engineering organizations re-viewed the design?

• Is the proposed design producible? How long lead items have been considered?

• Do specialty engineering project plans and design specifications provide sufficient guid-ance, constraints, and systems requirements for design engineers to execute the design?

• Is the reliability analysis based on sound meth-odology; and does it allow for realistic logistics planning and life cycle cost analysis?

• Are sufficient project reserves and schedule slack available to proceed further?

• Are the integrated logistics support analyses mature enough, and are they realistic?

• Has the Safety and Mission Assurance Review Team approved of the Phase I safety data package?

Project Decision – After the PDR is successfully completed, the “design-to” baseline and the Phase I safety data package are approved. Authorization is requested for the project to proceed to design phase. 3.3.6 Management Decision There are three decision points in Phase B. The SRR is sometimes referred to as an interim review rather than a “control gate” review. For large proj-ects, the decision that is reached following the SRR is to begin/stop preparing for release of a request for proposal (RFP) for Phase B studies. For smaller

projects, the directorate-level decision may be to proceed with a commitment of in-house resources to proceed with Phase B project implementation. The first control gate decisions follows the SDR. For projects requiring Center PMC authority to proceed, an Engineering Review Board (ERB) de-cision is reached regarding whether the system and its operation are well enough understood to warrant design and acquisition of the end items. Specifications are approved for the system as well as preliminary specifications for the design of appropriate elements. Plans to control and integrate the expanded technical process are in place. JSC PMC approval for a project approved by the ERB at this SDR control gate may be obtained outside of board (OSB). At the conclusion of the PDR is another control gate management decision point. This one approves the “design-to” baseline and authorizes the project to proceed to the design phase. The “design-to” baseline may include preliminary engineering design drawings, end-item specifications, preliminary ICDs, and pre-liminary S/W specifications. Identical to the SDR control gate, a decision to proceed into Phase C, De-sign, should be obtained from the directorate-level board or its delegate. The decision may also be presented to the ERB on request. 3.3.7 References The following documents, which were used to prepare this section, offer additional insights into Phase B, Definition:

1SP 6105, NASA Systems Engineering Handbook ¸ 1995. 2NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002.

Page 60: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-11

3.4 Phase C – Design1,2 3.4.1 Overview The Phase C, Design, phase of a project establishes a complete design “build-to” baseline that is ready to fabricate (or code), integrate, and verify. Trade studies continue. Engineering test units, which more closely resemble actual H/W, are built and tested to establish confidence that the design will function in the expect-ed environments. Engineering specialty analysis results are integrated into the design, and the manufacturing process and controls are defined and validated. At each step in the successive refinement of the final design, corresponding integration and verification activities are planned in greater detail. Phase C culminates in a series of Critical Design Reviews (CDRs) that contain the system of interest-level CDR and CDRs corresponding to different levels of the system hierarchy. At this point, a detailed design of the system of interest and its associated subsystems , including its operations systems, is complete and ready to be released. 3.4.2 Expected Results/Outcomes During Phase C, the project team should:

• Establish a “build-to” baseline of the system of interest.

• Verify that the detailed design meets system of interest requirements.

• Ensure that verification and acceptance testing requirements are satisfied.

• Ensure that system of interest performance and operations requirements are satisfied.

3.4.3 Products During Phase C, a completed, detailed design of the system of interest and all its components – includ-ing subsystems, testing system, and operations systems – shall be expressed and controlled as the “build-to” baseline. This baseline should include:

• Build-to specification(s) at all levels of the system architecture

• A baseline of all requirements and designs, including traceability to all levels

• Baseline updates to system architecture, verifi-cation requirements matrix, WBS, project plans, etc., reflecting project maturity

• Baselined system integration, operations, and manufacturing plans

• Completed and archived trade studies • A refined integrated logistics support plan • Refined verification and validation plans

3.4.4 Process The diagram (FIG. 3-4) at the top of the following page indicates the steps that should be accomplished

during this phase of the project life cycle. The inter-action of these steps and the associated reviews are also illustrated along with the PM processes that should be referenced to execute each activity. 3.4.5 Reviews – The Critical Design

Review A CDR shall be conducted for the system of inter-est and each of its major subsystems and/or CIs. The CDR is not a single review but is a number of reviews that includes the system of interest CDR and CDRs con-ducted on specific CIs. The CDR discloses the complete system of interest des ign in full detail; ascertains that technical problems and design anomalies have been resolved; and ensures that the design maturing justifies the decision to initiate fabrication/ manufacturing (or coding), integration, and verification of project H/W and S/W. At the CDR, the project team should be prepared to:

• Ensure that the “build-to” baseline contains detailed H/W and S/W specifications that can meet functional and performance requirements.

• Ensure that the design has been satisfactorily audited by s pecialty engineering organizations.

• Ensure that production processes and controls are sufficient to proceed to fabrication (or coding).

• Provide evidence that planned quality assurance (QA) activities will establish perceptive verifica-tion and screening processes to produce a quali-ty product.

• Verify that the final design fulfills specifications established at the PDR.

The following checklist is provided to aid in determining successful completion of the CDR:

• Can the proposed final design be expected to meet all requirements within planned cost and schedule?

• Is the design complete? Are drawings ready to begin production? Is the S/W product definition sufficiently mature to start coding?

• Is the “build-to” baseline sufficiently traceable to assure there are no orphan requirements?

• Has all qualification testing been completed? • Are all internal interfaces completely defined

and compatible? Are external interfaces current? • Are integrated safety analyses complete? Do

these analyses show that identified hazards have been controlled, or has the appropriate authority waived the remaining risks that cannot be controlled?

• Are production plans in place and reasonable? • Are there adequate quality checks in the

production process?

Page 61: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-12

• Are the logistics support analyses adequate to identify integrated logistics support resource requirements?

• Are comprehensive system integration and verification plans complete?

• Has the Safety and Mission Assurance Review Team approved the Phase II safety data package?

Project Decision – As a result of successful CDR completion, the “build-to” baseline, production plans, and verification plans are approved. Approved draw-ings are released and authorized for fabrication. The decision is made to authorize coding of deliverable S/W and system qualification testing and integration. All open issues are resolved with closure actions and schedules. The Phase II safety data package has been approved. Authorization to proceed to Phase D is requested.

3.4.6 Management Decision The affected directorate-level board(s) provide auth-orization to initiate fabrication/manufacturing (or coding), integration, and verification of project H/W and S/W. 3.4.7 References The following documents, which were used to prepare this section, offer additional insights into Phase C, Design:

1SP 6105, NASA Systems Engineering Handbook , 1995. 2NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002.

Management Decision/Control

Gate

Management Decision/Control

Gate Integration Systems/Elements/Lower Architectural Elements

Integration

Preliminary

Design Review(s) Reviews

Critical Design

Reviews(s) Reviews

5.1 Define & Control Detailed Design Interfaces

Decomposition Design

Start

5.2 Perform Detailed Design Design

5.3 Perform Engineering Test

Design, Verification

5.5 Evaluate, Verify, & Validate Design

Design, Attainment Systems Analysis, Verification, Validation

5.6 Complete Detailed Design & Production

Plans Design

5.4 Fabricates/Tests Qualification Items

Verification

Management and Planning

Technology Planning, Acquisition Management, Risk Management, Quality Management, Control, Work & Resource Management, Configuration Management, Safety and Mission Success, Resource Management, Planning, Document -ation and Data Management, Cost Estimatin g, Performance Measurement, Schedule Management, Project Analysis

X.X Phase Step

Supporting PM Process

Lower Architectural Elements Final Design

Figure 3-4. Phase C design diagram.

Page 62: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-13

3.5 Phase D – Development1,2 3.5.1 Overview The purpose of the Phase D, Development, phase is to build and verify the system of interest designed in the previous phase, deploy it, and prepare it for operations. Activities include fabrication (e.g., H/W, S/W facilities construction), integration, verification, and deploy-ment of the system of interest. Additional activities include initial training of operating personnel and implementation of the logistics support plans. The major product is the system of interest that has been shown to be capable of accomplishing the purpose for which it was created. 3.5.2 Expected Results/Outcomes During Phase D, the project team should:

• Build—i.e., fabricate (or code)—the end items (i.e., the lowest-level items in the system of in-terest architecture).

• Integrate end items according to the integration plan and perform verifications, thereby yielding verified components and subsystems. Repeat this process of successive integration until a verified system is achieved.

• Verify and validate the system; i.e., develop ver-ification and validation procedures at all levels, and perform system of interest qualification verification(s) and acceptance verification(s) and validation(s).

• Prepare for operations; i.e., prepare the opera-tors’ manuals and maintenance manuals. Train initial system operators and maintainers, perform operational verification(s), audit “as-built” con-figurations, and finalize and implement an inte-grated logistics support plan.

3.5.3 Products The primary product of Phase D shall consist of the completed, verified, validated, and accepted system of interest and associated documentation necessary for normal operations. Example products include:

• Verified and validated H/W and S/W products • “As-built” and “as-deployed” documentation • Updated logistics support plans • Operations plans and procedures, including op-

erations, maintenance, and disposal procedures • Training materials • Documentation of lessons learned • Verification, validation, and acceptance data • Problem/failure reports

3.5.4 Process The following diagrams (FIGS. 3-5.1, 3-5.2, and 3-5.3) indicate the steps in Phase D that should be ac-complished in the course of evolving the system of in -

terest from design to operational use. The interaction of these steps and the associated reviews are also il-lustrated along with the PM processes that should be referenced in executing each activity. During this phase of the life cycle, the system of interest is transformed through three distinct stages, each of which is bounded by control gates. These three stages include fabrication and integration, preparation for deployment, and deployment and operational verification.

Page 63: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-14

Manufacture & Assembly Integration & Test

Subsystems

Lower Architectural Elements

Management Decision/Control

Gate

Management Decision/Control

Gate Integrate Systems, Control & Verify Interfaces

Integration, Control

Critical Design

Review(s) Reviews

System Acceptance

Review Reviews

Production Readiness Reviews(s)

Reviews

Start

6.1 Ready Production Facilities Attainment

6.2 Fabricate/Assemble

End Item Attainment

6.5 Test/Verify End Item

Attainment, Systems Analysis, Verification,

Validation

6.6 Assemble & Physically

Integrate System Integration

6.7 Complete Test

Plans & Documentation

for System Integration, Verification,

Validation

6.8 Complete Plans &

Documentation for System Integration

6.3 Complete End Item Verification

Test Attainment, Verification

6.4 Complete Plans/ Documentation for

End Item Attainment

Test Readiness

Reviews(s) Reviews

6.9 Test/Verify System

Verification, Validation

6.10 Perform Acceptance

Testing Validation

X.X Phase Step

Supporting PM Process

Train

Management and Planning

Technology Planning, Acquisition Management, Risk Management, Quality Management, Control, Work and Resource Management, Configuration Management, Safety and Mission Success, Resource Management, Planning, Documentation and Data Management, Cost Estimating,

Performance Measurement, Schedule Management, Project Analysis

Figure 3-5.1 Phase D development – fabrication and integration stage diagram.

Page 64: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-15

Management Decision/Control

Gate

System Acceptance

Reviews Reviews

Deployment

Readiness Review Review

Start

7.2 Configure Hardware CM, Control

7.3 Configure Software CM, Control

7.4 Configure Support System

CM, Control

7.5 Prepare Personnel

7.6 Update Ops Plans & Procedures

Work & Resource Management, CM

7.1 Deliver/Install System

7.7 Complete Integrated

Pre-Deployment Checkout Verification

X.X Phase Step

Supporting PM Process

Management and Planning

Technology Planning, Acquisition Management, Risk Management, Quality Management, Cont rol, Work and Resource Management, Configuration Management, Safety and Mission Success, Resource Management, Planning, Documentation and Data Management,

Cost Estimating, Performance Measurement, Schedule Management, Project Analysis

Figure 3-5.2. Phase D development – preparation for deployment stage diagram.

Page 65: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-16

3.5.5 Reviews 3.5.5.1 The Production Readiness Review A Production Readiness Review (ProRR) shall be conducted to ensure that production plans, facilities, and personnel are in place and ready to begin produc-tion. The ProRR is conducted after the CDR and prior to the start of production. At the ProRR, the project team should be prepared to:

• Demonstrate that all significant production engi-neering problems encountered during develop-ment, including S/W problems, have been resolved.

• Ensure that the design documentation is adequate to support manufacturing, fabrication, or coding.

• Ensure that production plans and preparations are adequate to begin manufacturing, fabrication, or coding.

• Establish that adequate resources have been allo-cated to support end item production.

The following checklist should be used to aid in determining successful completion of the ProRR:

• Is the design baselined? Have incomplete design elements been identified?

• Have risks been identified and characterized and mitigation efforts defined?

• Has the bill of materials been reviewed, and

have critical parts been identified? • Have delivery schedules been verified? • Have alternative sources been identified? • Have adequate spares been planned and budgeted? • Are the facilities and tools sufficient for end

item production? Are special tools and test equipment specified in proper quantities?

• Are personnel qualified? • Are drawings baselined? • Is production engineering and planning mature

for cost-effective production? • Are production processes and methods con-

sistent with quality requirements, and are they compliant with occupation safety, environment-al, and energy conservation regulations?

Project Decision – A successful ProRR results in certification of production readiness by the project manager and involved specialty engineering organ-izations. The project manager recommends whether to proceed to production.

Deployment

Readiness Review Review

Management Decision/Control

Gate

Operational

Readiness Review Reviews

Start

8.1 Deploy

8.2 Configure for Checkout and

Operations Validation

8.3 Demonstrate Operational Capability Validation

X.X Phase Step

Supporting PM process

Management and Planning

Technology Planning, Acquisition Management, Risk Management, Quality Management, Control, Work and Resource Management, Configuration Management, Safety and Mission Success, Resource Management, Planning, Documentation and Data Management,

Cost Estimating, Performance Measurement, Schedule Measurement, Project Analysis

Figure 3-5.3. Phase D development – deployment and operational verification stage diagram.

Page 66: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-17

3.5.5.2 The Test Readiness Review A Test Readiness Review (TRR) shall be conduct-ed to ensure that all test article H/W and S/W, the test facility, ground support personnel, and test procedures are ready for integrated testing, data acquisition, re-duction, and control. The TRR is conducted after system fabrication and integration, but prior to the start of a formal test cycle. At the TRR, the project team should be prepared to:

• Confirm that in-place test plans meet verifica-tion requirements and specifications.

• Confirm that sufficient resources are allocated to the test effort.

• Examine detailed test procedures for complete-ness and safety.

• Determine that critical test personnel are test- and safety-certified.

• Confirm that test support S/W is adequate, pertinent, and verified.

The following checklist should be used to aid in determining successful completion of the TRR:

• Have the test cases been reviewed and analyzed for expected results?

• Are the results consistent with the test plans and objectives?

• Have the test procedure been “dry run?” If so, do they indicate satisfactory operation?

• Have test personnel received training and certi-fication, if required, in test operations and s afety procedures?

• Are resources available to adequately support planned tests as well as contingencies, includ-ing failed H/W replacement?

• Has the test support S/W been demonstrated to handle test configuration assignments, data ac-quisition, reduction, control, and archiving?

Project Decision – A successful TRR results in cert-ification of formal system test readiness by the project manager and involved specialty engineering organiza-tions. The project manager decides whether to proceed with planned system verification and acceptance testing. 3.5.5.3 The System Acceptance Review A System Acceptance Review (SAR) shall be con-ducted to examine the system, its end items and doc-umentation, and the test data and analyses that support its verification. The SAR also ensures that the system has sufficient technical maturity to authorize its ship-ment to and installation at the launch site or intended operational facility. The SAR is conducted near com-pletion of the system fabrication and integration stage, and prior to preparations for deployment. At the SAR, the project team should be prepared to:

• Establish that the system is ready to be deliver-ed and accepted under DD-250.

• Ensure that the system meets acceptance criteria established at the SDR.

• Establish that the system meets requirements and will function properly in the expected oper-ational environments as reflected in test data, demonstrations, and analyses.

• Establish an understanding of the capabilities and operational constraints of the “as -built” sys-tem, and ensure that the documentation delivered with the system is complete and current.

• For flight systems, ensure that the system (H/W and S/W) has been certified for flight. This certi-fication includes successful H/W qualification testing of the qualification unit and H/W and/or S/W acceptance testing of the flight unit. Certi-fication documentation includes the following: identification (part number, part name), baseline requirements and associated verifications, safe-ty data package, baseline test and analysis, doc-umentation (qualification and acceptance plans, procedures, reports), limited-life item list, approv-ed waivers, or deviations and material usage agreements (MUAs), etc.

The following checklist should be used to aid in determining successful completion of the SAR:

• Are tests and analyses complete? • Do the tests and analyses indicate that the

system will function properly in the expected operational environment(s)?

• Does the system meet the criteria described in the acceptance plans?

• Does the system meet the needs of the stakeholders?

• Is the system ready to be delivered (e.g., flight items to the launch site and non-flight items to the intended operational facility for installation)?

• Is the system documentation complete and accurate?

• Is it clear what is being bought? • Has all open work been identified and

dispositioned? • Has the Safety and Mission Assurance Review

Team approved the Phase III safety data package? Project Decision – A successful SAR results in the approval of the “as -built” baseline by the project man-ager, involved specialty engineering organizations, and the customer. The project manager recommends whe-ther to proceed to prepare the system for deployment. 3.5.5.4 The Deployment Readiness Review A Deployment Readiness Review shall be con-ducted to demonstrate that the system is ready for deployment. This review includes examining tests,

Page 67: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-18

demonstrations, analyses, and audits. It also ensures that all H/W, S/W, personnel, and procedures are op-erationally ready. In a flight environment, this review equates to a Flight Readiness Review (FRR) where system readiness for a safe and successful launch and for subsequent flight operations is determined. The Deployment Readiness Review is conducted on com-pletion of delivery, installation and configuration of the system but prior to system deployment and the operational capability demonstration. At the Deployment Readiness Review, the project team should be prepared to:

• Certify that system operations can safely pro-ceed with acceptable risk.

• Confirm that system and support elements are properly configured and ready.

• Establish that all interfaces are compatible and function as expected.

• Establish that the system state supports a “go” decision based on go/no-go criteria.

The following checklist should be used to aid in determining successful completion of the Deployment Readiness Review:

• Are the system elements ready to support operations?

• Is the system safe and capable of achieving mission success?

• Are the system interfaces checked out and found to be functional?

• Are all environmental factors within constraints ? • Have all open items and waivers been examin ed

and found to be acceptable? Project Decision – Based on the results of the Deployment Readiness Review, the project manager recommends whether to proceed with steps toward deployment and operational verification of the system. 3.5.5.5 The Operational Readiness Review An Operational Readiness Review (ORR) shall be conducted to examine the actual system characteristics and procedures used in the system of interest’s opera-tion, and to ensure that all H/W, S/W, personnel, pro-cedures, and user documentation accurately reflect the deployed state of the system of interest. The ORR is conducted when the system of interest and its opera-tional and support equipment and personnel are ready to become operational. At the ORR, the project team should be prepared to:

• Establish that the system of interest is ready to transition into an operational mode through ex-amination of available test results, analyses, and operational demonstrations.

• Confirm that the system of interest is operation-ally and logistically supported in a satisfactory manner considering all modes of operation and

support (i.e., normal, contingency, and unplanned.)

• Establish that operational documentation is complete and that this documentation represents the system of interest configuration and its planned modes of operation.

• Establish that the training function is in place and has demonstrated a capability to support all aspects of system of interest maintenance, prep-aration, operation, and recovery.

The following checklist should be used to aid in determining successful completion of the ORR:

• Are system of interest H/W, S/W, personnel, and procedures in place to support operation?

• Have all detected anomalies been resolved, documented, and incorporated into existing operational support data?

• Are the changes that are necessary to transition the system of interest to an operational configur-ation ready to be made?

• Are all waivers closed? • Are the res ources in place or financially planned

and approved to support the system of interest during its operational lifetime?

Project Decision – Based on the results of the ORR, the project manager, with concurrence from the custom-er, recommends whether to assume normal operational use of the system of interest. 3.5.6 Management Decision During Phase D, Development, two management decisions are necessary:

• Upon completion of system verification and acceptance testing, authorize the project to pro-ceed in preparation for delivery and installation of the system for operational use.

• Upon completion of system deployment and demonstration of operational capabilities, auth-orize the project to commence normal operations for the system.

3.5.7 References The following documents, which were used to prepare this section, offer additional insights into Phase D, Development:

1SP 6105, NASA Systems Engineering Handbook , 1995. 2NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002.

Page 68: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-19

3.5 Phase E – Operations1,2 3.6.1 Overview The purpose of Phase E, Operations, is to use the system of interest to meet its initially identified need. The Operations phase encompasses the use of the sys-tem of interest; the maintenance and upgrade of the system of interest; and the training of personnel who operate, maintain, and upgrade the system of interest. Evolution of the system of interest is included in this phase, as long as the changes do not involve major changes to the architecture. (Changes of that scope constitute new requirements, and the project life cy-cle starts over.) This phase also deals with preparation for the safe decommission and disposal of the system of interest at the end of its operational life, though the costs and risks associated with decommission and dis -posal should be considered during earlier phases of the project life cycle. 3.6.2 Expected Results/Outcomes During Phase E, the project team shall:

• Operationally use the system of interest. • Maintain an upgrade the system of interest. • Train operations and maintenance personnel. • Prepare to conduct the decommissioning review. • Document lessons learned. • Conduct delta ORRs. • Conduct safety reviews. • Conduct upgrade reviews.

3.6.3 Products Phase E products are the results of the operational use of the system of interest. Examples of these include:

• System-intended products or services delivered • Engineering data on system, subsystem, and

materials performance • Operations and maintenance logs • Problem/failure reports • Decommissioning studies • Lessons learned documents • System upgrade proposals

3.6.4 Process The diagram (FIG. 3-6) at the top of the following page indicates the steps in this phase of the project life cycle that should be accomplished in the course of evolving the system of interest. Interaction of these steps and the associated reviews are also illustrated along with the PM processes that should be referenc-ed in executing each activity.

3.6.5 Reviews 3.6.5.1 The Delta Operational Readiness

Review Delta ORRs shall be conducted in Phase E when-ever significant system changes occur. These reviews are typically held a few weeks or even a few months before operational use of the system changes. Delta ORRs essentially have the same content as the ORR described in Section 3.5.5.5. 3.6.5.2 The System Upgrade Review System Upgrade Reviews are held, as necessary, to present planned system upgrades to the project staff. This type of review is primarily informational, however approval is required when any change to critical systems is needed or extra resources are required. At the System Upgrade Reviews, the project team should be prepared to:

• Show before and after physical architecture diagrams.

• Show before and after functional architecture diagrams.

• Show before and after data diagrams. • Describe any H/W and/or S/W changes. • Describe the impact to operational procedures

or training. • Provide an upgrade schedule.

The following checklist should be used to aid in determining successful completion of the System Up-grade Review:

• Has the rationale for the upgrade been sufficient-ly described?

• Are changes to the system fully described in diagrams and text?

• Do operations and training personnel complete-ly understand the impacts to their procedures?

• Are all costs associated with the upgrades provided? • Does the schedule allow enough time for neces-

sary changes by the affected parties? • Where is the appropriate life cycle phase to re-

enter to implement upgrades? • Is a delta ORR required as a result of the changes ?

Project Decision – The project manager decides whe-ther to recommend proceeding with system upgrades . 3.6.5.3 The Safety Review Periodic Safety Reviews are held to examine the established safety procedures and identified hazards. At the Safety Review, the project team should be prepared to:

• Examine current safety documentation. • Review all safety incidents, their resolution, or

plans for their resolution. • Review all identified safety hazards and their

mitigation procedures.

Page 69: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-20

• Identify new safety hazards and plans to resolve or mitigate them.

The following checklist should be used to aid in de-termining successful completion of the Safety Review:

• Is current safety documentation up to date and accessible to all personnel?

• Have all safety incidents been properly addressed? • Are current safety procedures still relevant and

effective? Project Decision – The project must decide whether current safety procedures and identified hazards and resolutions are acceptable. 3.6.5.4 The Decommissioning Review Details of the Decommission Review are addressed in Section 3.7. In this phase, the project shall initiate

decommissioning studies and begin preparations for the Decommissioning Review. 3.6.6 Management Decision Directorate-level boards may/may not authorize the project to proceed with system operations. Customer management boards (whether Center- or program-level) may authorize recommended system upgrades. 3.6.7 References The following documents, which were used to prepare this section, offer additional insights into Phase E, Operations:

1SP 6105, NASA Systems Engineering Handbook, 1995. 2NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002.

Management Decision/Control

Gate

Management Decision/Control

Gate

Management Decision/Control

Gate Critical Design

Review(s) Reviews

(Delta) Operational

Readiness Review Reviews

Reenter Appropriate

Life Cycle Phase

System Upgrade Review

Reviews

Start

9.1 Configure for Operations

Configuration Mgmt. Control 9.2 Operate the

System Work & Resource Mgmt.

Configuration Mgmt. Safety Control

9.6 Distribute System Products Attainment Control

9.3 Train Personnel

Work & Resource Mgmt. Control

9.4 Maintain System

Work & Resource Mgmt. Configuration Mgmt.

Acquisition Mgt. Control

9.5 Support System

Work & Resource Mgmt. Quality

9.7 Assess Trends

Work & Resource Mgmt. Control

9.8 Update & Documentation Work & Resource

Mgmt. Configuration

Mgmt.

9.10 Sequential Production

Work & Resource Mgmt. Control

9.9 Improvement Block Changes

Rqmts. Development Requirements Mgmt.

Feasibility Study Technology Planning

Design Systems Analysis

Safety Review

Reviews

10.1 Preparation for Decommission/

Disposal Work & Resource

Mgmt. Control

X.X Phase Stop

Supporting PM Process

Management and Planning

Technology Planning, Acquisition Management, Risk Management, Quality Management, Control, Work and Resource Management, Configuration Management, Safety and Mission Success, Resource Management, Documentation and Data Management,

Cost Estimating, Performance Measurement, Schedule Management, Project Analysis

Figure 3-6. Phase E operations diagram.

Page 70: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-21

3.7 Project Termination1 3.7.1 Overview During Project Termination, the project team methodically plans and performs actions that bring the project to a timely, orderly conclusion, and effi-ciently and safely remove the system from the field of operational or active NASA interest. Project term-ination generally occurs for one of the following reasons:

• The project has successfully completed Phase E, Operations, and the system of interest is nom-inally approaching the end of its planned useful service life.

• The project has exceeded safety, cost, and/or schedule limits to an unacceptable degree dur-ing its design, development, or operations.

• The project is anticipated to be unable to meet the commitments contained in project control-ling agreements and plans.

• The project need is no longer valid. Project termination activities are planned and con-ducted so that appropriate degrees of oversight and control are provided both to the concluding stages of the project, whether nominal or off-nominal, and to the justification of, preparation for, and execution and documentation of decommissioning and/or disposal of the system of interest. 3.7.2 Activities The sets of activities differ for nominal and off-nominal termination. 3.7.2.1 Nominal termination For nominal project Termination during operations, the project shall:

• Monitor the schedule to stay informed of the approach of the nominal end of schedule system service life. − At an appropriate time, in advance of the

end of service life, begin and complete prep-arations to make removal of the system of interest from service as efficiently, safely, and otherwise acceptable to the system op-erating environment as constraints permit. • Remind all stakeholders of the approach

of the end of schedule system service life.

• Gather and prepare to present information, as available, that both supports and opposes on-time termination of the project and disposal of the system.

• Coordinate and document detailed plans for nominal project termination, system

decommissioning and disposal, and proj-ect/system personnel transition.

• Maintain the capability to continue system operations in the event that decommission-ing is postponed at the Decommissioning Review.

− Participate in a Decommissioning Review. The Decommissioning Review will either ap-prove termination and disposal as planned or with modifications, or decide on new sched-ule/performance/safety/cost-based trigger(s) for postponed termination and disposal. • If decommissioning is approved:

− Terminate project activity and support disposal of the system of interest and transition of personnel as planned or as redirected from plans by the De-commissioning Review.

− Document project closeout, support system decommissioning/disposal doc-umentation insofar as project personnel are involved, and turn over control of the project information repository to the next -higher tier of management.

• If decommissioning is approved with modifications, plan modifications shall be made and presented to the Decommis -sioning Review Board for concurrence.

• If decommissioning is postponed, con-tinue system operations and either: − Prepare as before to meet the Decom-

missioning Review’s new schedule-based termination target; or

− Monitor and assess trends using De-commissioning Review-designated performance/safety/cost triggers to recommend the next appropriate termination target.

3.7.2.2 Off-nominal termination Entry into the off-nominal termination phase occurs as a result of a Termination Review decision by the JSC Project Management Council (PMC) or the PMP designated authority. The Termination Review is held due to violation of established thresholds, anticipated inability to meet established commitments, or the project need no longer being valid. The project shall:

• Support the JSC PMC or the authority designat-ed by the PMP project Termination Review. − The project Termination Review shall either

direct termination or establish new schedule/ performance/safety/cost-based trigger(s) or agreements to permit continued design, de-velopment, or operations.

Page 71: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-22

• Upon direction to terminate, prepare to remove the system of interest from service in a manner that is efficient, safe, and acceptable to the system. − Inform all stakeholders of the decision to

terminate the project early, and dispose of the system of interest or its unintegrated el-ements before the scheduled end of service life.

− Gather and prepare to present information, as available, that both supports and opposes the proposed early termination of the project and disposal of the system.

− Coordinate and document detailed plans for early project termination, system disposal, and personnel transition.

• Participate in a Decommissioning Review, which will approve decommissioning plans either as presented or with modifications. − If decommissioning is approved with modifi-

cations, the plan modifications shall be made and presented to the Decommissioning Re-view Board for concurrence.

− If decommissioning is approved: • Terminate project activity, and support

disposal of the system of interest and trans-ition of project/system personnel as plan-ned or as redirected from plans by the Decommis sioning Review.

• Document project closeout, support system decommissioning/disposal doc-umentation insofar as project personnel are involved, and turn over control of the project information repository to the next -higher tier of management.

3.7.3 Products During Project Termination planning, the following are documented:

• Evidence supporting and/or opposing Project Termination and system disposal at the planned time

• System decommissioning/disposal requirements and plans

• Project closeout requirements and plans • Project/system personnel transition requirements

and plans • Lessons learned

3.7.4 Process The diagrams (FIGS. 3-7.1 and 3-7.2) on the follow-ing page indicate the steps involved in off-nominal and nominal Project Termination and decommissioning. The interaction of these steps and the associated re-views is also illustrated along with the processes that should be referenced in executing each activity.

3.7.5 Reviews 3.7.5.1 The Termination Review A Termination Review is held in the off-nominal case to review violations of established project thresh-olds and/or anticipated inability to meet established commitments. At the review, the project should be prepared to detail:

• Current project status, concentrating on the contributors to the violation of established proj-ect threshold and/or anticipated inability to meet the established commitments that necessitated the review.

• The justification and authority of the established threshold or agreement.

The result of this review will be a decision, by the governing authority (JSC PMC- or PMP-designated authority) to terminate the project or to establish new thresholds to continue design development or operations. 3.7.5.2 The Decommissioning Review In the nominal case, a Decommissioning Review is conducted to determine whether and how to terminate the project and dispose of the system. If the off-nomi-nal case, the review is conducted to determine only how to terminate the project and dispose of the system. In either case, as with most reviews, the project team contributes significant technical content. At the review, the project team should be prepared to:

• Cite the authority under which they initiated preparations for termination of the project, dis -posal of the system of interest, and transition of project/system personnel.

• Discuss the project schedule. • Demonstrate that plans for project termination,

system decommissioning and disposal, and project/system personnel transition at the pro-posed time are: − Feasible. − Consistent with requirements. − Acceptable to stakeholders.1 − Efficient, safe, and acceptable to the system

operating environment as constraints permit. For the nominal case,

• Present information as available, both support-ing and opposing the proposed termination of the project and disposal of the system.

For the off-nominal case, • Cite the authoritative direction to terminate and

dispose. • Review any top-level implementing guidance

that the project received during planning for termination and disposal.

Page 72: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Life Cycle Requirements

3-23

Life Cycle Phases B – E (Definition, Design, Development, Operations)

Continuous Performance Assessment

Risk Management, Safety, Quality, Systems Analysis, Project Control

Need Valid?

PMP threshold violation/

anticipated inability to meet

commitment ?

No

Yes Management Decision/

Control Gate Terminate

?

Termination Review (JSC PMC - or PM - Designated Authority) No

Reestablish Thresholds/ Legitimize Need

No

Step Name/Description

Supporting PM Process

Decision

Key:

Yes

Termination

Management Decision/Control

Gate

Decommissioning Review

Reviews

With Mods

As Planned

Make Modifications As Necessary

• Verify Proper Plan Implementation

• Provide Assurance to Decommissioning Review Authority

Figure 3-7.1. Off-nominal project termination diagram.

Termination

Management Decision/Control

Gate

With Mods

Post- poned

As Planned

Make Modifications as necessary

Decommissioning Review

Reviews

Life Cycle Phase E (Operations)

Monitor Schedule Schedule Management

Nominal End of Life

Approaching ?

Yes No

Step Name/Description Supporting PM Process

Decision

Key: • Verify Proper Plan Implementation

• Provide Assurance to Decommissioning Review Authority

Figure 3-7.2. Nominal project termination diagram.

10.2 Decommission/ Dispose

Implementation Plan • Terminate Project • Dispose of System • Transition Personnel • Document Actions and

Decisions • Give Repository to Next-

Higher Tier

10.1 Prepare for Disposal

• Alert Stakeholders

• Write and Coordinate Detailed Plans

10.1 Prepare for Disposal

• Inform Stakeholders • Gather Information For

and Against • Write and Coor dinate

Detailed Plans • Maintain Capability to

Continue Operations

10.2 Decommission/Dispose Implementation Plan

• Terminate Project • Dispose of System • Transition Personnel • Document Actions and

Decisions • Give Repository to Next-

Higher Tier

Page 73: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

3-24

A successful Decommissioning Review assures that the decommissioning and disposal of system items and processes is appropriate and effective. This review determines whether the following conditions are met prior to decommissioning:

• Reasons for decommissioning/disposal are documented

• Disposal plan is complete and in compliance with local, state, and Federal environmental regulations

• Disposal plan addresses disposition of H/W, S/W, facilities, processes, and any contractual/ procurement actions necessary

• Disposal risks have been assessed • Data archival plans have been defined • Sufficient resources are assigned to complete

the disposal plan • A personnel transition plan is in place

Subsequent to implementation of the decommis -sioning and disposal plans, assurance that these two plans were implemented as agreed is provided to the authority approving the plans. 3.7.6 Management Decision Management will direct whether to proceed with project termination, system disposal, and personnel transition in accordance with specific plans described at the Termination Review and Decommissioning Re-view, and, if not, what modifications to prescribe. 3.7.7 References The following document, which was used to prepare this section, offers additional insights into Project Termination:

1NPD 8010.3, Notification of Intent to Terminate Operating Space Systems, 1999.

Page 74: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: Systems Engineering &Project Control Processes and RequirementsP

roject M

anag

emen

t Pro

cesses and

Req

uirem

ents

Page 75: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-1

Chapter 4 Project Management Processes and Requirements

This chapter outlines the processes and requirements in support of Johnson Space Center (JSC) projects. It is divided into three sections. Section 4.1 addresses the systems engineering (SE) processes; Section 4.2 ad-dresses the project control processes; and Section 4.3 presents those processes that cut across SE and project control. The following information is presented for each process:

• Process overview – Provides a top-level summary narrative of the overall objective of the process, major process steps, products, and relationships with other project management (PM) processes.

• Function – Provides the requirements for tasks and expected outcomes resulting from the process.

• Objective – Defines the purpose of the process. • Responsibilities – Outlines the roles of key proj-

ect team members in planning and executing the process steps.

• Life cycle – Identifies the project life cycle phases supported by the process.

• Inputs – Identifies the main inputs that support execution of the process.

• Steps – Outlines the required main steps and preferred sequence of steps involved in execut-ing the process. Steps are annotated with support of information that provides additional detail and clarification.

• Outputs – Identifies the primary outputs from the process.

• Exit criteria – Provides conditions for determin-ing process completion.

• Measurement – Identifies example base and derived measures that may be used in conjunc-tion with executing the process. Base measures are simple measures relating to the primary ac-tivity that is being performed. Derived measures apply some algorithm to the base measure to provide trending data for corrective action an-alysis and future planning and estimating. While example measures are provided, it is the project team’s responsibility to identify those measures that provide appropriate insight into the health of this project process. The procedure, which is defined in SLP 4.20, Process Measurement and Improvement, should be followed to properly identify, collect, analyze, and report project measures.

• Methods and techniques – Provides a list of available methodologies and techniques that could be used to execute the process steps.

• Software tools – Provides a brief discussion of the functions provided by typical software (S/W) tools available to support the process.

• References – Lists sources used to prepare the content of the process section or that can offer further insight into the process.

The following paragraphs introduce several concepts and nomenclatures that are key for understanding the content of the PM processes covered in this chapter. In this chapter, extensive reference is made to the terms “system,” “system of interest,” and “enabling system.” These concepts are briefly explained below.

• A “system” is the combination of elements that functions together to produce the capability re-quired to meet a need. Elements include all of the hardware (H/W), S/W, equipment, facilities, personnel, and processes and procedures needed for this purpose.

• The “system of interest” is the entity for which the project engineering team is responsible for designing, developing, integrating, and testing. The system of interest may be at any level in the hierarchy of a complex system. In consider-ing a spacecraft, for example, the system of in-terest may be at the spacecraft level, the com-munications system level, or the avionics box level. The system of interest may include human-level interactions. (FIG. 4-1).

• “Enabling systems” are those systems that complement the system of interest by providing essential services (e.g., launch, tracking and data relay, and navigation services) that comple ment the systems of interest but do not contribute directly to their functioning (FIG. 4-2).

In this document, the nomenclature used to denote the decomposition of the system of interest is shown on page 4-3, Figure 4-3. Another important concept introduced here is that of project scope. Project scope constitutes the vision for the project and consists of the following elements:

• Need to develop or procure a system • Goals and objectives of the customer and

stakeholders • Information about the customers and users of

the system • How the system will be developed or purchas-

ed, tested, deployed, and used. Project scope also includes the boundaries and constraints of both the project and the system.

Page 76: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-2

Figure 4-2. Example of enabling systems.

Figure 4-1. Example of three levels of system of interest.

Page 77: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-3

Figure 4-3. Decomposition of the systems of interest. Key elements of the project scope are needs, goals, and objectives. The relationship among these is depicted in Figure 4-4. Project scope is defined up front, such that the vision and constraints of the project are clearly under-stood by the project team before requirements are written. This is an essential component that ensures the success of the project. The concept of project scope is reflected in the PM processes addressed in this chapter.

4.0.1 References The following documents, which were used to prepare this section, offer additional insights into the PM processes:

1NPR 71xx.x (document number not yet assigned), Systems Engineering Processes and Requirements. 2Hooks I, Farry K. Customer-Centered Products, American Management Association (AMACOM), 2001.

·····

Parent System

Enabling System(s) System s of Interest

System Element 1 (or Subsystem) System Element N (or Subsystem)

Lower Architectural Elements

End Items End Items

Lower Architectural Elements

Needs ? Reason for the Project ? Derived from the Problem Statement ? Should Not Change Much Over Time Example: “The Nation has a Need for Assured Human-Rated Space Trans-portation."

Goals ? Define What Actions Must Be

Accomplished to Meet the Needs ? Derived from the Needs Example: “To Fly Safely” Objectives

? Define How the Needs and Goals Are Accomplished (how do we know when we get there)

Example: “One Vehicle Loss in 500 Flights”

Figure 4-4. Relationship among needs, goals, and objectives.

Page 78: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-4

4.1 Systems Engineering Processes 4.1.1 Requirements Development1,2 The Requirements Development process is per-formed to gather needs and constraints and to transform them into agreed-to requirements for the system of interest. These requirements, which are stated in acceptable technical terms, represent a complete description of the system of interest. The same Requirements Development process will be used to evaluate changes to requirements in an agreement or when other stakeholder requirements are identified that affect the system of interest. 4.1.1.1 Function1 In the Requirements Development process:

• Customer and stakeholder needs shall be defined. • System of interest requirements shall be verified

and validated. • Requirements shall be evaluated and maintained

due to changes in the needs or constraints of the system of interest.

• Requirements traceability/flow down shall be established.

• All technical requirements shall be documented. 4.1.1.2 Objective1,2 The primary objective is to produce and analyze re -quirements that will serve as the foundation for estab-lishing the functions that a system of interest has to perform, how well the system of interest must perform, and with what systems it must interact. Requirements – which, depending on design maturity, take the form of specifications, drawings, models, or other design documents – are used to (1) build, code, assemble, and integrate end products; (2) verify end products against; (3) obtain off-the-shelf products; or (4) assign a supplier for development of subsystem products. 4.1.1.3 Responsibilities The Lead Systems Engineering function ensures that the Requirements Development process is fully implemented and the output of requirements meets intended criteria. The project or program manager is ultimately responsible for the content of the require-ments, but the Lead Systems Engineering function assists the project or program manager in providing technical overview of the process. 4.1.1.4 Life cycle Requirements development begins during the advanced studies phase of the development life cycle, carries through preliminary analysis, and is baselined in the definition phase. Requirements development may also take place in the desk and development phases, if changes occur in agreements or enabling systems.

4.1.1.5 Inputs2,3 There are three types of inputs to the Requirements Development process: (1) statement of need or project request made by the customer from the agreement, other documents, enabling systems, and stakeholders that have a stake in the outcome of the system of in te-rest; (2) requirements in the form of outcomes from other processes, such as technical plans and decisions from technical reviews; and (3) requested or approved changes to requirements of the first type. Typical inputs to the Requirements Development process should include, but are not limited to:

• Goals and objectives • Project or customer needs • Stakeholder needs • Assumptions, guidelines, and constraints (e.g.,

from specialty and design engineering) • Technology availability, maturity, costs, and risks • Outputs from preceding projects • Records of meetings and conversations with the

customer • Policies and procedures • Laws and regulations • Operations concepts

4.1.1.6 Steps4 SE performs all of the steps in the Requirements Development process. The diagram at the top of the following page (FIG. 4.1-1) illustrates the major steps and products of the Requirements Development proc-ess. The three major steps are to (1) develop customer requirements, (2) develop system of interest require-ments, and (3) analyze and validate requirements. These steps are progressively and iteratively executed until a validated set of requirements is achieved. The internal steps are shown in this illustration as well. The following adds detail to the steps illustrated:

• Develop Customer Requirements – Stakeholder needs, expectations, constraints, and interfaces shall be collected and translated into customer requirements. − Elicit and Collect Stakeholder Needs

• Elicit, identify, and collect stakeholder needs, expectations, constraints, and in terfaces for all phases of the system of interest life cycle.

• Engage relevant stakeholders using methods for eliciting needs, expectations, constraints, and external interfaces.

− Develop Customer Requirements • Transform stakeholder needs, expecta-

tions, constraints, and interfaces into cus-tomer requirements.

• Translate stakeholder needs, expectations, constraints, and interfaces into document-ed customer requirements.

Page 79: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-5

• Define constraints for verification and validation.

• Develop System of Interest Requirements – Customer requirements shall be refined and elaborated to develop system of interest and subsystem requirements. − Establish system of interest and subsystem

requirements. − Establish and maintain system of interest

and subsystem requirements that are based on customer requirements. • Develop requirements in the technical

terms necessary for system of interest and subsystem design.

• Derive requirements that result from design decisions.

• Establish and maintain a relationship among requirements for consideration during change management and require -ments allocation.

− Allocate Subsystem Requirements • Allocate requirements for each system of

interest component. • Allocate requirements to functions. • Allocate requirements to system of

interest components. • Allocate design constraints to system of

interest components. • Document relationships among allocated

requirements. − Identify Interface Requirements

• Identify interfaces that are both external and internal to the system of interest.

• Develop requirements for identified interfaces.

• Analyze and Validate Requirements – Require-ments shall be analyzed and validated, and a def-inition of required functionality shall be developed with documented rationale provided. − Establish and Maintain Operational Concepts

and Associated Scenarios • Develop operational concepts and

scenarios, including functionality, per-formance, maintenance, support, and disposal, as appropriate.

• Define environment in which the system of interest will operate, including bound-aries and constraints.

• Review operational concepts and scenar-ios to refine and discover requirements.

• As system of interest components are selected, develop a detailed operational concept that defines the interaction of the system of interest, the end user, and the environment, and that satisfies operation-al, maintenance, support, and disposal needs.

− Establish and Maintain Definition of Required Functionality • Analyze and quantify functionality

required by end users. • Analyze requirements to identify logical

or functional partitions. • Using established criteria, partition

requirements into groups to facilitate and focus requirements analysis.

B Establish and Maintain Operational Concepts

and Associated Scenarios

Establish and Maintain a Definition of

Required Functionality

Analyze Requirements

C

C

Analyze Requirements

to Achieve Balance

Validate Requirements

Validate System of Interest

Requirements

Develop Customer

Requirements

Elicit and Collect

Stakeholder Needs

Develop Customer

Requirements

Customer Requirements

A

Develop System of Interest

Requirements

A

Establish System of Interest and

Subsystem Requirements

Allocate Subsystem

Requirements

Identify Interface

Requirements

System of Interest Requirements

B

Analyze and Validate

Requirements

KEY Step/Activity Product Information/Output Flows B Connector

Figure 4.1-1. Requirements development process diagram.

Page 80: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-6

• Consider the sequencing of time-critical functions both initial and subsequently during system of interest development.

• Allocate customer requirements to functional partitions, objects, people, or support elements to support the synthesis of solutions.

• Allocate functional and performance re -quirements to functions and sub-functions.

− Analyze Requirements • Analyze requirements to ensure they are

necessary and sufficient. • Analyze stakeholder needs, expectations,

constraints, and external interfaces to re-move conflicts and to organize into related subjects.

• Analyze requirements to determine whether they satisfy the objectives of higher-level requirements.

• Analyze requirements to ensure they are specific, measurable, achievable, resource constrained, and time constrained (SMART).

• Identify key requirements that have a strong influence on cost, schedule, func-tionality, ris k, or performance.

• Identify technical performance measures that will be tracked during the develop-ment effort.

• Analyze operational concepts and scenar-ios to refine customer needs, constraints, and interfaces and to discover new requirements.

− Analyze Requirements to Achieve Balance • Analyze requirements to balance stake-

holder needs and constraints. • Use proven models (e.g., simulations)

and prototyping to analyze the balance of stakeholder needs and constraints.

• Perform risk assessment on the require-ments and functional architecture.

• Examine system of interest life cycle con-cepts for requirements impacts on risks.

• Prioritize requirements. Requirement priority information is key information to support any required project de-scope.

− Validate Requirements • Validate requirements to ensure the

resulting system of interest will perform appropriately and as intended in the cus-tomer environment using multiple tech-niques, as appropriate.

• Analyze requirements to determine the risk that the resulting system of interest

will perform appropriately in its intended-use environment.

• Explore the adequacy and completeness of requirements by developing system of interest representations and obtaining feed-back about them from relevant stakeholders .

• Assess the design as it matures in the context of the requirements validation environment to identify validation issues and expose unstated needs and customer requirements.

• Obtain customer concurrence with the requirements.

4.1.1.7 Outputs3,4 The primary output from the Requirements Devel-opment process is a system of interest requirements document that becomes the design baseline in passing the System Requirements Review (SRR) milestone. This requirements document may include:

• Concept of operation • Mission requirements • Functional require ments • Customer restraints on the conduct of

verification/validation • Derived requirements • Product requirements • Product-component requirements • Interface, environmental, and nonfunctional

requirements • Performance requirements, including key

performance parameters • Design constraints • Relationship among requirements (e.g., parent,

child, peer requirements) • Physical and functional interface requirements • Product installation, operational, maintenance,

and support concepts • Use cases (primarily for S/W development) • Timeline scenarios • System architecture • Functional architecture • Activity diagrams • Results of requirements validation • Technical baseline

4.1.1.8 Exit criteria3 An approved, validated system of interest require-ments document that is derived from each individual requirement statement shall be:

• Clear, unique, consistent, stand-alone, and verifiable.

• Traceable to an identified source requirement. • Not redundant to, nor in conflict with, any other

known requirement.

Page 81: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-7

• Not biased by any particular implementation. • Identified with a verification method assigned

to each requirement. • Traceable to a higher-level requirement. • Documented with results of customer concur-

rence discussions. While producing the system of interest requirements document, consider whether:

• The impact of enterprise and external constraints on system design is understood.

• The design team concurs with the list of requirements.

• Cost goals are established for the system. • Stakeholder concurrence discussion results are

documented. • Requirements are verifiable via test, demonstra-

tion, examination, etc. • All requirements are allocated.

4.1.1.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Requirements Development pro-cess. See discussion of Measurement on page 4-1. 4.1.1.10 Methods and techniques3,4 Some of the methods and techniques that may be used in the developing requirements are:

• Questionnaires, interviews, and brainstorming • Working groups • Project reviews • Mission analysis • Requirements analysis • Performance analysis • Trade studies • Prototypes and models • Constraint evaluation • Cost/benefit analysis

• Functional concept analysis • Functional definition and decomposition • Quality functional deployment (QFD) • Extraction from sources such as documents,

standards, or specifications • Observation of existing products, environments,

and workflow patterns • Use cases (primarily for S/W development) • Business case analysis • Reverse engineering (for legacy products)

4.1.1.11 Software tools1–6 Some means of capturing requirements should be required. This usually is a repository, the nature of which is dependent on the size and complexity of the system of interest. At a minimum, requirement state-ments must be captured and controlled, relationships among primary and derived requirements must be iden-tified and managed, and a means for tracking changes over time must be provided. There are a number of commercial off-the-shelf (COTS) S/W applications (e.g., DOORS, SLATE) that provide extensive capabil-ities supporting requirements development and manage-ment. For a list of potential tools, see the table provided by INCOSE at: http://www.incose.org/tools/eia632tax/eia632top.html. 4.1.1.12 References The following documents, which were used to prepare this section, offer additional insights into the Requirements Development process:

1NPR 71xx.x (document number not yet assigned), Systems Engineering Processes and Requirements. 2EIA-632, Processes for Engineering a System, ANSI/EIA-632-1998, 1999.

Total # of Shall Statements Total # of Requirements @ SRR – Planned vs. Actuals

Requirements Definition Effort Requirements Definition Productivity (allocated resources (full-time equivalents (FTEs)) vs. actual resources used)

Requirements Definition Effort – Planned vs. Actuals

Requirements Definition Rate Charts

Requirements Definition Effort as % of Total Engineering Effort

Defects in Requirements (by phase) Requirements Defects – Projected @ SRR vs. Actuals

# of Incomplete Requirements # of Blanks, To Be Determined (TBD), To Be Supplied (TBS), Incomplete Requirements vs. Planned @ SRR

Base Measures Derived Measures

Page 82: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-8

3INCOSE Systems Engineering Handbook , Version 2.0, 2000. 4CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineer-ing, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing , 2002. 5SP 6105, NASA Systems Engineering Handbook , 1995. 6RP 1358, Systems Engineering “Toolbox” for Design-Oriented Engineers, 1994.

Page 83: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-9

4.1.2 Requirements Management1,2 The Requirements Management process is per-formed to manage the requirements of the system of interest and its subsystem components, and to iden-tify inconsistencies among those requirements and the plans and work products of the project. This process manages all requirements received or gen-erated by the project, including both technical and nontechnical requirements. The Requirements De-velopment process is the primary source of these requirements. 4.1.2.1 Function1 In the Requirements Management process:

• Baselined requirements developed in the Requirements Development process shall be established.

• Proposed changes shall be reviewed by affected organizations to assess impact on project cost, schedule, and performance on the system of interest.

• Proposed changes shall be reviewed by inter-facing organizations and stakeholders to assess impact on the system of interest.

• Requirements shall be maintained under config-uration control.

• Changes to the baselined requirements shall be managed and assessed.

4.1.2.2 Objective2,3 The objective of requirements management is to establish a repository of baseline system of interest requirements to serve as a foundation for requirements refinement and revision by subsequent functions in the product development life cycle, for a nonambigu-ous and traceable flow of requirements to system of interest sub-components, and for support of correc-tive actions based on inconsistent requirements. 4.1.2.3 Responsibilities The Lead Systems Engineering function is to ensure that the Requirements Management process is fully implemented and the output of requirements meets intended criteria. The project or program man-ager is ultimately responsible for the content of the requirements, but the Lead Systems Engineering function assists the project or program manager in providing technical overview of the process. The Lead Systems Engineering function should ensure that impacts from changes to requirements are ex-amined thoroughly and fully tracked to completion in the Requirements Management process. 4.1.2.4 Life cycle1–8 Requirements management begins with project initiation and carries through all phases of the project.

4.1.2.5 Inputs3 Specific inputs should include those produced by the Requirements Development process, namely:

• Concept of operations • Mission requirements • Functional requirements • Customer restraints on the conduct of

verification/validation • Derived requirements • Product requirements • Product-component requirements • Interface environmental, and nonfunctional

requirements • Performance requirements, including key

performance parameters • Design constraints • Relationship among requirements • Physical and functional interface requirements • Product installation, operational, maintenance,

and support concepts • Use cases (primarily for S/W development) • Timeline scenarios • System architecture • Functional architecture • Activity diagrams • Requirements defects reports • Technical performance measurements (TPMs) • Results of requirements validation • Technical baseline

4.1.2.6 Steps2 SE performs all five steps in the Requirements Management process. These steps and their overall relationships are illustrated in Figure 4.1-2 at the top of the following page. In performing the Requirements Management process, users shall:

• Develop with the requirements providers an un-derstanding of the meaning of the requirements. − Establish criteria for distinguishing appro-

priate requirements providers. − Establish objective criteria for the acceptance

of requirements. − Analyze requirements to ensure that estab-

lished criteria are met. − Reach an understanding of the requirements

with the requirements provider so the project participants can commit to the requirements.

• Obtain commitment to the requirements from project participants. − Assess the impact of requirements on

existing commitments. − Negotiate and record commitments.

Page 84: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-10

• Manage changes to the requirements as they evolve during the project. − Capture all requirements and requirements

changes given to or generated by the project. − Maintain the requirements change history

with a rationale for the changes. − Evaluate the impact of requirement changes

from the standpoint of relevant stakeholders. − Make the requirements and change data

available to the project. • Maintain bi-directional traceability among the re-

quirements and the project plans and work products. − Maintain requirements traceability to ensure

that the source of lower-level (i.e., derived) requirements is documented.

− Maintain requirements traceability from a requirement to its derived requirements as well as to its allocation of functions, objects, people, processes, and work products.

− Maintain horizontal traceability from func-tion to function and across interfaces.

− Generate requirements traceability matrix. • Identify inconsistencies between the project

plans and work products and the requirements. − Review project plans, activities, and work

products for consistency with the require-ments and changes made to the requirements.

− Identify both the source of the inconsistency and the rationale.

− Identify changes that need to be made to the plans and work products resulting from changes to the requirements baseline.

− Initiate corrective action.

4.1.2.7 Outputs1–8 The output of the Requirements Management proc-ess is a system of interest requirements repository that contains the validated set of requirements and pro-vides control over requirement additions, changes, and deletions. In addition to requirements, other in-formation is stored in the repository, including:

• Requirement rationale and assumptions • Requirement assignments • Requirement bi-directional traceability • Any other information contained in the require-

ments document that needs to be controlled and made available to the various stakeholders

• Requirements defects reports 4.1.2.8 Exit criteria2 A configuration-controlled requirements repository shall have a documented requirements tracking method. The repository contains requirements that shall be:

• Clear, unique, consistent, stand-alone, and verifiable.

• Traceable to an identified source requirement. • Not redundant to, nor in conflict with, any other

known requirement. • Not biased by any particular implementation. • Identified with a verification method assigned

to each requirement. • Traceable to a higher-level requirement. • Documented with results of customer concur-

rence discussions.

Figure 4.1-2. Requirements management process diagram.

Develop Understanding of

Requirements

Obtain Commitment to Requirements

Requirements Repository

Identify Inconsistencies Among Project

Plans, Work Products, and Requirements

System of Interest Requirements

A

Manage Requirements

Changes

Requirements Inconsistencies

A Maintain bi-directional Traceability of Requirements

KEY Step/Activity Product Information/Output Flows B Connector

Page 85: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-11

4.1.2.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Requirements Management process. See discussion of Measurement on page 4-1. 4.1.2.10 Methods and techniques Some of the methods and techniques that may be used in the Requirements Management process are:

• Requirements analysis • Traceability analysis

4.1.2.11 Software tools1–8 Some means of capturing requirements should be required. This usually is a repository of some kind, the nature of which is dependent on the size and com-plexity of the system of interest. At a minimum, require-ment statements must be captured and controlled, re-lationships among primary and derived requirements must be identified and managed, and a means for track-ing changes over time must be provided. A number of COTS S/W applications (e.g., DOORS, SLATE) provide extensive capabilities supporting require-ments development and management. For a list of potential tools, see the table provided by INCOSE at: http://www.incose.org/tools/eia632tax/eia632top.html. 4.1.2.12 References The following documents, which were used to prepare this section, offer additional insights into the Requirements Development process:

1NPR 71xx.x (document number not yet assign-ed), NASA Systems Engineering Processes and Requirements. 2CMMI-SE/Sw/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002.

3INCOSE Systems Engineering Handbook , Version 2.0, 2000. 4EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999.

5SP 6105, NASA Systems Engineering Handbook, 1995. 6RP 1358, Systems Engineering “Toolbox” for Design-Oriented Engineers, 1994. 7SWELT:RM0.2, Requirements Management Guidebook , 1996. 8Hooks I, Farry K, Customer-Centered Products, American Management Association (AMACOM), 2001.

Total # of Requirements (e.g., name of Requirements Managed – Planned @ SRR vs. Actuals shall statements)

# of Requirements Added, Changed, or Deleted Requirements Volatility = % Added, Changed, or Deleted (since SRR)

Requirements Management Effort (FTEs) Requirements Management Productivity

Requirements Management Effort – Planned @ SRR vs. Actuals

Requirements Management Effort as % Total Engineering Effort

Base Measures Derived Measures

Page 86: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-12

4.1.3 Operational Concept Development1 The Operational Concept Development process is performed to better understand the capabilities and performance of the system of interest within its pro-posed mission, use, or function. This process helps to focus the Requirements Development process so sys-tems operations, deployment, delivery, support (in-cluding maintenance and sustaining), training, and disposal as well as all modes and states are considered during system of interest definition—regardless of en-gineering discipline. The operations concept describes the “who, what, when, where, and how” of the system. Operational concept and scenario development evolves as an iterative process. Operational concepts are refined as solution decisions are made, system of interest components are defined, and lower-level de-tailed requirements are developed. 4.1.3.1 Function1,2 In this process, a high-level definition of the operational concept and scenarios for the system of interest shall be established and maintained to cover the operational stages and to define timeline, func-tions, and interfaces. 4.1.3.2 Objective3 The primary objective of the Operational Concept Development process is to document what the opera-tional system will do and why, and to communicate with the end user so that operational needs are clearly understood. The secondary objective is to define any critical, top-level performance requirements (stated either qualitatively or quantitatively) and system rationale. Other objectives are to:

• Provide traceability between operational needs and written source requirements captured in the Requirements Development process.

• Establish a basis of requirements—e.g., person-nel requirements, support requirements, etc.—to support the system of interest over its life.

• Establish a basis for integration test planning, interface and system-level requirements, and any requirements for environmental stimulators.

• Provide the basis for computation of system capacity, behavior underload/overload, and mission-effectiveness calculations.

• Validate requirements at all levels to discover implicit requirements overlooked in source documents.

4.1.3.3 Responsibilities The Lead Systems Engineer is responsible for leading discussions with operations personnel (e.g., end users, system operators, facilities, flight crew, Mission Opera-tions) and the Design Engineering Team to develop the

concept of operations. All parties should concur on the defined concept of operations. The Lead Systems Engi-neer is also responsible for the performance of the process steps outlined below. The Design Engineering Discipline (e.g., mech-anical, electrical, S/W) are responsible for using the defined concept of operations as the basis for function-al decomposition and requirements allocation. The Project Manager is responsible for reviewing and agreeing on the defined concept of operations. 4.1.3.4 Life cycle Operations concept development begins during the advanced studies phase of the development life cycle, and these concepts are further refined and alternatives defined during preliminary analysis. As system of in -terest requirement changes are encountered during the definition, design, and development phases, op-erations concepts shall also be updated to ensure concurrency. Operations concepts and associated scenarios provide input into the development of op-erational products (e.g., operational plans and proce-dures) performed during the development phase. 4.1.3.5 Inputs3,4 Typical inputs to this process include, but are not limited to:

• Goals and objectives • Mission needs • Stakeholder needs • Assumptions, guidelines, and constraints • System of interest operating environment(s) • System of interest concept and system hierarchy • Technical operational requirements • Statement of operational objectives • System requirements document • Data flow diagrams • State/mode diagrams • Behavior diagrams • Customer standard operating procedures • Existing operational infrastructure • Records of meetings and conversations with

customer 4.1.3.6 Steps2–5 The diagram on the following page (FIG. 4.1-3) illustrates the major steps and products of this proc-ess. These are progressively and iteratively executed until a validated concept of operations is achieved. In performing the Operational Concept Develop-ment process, users shall take the following steps:

• Top-Level Operational Concepts and Associ-ated Scenarios – Top-level operational concepts and associated scenarios shall be developed.

Page 87: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-13

− Develop top-level operational concepts and scenarios. • Include functionality, performance,

maintenance, support, and disposal as appropriate.

• Identify and develop scenarios consistent with the level of detail in the stakeholder needs, expectations, and constraints in which the proposed system of interest is expected to operate.

− Define top-level operational environments. • Define the environments in which the

system of interest will operate. • Include boundaries and constraints.

• Detailed Operational Concepts and Scenarios − Establish and maintain detailed operational

concepts and scenarios. − Evolve operational concepts and scenarios.

• As system of interest components are selected, further evolve operational con-cepts and scenarios to describe the con-ditions, operating modes, and operating states specific to each system of interest component.

− Evolve operational environments. • Evolve the operational environments

for each system of interest component defined.

• Take into consideration that the environment for any given system of interest will be in-fluenced by other system of interest com-ponents and by the external environment.

?

− Review operational concepts and scenarios. • Integrate operational concepts and

scenarios produced for each physical level of system of interest decomposition.

• Refine requirements and discover over-looked implicit requirements.

• The following questions should be answered in the development of an operational concept: − Operational Distribution or Deployment:

Where will the system be used? − Mission Profile or Scenario: How will the

system accomplish its mission objective? − Performance and Related Parameters: What

are the critical system parameters to accom-plish the mission?

− Utilization Environments: How are the vari-ous system components to be used?

− Effectiveness Requirements: How effective or efficient must the system be in performing its mission?

− Operational Life Cycle: How long will the system be operated by the user?

− Environment: What environments will the system be expected to operate in effectively?

4.1.3.1 Outputs1–5 The primary output from this process is a system of interest concept of operations document that is base-lined at the SRR milestone. Outputs of this process include:

• Concept of operations document

Figure 4.1-3. Operational concept development process diagram.

These steps are progressively and iteratively executed until a validated concept of operations is achieved.

Develop Top-Level Operational Con-

cepts and Scenarios

Define Top-Level Operational

Environments

Top-Level Con Ops and

Scenarios

Evolve Operational Concepts and

Scenarios

Evolve Operational

Environments

Integrated Set of Detailed Con Ops

and Scenarios

Review Operational Concepts and

Scenarios

Derived System of Interest

Requirements

KEY Step/Activity Product Information/Output Flows B Connector

Page 88: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-14

• Top-level and detailed operational concept definitions

• Operational, installation, support, maintenance, training, disposal, and support concepts

• Integrated set of component-level operational concepts and scenarios

• Operational behavior models for each opera-tional mode traceable from source requirements

• Operational scenarios (step-by-step descriptions of how the proposed system of interest should operate and interact with its users and external interfaces)

• Use cases • Timeline scenarios • Timeline analyses of component interactions • Trade analyses for the operations concept • System test plan test threads and key test

features • Environmental simulation plan • Derived requirements

4.1.3.2 Exit criteria3,5 Exit criteria include:

• Release and approval of the system of interest concept of operations document at the SRR.

• Assurance that all operational concept changes affecting the system of interest have been prop-erly communicated to the interfacing and com-ponent suppliers.

4.1.3.3 Measurement3 The following table provides example base and derived measures that can be used in conjunction with executing the Operational Concept Development process. See discussion of Measurement on page 4-1. 4.1.3.4 Methods and techniques3,6 In addition to a scan of the listed input documents, additional operational insight can be achieved through:

• Interviews with operators of current/similar systems.

• Interviews with potential users. • Interface working group (IFWG) meetings. • Observation of existing products, environments,

and workflow patterns.

The following diagramming techniques may be employed to capture operational concepts:

• Context diagrams • Mission event sequence illustrations • Functional flow diagrams • Timeline charts • N2 charts

4.1.3.5 Software tools A standard word processor is sufficient for developing the concept of operations document and associated scenarios. Once the document and the associated scenarios have been developed, they should be placed under configuration management (CM) control to maintain and support their evolution. The use of basic spreadsheet and graphics tools is also applicable in the development of N2 charts as are context, event sequence, timeline, and functional flow diagrams. 4.1.3.12 References The following documents, which were used to prepare this section, offer additional insights into the Operational Concept Development process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Pro-cess Development, and Supplier Sourcing, 2002. CMMI is a service mark of Carnegie Mellon University. 3INCOSE Systems Engineering Handbook , Version 2.0, 2000. 4Section 4.3.1 of this document.

5EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999. 6Defense Systems Management College, System Engineering Fundamentals, 2001. 7ANSI/AIAA G-043-1992, Guide for Preparation of Operational Concept Documents.

Base Measures Derived Measures

# of Functional Flow Diagrams Functional Flow Diagrams – Required vs. Completed # of System External Interfaces System External Interfaces – Required vs. Completed # of Unresolved Source Requirement Unresolved Source Requirement Statements – (i.e., TBD, TBS, etc.) % Completed

Page 89: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-15

4.1.4 Decomposition1 The Decomposition process is performed to develop an architectural breakdown of the system of interest into lower-level elements to allocate the func-tions and requirements to the respective elements of the system of interest and to identify all physical and functional interfaces for the various levels. Decompo-sition is an iterative process that is performed until the system of interest is decomposed to the level needed to fully define it. Typically, the Decomposition process runs concurrent with development of the system concept of operations (SECTION 4.1.3) because the concept of operations is used as the basis for functional decom-position and requirements allocation. 4.1.4.1 Function1 In the Decomposition process:

• Physical and functional architectures shall be developed that identify the next lower-level physical or functional elements of the system of interest.

• The system architecture for the system of interest shall include an allocation of functions and requirements to the respective elements of the system of interest.

• Identification of interfaces shall also be devel-oped and documented, thereby defining all phys-ical and functional interfaces for the various levels of the system of interest.

• All physical and functional interfaces shall be identified, developed, and documented for the system of interest.

4.1.4.2 Objective2,3,4 The primary objectives are to (1) document the allo cation of higher-level systems, requirements, or customer needs into lower-level components within the context of the system of interest; and (2) document the interfaces down to the lower-level components. This allows for a better understanding of what the system of interest has to do and in what ways it can perform, both logically and in terms of the perform-ance required. Other objectives are to:

• Allocated higher-level functional and performance requirements to lower-level functions (i.e., requirements traceability).

• Document how requirements are parsed/ detailed into requirements for functional or physical components.

• Identify priorities and conflicts associated with lower-level functions.

• Provide information essential to optimizing and evaluating solutions.

• Form the basis for design definition.

• Develop and maintain interface control or defi-nition documentation that defines all physical and functional interfaces for the various levels of the system of interest.

• Identify and characterize interfaces between decomposed elements and functions.

4.1.4.3 Responsibilities3 The Lead Systems Engineer is responsible for the performance of the process steps outlined below. Process execution may include working and coordi-nating with multiple engineering disciplines (e.g., H/W, S/W, mechanical, electrical, environmental) to ensure the proper and appropriate level of system decomposition is achieved. The Design Engineering Team is the primary customer of the products resulting from the Decom-position process. Decomposition process products are used by the Design Engineering Team as the basis for design. 4.1.4.4 Life cycle System decomposition begins during the advanced studies phase of the project life cycle with the develop-ment of a high-level system architecture and break-down structure. The system of interest architecture and interfaces are further refined, and architectures associated with alternative solutions are defined dur-ing preliminary analysis. As the system of interest continues through the definition and design phases (i.e., system definition, preliminary design, and final design stages) of the life cycle, the architecture and interfaces of the selected solution are further decom-posed to whatever level of decomposition is needed to fully define the system. 4.1.4.5 Inputs4,5 Typical inputs to this process include, but are not limited to:

• System goals and objectives • System requirements, including functional,

physical, environmental, and performance pro-ducts and interface requirements

• System hierarchy concepts • High-level architectures (functional and physical) • Operational concepts and scenarios, including:

− Use cases (for S/W projects) − Data flow diagrams − State/mode diagrams − Behavior diagrams − Timeline scenarios

4.1.4.6 Steps2,5,6 SE is responsible for all of the steps in the Decom-position process. The following diagram (FIG. 4.1-4) illustrates the major steps and products of this process.

Page 90: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-16

Figure 4.1-4. Decomposition process diagram. Higher-level architectures, operational concepts, and system requirements are from the previous con-current execution of the decomposition, operations concept development, and requirements development processes, respectively. This repetitive decomposition occurs until the architecture detail and allocation of require ments are sufficient for the design process to start as follows:

• Functionally Decompose System of Interest – Functional decomposition shall be performed on the system of interest. − Decompose the system functions into sub-

functions. − Iterate until the system is decomposed into

its basic sub-functions and each sub-function at the lowest level is completely and uniquely defined by its requirements. The resulting product is the functional architecture.

• Physically Decompose System of Interest – Physical decomposition shall be performed on the system of interest. − Functional decomposition of the system

occurs concurrently with physical decom-position of the system. The resulting product is the physical architecture.

• Allocate Requirements to the Architectures – System of interest requirements shall be allocated to the lower-level functional and physical architecture. To achieve this:

− Distribute the functional system requirements to each of the functional system elements.

− Distribute the physical system requirements to each of the physical system elements.

− Perform and establish traceability of performance requirements (requirements allocation sheets (RASs)).

• Identify Interfaces – Functional and physical interfaces shall be identified and maintained. − Identify and document functional and phys-

ical interfaces between products internal to the system of interest.

− Identify and document functional and physical interfaces associated with external items and enabling systems.

− Identify and document interfaces between products and the product-related life cycle processes. For example, such interfaces may include those between a product that is to be fabricated and the jigs and fixtures that are used to enable fabrication during the man-ufacturing process.

− Ensure that the documented interface solutions account for all of the interface requirements developed in the requirements development processes.

Higher-Level

System Requirements

High-Level Opera-tional Concepts & Architectures

Allocate System Requirements to

Lower-Level Architectures

Functionally Decompose

System of Interest

Lower-Level Functional

Architectures

Physical Decompose

System of Interest

Lower-Level Physical

Architectures

Identify Functional and Physical Interfaces

Interface Control Documentation

Iterate process. Lower-level architectures become higher-level ones in next process pass.

KEY Step/Activity Product Information/Output Flows

B Connector

Page 91: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-17

Some specific activities that can occur during the Decomposition process include:

• Task sequences and relationships (functional flow block diagram (FFBD))

• Process and data flows (i.e., integration defini-tion for function modeling (IDEF0) diagrams)

• The time sequence of time-critical functions (timeline analysis sheets (TLSs)

4.1.4.7 Outputs1,2,3,5,6 Primary outputs from this process are:

• Functional architecture • Physical architecture • Product breakdown structure • Interface control documents (ICDs)

4.1.4.8 Exit criteria2,6 Exit criteria include:

• Completion of FFBD documentation to level of detail required for project life cycle phase.

• Completion of interface definition descriptions for all interfaces (N2 charts may be sufficient).

• Completion of RASs (or tables) for all functional requirements.

4.1.4.9 Measurement2 The table at the bottom of the page provides example base and derived measures that can be used in conjunction with the Decomposition process. See discussion of Measurement on page 4-1. 4.1.4.10 Methods and techniques2–5 Several methods and techniques are available to adequately define and represent the functional and physical architecture of the system of interest. These include:

• Functional Flow Block Diagramming – Dia-grams that show the logical sequence (network) of actions that leads to fulfi llment of a function.

• Timeline Analysis – Used to define the time se-quence of time -critical functions. The TLS adds

detail to defining durations of various functions. It defines concurrency, overlapping, and sequen-tial relationships of functions and tasks. It iden-tifies time-critical functions that directly affect system availability, operating time, and main-tenance downtime. Finally, it is used to identify specific time-related design requirements.

• N2 Diagramming – A matrix displaying func-tional interactions, or data flows, at a particular hierarchical level.

• RASs – Used to identify allocated performance and to establish traceability of performance requirements.

• IDEF0 – A common modeling techniques for the analysis, development, re -engineering, and integration of information systems, business pro-cesses, or SE analysis. Where the FFBD is used to show the functional flow of a product, IDEF0 is used to show data flow, system control, and the functional flow of life cycle processes.

• Trade Studies – Used to evaluate different decomposition (architecture) alternatives.

• Functional Thread Analysis – A thread con-sists of the system input and output. Analysis of threads identifies stimulus-condition-response threads to control S/W development.

• Modeling and Simulation – Used to verify the interpretation, definition, and viability of key functions.

• Real-Time Structural Analysis – An alternative to functional allocation that is usually applied to the design of S/W-intensive systems. Typical steps are to construct data flow diagrams and data dictionaries.

• Object-Oriented System Modeling – Used to construct an object that combines data structure and behavior in a single entity to represent the components of a system. This form of modeling allows conceptual issues to be clearly under-stood before the system is built.

Base Measures Derived Measures d # of Functional Flow Diagrams Function Flow Diagrams – Required vs. Completed # of System External Interfaces ICDs – Required vs. Completed # of Issues/Risks Identified Unresolved Issues/Risks – % Completed/Mitigated Depth of Functional Hierarchy – % vs. Target Depth % of Requirements Allocated at the Lowest Level of Hierarchy

Page 92: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-18

Architectures may also include standards and design rules governing the development of system components and their interfaces, as well as guidance to aid product developers. 4.1.4.11 Software Tools2 A standard word processor is sufficient for documenting the architectures and ICDs. Once developed, they should be placed under CM control to maintain and support their evolution. The use of basic spreadsheet and graphics tools is also applicable in the development of N2 charts, RASs, and TLSs, as well as functional flow and IDEF0 diagrams. Other S/W tools that can be used in the Decompo-sition process include:

• Analysis tools • Modeling tools • Prototyping tools • Simulation tools • Requirements traceability tools

4.1.4.12 References The following documents, which were used to prepare this section, offer additional insights into the Decomposition process:

1NPR 71xx.x (document number not yet assign-ed), NASA Systems Engineering Process and Requirements. 2INCOSE Systems Engineering Handbook , Version 2.0, 2002. 3SP 6105, NASA Systems Engineering Handbook, 1995. 4Defense Systems Management College, System Engineering Fundamentals, 2001. 5CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Pro-cess Development, and Supplier Sourcing, 2002. CMMI is a service mark of Carnegie Mellon University. 6EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999.

Page 93: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-19

4.1.5 Feasibility Study1 The Feasibility Study process is performed to eval-uate and iterate the needs and potential approaches for technical and programmatic feasibility. This process establishes technical and resource feasibility and iden-tifies the associated risks of meeting requirements within the given constraints. 4.1.5.1 Function1,2 The Feasibility Study process is performed to evaluate the technical and programmatic possibility of success of an approach(es) to realizing the system of interest. Credibility of a candidate concept and its subsequent design is also at issue in this requirement. In the Feasibility Study process:

• Evaluation criteria for assessing feasibility of approaches shall be established and ranked.

• One or more technically and programmatically implementable approaches for the system of in-terest under consideration shall be established in the feasibility study.

• Cost estimates shall be determined for each possible system of interest approach.

4.1.5.2 Objective The Feasibility Study process is performed to ascertain whether the system of interest can be de-signed, implemented as designed, and then accomplish system goals – all within the constraints imposed by the fiscal and thermal environments of the project. 4.1.5.3 Responsibilities Depending on the size and complexity of the project, the Feasibility Study process is led by the Project Manager or by another individual who is act-ing in the role of Lead Systems Engineer and can take a broad view of the total system of interest. Consult-ation with experienced personnel who have per-formed similar projects may be useful. Other personnel/roles may participate in proportion to the scale and complexity of the project. Broad parti-cipation in the feasibility study by the Project Team and outside experts usually contributes to the fidelity of the study effort. For example, Design Engineering Disciplines may develop the system concept and arch-itecture options for evaluation. Specialty Engineering Disciplines may contribute to the evaluate process or identify “show-stoppers” to proposed concept options. Cost Personnel may contribute to estimating the cost of the possible system approaches. 4.1.5.4 Life cycle During Pre-Phase A, Advanced Studies, the pro-cess iterates between requirements and alternative concepts, with the goal of establishing feasibility by admitting one or more concepts. It is intended that

the feasibility assessment by completed during Phase A, Preliminary analysis, where requirements and con-cepts are refined to establish optimal system require-ments and a single, optimum top-level architecture. However, circumstances may indicate the need for continuing assessments at lower levels in the archi-tecture to complete optimization of the total system of interest. 4.1.5.5 Inputs As shown in Section 4.1.5.6, the Feasibility Study process begins with an understanding of the system of interest stakeholders and their needs, the operational environment, and the system sub-elements. These are examined in the Requirements Development (Section 4.1.1), Re quirements Management (Section 4.1.2), Op-erational Concept Development (Section 4.1.1), and Decomposition (4.1.4) processes. Therefore, the out-puts of these processes, listed below, should be used as inputs to the Feasibility Study process.

• Requirements document • Contents of the system of interest requirements

repository, including requirement: − Additions, changes, and deletions − Rationale and assumptions − Assignments − Bi-directional traceability

• Concept of operations document • Functional architecture • Physical architecture • Product breakdown structure • Preliminary interface definition

4.1.5.6 Steps In Pre -Phase A, requirements and concepts are iterated to establish feasibility; and in Phase A, re-quirements and concepts are iterated to establish optimal system requirements and top-level archi-tecture. Major steps and products of the Feasibility Study process are illustrated in Figure 4.1-5. The specific steps in the process follow. In taking these steps, study participants:

• Should Review Requirements – To understand the expected performance, conceptual configur-ation, and operating environment of the system of interest. Study participants should also exam-ine the decomposition of stakeholder needs, as-sumptions, presumed scenarios, environmental constraints, and structure documented in the: − Requirements document. − Contents of the requirements repository. − Concept of operations. − Functional and physical architectures.

Page 94: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-20

Figure 4.1-5. Feasibility study process diagram.

− Preliminary interface definition. − Product breakdown structure.

• Should Identify Constraints − Analyze information gained from the review

to identify environmental, operational, func-tional, or architectural constraints.

− Identify critical physical or functional interfaces or project constraints.

• Shall Select Options for Evaluation − Develop alternative solutions or conceptual

designs for the system of interest. • Shall Determine Cost Estimates for Each Option

− Identify alternative operations and logistics concepts.

− Develop cost and schedule estimates for each alternate solution.

− Identify technical, cost, schedule, programmat-ic, and safety risks for each alternate solution.

• Shall Develop and Prioritize the Evaluation Criteria − Establish prioritized evaluation criteria for

assessing the feasibility of the alternative solutions. Select criteria appropriate to the system of interest. Assess the fit of the sol-ution with JSC or organization strategic or

implementation plans. Examples can include, but are not limited to, the factors shown in the table at the top of the following page.

• Shall Select Feasible Options − Use prioritized evaluation criteria to identify

technically and programmatically feasible solutions for further study or development. produce a feasibility assessment document that includes, but is not limited to, the following: • Description of and rationale for

evaluation criteria • Descriptions of alternatives considered,

which can include preliminary option: − Costs, schedules, risks − Operations concepts − Interface definition − Technical plans

• Ranked feasible alternative solutions list

4.1.5.7 Outputs For the Feasibility Study phase:

• For Pre-Phase A, the output of this process is a top-level or partial feasibility assessment.

• For Phase A, the output of this process is a completed feasibility assessment.

Requirements

Document Requirements Repository

Concept of Operations

Functional/Physical Architectures

Preliminary Interface

Document

Product Breakdown Structure

– START – Review

Requirements

Identify Constraints

Select Options for Evaluation A B

A

Determine Option Costs, Benefits,

Schedules, Risks

Preliminary Options Costs, Schedules, Risks Preliminary Option Ops Concepts

Preliminary Option Interface Definition

Preliminary Option Tech Plans

B Develop Evaluation

Criteria

Prioritized Evaluation Criteria

Select Feasible Options

Feasibility Assessment Document

KEY Step/Activity Product Information/Output Flows

B Connector

Page 95: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-21

4.1.5.8 Exit criteria For the Feasibility Study phase:

• For Pre-Phase A, a top-level or partial feasibility assessment shall be documented and accepted at the concept review.

• For Phase A, completed feasibility assessment shall be documented and accepted at the defini-tion review.

4.1.5.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Feasibility Study process. See discussion of Measurement on page 4-1. 4.1.5.10 Methods and techniques3 The key techniques here are sound managerial and engineering judgment. The following technical and management devices may be useful in performing the assessments:

• Trade studies • Cost-benefit studies • Risk assessment matrix • Success tree analysis • Benchmarking • Concurrent engineering

4.1.5.11 Software tools Standard office automation tools that support identification and direct communication of the pro-cess inputs to the study team should generally be used; e.g., word processor, spreadsheet, and presentation slide S/W for personal computers (PCs). Some tools are available that provide for doc-umentation of the options, input of the evaluation criteria, and display of the ranked list of options. One such tools is Expert Choice.4

Performance

Technology Readiness

Safety

Life Cycle Cost, Budget, and Schedule

Physical Envelopes During All Applicable

Project Phases

Resources

Specialty Engineering

Risk

(As required) (As required) (As required) – Design, development, test, and evaluation (DDTE) – Deployment – Skills, training – Disposal – Foreseeable required improvement/ augmentation/ complementary or follow -on system

– Length – Width – Height – Mass – Electromagnetic compatibility/ electromagnetic interference (EMC/EMI)

– Power – Information technology – Facilities – Thermal rejection – On-orbit crew time – Crew training time – Upmass – Up volume – On-orbit stowage space – Downmass – Down volume – Launch infrastructure – Operations & communications infrastructure – Ground processing infrastructure

– Reliability – Maintainability – Transportability – Sustainability – Producibility – Supportability – Human Factors – Safety – Quality Assurance – S/W Assurance – Environmental – Fabrication – Test & Verification – Training – Operations – Information Systems – Logistics & Maintenance

– Technical & performance – Safety – Cost – Schedule

Total # of Alternative Solutions Studied Total # of Alternative Solutions Studied – Planned vs. Actual

Alternative Solution Study Rate Charts

Feasibility Study Effort (FTEs) Feasibility Study Productivity

Feasibility Study Effort – Planned vs. Actuals

Feasibility Study Effort as % Total Engineering Effort

Base Measures Derived Measures

Page 96: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-22

4.1.5.12 References The following documents and Web site, which were used to prepare this section, offer additional insights into the environment of the Feasibility Study process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2SP 6105, NASA Systems Engineering Handbook , 1995. 3RP 1358, System Engineering “Toolbox” for Design-Oriented Engineers, 1994. 4Expert Choice, Inc., www.exp ertchoice.com

Page 97: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-23

4.1.6 Technology Planning1,2 The Technology Planning process is performed to ensure that technologies critical to the success of the project are identified and the effort to advance the technologies is understood and feasible. 4.1.6.1 Function1,2 For a system to be realized, the technologies that enable achievement of system goals must be available within the resources and schedule of the project. A technology assessment is therefore performed and documented to identify and characterize requirements and development plans for the key enabling and en-hancing technologies that the system of interest will require, and how these technologies will be acquired and incorporated into the system. In the Technology Planning process:

• An initial technology assessment shall be per-formed and documented that identifies any key enabling and enhancing technologies that will need to be infused into the system of interest and also identifies the existing technology readiness level (TRL) for each. (References explain determining TRLs.)

• Identified technologies shall be further assessed and documented to determine what is required to advance them to the level necessary for success-ful inclusion into the overall system of interest.

• Technology maturation and insertion planning shall be developed and documented for each identified technology that describes quantifiable milestones, decision gates, and fallback positions for incorporating the identified technologies into the system of interest.

4.1.6.2 Objectives The Technology Planning process is performed to document the selection of system-required technolo-gies and to schedule their maturation and incorporation into the system of interest. Technology planning in-cludes identification of partnering opportunities with other agencies, centers, industry, and academia. 4.1.6.3 Responsibilities2 The Lead Systems Engineer provides primary tech-nical assessment on needs and requirements for prod-ucts, materials, and services. The Lead Systems Engineer is responsible for performing the planning steps be-low, which outline how technical assessment for the technology planning mechanisms is performed. The Project Manager is responsible for formulating technology planning strategy(ies) for the project. The Project Manager also makes the final decisions con-cerning project technology planning and approves any corrective actions related to the performance of the technology insertion into the system of interest.

The Stakeholder Representative interfaces with the Project Manager and the Lead Systems Engineer for soliciting information in support of the technology planning function. The Stakeholder Representative also provides advice on assessing technology ma-turation and insertion. 4.1.6.4 Life cycle7 Technology planning begins in the Pre-Phase A, Advanced Studies, where conceptual technology de-velopment requirements are documented to help to answer the general question of mission feasibility. Subsequently, during Phase A, Preliminary Analysis, technology development requirements and plans are documented that reflect increased understanding gained from that phase’s iterations between require-ments and concepts to develop optimal system re-quirements and top-level architecture. Finally, during Phase B, Definition, technology development plans are updated to accommodate any changes indicated by the iterations between requirements and concepts to establish the optimal design-to baseline. 4.1.6.5 Inputs Typical inputs should include, but not be limited to:

• Needs, goals, objectives • Requirements • Programmatic guidelines and constraints] • System specifications • System concept and architecture • Cost/effectiveness analyses • Environmental assessment • Feasibility assessment • Specialty engineering studies • Life cycle cost estimates • Trade and analysis results • Analysis model descriptions • SE tool descriptions • Program and project management plans (PMPs) • Engineering master plan and master schedule • Operations concepts • Evaluation criteria • Reference missions • Technical performance measures

4.1.6.6 Steps1,3,6 Three main steps to this process are to review and understand requirements and concepts, to establish the key technologies needed by the system of interest, and to plan for their maturation and insertion. These major steps and products of the Technology Planning process, which are discussed further below, are illus-trated in Figure 4.1-6.

• Requirements and concepts should be reviewed. − Examine current life cycle phase inputs.

Page 98: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-24

Figure 4.1-6. Technology planning process diagram.

− Keep abreast of concurrently developing requirements.

• Required key technologies shall be established. − Identify and document key technologies re-

quired by the system of interest using tools and resources such as the NASA Technology Portal to identify potential technologies; e.g., http://nasatechnology.nasa.gov/index.cfm.

− Identify key partnerships. − Identify and document the TRL of each

required key technology. − Identify where technology gaps exist, in -

cluding gaps significant enough to question the viability for a concept to be realized.

− Evaluate those technologies that can in-crementally improve project capabilities, decrease risk, improve safety, or reduce cost.

− Identify technologies that have distribution restrictions on the S/W, H/W, or data.

− If no new technology are required, docu-ment this and advise termination of process.

• Technology maturation and insertion shall be planned for required low-TRL technologies. − Determine and document activities for

advancing required technologies from lower

TRLs to levels required for incorporation into the system of interest. • Generate technology development plans

to remove identified technology gaps. • Explore innovative avenues to expand

participation and infuse the latest tech-nological and commercial capabilities into the project.

• Ensure that plans for technological or commercial cooperation include a full description of the opportunities for part-nering, the potential partners, the need to protect intellectual property, the likelihood of the partnership achieving fruition, the expected contribution, and the confidence that the partnership will remain in force.

− Determine and document the criteria for assessing when required low-TRL technol-ogies that must be matured can be inserted into the system of interest.

− Schedule and document anticipated technol-ogy maturation and insertion for each identified required low-TRL technology, and protect against technology non-maturation.

KEY Step/Activity Information/Product Information/Output Flows

A Connector

Technology Development Requirements

Technology Development Plan

Inputs for Current Life Cycle Phase

Requirements & Concepts Developing in

Current Life Cycle Phase

– START – Review Requirements

and Concepts

Established Required Key Technologies

Plan for Technology Maturation &

Insertion

A

Key Technology Definitions, TRLs,

and Gaps

TRL Advancement Plans

Technology Alternative Employment Criteria

Low -TRL Technology Insertion Criteria

Maturation & Insertion Schedule

Is New Technology Required?

A Yes

No

Document That No New Technologies Are Needed

STOP

Page 99: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-25

• Describe calendar-based milestones for the insertion of each required technology.

• Document alternatives to incorporating the identified low-TRL technologies into the system of interest, and establish deci-sion gates and decision criteria for pro-ceeding with the alternatives.

− As indicated by the life cycle phase, document technology development requirements and/or the technology development plan.

− Identify potential technologies to disclose. 4.1.6.7 Outputs For the Technology Planning phase:

• Pre-Phase A, Advanced Studies, is a set of con-ceptual technology development requirements.

• Phase A, Preliminary Analysis, is a set of tech-nology development requirements and plans.

• Phase B, Definition, is any updates to the technology development plans.

4.1.6.8 Exit criteria For the Technology Planning phase:

• Pre-Phase A, Advanced Studies, conceptual technology development requirements shall be documented and accepted by the concept review.

• Phase A, Preliminary Analysis, documented technology requirements and plans shall be accepted by the definition review.

• Phase B, Definit ion, any updates to the tech-nology development plans shall be documented and accepted by the definition review.

4.1.6.9 Measurement The table at the bottom of the page provides exam-ple base and derived measures that can be used in con-junction with executing the Technology Planning pro-cess. See discussion of Measurement on page 4-1.

4.1.6.10 Methods and techniques4

The following technical and management methods – the key tools of which have been and will remain sound managerial and engineering judgment and decisive-ness – may be useful in technology planning:

• Concurrent engineering • Brainstorming • Delphi technique • Nominal group technique • Cause and effect diagram (fishbone) • Flowchart analysis • TRL classification

4.1.6.11 Software tools The Technology Planning process should include a review of technology databases and portals, and inter-views with experienced technology developers whose expert judgment can shed light on potential technology challenges for the system of interest. Thus, standard office automation tools that support identification and direct, condensed communication of process inputs to the study team should generally be used; e.g., word processor, spreadsheet, technology databases/portals, data mining tools such as an “expert locator system,” and presentation slide software for PCs. 4.1.6.12 References The following documents and Web site, which were used to prepare this section, offer additional insights into the environment of the Technology Planning process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002. 3INCOSE Systems Engineering Handbook , Version 2.0, 2000. 4RP 1358, System Engineering “Toolbox” for Design-Oriented Engineers, 1994. 5Mankins JC. Technology Readiness Levels: A White Paper. Advanced Concepts Office, Office of Space Access and Technology, NASA, 1995. 6Next-Generation Space Telescope (NGST) Web site explanation of TRLs, http://www.ngst.nasa.gov/public/unconfigured/doc_0852/rev_01/NGST_TRLs.html . 7SP 6105, NASA Systems Engineering Handbook , 1995.

Total # of Technologies Assessed Total # of Technologies Assessed – Planned vs. Actuals

Technology Assessment Rate Charts

Technology Planning Effort (FTEs) Technology Planning Productivity

Technology Planning Effort – Planned vs. Actuals

Technology Planning Effort as % Total Engineering Effort

Base Measures Derived Measures

Page 100: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-26

4.1.7 Design1–4 The Design process is performed to provide a methodical and iterative effort that transforms system of interest requirements into qualitative and quantitative solutions. this process involves taking identified prod-ucts from previous phases and establishing a system definition sufficient for moving into attainment. Re-quirements are iterated and decomposed into progress-ively lower levels using trade studies to optimize the selected system of interest. The selected solution is validated to meet needs, goals, objectives, costs, and schedules. This solution is evaluated for manufactur-ability, operations, safety, reliability, maintainability, parts/materials adequacy, requirements compliance, and operations concept consistency. All interfaces are identified and documented. Approaches to verify and validate all requirements and specifications are docu-mented. The process completes the design of the system of interest by defining the set of identified configuration items and defining their interfaces, integrated as a system. The Design process is applied to all levels of the system of interest to:

• Establish specified requirements and detailed draw-ings or documents for system of interest products.

• Define initial and final specifications, including interface specifications, for subsystems of each system of interest product that requires further development.

• Identify enabling product requirements to fa-cilitate meeting functional requirements during attainment, production, test, deployment, oper-ations, training, support, and disposal.

Designs provide the appropriate content not only for attainment but also for other phases of the product life cycle, such as modification, re-procurement, main-tenance, sustainment, and installation. Design docu-mentation provides a reference to support mutual understanding of the design by relevant stakeholders and supports future changes to the design both during attainment and subsequent phases of the life cycle. A complete design description is documented in a tech-nical data package that consists of a full range of fea-tures and parameters including form, fit, function, interface, manufacturing process characteristics, and other parameters. Established organizational or project design standards (e.g., checklists, templates, object frameworks) form the basis by which to achieve a high degree of definition and completeness in design documentation. Design begins in the earliest phases of the project with development or preliminary system concepts and specifications and continues throughout system definition and design. Predominate activities, how-ever, occur in the preliminary design and detailed design phases of the project life cycle. Specifically, preliminary design:

• Establishes a “design-to” solution that fully meets project needs, goals, and objectives.

• Completes test and verification plans. • Establishes design-dependent requirements and

interfaces. • Completes implementation level of design.

Detail design: • Establishes complete, validated “build-to”

detailed design. • Completes all design specialty audits. • Establishes manufacturing processes and controls. • Finalizes and integrates interfaces.

4.1.7.1 Function1 In the Design process:

• Requirements for the system of interest shall be further decomposed and allocated to elements of the system(s) of interest.

• Proposed solutions that address the system of interest requirements shall be identified.

• Trade studies of the proposed solutions shall be performed to provide an objective foundation for selection of the solution(s).

• The selected solution shall be documented. 4.1.7.2 Objective1 The primary objective of the Design process is to document that identified needs, goals, and objectives of the customer and other stakeholders can be achieved by a baselined system of interest solution, and the sys-tem of interest can be attained, operated, and de-com-missioned within the technical, budget, and schedule scope of the program/project plans. 4.1.7.3 Responsibilities3,4 In the Design process, the Lead Systems Engineer is responsible for ensuring the relationships of the system of interest are being designed to its environment and subsystems rather than with the internal details of how the system of interest will accomplish its objectives. The Lead Systems Engineer’s viewpoint is broad rather than deep, encompassing system functionality from end to end and temporally fro m conception to disposition. The Design Engineering Disciplines (e.g., mech-anical, electrical, S/W) have the primary responsibility for actual system of interest design. Specialty Engineering Disciplines and Support Functions (e.g., quality engineering, quality assurance (QA), S/W assurance, CM, data management) contri-bute to the design when and as required. For example, engineering specialty analysis results are integrated into the design, and the manufacturing process and controls are defined and validated.

Page 101: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-27

4.1.7.4 Life cycle Some early design activities take place in Phase A in support of synthesizing and down selecting proposed sys-tem concept(s) and drafting preliminary system specifi-cations. Some design activities also take place early in Phase B in support of definition reviews such as devel-opment of the systems evaluation criteria, development and refinement of the prime items concepts, and perform-ing trades for selecting the optimal solution. The prelim-inary Design process begins following successful Sys-tem Definition Reviews (SDRs) during Phase B and ends in a series of Preliminary Design Reviews (PDRs) con-taining the system of interest-level PDR as well as PDRs for lower-level components, as appropriate. Detailed de-sign execution is conducted during Phase C, culminating in a series of Critical Design Reviews (CDRs) for the system of interest and all of its components . 4.1.7.5 Inputs1,3 Available and necessary inputs required for the Design process range from analyses, assessments, and studies to plans and requirements. These inputs will occur during various stages of development. Some will be conceptual or preliminary documents and only partially complete, while others will be fully complete and approved. Some of these inputs are further refined or completed during the Design process and are listed as outputs. The following provides a listing of typical inputs that should be considered during design:

• Results from evaluations, analyses, and studies of cost/effectiveness, risks, life cycle cost, logis -tics, support, producibility, reliability, human factors, maintainability, safety/hazards, specialty engineering, environmental, feasibility, trades, and failures modes and effects

• Concepts, including system, functional, oper-ations, integrated logistics support, and integra-tion and assembly

• All requirements, including top-level project, science, system/segment, allocated, disposal, flow down, refined, interface, lower level, per-formance, technology, development, verifica-tion, and validation

• Plans in various stages of development, includ-ing PMPs, SE management plans, CM plans, integration and assembly plans, integrated logistics support plans, QA plans, risk manage-ment plans, system safety plans, verification and validation plans, etc.

Other inputs include: • Needs, goals, and objectives • Assumptions, guidelines, and constraints • Performance measures, including budgets and

margins • Design disclosure • Product breakdown structure

• System specifications • Applicable standards • Verification requirements matrix • ICDs • Design-to specifications • Design selection with rationale • Development test results • Engineering items • Hardware/software list • Integrated schematics • Material and processes data

4.1.7.6 Steps4 Figure 4.1-7 illustrates the major steps and prod-ucts of the Design process. This process begins with outputs from decomposition and ends with a fully characterized design solution. More detailed activi-ties associated with these steps are provided below. NOTE: Reference NPR 8705.2 for space flight systems that carry humans or whose function or malfunction may pose a hazard to NASA space systems that carry humans.

• Detailed Alternative Solutions and Selection Criteria Shall Be Developed – Various alterna-tive design solutions and association evaluation criteria should be established from which an op-timal design solution can be selected. These al-ternative solutions are refined and evaluated until a preferred design solution is selected. − Develop Evaluation Criteria – Evaluation

criteria should be established that address design issues for the life of the product.

− Synthesize and Down Select Optimal Solution – Establish project and system specifications from which potential solutions can be selected.

− Select Optimal Solution – Select and docu-ment product/components of the system of interest that best satisfies evaluation criteria.

• Preliminary Design Shall Be Established – Establish system of interest capabilities and arch-itecture, including partitions, component identifi-cations, system states and modes, major inter-component interfaces, and external interfaces. − Analyze and Refine Requirements – Detail

and update specifications, flow down require-ments to the component level, and establish disposal and interface requirements. Use the Requirements Development process (Section 4.1.1) as required.

− Perform Design Analyses – Capture design analyses and trade study results in reports as required. The Systems Analysis process (Sec-tion 4.1.13) is applied here as necessary.

Page 102: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-28

Figure 4.1-7. Design process diagram.

− Perform Engineering Development Tests – Identify engineering items and capture devel-opment test results. Engineering tests are conducted on test units created to resemble actual components to establish confidence that the design will function in the expected environment.

− Define Interfaces – Update the ICD and create integration schematics.

− Perform Preliminary Design – Prepare the design disclosure, create an engineering parts list, and generate a hardware/software list. Complete the “design-to” baseline.

− Complete Plans and Documentation for Qualification Items.

− Evaluate, Verify, and Validate Preliminary Design – Check preliminary design for cost effectiveness and life cycle cost, and other specialty engineering considerations. Use the Verification and Validation processes (Sections 4.1.14 and 4.1.15) as necessary.

• Detailed Design of the System of Interest Shall Be Developed – The system of interest structure and capabilities are defined in sufficient detail for conducting attainment, integration, verifica-tion, and validation. − Perform Detailed Design – Update design

disclosure, produce material and processes

data, and develop integration and assembly plans. Complete “build-to” baseline.

− Design and Control Detailed Interfaces – Update interface control documentation.

− Perform Engineering Tests – Capture/update development test results and finalize engineer-ing items. Test units that closely resemble actual components are evaluated to ensure the design will operate as expected in the target environment.

− Fabricate/Test Qualification Items – Produce qualification items and capture qualification test results.

− Evaluate, Verify, and Validate Detailed Design – Check final design to assure that it meets cost-effectiveness goals, life cycle cost, and other specialty engineering considerations. Use the Verification and Validation processes (Sections 4.1.14 and 4.1.15) as necessary.

− Complete Detailed Design and Production Plans – Create “build-to” specifications and verification requirements and specifications.

4.1.7.7 Outputs In general, the output of the Design process is a technical data package that evolves over the project life cycle to ultimately include all remaining lower-level requirements and designs and “build-to” speci-

KEY Step/Activity Product Information/Output Flows

B Connector

Needs, Goals, and Objectives

Develop Evaluation

Criteria

Synthesize and Down

Select

Select Optimal Solution

Preliminary Design To Spec

A

Detailed Design

System Definition

Preliminary Design

Design to Spec

Evaluate, Verify, and Validate

Preliminary Design

Perform Design Analyses

Perform Engineering Development Tests

Define Interfaces

A

Analyze and Refine

Requirements

Perform Preliminary

Design

B

Complete Plans and Documentation for Qualification Items

Define and

Control Detailed Interfaces

Fabricate/Test Qualification

Items

Perform Detailed Design

B

Evaluate, Verify, and

Validate Detailed Design

Perform Engineering

Tests

Complete Detail Design and

Production Plans

Build to Spec

Page 103: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-29

fications at all levels. Depending on the stage of the design, this technical data package includes:

• Results from analysis and refinement of requirements

• Evaluate of cost/effectiveness, environmental, failure modes and effect analysis (FMEA), logistics support, reliability, maintainability, safety and hazards, and life cycle costs

• Revisions to plans such as the program manage-ment plans, SE management plans, integrated lo-gistics support plans, S/W QA plans, verification and validation plans, risk management plans, etc.

• System specification • Evaluation criteria • Operations concept • System concept and architecture • Design disclosures • Product breakdown structure • Technology development requirements • ICDs • Integration and assembly concept and design • Integrated logistics support concept • Documented selection, including rationale • Design analysis reports • Trade analysis and results • Electronic parts list • Hardware/software list • Integrated schematics • Instrumentation program and command list • Material and process data • Integration and assembly design • Development test results • Engineering items • Verification requirements and specifications

4.1.7.8 Exit criteria1,4 Although design may occur throughout the remainder of the system of interest life cycle, most design activities are considered complete when CDR requirements are satisfied. Therefore, to transition these design components from the Design process to the Attainment process (Sec-tion 4.1.8), the following criteria must be satisfied:

• A design solution (i.e., a build-to specification) shall be baselined and documented that satisfies system of interest requirements and meets stakeholder needs.

• Documented readiness to attain, verify, and val-idate the system of interest shall be established.

• Customer signature approval of updated in ternal task agreements (ITAs) or statements of work (SOWs) with firm cost to complete shall be acquired.

• Safety and Mission Assurance Review Team (SMART) approval of the safety data package shall be provided.

• Successful completion of subsystem design re-views – i.e., PDR, CDR – shall be documented.

• Successful completion of system of interest design reviews – i.e., PDR, CDR – shall be documented.

4.1.7.9 Measurement The table at the top of the following page provides example base and derived measures that can be used in conjunction with executing the Design process. See discussion of Measurement on page 4-1. 4.1.7.10 Methods and techniques3 Methods and techniques used in the Design process depend, to some degree, on the nature of the system of interest. However, typical methods and techniques include:

• Brainstorming • Literature searches • Trade studies • Analysis of risks, hazards, failure modes and

effects, reliability, cause-consequence, etc. • Dimensioning and tolerancing • Surveys • Vendor inquiries • N2 charts • System schematics • Interface diagrams • Tables and drawings of detailed interface data • QFD • Layout sketches • Decision trees • Functional flow diagrams

4.1.7.11 Software tools Designing a system of interest involves the poten-tial use of numerous tools of various sizes. These range from tools intended to help to express the physical characteristics of the system of interest to tools that are used to analyze associated design parameters. There are concept development tools, system safety and reliability analysis tools, design-related analytical tools, graphical data development and interpretation tools, statistical tools, QA tools, and trend analysis tools. Other tools include computer-aided design tools; drafting tools for preparation of drawings and schematics; and tools that provide archival, CM, and review of design documentation (e.g., design and data management system (DDMS)). For a current and extensive listing of commercially available design tools, see the table provided by INCOSE at: http://www.incose.org/tools/eia632tax/eia632top.html. This table lists tools that support the solution definition requirements of EIA-632.

Page 104: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-30

4.1.7.12 References The following documents, which were used to prepare this section, offer additional insights into the Design process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Require-ments. 2EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999. 3INCOSE Systems Engineering Handbook , Version 2.0, 2000. 4SP 6105, NASA Systems Engineering Handbook , 1995. 5CMMI-SE/SW/IPPD/SS V1.1, Capability Maturing Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002.

# of Design Elements (e.g., # of subsystems, # of H/W configuration items (CIs), # of Planned vs. Actual Design Elements S/W CIs)

Number of Requirements Not Met By Planned vs. Actual Requirements Met by Design Solution Design Solution

# of Allocated Requirements (to subsystems, % Allocated Requirements – to Subsystems, H/W CIs, H/W, S/W) S/W CIs

% Requirements Not Allocated

# of Requirements Verified by Analyses Planned vs. Actual Requirements Verified

# of “TBDs” in Design Solution Planned vs. Actual TBDs

# of Unresolved Interface Issues Planned vs. Actual Unresolved Interface Issues

Elements in Design Solution Defined Planned vs. Actual Elements Defined

Design Milestone Dates Milestone – Planned vs. Actuals

Design % Complete – Planned vs. Actuals

Design Effort (FTEs) Design Productivity

Design Effort – Planned vs. Actuals

Design Rate Charts

Design Effort as % Total Engineering Effort

Base Measures Derived Measures

Page 105: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-31

4.1.8 Attainment1,2,3 The Attainment process is performed to transform each system of interest solution into an actual product. Attainment is accomplished through buying, building, supplying, or coding the product as determined by the trade studies. This process is executed to procure, ac-quire, and/or manufacture end items associated with each level of the system of interest. Resulting system of interest end items may then be assembled, integrated into other systems of interest, verified, and validated. See the Integration, Verification, and Validation pro-cesses (Sections 4.1.9, 4.1.14, and 4.1.15) for further details regarding the disposition of attained end items. The result of successfully completing this process is a fully integratable system of interest that satisfies its specified requirements. System of interest integra-tion preparation during attainment ensures that both internal and external interfaces function according to requirements; that designed states, modes, dynamic allocations, or other operational switching functions perform as required; and that any designed overload conditions, reduced operational levels, or designed-in degraded mode of operations are included. Example characteristics of the Attainment process are: S/W is coded, data are documented, services are performed and documented, electrical and mechanical parts are fabricated, product-unique manufacturing processes are put into operation, processes are doc-umented, and facilities are constructed. 4.1.8.1 Function1 In the Attainment process:

• The documented solution shall be made avail-able and communicated among individuals in-volved with transforming the solution into a product.

• Procedures and work instructions and sup-porting enabling systems shall be in place for transforming the solution into a product.

• The solution shall be transformed into a product. 4.1.8.2 Objective1 The objective of the Attainment process is to build subsystems of interest and prepare these subsystems for progressive integration by creating the overarching system of interest while developing confidence that the results will meet system of interest requirements. 4.1.8.3 Responsibilities4 In the Attainment process, the Lead Systems Engineer is primarily responsible for requirements baseline maintenance and requirements feedback to design. The Lead Systems Engineer ensures and man-ages the technical integrity of the requirements baseline, continually updating it as various changes are imposed on it during the Attainment process and communicat-

ing changes to relevant stakeholders. The Lead Systems Engineer is also responsible for monitoring activities during the Attainment process to identify the potential effects of any issues that may surface in other parts of the system of interest or its external interfaces. Final-ly, the Lead Systems Engineer updates technical plans and schedules, as necessary, to reflect the technical resolution of these issues. Actual solution attainment is the responsibility of the Subsystem Lead Engineers. By working closely with the Lead Systems Engineer, the Subsystem Lead Engineers ensure subsystem solutions are transformed into appropriate systems of interest solutions (i.e., fabricated items, S/W code, acquired components) that are ready for further integration and test. 4.1.8.4 Life cycle1 Attainment begins at the point where the document-ed solution (i.e., the “build-to” specification) for the system of interest is considered fixed and ends with the successful completion of the system of interest acceptance. This corresponds to the end of Phase C, Design, and the initial stages of Phase D, Development. 4.1.8.5 Inputs1,5 The primary inputs to attainment are:

• Baselined “build-to” specifications that satisfy requirements and meet stakeholder needs.

• Documented readiness to produce, verify, and validate the system of interest.

Inputs into the Attainment process will take place during various stages of completion. Some will be conceptual or preliminary documents and only par-tially complete, while others will be fully complete and approved. Following is a listing of typical inputs:

• System specification • Technology development requirements • Documented selection with rationale • Factors such as cost/effectiveness, environ-

mental, FMEA, logistics support, producibility, R&M, safety/hazard analysis, and life cycle costs .

• Acceptance criteria • Material and process data • Electronics parts list • Hardware/software list • Users’ manual • Plans such as the PMP, QA plan, acceptance plan,

development plan, safety plan, reliability plan, integration and assembly plans, verification and validation plans, qualification item plan, etc.

4.1.8.6 Steps5 The following diagram (FIG. 4.1-8) illustrates the major steps and products of the Attainment process. This process begins with outputs from the Design process (Section 4.1.7) and ends with a fully tested

Page 106: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-32

system of interest. More detailed activities associated with these steps are provided below. Figure 4.1-8. Attainment process diagram. The basic steps in the Attainment process are:

• Products Shall Be Acquired, Fabricated, or Coded − Communication Solution – The documented

solution is made available and communicated among those individuals involved in trans-forming the solution into a product.

− Receive, build, or reuse the end item(s); i.e., the lowest-level items in the system of inte-rest architecture.

− Fabrication is conducted in accordance with the “build-to” design and manufacturing/pro-duction plans.

• Product(s) Shall Be Verified and Validated Against Requirements (Verification and Valida-tion processes, Sections 4.1.14 and 4.1.15) − Ensure the product(s) satisfy requirements. − Ensure the item(s) satisfy the intent of the

“build-to” solution. • If more than one end item is required to create

the system of interest: − Product(s) shall be assembled.

• Assemble in accordance with the “build-to” design and assembly plans. – Assemblies shall be verified

(Verification process, Section 4.1.14). • Verify any assemblies in accordance

with the verification plan. – Assemblies shall be validated

(Validation process, Section 4.1.15). • Validate any assemblies in accordance

with the validation plan.

Other activities occur during the Attainment process that facilitate either end item acquisition or fabrication activities. These activities include:

• Develop Procedures – Procedures are required to fabricate end items. These procedures provide the specific instructions necessary at each stage of fabrication to create the item.

• Provide Enabling Systems – Define, plan, and establish any enabling systems needed to sup-port fabrication of the end items. Examples of enabling systems include qualification unit and ground support equipment.

• Prepare User’s Manual – As the end item is acquired or fabricated, initiate preparation of the user’s manual.

• Prepare Maintenance Manuals – Begin maintenance manual development during the Attainment process.

• Train System Operators and Maintainers – Initiate training for system operators and maintainers.

• Document Lessons Learned – Lessons learned provide information for Attainment process refinement.

• Finalize Test Procedures – Finalize test procedures for acceptance testing.

4.1.8.7 Outputs2,5 Primary outputs from the Attainment process, which are completed, validated, and verified prod-ucts, are:

Attain Products

Build Product(s)

Receive Product(s)

Communicate Solution

“Build-to” Spec(s)

Product(s) Reuse Product(s)

Verify and Validate Product(s) Against

Requirements

Assembled Product(s)

Assemble Product(s)

Verify Assemblies

Validate Assemblies

Complete Product(s)

Support Product

Attainment

Develop Procedures

Provide Enabling Systems

Prepare User’s Manual

Prepare Maintenance

Manuals

Train System Operators and

Maintainers

Document Lessons Learned

Finalize Text

Procedures

KEY Step/Activity Product Information/Output Flows

Page 107: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-33

• Acquired H/W, S/W, firmware end products or composites of system of interest products built or coded to their specified requirements, draw-ings, or descriptive documents; or other needed physical entities (e.g., trained personnel, certi-fied facilities, special techniques, manuals).

• Lessons learned. The following provides a listing of typical outputs:

• Evaluation, verification, and validation of design factors such as cost/effectiveness, environmental, producibility, FMEA, logistics support, reliability, maintainability, safety/hazard analysis, and life cycle costs.

• End items • Support items • Spares • QA results • As-built documentation • Technical manuals and data • User’s manual • Integrated system • Support equipment • Operations procedures • Delivered/installed system • Final documentation • Trained personnel • Waivers • Phase III safety data package • Final test procedures

4.1.8.8 Exit criteria The following shall be accomplished to satisfy exit from this process:

• A system of interest is developed or acquired. • Verification and validation of the system of

interest is accomplished and documented. 4.1.8.9 Measurement The table at the bottom of the page provides example base and derived measures that can be used in conjunction with the Attainment process. See discussion of Measurement on page 4-1. 4.1.8.10 Methods and techniques4 Methods and techniques include:

• CM of requirements baseline • Functional analysis tool

− N2 charts − Functional flow diagrams

4.1.8.11 Software tools Many S/W tools are required to attain system of interest end items. The nature of these tools depends on whether the end item is a fabricated item, a S/W item, or an acquired item. Most of the tools are also outside the domain of SE – i.e., they are unique tools associated with fabrication, S/W development, or ac-quisition – and, as such, are not referenced in this document. The SE tools used in this process are tools used for requirements management and, if necessary, functional analysis. For a list of potential tools, see the table provided by INCOSE at: http://www.incose.org/tools/eia632tax/eia632top.html.

Defects in Design (by phase) Design Defects – Projected vs. Actuals

Milestones Dates (reviews, change control Milestones – Planned vs. Actuals boards, baseline document publication)

Code Effort (FTEs) Code Productivity

Code Effort – Planned vs. Actual

Code Rate Charts

Code Effort as % Total Engineering Effort

Code Size (# lines of code) Code Size – Planned vs. Actual

Code % Complete – Planned vs. Actuals

Defects in S/W Code (by phase) Code Defects – Projected vs. Actuals

Base Measures Derived Measures

Page 108: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-34

4.1.8.12 References The following documents, which were used to prepare this section, offer additional insights into the Attainment process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999. 3CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 4INCOSE Systems Engineering Handbook , Version 2.0, 2000. 5SP 6105, NASA Systems Engineering Handbook , 1995.

Page 109: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-35

4.1.9 Integration1,2 The Integration process is performed to inter-connect distinct and separable elements to form a complete final system for verification. An integral part of ensuring physical system integration success is comprehensive integration planning; i.e., continually checking that all of the pieces remain compatible and that they will support the intended product. In this activity, each system element is analyzed to ensure compatibility with other elements within the system of interest. Com-patibility with other (external) enabling systems is also verified. This assessment ensures that each system element can be adequately accommodated with respect to the basic constraints imposed by the system of interest and the final integrated system. Another key aspect of the integration process is management of internal and external interfaces of the system of interest. Physical integration is more than just a one-time assembly of elements at the conclusion of design and attainment. Integration is conducted incrementally, using an iterative process of assembling elements, evaluating these elements, and assembling more ele-ments. This process steadily progresses through in-creasingly more realistic incremental functionality until the final system is achieved. There is a higher probability that a system of interest that has been in-tegrated in this manner will pass verification and validation. For some systems, the last integration phase will occur when the system is deployed at its intended operational site. Execution of the physical interconnection system integration activities follows the SE Design and At-tainment processes (Sections 4.1.7 and 4.1.8) and re-sults in a fully integrated system that will be verified by the SE Verification process (Section 4.1.14). 4.1.9.1 Function1 In the Integration process:

• Planning for the needs (i.e., H/W, S/W, human factors, facility, personnel, etc.) to integrate in-dividual elements into the system of interest shall be performed.

• Procedures and work instructions and support-ing enabling systems shall be put into place for integrating the products.

• The configuration of products being integrated, as well as the integrated products as being rep-resentative of the configuration in which the system of interest will be used, shall be validated.

• Testing of the interface(s) shall be performed. • Integration planning shall be performed to

ensure success in the physical and operational integration of the system of interest.

4.1.9.2 Objective2,3 The primary objective of this process is to man-age activities to achieve complete system integration through the progressive assembly of elements accord-ing to a defined integration sequence and procedures. The Integration process integrates end products, en-abling systems, and external interfacing systems as appropriate to the level of the system of interest and as required for verification. 4.1.9.3 Responsibilities4 The Lead Systems Engineer is responsible for over-all management of the Integration process, including the integration sequence. In this role, the Lead Systems Engineer ensures integration planning activities are performed, Test Readiness Reviews (TRRs) are con-ducted, and only verified CIs are integrated into the next -higher assembly for further verification. The Verification Lead supports the Lead Systems Engineer to ensure that development and documenta-tion of system-level integration and test plans are ac-complished and that the required integration facilities have been verified to provide the necessary condi-tions, have been scheduled, and are available. 4.1.9.4 Life cycle Integration is an activity that exists at all points in the project life cycle. The “Vee” chart, which appears as a foldout as page 35, represents systems definition and development in three dimensions. The horizontal axis of the chart is notional time. The vertical axis represents levels of system/element decomposition. The further down on the chart, the more detailed the system information and, hence, the system depth. Implicit in the chart is an axis perpendicular to the paper that represents the increasing number of sub-systems identified and being decomposed in parallel (system breadth). On the upward Integration and Verification leg (i.e., the right side of the “Vee”), integration activities consist of executing the integration and verification plans developed on the accompanying downward leg level. The hard work for integration takes place on the downward leg. Integration in the downward leg is applied at each baseline level to ensure compatibility is maintained during system definition and design. While descend-ing down this leg, the process remains the same except for the level of detail analyzed. The horizontal lines going across the “Vee” chart indicate the baseline lev-els and reflect the connection between the integration plan and the assembly of elements into subsystems on the upward leg of the “Vee.” Simultaneously, analysis is taking pace to assure integration on the perpendicular axis. The 3-way analysis approach produces detailed element-to-element integration and

Page 110: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-36

optimizes the system with respect to overall effective-ness (rather than sub-optimizing the system by opti-mizing each element for its maximum performance). Integration in the downward leg consists of identify-ing all interfaces (physical, electrical, data/information, and operational) and documenting the corresponding interface requirements in ICDs. As for any require-ment, interface requirements in the ICDs must have verification plans written as part of the baseline. The context for the verification plans for the interface re -quirements in the ICDs is that baseline level’s opera-tions scenarios/concepts/plans/procedures. Finally, integration plans must be developed at that baseline level to describe how the elements will be assembled or integrated to be ready for test and verification. Commensurate ICDs, verification plans, operational concepts, and integration plans, which are all written at the same baseline level on the downward leg of the “Vee,” control the execution of integration and veri-fication at the same baseline level on the upward leg of the “Vee.” 4.1.9.5 Inputs2,3,5 Typical inputs to this process include, but are not limited to:

• Operations concept • ICDs • Development schedule and status • System requirements • Design documentation • Validated products to be integrated • Product breakdown structure • System hierarchy • System architecture (functional, physical)

4.1.9.6 Steps2,3,4,6 The diagram at the bottom of the page (FIG. 4.1-9) illustrates the major steps and products of the Inte-gration process. The Lead Systems Engineer is responsible for ensuring all major steps in the Integration process are performed. Adherence to the major steps of this proc-ess directs the project toward meeting its integration requirements. For each major process step, additional guidance is provided as to the expected typical sub-process activities and products.

• Integration Planning Steps − An integration plan and process shall be

established. • Develop the overall project plan and

process to achieve system integration and include this as part of the Systems Engi-neering Management Plan (SEMP).

• Define the project integration responsibil-ities, especially as they relate to the project acquisition strategy (Section 4.3.1).

• Provide the project manager with cost and schedule inputs related to integration engi-neering life cycle activities in support of the project planning effo rt.

• Ensure a conceptual version of the in-tegration plan is available at the SDR, a preliminary version is available at the PDR, and a final/approved version is available at both the CDR and the Pro-duction Readiness Review (ProRR).

Figure 4.1-9. Integration process diagram.

KEY Step/Activity Product Information/Output Flows

System Integration

Integration Planning

Establish Integration Procedures and

Criteria

Establish an Integration Plan

and Process

Determine Integration Sequence

Establish Integration

Environment

Interface Compatibility

Continue Integration

Plan

Review and Manage Interface

Descriptions

Fully Integrated System

Confirm Product Readiness for

Integration

Assemble Products

Evaluate Assembled Products

Page 111: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-37

Page 112: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-38

Page 113: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-39

− An integration sequence shall be determined. • Identify the products to be integrated. • Identify the product integration verifica-

tions to be performed using the definition of the interfaces between products.

• Identify alternative product integration sequences.

• Select the best integration sequence. • Periodically review the product integra-

tion sequence and revise, as needed, to ensure that variations in production and delivery schedules have not adversely impacted the sequence or compromised the factors on which earlier decisions were based.

• Record the rationale for decisions made and deferred.

− An integration environment shall be established. • Identify requirements for product

integration environment. • Identify verification criteria and proced-

ures for product integration environment. • Decide whether to make or buy needed

product integration environment. • Develop integration environment if a

suitable environment cannot be acquired. • Maintain product integration environ-

ment throughout the project. • Dispose of those portions of the

environment that are no longer useful. − Integration procedures and criteria shall be

established. • Establish and maintain integration

procedures for the products. • Establish and maintain criteria for product

integration and evaluation. • Establish and maintain criteria for valid-

ation and delivery of integrated product. • Continually Perform Integration Planning

− Horizontal integration shall be performed to: • Monitor and integrate system definition

activities and results at the lower levels of the hierarchy to ensure compatibility.

• Monitor and integrate the preliminary design activities and results at lower levels of the hierarchy to ensure overall system integrity. This monitoring and integration includes tracking interfaces, TPMs, allocations, etc.

• Track parameters, budgets, and inter-faces as final design progresses to ensure that the design will fit together and work, and to facilitate later physical integration of the system.

• Monitor and address lower-level hierarchy effects and issues to the system level.

• Interface Compatibility Steps − Review and management of interface descrip-

tions shall be performed. • Review interface data for completeness and

ensure complete coverage of all interfaces. • Ensure that products and interfaces are

marked to ensure easy and correct connec-tion to the joining product.

• Periodically review adequacy of interface descriptions.

• Ensure compatibility of the interfaces throughout the life of the product.

• Resolve conflict, noncompliance, and change issues.

• Maintain repository for interface data that are accessible to project participants.

• System Integration Steps − Product readiness for integration shall be

confirmed. • Perform hardware/software integration

(HSI) prior to the PDR to establish confi-dence that the H/W and S/W design concepts are adequate to meet functional interfaces.

• Perform HSI prior to CDR on engineer-ing units to establish confidence that the H/W and S/W detailed designs meet requirements.

• Track status of all products as soon as they become available for integration.

• Ensure products are delivered to the prod-uct integration environment in accordance with the product integration sequence and available procedures.

• Confirm receipt of each properly identi-fied product.

• Ensure each received product meets its description.

• Check configuration status against the expected configuration.

• Perform pre -check (e.g., by performing a visual inspection and using basic measures) of all physical interfaces before connect-ing products.

− Assemble Products – Integration of system elements shall proceed in accordance with the integration plan. • Ensure readiness of the product integra-

tion plan. • Ensure assembly sequence is properly

performed. • Revise product integration sequence and

available procedures as appropriate.

Page 114: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-40

• Recommend verification and validation plans/procedures updates based on integration outcomes.

− Assembled product components shall be evaluated. • Conduct a TRR of assembled products. • Record the TRR results.

4.1.9.7 Outputs4 Primary outputs from this process are:

• Integration plans • System integration process (to include in SEMP) • Integration procedures and criteria • Integration sequence • Integration environment • Integration environment requirements

• TRR results • Verification and validation plan/procedure updates • Fully integrated system

4.1.9.8 Exit criteria Exit criteria include:

• Attainment of a fully integrated system of interest.

• Successful TRR in preparation for verification and acceptance of the system.

4.1.9.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Integration process. See discus-sion of Measurement on page 4-1.

Base Measures Derived Measures Requirements Management Measures Total # of Integration Environment Requirements (e.g., # of shalls) Integration Requirements Managed – Planned vs. Actuals # of Integration Environment Requirements Added, Changed, Deleted

Integration Requirements Volatility = % Added, Changed, Deleted

Requirements Development Measures Total # of Shalls for Integration Environment Total # of Integration Requirements – Planned vs. Actuals Integration Environment Requirements Effort (FTEs) Integration Requirements Definition Productivity Integration Requirements Definition Effort – Planned vs. Actuals Integration Requirements Definition Rate Charts Integration Requirements Definition Effort as % Total Engineering

Effort Technical Solution (TS) Measures # of Products to Integrate Integration Size – Planned vs. Actual % Integration Completed – Planned vs. Actuals Integration Effort (FTEs) Integration Productivity Integration Effort – Planned vs. Actuals Integration Rate Charts Integration Effort as % Total Engineering Effort Number of Integration Anomalies Integration Anomalies – Projected vs. Actuals Number of Interface Defects Interface Defects – Projected vs. Actuals Planning Measures Integration Planning Milestone Dates (e.g., due dates on planning checklist)

Integration Planning % Complete – Planned vs. Actuals

% Integration Planning Milestones Met – Planned vs. Actuals Integration Planning Effort (FTEs) Integration Planning Productivity Integration Planning Effort – Planned vs. Actuals Integration Planning Effort as % Total Engineering Effort Integration Plan Status % Integration Plan Reviewed; % Approved by Identified

Stakeholders Monitoring and Control Measures Tracking Integration Milestones Dates % Tracking Integration Milestones Met – Planned vs. Actuals Integration Issues/Actions Status % Integration Issues Closed Overall Integration Management Effort Integration Management Effort – Planned vs. Actuals Integration Management Effort as % Total Engineering Effort

CM Measures CI Status CI Status Summary

Page 115: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-41

4.1.9.10 Methods and techniques5 Several methods and techniques are available. These include:

• Integration Maps/Trees – An integration map or tree shows how elements are integrated to form the system of interest. Each node represents an element of the system of interest.

• N2 Diagramming – A matrix displaying func-tional interactions, or data flows, at a particular hierarchical level.

• Simulations – For example, the use of threads, rapid prototypes, virtual prototypes, and physical prototypes. The degree of virtual vs. physical prototyping used to support element integration depends on the functionality of the design tools, the complexity of the element, and the risk associated with that element.

• IFWGs – Can be established to review interface statements/drawings, and are a good means of ensuring direct interaction of all parties to the system of interface.

Horizontal integration methods and techniques include:

• Mathematical models (e.g., to ensure the power consumption needs of the overall system can be met by the power supplied by or to the system; or to integrate experiments, payloads, or payload complements at the rack, pallet, lab, or partner levels for purposes of transportation, accom-modation, or operations)

• Data collection and analysis • Compatibility analysis • Impact assessments • Planning and scheduling analyses

4.1.9.11 Software tools5 A standard office automation word processor is sufficient for documenting the integration-related

plans, processes, procedures, requirements, analyses, and criteria developed and documented under this process. Once developed, these documents should be placed under CM control to maintain, control, and support their evolution. Also, to ensure that proper configuration of the elements required for integra tion is obtained, a standard CM tool should be employed to document a concurrent baseline that is consistent with the output of the project. The use of a basic office automation spreadsheet and graphical tools is applicable in the development of N2 charts and integration sequences (maps, trees), respectively. Mathematical S/W tools should be con-sidered to support the development and execution of mathematical models during analytical integration activities. 4.1.9.12 References The following documents, which were used to prepare this section, offer additional insights into the SE Integration process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Require-ments. 2CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 3EIA-632, Processes for Engineering a System, 1999. 4SP 6105, NASA Systems Engineering Handbook , 1995. 5INCOSE Systems Engineering Handbook , Version 2.0, 2000. 6NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002.

Page 116: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-42

4.1.10 Technical Work and Resource Management1,2

The Technical Work and Resource Management process is performed to plan how technical aspects of the work will be accomplished and to identify the re-sources that are necessary to accomplish this work. Execution of this process by the SE effort provides technical inputs to the overall project work and resource planning effort. There are two basic aspects to this process: the plan and organize aspect, and the monitor and control aspect. The plan and organize aspect identifies technical needs and constraints in support of the overall project plan-ning effort. Technical requirements, which define the structure required to bring the system of interest into being, include identification, integration, and sched-uling of all engineering functions and tasks; work breakdown structure (WBS) development; organiza-tional structure definition (as related to the project); and descriptions of or references to key policies, stand-ards, and processes. The results are documented in an SE management approach, often called SEMP, which relates technical requirements to project requirements, providing the structure to guide and control the inte-gration of engineering activities needed to achieve SE objectives that are consistent with a top-level PMP. The monitor and control aspect provides visibility into technical progress and risks, and detection of var-iances needing corrective action. Monitoring requires tailoring the level of control of the complexity and risk of the project, tracking data for the level of control, and initiating a corrective action when measures do not meet expected results. Controlling requires setting thresholds of control limits and activating corrective actions based on risk analysis. A review process com-pares the results against the technical effort documented estimates, commitments, and plans. Adequate visibility enables timely corrective action to be taken before performance deviates significantly from plans. 4.1.10.1 Function1 In the Technical Work and Resource Management process:

• Technical content and technical products (e.g., documents, drawings) of the work to be accom-plished shall be defined.

• Detailed technical schedule estimates necessary to accomplish the work shall be defined.

• Technical skills and capabilities necessary to perform the work shall be identified.

• Resource estimates (e.g., cost, labor) to perform the work shall be provided.

• Status relative to the cost, schedule, and tech-nical progress of the work shall be provided.

4.1.10.2 Objective1 The objective of the Technical Work and Resource Management process is to document and status the SE effort required to accomplish the goals and objectives of the system of interest. This includes describing how the efforts are carried out, who accomplishes them, how they are controlled, and how technology is transi-tioned from the technology base to system of interest products. Basically, the objective of this process is to establish the SE management approach – whether it is documented in an SEMP or some other document – and record and report status against it. 4.1.10.3 Responsibilities3 The Lead Systems Engineer is responsible for man-aging the SE effort throughout the technical aspects of the project life cycle. This includes managing the sys-tem of interest decomposition and definition processes as well as the Integration, Verification, and Validation processes (Sections 4.1.9, 4.1.14, and 4.1.15). Atten-dant with this is the requirement to control the com-pleteness and integrity of project technical baselines, ensuring an efficient and logical progression through these baselines while maintaining cost and schedule consistent with business baseline. SE audits the design and coding process and the design engineering solu-tions for compliance to all higher-level baselines. SE also ensures that concurrent engineering is properly applied throughout the project life cycle by involving the required specialty engineering disciplines. The SEMP is the guiding document for all these activities. The Project Manager approves the SEMP, ensures that SEMP requirements and plans are properly inte-grated into the PMP, and allocates project resources to execute the SEMP. (NOTE: At the discretion of the Project Manager, the SEMP and the PMP may be combined into a single document.) 4.1.10.4 Life cycle The Technical Work and Resource Management process begins at the earliest stages of the project life cycle and continues throughout the project. Planning for and development of the SEMP is one of the first planning activities of any project. The SEMP then be-comes the baseline against which SE tasks, budgets, and schedules are measured throughout the remainder of the project. The SEMP is not a static document, however. It must be reviewed and revised as condi-tions change on the project. 4.1.10.5 Inputs3 Typical inputs required by the Technical Work and Resource Management process to create an SEMP include:

• Needs, goals, and activities

Page 117: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-43

• Project plans and schedules, including the PMP and other plans and schedules

• Assumptions, guidelines, and constraints • Concepts and architectures, including system,

functional, and operations • Product breakdown structure • System specification • Requirements, including top-level project,

science, and system/segment Once the SEMP is established and commitment to it are secured, inputs to this process are typically to pro-gress status with regard to tasks, budgets, and schedules. 4.1.10.6 Steps4 The following diagram (FIG. 4.1-10) illustrates the major steps and products of the Technical Work and Resource Management process. More detailed activi-ties associated with these steps are provided below.

• Establish Estimates – SE planning parameters shall be established and maintained. Parameters include all of the information needed by the project to perform necessary SE planning

organizing, staffing, directing, coordinating, reporting, and budgeting. − Estimate the Scope of the Project – Establish

a system of interest structure to estimate the scope of the project. Identify and describe tasks in sufficient detail to specify estimates of project tasks, responsibilities, and schedule. Identify system of interest products and any enabling products to be produced, acquired, or reused.

− Establish Estimates of Technical Attributes – Establish and maintain estimates of technical attributes of work products and tasks. Size, connectivity, complexity, and structure of deliverable and non-deliverable work prod-ucts, documents, and operational and support S/W are typical inputs to estimate. These parameters are used to support technical planning estimates of efforts, cost, and schedule. See the Control process (Section 4.1.12) for identification and allocation of technical attributes.

Figure 4.1-10. Technical work and resource management process diagram.

KEY Step/Activity Product Information/Output Flows B Connector

Manage Corrective

Action E Analyze

Issues

Take Corrective

Action

Manage Corrective

Action

Correction Action Results

Monitor and Control

Obtain SEMP

Commitment

C Review Plans

that Affect the Project

Reconcile Work and Resource

Levels

Obtain SEMP Commitments

SEMP Commitments

D

Monitor Technical

Efforts

C

D

Planning Parameters

Commit- ments

Project Risks

Data Management

Stakeholder Involvement

Monitor Conduct Progress Reviews

Conduct Milestone Reviews

Results

E

Plan and Organize

Establish Project Scope

Establish Estimates of Technical Attributes

Define Project Life

Cycle

Determine Estimates of

Effort and Cost

Project Attributes

AEstablish Estimates

Develop the SEMP

A Establish Budget

and Schedule

Identify Project Risks

Plan for Data Management

Plan for Project Resources

B

B Plan for Needed

Knowledge and Skills Identify

Processes

Plan Stakeholder Involvement

Establish the SEMP

SEMP C

Page 118: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-44

− Define Project Life Cycle – Define the proj-ect life cycle phases on which to scope the major system of interest phase for the current state of the product, expected future phases, and the relationships and effects among the phases. Adjust planning parameters to account for relationships and effects among phases.

− Determine Estimates of Effort and Cost – Estimate the project effort and cost for the work products and the task based on esti-mation rationale.

• Develop the SEMP – An SEM P shall be estab-lished and maintained as the basis for managing the SE aspects of the project. This SEMP details the work activities and work products of the integrated technical effort across the project. − Establish Budget and Schedule – Establish

and maintain the project budget and schedule. − Identify Project Risks – Identify and analyze

project risks (Risk Management process, Section 4.3.2).

− Plan for Data Management – Plan for the management of project data.

− Plan for Project Resources – Plan for nec-essary resources to perform the SE aspects of the project.

− Plan for Needed Knowledge and Skills – Plan for knowledge and skills needed to perform the project.

− Identify Processes – Identify technical and engineering management processes to be used in executing the project (see the Quality Management process, Section 4.3.4, for ad-ditional details regarding identifying and tailoring the project processes.)

− Plan Stakeholder Involvement – Plan the involvement of identified stakeholders.

− Establish the SEMP – Establish and main-tain the SEMP and its contents.

• Obtain Commitment to the SEMP – Com-mitments to the SEMP shall be established and maintained. − Review Plans Affecting the Project –

Review all plans that affect the project to understand project commitments; e.g., tech-nology development plans, systems integra-tion plans, verification and validation plans, and PMPs. Plans developed within other process areas typically contain information similar to that called for in the SEMP. These plans may provide additional detailed guid-ance and should be compatible with and support the SEMP to indicate who has authority, responsibility, accountability, and control.

− Reconcile Work and Resource Levels – Reconcile the SEMP to reflect available and estimated resources. When integrated teams are involved, special attention must be paid to resource commitments in circumstances of distributed integrated teams and when in-dividuals are on multiple integrated teams in one or more projects.

− Obtain SEMP Commitments – Obtain com-mitment from the project manager and rele-vant stakeholders responsible for performing and support SEMP execution.

• Monitor Technical Effort – Technical effort shall be monitored against the SEMP. − Monitor SEMP Planning Parameters –

Monitor the actual values of the project planning parameters against the SEMP. See the Control process (Section 4.1.12) for ac-tual tracking and reporting of significant technical attributes.

− Monitor Commitments – Monitor commit-ments against those identified in the SEMP.

− Monitor Project Risks – Monitor risk against those risks identified in the SEMP (see the Risk Management process, Section 4.3.2, for further details of this activity).

− Monitor Data Management – Monitor the management of project data against data management requirements in the SEMP.

− Monitor Stakeholder Involvement – Monitor stakeholder involvement against the SEMP.

− Conduct Progress Reviews – Periodically review the progress, performance, and issues of the project against the SEMP.

− Conduct Milestone Reviews – Review the accomplishments and results of the project at selected project milestones (see the Reviews process, Section 4.1.16, for further details of this activity).

• Manage Conservative Action to Closure – Sig-nificant deviations from plan shall be analyzed and corrective actions shall be managed to clos-ure. See the Quality Management process (Section 4.3.4) and the Safety and Mission Success pro-cess (Section 4.1.11) for related details associated with management corrective actions. − Analyze Issues – Collect and analyze the

issues and determine the corrective actions necessary to address these issues.

− Take Corrective Action – Take corrective action(s) on identified issues.

− Manage Corrective Action – Manage corrective actions to closure.

Page 119: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-45

4.1.10.7 Outputs4 The output associated with planning and organ-izing technical work and resource management shall be an SEMP. The SEMP, whether it is a standalone document or is embedded in the PMP, is the single in-tegrated technical planning document. It incorporates not only what the project needs to do to accomplish the SE effort, but also how the efforts are to be carried out, who will accomplish them, how they are to be con-trolled, and how the technology is to be transitioned from the technology base to the system of interest products. Actual structure and content of the SEMP will vary according to project needs, but generally the SEMP should be organized to describe the:

• System of interest and its structure. • Technical processes. • Project technical management processes. • Organizational structure. • Constraints and concerns.

An outline for development an SEMP is found in Appendix D of this document. Specific content of the SEMP would typically include:

• SE task descriptions, including size and complexity of tasks and work products

• WBS, including work packages and task dictionary • Technical approach and processes, including

project life cycle phases • Technical management approach and processes,

including estimating models, effort and cost es -timates with rationale, schedule and schedule dependencies, budget, and identified and prioritized risks

• Acquisition strategy and plans in support of SE activities

• References to or actual plans for engineering functions; e.g., reliability, maintainability, human engineering, safety, producibility, integrated logistics support, data management, CM, etc.

• Staffing requirements, including inventory of skills needed, and staffing and new hire plans

• Critical facilities and equipment lists • Records indicating reviews and commitments to

the SEMP For the monitor and control aspect of this process, major outputs should include:

• Records of project performance and significant deviations

• Documented project progress and milestone review results

• List of issues needing corrective action and corrective actions results

4.1.10.8 Exit criteria The exit criteria for the Technical Work and Re -source Management process shall be:

• Establishment of and commitment to an SEMP. • Regular, documented reviews of technical effort

against the SEMP. 4.1.10.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Technical Work and Resource Management process. See discussion of Measurement on page 4-1.

Planning Milestone Dates Planning % Complete – Planned vs. Actuals % Planning Milestones Met – Planned vs. Actuals Planning Effort (FTEs) Project Planning Productivity Project Planning Effort – Planned vs. Actuals Project Planning Effort as % Total Engineering Effort Tracking Milestone Dates (e.g., monthly % Tracking Milestones Met – Planned vs. Actuals progress reviews, milestone reviews) Issues/Actions Status % Issues Closed Overall Technical Project Management Effort Project Management Effort – Planned vs. Actuals Technical Management Effort as % Total Engineering Effort Tailoring Report Completion Planned vs. Actual Tailoring Dates # of Processes Tailored (i.e., modified or % Process Compliance tailored out) % Processes Modified or Tailored Out # of Plans and Associated Status % Plans Reviewed; % Approved by Identified Stakeholders

Base Measures Derived Measures

Page 120: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-46

4.1.10.10 Methods and techniques3 The following methods and techniques are rep-resentative of those typically used in technical work and resource management:

• WBS analysis • Networking scheduling • Workflow diagram • Program evaluation review technique (PERT)

charting • Activity-on-arrows diagram • Precedence diagrams • Gantt charts • Resource leveling • Decision tree

4.1.10.11 Software tools Depending on the size and complexity of the system of interest, the tools used to support the Technical Work and Resource Management process could vary from typical office products (e.g., spread-sheets, word processors, and graphical representation tools) to very sophisticated commercial products built on top of Microsoft Project or other standard project management applications. Generally, the basic re-quirements are to use tools that support capturing, analyzing, and publishing schedules, costs, budgets, and resources. Other tools may be required to support the capture and analysis of skill, knowledge, education, and training information. These could be commercial tools, when management and tracking of many peo-ple over a period of years is required, or a project built using spreadsheets or commonly available databases. 4.1.10.12 References The following documents, which were used to prepare this section, offer additional insights into the Technical Work and Resource Management process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2EIA-731.1, Systems Engineering Capability Model, 2002. 3SP 6105, NASA Systems Engineering Handbook , 1995. 4CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 5NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002.

Page 121: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-47

4.1.11 Safety and Mission Success1,2 The Safety and Mission Success process is perform-ed to provide for early identification, analysis, reduc-tion, elimination, and control of hazards to ensure the safety, reliability, maintainability, and quality of the system of interest. The requirements on the part of SE are to monitor, coordinate, and integrate the over-all project safety and mission success established by project management. Project management and SE must rely on contribu-tions from both the specialty engineering disciplines and the traditional design disciplines for functional expertise and specialized analytic methods related to safety and mission success. Specialty engineering areas include reliability, maintainability, QA, quality engi-neering, and safety engineering. Specialty engineers contribute throughout the SE process. As part of the technical effort, they apply specialized analytical tech-niques; define system requirements in their areas of expertise; perform analyses; and evaluate data pack-ages, engineering change requests, test results, and documentation for major project reviews. A part of the systems engineer’s function is to verify that these specialty engineering activities are coherently integrated into the project. 4.1.11.1 Function1,2,3 Safety engineering is performed to provide for the early identification, analysis, reduction, elimination, and/or control of hazards associated with the system of interest. Reliability engineering implements speci-fic design features to ensure that the system of interest can perform in physical environments for the required amount of time. Maintainability engineering imple-ments specific design features to optimize maintenance characteristics of the system in the physical environ-ments. Quality engineering and QA verifies that the system of interest is produced and delivered in ac-cordance with its safety, performance, and design requirements, and that approved processes are fol-lowed during the project life cycle. S/W QA ensures that the S/W processes and products conform to re-quirements, standards, and procedures through a planned and systematic set of activities as stated in NASA-STD-2201-93.18 Safety and mission success personnel at JSC will comply with the documents maintained in the safety and mission assurance (S&MA) documentation tree (http://www.hq.nasa.gov/office/codeq/doctree/gdoc.htm). The Safety and Mission Success process will:

• Scope the amount of S&MA support needed. • Ensure the system of interest is safe; i.e.,

− Potential hazards shall be identified. − A method to monitor and control, eliminate, or

reduce identified hazards shall be established.

− Hazard mitigation efforts shall be ensured to be implemented as designed and to perform the intended function.

− Systems safety s hall be an integral part of the overall risk management process using probabilities estimates and severity classes for understanding and managing safety risks.

• Ensure the system of interest is reliable. • Ensure the system of interest is maintainable. • Ensure the system of interest is of high quality.

4.1.11.2 Objective3 The primary objective of this process is to integrate specialized quality engineering and safety, reliability, maintainability, and quality assurance techniques/pro-cesses throughout all phases of the project life cycle. Other objectives are to ensure that:

• Safety and mission success processes will be used throughout the project life cycle to prevent potential system failures.

• Project decisions made will be consistent with JSC S&MA principles.

• The safety of the public, NASA flight crews, em-ployees, and critical assets will not be jeopardized.

4.1.11.3 Responsibilities The Project Manager ensures that safety and mis -sion success processes are performed by the project. The Lead Systems Engineer is responsible for integrating safety and mission success functions in-to the project and for coordinating the performance of these functions by the project team. This includes working with the S&MA Directorate at JSC to scope the best approach to meet the specific needs of the project as related to safety and mission success. S&MA project support is provided through S&MA-assigned representatives. In this capacity, the S&MA Representatives will (1) assist the project manager in ensuring that all S&MA requirements are appropriate-ly defined and implemented; (2) guide the project during life cycle development through the safety, assurance, procurement, certification, shipping, etc., processes; and (3) serve as a point of contact between the project team and the S&MA Directorate, assuring proper coordination, review, or approval of all safety and mission success responsibilities and practices. Support from numerous personnel is provided, as required, to meet the needs of the project. The safety engineer, reliability engineer, maintainability engineer, quality engineer, QA specialist, procurement quality assurance (PQA) specialist, and S/W QA engineer are all part of the S&MA Directorate. The difference in the roles of a quality engineer, QA specialist, PQA specialist, and S/W QA engineer are clarified below.

Page 122: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-48

• The quality engineer ensures that products provided meet or exceed S&MA requirements.

• The QA specialist ensures that processes are implemented and followed.

• The PQA specialist ensures that S&MA require-ments are imposed on contracts and delegates QA functions to other agencies (e.g., Defense Contract Management Agency (DCMA)).

• The S/W QA engineer is responsible for QA, quality engineering, and safety on S/W products.

4.1.11.4 Life cycle The Safety and Mission Success process establishes and maintains an effective process through all phases of the project life cycle. S&MA requirements, reviews, and plans are integrated into the project life cycle. 4.1.11.5 Inputs Typical inputs to this process include, but are not limited to:

• System goals and objectives • System requirements • Operational concepts and scenarios • Trade studies • Project plans • Feasibility assessment • Schedules/timelines • System constraints • Quality and test procedures

• Procurement documents (SOW, data require-ments descriptions)

• Version description document • ITA • Development plans • Engineering drawings • Design documents • Verification and validation plan/document • ICDs • Certification data package • Acceptance data package (ADP) • Problem reports

4.1.11.6 Steps The project team, in accordance with the responsi-bilities outlined in Section 4.1.11.3 above, performs the process steps illustrated in Figure 4.1-11.

• Determine S&MA Support Level – S&MA support shall be scoped to determine the amount of support needed. − Prepare and approve an ITA.

• Ensure System Safety – System safety shall be integrated into the life cycle development of the system of interest. − Develop and execute a system safety plan. − Provide regular reporting of system safety-

related issues and status to project management. − Formulate the System Safety Design Criteria –

The common goal of system safety design

KEY

Figure 4.1-11. Safety and mission success process diagram.

Ensure System

Maintainability

Ensure System Quality

Ensure System Safety

Ensure System

Reliability

Quality Plans, Procedures, analysis,

Reports, Issues, Status

Safety Plans, Procedures, Analysis, Reports, Issues, & Status

Reliability Plans, Procedures, Analysis,

Reports, Issues, & Status

Maintainability Plans, Procedures, Analysis,

Reports, Issues, & Status

Step/Activity Product Information/Output Flows

Page 123: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-49

criteria is to achieve acceptable and mini-mized levels of design-associated hazards. Many hazards can be eliminated by selecting appropriate specifications, standards, materi-als, design features, and previously qualified components as expressed in design criteria.

− Review and approve verification/validation plan. − Review and approve requirements documents

and verification criteria. − Prepare hazard analysis to identify potential

hazards and drive design solutions. A safety data package is prepared and submitted to the Safety Review Panel (SRP). Phase I is submitted at the PDR, Phase II at the CDR, and Phase III after qualification testing. (NOTE: For government-furnis hed equipment (GFE), the safety review is performed by the SMART; and for GFE payloads, the safety review is performed by the Payload Safety Review Panel (PSRP).5,6

− Perform Software Safety Analysis – S/W safety is defined in SSP 50038, NASA-STD 8719.13, and NASA-GP-1740.13-96.7,8,9

− Perform Safety Testing – Safety tests are planned to demonstrate that safety provi-sions, alarms, and procedures called out in the hazard analysis are adequate and to verify hazard controls are implemented, effective, and satisfy requirements.

− Perform Ground Safety Analysis – The ground safety analysis report is an analysis required to alert vehicle integrators to any specific safety-related design features of the project. Normally this involves integrating the project through the Kennedy Space Center (KSC) as shown in KSC 1700.710 and further defined in NSTS 13830.11 The paragraphs identified in KSC 1700.7 and NSTS 13830 provide information on the intent and content of this analysis report. These documents are specific to KSC operations but cover most typical project safety-related delivery scen-arios. If an organization other than KSC is integrating flight H/W onto a vehicle/flight article, a review of these paragraphs should ensure an understanding of the necessary material to support negotiations with the organization’s safety group. Ground systems with S/W controls must meet the safety requirements in NASA-STD-8719.13.8

− Support TRRs. − Ensure that the system and all ground opera-

tions and activities associated with the system comply with applicable Occupational Safety and Health Administration (OSHA) standards

and JPG 1700.112 so as to prevent personnel injury, damage to the system, and damage to other property/systems.

− Review and approve certification documentation. − Review and assess problem reports and

nonconformances for safety impacts. − Ensure that system operational issues are

identified and documented on a problem report and that corrective actions are implemented.

− Support project tests or analyze products – not only at the expected operational condi-tions, but also at off-nominal conditions – to determine margin or robustness.

• Ensure System Reliability – System reliability shall be integrated into the life cycle development of the system of interest.2 − Develop and execute a re liability plan. − Provide regular reporting of system reliability-

related issues and status to project management. − Perform criticality assessment of the system

as part of the feasibility assessment. − Develop and refine reliability prediction

models. − Establish and allocate reliability goals, fault -

tolerance requirements, and environmental design requirements.

− Review and approve verification/validation plan.

− Review and approve requirements docu-ments and verification criteria.

− Develop environment test requirements and specifications for H/W qualification.

− Prepare an FMEA/critical items list (CIL) to support design decisions. FMEA/CIL is re-viewed and approved by the appropriate R&M panel.

− Perform design trade studies covering issues such as the degree of redundancy, system avail-ability, and reliability vs. maintainability.

− Support risk management by identifying design attributes that are likely to result in reliability problems and recommend appro-priate risk mitigations.

− Identify and track limited life and cycle life items (e.g., JSC-2493713).

− Identify redundancy requirements. − Perform analyses on qualification test data

to verify reliability predictions and validate the system reliability prediction models, and to understand and resolve anomalies.

− Review and approve electrical/electronic equipment (EEE) parts specifications, if applicable.

− Conduct and/or approve EEE parts appli-cation analysis, if applicable.

Page 124: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-50

− Perform testing for qualification and acceptance of EEE parts, if applicable.

− Review and approve certification documentation.

− Prepare fault-tree analysis to identify failures . • Ensure System Maintainability – System main-

tainability shall be integrated into the life cycle development of the system of interest.2 − Develop and execute a maintainability plan. − Provide regular reporting of system maintain-

ability-related issues and status to project management.

− Establish and allocate maintainability and availability requirements. The requirements should be consistent with the maintenance concept and traceable to system-level avail-ability objectives.

− Perform an engineering design analysis to identify maintainability design deficiencies and predictions.

− Develop maintainability predictions to sup-port logistics (spares), operations (e.g., crew time for repairs), and manifest.

− Perform analyses to quantify the system maintenance resource requirements, and document them in the maintenance plan.

• Ensure System Quality – System quality shall be integrated into the life cycle development of the system of interest.2 − Develop and execute a QA plan (S/W and H/W). − Provide regular reporting of system quality-re-

lated issues and status of project management. − Identify issues of design, materials, workman-

ship, fabrication and verification processes, and other characters that could degrade prod-uct system quality in support of major system design reviews (e.g., SRR, PDR, and CDR).

− Ensure engineering designs meet project re-quirements and comply with Center and pro-gram quality requirements. The requirements are detailed in the following documents: ? JPG 5335.3, Quality Manual ? NSTS 5300.4(1D-2), Safety, Reliability,

Maintainability, and Quality Provisions for the Space Shuttle Program

? SSP 41173, Space Station Program Quality Assurance Requirements

? JPG 8080.5, JSC Design and Procedural Standard Manual

− Conduct process, product, and quality management system audits.

− Ensure the completeness of CM plan, procedures, and documentation.

− Review and approve test procedures.

− Review and approve PMPs and program/ project requirements documents. (NOTE: This is a joint review by all of S&MA.)

− Review and approve procurement documents. − Participate in the evaluation and selection of

procurement sources. − Ensure NASA workmanship standards are

used in the design and manufacture of EEE for high-reliability (flight H/W, critical ground support equipment, etc.) applications.14

− Ensure S/W is designed and developed in accordance with NASA standards establish-ed by the following documents: • NPD 2820.1, NASA Software Policies • NASA-STD-2100-91, NASA Software

Documentation Standard • NASA-STD-2201-93, Software Assurance

Standard • NASA-STD-8719.13A, Software Safety

NASA Technical Standard • NASA-STD-8739.8, Software Assurance

− Perform QA contract surveillance. − Ensure verification requirements are properly

specified, especially with respect to test environ-ments , test configurations, and pass/fail criteria.

− Evaluate manufacturing/fabrication plans and processes.

− Assign mandatory inspection points. − Inspect items and facilities during manufac-

turing/fabrication as well as items delivered to the NASA field centers.

− Inspect facilities to assure equipment, certification, and calibration.

− Ensure the adequacy of personnel training and technical documentation to be used during manufacturing/fabrication.

− Monitor qualification and acceptance tests to ensure compliance with verification require-ments and test procedures, and to ensure that test data are correct and complete.

− Approve the resolution of nonconformances and problem/failure reports (P/FRs), and ver-ify the implementation and effectiveness of corrective action(s).

− Perform assessment of as-designed vs. as-built configurations.

− Verify each ADP is complete and correct. − Verify each version description document as

complete and correct. − Verify completion of certification

documentation. − Assess S/W development folders. − Ensure flight equipment that is being shipped

for flight is ready for shipment and follow-on flight processing/integration.

Page 125: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-51

− Investigate mishaps and process escapes. 4.1.11.7 Outputs Primary outputs from this process are:

• Safety and mis sion success program plans • Safety and mission success requirements • Hazard analysis • Reliability prediction models • FMEA/CIL • Fault-tree analysis • Safety and mission success status • Review item dispositions (RIDs) • QA surveillance data • Flight H/W- and S/W-specific output, including:

− ADP − Certification data package − Pre-Shipment Readiness Review (JSC Form

1027) − Government certification approval request

(GCAR)

4.1.11.8 Exit criteria The Safety and Mission Success process is ongoing throughout the life cycle of the project and does not end until the mission is complete and project deliverables are decommissioned. 4.1.11.9 Measurement The table at the bottom of the page provides both base and derived measures that can be used in conjunc-tion with the Safety and Mission Success process. See discussion of Measurement on page 4-1. 4.1.11.10 Methods and techniques Methods and techniques used in the Safety and Mission Success process depend, to some degree, on the nature of the system of interest. However, typical methods and techniques include:

• Trade studies • Risk assessment matrix • Hazard analysis • FMEA and criticality analysis • Reliability block diagram • Fault-tree analysis • Event-tree analysis • Combinatorial failure probability analysis • Failure mode information propagation modeling • Probabilistic design analysis • Probabilistic risk assessment • Tolerance stack-up analysis • Design of experiments • Brainstorming • Checklists • Reliability trend analysis • Human reliability analysis • Failure trend analysis

4.1.11.11 Software tools Standard office automation tools, which support documentation reviews and the development of var-ious analyses, should generally be used; e.g., word processor, spreadsheet, and presentation S/W for PCs. Tools that help in the analysis of systems should also be used. One such tool is PSpice. 4.1.11.12 References The following documents, which were used to prepare this section, offer additional insights into the Safety and Mission Success process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2SP 6105, NASA Systems Engineering Handbook, 1995. 3NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002.

S&MA Planning Milestone Dates (e.g., due S&MA Planning % Complete – Planned vs. Actuals dates on planning checklist)

S&MA Planning Effort (FTEs) S&MA Planning Productivity

S&MA Planning Effort – Planned vs. Actuals

Mishap Rate Mishaps/Year for Last 5 Years

Cumulative Rate Projection

Cause Category

Open Work % H/W and S/W Shipped with Open Certification

Timeliness QA Turnaround Time for Discrepancy Report Processing

Base Measures Derived Measures

Page 126: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-52

4NASA-STD-2100-91, NASA Software Document-ation Standard . 5NSTS 1700.7, Safety Policy and Requirements for Payloads Using the STS . 6NSTS 1700.7 Addendum, Safety Policy Require-ments for Payloads Using the International Space Station. 7SSP 50038, Computer-Based Control System Safety Requirements. 8NASA-STD-8719.13A, Software Safety NASA Technical Standard . 9NASA-GB-1740.13-96, NASA Guidebook for Safety Critical Software – Analysis and Development. 10KHB 1700.7, KSC Payload Ground Safety Handbook . 11NSTS 13830, Payloads Safety Review and Data Submittal Requirements. 12JPG 1700.1, JSC Safety and Total Health Handbook . 13JSC-24937, Limited Life Time Cycle Items Program Requirements Document, 2002. 14NASA-STD-8739 series. 15INCOSE Systems Engineering Handbook , Version 2.0, 2000. 16CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 17EIA-632, Processes for Engineering a System. 18NASA-STD-2201-93, Software Assurance Standard .

Page 127: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-53

4.1.12 Control1 The SE Control process is performed to identify, allocate, track, and control significant attributes of the system of interest (e.g., technical performance, technical resources, cost, schedule, and interfaces). These significant system attributes are defined and controlled by this process and are reported to the Technical Work and Resource Management process (Section 4.1.10). The Technical Work and Resource Management process, in turn, is responsible for com-municating technical performance status and resolving discontinuities reported by the Control process. Establishing TPMs is useful for tracking technical performance and progress as well as for evaluating trades when trying to derive the best technical solu-tion since TPMs are typically based on key technical system attributes (e.g., weight, power, size, speed, performance, cost, and schedule). 4.1.12.1 Function1 In the SE Control process:

• Budgets and margins for significant system of interest attributes (e.g., power, weight) shall be defined and allocated.

• TPMs shall be established and tracked. • Methods for tracking and reporting measures

for technical resources, cost, and schedule shall be established.

• Interface controls shall be established, docu-mented, and maintained.

4.1.12.2 Objective2 The objective of this process is to establish, monitor, control, and report the technical performance, cost, and schedule attributes, methods, and measures of the system of interest. 4.1.12.3 Responsibilities The Project Manager makes the final decision concerning salient features and significant attributes of the control system. The Lead Systems Engineer is responsible for en-suring the completeness and integrity of the SE control system, identifying and allocating budgeted resources of system attributes, establishing and tracking technical measures, and establishing interface controls. The Project Control Officer is responsible for integrating the SE control systems into the overall project control function. Specialty Engineering Disciplines are responsible for supporting the Lead Systems Engineer with disci-pline-specific technical performance, cost, and sched-ule data and measures.

4.1.12.4 Life cycle The Control process is a crosscutting process that spans the entire project life cycle. Establishing meth-ods for tracking, controlling, and reporting technical measures and attributes occurs during definition of the project. Identification of technical attributes and performance measures in support of SEMP develop-ment also occurs early in the project life cycle. Execu-tion of process tracking, controlling, and reporting functions continues until the end of the project. 4.1.12.5 Inputs Typical inputs to this process include, but are not limited to:

• System architecture • WBS • SEMP • System requirements • Technical performance measurement baseline

(PMB) • ICD • Interface requirements specification

4.1.12.6 Steps3,4 Adhering to the major steps of this process directs the project toward meeting its SE control requirements. For each major process step, additional guidance is provided as to the typical sub-process steps and prod-ucts that are expected. These steps are shown in Figure 4.1-12 (at the top of the next page) and are discussed further below.

• Manage Technical Attributes − Identify and Allocate Technical Attributes –

Budgets and margins for significant system of interest attributes shall be defined and allocated. • Define technical budgets and margins for

significant system of interest attributes (e.g., size, power, weight, cost, and schedule).

• Allocate technical budgets and margins for significant attributes across and down the system of interest hierarchy.

− Maintain and Control Technical Attributes – Budgets and margins for significant system of interest attributes shall be maintained, tracked, and controlled. • Periodically assess the adequacy of tech-

nical resources, including margins, to meet project requirements.

• Develop and implement a recovery plan when margins of technical attributes for the system of interest become inadequate.

Page 128: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-54

• Manage Performance Measures − Establish Technical Performance Measures –

Identification of technical performance meas-ures and of collection, analysis, and reporting requirements shall be established. Note that the Control process measurement steps are very similar to those defined in the Quality Management process (Section 4.3.4); how-ever, the focus of each differs. The Control measurement steps are specific to perform-ance measures, whereas the Quality Man-agement measures are applicable only to process and product measures. • Establish measurement objectives that

are derived from technical needs and objectives.

• Specify TPMs that address resource, cost, and schedule measurement objectives, in-cluding attributes (e.g., owner, frequency of collection, roll-up of lower-level meas-ures). Most useful TPMs are those that provide visibility into the technical per-formance of key elements of the WBS, especially those that are cost drivers on the program, lie on the critical path, or represent high-risk items . TPMs are key to progressively assessing technical pro-gress and, once defined, should be doc-umented in the SEMP (Section 4.1.10).

• Specify the critical technical parameters, including update frequencies and level of

?

tracking depth, used to derive the defined TPMs. This may be achieved by develop-ing a technical parameter hierarchy that identifies all measurable key technical elements and establishes their relative relationships and importance.

• Specify measurement data collection methods, tools, and storage procedures.

• Specify measurement data analysis methods, procedures, and tools.

• Specify measurement result report types and communication mechanisms.

• Manage and store measurement specifi-cations in accordance with defined storage procedures.

− Track Technical Performance Measures – Collection of technical performance meas-urement data shall be performed. • Collect specified technical performance

measurement data in accordance with the defined collection methods and tools.

• Track critical technical parameters rela -tive to time, with dates established as to when progress will be checked and when full compliance will be met.

• Manage and store measurement data in ac-cordance with defined storage procedures.

− Analyze Technical Performance Measures – Analysis of technical performance measure-ment data shall be performed.

Identify & Allocate

Technical Attributes

System of Interest Technical

Attributes

Maintain & Control Technical Attributes

Over-Budgeted Resources

Recovery Plan

Manage Technical Attributes

Technical Performance

Results

Report Technical Performance

Measures

Technical Performance

Measures

Establish Technical Performance

Measures

Track Technical Performance

Measures

Analyze Technical Performance

Measures Manage

Performance Measures

Manage Interfaces

Establish Interface Controls

Interface Mgmt. Plans &

Processes

Maintain & Control

Interfaces

ICDs and Interface Request

Specifications

KEY Step/Activity Product Information/Output Flows

Figure 4.1-12. Control process diagram.

Page 129: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-55

• Analyze specified technical performance measurement data in accordance with de-fined analysis methods, procedures, and tools .

• Apply statistical methods to understand performance measurement variation, where applicable.

• Analyze and interpret performance measurement data.

• Manage and store analysis results in accord-ance with defined storage procedures .

− Report TPMs – Reporting of technical performance results to relevant stakeholders shall be performed. • Report technical performance measure-

ment and analysis results to the project manager and other relevant stakeholders (e.g., design and SE, customer).

• Periodically review project performance against the SEMP (see Section 4.1.10 for conducting progress reviews).

• Establish a correction plan for performance measurement discontinuities (see Section 4.3.4 for managing corrective actions).

• Manage Interfaces − Establish Interface Controls – The project

technical plan for maintaining and controlling system interfaces shall be established. Devel-opment and documentation of the system of interest internal and external interfaces is performed as part of the SE Decomposition process (Section 4.1.4). • Establish, document, and maintain the

project interface controls (e.g., responsi-bilities, processes) in an interface man-agement plan.

− Maintain and Control Interfaces – Manage-ment of system interfaces shall be performed by providing oversight of interface definition, control, compatibility, approval, and coordi-nation in accordance with the interface management plan. • Monitor system development and man-

age technical staff, budget, and schedules to ensure project interface management plans are being followed and supporting processes are being used.

• Ensure supervision and resources are pro -vided to enable the interface management plans to be executed and commitments met.

• Make data pertinent to interface manage-ment readily accessible to project teams throughout the system of interest life cycle.

• Ensure all internal and external func-tional and physical interfaces for each

element are identified, defined, assigned, documented, and managed.

• Ensure element design definitions are compatible in terms of form, fit, and function.

• Conduct frequent technical interchange meetings (TIMs) among systems engineers of each organization involved to promote understanding of the mission functional requirements of the fully integrated sys-tem, the contribution each product is expected to make in fulfilling those re-quirements, and the interfaces required for the system to perform as intended.

• Establish an interface control working group to maintain interface requirements, and to coordinate and resolve any discrep-ancies in system interfaces (requirements and implementations).

• Ensure interface changes affecting the element and affected by the element are controlled to prevent adverse consequences.

• Update and maintain ICDs and interface requirements, as required.

4.1.12.7 Outputs Primary outputs from this process are:

• System technical attributes • TPMs • TPM analysis results • Earned value status, if available • Plans (resource recovery plans, interface

management plans) • ICDs (updated) • Interface requirements specification (updated)

4.1.12.8 Exit criteria The process is exited upon project termination. 4.1.12.9 Measurement The table at the top of the following page provides example base and derived measures that can be used in conjunction with executing the Control process. See discussion of Measurement on page 4-1.

Page 130: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-56

4.1.12.10 Methods and techniques5,6 Performance management methods include:

• Earned Value Management (EVM) – Enables effective execution, management, and control as well as integrated evaluation of cost, schedule, and technical performance against the baseline.

• Performance Assessments – Confirms satisfactory project performance and ensures timely identifica-tion of all problems throughout the life cycle.

• Schedule Management – Ensures the establish-ment, management, and control of baseline master schedule and derivative schedules that provide the framework for time phasing and coordinating all project efforts into a master plan to ensure that objectives are accomplished within project or program commitments. Project performance against the baseline schedule represents a key element in managing risk.

• WBS – Serves as the structure for project tech-nical planning, scheduling, cost estimating and budgeting, contract scope definition, document-ation, product development, and status reporting and assessment (including integrated cost/sched-ule performance measurement).

• TPM – Is the ongoing process of predicting the value of performance parameters, which are crit-ical to successful performance of the system, at completion of the system development; compar-ing the current actual values of those perform-ance parameters to a planned profile of each parameter over development time; and identi-fying any design deficiency that could jeopar-dize system performance. The extent to which TPM is to be employed should be defined in the SEMP. TPM may also be used to: evaluate com-pliance with requirements; assess compliance to levels of technical risk; trigger development of recovery plans for identified deficiencies; and examine marginal cost benefits of performance in excess of requirements.

Measurement and analysis methods include: • Statistical analysis • Cause and effect – fishbone diagrams • Control charts • Flow diagrams

• Affinity diagrams • Interrelationship diagraphs • Pareto charts • Run charts • Block diagrams • Gantt charts

4.1.12.11 Software tools5 Many S/W tools are available to support the SE Control process. These S/W tools include typical spreadsheet, scheduling, database, and drawing tools for data visualization, planning, monitoring, control-ling, and analysis with commercial products available from many vendors. Als o available are specialized tools for measuring management systems and statis -tical modeling analysis. For cost estimating and anal-ysis, see the JSC Cost Estimating Web page at www.jsc.nasa.gov/bu2/. For an extensive listing of commercially available systems analysis tools, see the table provided by INCOSE at the following: www.incose.org/tools/eia632tax/eia632top.html. This table lists tools that support the Control process requirements of EIA-632.5 4.1.12.12 References The following documents, which were used to prepare this section, offer additional insights into the Control process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2SP 6105, NASA Systems Engineering Handbook , 1995. 3CMMI-SE/SW/IPPD/SS V1.1, Capability Maturing Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 4EIA-632, Processes for Engineering a System, 1999. 5INCOSE Systems Engineering Handbook , Version 2.0, 2000. 6NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002.

Tracking Milestone Dates (e.g., mo nthly % Tracking Milestones Met – Planned vs. Actuals tracking book reviews, gate reviews)

Issues/Actions Status % Issues Closed

Overall Technical Management Effort Technical Management Effort – Planned vs. Actuals

Management Effort as % Total Engineering Effort

Base Measures Derived Measures

Page 131: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-57

4.1.13 System Analysis1–5 The System Analysis process (e.g., stress, thermal, cost, performance, safety, etc.) is performed to provide a specific, quantitative, engineering assessment that is used in making systematic, technical, and economic decisions and establishing design alternatives, recom-mendations, allocations, and budgets. This process is used to (1) provide a rigorous basis for technical deci-sion making, resolution of requirements conflicts, and assessment of alternative physical solutions; (2) deter-mine progress in satisfying system of interest require -ments; (3) support risk management; and (4) ensure that decisions are made only after evaluating the cost, schedule, performance, and risk effects on the system of interest. The System Analysis process is a structured approach to evaluating alternative solutions against established criteria to determine a recommended sol-ution addressing an issue. A formal System Analysis process involves establishing criteria for evaluating alternatives, identifying alternative solutions, select-ing methods for evaluating alternatives, evaluating solutions using established criteria and methods, and selecting recommended solutions from the alternatives based on the evaluation criteria. A formal System Analysis process reduces the subject nature of the decision and increases the probability of selecting a solution that meets the multiple demands of the relevant stakeholders. In particular, trade studies are used to evaluate solutions to optimize the cost, schedule, performance, and risk of selected alternatives. 4.1.13.1 Function1 In the System Analysis process:

• Evaluation criteria (e.g., environment, dimensions, life cycle, weight, cost, etc.) shall be established and applied during the analysis.

• Analytical or physical modeling of the alter-natives based on the established evaluation criteria shall be performed to obtain a quan-titative prediction.

• A recommendation for the solution based on the quantitative outcome of the analytical or physical modeling shall be made.

4.1.13.2 Objective3 The objective of system analysis is to help decision makers choose appropriate course of action. This is achieved by systematically examining the relevant objectives – and alternative policies and strategies for achieving them – and comparing quantitatively the economic costs, effectiveness, and risks of the alternatives.

4.1.13.3 Responsibilities3 The Lead Systems Engineer is responsible for all aspects of the System Analysis process, including construction of quantitative models, application of those models to identified alternatives, and analysis of resulting data. Alternative selection and document-ation are also the responsibilities of the Lead Systems Engineer. Other participants in this process include the:

• Design Engineering Disciplines such as S/W, mechanical, electrical, etc., who are responsible for providing discipline and functional expertise in support of these analyses.

• Specialty Engineering Disciplines, including reliability, maintainability, logistics, test, pro-duction, transportation, human factors, QA, safety, and risk management, who are respon-sible for providing specific, specialty expertise in support of these analyses.

• Chief Financial Officer Representative, who is responsible for supporting, providing, and certi-fying financial information in these analyses.

4.1.13.4 Life cycle6 Technical issues requiring the System Analysis process must be identified during any phase of a pro-gram. The objective is to identify impending technical issues as early as possible in the life cycle to maximize the time available to deal with each issue. The great-est application of the System Analysis process is, therefore, during the early stages of the life cycle be-ginning with Pre-Phase A, Advanced Studies, and continuing through Phase A, Preliminary Analysis, and Phase B, Definition, and ending in Phase C, De-sign. However, the process is applicable to the remain-ing stages of the life cycle and should be applied any time a technical issues requires analysis. The System Analysis process may be involved from any of the other SE processes. 4.1.13.5 Inputs Inputs to the System Analysis process vary depend-ing on the stage in the life cycle and the focus of the analysis. Typical inputs will consist of:

• Functional concepts • Physical concepts • Operations concepts • Requirements • Design • Measures of effectiveness (evaluation criteria) • Plausible alternatives

Page 132: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-58

4.1.13.6 Steps3,4 The following diagram (FIG. 4.1-13) illustrates the major steps of the System Analysis process. Details of these steps are provided below. Figure 4.1-13. System analysis process diagram.

• Preparation for Systems Analysis Shall Be Per-formed – Preparation for the analytical portion of systems analysis shall be performed, including functional analysis of the system of interest to help identify plausible alternative solutions. − Establish Guidelines for System Analysis –

Establish and maintain guidelines to determine which issues are subject to a formal system analysis process. Guidelines should be re-viewed with and accepted by the design team.

− Define/Identify Goals, Objectives, and Con-straints – State the goals, objectives, and con-straints in general operational terms during early stages of the project life cycle and as the arch-itecture and design matures. State them more in terms of performance requirements that an as -pect of the system of interest must meet (see Section 4.1.1, Requirements Development).

− Perform Functional Analysis – Systematically identify, describe, and relate the functions that the system of interest must perform to

fulfill its goals and objectives (see Section 4.1.4, Decomposition).

− Define Plausible Alternatives – Create alternatives that can potentially achieve the

goals and objectives of the system. Solicit alternatives from relevant stakeholders. Use brainstorming sessions, literature searches, interviews, and working groups to uncover alternatives. The number of alternatives con-sidered can drive the cost of the analysis, so consider only clearly viable choices.

− Define Selection Criteria – Define how the figures of merit for each trade study or deci-sion will be used to make a tentative selection of the preferred alternative. Define the range and scale (or weighting factors) for ranking the criteria, rank the criteria, assess the criteria and their relative importance, and document rationale for the selection and rejection of criteria.

• Systems Analysis of Alternatives Shall Be Performed – Each plausible alternative shall be analyzed in sufficient detail to support selection of a course of action. − Define Figures of Merit (also called “meas-

ures of effectiveness”) – Define quantitatively

Select Solutions

Prepare for System Analysis

START

Establish System Analysis

Guidelines

Define/Identify Goals, Objectives, and Constraints

Perform Functional Analysis

Define Plausible

Alternatives

Define Selection Criteria

A

Conduct System Analysis

B

Compute Uncertainty

Ranges

Perform Sensitivity Analyses

Perform Risk

Analysis

Compute Cost for Each

Alternative

ADefine

Figures of Merit

Define Measurement

Methods

Collect Data on Each

Alternative

Compute System Effectiveness,

Performance or Technical Attributes

KEY

B Make a

Tentative Selection

Is Tentative Selection

Acceptable?

Document Systems Analysis Results

Analysis Report

END

Consider New Alternatives

Return to Start

Yes

No

Step/Activity Product Information/Output Flows B Connector Decision

Page 133: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-59

accomplishment of the system goals and ob-jectives. Figures of merit are often expressed as polynomials (e.g., cost per ton launched, mean time to repair, data return time).

− Define Measurement Methods – Define which measurement methods are to be used to ex-plicitly identify and model those variables associated with system effectiveness, system performance, technical attributes, and system cost, and that are important in meeting sys-tem of interest goals and objectives.

− Collect Data on Each Alternative – Collect data on each alternative to support evaluation by selected measurement methods.

− Compute System Effectiveness, Performance, or Technical Attributes – Compute an esti-mate of system effectiveness, performance, or technical attributes for each selected alternative.

− Compute Uncertainty Ranges – Compute, for non-point solution outcomes, uncertainty ranges for each alternative.

− Perform Sensitivity Analyses – Estimate, for uncertain key inputs, a range of output values to gauge the sensitivity of the alternative with regard to inputs. Especially in the early stages of design, weighting factors and some “quantified” data can be subjective, so the sensitivity analysis is crucial. If the solution can be changed by making relatively minor changes in data input, the study is probably invalid. Similarly, if significant changes in data input produce no change in output val-ue, the methodology should be reviewed and revised.

− Perform Risk Analysis – Analyze the tech-nical, schedule, and cost risks associated with each alternative.

• A Solution Shall Be Selected – Given the results of alternative analysis, a tentative solution shall be selected and a decision reached as to whether the solution meets the acceptance criteria. − Make a Tentative Selection – Combine the

selection rule with results of the analysis, align alternatives from most preferred to least, and make a tentative selection. In making this selection, the following ques-tions should be considered:

• Have the goals, objectives, and constraints been met?

• Is the tentative selection robust? Is it heavily dependent on a particular set of input values to the measurement methods? Does it hold up under a range of reason-able input values? What are the risks?

• Is more analytical refinement needed to distinguish among alternatives?

• Have subjective aspects of the problem been addressed?

− Consider New Alternative Solutions – Con-sider new alternative solutions, criteria, or methods if the proposed alternatives do not test well; repeat the evaluations until alter-natives test well.

− Document System Analysis Results – Estab-lish and maintain a system analysis or trade study report documenting all aspects of the analysis. Archive this report to ensure trace-ability of decisions made throughout the SE process. Ideally, system analysis and trade study reports should be indexed to require-ments and records of design decisions for the project.

4.1.13.7 Outputs3 The output of the System Analysis process is a report that describes:

• The system issue under analysis. • The system goals and objectives (or require-

ments, as appropriate to the level of resolution) and constraints.

• The figures of merit used in analysis. • The measurement methods (models) used. • All data sources used. • The alternatives chosen for analysis. • The computational results, including uncertainty

ranges and sensitivity analyses performed. • The selection rule (or decision method) used. • Recommended alternative and associated risks. • Revisions to baselines for functional, physical,

and operational concepts as a result of analysis. 4.1.13.8 Exit criteria Completely document the recommended solution with the associated analysis. 4.1.13.9 Measurement7 The table at the top of the following page pro-vides example base and derived measures that can be used in conjunction with executing the System Anal-ysis process. See discussion of Measurement on page 4-1. Each of the system analyses may have cost and schedule measures associated with planning and per-forming the analyses as well as progress measures with respect to completion of the analyses. Each type of analysis will also have specific technical measures related to the topic under analysis.

Page 134: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-60

4.1.13.10 Methods and techniques2,3,7 The following are the methods and techniques used by the System Analysis process:

• Methods − Tradeoff analysis − Effectiveness analysis

• Cost effectiveness • Total ownership cost • Environmental impacts • System effectiveness

− Risk analysis (technical, schedule, and cost) − Functional analysis − Decision analysis methods (e.g., analytic

hierarchy process, etc.) • Techniques

− Physical Modeling – Wind tunnel model, mockup, engineering model, breadboard model, thermal model, acoustic model

− Virtual Modeling − Graphical Modeling – Functional flowcharts,

behavior diagrams, timeline charts, N2 dia -grams, PERT charts, logic trees, document trees, waterfall charts, floor plans, schemat-ics, representative drawings, topographical representations, drawings of systems or components, QFD

− Mathematical Modeling – Dynamic motion models, structural analysis, thermal analysis, vibration analysis, electrical analysis, finite elements, linear programming, cost model-ing, network or nodal analysis, decision analysis, flow field studies, hydro-dynamics studies, control systems modeling, workflow analysis, reliability and availability models, human reliability analysis, maintainability analysis, process models, entity relationship models

− Statistical Modeling – Monte Carlo, logistic-al support, process modeling, manufacturing layout modeling, sequence estimation mod-eling, discrete, continuous

4.1.13.11 Software tools7 Many S/W tools are available to support the System Analysis process. These include typical spreadsheet, database, and drawing tools for visualization, imag-ing, and analysis with commercial products available from many vendors. Also available are specialized tools for graphical modeling, mathematical modeling, and statistical modeling. For cost estimating and analysis, see the JSC Cost Estimating Web page at www.jsc.nasa.gov/bu2/. For an extensive listing of commercially available systems analysis tools, see the table that has been provided by INCOSE at www.incose.org/tools/eia632tax/eia632top.html. This table lists tools that support the System An-alysis process requirements of EIA-632.2 4.1.13.12 References The following documents and Web site, which were used to prepare this section, offer additional insights into the System Analysis process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999. 3SP 6105, NASA Systems Engineering Handbook , 1995. 4CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 5NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002. 6EIA-731.1, Systems Engineering Capability Model, 2002. 7INCOSE Systems Engineering Handbook , Version 2.0, 2000. 8System Engineering Fundamentals, Defense Systems Management College, 2001. 9JSC Cost Estimating Web page: www.jsc.nasa.gov/bu2/.

Planned Analysis Cost Planned Analysis Cost vs. Actual Cost

Planned Analysis Schedule Planned Analysis Schedule vs. Actual Schedule

Planned # of Alternatives Planned # of Alternatives vs. Actual #

Base Measures Derived Measures

Page 135: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-61

4.1.14 Verification1,2 Verification is the methodical development of evidence of system of interest compliance with re-quirements (“shalls”). It is accomplished at each level of the system architectural hierarchy. 4.1.14.1 Function1,3 The system Verification process is used to ascertain that end items at each level of the system architecture, from the bottom up, meet specified re-quirements. In the Verification process:

• Verification methods (i.e., test, analysis, in-spection, and/or demonstration) shall be developed and documented to identify how each require-ment is met.

• Verification planning information shall be documented to described the detailed processes that will ensure the system of interest complies with its requirements.

• Verification success criteria shall be defined and documented that will indicate successful completion of each verification process.

• The method for submitting, reviewing, and tracking the results of the verification processes shall be defined and documented.

• Verification shall be performed and documented on each part of the system of interest at each level of the architecture hierarchy from the bottom to the top.

4.1.14.2 Objective The Verification process is performed to determine that the system of interest complies with requirements . 4.1.14.3 Responsibilities The Lead Systems Engineer reviews verification plans, ensures end-to-end verification is performed, reviews results as to adequacy and compliance, and recommends needs for retest or redesign.

The Verification Lead ensures that requirements are verifiable, develops a verification plan, identifies most effective verification methods and verification criteria, and coordinates activities with verification facilities, participants, and other team members. The Verification Lead also executes the verification plan, develops or coordinates procedures, collects and doc-uments results, and evaluates results for compliance or need for re -verification. The Test Operations Lead assures that the required test facilities have been verified to provide necessary conditions and have been scheduled and are available. The Project Manager assesses the verification plan, reviews results, determines the need for redesign, ne-gotiates access to external facilities or resources, and reviews results with the customer. Also, the Project Manager assigns responsibility for resolution of un-successful verification activities and manages closure of discrepancy reports. 4.1.14.4 Life cycle Verification-related activity of some kind is usually in progress during all phases through the System Ac-ceptance Review (SAR), with conceptual planning documentation required as early as the Concept Re-view (CR). Execution usually concludes by the SAR. However, for some systems verification continues by exception later in Phase D, Development, prior to launch, and even during Phase E, Operations (i.e., orbital testing). 4.1.14.5 Inputs The Verification process is principally tracked by means of a verification matrix. This matrix is the proj-ect record of the identity, performance, and outcome of the verification activity for each of the system “shall” requirements from all project requirement documents. It is used first to capture the plan for the verification, and it is then used to summarize results for the record. The table below illustrates the head-ings and one entry of such a matrix.

Reqmt. No.

Document of Origin

Paragraph Shall Statement Quotation

Verification Success Criteria

Verification Method

Facility or Lab

ID

Phase Acceptance Requirement?

Preflight Acceptance?

Performing Org.

Results

P-1

JSC-nnnn

3.2.1.1 Capability: support Uplinked Data

System shall pro-vide a maximum ground-to-station uplink

1. System locks to forward link at min./max. data rate tolerance. 2. System locks for-ward link at the min. and max. operating frequency tolerances.

Test

Electronic Systems Test Lab (ESTL)

5 [Code for “formal sys-tem-level functional phase”]

[Is this requirement also verified during initial acceptance testing of each unit?]

[Is this requirement also verified during any preflight or recurring acceptance testing of each unit?]

JSC/EV

Report nnn (indicates documents that contain objective evidence that the re-quirement was satis-fied)

Page 136: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-62

The following should be inputs to the process of creating the verification matrix, both from the per-spective of aggregating the “shall” requirements and from the perspective of understanding the environ-ment surrounding those requirements.

• Concept of operations documentation • Mission needs and goals • Requirements and specifications • ICDs • Testing standards and policies • Agency standards and policies

4.1.14.6 Steps The following diagram (FIG. 4.1-14) illustrates the major steps and products of the Verification process. The three major steps are (1) prepare for verification, (2) perform verification, and (3) analyze and docu-ment results. These steps are progressively and iter-atively executed until the complete system is verified for all requirements.

• Project Shall Prepare for Verification − Select system or subsystem components that

are to be verified and the verification methods that will be used for each. The method is se-lected from test, analysis, demonstration, or

inspection based on their ability to prove that the system meets its requirements.

− Develop the operational scenario, envi-ronment, or constraints for the verification activity and base these on a set of require-ments to be verified.

− Verification criteria are established and documented.

− Verification criteria are approved by customers and S&MA.

− Procedures are developed for the verifica-tion activity and are approved by quality engineering.

• Project Shall Perform Verification Activity − The verification activity is configured as

appropriate to reflect the selected environ-ment. H/W is configured, S/W is loaded, models are set up, tools are tested, and simulations are established as appropriate.

− The verification activity is performed and witnessed by QA.

− Quantitative results from the test, analysis, inspection, or demonstration activity are re-corded as appropriate.

Figure 4.1-14. Verification process diagram.

Step/Activity Information/Product Information Flows/Output Flows

Decision

Perform Verification

Configure or Model System

Perform Verification Procedure

Record results (data or

observations)

Verification Procedure

Results

KEY

Select Product for Verification

Define & Document Verification Method(s), Verification

Environment

Develop & Document Verification Success Criteria

Establish & Document Verification

Procedures & Plans

Prepare for Verification

Verification Procedures &

Plans

Analyze and Document Results

Evaluate Results Against

Criteria

OK to proceed?

Document results,

analysis, and decisions

Report of verification results,

analysis, and decisions

Address Design Issues

No

Yes

Page 137: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-63

− Discrepancies should be documented via discrepancy reports.

• Project Shall Analyze and Document Results − Data resulting from the verification activity are

analyzed against defined verification criteria. − Results of this analysis are used to determine

whether the product at this point in the life cycle meets the requirements or fails to meet the requirements, resulting in a need for either a waiver or a decision on whether a modification to the design is warranted.

− All verification results, analyses, and conclusions concerning product adequacy are documented.

4.1.14.7 Outputs The ultimate need is for documentation of objective evidence that all “shall” requirements have been veri-fied. The following outputs support fulfillment of that need:

• Verification plans • Completed verification requirements matrix • Verification results and analysis • Verification procedures • Test, demonstration, inspection, and analysis

reports • Waivers • Discrepancy reports and descriptions of their

closures 4.1.14.8 Exit criteria Process exit criteria include:

• Objective evidence of compliance or waiver of each system of interest “shall” requirements shall be documented.

• The verification process shall not be considered or designated as complete until all discrepancy reports are closed.

4.1.14.9 Measurement The table at the bottom of the pages provides example base and derived measures that can be used in conjunction with executing the Verification pro-cess. See discussion of Measurement on page 4-1. 4.1.14.10 Methods and techniques Verification may be accomplished by a number of techniques and methods that can be categorized as tests, analyses, inspections, and/or demonstrations. The following definitions are offered not to constrain verification planning with narrow a priori identities, but to stimulate ideas for planning, documenting, and employing the specific approaches that will be used in any given project to prove the fidelity of project unique designs and products to original requirements.

• Test – A method of verification wherein formal project requirements are verified by measurement or functional test during or after the controlled application of functional and/or environmental stimuli. These measurements may require the use of laboratory equipment, recorded data, procedures, test support items, or specialized S/W.

• Analysis – A verification method using tech-niques and tools such as math models, prior test data, simulations, analytical assessments, etc. Verification by similarity is acceptable if the subject article is similar or identical in design, manufacturer, manufacturing process, and quality control to another article that has been

Total # of “Shall” Requirements

Total # of “Shall” Requirements Verified Total # of “Shall” Requirements Verified To Date To Date vs. Total # of “Shall” Requirements, %

Total # of “Shall” Requirements Complied With Total # of “Shall” Requirements Complied With vs. Total # of “Shall” Requirements, %

Total # of “Shall” Requirements Waivered Total # of “Shall” Requirements Waivered vs. Total # of “Shall” Requirements, %

Total # of “Shall” Requirements Re-verified Re-verifications vs. Total “Shall” Requirements Verified, %

Total Verification Effort (FTEs) Verification Rates

Verification Productivity

# of Major Verifications (above specific # of Major Verifications vs. Total # of “Shall” dollar or hour level) Requirements Verified To Date

Base Measures Derived Measures

Page 138: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-64

previously verified to equivalent or more strin-gent criteria.

• Inspection – A method of verification of phys-ical characteristics that determines compliance using only vision and visible-spectrum optics (microscopes, fiberscopes, etc.)

• Demonstration – A method of verification that evaluates the integrated properties of the subject end item by staging a loosely controlled opera-tional scenario based on the operations concept. Demonstration participants are user-oriented; i.e., they interface with the subject and item pri-marily to employ it as fully as possible to meet user needs in that situation, with little emphasis on accommodating limitations of the item other than safety considerations. Demonstration can be conducted with or without special test equipment or instrumentation to verify required operational characteristics such as endurance, ruggedness, human engineering features, service and access features, transportability, and displayed data.

4.1.14.11 Software tools A S/W tool intended to support verification manage-ment should possess the ability to track the identity, origin, verification, method, facility, success criteria, and current degree of fulfillment of each requirement. For large or complex projects, COTS requirements management tools (e.g., DOORS and SLATE) usually include features that can convert requirements loaded in them into verification matrices with required ele-ments of information. INCOSE offers an excerpt from an S/W productivity solutions review of such tools at: http://wwww.incose.org/tools/reqsmgmt.html . For small or simple projects, word processor or spreadsheet S/W is often adequate for manual track-ing or building an elementary custom tool. 4.1.14.12 References The following documents, which were used to prepare this section, offer additional insights into the Verification process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 2NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002. 3EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999. 4Analysis of Automated Requirements Management Capabilities. Developed in support of Advanced System Engineering Automation (ASEA) CSC-2.7 Requirements/Design Manager (Contract no. F30602-

93-C-0123). Prepared for Rome Laboratory, Air Force Materiel Command C3CB 525 Brooks Rd. Griffiss AFB, NY 13441, by Software Productivity Solutions, Inc., 122 4th Ave., Indalantic, FL 32903. 5CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002.

Page 139: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-65

4.1.15 Validation1,2 While verification proves “whether the system was done right,” validation proves “whether the right sys-tem was done.” That is, verification provides objective evidence that every “shall” was met, whereas valida-tion is performed for the benefit of the customers and users to ensure that the system functions in the expect-ed manner when placed in the intended environment. This is achieved by examining the products of the system at every architectural level. 4.1.15.1 Function1,2 Validation is performed on system of interest and subsystem products. The Validation process is con-ducted to ensure that system and subsystem products are ready for the uses, functions, and missions implied or suggested by both the requirements and any other relevant project or program information. In the Validation process:

• Validation method(s) (i.e., test, analysis, inspect-ion, and/or demonstration) shall be defined and documented.

• Validation planning information shall be devel-oped and documented to describe the processes that will ensure the system of interest will meet customer needs.

• Validation success criteria shall be defined and documented that will indicate successful com-pletion of each Validation process.

• The methods for submitting, reviewing, and tracking the results of the Validation process shall be defined and documented.

• Validation shall be conducted and documented on each part of the system of interest at each level of the architecture hierarchy, from the bottom to the top.

4.1.15.2 Objective Validation is performed to determine whether cus-tomer needs can be met in the context of the expected operational scenarios of the system of interest. 4.1.15.3 Responsibilities The Lead Systems Engineer works with customers and users to identify desired validation activities, re-views the validation plan, ensures end-to-end valida-tion is performed, and reviews results to determine adequacy.

The Validation Lead develops the validation plan, identifies the most effective validation method, and coordinates activities with validation facilities, parti-cipants, and other team members. The Validation Lead also executes the validation plan, develops procedures, and collects and documents results. The Project Manager assesses the validation plan, negotiates access to external facilities or resources, assigns responsibility for resolution of unsuccessful validation activities, and manages closure of issues and actions. 4.1.15.4 Life cycle Validation-related activity is usually in progress during all phases through the SAR, with conceptual planning documentation required as early as the PDR. Although execution usually concludes by the SAR, for some systems validation continues by exception later in Phase D, Development, prior to launch, and even during Phase E, Operations (i.e., orbital testing). 4.1.15.5 Inputs The Validation process is principally tracked by means of a validation matrix. This matrix – which is the project record of the identity, performance, and outcome of each validation activity – is used first to capture the plan for validation and then to summarize plan results for the record. The table at the bottom of the page illustrates the headings and one entry of such a matrix. The following should be inputs to the process of creating the validation matrix, both from the perspec-tive of aggregating implied or suggested customer expectations beyond the “shall” requirements, and from the perspective of understanding the environ-ment surrounding those needs:

• Concept of operations documentation • Mission needs and goals • Requirements and specifications • ICDs • Testing standards and policies • Agency standards and policies

4.1.15.6 Steps3 Figure 4.1-15 illustrates the major steps and products of the Validation process. The major steps are to (1) prepare for validation, (2) perform valida-tion, and (3) analyze and document results. These

Validation Product #

Activity Objective Method Facility or Lab ID

Phase Performing Org.

Results

1

The customer/ sponsor will evaluate the candidate displays

1. Ensure legibility is acceptance. 2. Ensure overall appearance is acceptable.

Demonstration

ESTL

1 [this number is code for “During product selection process”]

JSC/EV

Customer asserts that displays are both legible and acceptable overall.

Page 140: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-66

Figure 4.1-15. Validation process diagram. steps are progressively and iteratively executed until the complete system is validated for all expected environments.

• Project Shall Prepare for Validation − Select system or subsystem components to

be validated and the validation methods that will be used for each. The method is selected from test, analysis, demonstration, or inspect-ion and is based on the ability of the system or subsystem components to prove that the user and customer needs are satisfied.

− Develop the operational scenario, environ-ment, or constraints for the validation activ-ity, and base these on customer and user needs and expectations.

− Ensure validation criteria are established and agreed to by users and customers.

− Develop and document procedures for the validation activity.

− Ensure that quality engineering approves procedures for the validation activity.

• Project Shall Perform the Validation Activity − The validation activity is configured as

appropriate to reflect the selected environ-ment. H/W is configured, S/W is loaded, models are set up, tools are tested, and sim-ulations are established as appropriate.

− The validation activity is performed and wit-nessed by QA, preferably with observation or participation by customer and users.

− Results of the validation activity are recorded as appropriate. User and customer subjective reactions to the activity are noted as are quantitative results from the test, analysis, inspection, or demonstration.

− Issues and actions arising from validation activity should be documented and tracked.

• Project Shall Analyze and Document Results − Data resulting from the validation activity

are analyzed against the defined validation criteria.

− Results of the analysis are used to determine whether the product at this point in the life cycle meets customer and user needs and expectations, or whether a modification to the design is warranted.

− All validation results, analysis, and conclu-sions as to product adequacy are documented.

4.1.15.7 Outputs The ultimate need is for documentation of objective evidence that shows all implied or suggested customer expectations have been validated. The following out-puts support fulfillment of that need:

Address Design Issues

Prepare for Validation

Select Product for Validation

Define & Document Validation Method(s), Validation

Environment

Develop & Document Validation Success Criteria

Establish & Document Validation

Procedures & Plans

Validation Procedures & Plans

Configure or Model System

Perform Validation

Procedures

Record Results (Data or

Observation)

Validation Procedure Report

Perform Validation

KEY Step/Activity Product Information/Output Flows Decision

Analyze and Document Results

Evaluate Results Against Criteria

OK to Proceed?

Document Results,

Analysis, and Decisions

Report of Validation Results, Analysis, and

Decisions

Yes

No

Page 141: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-67

• Validation plans and procedures • Validation evaluation results, including issues and

actions found and descriptions of their resolution 4.1.15.8 Exit criteria Exit criteria for this process include:

• Objective evidence of performance and results of each system of interest validation activity shall be documented.

• The validation process shall not be considered or designated as complete until all issues and actions are resolved.

4.1.15.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Validation process. See discussion of Measurement on page 4-1: 4.1.15.10 Methods and techniques Validation may be accomplished by a number of techniques and methods that can be categorized as tests, analyses, inspections, and/or demonstrations. The following definitions are offered not to constrain validation planning with narrow a priori identities, but to stimulate ideas for planning, documenting, and employing specific approaches that will be used in any given project to examine whether system of interest products will satisfy the customer:

• Test – A method of validation wherein require-ments (performance, environment, etc.) are proven by measurement or functional test during or after the controlled application of functional and/or environmental stimuli. These measurements may require the use of laboratory equipment, recorded data, procedures, test sup-port items, or specialized S/W.

• Analysis – A validation method using techniques and tools such as math models, prior test data, simulations, analytical assessments, etc. Valid-ation by similarity is acceptable if the subject article is similar to or identical in design, man-

ufacturer, manufacturing process, and quality control to another article that has been previously verified or validated to equivalent or more strin-gent criteria.

• Inspection – A method of confirmation of phys-ical characteristics that determines compliance using only vision and visible spectrum optics (microscopes, fiberscopes, etc.).

• Demonstration – A qualitative method of valid-ation that evaluates the integrated properties of the subject end item by staging a loosely con-trolled operational scenario based on operations concept. Demonstration participants are user-oriented; i.e., they interface with the subject end item primarily to use it as fully as possible to meet user needs in that situation, with little emphasis on accommodating limitations of the item other than safety considerations.

Demonstrations can be conducted with or without special test equipment or instrumenta-tion to validate implied customer expectations beyond “shall” requirements.

4.1.15.11 Software tools An S/W tool intended to support validation man-agement should possess the ability to track the identity, objective, validation, method, facility, phase, perform-ing organization, and results of each activity. For large or complex projects, COTS requirements manage-ment tools (e.g., DOORS and SLATE) usually in-clude features that can convert the requirements loaded into them into verification matrices with the required elements of information. This feature can also be used in the same manner to produce valida-tion matrices. INCOSE offers an excerpt from an S/W productivity solutions review of such tools at: http://www.incose.org/tools/reqsmgmt.html. For small or simple projects, word processor or spreadsheet S/W is often adequate for manual track-ing or building an elementary custom tool.

Total # of Validation Products

Total # of Project Changes Caused by Total # of Project Changes Caused by Validation vs. Validation Total # of Project Changes, %

Total Validation Effort (FTEs) Validation Rates

Validation Productivity

# of Major Validation Products (above specific dollar or hour level)

Base Measures Derived Measures

Page 142: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-68

4.1.15.12 References The following documents, which were used to prepare this section, offer additional insights into the Validation process:

1EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999. 2NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 3CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002.

4NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002. 5Analysis of Automated Requirements Management Capabilities. Developed in support of ASEA CSC-2.7 Requirements/Design Manager (Contract No. F30602-93-C-0123). Prepared for Rome Laboratory, Air Force Materiel Command C3CB 525 Brooks Rd. Griffiss AFB, NY 13441, by Software Pro-ductivity Solutions, Inc., 122 4th Ave. Indalantic, FL 32903.

Page 143: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-69

4.1.16 Reviews1 Several types of reviews are used in the project management environment, each with its own purpose. Reviews generally fall into three major categories: programmatic reviews, external reviews, and tech-nical reviews. Programmatic reviews, such as the Engineering Review Board and the Project Manage-ment Council reviews, are used to maintain broad governance over project processes. External reviews, such as non-advocate reviews and independent assess-ments, provide insight on the health of specific tech-nical and managerial areas of projects from non-affiliated expert authorities. This section deals with project technical reviews. The technical Reviews process is executed to establish an assessment of the system of interest status. This status is then used to determine readiness to proceed through a progressive maturation of proc-esses to develop the final system of interest. Tech-nical reviews required in support of the project life cycle are detailed in Chapter 3. 4.1.16.1 Function1,2 Reviews are conducted to communicate an ap-proach, demonstrate an ability to meet requirements, or establish status. Reviews help to develop a better understanding among project participants, open and maintain communication channels, alert participants and management to problems, and open avenues for solutions. In the Reviews process:

• The purpose, scope, and entry/exit criteria for conducting the review shall be established.

• The technical products to be assessed, as part of the specified review, shall be identified.

• A technical assessment of the review products shall be conducted, and issues shall be docu-mented and tracked to closure.

• Customer and stakeholders participation and role(s) in the review shall be established.

• Objective evidence that the review objectives are met shall be documented.

4.1.16.2 Objective2 The purpose of a review is to furnish the forum and process through which to provide NASA management and their contractors assurance that the most satisfac-tory approaches, plans, or designs have been selected, that configuration items have been produced to meet specified requirements, or that configuration items are ready. 4.1.16.3 Responsibilities The Convening Authority is responsible for initiating major milestone project technical reviews. The PMP identifies the convening authority, which may be

review-dependent. Typically, in the case of an in-stitutional support project, the Convening Authority resides within the project program or line manage-ment. The Project Manager is responsible for working with the convening authority to ensure technical re-views occur at the proper project level of maturity. In addition, the Project Manager ensures findings of the review panel/board are properly addressed to enable management decisions allowing project life cycle progression. The Lead Systems Engineer is responsible for facil-itating the Reviews process and ensuring communica-tion of project technical status to review participants. 4.1.16.4 Life cycle2 Typical reviews are found in or at the end of all project life cycle phases. 4.1.16.5 Inputs2 Reviews process input consists of a project with associated products possessing maturity that are con-sistent with the established review entry criteria. 4.1.16.6 Steps2 As with inputs and outputs, the specific steps of each review are unique. However, the general way of doing business for reviews is fairly consisted, as shown in Figure 4.1-16 at the top of the next page.

• Establish the Review – The review convening authority, in coordination with the project mana-ger, shall establish and document initial guidance as follows: − Purpose − Scope − Entry and exit criteria

• Verify Entry Criteria Are Met – The convening authority, in coordination with the project mana-ger, shall determine whether requisite life cycle steps have been taken and documents, H/W, S/W, and other items that are to be examined possess maturity consistent with review entry criteria.

• Appoint Review Board/Panel members and Chair and Establish Participant Roles − The convening authority shall appoint the

members. • Avoid appointing persons involved with

the project as chair or as majority make-up of members.

− Highly Recommended – In addition to board appointments, invite a review audience that includes both NASA and contractor person-nel who are not directly associated with the project. This: • Uses cross-discipline expertise.

Page 144: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-70

Figure 4.1-16. Reviews process diagram.

• Helps to identify design shortfalls and recommend design improvements.

• Should include non-project specialists in the area under review, production/fabrica-tion, testing, QA, reliability, and safety. If indicated, both contractor and NASA contracting officers should be included.

− The convening authority shall delineate roles and responsibilities for all participants, including: • Project team • Customer • Other stakeholders

• Establish Review Ground Rules – The chair shall publish and assure the availability of the: − Agenda of the review meeting. − Technical products to be assessed by the

review, including all applicable documents necessary to support the review.

− Process for dispositioning requests for action and change requests.

• Conduct Technical Assessment − Review Technical Products – Copies of pre-

pared materials (e.g., presentation charts)

− − − − − − − − − − − − − − − − − − − − − − − − −

should be provided to the review board and meeting attendees in advance of the review meeting. • For major reviews, this could be as far in

advance as 30 calendar days. • Specifications, drawings, analysis reports,

and any other documents/technical prod-ucts capturing the approaches, plans, and designs to be reviewed should be included.

− Board members and attendees may submit requests for action or change requests (CRs) in advance of the meeting that document a concern, deficiency, or recommended im-provement in the presented approaches, plans, or designs.

− Hold the Review Meeting ? Conduct Oral Presentations with Cogni-

zant Subsystem Engineers to: − Explain applicable project requirements. − Describe approaches, plans, or designs

devised to date to satisfy requirements. − Propose dispositions of CRs and

action items submitted prior to the meeting to the chair.

Agency/Center Guidelines &

Requirements

Establish the Review – Purpose – Scope – Entry/Exit Criteria

Verify That Entry Criteria Are Met

• Appoint Review Board/Panel Members & Chair • Establish Review Participant Roles

− Project Team − Customer − Other Stakeholders

• Establish Review Ground Rules − Publish Review Meeting Agenda − Identify Technical Products to Assess − Describe CR & Action Item Disposition Process(es)

• Conduct Technical Assessment − Review Technical Products − Hold Review Meeting − Verify Whether Exit Criteria Are Met

Document Evidence That Review Objectives Are or Are Not Met Review Report

KEY Step/Activity Information/

Product

Information/Output Flows

Page 145: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes and Requirements

4-71

− Describe documentation/technical-product maturity.

− Request Baselining of completed documentation/technical products that require CM.

• Continue Change/Action Traffic – Board members and attendees may continue to submit requests for action or CRs at the meeting.

• Formulate and Record Decisions – After developing consensus among members, the chair declares and records:* − Dispositions of CRs and action items

submitted both prior to and during the meeting.

− Either that the appropriate approaches, plans, or designs briefed at the meet-ing or submitted out-of-board are ac-cepted and that the next steps of the project are authorized; or that there are issues with these items, and en-unciates those issues and assigns ap-propriate action items for the record.

− Either that completed documents/tech-nical products capturing the accepted approaches, plans, or designs are base-lined and are to be taken under CM to support subsequent project action; or that there are issues with these items, and enunciates those issues and as-signs appropriate action items for the record.

− Verify Whether Exit Criteria Are Met – The board chair: • Finalizes consensus on the findings of

the board shortly following the review meeting, including: − Recommendation for or against pro-

ceeding with subsequent project life cycle steps

− Products baselined − Changes accepted, rejected, modified,

and withdrawn

*For most reviews, the chair is supported by CM and other administrative resources in taking notes, generating minutes, tracking action items and CRs, and tracking products under configuration control.

− Actions assigned and the persons re-sponsible for them

− All open issues and plans for closing them − Risks from problem areas

• Decides, based on established review exit criteria, whether exit criteria have been met.

• Document Evidence That Review Objectives Are/Are Not Met – The chair shall submit a writ-ten report to the convening authority capturing all of the above findings.

4.1.16.7 Outputs The output of the Reviews process consists of documented evidence produced by the review chair. This evidence addresses the results of the review, in-cluding the recommendation as to whether the project has or has not met the established review exit criteria. Characteristic output for individual technical reviews are listed in SP 6105.2

4.1.16.8 Exit criteria2 The Reviews process exit criteria shall consist of documentation that states that the review criteria have been met. 4.1.16.9 Measurement The table at the bottom of the page provides example base and derived measures that can be used in conjunction with executing the Reviews process. See discussion of Measurement on page 4-1. 4.1.16.10 Methods and techniques Methods and techniques useful and necessary to be used during the Reviews process include:

• Effective Communication – Effective commun-ication will be necessary to ensure that current project status is effectively conveyed to review board members and that all review participant inputs and concerns are effectively transmitted to the project team and, ultimately, to the con-vening authority.

Duration of the Review Duration of the Review, Planned vs. Actuals

Review Effort (FTEs) Review Effort, Planned vs. Actuals

Base Measures Derived Measures

Page 146: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-72

• Configuration Management – This will be used to manage review of products and to ensure all concerns are captured and addressed.

• Review Checklists – These are used to sum-marize items such as desired composition of the review team, lists of support technical docu-mentation, standard entry/exit criteria, etc.

• Templates – These are used to document the review plans, comments, reports, etc.

4.1.16.11 Software tools Tools available to support the Reviews process include a variety to enhance communications and maintain configuration control throughout the Re-

views process; including accessibility of review products to review participants, and generation and tracking of review comments and discrepancies to closure. 4.1.16.12 References The following documents, which were used to prepare this section, offer additional insights into the Reviews process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Require-ments. 2SP 6105, NASA Systems Engineering Handbook , 1995.

Page 147: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-73

4.2 Project Control Processes The objective of the project control processes section of this document is to clearly re-define the scope of project control. Only after the proper scope of the project control function is clearly defined and agreed to can the issues of organization responsibility and assignment be addressed. It should be emphasized that project control is not a collection of independent, stand-alone functions. It is instead a process that re-quires the integration of all of its functions to derive benefits for effective project management. This section of the document presents a “project control model” for the various functions, practices, and processes that are essential in supporting the suc-cessful management of NASA JSC projects. Sound project control practices over the years have contri-buted to the remarkable accomplishments attained by our Agency and our Center. JSC Project control activities have four significant and key characteristics. These are to:

1. Continually focus project management attention on areas critical to successful project execution. (Focus on what is important.)

2. Establish and document a project baseline, including: • A project requirements list. • A project schedule consisting of significant

project-level milestones. • An interconnected, sequential network of tasks re-

quired to complete the project (through retirement). • A time-phased life cycle cost (LCC) estimate

achieved by “resource loading” (e.g., costs for facility, workforce, materials, other direct costs (ODCs), and indirect costs) of the project network.

3. Emphasize timely responses. 4. Provide a “closed-loop” system for measure-

ment and analysis of the project that: • Provides a common method for defining, col-

lecting, analyzing, and reporting measurements . • Provides guidance and requirements for

identifying and managing measurement data. • Is a useful tool for assessing progress,

product, and process performance. Measurement and analysis is an integral part of project planning, project estimating, project manage-ment, and continuous improvement and spans the entire life cycle of a project. It supports the identification, collection, analysis, and reporting of three measure-ment types: Progress, Product, and Process.

• Progress measures are used to help work groups that build products assess the groups’ perform-ance against project goals.

• Product measures are used to ensure that defects are found early in the development life cycle, resulting in reduced risk and cost.

• Process measures are used to ensure processes meet customer expectations and are properly implemented.

Proactive use of these measurement types enables managers to identify and monitor trends; determine potential cost, schedule, and performance impacts; anal-yze and prioritize risks and opportunities; and propose corrective action to minimize or eliminate risks through integrated schedules and/or project plan revisions. 4.2.1 Resource Management1,2,3 4.2.1.1 Function Resource Management plans and develops an integrated process that involves the preparation, re-view, and maintenance of project resource needs. It is responsible for reviewing, analyzing, and interpreting data in a timely manner to effectively support project implementation in support of the NASA Strategic Plan and associated Center roles and missions. A major responsibility of Resource Management is to define and lead the business process to ensure proj-ect success through the budget formulation and execution process. It also assesses the political environment, performs requirement analysis, performs metric anal-ysis, and develops resource strategies to facilitate the project and Center budget process and schedules. The Resource Management process will be able to rapidly respond to both internal and external inquiries during the entire budget formulation cycle of the project. The Resources Management function shall:

• Ensure that all project needs are adequately covered and properly time phased in the budget submission and significant resource issues are identified during the POP process.

• Monitor cost performance on an ongoing basis by conducting plan vs. actual cost assessments and related analyses.

• Ensure that the flow of funds is being planned, expedited, tracked, and analyzed to guarantee timely use of project resources.

• Make sure that the data and information provided to the project management team is timely and accurate.

• Recognize and quantify risks and uncertainties by allowing for adequate reserves and allowan-ces. The recognition of uncertainties and quanti-fication of risks are vital to the success of any proj-ect; and having contingency funds with a judi-cious process for allocating them is an essential element for managing projects, especially in the research and development (R&D) environment.

• Resolve and defend unforeseen funding requirements

• Be available to advise project managers in all aspects of this critical project control function.

Page 148: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-74

A key product of the Resources Management func-tion comprises formulating, monitoring, and maintain-ing a funding needs profile, including such drivers as facilities and labor requirements. Because labor drives a significant portion of development costs, keeping abreast of the status and trends in this area is critical. The proj-ect budget profile, established late in the Phase A effort to be consistent with the cost, schedule, and technical baseline of the project, will be main tained reflecting only the impact of changed project scope or schedule. In contrast, the project funding requirements will be continually updated to reflect the latest assessment of resource needs and incorporated in the POP process. A primary value of the cost collection effort in the resource management function is to predict future per-formance indications and to contribute to project man-agement’s decision-making process. The Resource Management process relies heavily on project per-formance measurement efforts, which integrate the development and maintenance of the cost, schedule, and technical baseline of the project. 4.2.1.2 Objective The objective of resource management is to estab-lish and ensure consistency between resource availa-bility and project resource needs. 4.2.1.3 Responsibilities The Project Control Officer is responsible for all aspects of the Resource Management process, includ-ing notifying the Project Team of when and what data are required, coordinating with any project-external organizations, ensuring timely project-internal review of the final product, and delivering the final product. Other participants in this process include the:

• Project Manager, who is responsible for ensur-ing the Project Control Officer has the facilities and workforce needed to accomplish this task. The Project Manager also acts as a customer to the Project Control Officer for this process.

• Entire Project Team, which also participates in this process by providing data, supporting ration-ale, and recommendations for improvement.

4.2.1.4 Life cycle Resource management begins in Phase A and is conducted throughout the life cycle of the project. 4.2.1.5 Inputs Inputs to the Resource Management process include:

• Civil service labor costs (planned) • Civil service labor costs (actual) • Contractor labor costs (planned) • Contractor labor costs (actual) • Facility costs (planned)

• Facility costs (actual) • Material costs (planned) • Material costs (actual)

4.2.1.6 Steps Resources Management activities are initiated by the program operating plan (POP) process, as discussed in Section 2.8.3 and as described fully in LA-CWI-01, Budget Planning Process. The project manager shall prepare a POP in conformance with in-structions and guidance provided by the Center Chief Financial Officer (CFO). The CFO is responsible for the POP process. Guidance for developing the POP is provided in the Center POP Call issued under signature of the CFO. In addition, common work in struction LA-CWI-01 documents the POP process at JSC. It is critical that the POP submission realistically project the cost required to proceed according to the project management plan (PMP). The project POP should also identify any over-guidelines require ments and associ-ated impact statements assessing the risk to the technical performance of project schedules due to lack of required funds. The project manager should structure the POP submittal to minimize the risk associated with normal fluctuations in available funding as a result of the auth-orization, appropriation, and apportionment process or any delays. The project manager should also in-clude a projection of the total LCC of the project. These steps are illustrated in Figure 4.2-1 at the top of the following page. 4.2.1.7 Outputs Outputs to the Resource Management process include:

• Annual project budget submissions (funding requirements)

• Completed POP (time-phased estimation to complete (ETC) by phase and to completion)

• Workforce forecasts (by phase and to completion) • Cost forecasts (by phase and to completion)

4.2.1.8 Exit criteria The exit criterion for the Resource Management process is a completed work breakdown structure (WBS). 4.2.1.9 Measurement The following table provides example measures that can be used in conjunction with executing the Resource Management process. See discussion of Measurement on page 4-1. 4.2.1.10 Methods and techniques The methods and techniques that may be used in developing requirements are:

Page 149: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-75

• Resource (workforce, cost, etc.) histogram • Individual or group expert judgment • Statistical analysis of historical data • JSC SLP 4.20, Process Measurement and

Improvement • Contractor financial reporting systems for

contractor-supported/NASA-managed projects 4.2.1.11 Software tools The two principal considerations for a tool that would support the Resource Management process are (1) the consistent, concise, and thorough documenta-tion of the project funding status; and (2) common and convenient accessibility and visibility to all project

team members and stakeholders. The following S/W tools satisfy these considerations:

• Excel • Microsoft PowerPoint

4.2.1.12 References The following documents, which were used to prepare this section, offer additional insights into the Resource Management process:

1SP 6103, NASA Readings in Project Control, 1994. 2NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002. 3Full Cost Initiative Agencywide Implementation Guide: http://ifmp.nasa.gov/codeb/library/fcimplementation.pdf.

Total # of internal task agreements (ITAs) Time to complete each ITA, from draft to final signature developed Average time for all ITAs to be completed, with final signature Total $ value of all ITAs (in full cost) ITA $ costed to date (in full cost) Monthly ITA $ costed to date (planned vs. actual) Support contractor total $ value of contract Support contractor $ costed to date Monthly contractor $ costed (planned vs. actual)

Total # of civil servants on project Monthly level of civil servants (planned vs. actual) Total # of contractors on project Monthly level of contractor personnel (planned vs. actual) % of WBS development complete

Total material costs (planned vs. actual), Monthly material costs if not already captured in ITA (planned vs. actual) Management reserve remaining ($) Monthly level of management reserve ($) available

Purchase request (PR) tracking log

Base Measures Derived Measures

Figure 4.2-1. Resource management process diagram.

New budget required per annual

POP schedule

Develop monthly status reporting including

potential issues

Issue identified?

No

Yes

Analyze issue including root cause

and potential additional

resource impact in future

Identify potential metric

indicators to identify this problem in

future

Project review of new

performance/ process metric

indicators

Define and doc-ument resource

management plan including strategy,

processes (including POP schedules), and procedures

including reserves management ap -

proach and process metrics

Complete project

resource needs by

WBS

Check to en-sure resource projections are

realistic, accurate, and cover entire

project scope and life cycles

Identify potential reserves level(s)

Potential reserves level(s)

approved by

project manager

Review project resource needs and

phasing, and identify potential

issues (e.g., proper resource phasing

for contracts, workforce, and facility support

schedules, etc.), and mitigation plans

Document project resource

budget, obligation, and

cost plan (including phasing) i.e., POP

Project manager approves

POP

Submit to Center/program

POP call

Project operating plan

(budget) approved)

Reanalyze POP and process metrics on a

monthly basis; identify process/ progress issues

Form 533 inputs, if

applicable

Modify resource management

plan accordingly

Implement? Yes

No

Page 150: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-76

4.2.2 Planning1,2,3 4.2.2.1 Function Project planning is determining what needs to be done, by whom, when, and with what resources to accomplish the project. Without appropriate planning there can be no project control, because planning pro-vides the baseline that makes control possible. The Planning process is an iterative process that requires the active participation of all knowledgeable project and support team members, and that defines require-ments, schedules, cost, and the resulting PMP. This process shall be achieved by establishing requirements , an overall project budget, a project WBS, a detailed networked schedule baseline, and a risk management process. This would then allow the project to evaluate progress and determine and implement any “mid-course” corrections. 4.2.2.2 Objective The objective or project and management planning revolves around establishing the project roadmap from inception to completion of stated goals. A key element in the Planning process is the ability of the program to effectively plan and organize activities to meet over-all objectives. Strategic planning is determining what needs to be done, by whom, when, and at what expense of resources throughout the life cycle of the project. It carefully considers customer needs and how project resources can be best managed. 4.2.2.3 Responsibilities The Project Control Officer is responsible for all aspects of the Planning process, including notifying the Project Team as to when and what data are requir-ed, coordinating with any project-external organiza -tions, ensuring timely project-internal review of the final product, and delivering the final product. Other participants in this process include:

• Project Manager, who is responsible for ensur-ing the Project Control Officer has the facilities and workforce needed to accomplish this task. The Project Manager also acts as a customer to the Project Control Officer for this process.

• Entire Project Team, which also participates in this process by providing data, supporting ration-ale, and recommendations for improvement.

4.2.2.4 Life cycle Planning is conducted throughout the life cycle of the project. 4.2.2.5 Inputs Typical inputs to the Planning process are:

• Project requirements

• Project product list • Project schedule

4.2.2.6 Steps Project and management planning is an iterative process that entails layout out a unified strategy for project accomplishment and adjusting to changing conditions, updating the plan, and integrating require-ments across all project disciplines. The basic elements of planning, which a business discipline will engage in, occur from the formulation phase throughout the implementation phase. These elements include:

• Developing objectives and requirements • Developing WBS • Developing project requirements, including the

master schedule • Assuring cost and schedule are commensurate

with technical scope • Assisting in cost/schedule trades as they relate

to system and design changes • Conducting “what-if” analyses • Assessing workforce needs and skill mix • Developing the PMP • Developing acquisition strategies

Although a significant initial effort is required dur-ing the formulation phase, the Planning process is a continual and iterative process of laying out and en-suring a unified effort in implementation, adjusting to changing conditions, maintaining the plan, and inte-grating technical, cost, and schedule requirements. Figure 4.2-2, which appears at the top of the next page, illustrates the steps in the Planning process. 4.2.2.7 Outputs Typical outputs to the Planning process are:

• Project WBS • Master schedule • Responsibility organization matrix • Project organization and structure • Resource loaded, integrated schedule

Specific time-phased estimates should be developed for direct civil service labor and travel and for direct contractor labor and total cost. All other elements of project cost (service pools and general and adminis tra-tive (G&A)) are planned and recorded as factors ap-plied to labor or other direct costs and are not within the control of project management personnel. Planners should be familiar with the latest version of the Full Cost Initiative Agencywide Implementation Guide, which is available from the CFO.

Page 151: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-77

Figure 4.2-2. The planning process diagram. 4.2.2.8 Exit criteria An integrated project baseline has been developed and accepted by the POP process as the exit criterion. 4.2.2.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Planning process. See discussion of Measurement on page 4-1.

Page 152: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-78

4.2.2.10 Methods and techniques The methods and techniques that may be used in developing requirements are derived from individual or group expert judgment. 4.2.2.11 Software tools The following S/W tools satisfy the Planning process:

• Microsoft Project • Microsoft Project Server • Excel • Microsoft PowerPoint • SuperProject Expert

4.2.2.12 References The following documents, which were used to prepare this section, offer additional insights into the Planning process:

1Milosevic DZ, Project Management Toolbox, John Wiley & Sons, Inc., 2003. 2Forsberg K, Mooz H, Cotterman H, Visualizing Project Management, 2nd edition, John Wiley & Sons, Inc., 2000. 3Lewis JP, Fundamentals of Project Management, AMACOM, 2002.

# of project support plans identified (e.g., PMP, # of project support plans completed to draft level Systems Engineering Management Plan (SEMP), (actual vs. planned) risk management plan, documentation and data # of project support plans completed to final level management plan, acquisition management (actual vs. planned) plan, etc.)

% complete of responsibility organization % complete of responsibility organization matrix matrix (actual vs. planned)

% complete of WBS development % complete of WBS development (% vs. schedule) (% vs. schedule) (actual vs. planned)

# of detailed and high-level project schedules # of detailed and high-level project schedules to be developed developed (actual vs. planned)

Organization structure Draft version available Final version available

LCC estimate completion schedule LCC estimate completion schedule (actual vs. planned)

Base Measures Derived Measures

Page 153: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-79

4.2.3 Documentation and Data Management1–4

4.2.3.1 Function Documentation and data management establishes the data policies, responsibilities, and procedures for identifying, selecting, archiving, and providing access to project data not directly associated with product con-figuration. Examples of these types of information in-clude the PMP, meeting minutes, design review pre -sentations, the acquisition management plan, etc. It must be clearly understood that documentation and data management is separate and distinct from config-uration management (CM), which focuses on the ac-tual project product configuration. Further information on CM can be found in Section 4.3.3. The project shall make every effort to retain only the minimum items required by regulations and those items needed to permit cost-effective support of research, development, production, cataloging, provisioning, training, operation, maintenance, and related logistics functions over the project life cycle. The project team must have timely, authorized access to accurate data and documentation, regardless of where the data are stored, how they are formatted, or how they are accessed. Data and documentation must be available when needed to reduce cycle time, increase data accuracy, and ulti-mately improve decision-making. There should be fre-quent, informal interaction with the project team (and with the parent program, if applicable) to determine information requirements and preferences. Addition-ally, the documentation and data management system should ensure adequate control of the data and documents once they are provided. An additional aspect of documentation and data management includes the display of information. Time invested in designing effective formats for communi-cation of project data and document early in the proj-ect advanced studies and definition phases will lead to significant returns, especially in the area of labor hours. Communication techniques should include the judicious use of candor by members of the project team, data visualization techniques (to depict and focus major points) and documentation, and data scope and depth to help establish credibility and reliability for the communicated products. For contractor-supported projects, the project team will establish similar documentation and data manage-ment requirements and formally document them as a deliverable that is subject to immediate access or that must be made available to the project team within a specified period of time. 4.2.3.2 Objective The objective of documentation and data manage-ment is to establish a formal and disciplined system for the scientific, technical, and management informa-

tion required to support a project. It assures that the appropriate project management data are captured and that proper control is established for data and documents created during and after the life of the project. 4.2.3.3 Responsibilities The Project Control Officer is responsible for all aspects of the Documentation and Data Management process, including notifying the Project Team as to when and what data are required, coordination with any project-external organizations, ensuring timely project-internal review of the final product, and de-livery of the final product. Other participants in the process include the:

• Project Manager, who is responsible for ensur-ing the Project Control Officer has the facilities and workforce needed to accomplish the task. The Project Manager also acts as a customer to the Project Control Officer for this process.

• Entire Project Team, which also participates in this process by providing data, supporting ration-ale, and recommendations for improvement.

4.2.3.4 Life cycle Documentation and data management is conducted throughout the project life cycle. 4.2.3.5 Inputs Typical inputs to the Documentation and Data Man-agement process include a detailed understanding of Federal, JSC, and, if appropriate, program-specific data and documentation regulations and requirements. 4.2.3.6 Steps The Documentation and Data Management process for the project begins with a clear understanding of the documentation and data management requirements levied on the project by Federal, NASA, and JSC re-quirements. The project will then proceed to identify the documentation and data management strategy, ap-proach, processes, procedures, methods, and metrics to be used. After these steps have been successfully identified and documented, the project will identify a preliminary set of documentation and data to be re-tained and, potentially, archived throughout the complete project life cycle. As the project exe cutes the data and documentation plan over the project life cycle, it shall be necessary to perform a periodic audit of the processes. These audits will include performing a review of the process met-rics, output products, and, as required, retained and archived documentation and data. If an issue is identified during the audits, the project team (and any other necessary external organization) shall analyze the issue to determine the root cause of

Page 154: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-80

the issue and the preventive or corrective action requir-ed. After successful implementation of a preventive or corrective action, the project team may re -audit this area during the following normally scheduled audit or, if necessary, perform a special audit that focuses solely on the specific top in question. Figure 4.2-3 illustrates the steps in the process. 4.2.3.7 Outputs Typical outputs to the Documentation and Data Management process are:

• Documented project strategy and process for data and documentation management

• List of identified project data and documenta-tion to be controlled, reported, and archived

• List of data and documentation management tools and techniques to be used by the project

• Project change log • Change request form

4.2.3.8 Exit criteria Exit criteria for the Documentation and Data Management process are:

• Project strategy and process for data and docu-mentation management documented in the PMP

• Successful audit of the documentation and data process to ensure it adequately addresses all Federal, JSC, and, if applicable, program-specific regulations and requirements

• Successful audit of the implementation of all data and documentation management tools and techniques

4.2.3.9 Measurement The table at the top of the following page provides example base and derived measures that can be used in conjunction with the Documentation and Data Management process. See discussion of Measure-ment on page 4-1. 4.2.3.10 Methods and techniques The methods and techniques used in developing documentation and data management requirements are:

• Audit process • Individual and group expert judgment

4.2.3.11 Software tools The following S/W tools satisfy the requirements of documentation and data management:

• Change tracking tool (e.g., Excel) • Document and data storage tool (e.g.,

Windchill) • Process modeling tool (e.g., Process Model)

Figure 4.2-3. Documentation and data management process diagram.

Continuous improvement

option identified?

Yes

No

Implement continuous improvement action

Document

documentation and data management strategy, approach,

processes, procedures, methods, and

metrics in a plan

Baseline plan in PMP

Identify preliminary documentation and data elements to be retained/archived

Periodically audit

documentation and data management process metrics, output products,

and retained/archived documentation

and data

Issue identified

Yes

No

Analyze issue to determine

root cause and preventive,

corrective action required

Implement preventative/ corrective plan

Execute Plan Over Project

Life Cycle

Page 155: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-81

4.2.3.12 References The following documents, which were used to prepare this section, offer additional insights into the Documentation and Data Management process:

1JMI 2314.2L, Identifying and Processing JSC Scientific, Technical and Administrative Documents, 1993. 2JPD 2200.1A, Release of JSC Scientific and Tech-nical Information to External Audiences, 2000. 3JPG 1440.3, JSC Files and Record Management Procedures, 2001. 4NPD 1441.1D, NASA Record Retention Schedule, 2003.

Documented audit results Documented preventative and corrective actions (including closure date) resulting from the audits

Total # of completed preventative and corrective actions (planned vs. actual)

# of completed preventative and corrective actions per month (planned vs. actual)

Base Measures Derived Measures

Page 156: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-82

4.2.4 Cost Estimating1–5 4.2.4.1 Function Cost estimating and the development of accurate and defensible LCC estimates for JSC projects are critical for good project planning and execution. LCCs are the total of the direct, indirect, recurring, nonrecur-ring, and other related expenses incurred or estimated to be incurred in the design, development, verification, production, operation, maintenance, support, and retire-ment of a system over its planned lifetime. A project manager can use the LCC estimates as not only a project planning tool for workforce and deliverables, but also as an additional input or constraint into the design space for the project. Cost estimating may be done at the project level or at some lower level. Project-level cost estimates will be discussed in the following paragraphs. Cost esti-mates may also be done at lower levels; e.g., in indi-vidual change requests or as a comparison tool in trade studies. It is critical that the project team per-forming the cost estimate uses the proper tools and techniques for the project life cycle phase in which the estimate is being done. NPG 7120.5B3 provides the requirements and guidelines for the frequency and types of project-level cost estimates to be performed. Both types of estimates (i.e., project level and below) can be devel-oped at the request of the project manager to the JSC Chief Financial Officer, Cost Estimating and Assess-ment Office. Basically, two types of project-level cost estimates are required during the life cycle of the project. These are the advocacy cost estimate (ACE) and the inde-pendent cost estimate (ICE). An ACE shall be required of all JSC projects and shall be documented in the PMP prior to approval. Although the project office can develop the LCC es-timate for budgetary planning purposes based on its understanding of the technical requirements and sched-ules, it is strongly encouraged that the project coordi-nate this activity with the CFO Cost Estimating and Assessment Office. The ACE typically become the project baseline estimate and is performed during the Pre-Phase A and Phase A definition phases of a proj-ect. There may be several iterations of the ACE during these phases as trade studies are conducted and the project becomes more mature and better defined. An ICE shall be accomplished as required by the JSC Center Director or NPG 7120.5B.3 This indepen-dent cost review may also be accomplished as part of other independent reviews, such as a non-advocate re-view (NAR) or an independent assessment (IA). In each of these cases, members of the independent review team, as opposed to the project team, will develop the ICE. In addition to reviewing the project selection of cost methodologies and model inputs, the independent

review team will question project assump tions and identify and quantify technical and programmatic risks, risk mitigation strategies, and reserve strate-gies. Depending on the team’s findings, the ICE from one of these reviews may result in a recommendation to change the project funding profile. 4.2.4.2 Objective The objective of effective cost estimating is to assess project performance and aid in project decision-making. Beginning early in the life cycle process and contin-uing throughout the project life, cost estimates must be generated that reflect realistic cost projects for achieving the documented technical requirements of a project. 4.2.4.3 Responsibilities The Project Control Officer is responsible for all aspects of the cost-estimating process, including notify-ing the Project Team of when and what data are required, coordination with any project-external organizations, ensuring timely project-internal review of the final product, and delivery of the final product. Other participants in this process include the:

• Project Manager, who is responsible for ensur-ing the Project Control Officer has the facilities and workforce needed to accomplish this task. The Project Manager also acts as a customer to the Project Control Officer for this process.

• Entire Project Team, which also participates in this process by providing data, supporting ration-ale, and recommendations for improvement.

4.2.4.4 Life cycle Cost estimating is conducted throughout the life cycle of the project. 4.2.4.5 Inputs Inputs to the Cost Estimating process are depend-ent on the cost estimation method and technique to be used. For example, potential inputs could include a wide range of parameters such as weight, year of technology, quantities, technical complexity, schedule, estimated number of lines of S/W code, etc. Inputs are also provided to the process by the cost analysis requirements description (CARD). 4.2.4.6 Steps The project team member:

• Should determine what the cost estimate scope and content should include. Care should be taken to ensure the cost estimate reflects all cost-gen-erating areas within the scope of the estimate by establishing a comprehensive WBS.

• Should review the various models and techniques readily available for cost estimating during that

Page 157: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-83

life cycle phase. If a cost-estimating commercial off-the-shelf (COTS) or NASA -developed par-ametric model is used, the project team member should ascertain whether individual training on the use and nuances of the model(s) is necessary.

• Develops and documents the input parameters to be used for the model chosen. Documentation includes the rationale behind selection of individ-ual parameters and values so as to capture the thought process used to develop them.

• Finally runs the model. The resulting cost estimate is reviewed for “reason-ableness” using engineering judgment. Care must be taken not to discard a cost estimate that is higher than the project team member’s experience. Therefore as a result, model changes that may be required should be incorporated and the model run again until a reasona-ble result occurs. A sensitivity analysis should be done to determine both the primary “cost drivers” and the potential range of the cost estimate given project-realistic changes in the input parameters. Figure 4.2-4. Cost estimating process diagram.

Figure 4.2-4 (below) illustrates the steps taken in this process. 4.2.4.7 Outputs Typical outputs to the Cost Estimating process are:

• Cost estimate • Project data to develop the CARD

4.2.4.8 Exit criteria Exit criteria for the Cost Estimating process are:

• Completion and documentation of cost estimate • Method of cost estimate, including all input

parameters and their rationales 4.2.4.9 Measurement The table, which appears at the top of the following page, provides example base and derived measures that can be used in conjunction with execut-ing the Cost Estimating process. See discussion of Measurement on page 4-1.

No

No Yes Yes

Develop and Document Input Parameters

Project Team Member Familiar with

Model/Technique

Yes No Obtain Training

Develop Cost Estimate

Review Output for “Reasonableness”

Conduct Sensitivity Analysis

Document Cost Estimate Results

Solicit Expert Consultation from

JSC CFO

Cost Estimate Determined to

be Needed

Cost Estimate Scope and Content Decided

Determine Most Appropriate Model and Technique to

be Used

Reasonable? Correct

Model and Technique

Used?

Correct Input

Parameters Used?

Yes

No

Page 158: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-84

4.2.4.10 Methods and techniques The methods and techniques that may be used in developing requirements are:

• Parametric • Grass roots • Analogy • Individual or group expert judgment • Historical costs of analogous H/W and S/W

systems • Inflation indices; e.g., the NASA (Code B)

R&D New Start Index 4.2.4.11 Software tools The following S/W tools will support the Cost Estimating process:

• Parametric cost models; e.g., NAFCOM, PRICE, SEER, and COCOMO

• Risk analysis tools; e.g., @RISK • Cost-phasing algorithms; e.g., the Beta Curve

4.2.4.12 References The following documents and Web site, which were used to prepare this section, offer additional insights into the Cost Estimating process:

1Milosevic DZ, Project Management Toolbox, John Wiley & Sons, Inc., 2003. 2NASA Cost Estimating Handbook 2002 at http://www.jsc.nasa.gov/bu2/NCEH/index.htm. 3NPR 7120.5B, NASA Program and Project Man-agement Processes and Requirements, 2002. 4SP 6103, NASA Readings in Project Control. 5SP 6105, NASA Systems Engineering Handbook , 1995.

# of cost estimates done Time to accomplish cost estimate (in days)

Base Measures Derived Measures

Page 159: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-85

4.2.5 Performance Measurement1–4 Performance measurement is a management tool that shows the current system or process status and identifies problem trends or problem areas as they develop. The three major types of performance measure-ment are technical performance measurement (TPM), process performance measurement, and earned value management (EVM) performance measure. Although there are other taxonomies for performance measure-ment, they will not be addressed here. The first of these performance measurements, TPM, addresses attributes of the system of interest. By measuring or estimating TPMs, the subsystem en-gineer, lead systems engineer, and project manager gain visibility into whether the delivered system will actually meet performance requirements. The goal of TPM is to provide early warning that specific quanti-fiable threshold values for performance (e.g., accuracy, thrust, reliability, etc.) or limits of resources (e.g., weight, power demand, thermal output, memory availability, central processing unit (CPU) usage, etc.) are in jeo-pardy. These thresholds and limits are key attributes of the system of interest. They are not applicable to all elements of a project, and they may or may not be additive. A detailed discussion of TPMs is found in Section 4.1.12. The second type of performance measurement, process performance measurement, addresses the pro-cesses used in engineering the system of interest. SE metrics are included in this category, as are project control metrics involving production control, opera-tions, and maintenance. Examples of SE and project control metrics are shown throughout Chapter 4. The focus of process performance measurement is on the progress of the project and the quality and productiv-ity of the processes rather than on the performance of the system of interest. Such metrics will facilitate the detection of systemic problems (e.g., skills imbalances ) or project difficulties (e.g., requirements instability). Process metrics will also guide process improvement initiatives. Further detailed discussion of process per-formance measurements is found in Section 4.3.4. Both TPM and process performance measurement should be directed at issues in either the projects or the organizations in which the projects are performed. These measures are not universal and should not be applied to all of the elements of a project or an organ-ization. Some issues requiring closer scrutiny and measurement will be identified in the early phases from project objectives (budgets, schedules, quality, performance, system capability), risk assessments, project constraints and assumptions, product accept-ance requirements, or project external requirements. Finally, these issues may be identified from experi-ence. Because the measurement process consumes

resources, when issues are resolved it may be wise to delete the measures addressing them unless there is concern that these issues may recur. The third and final type of performance measure-ment is EVM. EVM is a “leading process” whereby project tasks are arranged in resource-load schedules (a budget baseline) and checked off when accomplish-ed. A detailed discussion of EVM is provided below. 4.2.5.1 Function EVM is an approach to project management (plan-ning and control) that requires the project manager to quantify accomplishments. Earned value is the measure of progress in completing the project; i.e., its degree of completion. The goal of EVM is to provide early visibility into technical problems that, if unrecognized, may impact project cost or schedule goals. The earlier such problems are recognized, the more likely the corrective action will be fruitful and the less likely the problems will result in large project overruns or schedule slips. Before project management starts, a project must be defined. An effort becomes a project when the scope (with completion criteria), schedule (completion date and major milestones), and budget (the most likely level of resources that will be consumed) are clearly defined and conclusively agreed by the customer and the supplier or project manager. The EVM process starts when the project scope (statement of work (SOW)) is clearly divided into lower-level, re latively short-duration tasks that can be sequenced, scheduled, budgeted, and assigned by the project manager to re-sponsible front-line managers for completion. A WBS is the most desired method of accomplishing this. The EVM process continues into project implement-ation with monthly or weekly project schedule assess-ments in which the responsible front-line managers quantify progress (earned value) against the budgeted, short-duration tasks defined in the planning phase. In this assessment a task is either complete, in which case the budget value is “earned,” or is not complete, in which case there is no “earned value.” Significant dif-ferences between the volume of work completed and the volume planned (schedule variance) must be tho-roughly analyzed to identify the nature of the problem and the reason for this variance. Once the cause of variance is understood, its likely impact on the same or future tasks can be accurately assessed and correc-tive actions can be planned and set in motion to elim-inate or mitigate the effect on downstream work. During these frequent assessments, front-line man-agers need to determine the actual cost of the completed work. The actual cost data must come from the account-ing system. Significant differences between the volume of work completed and the actual cost incurred (cost

Page 160: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-86

variance) should be analyzed in a manner similar to that used to analyze the schedule variance. Once the front-line managers clearly understand the nature of the problems and make reasonable plans for corrective actions, they can make informed fore -casts of when the tasks can be completed and what level of resources they are likely to consume. These estimates are consolidated to produce an ETC for the project. This cycle of quantifying progress against schedule and actual cost is repeated each period until project completion. If these assessments are to accurately re-flect the project status and forecasts of future condi-tions, the plan (baseline) must be maintained. The goal of maintenance is that the baseline always re-flects the agreement between the customer and the project manager. When the project changes (i.e., work is added, deleted, or moved in time), the base-line must change to reflect this new reality. The bud-gets for project tasks will change in total dollars or in phasing whenever significant work is added, deleted, or moved in time. The budgets will not change, how-ever, unless the work changes. EVM is described in significant detail in EIA-748A.1 This standard has many parallels with EIA-632.2 The EIA-748A guidelines for EVM are as follows:

• Plan work scope for the project to completion. • Break down the project work scope into finite

pieces that can be assigned to a responsible per-son or organization for control of technical, schedule, and cost objectives.

• Integrate project work scope, schedule, and cost objectives into a performance measurement base-line plan against which accomplishments may be measured. Control changes to the baseline.

• Use actual costs incurred and recorded in ac-complishing the work performed.

• Objectively assess accomplishments at the work performance level.

• Analyze significant variances from the plan, forecast impacts, and prepare an estimate at completion based on performance to date and work to be performed.

• Use earned value management system (EVMS) information in the center management process.

EIA-748A1 also includes 32 principles that further promote objective and accurate assessment of the sta-tus and outlook of projects. These principles are appro-priate for large, long-term projects in which accuracy, objectivity, and baseline traceability are particularly importance. The principles are essentially the same as the Cost/Schedule Control Systems Criteria (C/SCSC) levied by the Department of Defense (DoD) on large weapons programs since the 1960s.

NASA policy for application of EVM can be found in NPD 9501.3A.3 This document applies only to con-tracts . It requires the contractor to apply EVM to R&D or production contracts larger than $25M and of a dur-ation longer than one year. If the contracts are larger than $70M for R&D or $300M for production, the contractor is required to implement a system that ob-serves the 32 principles from EIA-748A.1 It further requires “validation,” meaning that the EVMS actu-ally used by the contractor has been demonstrated and verified in writing as complying with the 32 prin-ciples or criteria. Earned value is not required if the contract is predominantly level-of-effort (no products ). A draft policy (to replace NPD 9501.3A3 but re-numbered to be part of program formulation policy, NPD 7XXX) is in development at the Agency level. This document will broaden the application from contracts to projects. In the interim, JSC shall imple -ment an EVMS for all projects that is tailored to the specific project and approved by the Center EVM Council representative. Only the Center PMC or Center Director may approve waivers of NPD 9501.3A.3

4.2.5.2 Objective The objective of EVM is to prove early visibility into project technical problems so that corrective ac-tions might prevent unfavorable schedule or cost outcomes. 4.2.5.3 Responsibilities The Project Control Officer is responsible for all aspects of the performance measurement process. This includes notifying the Project Team of when and what data are required, coordination with any project-exter-nal organizations, ensuring timely project-internal re -view of the final product, and delivery of the final product. Other participants in the process include the:

• Project Manager, who is responsible for ensur-ing the Project Control Officer has the facilities and workforce needed to accomplish this task. The Project Manager also acts as a customer to the Project Control Officer for this process.

• Entire Project Team, which also participates in this process by providing data, supporting ration-ale, and recommendations for improvement.

4.2.5.4 Life cycle EVM is applicable to any project life cycle phase that has conclusive completion criteria. 4.2.5.5 Inputs Typical inputs to the EVM Performance Measurement process are:

Page 161: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-87

• A product hierarchy (usually a WBS) with defi-nitions for each element that clearly distinguish that element from all other elements and that provide conclusive guidance regarding com-pletion criteria.

• Work authorization (scope, schedule, budget in terms of resource types, and signatures indicat-ing a meeting of the minds): − At the project level, between the customer

and the performing organization. − At the task level (scope entirely within a

WBS element to be performed by a single functional organization), between the project manager and the task manager.

• Schedules that clearly define the period of per-formance, major enforced milestones dates, and delivery dates of customer-supplied products.

• An EVM tool that will capture resource-loaded schedules.

4.2.5.6 Steps Figure 4.2-5 illustrates the steps that are taken in performing the EVM Performance Measurement pro-cess. These steps can be broken down into two phases : planning and control.

• Planning Phase − A solicitation is received from the customer.

The formality is less important than the spec-ificity of the solicitation. It should clearly convey the functionality of the product as well as the major schedule milestones pertinent to the product.

− The directorate prepares a proposal describ-ing the implementation, the budget in terms

Project Manager:

Task Manager:

Customer:

Directorate:

Project Log

Review

Agreement?

Detail Planning

Yes

Work Authorization Documents

Initial Planning

No

Review

Review

Review

Review

Record Baseline

Project Man-agement Plan

Schedule Status

Analyze Variances

Actual Cost

Another Month’s Work

Estimate Cost to Complete

Report Replan

Agreement? Project

Agreement (e.g., ITA)

Solicitation

Urgent Change

Proposed Change

Review

Proposal Yes

No

Figure 4.2-5. Performance measurement process diagram.

Page 162: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-88

of time -phased resources (in-house labor by organization, contractor labor, material, facil-ities, travel, service pool, general and admin-istrative (G&A) by fiscal year), and any other pertinent schedule events.

− Review and negotiation continues until both parties reach a mutual agreement regarding scope, schedule, and time-phased resources (budget). This agreement is recorded in an ITA or another document that clearly defines the scope, schedule, and budget. It is then signed by the director and the customer.

− The project manager performs sufficient in-ternal planning to do a preliminary allocation of project tasks and resource levels to per-forming organization units. These plans will also include target dates for commencing and completing the tasks.

− Negotiation between the project manager and the task manager will continue until they reach a mutual agreement regarding all sali-ent provisions of the task. The project man-ager, task manager, and functional manager, who supervises the task manager, will sign the work authorization document (WAD).

− Once the WAD is signed, the task manager will perform the detailed planning that breaks the task into near-term work packages (sub-tasks) and farther-term planning packages, indicating specific periods of performance but with less “granularity.” Each work pack-age and planning package will be composed of only a single resource type. Predecessor/ successor relationships among the work/plan-ning packages will be defined. A work pack-age represents units of work at levels where work is performed. As such, it is clearly dis -tinguished from all other work packages. Among the factors that distinguish are: • It is assigned to a single organizational

element. • It has scheduled start and comp letion

dates and, as applicable, interim mile -stones that are representative of physical accomplishment.

• It has a budget or an assigned value ex-pressed in terms of dollars, labor-hours, or other measurable units.

• Its duration is limited to a relatively short time span, or it is subdivided by discrete value milestones to facilitate the objective measurement of work performed, or it is level of effort.

• It is integrated with detailed engineering, manufacturing, or other schedules.

− Record this baseline information in the EVM tool. Baseline information includes schedule and budget data. Budget data will be broken into the performance measurement baseline representing work authorized formally to task managers, undistributed budget that represents work not yet authorized to task managers, and management reserve representing a contingency held by the project manager to authorize work that has always been part of the scope that has not been assigned to any task manager.

• Control Phase − Each month, the task manager will status

the task-level schedule. Milestones and work packages will be reported as complete, not yet started, or in process.

− The schedule status information, when com-pared with the project baseline data, will in-dicate schedule variance. When compared with actual cost data, cost variances will be revealed. Both schedule and cost variances will be analyzed as to cause (i.e., the nature of the technical problem underlying the var-iance), impact on the current task and on successor tasks, and corrective action.

− A time-phased ETC will be prepared that reflects actual cost to date and knowledge of future conditions.

− The task manager will repeat this cycle each month, updating schedule status, analyzing variances, and building ETCs until the task is complete. Each month a report of this in -formation will be provided to the project manager, directorate management, and customer.

− Occasions will arise when a work package or a milestone should be replanned. Replan-ning can take the form of a change in time (schedule) or implementation details (differ-ent work group, buy rather than make, etc.) within the overall schedule and budget con-straints for the task as outlined in the work authorization. When such a need arises, a request will be prospectively prepared and forwarded to the project manager. When approved, the detail project baseline will be modified accordingly.

− On infrequent occasions, changes in overall project scope or schedule will be necessary. These changes will follow the steps taken when establishing the original project agree-ment. Sometimes elements of the changes are urgent and must be pursued immediate-ly, even if the changes have not yet been

Page 163: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-89

negotiated between the customer and the directorate. In such case, the near-term effort will be scheduled and budgeted. At the same time, a change request will be prepared, neg-otiated, and recorded to reflect appropriate, equitable changes to the baseline values.

4.2.5.7 Outputs The outputs of the process are:

• Variance analyses characterizing the cause and impact of only significant variances as well as corrective action plans intended to mitigate or minimize the effects of technical problems.

• Estimates of cost at completion. These estimates are based on performance to date, commitment values for material, and estimates of future conditions.

• Estimated project completion date, including forecasts for significant milestones.

4.2.5.8 Exit criteria There are no exit criteria since this process is continuous. 4.2.5.9 Measurement The following tables provides example base and derived measures that can be used in conjunction with executing the Performance Measurement proc-ess. See discussion of Measurement on page 4-1. 4.2.5.10 Methods and techniques The methods and techniques that may be used in developing requirements are defined by EIA-748-A1 principles.1

4.2.5.11 Software tools The following S/W tools satisfy the principal consid-erations of the EVM Performance Measuring process:

• MS Project • MS Project Server • EV tools perform several essential tasks,

including: − Schedule planning (duration, sequence,

resource types, resource levels) − Schedule baseline (recording and maintain-

ing baseline approved by project manager) − Schedule status by work package or mile-

stone (completed, not started, or in work/ percent complete)

− Schedule forecast (estimated dates for base-lined tasks)

− Record direct costs in a manner consist with budgets without allocation

− Report variances (schedule and cost) at task level

− Report to facilitate variance analysis (auto-matically decompose labor-cost variances into rate and efficiency variances and ma -terial-cost variances into price and usage variances)

− Accommodate revisions to baseline for • Internal task replanning • Adding or deleting work from task • Changes to indirect cost rates or factors

− Consolidation for reporting purposes from task level to project level through hierarchies for organization (functional) and product (WBS).

Project Schedule variance (SV) – ($, %, comparative fit index • Plan (budgeted cost of work scheduled (BCWS) (CFI), 6-month, etc.) • Earned value (budged cost of work Cost variance (CV) – ($, %, CFI, 6-month, etc.) performed (BCWP)) Variance at completion (VAC) – (budget at completion • Actual cost (actual cost of work performed (BAC)-estimate at completion (EAC)) (ACWP)) Schedule performance index (SPI) (earned value • BAC (EV)/Plan) • EAC Cost performance index (CPI) (EV/ACWP) To complete performance index (TCPI) (BAC-EV)/ (EAC-actual cost (AC))

Fraction of work performed each period Defined as measured as opposed to level of effort (LOE)

Age of unnegotiated changes Age of unnegotiated changes (actual vs. project-defined “yellow” and “red” age levels)

Base Measures Derived Measures

Page 164: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-90

4.2.5.12 References The following documents, which were used to prepare this section, offer additional insights into the EVM Performance Measuring process:

1EIA-748A, Earned Value Management Systems, 2002. 2EIA-632, Processes for Engineering a System, ANSI/EIAS-632-1998, 1999. 3NPD 9501.3A, Earned Value Performance Man-agement, 2002. 4Milosevic DZ, Project Management Toolbox, John Wiley & Sons, Inc., 2003.

Page 165: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-91

4.2.6 Schedule Management1–5 4.2.6.1 Function Schedule management provides for the overall development and maintenance of a master schedule, coupled with the detail from the interrelated, lower-level schedules covering the overall project from start to finish. Additional benefits of this scheduling strate-gy include automated summarizations of project cost and schedule data, resource leveling to effectively stay within existing workforce constraints, efficient and timely reporting of project status, and more accu-rate impact assessments on external project interfaces . Time and money are directly linked, and combining them in analysis proves very beneficial. Logic network-driven schedules should be resource-loaded in a sim-ple and practical manner. These resource-loaded networked schedules provide a useful tool in un-derstanding interdependencies. Schedule analysis techniques shall involve review and assessment of critical paths, logic relationships between activities, slack and reserve usage manage-ment, milestones and tasks progress against plans, and other schedule analysis criteria. This can then provide the capabilities for determining reliable cost and workforce estimates, EVM, trending for better EACs, what-if analysis for POP exercises and project workarounds, and generally better overall cost control. For contractor-supported projects, the contractor should be held to “rolling wave” accountability on near-term schedules. The term “rolling wave” means that there are very detailed schedules for near-term activities, specifically for the upcoming 6-month period. These schedules should be measured in inch-stones, not milestones. The performer reports the budgeted value of the inch-stones met in a given month vs. the budged value of those planned. With this type of small, incremental planning and tracking, it is readily apparent how well the project is being executed. When the contractor reporting structure closely links schedule and cost-reporting milestones, care should be taken that the approach is not based on equal value milestone performance since this could lead to erroneous assessments. Slippages in a project schedule can be extreme -ly expensive. A permanent record of all schedule changes should be documented to support trend analysis and the project assessment function. 4.2.6.2 Objective The objective of schedule management is to provide the key in implementing an effective integrated man-agement process for NASA projects. It is the founda-tion for establishing a project baseline and effective performance measurement process. The goal of sched-uling is to provide a framework to time-phase and coordinate tasks into a master plan to complete the

project within cost, schedule, and performance con-straints. The schedule content, format, detail, and symbology shall be established. The key discipline in this area is taking time and effort early in the schedule development process to develop a detailed, logical, network-driven schedule. This scheduling technique is mandatory for successful control of both in-house and contractor efforts, and it proves invaluable in keeping management focused on the right tasks and knowing the right priorities for workforce assignments . 4.2.6.3 Responsibilities The Project Control Officer is responsible for all aspects of the schedule management process, including notifying the Project Team of when and what data are required, coordination with any project-external org -anizations, ensuring timely project-internal review of the final product, and delivery of the final product. Other participants in this process include the:

• Project Manager, who is responsible for ensur-ing the Project Control Officer has the facilities and workforce needed to accomplish this task. The Project Manager also acts as a customer to the Project Control Officer for this process.

• Entire Project Team, which also participates in this process by providing data, supporting ration-ale, and recommendations for improvement.

4.2.6.4 Life cycle Schedule management is begun during Phase A and is conducted throughout the life cycle of the project. 4.2.6.5 Inputs Typical inputs to the Schedule Management process are:

• WBS • WBS dictionary • Project master schedule

4.2.6.6 Steps The project effort shall be broken into tasks and milestones at a level of detail to allow for discrete pro-gress measurement and for management visibility into the overall definition, development, fabrication, integra-tion, test, delivery, and operational phases of a project. The project should require the responsible lead en-gineer, along with other key project team members, to set aside time in the early stages of the project to re-view the WBS and determine associated task durations, resources required, and interdependencies. Each task should also be assessed for risk to determine the ap-propriate duration range to be used in the scheduling process. All of these efforts enable better organization of the work, improved estimates of task duration, bet-ter procurement planning, and reduced risk of acci-dental scope omission.

Page 166: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-92

A logic-driven schedule is then developed within the framework of the approved project WBS. This schedule must, by definition, contain the entire scope of the work. Each task and milestone should be hori-zontally integrated with the appropriate predecessor/ successor sequence relationships; they should also be vertically integrated to significant project milestones. This provides an end-to-end logic network of resource-loaded project tasks. As the project progresses, the scheduled effort will shift to schedule status reporting and analysis on a regular basis so as to reflect the most current status of the integrated project schedule. Schedule analysis activities must also take place on a regular basis to determine whether significant project schedule chan-ges have occurred, such as a change in critical path. Figure 4.2-6 illustrates the steps in the process.

4.2.6.7 Outputs Typical outputs to the Schedule Management process are:

• Identified tasks for each appropriate WBS item • Duration (range) for each task • Resources required for each task • Logic progression sequence of tasks (e.g.,

network) • Integrated, resource-loaded project schedule • Critical path analyses

4.2.6.8 Exit criteria Exit criteria for the Schedule Management process are:

• Completed top-level project schedule through-out the life cycle

Figure 4.2-6. Schedule management process diagram.

Determine What Schedules Are

Needed

Determine How Schedules Will Be Used (e.g., Task Planning and Oversight, Task Progress Tracking,

Reporting to Upper Management, etc.)

Develop Predecessor/Successor for Each Task, Through Project

Completion

Determine Appropriate Schedule

Management Tool

Develop Tasks (from WBS)

Combine Tasks to Form Project Network

Define and Evaluate Risks for Each Task

Develop Risk Mitigation Actions and Add to

Network

Determine Range of Duration for

Each Task

Review Critical Path to Determine if It Is

Possible to Shorten

Determine Critical Path

Assign Duration Range to Each Task

Yes Plan Periodic Project-Level

Schedule Status Reviews

Shorten? No

Determine Current Tasks

Schedule Status

Perform Variance Analysis

Page 167: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-93

• Completed detail-level project schedule throughout the life cycle

• Defined and documented schedule definition and analysis process

• Successful schedule trace from start milestone through completion with no breaks

4.2.6.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Schedule Management process. See discussion of Measurement on page 4-1. 4.2.6.10 Methods and techniques The methods and techniques that may be used in development requirements are:

• Cards-on-the-wall method • Work flow diagram • Individual and group expert judgment • Start at completion and work backward • Critical path analysis techniques (program eval-

uation review technique (PERT), graphical eval-uation review technique (GERT), critical path method (CPM), etc.)

• Time -scaled arrow diagram • Critical chain schedule • Milestone charts • Hierarchical schedule • Line of balance • B-C-F [baseline-current-future] analysis • Backward pass method

4.2.6.11 Software tools The following S/W tools support the Schedule Management process:

• Microsoft Project Server • Microsoft Project • Microsoft PowerPoint (Gantt charts)

4.2.6.12 References The following documents, which were used to prepare this section, offer additional insights into the Schedule Management Process:

1NPD 1440.6G, NASA Records Management , 2002. 2JPG 1440.3, JSC Files and Records Management Procedures, 2001. 3SP 6105, NASA Systems Engineering Handbook , 1995.

4Forsberg K, Mooz H, Cotterman H, Visualizing Project Management, John Wiley & Sons, Inc., 2002. 5Dragan ZM, Project Management Toolbox, John Wiley & Sons, Inc., 2003.

# of schedules to be developed # completed (actual vs. planned)

Priority of schedules to be developed

% complete for identification and documentation % complete for identification and documentation of each of each task duration (range) for each schedule task duration (range) for each schedule (actual vs. planned)

% complete for identification and documentation % complete for identification and documentation of of predecessor/successor for each task for each predecessor/s uccessor for each task for each schedule schedule (actual vs. planned)

% complete for identification and documentation % complete for identification and documentation for for required resources for each task for each required resources for each task for each schedule (actual schedule vs. planned)

Critical path analysis completion

Slack/float (free and total)

Base Measures Derived Measures

Page 168: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-94

4.2.7 Project Analysis1,2,3 4.2.7.1 Function This section addresses the need for project leaders to perform an effective and timely analysis of project data for existing projects, proposed changes, and future projects. This analysis shall be performed to determine that the current performance analysis plan is understood and that future plans are in place to assure project success. The analysis shall also be used to assess proposed changes and new project plans. 4.2.7.2 Objective The objective of project analysis is to provide a dis ciplined process for analyzing and communicating existing project performance measurement data to deter-mine project progress and issues, and to assess future plans. This analysis should result in an understanding of the accuracy of the data and progress on the project, project issues, and future plans to provide a basis for management action. Project analysis also provides a process for analyz-ing proposed changes and proposed project plans to determine whether the plans are sound and form the basis for accepting or rejecting the proposals. Project analysis shall be done at a level in the WBS that is appropriate to the complexity or risk of the task and at the overall project level. Project analysis shall be done throughout the life cycle of the project to provide increasingly reliable data concerning the viability of future projects or the performance on existing projects. 4.2.7.3 Responsibilities The Project Control Officer is responsible for all aspects of the project analysis process, including notifying the Project Team of when and what data are required, coordination with any project-external organ-izations, ensuring timely project-internal review of the final product, and delivery of the final product. Other participants in this process include the:

• Project Manager, who is responsible for ensur-ing the Project Control Officer has the facilities and workforce needed to accomplish this task. The Project Manager also acts as a customer to the Project Control Officer for this progress.

• Entire Project Team, which also participates in this process by providing data, supporting ration-ale, and recommendations for improvement.

4.2.7.4 Life cycle Project analysis applies during the entire life cycle of a project. 4.2.7.5 Inputs Typical inputs to the Project Analysis process are:

• BCWS, which is the sum of the time-phased budgets for all efforts scheduled to be accom-plished within a time period; associated with this budget is a baseline schedule that has re-sources assigned to it

• BCWP, which is the sum of the time-phased budgets for all efforts completed during a spec-ified time period

• ACWP, which is the cost actually incurred and recorded in accomplishing the BCWP within a given time period

• CV, which is BCWP-ACWP • SV, which is BCWP-BCWS • CPI, which is cumulative BCWP divided by

cumulative ACWP • SPI, which is cumulative BCWP divided by

cumulative BCWS • TCPI, which is BAC-cumulative BCWP divided

by EAC-ACWP • Actual costs for similar projects, vendor quotes,

or best estimates from subject matter experts • Basis of estimate (BOE), which is the basis for

estimating new projects and changes to existing projects

• TPMs appropriate to the project life cycle 4.2.7.6 Steps To analyze a proposed project in Pre-Phase A or Phase A, the project control officer and the team members acquire backup cost, schedule, and risk in-formation from other similar projects or market sur-veys to form the basis of estimate and risk assessment. Where information is not available, subject matter ex-perts (SMEs) can be used to provide a best engineering estimate using cost-estimating techniques appropriate for the product. SMEs can also be used to survey the market to assess the risk of available technologies to support the project. Team members and the project manager use this information to validate the proposed project cost, schedule, and risks. When assessing a ven-dor’s proposal, similar comparison data are acquired for purposes of reviewing both the original proposal for a project and subsequent changes to that project. When a project is sufficiently defined, the work is allocated to team members or vendor managers for implementation into an integrated project baseline. The project control officer, project manager, and other ap-propriate SMEs review and approve the baseline. If the project is a government-furnished item, the project control officer and team members implement the base-line. If the project is assigned to a vendor, the vendor’s project control function and managers implement the baseline with oversight by the NASA project team. The project control officer and vendor’s project con-trol function strictly control project baseline changes

Page 169: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-95

to assure that no unknown changes invalidate the baseline and subsequent data for analysis. The team members or vendor managers provide monthly status for both TPMs and EV reporting. The project control officer or vendor’s project control func-tion takes these inputs and provides team members or vendor managers and the project manager with reports on project performance measurements for project an-alysis. The team members or vendor managers use these reports to assess variances in cost or schedule (usually greater than ±10%) and provide corrective action plans. The project manager analyzes project-level indices and variance information for project trends. For variance analysis, CPI, SPI, and TCPI provide a quick rule-o f-thumb measure for project health. CPI represents how efficiently the work is performed (BCWP) vs. the money being spent

(ACWP). SPI represents how quickly the work is accomplished (BCWP) vs. what is baselined (BCWS). TCPI represents the cost efficiency re-quired to achieve the project with the budget or EAC planned. In a perfect project, all three measures are a 1.0. Seeing usual variations in these indices indicates a need to look further. For example, if the project CPI is 0.56 yet the TCPI is 1.40, the indication is that a large increase in efficiency will be needed to successfully complete the plan. TPMs are analyzed to determine whether there are technical issues or risks in trends observed. For example, a high change rate could indicate that the project is insufficiently defined for the current phase. Similarly, a high defect rate could indicate a process problem within the project. Figure 4.2.7 illustrates the Project Analysis process.

Analyze Proposed Project

Collect Data from Similar Project

Assess Proposal

Proposal Assessment

Go?

Yes A

No

Stop Project

Analyze Project Change

Review Technical Proposal

Determine Should Cost

Assess Cost Proposal BOEs vs. Should Cost

Conduct Fact-Finding

B

Authorize Change and Update

Baseline/POP

Negotiate Baseline Change

B

Develop Integrated Baseline

Integrated Baseline and POP

C

Prepare Budget, Schedule,

Metrics, and Risks for CC

Allocate Requirements

to CC

Define Cost Centers (CC) (WBS + Org.)

A

Analyze Performance

Data

Collect Performance

Data

C

Analyze Variances, Issues,

and Risks

Prepare Corrective

Action Plans

ETC Update Variance Report Issues and Risks

KEY Step/Activity Product Information/Output Flows Decision B Connector

Figure 4.2-7. Project analysis process diagram.

Page 170: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-96

4.2.7.7 Outputs Typical outputs to the Project Analysis process are:

• An integrated project status and POP • ETC, which provides an updated time -phased

estimate for the remaining work on the project • Variance analysis report (VAR), which provides

an analysis of the problem, program impact, and corrective action plan

• Risk assessment and technical performance issues • Corrective action plans

4.2.7.8 Exit criteria Exit criteria for the Project Analysis process are:

• A revised estimate that more accurately reflects project costs for the project’s entire scope

• The corrective action plan in the variance anal-ysis addresses the root cause of the technical, cost, or schedule issue

• A risk assessment and abatement plan for the potential future project problems

4.2.7.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Project Analysis process. See discussion of Measurement on page 4-1. 4.2.7.10 Methods and techniques The methods and techniques that may be used in developing requirements are:

• Statistical analysis of defect closure rate and change rate

• Analysis of EV performance measurements • Cost-estimating methods

• Individual or group expert judgment • EVMS

4.2.7.11 Software tools S/W tools that will support the Project Analysis process are:

• Excel • Word processor • Cost-estimating models (e.g., SEER, NAFCOM,

PRICE, etc.) 4.2.7.12 References The following documents and Web site, which were used to prepare this section, offer additional in-sights into the Project Analysis process:

1NPR 9501.3, Earned Value Management Imple-mentation on NASA Contract, 2002. 2EIA-748-98, Industry Guidelines for Earned Value Management Systems, 1998. 3NASA Cost Estimating Handbook 2002 at http://www.jsc.nasa.gov/bu2/NCEH/index.htm.

# Requirements Added, Changed, or Deleted Requirements Volatility = % Added, Changed, or Deleted

Defects in Requirements (by Phase) Requirements Defects – Projected vs. Actuals

Defects in Design (by Phase) Design Defects – Projected vs. Actuals

Defects in S/W Code (by Phase) Code Defects – Projected vs. Actuals

Issues/Actions Status % Issues Closed

Risk Status (e.g., Open, In Work, Closed) Projected vs. Actual Risk Status

CV BCWP-ACWP

SV BCWP-BCWS

CPI Cumulative BCWP divided by Cumulative ACWP

SPI Cumulative BCWP divided by Cumulative BCWS

TCPI BAC-Cumulative BCWP divided by EAC-ACWP

Base Measures Derived Measures

Page 171: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-97

4.3 Crosscutting Processes 4.3.1 Acquisition Management1,2 The Acquisition Management process is performed to acquire all products, materials, or services in support of the project. Project acquisitions are performed via acquis ition instruments such as contracts, grants, co-operative agreements, or a combination of these. Compo-nent activities of the Acquisition Management process include:

• Defining requirements • Formulating acquisition strategies for the

project, including determining acquisition instruments and appropriate contract types

• Executing the acquisition instruments • Monitoring and evaluating performance of the

acquisitions The end customer for the Acquisition Management process is the project manager. Acquisition manage-ment, which is a continuous process throughout the life cycle of the project, is exercised in accordance with project acquisition strategies. 4.3.1.1 Function1,2,3 In the Acquisition Management process:

• Technical requirements (e.g., detail specifica-tions, interfaces, cost, schedule/long lead, etc.) and management requirements of the products, materials, or services being acquired shall be identified.

• Acquisition strategies to acquire required prod-ucts, materials, and services shall be developed.

• Acquisition instruments in accordance with project acquisition strategies shall be executed.

• Execution of the acquisition instruments, includ-ing monitoring and performance evaluation, shall be managed.

All project acquisitions are performed in accordance with the following established JSC procedures: JSC SLP 4.6, Procurement;4 and JPG 5335.2, Space Act Agreements.5

4.3.1.2 Objective1,2 The primary objective of the Acquisition Manage-ment process is to acquire all required products, ma -terials, and services to enable a firm commitment to accomplish project goals and objectives on schedule and within budget. 4.3.1.3 Responsibilities1 The Project Manager is responsible for formulat-ing acquisition strategies for the project. The Project Manager also makes the final decisions concerning project acquisitions and approves any corrective ac-tions relating to the performance of the acquisition mechanisms.

The Project Manager is supported by the Project Acquisition Team, whose principal participants in-clude the Project Control Officer, the Lead Systems Engineer, the Procurement Representative, the Chief Financial Officer Representative, and, as necessary, the Office of the Chief Counsel Representative. The Project Control Officer is responsible for per-forming the process steps outlined below. The Project Control Officer also provides the primary management inputs on needs and requirements for the products, materials, and services to be acquired in support of the project. The Lead Systems Engineer provides primary tech-nical input on needs and requirements for project ac-quisitions. The Lead Systems Engineer also supports execution of the acquisition instruments by participat-ing in the development and review of solicitations and the evaluation process for selection of suppliers. Finally, the Lead Systems Engineer supports techni-cal monitoring and evaluation of the acquisition instruments. The Procurement Representative interfaces with the Project Control Officer and the Lead Systems En-gineer to solicit information in support of project ac-quisitions and provides advice both on conducting acquisitions and on acquisition regulations. The Chief Financial Officer Representative provides resource management, budget support, accounting, and financial transaction support for project acquis itions. Other responsibilities are outlined in JSC SLP 4.64 and JPG 5335.2.5

4.3.1.4 Life cycle3 The Acquisition Management process is applicable to all phases of the project life cycle whenever acqui-sitions are required. Acquisition strategies are formu -lated based on project needs for products, materials, or services in support of the work performed at any given life cycle phase. Acquisition strategies may range from project-level strategies – such as whether the contractor or outside organization should perform any (or all) of the project – through focused acquisi-tions to procure products or materials to implement in-house designs. Once decisions are made to acquire from a contractor or outside organization, support of the contract execution and performance monitoring continues until acquisition is completed. 4.3.1.5 Inputs3,8 Typical inputs to the process include, but are not limited to:

• Concepts (mission, system, operations) • Goals and objectives (mission, system) • Needs (mission, system) • Assumptions, guidelines, and constraints,

including:

Page 172: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-98

− Federal Acquisition Regulations (FARs) and other applicable NASA and JSC guidelines/ regulations

• Requirements (mission, system, interface), including: − Specifications, interfaces, design data, cost,

schedule, quantities, and standards related to the item being acquired

• Historical data (previous NASA projects and contracts)

• Market/industry data • Plans (project plan – project acquisition strategy) • Procurement data, including:

− List of appropriate acquisition types − Implementation timeline for each acquisition type

4.3.1.6 Steps1,2,3,7 The major steps and products that are implemented by this process are illustrated in Figure 4.3.-1.

• Project acquisition requirements shall be defined. − Identify and document technical and manage-

ment needs and requirements for the products , materials, and services that that need to be acquired to support the project. Includes, as applicable, SOWs, detailed specifications, deliverables, cost, schedule, and any other applicable documents.

− Consider using a draft request for proposal (DRFP) to obtain industry comments on the set of acquisition requirements.

− Communicate in advance with procurement support personnel to coordinate potential project procurement needs and potential procurement instruments.

• Project acquisition strategies shall be developed and maintained. − Gather technical and management inputs

and recommendations to the project acquisi-tion strategies based on judgments related to technology maturity, knowledge and skill base of the civil service workforce, cost ef-fectiveness, work complexity, existence of facilities or other infrastructure to perform the work, or other relevant considerations.

− Evaluate whether project components should be developed, purchased, or reused based on es-

tablished criteria and the needs of the project (i.e., Make/Buy Analysis). Identification of work products, whether externally or intern-ally acquired, helps to establish a top-level WBS to estimate the scope of the project.

− Determine the acquisition type for each product to be acquired (e.g., contract type such as sole source, full and open competi-tion or “piggyback” on existing contract, grant, cooperative agreement, or interagency

Acquisition Requirements

Definition

Acquisition Requirements

Develop and Maintain Acquisition

Strategies

Project Acquisition Strategies

Execute Acquisition Instruments

Contracts, Grants, or Agreements

Contract Performance Evaluation and Monitoring Data

Manage Acquisition Executions

Information in support of the next acquisition

KEY Step/Activity Product Information/Output Flows

Figure 4.3-1. Acquisition management process diagram.

Page 173: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-99

funds transfer) (see discussion, Section 2.8, Project Management and Planning).

− Periodically review and maintain the project acquisition strategies to account for changing project needs and updated Federal regulations.

• The project shall execute the acquisition instru-ment in accordance with project strategies. − Develop solicitation and evaluation instru-

ments to support solicitation, negotiations, and selection of the supplier (e.g., agreement, SOWs, RFPs, requests for quotes (RFQs), announcements of opportunities, evaluation criteria).

− Provide technical and management inputs to supplier selection based on an evaluation of the supplier’s ability to meet specified require-ments and established criteria.

− Support activities leading to the award of the contract or approval of agreements per estab-lished JSC procedures.

• The project shall manage execution of the acquisition instruments, including monitoring and performing evaluation. − Designate members of the project team to

act as task monitors or contracting officer’s technical representatives (COTRs) to oversee the work performed.

− Develop a surveillance strategy. Note that, for contracts, the COTR and contracting of-ficer are responsible for determining and im-plementing the level and type of performance monitoring required after considering risk fac-tors associated with the work to be performed.

− Provide the necessary performance evalua-tion criteria and data to monitor and evaluate the supplier’s performance.

− Analyze performance data vs. established

criteria to assess the level of performance. − Communicate the status to the project man-

ager. Include recommendations for corrective actions as required (see Section 4.2).

NOTE: These process steps are also applicable for the case where acquisitions are performed by adding de-liveries to an existing contract. In this case, require-ments would still need to be defined; the acquisition strategy will use the existing contractual vehicle; the supplier proposal would still need to be evaluated; and performance monitoring of the task would still need to be performed. 4.3.1.7 Outputs6,7,8 Primary outputs from this process are:

• Acquisition Requirements, including, as applic-able, SOWs, detailed specifications, deliverables, cost, schedule, and any other applicable documents .

• Acquisition strategies. • Surveillance plans. • Acquisition instruments. • Supplier performance evaluation and monitor-

ing data. 4.3.1.8 Exit criteria3 Closeout activities are completed for each acquisition within the project. 4.3.1.9 Measurement7 The following table provides example base and derived measures that can be used in conjunction with the Acquisition Management process. See discussion of Management on page 4-1.

Acquisition Requirements Definition (FTEs) Acquisition Requirements Definition Productivity Acquisition Requirements Definition Effort – Planned vs. Actuals Acquisition Requirements Definition Rate Charts Acquisition Requirements Definition Effort as % of Total Engineering Effort # of Products to be Acquired Planned vs. Actual Acquisition Products Acquisition Strategy Milestone Schedule – Planned vs. Actuals Contract Deliverables Milestone Dates Contract Deliverables – Planned vs. Actual Dates Data Requirements List Items (DRLI) Delivery Status Summary – # On Time vs. # Late; % On Time Deliveries Contract Deliverables Status (e.g., DRLI in Contract Deliverables Status Summary development, review, approval, delivered)

Base Measures Derived Measures

Page 174: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-100

4.3.1.10 Methods and techniques3,8 Several methods and techniques are available to sup-port the Acquisition Management process. These are:

• Market research. • Make/buy/reuse analysis. • Technology assessment. • Trade study. • Risk-based acquisition management.

4.3.1.11 Software tools Some means of capturing evaluation criteria, computations, comparison, recommendations, and results associated with SE analyses is required. Typ-ically a spreadsheet application is sufficient to capture, control, manipulate, present, and share this informa-tion. Access to historical data repositories should also be considered. Depending on the size, complexity, and amount of applicable data captured from previous projects, anywhere from standard office automation tools to large databases may be required. 4.3.1.12 References The following documents, which were used to prepare this section, offer additional insights into the Acquisition Management process:

1NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002. 2NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Require-ments. 3CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 4JSC SLP 4.6, Procurement. 5JPG 5335.2, Space Act Agreements. 6INCOSE Systems Engineering Handbook , Version 2.0, 2000. 7EIA-632, Processes for Engineering a System, ANSI/EIA -632-1998, 1999. 8SP 6105, NASA Systems Engineering Handbook , 1995. 9JMI 5150.5F, Processing of JSC Procurements Through Delivery, Acceptance and Payment Stages. 10JMI 5151.5B, Management of Support Contracts.

Page 175: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-101

4.3.2 Risk Management1,2 The Risk Management process is performed to identify potential problems before they occur, so that risk-handling activities may be planned and invoked, as needed, across the life of the project in an efficient and effective manner and to mitigate adverse impacts. In the Risk Management process, the project team is responsible for identifying, analyzing, planning, track-ing, controlling, and communicating effectively the risks (and the steps being taken to handle them) both within the team and with management and stakehold-ers. Risk management is a continuous, iterative process with the ultimate goal of enabling mission success. It should be a key element and an integral part of project management and engineering processes. 4.3.2.1 Function3 In the Risk Management process:

• Continuous and iterative risk analysis that des -cribes and quantifies the safety, performance, schedule, and cost risk (i.e., likelihood vs. con-sequences) shall be conducted.

• Controls and approach, including mitigation op-tions for identified risks, shall be documented.

• Risks shall be tracked. • Risks that impact mission success, performance,

cost, and schedule shall be controlled. 4.3.2.2 Objective The Risk Management process is performed to re-duce the probability of adverse impacts and, as a result, maximize the probability of achieving system goals. 4.3.2.3 Responsibilities The Project Control Officer has primary responsi-bilities in the tracking, control, and communication aspects of the Risk Management process. The Project Manager is the primary customer in that the Risk Management process supports project resource allocation decisions. All other Project Team Members are responsible for risk identification, analysis, and planning. 4.3.2.4 Life cycle Risk management is conducted throughout the project life cycle. A preliminary risk management plan is required at the end of Phase A, Preliminary Analysis, and an update to this plan is required by the end of Phase B, Definition. 4.3.2.5 Inputs4 Inputs include the following:

• Program data • Project data • Constraints

• Fault-tree analysis (FTA) results • Failure modes and effects analysis (FMEA)

results • Test data • Expert opinion • Hazard analysis • Lessons learned • Technical analysis • Resources

4.3.2.6 Steps1 The project team shall document the Risk Man-agement process within a risk management plan, and the project manager shall approve that process. Con-tent requirements of the risk management plan are listed in NPR 8000.4, Section 2.7.4.2. Figure 4.3-2, which appears at the top of the following page, illustrates this process. The project Risk Management process shall in clude at a minimum: (a) risk identification, (b) risk analysis , (c) risk planning, (d) risk tracking, (e) risk control, and (f) identification of process responsibilities. These are outlined as shown below.

• Identify Risks − Identify all project risks, including safety,

performance, schedule, and cost risks. − State the risks in terms of conditions and

consequence(s). − Capture the context of the risks; e.g., what,

when, where, how, and why. − Methods such as FMEA and fault-tree anal-

ysis can help to identify risks. − Key areas to assess include safety, perform-

ance, schedule, budget, requirements, tech-nology, management supportability, logistics and maintenance operations, and programmatic.

• Analyze Risks − Evaluate risks, including:

• Probability • Impact/severity • Timeframe (when action needs to be

taken) − Classify/group similar/related risks. − Prioritize. − Methods such as probabilistic risk assess-

ment (PRA) can help to analyze risks. • Plan for Action

− Assign responsibility for each risk. − Determine approach for each risk (research,

accept, mitigate, or monitor). Typical risk attributes for each approach are as follows: • Research – High-cost mitigation, long

time to effect, uncertainty in analysis of likelihood and consequence

Page 176: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-102

• Accept – Low consequence, fairly low likelihood, any timeframe, any cost

• Mitigate – High-medium consequence, high and medium likelihood, near to mid term, high-medium and low cost to mitigate

• Monitor – Low-medium likelihood, low-medium consequence, mid and far term, high-medium and low cost to mitigate

− For each risk that will be mitigated, define mitigation level (e.g., action item list or more detailed task plan) and goal, and include budget estimates.

• Track − Acquire/update, compile, analyze, and

organize risk data. − Report results/status. − Verify and validate mitigation actions.

• Control − Analyze results of mitigation actions. − Decide how to proceed (re-plan, close the

risk, invoke contingency plans, continue tracking).

− Execute the control decisions. − The project manager shall personally review

the formal acceptance and closure of all proj-ect risks.

• Communicate and Document − Communicate essential risks status to the

entire team on a regular basis. − The project manager shall report project risk

status, especially a status concerning primary risks (i.e., those with both high probability and high impact/severity) to the program manager, directorate-level organization (DLO) management, and appropriate project governing management forums (e.g., direc-torate project management board, JSC Engi-neering Review Board (ERB), or JSC Project Management Council (PMC)).

− Implement a system for documenting and tracking risk decisions.

4.3.2.7 Outputs The Risk Management process is performed over the entire life cycle of the system of interest. A prelim-inary risk management plan is required during Phase A, with an update required during Phase B. In addi-tion, outputs include risk:

• Statements • Evaluations • Mitigation plans • Acceptance criteria

Project Data Risk Management Plan FTA FMEA Results Constraints

Lessons Learned PRA Results Expert Opinion Test Data Other Technical Analysis Hazard Analysis

START Identify Risks

Statements of Risks

Analyze Risks

Risk Evaluation, Classification, and Prioritization

Resources Project Goals and Constraints

Plan for Action Risk Mitigation Plans Risk Acceptance Criteria Risk Tracking Plan

Project Data Resources

Track Risk Status Reports

Control NOTE: Communication and documentation occurs through-out all of the activities

KEY Step/Activity Information/Products Information/Output Flows

Figure 4.3-2. Risk management process diagram.

Page 177: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-103

• Tracking plans • Status reports

4.3.2.8 Exit criteria Since risk management is conducted throughout the project life cycle, the overall exit criterion is the safe and orderly achievement of system of interest disposal. 4.3.2.9 Measurement The following provides example base and derived measures that can be used in conjunction with execut-ing the Risk Management process. See discussion of Measurement on page 4-1.

4.3.2.10 Methods and techniques1,5 Methods and techniques applicable to the Risk Management process include:

• Individual or group expert judgment. • Statistical analysis of historical data. • Uncertainty analysis of cost, performance, and

schedule projects (consists of building and run-ning a probabilistic model of the system under investigation, including the chance variation inherent in real-life cost, performance, and schedule).

Total # of Risks Identified Total # of Risks of High Probability and/or High # of Primary Risks Encountered vs. Total Impact (primary risks) Identified # of Risks Identified – % Risks Identified Rate Charts # of Risks with Chosen Approach Research Total # of Risks Identified vs. Total # of Risks with Chosen Approach Research – % # of Risks with Chosen Approach Accept Total # of Risks Identified vs. Total # of Risks with Chosen Approach Accept – %

# of Risks with Chosen Approach Monitor Total # of Risks Identified vs. Total # of Risks with Chosen Approach Monitor – % # of Risks with Chosen Approach Mitigate Total # of Risks Identified vs. Total # of Risks with Total # of Mitigate – % Total # of Mitigation Plans Developed Mitigation Plan Development Rate Charts Total # of Mitigation Plans Developed vs. Total # of Risks Identified – % # of Mitigation Plans for Primary Risks Developed vs. Total # of Mitigation Plans Developed Total # of Risks Eliminated Total # of Risks Eliminated vs. Total # of Risks Identified – % # of Primary Risks Eliminated vs. Total # of Risks Eliminated

# of Primary Risks Eliminated vs. Total # of Primary Risks Identified Risk Elimination Rate Charts Risk Management Effort (FTEs) Risk Management Productivity

Risk Management Effort – Planned vs. Actuals

Risk Management Effort as % of Total Engineering Effort

Base Measures Derived Measures

Identify

Analyze

Plan

Track

Overall Process

Page 178: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-104

• PRA (also known as probabilistic safety assess-ment (PSA) and quantitative risk assessment (QRA)).

• FTA and FMEA. • Ordinal risk scales. • Comparison to analogous systems. • Risk assessment matrix.

4.3.2.11 Software tools Principal considerations for a S/W tool that would support the Risk Management process are the consist-ent, concise, and thorough documentation both of the risk characterization (probability and impact) and of the mitigation progress of each risk. Common and convenient accessibility and visibility to all project team members and stakeholders is also a primary consideration. The Integrated Risk Management Application (IRMA) is a custom-built database tool that is used by the International Space Station Program. JSC proj-ects should consider use of this already existing tool as their risk management application. A plan is being considered for adopting IRMA NASA-wide to support unifying the risk management effort across the Agency. 4.3.2.12 References The following documents and Web sites, which were used to prepare this section, offer additional in-sights into the Risk Management process:

1NPR 8000.4, Risk Management Procedures and Guidelines, 2002. 2CMMI-SE/ SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 3NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Requirements. 4NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002. 5RP 1358, Systems Engineering “Toolbox” for Design-Oriented Engineers, 1994. 6Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners, Version 1.1, http://www.hq.nasa.gov/office/doeq/doctree/praguide.pdf 7NTIS AD-319533KKG, DTIC#:AD-A319 533\6\XAB, Continuous Risk Management Guide-book , Software Engineering Institute at Carnegie Mellon University, 1996. 8The Department of Defense, Risk Management Guide for DoD Acquisition. Defense Acquisition University and Defense Systems Management

College, Second Edition, 1999. This document can be downloaded from: http://www.dsmc.dsm.mil/pubs/pubsgen.htm 9SSP 51079, International Space Station Program Risk Management Plan , Revision A, 2002.

Information on: • Risks associated with computer systems can be

found in the National Institute of Standards and Technology (NIST) publication: SP 800-12, An Introduction to Computer Security: the NIST Handbook , available at http://csrc.nist.gov/publications/nistpubs/800-12/.

• Implementation of risk-based acquisition management (R-BAM) can be found at http://www.grc.nasa.gov/WWW/spaceiso/rbam/ .

• The Continuous Risk Management process may be found at http:///www.srqa.jsc.nasa.gov/riskmgmt/.

Page 179: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-105

4.3.3 Configuration Management1,2 Configuration management (CM) is performed on projects to establish and maintain the integrity of a system of interest’s products and controlling project baselines using configuration planning, configuration identification, configuration control, configuration sta-tus accounting, and configuration verification. The re-quirements on the part of the SE and project control efforts are to provide technical support to the overall project CM established by the project manager per NPG 7120.5B.3

4.3.3.1 Function1 In the Configuration Management process:

• The project’s overall plan and process for achieving CM shall be developed.

• The work products that are to be placed under configuration control shall be identified.

• The configuration of selected work products that compose the baselines at given points in time shall be identified.

• The control and communication of changes to configuration items (CIs) shall be conducted.

• Status accounting and current configuration data for configuration-controlled items shall be provided.

• Configuration verification shall be supported. 4.3.3.2 Objective The primary objective of the SE and project control Configuration Management process is to assure that the physical configuration of a product is adequately identified, documented, and controlled to a level of detail sufficient to repeatedly produce that product and meet anticipated needs for operation, quality management, maintenance, repair, and replacement. 4.3.3.3 Responsibilities2 The Lead Systems Engineer is responsible for ensur-ing the completeness and technical integrity of the tech-

nical baseline and for ensuring that it is consistent with the costs and schedules in the business baseline. The Project Manager is responsible for establishing a Configuration Management process for the project, including a Configuration Control Board (CCB) or equivalent. The Project Control Officer conceives, implements, and manages the CM system, and documents it in a CM plan; acts as secretary of the project CCB (controls the change review process); controls baseline changes and releases; and initiates and acts on the results of configuration verification activities, including audits. The Quality Assurance Engineer supports the con-figuration audits to evaluate the evolution of a product to ensure compliance to specifications, policies, and agreements. The Engineering Disciplines support configuration identification for their specific areas, are extensive us-ers of the CM system, and rely on configuration status accounting to manage their CIs and request the proper baselines. The Engineering Disciplines also support the Configuration Manager and the Quality Assurance Engineer with applicable configuration audits. 4.3.3.4 Life cycle The Configuration Management process is concern-ed with all system of interest products and everything that describes those products from their name to their requirements and documentation. Therefore, CM be-gins d uring the preliminary analysis phase and contin-ues throughout the life cycle of the system of interest. A CM plan is developed during the preliminary analysis phase. 4.3.3.5 Inputs Typical inputs to the Configuration Management process consist primarily of product baselines and CRs to baselines. Examples of these inputs and their associated life cycle phase are as follows:

Preliminary Analysis Operations Concept Definition System/Subsystem Requirements Interface Requirements Specification Design Architectural Design Interface Design Test Plan and Procedures Operational and Support Manuals Training Materials Product Delivery Operations Technical Performance Data Lessons Learned Termination Disposal Plan

Life Cycle Phase CM Process Input

Page 180: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-106

4.3.3.6 Steps4 The following diagram (FIG. 4.3-3) illustrates the major steps and products of the SE Configuration Man-agement process. Adhering to the major steps of this process directs the project toward meeting its CM re -quirements. For each major process step, additional guidance is provided as to the typical sub-process steps and products that are expected.

• Planning Steps − Perform CM Planning – A plan for perform-

ing CM on the project shall be defined. • Develop the project’s overall plan for

achieving CM and include as part of the SEMP and PMP, as applicable.

• Define the project’s CM roles and responsibilities.

• Develop the appropriate CM policies, processes, procedures, and guidelines required to meet the project’s CM plan.

• Provide the project manager with cost and schedule inputs related to CM life cycle activities in support of the project planning effort.

• Ensure that at least a top-level/partial version of the CM plan is available at Mission Definition Review (MDR), and a final/approved version is available at System Design Review (SDR).

• Identification Steps − Identify CIs and Baselines – CIs that will be

placed under CM shall be identified. Typical items placed under CM control include plans, processes, designs, drawings, requirements, products, tools, technical baselines, change control forms, and program documentation.

• Select the CIs and the work products that compose the CIs based on documented criteria.

• Assign unique identifiers to CIs using a predetermined method.

• Specify important characteristics of each CI (e.g., author, document or file type, pub-lication date or version identifier, and pro -gramming language for S/W code files).

• Specify when each CI is placed under CM. Example criteria for determining when to place work products under CM include the stage of the project life cycle, when the work product is ready for test, degree of control desired on the work product, cost and schedule limitations, and customer requirements.

• Identify the owner responsible for each CI. • Identify CI baselines that may be used for

internal use and for delivery to the customer.

CM Systems

Perform CM Planning

CM Plans and Processes

Identify CIs and Baselines

Project CI and Baseline

Definitions

Perform Configuration

Audits

Audit Results

Establish CM Records

Configuration Reports

Establish CM Systems

Manage CRs

Change Control Forms

Control CIs

Release Baselines

CI Baselines

KEY Step/Activity Product Information/Output Flows

Figure 4.3-3. Configuration management process diagram.

Page 181: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-107

− Establish a CM System – A CM system, in-cluding a change management process for controlling work products, shall be estab-lished and maintained. • Establish a mechanism to manage mul-

tiple control levels of CM; e.g., mecha-nisms for managing the differences in the levels of control needed at different times in the project life cycle, differences in the levels of control needed for different types of systems, and differences in the levels of control needed to satisfy privacy and security requirements for the CIs.

• Create CM reports from the CM system. • Preserve the contents of the CM system

(i.e., backup, archiving, and recovery functions).

• Control Steps − Manage CRs – CRs to CIs shall be tracked.

• Initiate and record CRs in the CR repository. − Problem/change report − Specification change notice − Engineering change proposal − Request for deviation/waiver

• Analyze the impact of changes and fixes proposed in the CRs.

• Review CRs that will be addressed in the next baseline with those organizations that will be affected by the changes and get their agreement.

• Track the status of CRs to closure. − Control CIs – Changes to CIs, throughout

the life of the product, shall be controlled. Configuration control maintains the integrity of the CIs identified by facilitating approved changes and prevents the incorporation of unapproved changes into the baseline. • Verify appropriate authorization was ob-

tained before changed CIs are entered into the CM system. Authorization may be in the form of approval from a project-established CCB.

• Check in and check out CIs from the CM system for incorporation of changes in a manner that maintains the correctness and ingenuity of the CIs.

• Perform reviews to ensure that changes have not caused unintended effects on the baselines (e.g., ensure that changes have not compromised the safety and/or secur-ity of the system).

• Record changes to CIs and the reasons for the changes, as appropriate.

− Release Baselines – Baselines for internal use and for delivery to the customer shall be released. Release of a baseline involves ap-proving a set of configuration data for the agreed-on set of CIs from the CM systems and releasing the baseline for further devel-opment. Multiple baselines may be used to define an evolving product during its devel-opment cycle. • Obtain authorization from the CCB before

creating or releasing baselines of CIs. • Create or release baselines only from CIs

in the CM system. • Document the set of CIs that is contained

in a baseline. • Make the current set of baselines readily

available. • Status Steps

− Establish CM Records – Records describing CIs shall be established and maintained. • Record CM actions in sufficient detail so that

the content and status of each CI is known and previous versions can be recovered.

• Ensure that relevant stakeholders have access to and knowledge of the configur-ation status of the CIs.

• Specify the latest version of the baselines . • Identify the version of the CIs that con-

stitute a particular baseline. • Describe the differences between succes-

sive baselines. • Revise the status and history (i.e.,

changes and other actions) of each CI, as necessary.

• Audit Steps − Perform Configuration Verification – Config-

uration audits to maintain integrity of the configuration baselines and ensure process integrity shall be supported. • Assess the integrity of the baselines. • Confirm that the configuration records cor-

rectly identify the configuration of the CIs. • Review the structure and integrity of the

items in the CM system. • Confirm the completeness and correct-

ness of the items in the CM system. • Confirm compliance with applicable CM

standards and procedures. • Track action items from audit to closure.

4.3.3.7 Outputs4 Primary outputs from this process are:

• CM plans/processes/CM procedure(s) • CM system

Page 182: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-108

• Project CIs • CI baselines • Change control forms • configuration reports • Configuration audit results

4.3.3.8 Exit criteria Since CM is an ongoing function throughout the system of interest life cycle until retirement, there are no specific exit criteria. However, there are some items that are required to move from one life cycle phase to the next. Some examples of these are:

• Released and approved CM plan or updated version

• Established list of project CIs • Implemented CM system containing the identi-

fied CIs • Established and controlled CI baselines • Published configuration audit results

4.3.3.9 Measurement The following table provides example base and derived measures that can be used in conjunction with the Configuration Management process. See discussion of Measurement on page 4-1. 4.3.3.10 Methods and techniques Several methods and techniques, such as general auditing techniques, are available. Additional process tools and techniques are listed in JSC International Standards Organization (ISO) SLP 4.20, Process Measurement and Improvement.5

Also, change control forms provide a standard method of reporting problems and enhancements that

lead to changes in formal baselines and internally con-trolled items. The following examples provide an organized approach to change tracking:

• Problem/Change Report – To document prob-lems and recommend enhancements to CIs or complementary documentation. Can be used to identify problems during design, development, integration, test, and operations.

• Specification Change Notice – To propose, trans-mit, and record changes to baselined specifications.

• Engineering Change Proposal – To propose changes to the customer. This proposal describes the advantages and disadvantages of the propos-ed change as well as available alternatives and the schedule and funding needed to proceed.

• Request for Deviation/Waiver – To request and document temporary deviations from configura-tion identification requirements when permanent changes to provide conformity to an established baseline are not acceptable.

4.3.3.11 Software tools Suggested tools are: • A database management system to capture and

control CIs; to identify, control, and release base-lines; and to provide status accounting reports .

• Standard office automation products to create

standardized change control forms and audit reports.

• A workflow system to handle CR tracking and approval.

# of CIs and Associated Size (or complexity) CI Content – Planned vs. Actuals

CI Status CI Status Summary (monthly as a minimum)

CR Status (classifications include change, CR Status Summary – # Open, Approved, Rejected, In- problem, deviation, waiver) Process, Closed, Common Reason for Change

Effort for CM (FTEs) CM Effort – Planned vs. Actuals

CM Effort as % Total Engineering Effort

CM Activity Status CM Activities Status Summary

CM Records Status CM Records Status Summary

CM Effort – Planned vs. Actuals; per CR Type; per Life Cycle Phase

CM Rate Charts – Change Rate, Change Process, Cycle Time

CM Effort as % Total Engineering Effort

Base Measures Derived Measures

Page 183: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-109

4.3.3.12 References The following documents, which were used to prepare this section, offer additional insights into the Configuration Management process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Require-ments. 2SP 6105, NASA Systems Engineering Handbook , 1995. 3NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002. 4CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 5SLP 4.20, Process Measurement and Improvement. 6INCOSE Systems Engineering Handbook , Version 2.0, 2000. 7EIA-632, Processes for Engineering a System, 1999.

Page 184: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-110

4.3.4 Quality Management1–6 The Quality Management process is performed to assure the use and integration of high standards of quality within the PM effort to reduce technical risk. This process addresses the quality of the processes that are being used to create the system of interest. High-quality products can only be produced, on a continuous basis, if a process exists to continuously measure and improve the quality of processes used to produce the system of interest products. This process emphasizes establishment of goals and subsequent measurements, analysis, and implementation of cor-rective actions to attain those goals. The intent of quality is to ensure products or services are offered that meet defined needs, satisfy stakeholders’ expec-tations, and comply with applicable standards and procedures. The Quality Management process involves (1) ob-jectively evaluating performed processes, work prod-ucts, and services against applicable process descrip-tions, standards, and procedures; (2) identifying and documenting noncomp liance issues; (3) ensuring that noncompliance issues are addressed; (4) capturing, reporting, and reviewing lessons learned; and (5) pro-viding feedback to project staff and managers on the results of quality activities. This process supports the delivery of high-quality products by providing project staff and managers at all levels with appropriate visi-bility into, and feedback on, processes and associated work products throughout the life of the project. NASA and JSC policies, procedures, and guidelines (ref. NPD 1280.1,7 JPD 5335.1,8 and JPG 5335.34) establish organizational and project expectations to objectively evaluate processes and products. These expectations encompass assurances that (1) customer requirements are determined, maintained, and satisfi-ed; (2) products and services are planned, developed under controlled conditions, measured, reviewed, and improved; and (3) processes and their interactions used to provide products and services are planned, measured, reviewed, and improved. This Quality Management process provides a mechanism for im-plementing many of these policies, procedures, and guidelines. See Section 4.1.11 for details on quality assurance (QA) activities. 4.3.4.1 Function1 In the Quality Management process:

• Reliable and repetitive procedures and processes shall be identified and established.

• Established procedures and processes shall be us-ed to manage the project and design of the system.

• The quality process shall ensure that system of interest products meet the design (i.e., “as-built” meets the “as designed”) and the records of compliance are maintained.

• Continuous improvement and lessons learned within the procedures and processes shall be incorporated.

4.3.4.2 Objective2,7 The objective of this process is to identify and establish processes and procedures to manage and design the system, to measure process performance, and to perform continuous improvement of the proc-esses based on objective data. Not only is quality ex-pected of the system of interest as it matures but the processes used to build in quality are under continu-ous review for effectiveness, value to stakeholders, and improvement opportunities. 4.3.4.3 Responsibilities9 It is the responsibility of the Project Manager to ensure that project planning documents include well-defined, quality-related processes and strategies that will be followed by the project team and that are in ac-cordance with the JSC Quality Management System.4

The Project Manager and the Lead Systems Engi-neer need to have confidence that the system of in te-rest produced and delivered is in accordance with its functional, performance, and design requirements. The Project Control Officer is responsible for implementing the Quality Management process for the project with technical support from the Lead Sys-tems Engineer. 4.3.4.4 Life cycle The Quality Management process is pervasive throughout the project life cycle. Quality Management process expectations begin at project conception and are applied increasingly as the system of interest ma-tures through design, development, integration, test, deployment, operation, and disposal. During the early stages of the life cycle, the focus is on ensuring that the quality management system policy is being prop-erly applied; i.e., appropriate processes and procedures are established based on the needs of the project. As the project matures, the emphasis shifts to assessments of process applications; i.e., are the processes being executed as described, and are the work products being produced in accordance with requirements and specifications. 4.3.4.5 Inputs The following are typical inputs to the Quality Management process:

• Project plans • Center processes, standards, and work instructions • Process tailoring guidelines

Page 185: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-111

4.3.4.6 Steps3,10,11 The following diagram (FIG. 4.3-4) illustrates the major steps of the Quality Management process. De-tails of these steps are provided below.

• Apply Processes and Practices – All project-related processes and practices shall be properly established and tailored as required, and address-ed in the project plan and associated project documentation.

• Review Lessons Learned – Review lessons learned to establish an awareness of issues and problems previously identified. − Identify Applicable Processes – Determine

which JSC processes are applicable to the project.

− Tailor Processes for the Project – Tailor identified processes to meet specific needs of the project.

− Identify Lessons Learned – Identify and capture lessons learned and best practices that could improve future projects and their processes. In particular, identify and report

to process owners any particular strengths or weaknesses of the selected processes.

• Evaluate Process Performance – Performed

processes shall be objectively evaluated against applicable process descriptions, standards, and procedures. − Select Processes for Evaluation – Identify

the processes that will be evaluated based on criticality of the process to meet project needs.

− Establish and Maintain Process Evaluation Criteria – Establish and maintain clearly stated criteria for process evaluations to determine appropriate levels of process performance based on project needs.

− Select Process Measures – Select a set of measurements that supports the evaluation criteria (see Section 4.2.5, Performance Measurement).

− Monitor Process Performance and Collect Process Measures

Apply Processes and

Practices

Review Lessons Learned

Identify Applicable Processes

Tailor Processes

Project Processes

A

Identify Lessons Learned

C

Evaluate Process

Performances

Project Processes

A Select

Processes for

Evaluation

Establish and Maintain

Process Evaluation

Criteria

Select Process

Measures

Monitor Processes

Evaluate Selected

Processes

B

Identify Lessons Learned

C

Evaluate Work

Products

Work Products

A Select Work

Products for

Evaluation

Establish and Maintain

Work Product

Evaluation Criteria

Select Product

Measures

Monitor Work

Products

Evaluate Selected

Work Products

B

Identify Lessons Learned

C

Ensure Resolution of

Noncompliance Issues

B

Resolve Non-

compliance Issues

Document Unresolved

Issues

Escalate Unresolved

Issues

Analyze Trends

Inform Relevant Stake-

holders

Review Open Issues

Track Non-

compliance Issues

Establish Records

C Capture Lessons Learned

Lessons Learned

Quality Process Status Report

Revise Status

Record Activities

KEY Step/Activity Product Information/Output Flows

B Connector

Figure 4.3-4. Quality management process diagram.

Page 186: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

4-112

− Evaluate Selected Processes – Use the stated criteria to evaluate performed processes.

− Identify Lessons Learned – Identify and capture lessons learned and best practices that could improve future projects and their processes.

• Evaluate Work products – Work products and services shall be objectively evaluated against the applicable product descriptions, standards, and procedures. − Select Work Products for Evaluation –

Identify work products to be evaluated based on documented sampling criteria, if sampling is used.

− Establish and Maintain Work Product Evaluation Criteria – Clearly state criteria by which to evaluate work products. The intent is to provide criteria based on project needs such as: • What will be assessed during the evalua-

tion of a work product? • When or how often will a work product

be evaluated? • How will the evaluation be conducted? • Who must be involved in the evaluation?

− Select Product Measures – Select a set of measurements that supports the evaluation criteria (see Section 4.2.5, Performance Measurement).

− Monitor the Development of Work Products and Collect Product Measures

− Evaluate Work Products – Evaluate work products by: • Using the standard criteria during evalua-

tions of work products. • Evaluating work products before they are

delivered to the customer. • Evaluating work products at selected

milestones in their development. • Performing in-progress or incremental

evaluations of work products and serv-ices against process descriptions, stand-ards, and procedures.

• Identifying each case of noncompliance found during the evaluations.

− Identify Lessons Learned – Identify and capture lessons learned and best practices that could improve future projects and their work products.

• Ensure Resolution of Noncompliance Issues – Quality issues and resolution of noncompliance issues shall be communicated with project staff and management, regardless of whether issues are process or work product related.

− Resolve Noncompliance Issues – Resolve each noncompliance issue with the appropri-ate members of the project where possible.

− Document Unresolved Issues – Document noncompliance issues when they cannot be resolved within the project.

− Escalate Unresolved Issues – Escalate non-compliance issues that cannot be resolved within the project to the appropriate level of management.

− Analyze Trends – Analyze nonconformance issues to see if there are any quality trends that can be identified and addressed. Identify whether nonconformance issues are caused by noncompliance with established proc-esses or by process deficiencies.

− Inform Relevant Stakeholders – Ensure that relevant stakeholders are aware of the results of the evaluations and the quality trends. Com-municate process deficiencies or improvement opportunities to the process owner so that they are addressed as part of continuous process improvement. The process owner can be at the project, directorate, or Center level. Exam-ples of Center-level process owners include the JSC Systems Engineering Working Group (J-SEWG), the Software Engineering Process Group (SEPG), and the Project Management Working Group (PMWG).

− Review Open Noncompliance Issues – Periodically review open noncompliance issues and trends with the manager who is designated to receive and act on noncompli-ance issues.

− Track Noncompliance Issues – Track non-compliance issues to resolution.

• Establish Records – Quality records shall be established and maintained. − Capture Lessons Learned – Record lessons

learned that were identified in previous steps in this process. Ensure lessons learned are available for review. Report lessons learned, as required.

− Record Activities – Record process and prod-uct quality management activities in sufficient detail such that status and results are known .

− Revise Status – Revise status and history of the quality management activities, as necessary.

4.3.4.7 Outputs2,3 Outputs of the Quality Management process are:

• Project-specific processes and work instructions.

• Process and work product evaluation criteria. • Process and work product evaluation reports.

Page 187: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management Processes

4-113

• Lessons learned, best practices, and improve-ment opportunities.

• Trouble/noncompliance reports. • Corrective action reports. • Quality trends.

4.3.4.8 Exit criteria2 Quality management is an ongoing process that is never actually completed until project termination. However, throughout the project life cycle the fol-lowing criteria shall be evident:

• Corrective action identification and mitigation plans

• Lessons learned identification, capture, and dissemination

• Improvement opportunity identification, capture, and dissemination

4.3.4.9 Measurement The following table provides example base and derived measures that can be used in conjunction with executing the Quality Management process. See discussion of Measurement on page 4-1. 4.3.4.10 Methods and techniques2 The following are typical methods and techniques used in quality management:

• Cause and effects diagram • Trend analysis • Pareto analysis • Fishbone (Ishikawa) diagrams • Process maps and models • Process simulations • Surveys, assessments, and audits

4.3.4.11 Software tools2 Typical S/W tools used in the Quality Management process are analysis, simulation, and modeling tools as well as standard office products such as spread-sheets, work processors, and databases. Many of these tools fall into the following categories:

• Cause analysis tools • Analysis tools • Mathematical modeling tools • Process illustration and simulation tools • Process checklists

4.3.4.12 References The following documents, which were used to prepare this section, offer additional insights into the Quality Management process:

1NPR 71xx.x (document number not yet assigned), NASA Systems Engineering Processes and Require-ments. 2EIA-731.1, Systems Engineering Capability Model, 2002. 3CMMI-SE/SW/IPPD/SS V1.1, Capability Maturity Model Integration (CMMI) for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing, 2002. 4JPG 5335.3, JSC Quality Management System Quality Manual, 2003. 5NPR 7120.5B, NASA Program and Project Management Processes and Requirements, 2002. 6ISO 9001:2000, Quality Management Systems – Requirements, 2000. 7NPD 1280.1, NASA Management System Policy, 2003. (NOTE: This NPD cancelled NPD 8730.3, NASA Quality Management System Policy.) 8JPD 5335.1, JSC Quality Policy, 2003. 9SP 6105, NASA Systems Engineering Handbook , 1995. 10INCOSE Systems Engineering Handbook , Version 2.0, 2000. 11AG-CWI-001, JSC Lessons Learned Process.

Tailoring Report Completion Planned vs. Actual Tailoring Dates

# of Processes Tailored (i.e., modified or % Process Compliance tailored out)

% Processes Modified or Tailored Out

# of Process Assessments Planned vs. Actual Process Assessments

# of Corrective Actions (e.g., discrepancy # of Corrective Action Items vs. # of Corrective reports) Actions Resolved

# of Days Past Due for Corrective Action

Base Measures Derived Measures

Page 188: NASA Project Management System Engineering and Project Control Processes and Requierments

JSC Project Management: System Engineering& Project Control Processes & RequirementsA

pp

end

ices

Page 189: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix A

A-1

Appendix A – Project Management Plan Content Requirements1

A.1 Title Page Includes, at a minimum, signatures of:

• Center Director or chair, directorate-level board if delegated

• Program manager (if project is in support of a program)

• Project manager • S&MA organization

A.2 Introduction The project is identified by an officially approved title, a NASA program, a program commitment agree-ment (PCA), and/or unique project number. A brief general history and summary are given, including the project’s purpose, goals, overall approach, and time-frame. For multiple NASA center projects, describe the NASA center’s project in relationship to the other participating NASA centers. A.3 Objectives State the specific project objectives, the perform-ance goals, and their relationship to the program ob-jectives and goals. Performance goals should be ex-pressed in an objective, quantifiable, and measurable form. A.4 Customer Definition and Advocacy State the main customers of the project (e.g., prin-cipal investigator (PI), science community, technology community, public, education community, program and Enterprise sponsor) and the process to be used to ensure customer advocacy. A.5 Project Authority Identify the center where the project manager resides and other center responsibilities, and the Governing Project Management Council (GPMC) responsible for oversight of the project. Provide a chain of accountability and decision path that outlines the roles and responsibilities of the project manager, program manger, Center Director, and other auth-orities as required. A.6 Management Describe the project management structure, includ-ing organization and responsibilities, its integration into the program management structure, and NASA center participation. Identify all significant interfaces with other contributing organizations. Identify specific 1NPR 7120.5B, NASA Program and Project Management Processes and Requirements , 2002.

management tools to support management in planning and controlling the project. Describe any use of special boards and committees. Address any requirement for a NASA Resident Office, including duties and authority. A.7 Project Requirements Document the project requirements, including per-formance requirements and success criteria, as a flow-down from the program requirements. This includes allocation of these requirements and success criteria among the systems to be developed, both hardware (H/W) and software (S/W). A.8 Technical Summary Present a technical description of the project. This includes the systems to be developed (H/W and S/W), use of the international system of units (SI) measure-ment system, facilities, flight plans, operations and logistics concepts, and planned mission results anal-ysis and reporting. The technical summary includes the following:

• System(s) • System operations concept • System constraints • Ground systems and support • Facilities • Mission results analysis and reporting • End of life cycle

A.9 Logistics Describe the project logistics requirements; e.g., spares, shipping and handling equipment, transport-ation, user manuals, simulators, training and training materials, and supporting personnel. A.10 Schedules Document the project master schedule for all major events, independent reviews, and other activities throughout the life cycle of the project. Include approval dates for principal project documentation, life cycle transitions, major reviews, program-controlled mile-stones, and significant contractor milestones. Identify lower-level schedules to be developed and maintained. A.11 Resources The following resources are to be addressed:

a. Funding Requirements – Present a funding requirements chart that includes the same ele-ments as for the acquisition summary. Indicate the new obligation authority (NOA) in real-year dollars for the prior, current, and remaining fis -cal years. The displayed detail should cover major elements of cost (typically reflecting at least at the second level of the work breakdown structure (WBS) or its equivalent).

Page 190: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

A-2

b. Institutional Requirements – Present institutional requirements (use of or development of facilities, workforce) for the entire project throughout its life cycle. Include civil service workforce require -ments on the providing organizations for the prior (e.g., actuals), current, and remaining years.

A.12 Controls All technical performance, cost, or schedule parameters specified as requiring approval by the Administrator, the Enterprise Associate Administra-tor (EAA), Center Director, program manager, or appropriate controlling authority should be identified. Examples include funding by year, success criteria, program requirements, project objectives, management structure, and major program/project documentation. Identify the thresholds associated with each parameter that could cause a change request. Describe the process by which project requirements are validated for com-pliance with program requirements. Describe the pro-cess for controlling changes to these requirements. A.13 Implementation Approach The implementation approach of the project is provided (e.g., in-house, NASA center, contractor prime) as well as a project WBS as follows:

• Implementation approach • Project summary WBS (tied to the higher-level

project or program) • WBS dictionary

A.14 Acquisition Summary Provide summary information on procurement items, such as element (engineering design study, H/W and S/W development, mission and data oper-ations support); type of procurement (competitive, Announcement of Opportunity (AO) for instruments ); type of contract (cost-reimbursable, fixed-price); source (institutional, contractor, other government organizations); procuring activity; and surveillance. A.15 Program/Project Dependencies Other NASA, U.S. agency, and international activ-ities, studies, and agreements are summarized with emphasis on their effect on the program as follows:

• Related activities and studies; e.g., space communi-cations, launch services, crosscutting technology

• Related non-NASA activities and studies A.16 Agreements List all agreements necessary for project success and the projected dates of approval. Include all agree-ments concluded with the authority of the project manager and reference agreements concluded with the authority of the program manager and above as listed below:

• NASA agreements; e.g., space communications, launch services

• Non-NASA agreements − Domestic − International

A.17 Safety and Mission Success Safety and mission success planning is developed either as a section of this project plan or as a separate document. Address the activities and steps to be taken to ensure the safety of the public, the NASA astronauts and pilots, the NASA workforce, and NASA’s high-value equipment and property. Address both H/W and S/W aspects of the project, and identify all activi-ties; e.g., safety, reliability, maintainability, quality assurance, and environmental-related design and test, including orbital debris mitigation, project surveil-lance, and failure reporting/resolution that are used to ensure the success and safety of the mission. A.18 Risk Management Summarize the risk management approach to be used for the project, including appropriate project de-scope plans. Also identify primary risks consistent with NPR 7120.5B, paragraph 4.3.2d. A risk management plan is also developed and includes the content shown in NPR 8000.4, Risk Management Procedures and Guidelines. A.19 Environmental Impact Identify the documentation and schedule of events associated with environmental compliance considera-tions (NEPA and other requirements). This may inclu de environmental assessment (EA) or an environmental impact statement (see NPR 7120.5B, paragraph 4.6.5). A.20 Test and Verification Describe the project approach to test and verifica-tion for assurance of project success. This should ad-dress requirements for H/W and S/W verification and validation, as well as S/W independent verification and validation per NPD 8730.4, NASA Software In depen-dent Verification and Validation (IV&V) Policy. A.21 Technology Assessment Identify the NASA crosscutting or other technology thrusts to be used by the project. Identify the technol-ogies the project expects to mature during the life of the program. Briefly describe how these technologies will be developed and infused. Describe how and when the project will evaluate the feasibility, readi-ness, cost, risk, and benefits of the new technologies. A.22 Commercialization Identify near-term opportunities for commercial-iza tion. Describe the methods to be used to identify additional opportunities throughout the project life cycle.

Page 191: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix A

A-3

A.23 Reviews Provide the names, purposes, content, and timing of all reviews. Explain the reporting requirements for program and project reviews. A.24 Termination Review Criteria Provide the technical, scientific, schedule, cost, and other threshold criteria that will be used to initiate a termination review. A.25 Tailoring Identify those requirements for which the approach to compliance has been tailored consistent with project characteristics such as scope, complexity, visibility, cost, safety, and acceptable risk. Provide the rationale for such tailoring. A.26 Change Log Changes to the project plan should be documented in a change log.

Page 192: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix B

B-1

Appendix B – Systems Engineering Management Plan Outline

The Systems Engineering Management Plan (SEMP) is the top-level, integrated planning docu-ment for managing the technical effort. The SEMP defines how the project will be organized, structured, and conducted, and how the total engineering process will be controlled to provide a product that satisfies customer requirements. The SEMP should incorporate what the project needs to do to accom-plish the technical efforts, describing how the efforts will be carried out, who will accomplish them, how they will be controlled, and how technology will be transitioned from the technology base to system-of-interest products. This appendix provides a preferred outline for an SEMP. This outline is presented as an aid to the user of this Johnson Space Center (JSC) Procedures and Guidelines (JPG). The user may tailor the actual con-tent of the SEMP. B.1 Description of the System of

Interest Describe the:

a. System of interest structure to include the products (subsystems or system elements) that will make up the system of interest and also the enabling systems that will be related to the sys-tem of interest over its life.

b. Describe the key technical objectives and expected deliverables applicable to the systems engineering (SE) effort based on the entry and exit criteria of the applicable system life cycle stage.

B.2 Description of the Technical

Processes For each system of interest, describe implement-ation of the following technical process from Section 4 of this JPG:

• Requirements development • Requirements management • Operations concept development • Decomposition • Feasibility study • Technology planning • Design • Attainment • Integration • Systems analysis • Verification • Validation

For each process describe:

• What work will be done in executing the process.

• Who has overall responsibility for the success of the process, and who provides support in the execution of the process.

• Who will do the work, when will the work be done, and where will the work be done.

• The specific inputs required to conduct the process, including dependencies among work efforts such as what information will be needed to perform the work, where the work will come from, and when the work will be needed.

• The actual steps to be followed and their sequence of occurrence. Indicate any deviations from the recommended approach.

• The expected products and outcomes to be produced in the execution of the process, includ-ing tracking with respect to who will use and/or require the results.

• Any measurements to be used in tracking and managing the execution of the process.

• How exit criteria will be satisfied. • Any specific methods, techniques, and tools to

be used to conduct the process. B.3 Software Development (can

refer to a standalone plan if one is applicable)

Software development entails describing: a. The processes and tasks to be completed for

developing software (S/W) products. b. Methods and tools that are used to accomplish

S/W development processes and/or tasks. c. Any special organizational approaches to

ensure that efficient use of embedded or stand-alone computer and S/W products will be ac-complished in the SE efforts related to the system-of-interest structure.

B.4 Project Technical Management

Processes The project technical management process involves the following:

a. Describe how each of the following technical management processes from Section 4 of this JPG is used to support the technical work defined in Section B.2 above: • Acquisition management • Technical work and resource management • Risk management • Configuration management (CM) • Safety and mission success • Control

Page 193: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

B-2

• Quality management • Reviews

b. Describe the hooks/relationships to NASA NPR 7120.5 and other such management policy guid-ance documents.

c. Provide reference, as appropriate, to each stand-alone plan that provides more complete descrip-tions of processes such as cost and schedule control, risk management, logistics support, CM, interface management, quality assurance (QA), requirements management, change man-agement, and information management.

B.5 Organization (Team) Structure For the organization structure:

a. Describe the special skills and expertise re-quired to provide members to teams involved with engineering the system of interest.

b. Describe any gaps in skills and/or knowledge of

the existing workforce, and identify and describe training that will be accomplished to provide needed skills and/or knowledge.

c. Explain how personnel will be assigned to teams to ensure that integrated, multi-disciplin-ary teamwork will be accomplished to define, design, and realize required products.

B.6 Other Systems Engineering

Concerns Other SE concerns include:

• Budget • Schedule • Planned technology transitions • Long lead items • Other considerations for support required from

the project or customer

Page 194: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix C – Trace of Project Management Processes to Life Cycle

C-1

Page 195: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix D

D-1

Appendix D – Project Tailoring Guidelines The basic requirements expected from a Johnson Space Center (JSC) project are detailed in this docu-ment. However, because of the wide variety and size of projects, it may be necessary to allow some form of tailoring. This is acceptable as an effort to optimize the project processes and requirements and to en-courage development of innovative practices. The project should assess a wide variety of factors to determine both the appropriate and the necessary level of tailoring required. These factors include:

• Type of project • Approach of project • Project funding (including funding profile) • Project timeline • Level of risk acceptable to the customer • Level of insight/control acceptable to the

customer

• Criticality of the project (if any) • Experience level of the project team members

and support team members • Lessons learned from other tailoring efforts by

other projects • Acquisition approach

Every project is strongly encouraged to hold tailor-ing discussions, which include the customer, to iden-tify areas that might be possible candidates. These discussions and any final decision should be docu-mented as part of the project documentation. As a final step, any project tailoring that is to be imple-mented on the project is recorded in the project management plan prior to formal directorate and JSC Project Management Council approval (see Section 1.2).

Page 196: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix E – Terms and Definitions

E-1

Term(s) Definition Reference (if applicable)

Acquisition Recommendations Recommendations to the project acquisition strategies that include: • Inputs based on technical judgments items such as: technology

maturity, civil service workforce knowledge and skill base, cost effectiveness, work complexity, and existence of facilities or other infrastructure to perform the work

• Lists of products to be acquired and the applicable acquisition type based on a Make/Buy Analysis

• Evaluation of suppliers based on their ability to meet specified requirements and established criteria

Acquisition Requirements The acquisition team generates a requirement set for each acquisition, including, as applicable, statements of work (SOWs), solicitations (request for proposal (RFP), request for quotes (RFQ)), detailed specifications, documentation deliverables, cost, schedule/ long lead, and any other applicable documents. Technical inputs related to the acquisition requirements are provided by the systems engineering (SE) function.

Attainment The transformation of a solution into a product.

Agreement Typically used to document requirements (deliverables and quantities, budget plans, and schedules) and responsibilities; examples include an internal task agreement (ITA), a memorandum of agreement (MOA), and an interdivisional agreement (IA).

Authorization to Proceed (ATP) A formal approval provided by the Johnson Space Center (JSC) Project Management Council (PMC) to implement a JSC project. This approval marks a commitment by the Center to execute the project within the plans and resource envelopes authorized at the time this approval is given.

Baseline • The technical performance and content, technology application, schedule milestones, and budget (including contingency and APA) that are documented in approved program and project plans.

• Official version of a configuration-controlled item that can only be changed through a formal review, evaluation, or approval procedure (Configuration Management (CM) Process section)

NPR 7120.5B

Certification Flight system certification documents the results of successful hardware (H/W) qualification testing of the qualification unit and H/W and/or software (S/W) acceptance testing of the flight unit. Certification documentation includes the following: identification (part number, part name), baseline requirements and associated verifications, safety data package, baseline test and analysis, documentation (qualification and acceptance plans, procedures, reports), limited-life item list, approved waivers, or deviations and material usage agreements (MUAs), etc.

Control Gate A management decision process, by the designated review authority, that marks progress in project maturity and risk reduction. A control gate is a process rather than a single formal meeting. It is a tool to achieve consensus about the status of the project. It is a gate in the sense that the effort must successfully complete the associated prerequisite review(s) to move to the next step or life cycle phase in the project.

Configuration Item An aggregation of work products that is designated for CM and treated as a single entity in the CM process.

Configuration Management A management discipline applied over the product life cycle to provide visibility, and to control performance and functional and physical characteristics.

NPR 7120.5B

Convening Authority The individual designated by the project management plan (PMP) as being responsible for initiating a milestone review. The conven-ing authority establishes review purpose, scope, and entry and exit criteria as well as assigns a review board chairperson.

Page 197: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

E-2

Term(s) Definition Reference (if applicable)

Customer A “customer” is the party (individual, project, or organization) responsible for accepting the product or for authorizing payment. The customer is external to the project, but not necessarily external to the organization. The customer may be a higher-level project. Customers are a subset of stakeholders.

CMMI v1.1

Any individual, organization, or other entity to which a project or project provides a product(s) and/or service(s).

NPR 7120.5B

Decomposition The division and allocation of high-level systems, requirements, or customer needs into lower-level components or parts.

Enabling System A system that complements the system of interest but does not contribute directly to its functionality. Each system of interest has a set of enabling systems that allows the system of interest to perform its life cycle functions.

NPR 71xx.x (document number not yet assigned)

Entrance Criteria A collection of documents, decisions, and previously scheduled events that must be accomplished or available at the start of stage processes.

Exit Criteria A collection of documents, decisions, and previously scheduled events that must be accomplished at the end of the stage before pro-ceeding to a subsequent stage.

Life Cycle Cost (LCC) The total of direct, indirect, recurring, nonrecurring, and other related NPR expenses incurred or estimated to be incurred in the design, development, verification, production, operation, mainte-nance, support, and retirement of a system over its planned lifespan.

NPR 7120.5B

Logistics Support Plans Plans that describe what logistics activities are planned and how they will be conducted and integrated into the SE process. Logistics activities include maintenance, personnel and training, supply support, test and support equipment, transportation and handling, facilities, and disposal.

SP 6105

Metric A measurement taken over a period of time that communicates vital information about a process or an activity. A metric should derive appropriate action.

NPR 7120.5B

Mission A major activity required to accomplish an Agency goal or to effectively pursue a scientific, a technological, or an engineering opportunity directly related to an Agency goal. Mission needs are independent of any particular system or technological solution.

NPR 7120.5B

Operational Concepts Operational scenarios are a step-by-step description of how the proposed system should operate and interact with its users and its external interfaces (e.g., other systems).

Participant An active member of the project or program.

Program A major activity within an Enterprise that has defined goals, objectives, requirements, and funding levels, and consists of one or more projects.

NPR 7120.5B

Project An organized activity with a finite time span that has defined goals, objectives, requirements, and LEE, and that consumes resources and yields either: a. a new or revised product or service that meets the Agency’s

strategic needs. b. A new capability applicable to current or future products or

processes that will support future Agency needs.

Draft NPR 7120.5C

Requirement A statement that identifies a product or process operational, functional, or design characteristic or constraint that is unambigu-ous, testable or measurable, and necessary for product or process acceptability (by consumers or internal quality assurance (QA) guidelines.

IEEE 1220-1998

Risk The combination of the likelihood of various outcomes and their distinct consequences.

SP 6105

Page 198: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix E – Terms and Definitions

E-3

Term(s) Definition Reference (if applicable)

Risk Management An organized, systematic decision-making process that efficiently identifies, analyzes, plans, tracks, controls, communicates, and documents risk to increase the likelihood of achieving program/ project goals.

NPR 7120.5B

Specialty Engineering The application of specific knowledge and analytic methods in support of SE processes. Specialty engineering disciplines typically include safety, reliability, maintainability, quality assurance, S/W assurance, environment, fabrication, test and verification, human factors, training, operations, information systems, logistics, and maintenance.

Stakeholder A “stakeholder” is a group of an individual that is affected by or in some way accountable for the outcome of an undertaking. Stake-holders may include project members, suppliers, customers, end users, and others.

CMMI v1.1

An individual or organization having no interest (or stake) in the outcome or deliverable of a program or project.

NPR 7120.5B

System The combination of elements that functions together to produce the capability required to meet a need. Elements include all H/W, S/W, equipment, facilities, personnel, and processes and procedures needed for this purpose.

NPR 7120.5B

System of Interest The entity that the engineering team is responsible for designing, developing, integrating, and testing.

Systems Engineering SE is defined as a disciplined approach for the definition, imple-mentation, integration, and operation of a system (product or serv-ice). The emphasis is on achieving stakeholder functional, physical, and operational performance requirements in the intended use en-vironments over the system’s planned life and within cost and schedule constraints. SE includes the engineering processes and technical management processes that consider the interface rela-tionships across all elements of the system, other systems, or as a part of a larger system.

NPR 71xx.x (document number not yet assigned)

Systems Engineering Management Plan (SEMP)

A documented plan that addresses all relevant technical and engineering management planning items necessary to achieve the mutual understanding, commitment, and performance of individu-als, groups, and organizations that must execute or support the technical effort . The SEMP defines all aspects of the technical effort, tying together in a logical manner: project life cycle con-siderations; technical and management tasks; budgets and sched-ules; milestones; data management, risk identification, resource and skill requirements; and stakeholder identification and interaction. Infrastructure descriptions include responsibility and authority re-lationships for the technical staff, management, and support organ-izations. Depending on the nature of the project, SEMP content may be contained in a standalone document or embedded in the PMP.

CMMI Project Planning Process

Technical Performance Measure (TPM) Key indicators of key performance that, if not met, put the project at cost, schedule, and performance risk. The most useful TPMs are those that provide visibility into the technical performance of key elements of the work breakdown structure (WBS), especially those that are cost drivers on the program, lie on the critical path, or rep-resent high-risk items. TPMs are key to progressively assessing technical progress.

IEEE 1220-1998

Validation Proof that the product accomplishes the intended purpose. May be determined by a combination of test, analysis, and demonstration.

NPR 7120.5B

Verification Proof of compliance with specs. May be determined by a combination of test, analysis, demonstration, and inspection.

NPR 7120.5B

Page 199: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix F – Acronyms

F-1

AC actual cost ACE advocacy cost estimate ACWP actual cost of work performed ADP acceptance data package AIAA American Institute of

Aeronautics and Astronautics AMACON American Management

Association ANSI American National Standards

Institute AO Announcement of Opportunity APA allowance for program

adjustment ATP authorization to proceed B-C-F baseline-current-future BAC budget at completion BCWP budgeted cost of work performed BCWS budgeted cost of work scheduled BOE basis of estimate C/SCSC Cost/Schedule Control Systems

Criteria CARD cost analysis requirements

description CCB configuration control board CD Center Director CDDF Center Director Discretionary

Fund CDR Critical Design Review CFI comparative fit index CFO Chief Financial Officer CI configuration item CIL critical items list CM configuration management CMMI Capability Maturity Model

Integration CofF construction of facilities COTR contracting officer technical

representative COTS commercial off-the-shelf CPI cost performance index CPM critical path method CPU central processing unit CR change request Concept Review CV cost variance CWI Common Work Instruction DCMA Defense Contract Management

Agency DDMS design and data management

system DDTE design, development, test, and

evaluation

DLE discipline lead engineer DLO directorate-level organization DO delivery order DoD Department of Defense DPM deputy project manager DR Definition Review DRFP draft request for proposal DRLI data requirements list items DTO detailed test objective EA environmental assessment EAA Enterprise Associate

Administrator EAC estimate at completion EEE electrical/electronic equipment EIA Electronics Industries Alliance EMC/EMI electromagnetic compatibility/ electromagnetic interference EPC engineering-procurement-

construction ERB Engineering Review Board ESTL electronic systems test

laboratory ETC estimation to complete EV earned value EVM earned value management EVMS earned value management

system FAR Federal Acquisition Regulation FFBD functional flow block diagram FMEA failure mode and effects analysis FRB Facility Review board FRR Flight Readiness Review FTA fault-tree analysis FTE full-time equivalent G&A general and administrative GCAR government certification

approval request GERT graphical evaluation review

technique GFE government-furnished

equipment GPMC Governing Project Management

Council GSE group support equipment H/W hardware HSI hardware/software integration IA independent assessment interdivisional agreement ICD interface control document ICE independent cost estimate

Page 200: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

F-2

IDEF0 integration definition for function modeling

IFWG interface working group INCOSE International Council on Systems

Engineering IRD Information Resources

Directorate IRMA Integrated Risk Management

Application ISO International Organization for

Standardization IT information technology ITA internal task agreement IV&V independent verification and

validation J-SEWG JSC Systems Engineering

Working Group JPD JSC Policy Directive JPG JSC Procedures and Guidelines JSC Johnson Space Center KSC Kennedy Space Center LAN local area network LCC life cycle cost LOE level of effort LSE lead systems engineer MCC Mission Control Center MDR Mission Definition Review MOA memorandum of agreement MUA material usage agreement NAR non-advocate review NGST Next -Generation Space

Telescope NIST National Institute of Standards

and Testing NMI NASA Management Instruction NOA new obligation authority NPD NASA Policies and Directives NPR NASA Procedures and

Requirements NSRS NASA Safety and Reporting

System ODC other direct cost ORR Operational Readiness Review OSB outside of board OSHA Occupational Safety and Health

Administration P/FR problem/failure report P3I preplanned product

improvement

PC personal computer PCA program commitment agreement PDR Preliminary Design Review PERT program evaluation review

technique PI principal investigator PMB performance measurement

baseline PMC Project Management Council PMP project management plan PMWG Project Management Working

Group POP program operating plan PQA procurement quality assurance PR Purchase Request PRA probabilistic risk assessment ProRR Production Readiness Review PSA probabilistic safety assessment PSRP Payload Safety Review Panel QA quality assurance QFD quality functional deployment QRA quantitative risk assessment R-BAM risk-based acquisition

management R&D research and development R&M reliability and maintainability RAS requirements allocation sheet RFP request for proposal RFQ request for quotes RID review item disposition RP reference publication RR Requirements Review RTOP Research and Technology

Objectives and Plans S&MA safety and mission assurance S/W software SAR System Acceptance Review SBIR Small Business Innovation

Research SDR System Definition Review SE systems engineering SEMP Systems Engineering Manage-

ment Plan SEPG Software Engineering Process

Group SI international system of units SLE subsystem lead engineer SLP system-level procedure SMART Safety and Mission Assurance

Review Team specific, measurable, achievable,

resource constrained, and time constrained

Page 201: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix F – Acronyms

F-3

SME subject matter expert SMO Systems Management Office SOW statement of work SPI schedule performance index SRP Safety Review Panel SRR System Requirements Review SV schedule variance TBD to be determined TBS to be supplied TCPI to complete performance index TDRSS tracking and data relay satellite

system TIM technical interchange meeting TLS timeline analysis sheet TPM technical performance measure TRL technology readiness level TRR Test Readiness Review TS technical solution VAC variance at completion VAR variance analysis report VASIMR variable specific impulse

magnetoplasma rocket WAD work authorization document WBS work breakdown structure

Page 202: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix G – Photograph Captions

G-1

Chapter 1: P1-2 STS110-730-079 (17 April 2002) This full view of the International Space Station (ISS) was recorded by the STS-110 crewmembers on board the Space Shuttle Atlantis following the undocking of the two space-craft some 247 statute miles above the North Atlantic. P1-7 JSC2001-E-18949 (December 2001) The Space Shuttle Atlantis launched this railcar, called the Mobile Transporter (MT), and an initial 43-foot section of track as it delivered the first segment of the ISS exterior truss. Designated “S0 (S-zero),” this first section of truss was carried aloft during the mission of STS-110.

Chapter 2: P2-6 STS110-S-039 (19 April 2002) The Space Shuttle Atlantis heads for touchdown on the runway at the KSC landing facility to complete a nearly 11-day journey. P2-10 STS105-E-5226 (16 August 2001) Now a member of the STS-105 crew, departing Expedition 2 Flight Engineer Susan J. Helms works out on the ergometer device on the mid deck of the Space Shuttle Discovery. P2-11 top ISS005-E-16521 (9 October 2002) Backdropped against a blue and white Earth, this view of the Space Shuttle Atlantis was photographed by an Expedition 5 crewmember on board the ISS during rendezvous and docking operations. P2-11 bottom ISS005-E-10926 (23 August 2002) This image shows some of the devastating late summer 2002 European flooding. The image, captured on board the ISS during Expedition 5, shows flooding around the Danube Bend area just north of Budapest near the city of Vác, Hungary P2-20 top right ISS004-E-8652 (14 March 2002) Astronaut Daniel W. Bursch, Expedition 4 flight engineer, works the controls of the Canadarm2, or Space Station Remote Manipulator System (SSRMS) in the Destiny laboratory on the ISS. P2-20 bottom right ISS004-E-10288 (21 April 2002) This view featuring the San Francisco Bay Area was photographed by an Expedition 4 crewmember on board the ISS.

P2-20 left ISS004-E-11792 (26 April 2002) Astronaut Carl E. Walz, Expedition 4 flight engineer, works on the Elektron Oxygen Generator in the Zvezda Service Module on the ISS.P2-22 p2-22 ISS004-E-11958 (16 May 2002) This image features fire scars and smoke plumes result-ing from biomass burning in the savannahs of the southern Democratic Republic Congo. Expedition 4 crewmembers observed the seasonal increase in savannah burning.

Chapter 3: P3-1 ISS007-E-17880 (20 October 2003) European Space Agency (ESA) astronaut Pedro Duque of Spain prepares to set up the Cervantes program of tests by starting with the Microgravity Science Glovebox (MSB) in the Destiny laboratory on the ISS. Duque is working on an experiment that will investigate the growth processes of proteins during weightless conditions. P3-3 ISS008-E-05181 (31 October 2003) Astronaut C. Michael Foale, Expedition Eight mission commander and NASA ISS science officer, works with the Russian biomedical “Pilot” experiment in the Zvezda Serv-ice Module on the ISS. The experiment looks at psycholog-ical and physiological changes in crew performance during long-duration space flight. P3-6 ISS008-E-12281 (9 January 2004) Cosmonaut Alexander Y. Kaleri, Expedition 8 flight engineer, works at the Vozdukh CO2 scrubber in the Zvezda Service Module on the ISS. P3-10 STS109-S-020 (1 March 2002) The Space Shuttle Columbia passes through some pre-dawn clouds as it soars into the sky to begin its 27th flight, STS-109. P3-13 STS112-326-015 (12 October 2002) Astronaut Piers J. Sellers, STS-112 mission specialist, uses a handrail on the Destiny Laboratory and a foot restraint on the SSRMS or Canadarm2 to remain stationary while per-forming work at the end of the STS-112 mission's second spacewalk. P3-14 left STS112-E-05901 (16 October 2002) Astronaut Sandra H. Magnus, STS-112 mission specialist, works out on a bicycle ergometer on the middeck of the Space Shuttle Atlantis.

Page 203: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

G-2

P3-14 right STS112-304-005 (12 October 2002) This scene, showing a portion of the forward section of the Space Shuttle Atlantis, was photographed by a space walk-ing astronaut. Astronauts Jeffrey S. Ashby, STS-112 mission commander; Pamela A. Melroy, pilot; and cosmonaut Fyodor N. Yurchikhin, mission specialist, can be seen through an overheard aft flight deck window. P3-15 STS112-702-002 (7–18 October 2002) Egypt’s triangular Sinai Peninsula lies in the center of this view, photographed from the Space Shuttle Atlantis, with the dark greens of the Nile delta lower right. P3-24 STS113-E-05201 (28 November 2002) Astronaut Michael E. Lopez-Alegria, STS-113 mission specialist, works on the newly installed Port One (P1) truss on the ISS during the mission’s second scheduled spacewalk The spacewalk lasted 6 hours, 10 minutes.

Chapter 4: P4-8 top ISS008-E-13212 (26 January 2004) This image of the El Paso-Juarez area on the U.S.-Mexico border, photographed by an Expedition 8 crewmember, is the 100,000th photograph of Earth that astronauts have taken from the ISS. P4-8 bottom STS112-709-073K (10 October 2002) Astronaut David A. Wolf, STS-112 mission specialist, anchored to a foot restraint on the SSRMS or Canadarm2, carries the Starboard One (S1) outboard nadir external cam-era. The camera was installed on the end of the S1 Truss on the ISS during the mission’s first scheduled spacewalk. P4-11 STS111-307-017 (11 June 2002) Astronaut Philippe Perrin, STS-111 mission specialist representing CNES, the French Space Agency, participates in the second scheduled spacewalk for the mission. During the spacewalk, Perrin and Chang-Diaz attached power, data and video cables from the ISS to the Mobile Base System (MBS) and used a power wrench to complete attachment of the MBS onto the MT. P4-18 STS113-S-007 (23 November 2003) Against a black night sky, Space Shuttle Endeavour heads toward Earth orbit and a scheduled link-up with the ISS. P4-22 top right STS113-360-030 (26 November 2002) Astronaut Michael E. Lopez-Alegria, STS-113 mission specialist, attired in his Extravehicular Mobility Unit ISS.

(EMU) spacesuit, is pictured in the Quest Airlock on the Lopez-Alegria was about to begin the first of three scheduled spacewalks to perform work on the Station. P4-22 bottom STS113-370-012 (2 December 2002) The horizon of a blue and white Earth and the blackness of space form the backdrop as two miniature satellites are released from the Space Shuttle Endeavour as part of an experiment referred to as MEPSI. Funded by the Defense Advance Research Projects Agency (DARPA), the two small satellites, which are tethered together, were released from Endeavour’s payload bay (visible in foreground) to fly free for three days as a technology demonstration of the launcher and use of micro- and nano-technologies in space systems. P4-30 STS107-400-004 (16 January – 1 February 2003) This Earth view featuring the Sinai Peninsula, Red Sea, Egypt, Nile River, and Mediterranean was photographed by an STS-107 crewmember on board the Space Shuttle Columbia. P4-32 fragment, top right STS113-332-035 (14 December 2002) The STS-113 crewmembers used a 35mm still camera to record this image of Mt. Etna Volcano erupting on the island of Sicily. The oblique, south-looking view shows Mt. Etna’s dark ash plume. P4-34 top right STS113-305-007 (26 November 2002) Astronaut John B. Herrington, STS-113 mission specialist, participates in the mission’s first spacewalk. The opened hatch of the Quest Airlock is reflected in Herrington’s helmet visor. P4-34 bottom STS113-336-015 (2 December 2002) Backdropped by a blue and white Earth, this full view of the ISS was photographed by a crewmember on board the Space Shuttle Endeavour following the undocking of the two spacecraft. P4-41 left STS111-321-024 (5–19 June 2002) This sunset over the Sahara Desert was photographed by the STS-111 crewmembers aboard the Space Shuttle Endeavour. When this photograph was taken, the Shuttle was in a position over the Sudan near the Red Sea coast. P4-41 right STS111-367-014 (5–19 June 2002) This view featuring Canadian forest fires was taken by STS-111 crewmembers aboard Space Shuttle Endeavour. It represents an oblique view northward of one of the numer-ous fires observed and reported burning in the dry boreal forests of Saskatchewan and Manitoba during the month of June.

Page 204: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix G – Photograph Captions

G-3

P4-46 top ISS007-E-09860 (21 July 2003) This view of Earth’s horizon as the sunsets over the Pacific Ocean was taken by an Expedition 7 crewmember on board the ISS. Anvil tops of thunderclouds are also visible. P4-46 bottom ISS007-E-10807 (8 July 2003) Astronaut Edward T. Lu, Expedition 7 NASA ISS science officer and flight engineer, wearing squat harness pads, performs knee-bends using the Interim Resistive Exercise Device (IRED) equipment in the Unity node on the ISS. P4-52 top JSC2003-E-31962 (24 April 2003) The Soyuz rocket is erected at the launch pad at the Baikonur Cosmodrome, Kazakhstan. Expedition 7 is scheduled to launch on board the Soyuz on Saturday April 26, 2003. P4-52 bottom ISS008-E-12109 (6 January 2004) Five-year-old icebergs near South Georgia Island are featured in this image photographed by an Expedition 8 crewmember on board the ISS. This oblique image shows two pieces of a massive iceberg that broke off from the Antarctica Ronne Ice Shelf in October 1998. P4-55 STS110-353-023 (8–19 April 2002) Docked to the ISS, a Soyuz vehicle (foreground) and the Space Shuttle Atlantis were photographed by an STS-110 crewmember in the Pirs docking compartment on the orbital outpost. P4-64 STS113-S-005 (23 November 2002) Against a black night sky, the Space Shuttle Endeavour heads toward Earth orbit and a scheduled link-up with the ISS. P4-68 STS111-373-001 (15 June 2002) Backdropped by the blackness of space and a blue and white Earth, the ISS is now separated from the Space Shuttle Endeavour following the undocking of the two spacecraft over western Kazakhstan. P4-72 JSC2003-E-59333 (20 October 2003) This overall view of the Station flight control room (BFCR) in JSC’s Mission Control Center (MCC) was photographed during rendezvous and docking operations between the Soyuz TMA-3 spacecraft and the ISS. P4-77 STS108-311-010 (10 December 2001) Astronauts Linda M. Godwin (red stripes) and Daniel M. Tani, STS-108 mission specialists, are pictured near the end

of the Space Shuttle Endeavour remote manipulator system (RMS) robotic arm during the four-hour, 12-minute session spacewalk. P4-78 STS108-350-009 (10 December 2001) Astronaut Linda M. Godwin, STS-108 mission specialist, works during a four-hour, 12-minute spacewalk. The main objective of the spacewalk was to install thermal blankets on mechanisms that rotate the ISS main solar arrays P4-81 left STS107-E-05359 (22 January 2003) SPACEHAB Research Double Module as seen from Columbia’s aft flight deck during STS-107. P4-81 right STS107-E-05537 (25 January 2003) One of the STS-107 crewmembers used a digital still camera to capture this image of clouds, shadows and sun-glint during a moment away from Flight Day 10 science research aboard the Space Shuttle Columbia. P4-84 STS104-315-013 (12–24 July 2001) Holding onto the end effector of the Canadarm on the Space Shuttle Atlantis, astronaut Michael L. Gernhardt, STS-104 mission specialist, participates in one of three STS-104 spacewalks. The spacewalk was designed to help wrap up work on the second phase of the ISS. P4-90 top ISS006-E-05070 (4 December 2002) The new crewmembers aboard the ISS were able to document a rare occurrence early into their tour on the outpost. The dark area near Earth’s horizon at center frame is actually a shadow cast by the Moon during the total solar eclipse of Dec. 4, 2002. P4-90 bottom ISS006-E-07133 (9 December 2002) Astronaut Donald R. Pettit, Expedition 6 NASA ISS science officer, works to set up Pulmonary Function in Flight (PuFF) hardware in preparation for a Human Research Facility (HRF) experiment in the Destiny laboratory on the ISS. P4-93 STS112-705-011 (7–18 October 2002) The light-blue region in the middle of this view, photographed from the Space Shuttle Atlantis, is the shallow flat platform known as the Great Bahama Bank. P4-96 STS111-373-018 (15 June 2002) Silhouetted over Earth, this full view of the ISS was photographed by a crewmember on board the Space Shuttle Endeavour following the undocking of the two spacecraft over western Kazakhstan.

Page 205: NASA Project Management System Engineering and Project Control Processes and Requierments

Project Management: SE & Project Control Processes & Requirements

G-4

P4-100 STS110-716-026 (13 April 2002) Some 240 miles above the blue and white Earth, astronaut Lee M.E. Morin totes one of the S0 (S-zero) keel pins that was removed from its functional position on the truss and attached on the truss exterior for long-term stowage. P4-104 ISS007-E-09855 (8 July 2003) Astronaut Edward T. Lu, Expedition 7 NASA ISS science officer and flight engineer, exercises on the Cycle Ergom-eter with Vibration Isolation System (CEVIS) in the Destiny laboratory on the ISS. P4-109 top STS109-326-008 (5 March 2002) Astronaut Michael J. Massimino, mission specialist, works at the stowage area for the Hubble Space Telescope port side solar array. Astronauts Massimino and James H. Newman removed the old port solar array and stowed it in Columbia’s payload bay for a return to Earth. They then went on to install a third-generation solar array and its associated electrical components. P4-109 bottom STS109-329-021 (1–12 March 2002) The horizon of a blue and white Earth and the blackness of space form the backdrop for this view of the cargo bay of the Space Shuttle Columbia, as seen through windows on the aft flight deck during the STS-109 mission. Pictured in the cargo bay is the Rigid Array Carrier (RAC) holding the new Hubble Solar Arrays. In its stowed position at right center of the frame is the Canadian-built RMS arm.

Appendices: PA-3 top ISS007-E-10246 (15 July 2003) The crew of the ISS had a great seat from which to observe tropical storm Claudette as she turned into a hurricane and came ashore with high winds and heavy rains that drenched their Houston home base and other Texas areas. PA-3 bottom right ISS007-E-14440 (4 September 2003) Astronaut Edward T. Lu, Expedition 7 NASA ISS science officer and flight engineer, wearing a Russian Sokol suit, floats in the Destiny laboratory on the ISS. PA-3 bottom left ISS007-E-11507 (31 July 2003) Cosmonaut Yuri I. Malenchenko, Expedition 7 mission commander, is pictured with the Plasma Crystal Experi-ment in the Zvezda Service Module transfer compartment on the ISS.

PB-2 STS109-713-014 (8 March 2002) Astronauts John M. Grunsfeld (right) and Richard M. Linnehan, STS-109 payload commander and mission specialist, respectively, are photographed near the giant Hubble Sp ace Telescope temporarily hosted in the Space Shuttle Columbia’s cargo bay at the close of the fifth and final session of spacewalks. PD-1 STS108-E-5349 (10 December 2001) Astronaut Linda M. Godwin, STS-108 mission specialist, is pictured near the end of the Space Shuttle Endeavour RMS arm during a four-hour spacewalk. PF-3 top ISS005-E-19267 (1 November 2002) A Soyuz spacecraft approaches the Pirs docking compartment on the ISS carrying the Soyuz 5 taxi crew, Commander Sergei Zalyotin, Belgian Flight Engineer Frank DeWinne, and Flight Engineer Yuri V. Lonchakov for an eight-day stay on the Station. The new Soyuz TMA-1 vehicle was designed to accommodate larger or smaller crewmembers, and is equipped with upgraded computers, a new cockpit control panel and improved avionics. PF-3 bottom ISS005-E-19024 (30 October 2002) The three-member crew of the Expedition 5 mission on board the ISS was able to observe Mt. Etna’s spectacular eruption, and photograph the details of the eruption plume and smoke from fires triggered by lava as it flowed down the 11,000 ft mountain. This image provides a three-di-mensional profile of the eruption plume. This eruption was one of Etna’s most vigorous in years. Below ISS008-E-08453 (10 December 2003) This view of a full Moon was photographed by one of the Expedition 8 crewmembers on board the ISS.

Page 206: NASA Project Management System Engineering and Project Control Processes and Requierments

Appendix H – “Spiral” Development Process

H-1

n + 1 TRL (n)

n

Key:

Planning Develop

Prototype Build

Evaluate, Determine Alternatives

Page 207: NASA Project Management System Engineering and Project Control Processes and Requierments
Page 208: NASA Project Management System Engineering and Project Control Processes and Requierments
Page 209: NASA Project Management System Engineering and Project Control Processes and Requierments

National Aeronautics andSpace Administration

Lyndon B. Johnson Space CenterHouston, Texas