cmmism — how will it change the industry? · sm cmmi and ideal are service marks of carnegie...
TRANSCRIPT
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMISM — How Will It ChangeThe Industry?
Dennis R. GoldensonD. Mike Phillips
Software Engineering InstituteCarnegie Mellon University
November 2000
Sponsored by the U.S. Department of Defense© 2000 by Carnegie Mellon University
Page 2
CMMISM
Carnegie MellonSoftware Engineering Institute
Acknowledgments Slides and ideas stolen liberally from other members ofthe CMMI Product Team:
• Thanks are due in particular to Dave Kitson, JoeJarzombek, Gene Miluk, David White, and Dave Zubrow.
• Special thanks also are due to Khaled El Emam, TerryRout, Angela Tuffley, and the members of the CMMITest and Evaluation IPT.
® CMM and Capability Maturity Model are registered in the U.S. Patent andTrademark Office.
SM CMMI and IDEAL are service marks of Carnegie Mellon University.
Page 3
CMMISM
Carnegie MellonSoftware Engineering Institute
Today’s Talk
CMMI Background
Impact of process improvement: a brief review
The product suite
Test and evaluation in CMMI
Page 4
CMMISM
Carnegie MellonSoftware Engineering Institute
Today’s Talk
CMMI Background
• Purpose and scope
• Integrating the source models & methods
• Development & review process
• Transition to CMMI
Impact of process improvement: a brief review
The product suite
Test and evaluation in CMMI
Page 5
CMMISM
Carnegie MellonSoftware Engineering Institute
SW-CMM
Also see www.software.org/quagmire
MIL-Q -9858
Trillium Baldrige
IEEE Stds. 730,828829, 830,1012,1016
1028,1058,1063ISO 15504*(SPICE)
People CMM
IPD-CMM*
DODIPPD
SECAMAF IPD Guide
SDCCR
SCE
NATO AQAP1,4,9
BS5750
MIL-STD-498
DOD-STD -2167A
DOD-STD -7935A
MIL-STD-499B*
ISO/IEC12207
IEEE1220 ISO 10011
SDCE
SE-CMMSECM*(EIA/IS 731)
EIA/IS632
ISO 9000Series
EIA/IEEEJ-STD-016
IEEE/EIA12207
EIA 632*
MIL-STD-1679
IEEE 1074
TickITSSE-CMM
ISO 15288*
EQA
* Not yet released
CMMI*
PSP
SA-CMM
Q9000
DOD-STD-2168
quag14d: 5 June 1998
FAA-iCMM
DO-178B
SW-CMM
Courtesy Sarah Sheard, SPC
The Frameworks Quagmire
Page 6
CMMISM
Carnegie MellonSoftware Engineering Institute
The CMMI Project
Sponsored by the US Department of Defense andthe National Defense Industrial Association (NDIA)
Collaborative endeavor• Industry• Government• Software Engineering Institute
Over 100 people involved• Product (Development) Team• Steering Group• Stakeholder reviewers from selected government
and industry organizations
Page 7
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Benefits
Efficient, effective assessment and improvementacross multiple process disciplines in an organization
Reduced training and assessment costs
A common, integrated vision of improvement for allelements of an organization
A means of representing new discipline-specificinformation in a standard, proven processimprovement context
Page 8
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Source Models
Capability Maturity Model for Software V2,draft C (SW-CMM V2C)
EIA Interim Standard 731, System EngineeringCapability Model (SECM)
Integrated Product Development Capability MaturityModel, draft V0.98 (IPD-CMM)
… “Reference documents” include ISO/IEC 12207 andISO/IEC 15504
Page 9
CMMISM
Carnegie MellonSoftware Engineering Institute
Model Metrics
Release PAs/ Goals/ Activities/
FAs Themes* Practices**
SW-CMM V1.1 18 52 316
SW-CMM V2C 19 62 318
EIA/IS 731 19 77 383
IPD-CMM V0.98 23 60 865
CMMI V0.1 SE/SW 27 149 550
CMMI V0.2 SE/SW 24 80 528
CMMI V1.0 SE/SW 22 70 417
* Ratable components
** Key to implementation effort
38 701139
Page 10
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Design Goals Integrate the source models, eliminateinconsistencies, reduce duplication
Reduce the cost of implementing model-basedprocess improvement
Increase clarity and understanding
• Common terminology• Consistent style
• Uniform construction rules• Common components
Assure consistency with ISO/IEC 15504
Be sensitive to impact on legacy efforts
Page 11
CMMISM
Carnegie MellonSoftware Engineering Institute
Development & Review Process
Product Development TeamProject Manager
M. Phillips / D. Ahern
Chief ArchitectR. Bate
Stakeholder/Reviewers
Steering GroupCo-Chairs
A. Dulai / B. Rassa
Reqmts/T&EIPT
Architecture IPT
PA AuthorGroups
AssessmentMeth.IPT
Coordinating IPTTeam Leads
IPPD IPT
Standards Team Lotus Notes Team
Editors (CCB)
D. Ahern / M. Konrad
Pilot Planning
IPT = integrated product team
TrainingIPT
CCB
SE/SW/IPPDExpert Teams
New Discipline Teams
Page 12
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Schedule August 12, 2000 Released CMMI-SE/SW V1.0 for
initial use
August 16, 2000 Released CMMI-SE/SW/Ato stakeholders for review
Oct 2000 Released CMMI-SE/SW/IPPD V1.0for initial use
Nov 2000 Release CMMI-SE/SW/Afor initial piloting
Fall 2001 Publish models V1.1
Fall 2003 Complete sunset periodfor precursor models
Page 13
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Transition Plan
Development Phase• Development of CMMI products
• Verification and validation of CMMI products Transition Phase
• Approval of initial CMMI products for public release
• Evidence of sufficient use
• Transition planning to help organizations use CMMI products
Sustainment Phase• Upkeep and continuous improvement
of the product suite• Additional evidence of adoption and use
Page 14
CMMISM
Carnegie MellonSoftware Engineering Institute
Today’s Talk
CMMI Background
Impact of process improvement: a brief review
The product suite
Test and evaluation in CMMI
Page 15
CMMISM
Carnegie MellonSoftware Engineering Institute
Impact of Process Improvement
Process capability can, and often does, result inimproved organizational performance
• Numerous case studies demonstrate improvements inproductivity, product quality, & return on investment- E.g., improvements in cycle time, defect density,
and productivity, and benefit-to-cost ratios in therange of 4.0:1 to 8.8:1 [Herbsleb 94]
• Similar results from studies comparing many projectsor organizations- Comparing measures of process capability with
schedule, budget, product quality, and otherperformance metrics [E.g., Harter 00, Krishnan 99,Clark 97, Lawlis 95, Goldenson 95]
Page 16
CMMISM
Carnegie MellonSoftware Engineering Institute
For Example...
0
20
40
60
80
100
Productquality
Customersatisfaction
Staffproductivity
Ability tomeet
schedules
Ability tomeet
budgets
Staff morale
Initial Repeatable Defined“good” or “excellent”
performance at ...
Percent
“good” or “excellent”performance at ...
Percent
Page 17
CMMISM
Carnegie MellonSoftware Engineering Institute
Select Bibliography -1 Clark, Bradford K. The Effects of Software Process Maturity on Software Development Effort, PhD
Dissertation, Computer Science Department, University of Southern California, August 1997.
Dunaway, Donna K.; Goldenson, Dennis R.; Monarch, Ira A.; & White, David M. How Well is CBA IPIWorking? User feedback presented at SEPG98, Chicago, Illinois, March 1998.
El-Emam, Khaled and Goldenson, Dennis R. “An Empirical Review of Software Process Assessments.”In Advances in Computers, Marvin V. Zelkowitz (ed.), pp. 319-423. San Dioego and other cities:Academic Press 2000.
Goldenson, Dennis R. & Herbsleb, James D. After the Appraisal: A Systematic Survey of ProcessImprovement, Its Benefits and Factors that Influence Success (CMU/SEI-95-TR-009, ADA300225).Pittsburgh, Pa.: Software Engineering Institute, Carnegie Mellon University, 1995.
Hayes, Will & Zubrow, Dave. Moving On Up: Data and Experience Doing CMM-Based ProcessImprovement (CMU/SEI-95-TR-008, ADA300121). Pittsburgh, Pa.: Software Engineering Institute,Carnegie Mellon University, 1995.
Herbsleb, James; Carleton, Anita; Rozum, James; Siegel, Jane; & Zubrow, David. Benefits of CMM-Based Software Process Improvement: Initial Results (CMU/SEI-94-TR-013, ADA283848). Pittsburgh,Pa.: Software Engineering Institute, Carnegie Mellon University, 1995.
Page 18
CMMISM
Carnegie MellonSoftware Engineering Institute
Select Bibliography -2 Krasner, H. “The Payoff for Software Process Improvement: What it is and How to Get it.” In
Elements of Software Process Assessment and Improvement, Khaled El-Emam and Nazim H.Madhavji (eds.), pp. 151-176. New York: IEEE Press, 1999
Krishnan, M.S. and Kellner, Marc I. “Measuring Process Consistency: Implications for ReducingSoftware Defects.” IEEE Transactions on Software Engineering, 25(6), pp. 800-815,November/December 1999.
Lawlis , Patricia K., Flowe, Robert M. and Thordahl, James B. "A Correlational Study of theCMM and Software Development Performance," Crosstalk: The Journal of Defense SoftwareEngineering, 8(9), pp. 21-25, September 1995.
McGibbon, Thomas. A Business Case for Software Process Improvement Revised: MeasuringReturn on Investment from Software Engineering and Management. Rome, New York: Data &Analysis Center for Software, September 1999.
Sheard, Sarah A., Lykins, Howard, and Armstrong, James R. “Overcoming Barriers to SystemsEngineering Process Improvement.” Systems Engineering, 3(2), pp. 59-67, June 2000.
Page 19
CMMISM
Carnegie MellonSoftware Engineering Institute
Today’s Talk
CMMI Background
Impact of process improvement: a brief review
The product suite
• Framework & architecture
• Integrated models
• Measurement in CMMI
• Appraisal methods and training
• Relation to ISO/IEC 15504
Test and evaluation in CMMI
Page 20
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Framework
Architecture instantiated in Lotus Notes
Generates CMMI models for specified disciplines
• Common, shared, and discipline specific processareas
In either staged or continuous representations
Capable of generating training and appraisaldocuments
Page 21
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Model Representations An organization may choose toapproach process improvementfrom either the• process capability
approach• organizational maturity
approach
CMMI models support each approach witha representation• process capability approach - continuous
representation• organizational maturity approach - staged
representation
Page 22
CMMISM
Carnegie MellonSoftware Engineering Institute
Comparing the Two Representations
Both representations provide the same essentialcontent but organized in different ways
Thus providing different ways of implementing processimprovement to achieve business goals
Page 23
CMMISM
Carnegie MellonSoftware Engineering Institute
Staged Representation
Provides a pre-defined roadmap for organizational improvement based on grouping & ordering of processes and associated organizational relationships
Page 24
CMMISM
Carnegie MellonSoftware Engineering Institute
Continuous Representation
Provides maximum flexibility for organizations to choose which processes to emphasize forimprovement
Page 25
CMMISM
Carnegie MellonSoftware Engineering Institute
Staged Representations
Process Areas are grouped into stages (maturity levels)from 2 to 5
Each Process Area contains goals and implementing practices (activities) to achieve the purpose of the process area
For a Process Area at a given stage, institutionalization practices (common features) are integral to the process area
Page 26
CMMISM
Carnegie MellonSoftware Engineering Institute
Organizational Innovation and DeploymentCausal Analysis and Resolution
5 Optimizing
4 Quantitatively Managed
3 Defined
2 Managed
ContinuousProcess Improvement
QuantitativeManagement
ProcessStandardization
BasicProjectManagement
Organizational Process PerformanceQuantitative Project Management
Requirements DevelopmentTechnical SolutionProduct IntegrationVerificationValidationOrganizational Process FocusOrganizational Process DefinitionOrganizational Training Integrated Project ManagementRisk ManagementDecision Analysis and Resolution
Requirements ManagementProject PlanningProject Monitoring and ControlSupplier Agreement ManagementMeasurement and AnalysisProcess and Product Quality AssuranceConfiguration Management
QualityProductivity
RiskRework1 Initial
Process AreasLevel Focus
Page 27
CMMISM
Carnegie MellonSoftware Engineering Institute
Structure of the CMMI StagedRepresentation
Commitmentto Perform
Maturity Level
SpecificPractices
Generic Goals
Process Area 2
Common Features - Generic Practices
Process Area 1 Process Area n
Abilityto Perform
DirectingImplementation Verification
Specific Goals
Page 28
CMMISM
Carnegie MellonSoftware Engineering Institute
Continuous Representations
A Process Area contains specific goals and practices toachieve the purpose of the process area
• Some of these practices may reside at higherCapability Levels (advanced practices)
Generic practices are grouped by goal to defineCapability Levels
• The generic practices and goals are considered alongwith the specific goals and practices of any processarea to attain a capability level for that process area
The order in which Process Areas are addressed canfollow a recommended staging
Page 29
CMMISM
Carnegie MellonSoftware Engineering Institute
Process Capability Levels
5 Optimizing
4 Quantitatively Managed
3 Defined
2 Managed
1 Performed
0 Incomplete
Page 30
CMMISM
Carnegie MellonSoftware Engineering Institute
Structure of the CMMI ContinuousRepresentation
GenericGoals
& Generic Practices
SpecificGoals
& Practices
Page 31
CMMISM
Carnegie MellonSoftware Engineering Institute
An Example Organizational ProcessMaturity Profile
P r o c e s s
RSKM PP PMC etc.
5
4
3
2
1
0
C a
p a
b i l i t
y
Page 32
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Process Area Contents
Purpose
Introductory Notes
Goals: Specific and Generic
Generic Practices
Specific Practices
Notes
Work Products
Subpractices
Amplifications
Elaborations
Page 33
CMMISM
Carnegie MellonSoftware Engineering Institute
Commonalities
Both staged and continuous share …
• Same process areas, goals, practices, andinformative materials
• Elaborations of generic practices in process areacontext
• Amplifications by discipline
Page 34
CMMISM
Carnegie MellonSoftware Engineering Institute
Maturity Level 5 OID, CAR
Maturity Level 4 OPP, QPM
Maturity Level 3 REQD, TS, PI, VER, VAL, OPF, OPD, OT, IPM, RSKM, DAR
Overview Introduction Structure of the Model Model Terminology Maturity Levels, Common Features, and Generic Practices Understanding the Model Using the Model
Maturity Level 2 REQM, PP, PMC, SAM, MA, PPQA, CM
Appendixes
Engineering REQM, REQD, TS, PI, VER, VAL
Project Management PP, PMC, SAM IPM, RSKM, QPM
Process Management OPF, OPD, OT, OPP, OID
Process Management PAs - Goals - Practices
Support CM, PPQA, MA, CAR, DAR
Appendixes
CMMI-SE/SWStaged
CMMI Structure
Overview Introduction Structure of the Model Model Terminology Capability Levels and Generic Model Components Understanding the Model Using the Model
CMMI-SE/SWContinuous
Page 35
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI-SE/SW Compared toSW-CMM v1.1
Organizations using SW-CMM v1.1 should be able totransition to CMMI by focusing on the following changes:• Measurement and Analysis at L2
• Risk Management & Decision Analysis and Resolution atL3
• Expansion of Software Product Engineering
• Refocus of Measurement and Analysis CF to DirectingImplementation CF
Most SW-CMM v2 Draft C updates have been incorporated
Page 36
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI-SE/SW Compared to SECM
EIA 731 users should be able to transitionto the CMMI-SE/SW model by recognizing:
• Continuous representation (+ “equivalent” stagedrepresentation)
• Some lower-level differences
• Application of common SE/SW practices toSE community
Integrated Product and Process Development (IPPD) willbe added.
• Integrated Team, and Organizational Environmentfor Integration
Page 37
CMMISM
Carnegie MellonSoftware Engineering Institute
Methods and Training
Assessment methods
• Comprehensive SCAMPI- (Standard CMMI Assessment Method for
Process Improvement)
• ARC (Assessment Requirements for CMMI) definesclassess of CMMI assessment method
Training support
• Course development and delivery
• Model and method
Page 38
CMMISM
Carnegie MellonSoftware Engineering Institute
Classes of Assessment Methods
Class C:•Quick look
•Checking for specific risk areas
•Inexpensive, little training needed
Class A:•Comprehensive method
•Thorough model coverage•Provides maturity level
Class B:•Less comprehensive,
less expensive•Initial, partial, self-assessment
•Focus on areasneeding attention•No maturity level
rating
Page 39
CMMISM
Carnegie MellonSoftware Engineering Institute
Standard CMMI Assessment Methodfor Process Improvement (SCAMPI)
Based on CMM®-Based Appraisal for ProcessImprovement (CBA IPI) and EIA IS 731 Appraisal Method
Satisfies all ARC requirements
Must be led by authorized SCAMPI Lead Assessor
Tailorable to organization and model scope
Artifacts:
• SCAMPI Method Description
• CMMI Appraisal Questionnaire, work aids, templates
Page 40
CMMISM
Carnegie MellonSoftware Engineering Institute
Value of Alternative AssessmentApproaches
CMMI models pose a challenge to current assessmentmethods given the:
• expanded (and expanding) model scope
• need for adequate rigor
• need for a reasonable-length on-site period
Alternative assessment methods may require changes toARC
Linkage as part of “assessment product suite”
Page 41
CMMISM
Carnegie MellonSoftware Engineering Institute
Coupling CMMI with theInternational SPICE trials
Mapping and translation of CMMI
Pilot assessments in Australia first to produce15504 Process Profiles
Integrating future CMMI assessments with 5504trials
Page 42
CMMISM
Carnegie MellonSoftware Engineering Institute
Mapping and Translation of CMMI toISO/IEC 15504
CMMI functional requirement 3.1.6: The CMMIProduct Suite shall be consistent and compatiblewith ISO/IEC 15504
Page 43
CMMISM
Carnegie MellonSoftware Engineering Institute
Demonstration of Conformance
For CMMI, v1.0 …
• To be completed in 4th quarter 2000
• Mapping of CMMI ISO/IEC 15504 TR, part 2- Currently dependent on informative materials- Many-to-one mappings from CMMI
• Mapping & translation being prepared collaborativelywith Australian colleagues
For CMMI, v1.1 …
• Address 15504 mapping gaps identified in V1.0
• Publish demonstration of conformance as integralpart of product suite
Page 44
CMMISM
Carnegie MellonSoftware Engineering Institute
Risks in Establishing Conformance
Mapping relationships is not a simple task
• Lack of clarity and ambiguity remains in both 15504and CMMI- Difficulty in mapping due to uncertainty in
meaning of text
• Compounded by lack of familiarity by manymappers with both 15504 and CMMI- Insufficient contextual knowledge of one or the
other
• Requires iterative validation cycle
Page 45
CMMISM
Carnegie MellonSoftware Engineering Institute
Today’s Talk
CMMI Background
Impact of process improvement: a brief review
The product suite
Test and evaluation in CMMI
• The results so far
• What’s next?
• Coupling CMMI test and evaluation with theInternational SPICE trials
Page 46
CMMISM
Carnegie MellonSoftware Engineering Institute
Why Do Test & Evaluation forCMMI?
Objectively evaluate product quality as delivered
• What works well or poorly?
Inform product revision & development choices• Continuing support for improvement of CMMI product
suite during sustainment
Support transition and adoption of product suite
• Integrate test & evaluation into CMMI projectmanagement
Page 47
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMIProduct Suite
Phase 1 Test& Evaluation Verification
Does the CMMI ProductSuite Meet Specifications?
ValidationCMMI Product Suite:- Usability?- Value Added?- Correctness?- Robustness?
Page 48
CMMISM
Carnegie MellonSoftware Engineering Institute
What Have We Learned So Far?
Product value
• Considerable value added reported by appraisers,sponsors, and other organizational participants
Areas for improvement
• Recurring problems observed and reported byappraisers and organizational participants- Some addressed prior to release of v1.0- Others targeted for further improvement
Improvements to the product suite are helping
• As seen in last of the initial pilots
Page 49
CMMISM
Carnegie MellonSoftware Engineering Institute
Progress Made So Far?
Improvements made during development and revisionof the product suite
Verification that CMMI product suite satisfactorilymeets its functional requirements
• Deployment risk analysis completed- Based on product inspections, test & evaluation- Risk items traceable to CMMI requirements (A-
spec)- Rated for priority and severity
• Independent review of risk analysis prior to releaseof v1.0
Page 50
CMMISM
Carnegie MellonSoftware Engineering Institute
How Was It Done?Inspection and Review
Numerous change requests from external stakeholdersand public review
• Documented in ...- PA revision logs- IPT working documents
• Used by PDT for product revision
Feedback from “focus groups” associated with trainingand other events
Inspection of product suite by PDT
• Delivered products
• Framework (implementation code & input content)
Page 51
CMMISM
Carnegie MellonSoftware Engineering Institute
How Was It Done?Initial Pilot Appraisals Eight “use cases” prior to release of v1.0
• Covering systems engineering and software disciplines,staged and continuous representations, varyingorganizational maturity
• Supported by observer teams using structuredobservation procedures
Evidence collected includes• Incident reports and briefings by observer teams• Incident reports completed by appraisal team members• End of appraisal feedback forms completed by
appraisers and organization participants• Assessment artifacts
Page 52
CMMISM
Carnegie MellonSoftware Engineering Institute
CMMI Test & Evaluation
DATABASE
Observers
Briefings
Time Logs
Incident Logs
Assessors
Feedback
Incident
Rep
orts
Sponsors
Sponsor FeedbackSponsor Feedback
ArtifactsArtifacts
Findings
Observation
Wo
rkbo
oks
Interview
CAQ
qu
estion
s
Participants
Feedback
Incident
Rep
orts
Page 53
CMMISM
Carnegie MellonSoftware Engineering Institute
What Changes Have Been Made? High risk items addressed prior to release of v1.0, e.g.
• Number of model practices reduced by ~15%- Combined with changes meant to improve clarity
• Enhanced method guidance, including …- Appraisals using the continuous representation- Breadth first appraisal of generic practices- Similar change to CMMI Appraisal Questionnaire
• Automated tool support for consolidation of appraisalobservations
• Improvements to CMMI Appraisal Questionnaire- Automated generation of response summaries- Guidance on completion, selecting respondents, and
handling local organizational terminology
Page 54
CMMISM
Carnegie MellonSoftware Engineering Institute
Observations from the Pilots -1 Problems understanding intended meaning as written
• PA’s• Generic practices / common features
Apparent redundancy / confusion about linkage• Among some PA’s• Between some PA’s and GP’s
Difficulty relating lower capability levels tohigher maturity level PA’s
Concern that some specific practices may not map wellto their respective goals
Page 55
CMMISM
Carnegie MellonSoftware Engineering Institute
Observations from the Pilots -2
Difficulties encountered …
• Appraising generic practices/common features
• Applying rules of consolidation
• Scripting interview questions, “listen fors,” and“look fors”
• Crafting observations
• Mapping the model to the organization and itspractices
• Integrating systems and software perspectives insome organizations
Page 56
CMMISM
Carnegie MellonSoftware Engineering Institute
Observations from the Pilots -3
Concerns with model terminology
Inconsistent training delivery, e.g. ...
• Abbreviated time
• Exercises rushed or omitted
• Varying use of reminders and other work aids
Page 57
CMMISM
Carnegie MellonSoftware Engineering Institute
The Bottom Line About AppraisalEffort
Initial pilots took considerable effort
And were judged worth the effort
Page 58
CMMISM
Carnegie MellonSoftware Engineering Institute
Proposed Change in Focus forCMMI Test & Evaluation
Phase 1:
• Verification of functional requirements (A-spec) forCMMI product suite
Phase 2:
• Validation and continuous improvement- Model framework, architecture, & content- Appraisal method- Training- Long term impact & ROI of process improvement
• Measuring performance to ...- Aid in refining CMMI product suite- Provide valid, useful results to the community
• Evaluating adoption strategies
Page 59
CMMISM
Carnegie MellonSoftware Engineering Institute
Phase 2 Approach
Broad based feedback about the product suite
In-depth studies focusing on critical issues
Page 60
CMMISM
Carnegie MellonSoftware Engineering Institute
Broad based community feedback on usage of products
Automatedcommunity baseddata collection
Focused designed studies for identified critical Issues
Phase 2 Strategy
Page 61
CMMISM
Carnegie MellonSoftware Engineering Institute
Contact
Dennis R. Goldenson• [email protected]• 412/268-8506• http://www.sei.cmu.edu/staff/dg/
Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213-3890• 412/268-5758 (fax)
Page 62
CMMISM
Carnegie MellonSoftware Engineering Institute
Back Pocket Slides
Follow here …
Page 63
CMMISM
Carnegie MellonSoftware Engineering Institute
Broad Based Feedback
Standard feedback from many more cases …• Building on current CBA IPI & PAIS feedback &
reporting mechanisms- Similar also to International SPICE trials of ISO/IEC
15504• Provides more confidence in generalizability of results• Supports more informative comparative analyses
- E.g., by varying model scope, method activities,appraiser background, and organizationalcharacteristics
CR’s from public review• From document review, test, & deployed use of CMMI
v1.0
Page 64
CMMISM
Carnegie MellonSoftware Engineering Institute
In-Depth Studies -1
In-depth pilot appraisals as necessary ...
• Need detailed cases on early use of v1.0
• Also important to get more detail as new disciplinesare added, e.g., acquisition
• With emphasis on fidelity to enactment of theappraisal method
- And commitment to clearly established processes
Page 65
CMMISM
Carnegie MellonSoftware Engineering Institute
In-Depth Studies -2
Custom designed studies and analyses asappropriate, including ...
• Pilot cases of process improvement beyondappraisals
• Ways of improving specific aspects of theproduct suite, e.g.
- CMMI Appraisal Questionnaire
- Other specific appraisal processes, activities,or tool support
- Training design and delivery
- Model structure and content
Page 66
CMMISM
Carnegie MellonSoftware Engineering Institute
Sources of Data
• Participants in CMMI appraisals
• Legacy users- 731 and CMM
• Other experts, e.g.- iCMM and ISO/IEC 15504- SDCE and SCE
• Broader software and systems engineeringcommunities, including non defense
Page 67
CMMISM
Carnegie MellonSoftware Engineering Institute
Evolving Policy
The CMMI Steering Group has established theschedule for the sunset period of the legacy models(i.e., SW-CMM V1.1, EIA IS 731) to be three years afterrelease of the CMMI-SE/SW/IPPD Version 1.0Product Suite.
A subsequent release, Version 1.1, is plannedone year later to provide additional refinement andupdate based on the continuing CMMI pilot program.
Page 68
CMMISM
Carnegie MellonSoftware Engineering Institute
Requirements Management Requirements Management
Software Project Planning Project Planning
Software Project Tracking & Project Monitoring and Control Oversight
Software Subcontract Management Supplier Agreement Management
Software Quality Assurance Process & Product Quality Assurance
Software Configuration Management Configuration Management
Measurement and Analysis
SW-CMM v1.1 CMMI -1
LEVEL 2 (MANAGED)
Page 69
CMMISM
Carnegie MellonSoftware Engineering Institute
SW-CMM v1.1 CMMI -2
LEVEL 3 (DEFINED)
Organization Process Focus Organizational Process Focus Organization Process Definition Organizational Process DefinitionTraining Program Organizational TrainingIntegrated Software Management Integrated Project Management
Risk ManagementSoftware Product Engineering Requirements Development
Technical SolutionProduct Integration
Intergroup Coordination VerificationPeer Reviews Validation
Decision Analysis and Resolution
Page 70
CMMISM
Carnegie MellonSoftware Engineering Institute
SW-CMM v1.1 CMMI -3
LEVEL 4 (QUANTITATIVELY MANAGED)Quantitative Process Management Organizational Process PerformanceSoftware Quality Management Quantitative Project Management
Defect Prevention Causal Analysis and ResolutionTechnology Change Management Organizational Innovation andProcess Change Management Deployment
LEVEL 5 (OPTMIZING)
Page 71
CMMISM
Carnegie MellonSoftware Engineering Institute
SW-CMM v1 .1 Common Feat ures CMMI Common Feat uresCommitment to Per form Commitment to Per form
Establish an Or ganizat ional Policy Establish an Or ganizat ional PolicyAbility to Per form Ability to Per form
Plan the ProcessP rovide Resources P rovide ResourcesAssign Resp onsibility Assign Resp onsibilityTrain Peo ple Train Peo ple
Activities Per formedPlan the Process (S pecific Pr act ices)Perform the ProcessM onitor and Control the Process
Directing Imp lement ationIdent ify & Involve Rel. Stakeho ldersManage Co nfigur ationsM onitor and Control the ProcessC ollect Improvement Information
Meas urement & Analys isMeasur e the ProcessAnalyze t he Measur ement s
Ver ifying Imp lement ation Ver ifying Imp lement ationReview wit h Org. Management Review Status w/ Higher -Lev el MgtReview wit h Project ManagementObject ively Ver ify Adher ence Object ively E valuat e Adher ence
Expanded in the Measurement and Analysis PA
Page 72
CMMISM
Carnegie MellonSoftware Engineering Institute
Engineering
PA06 Understand CustomerNeeds andExpectations
PA02 Derive and AllocateRequirements
PA03 Evolve SystemArchitecture
PA01 Analyze CandidateSolutions
PA05 Integrate SystemPA07 Verify and Validate
SystemPA04 Integrate
Disciplines
SE-CMM
Engineering
CMMI
Technical1.1 Define Stakeholder
and System LevelRequirements
1.2 Define TechnicalRequirements
1.3 Define Solution
1.4 Assess and Select
1.5 Integrate System
1.6 Verify System
1.7 Validate System
SECM
2.3
Comparison of Elements -1
Req’s Management
RequirementsDevelopment
Technical Solution
Decision Analysis andResolution
Product Integration
Verification
Validation
This graphic courtesy of Software Productivity Consortium
Page 73
CMMISM
Carnegie MellonSoftware Engineering Institute
Project ManagementProject PlanningProject Mon and CtrlSupplier Agree MgtIntegrated Proj MgtRisk Management
*Not in SE-CMM
SE-CMM
Management
2.1 Plan and Organize2.2 Monitor and Control2.3 Integrate Disciplines2.4 Coordinate with
Suppliers2.5 Manage Risk2.6 Manage Data2.7 Manage
Configurations2.8 Ensure Quality
SECM
*
Project
PA12 Plan TechnicalEffort
PA11 Monitor andControl TechnicalEffort
PA10 Manage RiskPA09 Manage
ConfigurationsPA08 Ensure Quality
CMMI
PA04
PA18
Comparison of Elements -2
SupportConfiguration MgtProc and Prod QAMeas and Analysis
L2 GP
This graphic courtesy of Software Productivity Consortium
Page 74
CMMISM
Carnegie MellonSoftware Engineering Institute
SE-CMM
Environment
3.1 Define and Improvethe SE Process
3.2 ManageCompetency
3.3 Manage Technology3.4 Manage SE Support
Environment
SECM
Organization
PA13 Define Organization’s SEProcess
PA14 Improve Organization’sSE Process
PA17 Provide OngoingKnowledge and Skills
PA15 Manage Product LineEvolution
PA16 Manage SE SupportEnvironment
PA18 Coordinate WithSuppliers
CMMI
Org Process FocusOrg Process Def
Organizational Tng
Quant Project Mgt
Org Process Perf
Causal An and Res
Org Innovation andDeployment
(IPPD Extension)OrganizationalEnvironment forIntegration
Process Management
2.4
Comparison of Elements -3
L4 GP
L4 GP
L5 GP
This graphic courtesy of Software Productivity Consortium