the spiral model as a tool for evolutionary acquisition

8
The Spiral Model as a Tool for Evolutionary Acquisition Dr. Barry Boehm Wilfred J. Hansen University of Southern California Software Engineering Institute Center for Software Engineering Best Practices Dr. Barry Boehm will speak at STC luncheon B on April 30 S ince its original publication [1], the spiral development model diagrammed in Figure 1 has been used successfully in many defense and commercial projects. To extend this base of success, the Department of Defense (DoD) has recent- ly rewritten the defense acquisition regula- tions to incorporate “evolutionary acquisi- tion,” an acquisition strategy designed to mesh well with spiral development. In par- ticular, DoD Instruction 5000.2 subdi- vides acquisition [2]: “There are two ... approaches, evolu- tionary and single step to full capability. An evolutionary approach is preferred. … [In this] approach, the ultimate capa- bility delivered to the user is divided into two or more blocks, with increasing increments of capability.” (p. 20) Here, a block corresponds to a single product release. The text goes on to speci- fy the use of spiral development within blocks: “For both the evolutionary and single- step approaches, software develop- ment shall follow an iterative spiral development process in which contin- ually expanding software versions are based on learning from earlier devel- opment.” (p. 20) Given this reliance on the spiral develop- ment model, an in-depth definition is appropriate. Two recent workshops pro- vided one. The University of Southern California (USC) Center for Software Engineering and the Carnegie Mellon University Software Engineering Institute held two workshops last year to study spiral devel- opment and identify a set of critical suc- cess factors and recommended approaches. Their results appear in two reports, [3, 4] and are available on the workshop Web site www .sei.emu.edu/cbs/spiral2000 The first author’s presentations at these workshops defined spiral develop- ment and are followed below. The defini- tion was first converted to a report [5], where details, suggestions, and further ref- erences can be found. Additionally, a fol- low-on article appearing in a later CROSS TALK issue, will address the rela- tionships among spiral development, evo- lutionary acquisition, and the Integrated Capability Maturity Model. Spiral Development Definition and Context We can begin with a high-level definition of the spiral development model: The spiral development model is a risk- driven process model generator that is used to guide multi-stakeholder concur- rent engineering of software-intensive systems. It has two main distinguishing features. One is a cyclic approach for incrementally growing a system’s degree of definition and implementation while decreasing its degree of risk. The other is a set of anchor point milestones for ensur- ing stakeholder commitment to feasible and mutually satisfactory system solu- tions. The highlighted terms deserve further explanation: • Risks are situations or possible events that can cause a project to fail to meet its goals. They range in impact from trivial to fatal and in likelihood from certain to improbable. Since risk consid- erations dictate the path a development must take, it is important that those risks be cataloged candidly and com- pletely. See the references for taxonomy of risks [6] and a method for identifying them [7]. •A process model answers two main ques- tions: What should be done next? How long should it continue? Under the spi- ral model the answers to these questions are driven by risk considerations and Recent Department of Defence policy embodied in the new 5000.1 and 5000.2 series of acquisition regulations strong- ly emphasizes evolutionary acquisition and the use of spiral development for software. Spiral development has been used successfully in many commercial systems and in a good number of defense systems. But some ambiguities in pre- vious spiral model definitions has also led to a good number of unsuccessful projects adopting “hazardous spiral look- alikes.” This paper provides clearer definitions of a set of six Spiral Model Essentials or critical success factors for spiral development. It illustrates each with examples, identifies the hazardous look-alikes to avoid, and provides guidelines for using the spiral model in support of evolutionary acquisition. 4 CROSS TALK The Journal of Defense Software Engineering May 2001 Figure 1: Original Spiral Development Diagram

Upload: vuduong

Post on 10-Feb-2017

214 views

Category:

Documents


1 download

TRANSCRIPT

The Spiral Model as a Tool for Evolutionary AcquisitionDr. Barry Boehm Wilfred J. HansenUniversity of Southern California Software Engineering Institute

Center for Software Engineering

Best Practices DDrr.. BBaarrrryy BBooeehhmm wwiillll ssppeeaakk aatt SSTTCC lluunncchheeoonn BB oonn AApprriill 3300

Since its original publication [1], thespiral development model diagrammed

in Figure 1 has been used successfully inmany defense and commercial projects. Toextend this base of success, theDepartment of Defense (DoD) has recent-ly rewritten the defense acquisition regula-tions to incorporate “evolutionary acquisi-tion,” an acquisition strategy designed tomesh well with spiral development. In par-ticular, DoD Instruction 5000.2 subdi-vides acquisition [2]:

“There are two ... approaches, evolu-tionary and single step to full capability.An evolutionary approach is preferred.… [In this] approach, the ultimate capa-bility delivered to the user is dividedinto two or more blocks, with increasingincrements of capability.” (p. 20)

Here, a block corresponds to a singleproduct release. The text goes on to speci-fy the use of spiral development withinblocks:

“For both the evolutionary and single-

step approaches, software develop-ment shall follow an iterative spiraldevelopment process in which contin-ually expanding software versions arebased on learning from earlier devel-opment.” (p. 20)

Given this reliance on the spiral develop-ment model, an in-depth definition isappropriate. Two recent workshops pro-vided one.

The University of Southern California(USC) Center for Software Engineeringand the Carnegie Mellon UniversitySoftware Engineering Institute held twoworkshops last year to study spiral devel-opment and identify a set of critical suc-cess factors and recommended approaches.Their results appear in two reports, [3, 4]and are available on the workshop Website www.sei.emu.edu/cbs/spiral2000

The first author’s presentations atthese workshops defined spiral develop-ment and are followed below. The defini-tion was first converted to a report [5],

where details, suggestions, and further ref-erences can be found. Additionally, a fol-low-on article appearing in a laterCR O S S TA L K issue, will address the rela-tionships among spiral development, evo-lutionary acquisition, and the IntegratedCapability Maturity Model.

Spiral Development

Definition and ContextWe can begin with a high-level definitionof the spiral development model:

The spiral development model is a risk-driven process model generator that isused to guide multi-stakeholder concur-rent engineering of software-intensivesystems. It has two main distinguishingfeatures. One is a cyclic approach forincrementally growing a system’s degreeof definition and implementation whiledecreasing its degree of risk. The other isa set of anchor point milestones for ensur-ing stakeholder commitment to feasibleand mutually satisfactory system solu-tions.

The highlighted terms deserve furtherexplanation:• Risks are situations or possible events

that can cause a project to fail to meetits goals. They range in impact fromtrivial to fatal and in likelihood fromcertain to improbable. Since risk consid-erations dictate the path a developmentmust take, it is important that thoserisks be cataloged candidly and com-pletely. See the references for taxonomyof risks [6] and a method for identifyingthem [7].

• A process model answers two main ques-tions: What should be done next? Howlong should it continue? Under the spi-ral model the answers to these questionsare driven by risk considerations and

Recent Department of Defence policy embodied in the new 5000.1 and 5000.2 series of acquisition regulations strong-ly emphasizes evolutionary acquisition and the use of spiral development for software. Spiral development has beenused successfully in many commercial systems and in a good number of defense systems. But some ambiguities in pre-vious spiral model definitions has also led to a good number of unsuccessful projects adopting “hazardous spiral look-alikes.” This paper provides clearer definitions of a set of six Spiral Model Essentials or critical success factors for spiraldevelopment. It illustrates each with examples, identifies the hazardous look-alikes to avoid, and provides guidelinesfor using the spiral model in support of evolutionary acquisition.

4 CR O S S TA L K The Journal of Defense Software Engineering May 2001

Figure 1: Original Spiral Development Diagram

vary from project to project and some-times from one spiral cycle to the next.Each choice of answers generates a dif-ferent process model.

• The cyclic nature of the spiral model isillustrated in Figure 1. Rather thandevelop the completed product in onestep, multiple cycles are performed witheach taking steps calculated to reducethe most significant remaining risks.

• Each anchor point milestone is a specificcombination of artifacts and conditionsthat must be attained at some point.The sequence of three anchor pointmilestones — “LCO,” “LCA,” and“ICO” — is defined in Spiral Essential5 (to be discussed). These milestonesimpel the project toward completionand offer a means to compare progressbetween one project and another.

Many aspects of spiral developmentare omitted in the above definition. Theremainder of this paper expands the defi-nition by describing six essential aspectsthat every proper spiral process mustexhibit. The essentials are sketched inFigure 2. Each subsequent sectiondescribes a Spiral Essential, the critical suc-cess factor, reasons why it is necessary, andthe variant process models it allows.Examples are given. Other process modelsthat are precluded by the Spiral Essentialare described. Because these may seem tobe instances of the spiral model, but lacknecessary essentials and thus risk failure,they are called “hazardous spiral look-alikes.” Spiral Essential 1: Concurrent

Determination of Key Artifacts

(Operational Concept,

Requirements, Plans, Design,

Code)

For a successful spiral effort, it is vital todetermine balanced portions of certain keyartifacts concurrently and not sequentially.These key artifacts are the operationalconcept, the system and software require-ments, the plans, the system and softwarearchitecture and design, and the codecomponents, including COTS, reusedcomponents, prototypes, success-criticalcomponents, and algorithms. Ignoringthis Essential by sequentially determiningthe key artifacts will prematurely overcon-strain the project, and often extinguish thepossibility of developing a product satis-

factory to the stakeholders. Variants: Within the constraints of thisEssential, variation is possible in the prod-uct and process internals of the concurrentengineering activity. For a low technology,interoperability-critical system, the initialspiral products will be requirements-inten-sive. For a high technology, more stand-alone system, the initial spiral productswill be prototype-code intensive. Also, theEssential does not dictate the number ofmini-cycles (e.g., individual prototypes forCOTS, algorithms, or user-interface risks)within a given spiral cycle.Example: One-Second Response Time

Examples of failure due to omission of thisEssential include premature commitmentsto hardware platforms, incompatible com-binations of COTS components, andrequirements whose achievability has notbeen validated. For instance, in the early1980s, a large government organizationcontracted with TRW to develop an ambi-tious information system for more than athousand users. This system was to be dis-tributed across a campus and offer power-ful query and analysis access to a large anddynamic database. Based largely on userneed surveys and an oversimplified high-level performance analysis, TRW and thecustomer fixed into the contract a require-ment for a system response time of lessthan one second.

Two thousand pages of requirementslater, the software architects found thatsubsecond performance could only be pro-vided via a highly customized design thatattempted to cache data and anticipatequery patterns to be able to respond to

each user within one second. The resultinghardware architecture had more than 25super-minicomputers caching data accord-ing to algorithms whose actual perform-ance defied easy analysis. The estimatedcost was $100 million; see the upper arc inFigure 3 (See page 6).

Faced with this exorbitant cost, thecustomer and developer decided to devel-op and user-test a prototype. The resultsshowed a four-second response time with90 percent user satisfaction. This lowerperformance could be achieved with amodified client-server architecture, cut-ting development costs to $30 million asshown by the lower arc in the Figure 3 [8].Concurrent prototyping would have savedtwo years’ time and large-project effort.

Hazardous Spiral Look-Alike:

Violation of Waterfall Assumptions

Essential 1 excludes the use of an incre-mental sequence of waterfall develop-ments in the common case where there isa high risk of violating the assumptionsunderlying the waterfall model. Theseassumptions are that the requirements arepre-specifiable, slowly changing, and satis-factory to all stakeholders, and that a wellunderstood architecture can meet theserequirements. These assumptions must bemet by a project if the waterfall model is tosucceed. If all are true, then it is a projectrisk not to specify the requirements: thespiral-dictated risk analysis results in awaterfall approach for this project. If anyassumption is false, then a waterfallapproach will commit the project to trou-blesome assumptions and requirements

May 2001 www.stsc.hill.af.mil 5

Figure 2: Pictorial Sketch of the Six Spiral Essentials

The Spiral Model as a Tool for Evolutionary Acquisition

mismatches. Here are typical cases thatviolate waterfall assumptions:• Requirements are not generally pre-speci-

fiable for new user-interactive systems,because of the IKIWISI syndrome.When asked for their required screenlayout for a new decision-support sys-tem, users will generally say, “I can’t tellyou, but I’ll know it when I see it (IKI-WISI).” In such cases, a concurrent pro-totyping/requirements/architectureapproach is necessary.

• Rapidly changing requirements are wellillustrated by electronic commerce proj-ects, where the volatility of technologyand the marketplace is high. The time ittakes to write detailed requirements isnot a good investment of the scarcetime-to-market available when it is like-ly the requirements will change morethan once downstream.

• The architecture and its implicationswere the downfall of the one-secondresponse time example.

Spiral Essential 2:: Each Cycle

Does Objectives, Constraints,

Alternatives, Risks, Review,

and Commitment to Proceed

Spiral Essential 2 identifies the activitiesthat need to be done in each spiral cycle.These include consideration of criticalstakeholder objectives and constraints,elaboration and evaluation of project andprocess alternatives for achieving theobjectives subject to the constraints, iden-tification and resolution of risks attendanton choices of alternative solutions, andstakeholders’ review and commitment to

proceed based on satisfaction of their crit-ical objectives and constraints. If all ofthese are not considered, the project maybe prematurely committed to alternativesthat are either unacceptable to key stake-holders or overly risky. Variants: Spiral Essential 2 does not man-date particular generic choices of risk reso-lution techniques, although guidelines areavailable [9]. Nor does this Essential man-date particular levels of effort for the activ-ities performed during each cycle. Levelsmust be balanced between the risks oflearning too little and the risks of wastingtime and effort gathering marginally use-ful information.Example: Windows-Only COTS

Ignoring Essential 2 can lead to wastedeffort in elaborating an alternative thatcould have been shown earlier to be unsat-isfactory. One of the current USC digitallibrary projects is developing a Web-basedviewer for oversized artifacts (e.g., newspa-pers, large images). The initial prototypefeatured a tremendously powerful andhigh-speed viewing capability, based on aCOTS product called ER Mapper. Theinitial project review approved selection ofthis COTS product, even though it onlyran well on Windows platforms, and theLibrary had significant Macintosh andUNIX user communities. This decisionwas based on initial indications that Macand UNIX versions of ER Mapper wouldbe available “soon.” These indicationsproved unreliable, however, and the antic-ipated delay became quite lengthy. So afterwasting considerable effort on ERMapper, it was dropped in favor of a lesspowerful but fully portable COTS prod-

uct, Mr. SID. The excess effort could havebeen avoided had the review team includ-ed stakeholders from the Mac and UNIXcommunities on campus who would havedone the necessary investigations earlier.Hazardous Spiral Look-Alike:

Excluding Key Stakeholders

Essential 2 excludes the process model oforganizing the project into sequentialphases or cycles in which key stakeholdersare excluded. These omissions are likely tocause critical risks to go undetected.Examples are excluding developers fromsystem definition, excluding users fromsystem construction, or excluding systemmaintainers from either definition or con-struction. Excluding developer participa-tion in early cycles can lead to projectcommitments based on unrealisticassumptions about developer capabilities.Excluding users or maintainers fromdevelopment cycles can lead to win/losesituations, which generally devolve intolose-lose situations.

Spiral Essential 3: Level of

Effort Driven by Risk

Considerations

Spiral Essential 3 dictates the use of riskconsiderations to answer the difficultquestions concerning how much is enoughof a given activity such as domain engi-neering, prototyping, testing, configura-tion management, and so on. The recom-mended approach is to evaluate RiskExposure (RE), which is computed asProbability (Loss) • Size (Loss). There isrisk of project error REerror from doing toolittle effort and project delay REdelay fromdoing too much. Ideally, the effortexpended will be that which minimizes thesum REerror + REdelay. This approach applies tomost activities that are undertaken in aspiral development.Variants: The variants to be consideredinclude the choice of methods used to pur-sue activities (e.g., MBASE/WinWin,Rational RUP, JAD, QFD, ESP) and thedegree of detail of artifacts produced ineach cycle. Another variant is an organiza-tion’s choice of particular methods for riskassessment and management. Example: Pre-Ship Testing

Risk considerations can help determine“how much testing is enough” before ship-ping a product. The more testing that is

Best Practices

6 CR O S S TA L K The Journal of Defense Software Engineering May 2001

Figure 3: Two System Designs: Cost vs. Response Time

done, the lower becomes REerror due todefects, as discovered defects reduce boththe size of loss due to defects and the prob-ability that undiscovered defects stillremain. However, the more time spenttesting, the higher is REdelay from losses dueto both competitors entering the marketand decreased profitability on the remain-ing market share. As shown in Figure 4,the sum of these risk exposures achieves aminimum at some intermediate level oftesting. The location of this minimum-risk point in time will vary by type oforganization. For example, it will be con-siderably shorter for a “dot.com” companythan it will for a safety-critical productsuch as a nuclear power plant. Calculatingthe risk exposures also requires an organi-zation to accumulate a fair amount of cal-ibrated experience on the probabilities andsize of losses as functions of test durationand delay in market entry. Hazardous Spiral Look-Alikes:

Risk Insensitivity

Hazardous spiral model look-alikesexcluded by Essential 3 are: • Risk-insensitive evolutionary develop-

ment (e.g., neglecting scalability risks).• Risk-insensitive incremental develop-

ment (e.g., sub-optimizing duringincrement 1 with an ad hoc architecturethat must be dropped or heavilyreworked to accommodate future incre-ments).

• Impeccable spiral plans with no com-mitment to managing the risks identi-fied.

Spiral Essential 4:: Degree of

Detail Driven by Risk

Considerations

Where Essential 3 circumscribes efforts,Essential 4 circumscribes the results ofthose efforts; it dictates that risk consider-ations determine the degree of detail ofartifacts. This means, for example, that thetraditional ideal of a complete, consistent,traceable, testable requirements specifica-tion is not a good idea for certain productcomponents, such as a graphic user inter-face (GUI) or COTS interface. Here, therisk of precisely specifying screen layoutsin advance of development involves a highprobability of locking an awkward userinterface into the development contract,while the risk of not specifying screen lay-outs is low, given the general availability offlexible GUI-builder tools. Even aimingfor full consistency and testability can berisky, as it creates a pressure to premature-ly specify decisions that would better bedeferred (e.g., the form and content ofexception reports). However, some riskpatterns make it very important to haveprecise specifications, such as the risks ofsafety-critical interface mismatchesbetween hardware and software compo-nents, or between a prime contractor’s anda subcontractor’s software.

This guideline shows when it is riskyto over specify and under specify softwarefeatures: • If it’s risky to not specify precisely, DO

specify (e.g., hardware-software inter-face, prime-subcontractor interface).

• If it’s risky to specify precisely, DO NOTspecify (e.g., GUI layout, volatile COTS behavior).

Variants: Unconstrained by Essential 4are the choices of representations for arti-facts (SA/SD, UML, MBASE, formalspecs, programming languages, … ).

Example: Risk of Precise

Specification

One editor specification required thatevery operation be available through abutton on the window. As a result, thespace available for viewing and editingbecame unusably small. The developer wasprecluded from moving some operationsto menus because the GUI layout hadbeen specified precisely at an early step.(Of course, given too much freedom pro-grammers can develop very bad GUIs.Stakeholder review and prototype exercis-es are necessary to avoid such problems.)Hazardous Spiral Look-Alikes:

Insistence on Complete

Specifications

It is often risky to undertake a spiral devel-opment project wherein complete specifi-cations are pre-specified for all aspects.Aspects such as those described in theexample should be left to be furtherdefined during project exploratory phases.

Spiral Essential 5:: Use Anchor

Point Milestones LCO, LCA, IOCA major difficulty of the original spiralmodel was its lack of intermediate mile-stones to serve as commitment points andprogress checkpoints. This difficulty hasbeen remedied by the development of a setof anchor point milestones:

LLCCOO - Life Cycle Objectives - whatshould the system accomplish?

LLCCAA - Life Cycle Architecture - whatis the structure of the system?

IIOOCC - Initial Operating Capability -the first released version.

(The artifacts for each are provided by anelectronic process guide [10] and are alsoused by the Rational Unified Process.)

The focus of the LCO review is toensure that at least one architecture choiceis viable from a business perspective. Thefocus of the LCA review is to commit to asingle detailed definition of the project.The project must have either eliminatedall significant risks or put in place anacceptable risk-management plan. TheLCA milestone is particularly important,as its pass/fail criteria enables stakeholdersto hold up projects attempting to proceedinto evolutionary or incremental develop-ment without a life cycle architecture.Each milestone is a stakeholder commit-ment point: at LCO the stakeholders

May 2001 www.stsc.hill.af.mil 7

Figure 4: Pre-Ship Test Risk Exposure

Risk ExposureRE =Size (Loss) •Pr (Loss)

Amount of testing; Time to market

10

8

6

4

2

REdelayMarket share losses

RE (total)

REerrorDefect losses

The Spiral Model as a Tool for Evolutionary Acquisition

commit to support building architecture;at LCA they commit to support initialdeployment; at IOC they commit to sup-port operations. Together the anchor pointmilestones avoid analysis paralysis, unreal-istic expectations, requirements creep,architectural drift, COTS shortfalls andincompatibilities, unsustainable architec-tures, traumatic cutovers, and useless sys-tems.Variants: One appropriate variant ofEssential 5 is the number of spiral cyclesbetween anchor points. Another possiblevariant is to merge anchor points. In par-ticular, a project using a mature andappropriately scalable fourth generationlanguage (4GL) or product line frame-work will have already determined itsarchitecture by its LCO milestone,enabling the LCO and LCA milestones tobe merged. Example: Stud Poker Analogy

A valuable aspect of the spiral model is itsability to support incremental commit-ment of corporate resources rather thanrequiring a large outlay of resources to theproject before its success prospects are wellunderstood. Funding a spiral developmentcan thus be likened to the game of studpoker. In that game, you put a couple ofchips in the pot and receive two cards, onehidden and one exposed. If your cardsdon’t promise a winning outcome, you candrop out without a great loss. This corre-sponds to cancelling a project at or beforeLCO. If your two cards are both aces, youwill probably bet on your prospects aggres-sively (or less so if you see aces amongother players’ exposed cards). Droppingout of the second or third round of bettingcorresponds to cancelling at or beforeLCA. In any case, based on informationavailable, you can decide during eachround whether it’s worth putting morechips in the pot to buy more informationor whether it’s better not to pursue thisparticular deal or project.Hazardous Look-Alike:

Evolutionary Development Without Life

Cycle Architecture

The LCO and LCA milestones’ pass-failcriteria emphasize that the system’s archi-tecture must support not just the initialincrement’s requirements, but also the sys-tem’s evolutionary life-cycle requirements.This avoids the hazardous spiral look-alikeof an initial increment optimized to pro-

vide an impressive early system demon-stration or limited release, but withoutthe architecture to support full-systemrequirements for security, fault-tolerance,or scalability to large workloads. Otherimportant considerations for LCA are thatthe initial release ensure continued keystakeholder participation, that the userorganizations are flexible enough to adaptto the pace of system evolution, and thatlegacy-system replacement be well thoughtout. Ignoring these aspects lead to otherhazardous spiral look-alike processes.

Spiral Essential 6:: Emphasis

on System and Life Cycle

Activities and ArtifactsSpiral Essential 6 emphasizes that spiraldevelopment of software-intensive systemsneeds to focus not just on software con-struction aspects, but also on overall sys-tem and life-cycle concerns. Will theproduct satisfy stakeholders? Will it meetcost and performance goals? Will it inte-grate with existing business practices?Will it adapt to organizational changes?Software developers are apt to fall intothe oft-cited trap: “If your best tool is ahammer, the world you see is a collectionof nails.” Writing code may be a devel-oper’s forte, but it is as important to theproject as are nails to a house.Variants: The model’s use of risk con-siderations to drive solutions makes itpossible to tailor each spiral cycle towhatever mix of software and hardware,choice of capabilities, or degree of pro-ductization is appropriate.Example: Order Processing

An example of failure to consider the

whole system occurred with the ScientificAmerican order processing system inFigure 5 [11]. Scientific American hadhoped that computerizing the functionsbeing performed on tabulator machineswould reduce its subscription processingcosts, errors, and delays. Rather than ana-lyze the sources of these problems, thesoftware house focused on the part of theproblem having a software solution. Theresult was a batch-processing computersystem whose long delays put extra strainon the clerical portion of the system thathad been the major source of costs, errors,and delays in the first place. The softwarepeople looked for the part of the problemwith a software solution (their “nail”),pounded it in with their software hammer,and left Scientific American worse off thanwhen they started.

This kind of outcome would haveresulted even if the software automatingthe tabulator-machine functions had beendeveloped in a risk-driven cyclic approach.However, its Life Cycle Objectives mile-stone package would have failed its feasi-bility review, as it had no system-levelbusiness case demonstrating that thedevelopment of the software would lead tothe desired reduction in costs, errors, anddelays. Had a thorough business caseanalysis been done, it would have identi-fied the need to reengineer the clericalbusiness processes as well as to automatethe manual tab runs. Hazardous Spiral Look-Alikes:

Logic-Only OO Designs

Models excluded by Essential 6 includemost published object-oriented analysisand design (OOA&D) methods, whichare usually presented as abstract logical

Best Practices

8 CR O S S TA L K The Journal of Defense Software Engineering May 2001

Figure 5: Scientific American Order Processing

exercises independent of system perform-ance or economic concerns. For example,in a recent survey of 16 OOA&D books,only six listed the word “performance” intheir index, and only two listed “cost.”

Using the Spiral Model for

Evolutionary Acquisition

Both the February and September work-shops had working groups on the relation-ships between spiral development and evo-lutionary acquisition. A primary conclu-sion was that the relationships differ acrosstwo major DoD acquisition sectors:• Information systems, such as C4ISR sys-

tems, logistics systems, and manage-ment systems, in which spiral and evo-lutionary models coincide well.

• Software-intensive embedded hard-ware/software systems, in which thesoftware aspects best follow a spiralapproach, but the hardware aspects needto follow a more sequential approach toaccommodate lead times for productionfacilities, production subcontracts, andlong-lead critical component orders.

Even for embedded systems, however,spiral approaches can be helpful for syn-chronizing hardware and software process-es, and for determining when to apply anevolutionary, incremental, or single-stepacquisition strategy. For example, Xerox’stime-to-market process uses the spiralanchor point milestones as hardware/soft-ware synchronization points for its printerbusiness line [12]. Rechtin and Maieradopt a similar approach in their book, theArt of Systems Architecting [13].

A good example of the use of a risk-driven spiral approach to determine a pre-ferred software/system acquisition strategywas originally developed for DoD’s MIL-STD-498, and subsequently incorporatedin IEEE/EIA 12207 [14]. This approachdistinguishes among single-step (once-through or waterfall), incremental, andevolutionary acquisition processes asshown in Table 1. Thus, evolutionaryacquisition avoids defining all require-ments first, and proceeds in multipledevelopment cycles, some of whichinvolve distribution and usage of initialand intermediate operational capabilities.

Table 2 shows how a spiral risk analy-sis can be performed during early integrat-ed product and process definition cycles toselect the most appropriate acquisitionprocess for a system. The example shownin Table 2 is for a fairly large, high-tech-nology C4ISR system. For such a system,the high risks of poorly-understoodrequirements and rapid technologychanges push the decision away fromonce-through or incremental acquisition,while the needs for an early capability andfor user feedback on full requirements

push the decision toward evolutionaryacquisition. If the system had been a small,lower-technology embedded system whereall capabilities are needed at first delivery,the risk and opportunity factors wouldpush the decision away from evolutionaryand incremental acquisition toward once-through or single-step acquisition.

ConclusionThis paper has defined the spiral develop-ment model as a risk-driven process modelgenerator with cyclic process executionand a set of three anchor point milestones.The definition was sharpened by present-ing a set of six “essential” attributes; thatis, six attributes which every spiral devel-opment process must incorporate. Theseessentials are summarized in Table 3 (Seepage 10). Omission of each of these givesrise to process models that are cyclic oriterative, but are not examples of spiraldevelopment. These are called “hazardousspiral look-alikes.” Each was described andpilloried as part of describing the Essentialit violates.

Spiral development works fairly seam-lessly with evolutionary acquisition ofinformation systems. For evolutionaryacquisition of software-intensive embed-ded hardware-software systems, a mix ofspiral and sequential processes is oftenneeded. Even here, however, the spiralanchor points and risk-driven processselection approach are useful for determin-ing how best to synchronize the hardwareand software processes.◆

References

1.Boehm, B., A Spiral Model of Software Development and Enhancement,Computer, May 1988, pp. 61-72.

2.Department of Defense Instruction5000.2, Operation of the Defense

May 2001 www.stsc.hill.af.mil 9

Table 1: Evolutionary Acquisition Distinctions [14]

Development strategy Define allrequirements first?

Multiple develop-ment cycles?

Distribute interimsoftware?

Once-Through (Waterfall) Yes No No

Incremental (PreplannedProduct Improvement)

Yes Yes Maybe

Evolutionary No Yes Yes

The Spiral Model as a Tool for Evolutionary Acquisition

Table 2: Spiral Risk Analysis for Process Selection [14]

Acquisition System, September 2000,www.acq.osd.mil/ap/i50002p.doc

3.Hansen, W.; Foreman, J.; Carney, D;Forrester, E.; Graettinger, C.; Peterson,W.; and Place, P., Spiral DevelopmentBuilding the Culture: A Report on theCSE-SEI Workshop, February 2000,Software Engineering Institute,Carnegie Mellon University, SpecialReport CMU/SEI-2000-SR-006, July2000, www/cbs/spiral2000/february2000/finalreport.html

4.Hansen, W.; Foreman, J.; Albert, C.;Axelband, E.; Brownsword, L.; Forrester, E. ;and Place, P., Spiral Development andEvolutionary Acquisition: The SEI-CSEWorkshop, September, 2000, SoftwareEngineering Institute, Carnegie MellonUniversity, Special Report, in preparation.

5.Boehm, Barry, edited by Hansen, Wilfred J.,Spiral Development: Experience, Principles,and Refinements, Software EngineeringInstitute, Carnegie Mellon University, SpecialReport CMU/SEI-00-SR-08, ESC-SR-00-08, June, 2000, www/cbs/ spiral2000/february2000/BoehmSR.html.

6.Carr, M. J.; Konda, S. L.; Monarch, I.;

Ulrich, F. C., and Walker, C. F.,Taxonomy-Based Risk Identification,Software Engineering Institute, CarnegieMellon University, Technical ReportCMU/SEI-93-TR-6, ESC-TR-93-183,June,1993,www.sei.cmu.edu/legacy/risk/kit/tr06.93.pdf

7. Williams, R. C.; Pandelios, G. J.; andBehrens, S.G., Software RiskEvaluation (SRE) Method Description(Version 2.0), Software EngineeringInstitute, Carnegie Mellon University,Technical Report CMU/SEI-99-TR-029, ESC-TR-99-029, December,1999, www.sei.cmu.edu/pub/documents/99.reports/pdf/99tr029body.pdf

8. Boehm, B., Unifying Software Engineeringand Systems Engineering, IEEE Computer,March 2000, pp. 114-116.

9. Boehm, B., Software Risk Management, IEEE Computer Society Press, 1989.

10.Mehta, N., MBASE Electronic ProcessGuide, USC-CSE, Los Angeles, Calif.,October 1999, sunset.usc.edu/research/MBASE/EPG

11.Boehm, B., Software Engineering Economics,

New York, N.Y., Prentice Hall, 1981.12.Hantos, P., From Spiral to Anchored

Processes: A Wild Ride in Lifecycle Buildingarchitecture, Proceedings, USC-SEI SpiralExperience Workshop, Los Angeles, Calif.,Febuary 2000, www.sei.cmu.edu/cbs/spiral2000/Hantos

13.Rechtin, E., and Maier, M., The Art ofSystems Architecting, CRC Press, 1997.

14.IEEE and EIA, Industry Implementation ofISO/IEC 12207: Software Life CycleProcesses-Implementation Considerations,IEEE/EIA 12207.2 - 1997, April 1998.

Article Web Site Locationhttp://www.se i .cmu.edu/pub/documents/00.reports/pdf/00sr008.pdf

10 C R O S STA L K The Journal of Defense Software Engineering May 2001

Best Practices

Table 3: Summary

“It is impossible to make a programfoolproof because fools are so

ingenious.”Anonymous

Software Technology Support Centerwwwwww..ssttsscc..hhiillll..aaff..mmiill

The Software Technology Support Center (STSC) was estab-lished in 1987 as the command focus for proactive applicationof software technology in weapon, command and control, intel-ligence, and mission-critical systems. The STSC provides hands-on assistance in adopting effective technologies for software-intensive systems. It helps organizations identify, evaluate, andadopt technologies that improve software product quality, pro-duction efficiency, and predictability. STSC uses the term tech-nology in its broadest sense to include processes, methods, tech-niques, and tools that enhance human capability. Its focus is onfield-proven technologies that will benefit the Department ofDefense mission.

Software Engineering Institute

wwwwww..sseeii..ccmmuu..eedduu

The Software Engineering Institute (SEI) is a federally fundedresearch and development center sponsored by the Departmentof Defense to provide leadership in advancing the state of thepractice of software engineering to improve the quality of systemsthat depend on software. SEI helps organizations and individualsto improve their software engineering management practices. Thesite features a software engineering management practices workarea that focuses on the ability of organizations to predict andcontrol quality, schedule, cost, cycle time, and productivity whenacquiring, building, or enhancing software systems.

Data & Analysis Center for Software wwwwww..ddaaccss..ddttiicc..mmiill

The Data and Analysis Center for Software (DACS) is aDepartment of Defense (DoD) Information Analysis Center. TheDACS is the DoD software information clearinghouse serving asan authoritative source for state of the art software informationand providing technical support to the software community. Thecenter’s technical area of focus is software technology and soft-ware engineering, in its broadest sense. The DACS offers a widevariety of technical services designed to support the development,testing, validation, and transitioning of software engineeringtechnology. The DACS is technically managed by the Air ForceResearch Laboratory - Information Directorate.

The Institute of Electrical and

Electronics Engineerswwwwww..iieeeeee..oorrgg

The Institute of Electrical and Electronics Engineers, Inc.(IEEE), helps advance global prosperity by promoting the engi-neering process of creating, developing, integrating, sharing, andapplying knowledge about electrical and information technolo-gies and sciences for the benefit of humanity and the profession.The IEEE is a nonprofit, technical professional association ofmore than 350,000 individual members in 150 countries.Through its members, the IEEE is a leading authority in techni-cal areas ranging from computer engineering, biomedical tech-nology and telecommunications, to electric power, aerospace andconsumer electronics, among others.

Software Productivity Consortium

wwwwww..ssooffttwwaarree..oorrgg

The Software Productivity Consortium is a unique, nonprofitpartnership of industry, government, and academia. It developsprocesses, methods, tools, and supporting services to help mem-bers and affiliates build high-quality, component-based systems,and continuously advance their systems and software engineeringmaturity pursuant to the guidelines of all of the major processand quality frameworks. The site features an interactive section todiscuss new trends.

About the Authors

BBaarrrryy BBooeehhmm,, Ph.D., is the TRW professor of soft-ware engineering and director of the Center forSoftware Engineering at the University of SouthernCalifornia. He was previously in technical and man-agement positions at General Dynamics, RandCorp., TRW, and the Office of the Secretary of

Defense as the director of Defense Research and EngineeringSoftware and Computer Technology Office. Boehm orginiated thespiral model, the Constructive Cost Model, and the stakeholderwin-win approach to software management and requirements nego-tiation.

UUnniivveerrssiittyy ooff SSoouutthheerrnn CCaalliiffoorrnniiaaCCeenntteerr ffoorr SSooffttwwaarree EEnnggiinneeeerriinnggLLooss AAnnggeelleess,, CCAA 9900008899--00778811VVooiiccee:: 221133--774400--88116633FFaaxx:: 221133--774400--44992277EE--mmaaiill:: bbooeehhmm@@ssuunnsseett..uusscc..eedduu

WWiillffrreedd JJ..HHaannsseenn has been consulting on the tools and processesfor utilizing commercial-off-the-shelf (COTS) software within larg-er systems at the Software Engineering Institute. Previously he wasdirector of the Andrew Consortium where he led the developmentand maintenance of the Andrew User Interface System, which wasthe first modern word-processor in that it originated the embed-ding of arbitrary objects into text and into other objects. One of hiscontributions to Andrew was a scripting language having a uniqueinteger-free sub-language for processing strings. Earlier in his career,Hansen taught data structures and programming languages.

SSooffttwwaarree EEnnggiinneeeerriinngg IInnssttiittuuttee44550000 FFiifftthh AAvveennuueePPiittttssbbuurrgghh,, PPAA 1155221133--33889900 VVooiiccee:: 441122--226688--88224477EE--mmaaiill:: wwjjhh@@sseeii..ccmmuu..eedduu

May 2001 www.stsc.hill.af.mil 11

The Spiral Model as a Tool for Evolutionary Acquisition

W e b s i t e s