software engineering - university of...

12
1 CS 1530 Software Engineering Fall 2004 Software Engineering Process & Evaluation (wrapup) CS 1530 Software Engineering Fall 2004 Reading SE chapters 2,12 Eric S. Raymond: The Cathedral and the Bazaar read by 11/15: there will be a short quiz on it on Monday http://www.catb.org/~esr/writings/cathedral-bazaar/ cathedral-bazaar/cathedral-bazaar. ps Categories of evaluation Feature analysis: rate and rank attributes Survey: document relationships Case study Formal experiment

Upload: others

Post on 27-Mar-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

1

CS 1530 Software Engineering Fall2004

Software Engineering

Process &Evaluation (wrapup)

CS 1530 Software Engineering Fall2004

Reading

■ SE chapters 2,12■ Eric S. Raymond: The Cathedral and the

Bazaar■ read by 11/15: there will be a short quiz on

it on Monday

http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/cathedral-bazaar.ps

Categories of evaluation

■ Feature analysis: rate and rankattributes

■ Survey: document relationships■ Case study■ Formal experiment

Page 2: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

2

Example feature analysis

Table 12.1. Design tool ratings

Feature Tool 1:t-OO-l

Tool 2:ObjecTool

Tool 3:EasyDesign

Importance

Good user interface 4 5 4 3Object-oriented design 5 5 5 5Consistency checking 5 3 1 3Use cases 4 4 4 2Runs on Unix 5 4 5 5Score 85 77 73

Case study types

■ Sister projects: each is typical and hassimilar values for the independentvariables

■ Baseline: compare single project toorganizational norm

■ Random selection: partition singleproject into parts

Formal experiment

■ Controls variables■ Uses methods to reduce bias and

eliminate confounding factors■ Often replicated■ Instances are representative: sample

over the variables (whereas case studysamples from the variables)

Page 3: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

3

Evaluation steps

■ Setting the hypothesis■ Maintaining control over variables■ Making investigation meaningful

Table 12.2. Common pitfalls in evaluation. Adapted with permission from (Liebman 1994)

Pitfall Description1. Confounding Another factor is causing the effect.2. Cause or effect? The factor could be a result, not a cause, of the treatment.3. Chance There is always a small possibility that your result happened by chance.4. Homogeneity You can find no link because all subjects had the same level of the factor.5. Misclassification You can find no link because you cannot accurately classify each subject’s level

of the factor.6. Bias Selection procedures or administration of the study inadvertently bias the result.7. Too short The short-term effects are different from the long-term ones.8. Wrong amount The factor would have had an effect, but not in the amount used in the study.9. Wrong situation The factor has the desired effect, but not in the situation studied.

Assessment vs. prediction

■ Assessment system examines an existingentity by characterizing it numerically

■ Prediction system predicts characteristic of afuture entity; involves a model withassociated prediction procedures■ deterministic prediction (we always get the same

output for an input)■ stochastic prediction (output varies

probabilistically)

Page 4: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

4

Table 12.3. Results of comparing prediction models. (Lanubile 1996)

Modelingtechnique

Predictivevalidity

Proportionof falsenegatives

Proportionof falsepositives

Proportionof falseclassifications

Completeness

Overallinspection

Wastedinspection

Discriminantanalysis

p=0.621 28% 26% 54% 42% 46% 56%

Principalcomponentanalysisplusdiscriminant analysis

p=0.408 15% 41% 56% 68% 74% 55%

Logisticregression

p=0.491 28% 28% 56% 42% 49% 58%

Principalcomponentanalysispluslogisticregression

p=0.184 13% 46% 59% 74% 82% 56%

Logicalclassification model

p=0.643 26% 21% 46% 47% 44% 47%

Layeredneuralnetwork

p=0.421 28% 28% 56% 42% 49% 58%

Holographic network

p=0.634 26% 28% 54% 47% 51% 55%

Heads ortails?

p=1.000 25% 25% 50% 50% 50% 50%

Table 12.4. ISO 9126 quality characteristics.

Quality characteristic DefinitionFunctionality A set of attributes that bear on the

existence of a set of functions and theirspecified properties. The functions arethose that satisfy stated or implied needs.

Reliability A set of attributes that bear on thecapability of software to maintain itsperformance level under stated conditionsfor a stated period of time.

Usability A set of attributes that bear on the effortneeded for use and on the individualassessment of such use by a stated orimplied set of users.

Efficiency A set of attributes that bear on therelationship between the software’sperformance and the amount of resourcesused under stated conditions.

Maintainability A set of attributes that bear on the effortneeded to make specified modifications(which may include corrections,improvements, or adaptations of softwareto environmental changes and changes inthe requirements and functionalspecifications).

Portability A set of attributes that bear on the abilityof software to be transferred from oneenvironment to another (including theorganizational, hardware or softwareenvironment).

Dromey quality model

■ Identify a set of high-level quality attributes.■ Identify product components.■ Identify and classify the most significant,

tangible, quality-carrying properties for eachcomponent.

■ Propose a set of axioms for linking productproperties to quality attributes.

■ Evaluate the model, identify its weaknesses,and refine or recreate it.

Page 5: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

5

Table 12.5. Quantitative targets for managing US defense projects. (NetFocus 1995)

Item Target Malpractice levelFault removal efficiency > 95% < 70%Original fault density < 4 per function point > 7 per function pointSlip or cost overrun in excessof risk reserve

0% > 10%

Total requirements creep(function points orequivalent)

< 1% per month average > 50%

Total programdocumentation

< 3 pages per functionpoint

> 6 pages perfunction point

Staff turnover 1 to 3% per year > 5% per year

Software reuse

■ Producer reuse: creating componentsfor someone else to use

■ Consumer reuse: using componentsdeveloped for some other product

■ Black-box reuse: using componentwithout modification

■ Clear- or white-box reuse: modifyingcomponent before reusing it

More on reuse

■ Compositional reuse: uses components asbuilding blocks; development done frombottom up

■ Generative reuse: components designedspecifically for a domain; design is top-down

■ Domain analysis: identifies areas ofcommonality that make a domain ripe forreuse

Page 6: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

6

Table 12.6. Aspects of reuse. (adapted from Prieto-Díaz 1993)

Substance Scope Mode Technique Intention ProductIdeas andconcepts

Vertical Planned andsystematic

Compositional Black-box, as is Source code

Artifacts andcomponents

Horizontal Ad hoc,opportunistic

Generative Clear-box,modified

Design

Procedures,skills,experience

Requirements

Patterns ObjectsArchitecture Data

ProcessesDocumentationTests

Reuse lessons

■ Reuse goals should be measurable.■ Management should resolve reuse goals early.■ Different perspectives may generate different

questions about reuse.■ Every organization must decide at what level to

answer reuse questions.■ Integrate the reuse process into the

development process.■ Let your business goals suggest what to

measure.

Postmortem analysis

■ Design and promulgate a project surveyto collect relevant data.

■ Collect objective project information.■ Conduct a debriefing meeting.■ Conduct a project history day.■ Publish the results by focusing on

lessons learned.

Page 7: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

7

Table 12.9. When post-implementation evaluation is done.

Time period Percentage of respondents (of 92 organizations)Just before delivery 27.8%At delivery 4.20%One month after delivery 22.20%Two months after delivery 6.90%Three months after delivery 18.10%Four months after delivery 1.40%Five months after delivery 1.40%Six months after delivery 13.90%Twelve months after delivery 4.20%

Table 12.10. Required questions for level 1 of process maturity model.

Question number Question1.1.3 Does the Software Quality Assurance function have a

management reporting channel separate from the softwaredevelopment project management?

1.1.6 Is there a software configuration control function for eachproject that involves software development?

2.1.3 Is a formal process used in the management review of eachsoftware development prior to making contractualcommitments?

2.1.14 Is a formal procedure used to make estimates of software size?2.1.15 Is a formal procedure used to produce software development

schedules?2.1.16 Are formal procedures applied to estimating software

development cost?2.2.2 Are profiles of software size maintained for each software

configuration item over time?2.2.4 Are statistics on software code and test errors gathered?2.4.1 Does senior management have a mechanism for the regular

review of the status of software development projects?2.4.7 Do software development first-line managers sign off on their

schedule and cost estimates?2.4.9 Is a mechanism used for controlling changes to the software

requirements?2.4.17 Is a mechanism used for controlling changes to the code?

Table 12.11. Key process areas in the CMM (Paulk et al. 1993)CMM level Key process areasInitial noneRepeatable Requirements management

Software project planningSoftware project tracking and oversightSoftware subcontract managementSoftware quality assuranceSoftware configuration management

Defined Organization process focusOrganization process definitionTraining programIntegrated software managementSoftware product engineeringIntergroup coordinationPeer reviews

Managed Quantitative process managementSoftware quality management

Optimizing Fault preventionTechnology change managementProcess change management

Page 8: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

8

Evaluation of CMMconformance■ Commitment to perform■ Ability to perform■ Activities performed■ Measurement and analysis■ Verifying implementation

SPICE

■ Base practices (essential activities) andgeneral practices (institutionalize orimplement the process)

■ Ratings in terms of■ customer-supplied processes■ engineering processes■ project processes■ support processes■ organizational processes

Table 12.12. ISO 9001 clauses.

Clause number Subject matter4.1 Management responsibility4.2 Quality system4.3 Contract review4.4 Design control4.5 Document and data control4.6 Purchasing4.7 Control of customer-supplied product4.8 Product identification and traceability4.9 Process control4.10 Inspection and testing4.11 Control of inspection, measuring and test equipment4.12 Inspection and test status4.13 Control of nonconforming product4.14 Corrective and preventive action4.15 Handling, storage, packaging, preservation and delivery4.16 Control of quality records4.17 Internal quality audits4.18 Training4.19 Servicing4.20 Statistical techniques

Page 9: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

9

Table 12.13. People capability maturity model. (Curtis, Hefley and Miller 1995)

Level Focus Key practices5: optimizing Continuous knowledge and skills

improvementContinuous workforce innovationCoachingPersonal competencydevelopment

4: managed Effectiveness measured andmanaged, high performanceteams developed

Organizational performancealignmentOrganizational competencymanagementTeam-based practicesTeam-buildingMentoring

3: defined Competency-based workforcepractices

Participatory cultureCompetency-based practicesCareer developmentCompetency developmentWorkforce planningKnowledge and skills analysis

2: repeatable Management takes responsibilityfor managing its people

CompensationTrainingPerformance managementStaffingCommunicationWork environment

1: initial

Process includes:

■ all major process activities■ resources used, subject to set of constraints (such

as schedule)■ intermediate and final products■ subprocesses, with hierarchy or links■ entry and exit criteria for each activity■ sequence of activities, so timing is clear■ guiding principles, including goals of each activity■ constraints for each activity, resource or product

Reasons for modeling aprocess■ To form a common understanding■ To find inconsistencies, redundancies,

omissions■ To find and evaluate appropriate

activities for reaching process goal■ To tailor a general process for the

particular situation in which it will beused

Page 10: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

10

Examples of process models

■ Waterfall model■ Prototyping■ V-model■ Operational specification■ Transformational model■ Phased development: increments and

iteration■ Spiral model

Tools and techniques forprocess modeling■ Example: Lai notation

■ activity■ sequence■ process model■ resource■ control■ policy■ organization

Table 2.1. Artifact definition form for artifact “car” (Lai 1991).

Name CarSynopsis This is the artifact that represents a class of cars.Complexity type CompositeData type (car_c, user-defined)Artifact-state listparked ((state_of(car.engine) = off)

(state_of(car.gear) = park)(state_of(car.speed) = stand))

Car is not moving, and engineis not running.

initiated ((state_of(car.engine) = on)(state_of(car.key_hole) = has-key)(state_of(car-driver(car.)) =in-car)(state_of(car.gear) = drive)(state_of(car.speed) = stand))

Car is not moving, but theengine is running

moving ((state_of(car.engine) = on)(state_of(car.keyhole) = has-key)(state_of(car-driver(car.)) =driving)((state_of(car.gear) = drive)or (state_of(car.gear) =reverse))((state_of(car.speed) = stand)or (state_of(car.speed) =slow) or (state_of(car.speed)= medium) or(state_of(car.speed) = high))

Car is moving forward orbackward.

Sub-artifact listdoors The four doors of a car.engine The engine of a car.keyhole The ignition keyhole of a car.gear The gear of a car.speed The speed of a car.

Relations listcar-key This is the relation between a car and a key.car-driver This is the relation between a car and a driver.

Page 11: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

11

Dynamic process models

■ Enables enaction of process to see whathappens to resources and artifacts asactivities occur

■ Simulate alternatives and make changesto improve the process

■ Example: systems dynamics model

Marvel process language

■ Three constructs: classes, rules, toolenvelopes

■ Three-part process description:■ rule-based specification of process

behavior■ object-oriented definition of model’s

information process■ set of envelopes to interface between

Marvel and external software tools

Marvel exampleTICKET:: superclass ENTITY

status : (initial, open, referred_out, referral_done,

closed, fixed) = initial; diagnostics : (terminal, non_terminal, none) = none; level : integer; description : text; referred_to : link WORKCENTER; referrals : set_of link TICKET; process : link PROC_INST;end

diagnose [?t: TICKET]: (exists PROC_INST ?p suchthat (linkto [?t.process ?p])) : (and (?t.status = open}(?t.diagnostics = none)) {TICKET_UTIL diagnose ?t.Name} (and (?t.diagnostics = terminal)

(?p.last_task = diagnose)(?p.next_task = refer_to_WC3));

(and (?t.diagnostics = non_terminal)(?p.last_task = diagnose)(?p.next_task = refer_to_WC2));

Classdefinitionfor troubletickets

Rule fordiagnosingticket

Page 12: Software Engineering - University of Pittsburghpeople.cs.pitt.edu/~mock/cs1530/lectures2/cs1530-20.pdf · 2.2.4 Are statistics on software code and test errors gathered? 2.4.1 Does

12

Desirable properties ofprocess modeling tools andtechniques■ Facilitates human understanding and

communication■ Supports process improvement■ Supports process management■ Provides automated guidance in

performing the process■ Supports automated process execution

Information system exampleTable 2.2. Artifact definition form for artifact “risk.”

Name Risk (problemX)Synopsis This is the artifact that represents the risk that problem X will

occur and have a negative affect on some aspect of thedevelopment process.

Complexity type CompositeData type (risk_s, user_defined)Artifact-state listlow ((state_of(probability.x) =

low)(state_of(severity.x) = small))

Probability of problem is low,severity problem impact issmall.

high-medium ((state_of(probability.x) =low)(state_of(severity.x) = large))

Probability of problem is low,severity problem impact islarge.

low-medium ((state_of(probability.x) =high)(state_of(severity.x) = small))

Probability of problem ishigh, severity problem impactis small.

high ((state_of(probability.x) =high)(state_of(severity.x) = large))

Probability of problem ishigh, severity problem impactis large.

Sub-artifact listprobability.x The probability that problem

X will occur.severity.x The severity of the impact

should problem X occur onthe project.