test process improvement
DESCRIPTION
Test Process ImprovementTRANSCRIPT
© 1998 Gerrard ConsultingVersion 2.0 Page 1
Test Process Improvement
Paul Gerrard
Gerrard Consulting Limited
© 2000 Gerrard Consulting Ltd Slide 1
Gerrard Consulting Limitedhttp://gerrardconsulting.com
Agenda
IntroductionWhat Can Be Improved?Test MethodologyTest Organisation Maturity (TOM)and Testing Assessment
© 2000 Gerrard Consulting Ltd Slide 2
Implementation.
© 1998 Gerrard ConsultingVersion 2.0 Page 2
Introduction
© 2000 Gerrard Consulting Ltd Slide 3
Is testing a strategic issue?
Testing, in its broadest sense, comprises more than 50% of your development cost and has influence over all these risksSoftware development productivity is increasing, testing productivity decreasingTester psychology at odds with developer
© 2000 Gerrard Consulting Ltd Slide 4
Tester psychology at odds with developer and most management mentalityDemand for, appreciation of testing and testers is increasing.
© 1998 Gerrard ConsultingVersion 2.0 Page 3
Testing perceptions
Testing is innate - anyone can do it– everyone tests, but it takes a good tester to find
bugs in good code– anyone can find bugs in poor quality code
Tester psychology is counter-intuitive, destructive ‘all-wrong’
© 2000 Gerrard Consulting Ltd Slide 5
destructive, all wrongGood testers are awkward employeesSome code doesn’t need testingTesting doesn’t add value.
Testing perceptions (2)
Software is getting easier to testSoftware quality is improvingTesting has little value - destructive, pointlessTesting is random and cannot be
i d
© 2000 Gerrard Consulting Ltd Slide 6
systematised– test techniques are prone to mathematical
treatment– development is still a craft, in comparison.
© 1998 Gerrard ConsultingVersion 2.0 Page 4
Some of the problems to be addressed…
Too many errors being found in production or being reported by customersTesting is taking too long, and delaying deliveryTesting is too expensive
iffi l i fi di l
© 2000 Gerrard Consulting Ltd Slide 7
Difficulty in finding volunteers to testTesters don’t see a career path in your company.
What can be Improved?
© 2000 Gerrard Consulting Ltd Slide 8
© 1998 Gerrard ConsultingVersion 2.0 Page 5
Testing know-how
Training is the quick-win stream of workFocus on principles, process and techniquesTraining and qualification schemes are emerging (ISEB scheme in the UK):– grades of certification
i d l ll b
© 2000 Gerrard Consulting Ltd Slide 9
– industry-relevant syllabus– accredited training organisations, courses and
instructors– examination schemes.
Standards and procedures
Serve four main purposes:– a consistent baseline for the quality of testing
across the organisation– alignment with your development and
deployment processes– appropriate use of industry best practices
© 2000 Gerrard Consulting Ltd Slide 10
pp p y p– provide a shortcut to using the best, most
appropriate techniques.
© 1998 Gerrard ConsultingVersion 2.0 Page 6
Organisation
Streamlining testing usually implies fostering closer liaisons between– projects and software users– developers, maintainers and support groups
A common improvement is to create an independent test group
© 2000 Gerrard Consulting Ltd Slide 11
independent test group– set up to offer testing services to other
departments, projects.
Environment
Existing test environments are often inadequate and need improvement– scale - too small to run some tests– control - chaos is a barrier to systematic testing– isolation - some tests too risky to implement– dedication - shared environments problematic
© 2000 Gerrard Consulting Ltd Slide 12
– dedication - shared environments problematicTools can make testers more productive– the only realistic way to regression test.
© 1998 Gerrard ConsultingVersion 2.0 Page 7
Improvement investment
All improvements require investmentPay back by ensuring resources are focusedBenefits vary with type of improvement and by installation– e.g. one site found that ‘...[There was] a seven-
fold pay back for the cost of the training over a
© 2000 Gerrard Consulting Ltd Slide 13
fold pay-back for the cost of the training over a three month period.’
Test Methodology
© 2000 Gerrard Consulting Ltd Slide 14
© 1998 Gerrard ConsultingVersion 2.0 Page 8
Typical test strategy
Acceptance
Integration
System
Acceptance
© 2000 Gerrard Consulting Ltd Slide 15
Unit
Ad hoc
V model: waterfall and locks
R i t AcceptanceRequirements
LogicalDesign
PhysicalIntegration
System test
ptest
© 2000 Gerrard Consulting Ltd Slide 16
Design
Code and test
Integration
© 1998 Gerrard ConsultingVersion 2.0 Page 9
Typical test practice
Acceptance
System
Integration
Acceptance
© 2000 Gerrard Consulting Ltd Slide 17
Unit
Ad hoc
x
Economics of errors
Time/Cost
© 2000 Gerrard Consulting Ltd Slide 18
Development Live Running
© 1998 Gerrard ConsultingVersion 2.0 Page 10
Front-loading
The principle is to start testing earlyReviews, walkthroughs and inspections of documents during the definition stages are examples of early testsStart preparing test cases early. Test case preparation “tests” the document on which
© 2000 Gerrard Consulting Ltd Slide 19
preparation “tests” the document on which the cases are basedPreparing the user manual tests the requirements and design.
Front-loading advantages
Requirements, specification and design errors are detected earlier and are therefore less costlyRequirements more accurately capturedTest cases are a useful input to designers and programmers
© 2000 Gerrard Consulting Ltd Slide 20
programmersSpreads the workload of test preparation over the whole project.
© 1998 Gerrard ConsultingVersion 2.0 Page 11
Early test case preparation
R i t AcceptancePrepare Requirements
LogicalDesign
PhysicalIntegration
System test
ptest
ptests
Preparetests
Prepare
© 2000 Gerrard Consulting Ltd Slide 21
Design
Code and test
Integrationptests
Testing throughout the life cycle: the W model
AcceptanceTest
InstallWrite Test theRequirementsRequirements Test
BuildSoftware
LogicalDesign
PhysicalDesign
Test theDesign
Test theSpecs
IntegrationTest
SystemTest
BuildSystem
© 2000 Gerrard Consulting Ltd Slide 22
Code Unit Test
© 1998 Gerrard ConsultingVersion 2.0 Page 12
AcceptanceTest
InstallWrite Test theRequirementsRequirements
Static test techniques
RequirementsAnimation
BehaviourAnalysis
ScenarioWalkthroughs
Early Test
BuildSoftware
LogicalDesign
PhysicalDesign
Test theDesign
Test theSpecs
IntegrationTest
SystemTest
BuildSystem
Requirementsq
Reviews
Inspections
yTest Case
Preparation
© 2000 Gerrard Consulting Ltd Slide 23
Code Unit Test
Inspections Static Analysis
AcceptanceTest
InstallWrite Test theRequirementsRequirements
Dynamic test techniques
Test
BuildSoftware
LogicalDesign
PhysicalDesign
Test theDesign
Test theSpecs
IntegrationTest
SystemTest
BuildSystem
Requirementsq
FeaturesTesting
Performance,
TestingVolume, Stress
Restart & Recovery
TestingInstallation
SecurityTesting
© 2000 Gerrard Consulting Ltd Slide 24
Code Unit Test
PathTesting
LoopTesting
Input ValidationTesting
Multi-userTesting
Equivalence PartitioningBoundary Value Analysis
Cause-effect Graphing
Transaction Flows
© 1998 Gerrard ConsultingVersion 2.0 Page 13
Test Process Assessment
© 2000 Gerrard Consulting Ltd Slide 25
Information gathering
Need to determine:– where you are today– where you want to go tomorrow
Data gathering using interviews– managers, developers, testers, users
h kli t t d d d ti
© 2000 Gerrard Consulting Ltd Slide 26
– checklist type and open-ended questionsOther research– examination of project and test plans, test
records, fault reports.
© 1998 Gerrard ConsultingVersion 2.0 Page 14
The challenges of testing improvement
Major barriers are organisational and personal, not technical– changing management perceptions so they
support testing and improvement– overcoming management and practitioner
resistance to change
© 2000 Gerrard Consulting Ltd Slide 27
– design and implementation of workable processes and management controls
People, not tools, implement test strategies.
Problems or symptoms?
Need to separate problems from symptoms:– management doesn't understand the objectives of
testing– the cost of testing is high but difficult to pin
down– developers, testers, users may never have been
© 2000 Gerrard Consulting Ltd Slide 28
p , , ytrained
– the quality of the product delivered into testing is poor, so takes longer to system test.
© 1998 Gerrard ConsultingVersion 2.0 Page 15
Improvement mix
A mix of improvements is most likely to be required:– management awareness– tester training– improved definition of the test stages and their
objectives
© 2000 Gerrard Consulting Ltd Slide 29
objectives– measurement of the quality of the product at
each stage– etc. etc.
Staged improvements
Not all improvements are a good idea straight away– some improvements are expensive– some save time, but bring dramatic change– some improve the quality of the testing, but take
longer to implement
© 2000 Gerrard Consulting Ltd Slide 30
longer to implementVery few improvements save time, improve quality, cause minimal change and pay back after two weeks.
© 1998 Gerrard ConsultingVersion 2.0 Page 16
TOM™
Recommended improvements must take account of broader organisational objectives, constraints and priorities
Test Organisation Maturity model.
© 2000 Gerrard Consulting Ltd Slide 31
Test Organisation Maturity (TOM)(TOM)
and Testing Assessment
© 2000 Gerrard Consulting Ltd Slide 32
© 1998 Gerrard ConsultingVersion 2.0 Page 17
Process maturity
Process maturity is a product-independent measure of an organisation’s capabilityFramework for process improvementsWhat are the most sensible things to improve next?A h ?
© 2000 Gerrard Consulting Ltd Slide 33
Assessment: where are we now?Monitoring: are we there yet?
Capability Maturity Model (CMM)
CMM for software widely adopted
Used to assess capability and identify improvements
Maturity 'levels' define a graduated scale and provide a roadmap of improvements
© 2000 Gerrard Consulting Ltd Slide 34
CMM represents a high-level, process-oriented description of the capability of a development organisation.
© 1998 Gerrard ConsultingVersion 2.0 Page 18
CMM (2)
Heritage of the CMM is large, long term, defence projects in the US
SEI agenda remains high-integrity, but has little to say about testing
Relevance to commercial IT organisations is
© 2000 Gerrard Consulting Ltd Slide 35
often tenuous
Most organisations at levels 1 (0, -1…?)
Whose process is it anyway?
Several testing maturity models exist
Test Process Improvement (TPI) Model, Koomen, Pol (Holland)– 20 key areas scored, improvement suggestions
Testability Maturity Model, Gelperin, USA– 20 key areas, score between 1-2
T ti M t it M d l B t i
© 2000 Gerrard Consulting Ltd Slide 36
Testing Maturity Model, Burnstein, Suwannasat and Carlson, USA– aligned with the CMM
© 1998 Gerrard ConsultingVersion 2.0 Page 19
Problems with existing models
Remedy oriented, not problem-oriented– here are the pills, which are you taking?– here’s the solution, does it solve the problem?
Little guidance on priorities or constraints– hackers and high integrity get same treatment
‘ li k d ’ t lit d lt
© 2000 Gerrard Consulting Ltd Slide 37
– assumes a ‘click and go’ mentality and culture– no comparison of costs v benefits
For process hypochondriacs - you didn’t know you had a problem until now.
TOM™
Test Organisation Maturity model - TOM™Assessment– self assessment or consultant-assessment– questions based on symptoms of poor testing– assessment score from 20-100
B il i i d l
© 2000 Gerrard Consulting Ltd Slide 38
Built-in improvements model– improvements selected, based on assessment– improvements prioritised, based on assessment.
© 1998 Gerrard ConsultingVersion 2.0 Page 20
TOM™ assessment to action plan
Document objectives and constraints and prioritise
Objectives/Constraints
Symptoms(current maturity)
Priorities
constraints and prioritise
Identify the testing-related problems
Which problems cause t ?
Constraints
© 2000 Gerrard Consulting Ltd Slide 39
Priorities(target maturity)
Action Plan
most concern?
Select improvements which best meet objectives.
Assessment questions
Questions focus on organisational rather than technical issues– can be answered by management or practitioners
(try both and compare!)Questions relate directly to the symptoms– how bad is the problem? (the score)
© 2000 Gerrard Consulting Ltd Slide 40
how bad is the problem? (the score)– how much does it hurt? (the priority).
© 1998 Gerrard ConsultingVersion 2.0 Page 21
Improvement objectives and constraints
Decrease time required to testDecrease cost of testingIncrease quality of testing (and systems)Minimal change to current practiceQuick payback
© 2000 Gerrard Consulting Ltd Slide 41
Give each a priority of 1-5 (low-high).
Example assessment symptom
There are gaps in the testing - features of the system may be released untested
Score 1 Tests are not based on requirements or design documents, there are no test inventories or means of measuring coverage against requirements or specifications
Score 3 Test inventories are used to define the scope of system and acceptance tests and cross-reference requirements, formal test techniques are sometimes used to design black box test
system may be released untested
© 2000 Gerrard Consulting Ltd Slide 42
test techniques are sometimes used to design black-box test cases.
Score 5 Test inventories are used for all testing and are reviewed against requirements and specifications, formal test techniques are used for test case design, tools are used to measure code coverage
© 1998 Gerrard ConsultingVersion 2.0 Page 22
Of the errors that are found, there is a perception (based on evidence) that many
Example assessment symptom (2)
Score 1 Errors are found in acceptance tests which should have been found in sub-system and system tests
Score 3 Errors are found in system tests which should have been found in sub-system tests
perception (based on evidence) that many should have been found in earlier test stage(s).
© 2000 Gerrard Consulting Ltd Slide 43
found in sub-system tests
Score 5 Errors found would not be expected to have been detected earlier
Potential improvements
Train developers, system testers in testingImprove test design by adopting techniquesInvolve users in definition of system testsMotivate developers to do better testingSeparate code and test activities in plans.
© 2000 Gerrard Consulting Ltd Slide 44
© 1998 Gerrard ConsultingVersion 2.0 Page 23
Objective/Constraint Score Description
Improvement: train developers, system testers in testing
Decrease time required to test 0 Increase dev.testDecrease sys.test
Decrease cost of testing 0 No change
Increase quality of testing(and systems)
+1 Yes
Minimal change to current -1 Change likely
© 2000 Gerrard Consulting Ltd Slide 45
gpractice
g y
Quick payback +1 Yes
Implementation
© 2000 Gerrard Consulting Ltd Slide 46
© 1998 Gerrard ConsultingVersion 2.0 Page 24
Pilot projects
Test the improvements and gain experiencePilot projects should be:– low risk, in a familiar area, low criticality– in their earliest stages - budget yet to be set– project managed by someone willing and
capable of managing additional responsibilities
© 2000 Gerrard Consulting Ltd Slide 47
capable of managing additional responsibilities– staffed by people willing to learn and apply new
techniques, and capable of providing objective feedback.
Pilot project objectives
Get consistent and comprehensive incident and defect information for a project– where are the bugs found?– where should the bugs have been found?
Identify where costs are incurred in testingM th t f t ti d
© 2000 Gerrard Consulting Ltd Slide 48
Measure the amount of testing doneReduce or eliminate duplicated testsReduce the frustrations of testing.
© 1998 Gerrard ConsultingVersion 2.0 Page 25
Pilot projects - preparation
Project manager needs to be briefed and measurable objectives for the pilot agreedProject staff– need to be aware of the purpose of ‘doing things
differently’– training in the use of unfamiliar process
© 2000 Gerrard Consulting Ltd Slide 49
training in the use of unfamiliar process, techniques and tools
Need to provide support to the project manager and staff involved.
Reviewing the results of the pilot
If the pilot objectives were met and the changes accepted, then the changes are ready for roll-out to other projects
Otherwise, the conduct of the pilot should be investigated:
© 2000 Gerrard Consulting Ltd Slide 50
– were the objectives too ambitious?
– could the changes ever have delivered the benefits anticipated?
© 1998 Gerrard ConsultingVersion 2.0 Page 26
Roll-out - refinement and full implementation
Refinement and re-issue of materials based on pilot experiencesInternal publicity of the results of the pilot and the imminent roll-out of the changesSchedule and conduct training
i b f h il j d
© 2000 Gerrard Consulting Ltd Slide 51
Nominate members of the pilot project and implementation team as consultants to other projects.
Post-implementation review
Were benefits achieved? If not why not?If you exceeded expectations, celebrate!Are there new opportunities for additional improvements?Are additional refinements required?
© 2000 Gerrard Consulting Ltd Slide 52
What would you do different? The same?Is testing now providing the information to provide an informed decision to release?
© 1998 Gerrard ConsultingVersion 2.0 Page 27
TOM assessment forms can be downloaded and completed at
gerrardconsulting.com
© 2000 Gerrard Consulting Ltd Slide 53