software test management overview for managers
DESCRIPTION
Software test management presentation given to the senior management of several Fortune 100 companies to aid them in planning their software development management efforts.TRANSCRIPT
Test Concepts and Practices
From Concept to DeliverablePresented by
T. James LeDoux,
Test Management Consultant
1
An Executive-level overview of quality assurance and test management practices and considerations
Topics
• Differences between QA and Test• Goals of Testing• Phases / Levels of Testing• Types of Testing• Basic Concepts of Test Planning and
Creation• Pass/Fail Criteria (Discussion)
• Traps and Pitfalls (Discussion)
• Successful Completion (Discussion)
2
Differences Between QA and Test
• QA– Focused on managing
organization-wide process for acceptance of upgrade
– Targets affected system– Performs by developing
management processes– Involves Quality Manager,
Quality Director or QA Manager
• Test– Focused on validating
level of confidence in upgrade
– Targets a Release– Performs from planned
script– Involves Test Engineer
or Test Manager
3
Goals of Testing
• Increase Quality of the Upgrade Quality = Confidence
• Reduce risks to the system or business• Stabilize the system• Meet the needs of the Users• Determine capability of the system• Define monitoring requirement in
production
4
Phases / Levels of TestingRequirements
Unit Tests
Test Plan
Test Scenarios
Performance Tests
String Tests
System Tests
UAT Tests
Integration Tests
Staging Tests
5
Types of Testing
• Functional
• Security
• User
• Exception
• Etc.
6
System Tests
Unit Tests
String Tests
System Tests
Integration Tests
Functional Tests Security Tests Exception Tests
Developer System Testing
Test Group System Testing
7
Performance Tests
Performance Tests
System Tests
UAT Tests
Staging Tests
Throughput Tests Stress TestsBenchmark Tests Load Tests
8
User Tests
Performance Tests
System Tests
UAT Tests
Staging Tests
User Acceptance Tests Business Exception Tests
9
Deployment Tests
Performance Tests
System Tests
UAT Tests
Staging Tests
Deployment TestsRelease Notes Test
10
The Test Harness
• Acts as the initiator of the object’s task• Typically structured with three parts:
– Input data and form– Setup process– Output data handler
• Must also be capable of creating and determining exception responses
11
Test Harness Example
Arithmetic AdderA + B = C
Method ARADD (A,B,C)
Input dataIteration A B
12345
Except 6Except 7
12457
21S
1378
12K8
Expected dataIteration C P/F
12345
Except 6Except 7
25
111319
ErrorError
PPPPPFF
12
Test PlanPage 1
Describes purpose. Example: One of the primary goals of the Web To Compass project at Compassion International is to streamline the process of getting a web transaction from the web site (Compassion.com) to the Compass database. Currently, the web site and Compass are disconnected, and a series of manual and semi-automated processes are in place to facilitate this work.
1. Introduction1.1 Purpose
1.2 GoalsDescribes the goals of the testing.Example: This Test Plan for the Compassion W2C Release supports the following Goals:1. Defines the activities required to prepare for and conduct Unit, System, Load, and Performance Testing.2. Communicates to all responsible parties the testing strategy.
1.3 Test ScopeDescribes scope of testing. Example: Testing encompasses System, Load and Regression testing under a manual test process.
1.4 System OverviewDescribes the system layout along with a diagram
13
Example of a System Overview Diagram
SQL 2000MSMQ Support
SQLServer
ExceptionWorkflow Database
App Server
BiztalkTracking
WorkflowExceptionHandler
Biztalk MessagingDatabase
Biztalk TrackingDatabase
NLB
IIS / ASP
Msg Broker
BiztalkMessaging
Inside Firewall
IIS / ASP
Test Controller(TC)
VirtualUser (VU)
Functional Test and Database Restore
Functional Test and Database Restore
Functional Test and Database Restore
Fun
ctio
nal T
est a
nd D
atab
ase
Res
tore
Fun
ctio
nal T
est
Sys
tem
and
Loa
d T
est
Sys
tem
and
Loa
d T
est
VU ControllerTest System
External to the PreProd System
14
Test PlanPage 2
2.0 Test Strategy Overview (From Test Strategy document)
15
2.1 Purpose, 2.2 Approach2.3 Validation
Test PlanPage 3
3.1 Unit/String Testing
• Unit/String Testing Environment• Test Data• Installation, Verification and Control• Unit/String Test to be Performed• Unit/String Test Logging and Reporting
3.1 Integration Testing• Integration Environment• Test Data• Installation, Verification and Control• Test Team Logistics• Logging and Reporting
3.0 Test Levels
Test PlanPage 4
• System Test Environment• Test Data• Installation, Verification and Control• Test Team Logistics• System Test to be Performed• System Test Logging and Reporting• Defect Management• Quality Requirements and Metrics (see next slide)• Roles and Responsibilities
16
3.2 System Test
3.3 Performance Test
Quality Requirements and Metrics Section
Entrance Criteria
• Test Plan completeo Test Cases completeo Test Procedures completeo Test Scenarios completeo Execution Schedule completeo Test data acquiredo User names, passwords and access rights issued
•Test environment ready
Exit Criteria
• Testing completed• Test results complete• Screen captures complete• Test logs up to date• Defects identified and reported or fixed
17
Test PlanPage 5
4.0 Test Identification and Definition
18
• Requirements Testability Matrix• Functional Scenarios• Exception Scenarios• End-To-End
• Functional Test Cases• Test Cases not associated with a Use Case• Exception Tests• End-To-End
4.1 Test Scenarios
Test PlanPage 6
4.2 Performance Test Scenarios
4.3 Regression Test Scenarios
4.4 Test Procedures
19
• Unit/String Test Schedule• Integration Test Schedule• System Test Schedule• Performance Test Schedule
• Manual Test Procedures• Automated Test Procedures
5.0 Test Schedule
Test Case ExampleSection Description
Procedure Name ABC1-TC100.0 Use Case 1 Happy Path
Procedure Overview This test covers the 2 steps of the UC1 happy path;1. Tester confirms no web comments2. Tester logs into system and navigates to the constituent screen and enters the constituent number
Procedure Dependencies 1. Records have been properly created for the constituents being pulled up.
Functional Requirements 1. No web comments are placed in the comments fields in the database2. The IIS is properly set up.
Procedure Test Data Test Database Image Suite TDB100.0
Test Procedure
Conducted by __________________________________
Date Conducted ________________________
Build Label/Version
Pre-Amble Instructions All databases need to be initialized to a known condition. This is typically done by copying the golden database images to their respective databases. Note that for Release 1, the databases will be small and the restoration of the databases will be more manual than is typical with a golden database environment.
Procedure Instructions
Steps Passed/Fail Data Input Results Comments
Setup steps
Step 1 – Tester confirms no web comments
Step 2 – Tester logs into the system
Post-Amble Instructions Do not restore the golden database to this environment after this test. Selected constituents in the database are set up for each test scenario to reduce the need to restore the database. 20
Summary
• QA is organization-wide; Test is Project-wide
• QA requires that testing be performed at the development, validation and deployment phases
• The Test Strategy defines the validation approach (test, acceptance process) for the business need
• The Test Plan defines how testing is managed
• The Test Cases define how testing is executed
21