isa test plan template

23
INTERACTIVE SAUID ARABIA LTD. Test Plan Template Test Plan Ahmed Abdulhamid 12/2/2008 © 2008, Interactive Saudi Arabia Limited. All rights reserved. This document or any part thereof may not, without the written consent of Interactive Saudi Arabia Limited (ISA), be copied, reprinted, or reproduced in any material from, including but not limited to photocopying, tanscribing, transmitting, or sorting it in any medium or translationg it into any language, in any form or by any means, be it electronic, mechanical, xerographic, optical, magnatic or otherwise.The information contained in this document is proprietry and confidential; all copyrights, trademarks, trade names, patents and other intellectual property rights in the documentation are the exclusive propery of ISA unless otherwise specified. The information (including but not limited to data, drawings, specification and documentation) shall not at any time, be disclosed directly or indirectly to any third party without the prior written consent of ISA. The information contained herein is believed to be accurate and reliable. ISA accepts no repsonsibility for its use by any means or in any way whatsoever.The indormation contained herein is subject to change without notice.

Upload: cazanova20

Post on 15-Nov-2014

135 views

Category:

Documents


5 download

DESCRIPTION

test plan template

TRANSCRIPT

INTERACTIVE SAUID ARABIA LTD.

Test Plan Template Test Plan

Ahmed Abdulhamid

12/2/2008

© 2008, Interactive Saudi Arabia Limited. All rights reserved. This document or any part thereof may not, without the

written consent of Interactive Saudi Arabia Limited (ISA), be copied, reprinted, or reproduced in any material from,

including but not limited to photocopying, tanscribing, transmitting, or sorting it in any medium or translationg it into

any language, in any form or by any means, be it electronic, mechanical, xerographic, optical, magnatic or

otherwise.The information contained in this document is proprietry and confidential; all copyrights, trademarks, trade

names, patents and other intellectual property rights in the documentation are the exclusive propery of ISA unless

otherwise specified. The information (including but not limited to data, drawings, specification and documentation)

shall not at any time, be disclosed directly or indirectly to any third party without the prior written consent of ISA. The

information contained herein is believed to be accurate and reliable. ISA accepts no repsonsibility for its use by any

means or in any way whatsoever.The indormation contained herein is subject to change without notice.

<Document Name> <Project Name & Project Code>

Document Name: <Document Name>

Prepared for: <Company Name>

Author: Ahmed Abdulhamid

Pages: 23

Last Updated: 6/1/2009

< Test Plan>

<Company Name> <Project Name>

Project Code: <Client Name – 3 Letters>-<Project Short Name>

ISA Test_Plan_Template.doc (Version 1.0.0 Page 3 of 23

TABLE OF CONTENTS

1 Introduction ................................................................................................................ 5

1.1 Purpose .............................................................................................................. 5

1.2 Background ........................................................................................................ 5

1.3 Scope ................................................................................................................. 5

1.4 Project Team ...................................................................................................... 5

1.5 Project Identification ........................................................................................... 5

2 Requirements for Test ............................................................................................... 7

2.1 Use Cases Phase 1 ............................................................................................ 7

2.2 Use Cases Phase 2 ............................................................................................ 7

2.3 Use Case Diagram Phase 1 & 2 ........................................................................ 8

3 Approach ................................................................................................................... 9

3.1 Approach ............................................................................................................ 9

Flow Chart Phase1 ........................................................................................................ 9

Flow Chart Phase 2 ....................................................................................................... 9

3.2 Testing Tools ...................................................................................................... 9

3.3 Measurement and Metrics .................................................................................. 9

3.4 Deliverables ........................................................................................................ 9

4 Item Pass/Fail Criteria ............................................................................................. 10

The following chart defines the Interactive agreed-upon severity levels: ....................... 10

5 Test Strategy ........................................................................................................... 11

6 Test Procedures ...................................................................................................... 12

6.1 Configuration Management .............................................................................. 12

6.2 Change Control Process .................................................................................. 12

6.3 Defect Reporting .............................................................................................. 12

6.4 File Locations ................................................................................................... 13

7 Environmental Needs .............................................................................................. 14

7.1 QC Test Lab ..................................................................................................... 14

7.2 QC Desktop ...................................................................................................... 14

7.3 Staffing and Training Needs ............................................................................. 14

8 Tools ........................................................................................................................ 15

9 Resources ................................................................................................................ 16

10 Test Case Creation/Execution Estimates (Schedule) ......................................... 17

11 Roles & Responsibilities ...................................................................................... 18

12 Issues/Risks and Mitigating Factors .................................................................... 19

13 Project Milestones and Target Dates................................................................... 20

14 Reports ................................................................................................................. 21

15 Approvals ............................................................................................................. 22

16 Appendix A: Release Signoff Form ...................................................................... 23

16.1 Signoffs ............................................................................................................. 23

ISA Test_Plan_Template.doc (Version 1.0.0 Page 4 of 23

CHANGE HISTORY

Section Date Change Made

ISA Test_Plan_Template.doc (Version 1.0.0 Page 5 of 23

1 INTRODUCTION

1.1 Purpose <Project Purpose Descriptions>

1.2 Background <Project Background Descriptions>

1.3 Scope <Project Scope Descriptions>

1.4 Project Team Name: Title: Role: Phone #:

1.5 Project Identification The table below identifies the documentation and availability used for developing the test plan:

Document (and version / date)

Created or Available

Received or Reviewed

Author or

Resource

Notes

Requirements Specification

Yes No

Yes No

Functional Specification

Yes No

Yes No

Use-Case Reports Yes No

Yes No

Project Plan Yes No

Yes No

Design Specifications Yes No

Yes No

ISA Test_Plan_Template.doc (Version 1.0.0 Page 6 of 23

Prototype Yes No

Yes No

User’s Manuals Yes No

Yes No

Business Model or Flow

Yes No

Yes No

Data Model or Flow Yes No

Yes No

Business Functions and Rules

Yes No

Yes No

Project or Business Risk Assessment

Yes No

Yes No

Architecture Diagram Yes No Yes No

ISA Test_Plan_Template.doc (Version 1.0.0 Page 7 of 23

2 REQUIREMENTS FOR TEST The information below identifies those items (Use Cases, Functional Requirements, Non-Functional Requirements) that have been identified as targets for testing. From a high level this represents what will be tested.

2.1 Use Cases Phase 1 1 login (UC6) <Use Case, Functional Specifications details etc>

2.2 Use Cases Phase 2 1 <Use Case, Functional Specifications details etc>

ISA Test_Plan_Template.doc (Version 1.0.0 Page 8 of 23

2.3 Use Case Diagram Phase 1 & 2

<Image Use Case Diagram>

ISA Test_Plan_Template.doc (Version 1.0.0 Page 9 of 23

3 APPROACH

3.1 Approach

Flow Chart Phase1

<Visio Image>

Flow Chart Phase 2

<Visio Image>

3.2 Testing Tools Test Coverage documents for Requirements Traceability management

Automated Defect Tracking Tool for defect management-BTT

Visual Source Safe for all test artifacts (e.g. test cases, execution log etc)

QTP for test automation.

<>

3.3 Measurement and Metrics UAT, System and Integration Test Cases.

Final Test Execution Report.

Base lined correct test results for all future tests.

Defect Log

<>

3.4 Deliverables Deliverable elements from test team are.

o Test Plan o Test Scenarios o Test Data o Test Cases o Weekly Defect Report o Test Execution Report o <>

Interactive Saudi Arabia Page 10 of 23

4 ITEM PASS/FAIL CRITERIA

THE FOLLOWING CHART DEFINES THE ZEON AGREED-UPON SEVERITY LEVELS:

Severity Level Description 1 – Crash Major functionality defect or loss with no workaround, causing

loss of data or application crash.

2 – Block Any functionality defect or loss with no workaround, causing a failure to meet a specific system requirement

3 – Major Any functionality defect or loss causing a failure to meet a specific system requirement, but which has a workaround that is acceptable to the customer for a finite agreed period of time.

4 – Minor Inconvenience or annoyance issues

Defect Levels Permitted for Release Promotion from Validation to UAT Priority one: zero maximum. Priority two: five maximum. Priority three: ten maximum. Defect Levels Permitted for Release Promotion from UAT to Production Priority one: zero maximum. Priority two: zero maximum (unless an approved Change Request is in place). Priority three: eight maximum Test cases developed during the construct phase will be updated by the Project Team during test execution and test logs will be created. Any deviations in expected results will be documented in the comments section of the log. All the defects found during the testing will be tracked to closure. Decisions on test case updates will be reserved for the daily or weekly Project review meeting as applicable during the test period.

The project teams plan to complete all test cases. Every failed test case will be recorded. When a single coding error appears in multiple parts of the application, it is recorded as multiple fails. Application development rework will be done as soon as a test case fails. The application cannot move to production with level one, two and three errors but may work with level four errors with customer approval.

Interactive Saudi Arabia Page 11 of 23

5 TEST STRATEGY The testing strategy should define the objectives of all test stages and the techniques that apply. The testing strategy also forms the basis for the creation of a standardized documentation set, and facilitates communication of the test process and its implications outside of the test discipline. Any test support tools introduced should be aligned with, and in support of, the test strategy (Separate Document Available)

Interactive Saudi Arabia Page 12 of 23

6 TEST PROCEDURES

6.1 Configuration Management The goal of configuration management is to control and document the physical and functional of a system to ensure the integrity of the system is maintained. This is accomplished by implementing version control and change control procedures of all physical and functional configurations of the system.

6.2 Change Control Process The main objective of change control is to control the changes made to project artifacts (technical documents, code and databases). If change control processes are not followed, projects have a risk of scope creep or moving the target that everyone is aiming at. In order for developers to build software, they need a clear definition of how the software is supposed to function.

6.3 Defect Reporting The primary objective of a defect tracking system is to define responsibilities and activities necessary to ensure that defects are identified, prioritized and fixed. The key benefits of a good tracking system are the improvements in communication and accountability.

The system must ensure:

Anyone who needs to know about a defect should learn of it soon after it is reported (Timeliness).

No error will go unfixed merely because someone forgot about it (Accountability).

No errors will go unfixed merely because of poor communication (Communication).

The tracking system will be used to document defect information from initial finding through

resolution. During the course of software development, the status of each defect will be

updated to keep the information current with the development-test-repair cycle.

What is a Defect? Testing is performed to ensure the application satisfies all requirements. When the application does not perform as expected, the tester first confirms that both

Interactive Saudi Arabia Page 13 of 23

the test conditions and errant result are repeatable. Underlying conditions and test results are then documented as a defect in the defect tracking system. A defect is a variance from a desired product attribute. There are two categories of defect:

Variance from product specifications - The product built varies from the product specified.

Wrong - the specifications have been implemented incorrectly. The defect is a variance from customer/user specification. Missing - a specified requirement is not in the built product. This can be a variance from specification, an indication that the specification was not implemented, or a requirement that the customer identified was not in the specification. If a requirement is missing the new requirement must flow through the change control process.

Variance from customer/user expectation - This variance is something that the user wanted that is not in the built product. The missing piece may be a specification or requirement, or unsatisfactory implementation of the requirement. If a requirement is missing or the requirement is changed the new/changed requirement must flow through the change control process.

6.4 File Locations All files should be located where everyone on the project team can access them. The following chart will be used to track the locations of the files.

File Name File Location in VSS Phases Test Case Type

Interactive Saudi Arabia Page 14 of 23

7 ENVIRONMENTAL NEEDS This section specifies the necessary and desired properties of the test environment including: (TBD) Each component developed is required to perform sufficient unit testing. The unit test cases, test data, and test results will be created before the project migrate to the pre-production environment. Testing tool will be used for functional and load testing of the application. Scripts created for testing the application will be also used in the pre-production environment.

7.1 QC Test Lab Hardware (TBD)

Operating Systems (TBD) Software (TBD)

7.2 QC Desktop Hardware (TBD)

Operating Systems (TBD)

Software (TBD)

7.3 Staffing and Training Needs N/A

Interactive Saudi Arabia Page 15 of 23

8 TOOLS

The following tools will be employed for this project:

Type Tool Vendor/In-house Version

Manage Documents Visual Source Safe 6.0

Bug snapshot Snagit 8.0

Automated Defect

Tracking Tool for

defect management

Bug Tracker 3.0

Flow Charts Visio 10.0

Unit test MbUnit or NUnit

Browsers testing IETester , Firefox , Google

Chrome , Opera

Code Documentation NDoc V1.3.1

<> <> <> <>

Interactive Saudi Arabia Page 16 of 23

9 RESOURCES This section presents the recommended resources for the test effort, their main responsibilities, and their knowledge or skill set.

Name: Title: Role: Phone #:

QA Testing

QA Testing

QA Testing

QA Testing

QA Testing

<> <> <> <>

Interactive Saudi Arabia Page 17 of 23

10 TEST CASE CREATION/EXECUTION ESTIMATES

(SCHEDULE)

UseCase# Use Case Title Phases

TestCases-

Needed(est)

TestCases-

Rework

<>

0 0

Total TestCases To Be Created 0

Total TestCases To Be Reworked 0

Hours To Rework Existing TestCase 0 0

Hours To Create New TestCase and Approval 0 0.0

Total HoursTo Completion (1 Person) 0.0

Total Days To Completion (1 Person) 0.0

Total Days To Completion (Test Team) #DIV/0!

Total TestCases To Be Executed 0

Average Hours to Execute Each Test Case 0

Total HoursTo Completion 0.0

Total Days To Completion (1 Person) #DIV/0!

Total Days To Completion (Test Team) #DIV/0!

Productive Hours Per Workday 0

Resources 0

Test Case Execution

Test Case Creation / Execution Time Estimate

Test Case Creation

Resources

Interactive Saudi Arabia Page 18 of 23

11 ROLES & RESPONSIBILITIES This table shows the staffing assumptions for the project.

Human Resources

Worker Resources Recommended

Specific Responsibilities/Comments

Test Team Lead () Full-time Responsibilities:

Problem Domain Specialist

External Components Acquirer

End-User Educator

Technical Lead

Product Delivery Specialist

Test Analyst/Architect () Full-time Responsibilities:

Testing Methodology

Automatic Testing Tools

Technical basics of particular application

System & Integration Tests

Performance tests

User Acceptance Tests

Delivery & Planning – Estimate

Test Automation Architect

() Full-time Responsibilities:

Design and develop manual scenarios from system requirements.

Design and develop automation scripts. These may be developed from manual scenarios.

Design and configure test environment. Update automated software.

<> <> <>

Interactive Saudi Arabia Page 19 of 23

12 ISSUES/RISKS AND MITIGATING FACTORS All project Issues/Risks and Mitigating Factors have been documented within a separate Issue and Risk log. All issues are document, prioritized, tracked and managed through the Issue & Risk Management Process Flow. Test Dependencies:

Availability of required Real System Data

Unit and System testing must be stable to begin Integration and Regression testing

Integration and Regression testing must be stable to begin Performance and Load testing

Performance and Load Test must be stable to begin UAT testing

Issue / Risk Assigned To

Proposed Resolution Proposed Resolution

Date

Equipment failure of Testing Environment

Communications/network failure

Fire or other natural disasters

*Note: An Issue is something that has happened. A Risk is something that could happen

Interactive Saudi Arabia Page 20 of 23

13 PROJECT MILESTONES AND TARGET DATES This section contains information about the project schedule, test milestones and item delivery events. (Refer to Project Plan)

Task # Task Name Duration Start Finish Predecessors

1 Test Scenario Matrix Phase1

2 System Test Cases Phase1

3 UAT Test Cases Phase1

4 Integration Test Cases Phase1

5 Test Scenario Matrix Phase2

6 System Test Cases Phase2

7 UAT Test Cases Phase2

8 Integration Test Cases Phase2

10 Test Data Phase1

11 Test Data Phase2

Interactive Saudi Arabia Page 21 of 23

14 REPORTS Requirement coverage report – The requirement coverage report will be used to track the development of test cases. This report will contain the total requirements that have test cases developed. Test case completion report – The test case completion report will be used to track the development of testing test cases. This report will contain the total test cases developed and the number of test cases remaining to be developed. Defect Summary report – The defect summary report will be used to track the status of all defects. This report will contain the total number of open defects (by status and priority). Weekly status report – The weekly status report will be used to track the progress of testing. It will contain detailed information about the development of requirements, test cases, test data, test cases executed and any problems or issues uncovered during testing.

Interactive Saudi Arabia Page 22 of 23

15 APPROVALS We, the undersigned, certify that this Test Plan is complete, accurate, and can be used to guide the testing activities.

Name Signature Date

<> <> <>

Interactive Saudi Arabia Page 23 of 23

16 APPENDIX A: RELEASE SIGNOFF FORM

16.1 Signoffs

Project Manager <>

System Architect <>

Development Team Lead <>

Test Team Lead <>