how to transform the manual testing process to incorporate test automation
TRANSCRIPT
How to transform the Manual Testing process
to incorporate Test Automation
Jim TrentadueEnterprise Account Manager - [email protected]
Agenda
3
Agenda
• Test activities and deliverables within a Waterfall and Agile SDLC
• Test Automation working models and frameworks in the industry
• Incorporating key aspects from Test Automation procedures and fusing them into standard testing processes
• Deep-dive into Testing Artifacts: Plan & Avoid particular topics
• Additional areas of impact among the project team members
• Case Study review
• Session recap
Test activities / deliverables within the Agile and Waterfall
SDLC
5
Test Activities within an Agile SDLC
Project Initiation
Typical Agile process of activities to test deliverables
The End Game(System Test)
Release Planning
Release to Prod/Support
Each Iteration1
…
X
Get an understanding of the project
Participate in sizing stories; Create test plans
Participate in sprint planning, estimating tasksWrite and execute story tests
Pair-test with other testers, developersBusiness validation (customers)
Automate new functional test cases; Run automated regression test casesRun project load tests
Demo to the stakeholders
Release mgmt tests mock deploy on staging; Smoke test on staging
Perform load test (if needed)Complete regression test; Participate in releases readiness
Business testers perform UAT
Participate in release to productionParticipate in retrospectives
Agile Testing – Crispin & Gregory
6
Test Activities within a waterfall SDLC
Standard V-Model set of testing activities
Test Strateg
y
Test Scenari
os
Test Cases
Test Scripts
Test Results
Test Summa
ry
Project Initiati
on
Analysis
Design
Develop
Testing
Deploy
7
Test Automation Frameworks
Record \ Playback
Structured
Testing (invokes more
conditi
ons)
Data Driven
Keywor
d Driven
Model /
Object
Based
Actions
Based
Hybrid (combines 2 or more of
the previou
s framew
orks)
Below is a list and evolution of available test automation frameworks
Test Automation working models and frameworks in the industry
9
Key points with Test Automation within Agile
0-10%UI Centric Automation
20-40% Service or Middle Tier
Automation50-60%
Unit level Automation
+80%, UI Centric Automation
0-10% Service or Middle Tier
Automation0-10% Unit level
Automation
Velocity Partners –The Agile Test Automation pyramid, Mike Cohn
Traditional Test Automation
methods Agile Test
Automation methods
Agile automation
• Smaller units of work pieced together as opposed to end-to-end
• Iterative review of the automation; constant review of backlog
• Automation of a given screen even if not complete
10
Mapping an ATLM against a Waterfall SDLC
Automated Software Testing: Introduction, Management and Performance – Dustin, Rashka, Paul
Illustration of how typical activities of an ATLM align
with SDLC activities
D. System Design & Development
C. Small Tool Pilot/Prototype
A. System Life-Cycle Process F. Production and Maintenance
B.
Bu
sin
ess A
naly
sis
an
d R
eq
uir
em
en
ts
E.
Inte
gra
tion
an
d T
est
Fusing a Test Automation process into an existing
Testing Process
12
Test Automation-Agile fusion
Agile Testing – Crispin & Gregory
Questions facing each of the quadrants
• How can it be coded test-first?
• Do we know how to unit test our presentation layer; do we need a tool for that?
• How are we going to prototype?
• What tool will we use to create executable business-facing tests to guide development?
• Do we have regression tests that will need updating?
• Requires more advance planning
• Will need to track the users’ activities for further analysis
• Load scripts might use the old UI; time has to be budgeted to update these for the new one
13
Test Automation-Waterfall fusion
Outline the scope of automation in the
Release Test Plan
Divide test scenarios:
Automated vs. Manual
Prepare
the automated
test
workflow with
error-handling(new and
regression)
Record your modules, ensuring all objects
have been recognized
Replay until test executes as expected
Check-inCM
Planning test automation activities into your testing process
Project Initiati
on
Test Strateg
y
Analysis
Design
Develop
Testing
Deploy
Test Scenari
os
Test Cases
Test Scripts
Test Results
Test Summa
ry
Test Automation activities:
Deep-dive into Testing Artifacts:
Plan & Avoid particular topics
15
Test Strategy / Test PlanNot listing automation in the Master Test Plan for accomplishing the testing on schedule
Not documenting all of the related variables for automation, such as resources, environments, data, controls & licensing
Omitting a
section for
automation
That the tool selection preparation has been done on the technologies being used and how team members will use this
To include automation (developing or executing) in every iteration or build, as part of the overall test approach in the project
To obtain
stakeholder
buy-in for
automation
initiatives
Avoid Plan
16
Test Scenarios / Test StoriesWriting the scenarios or stories at a level that does not allow for automation as a solution
Trying to automate those areas that focus on usability, reliability, maintainability or other abstract areas
Topics that can’t be
automated
To define the ‘how’ along with ‘what’ for your given topic to ensure it gives clear understanding on responsibility
A second review of the topic if it’s tagged for automation to ensure more precise wording for automation
To parse
automation
candidates
Avoid Plan
17
Test Cases / Test ScriptsAssuming that previously written manual test cases can be automated, thus planning the automation until later in the schedule
Delaying the inclusion of automation development and execution at the start of the schedule or sprint
Automation as a final activity
Frequent component-level runs with Agile development, designed as automated test cases from the onset
End-to-end regression tests runs for those tests that are applicable to the changes deployed
Continuous
Integration
Avoid Plan
18
Test Summary / RetrospectivesConsidering the automated tests as a simple test case artifact as opposed to a development activity
Losing the opportunity on scheduling a window for test maintenance on the automation outside of project activities
Not reworki
ng automa
tion failures
Automation metric reviews on each iteration and sprint to show the time savings and velocity for each tester involved
On incorporating the manual tests that have been completed into the regression test library either in a maintenance window or during a sprint
On improvi
ng automation at
the next
chance
Avoid Plan
Additional areas of impact among project team members
20
Project team members impact
Project Manager
•Allocates the right resources and schedules accordingly•Accounts for budget planning for licenses, environments or head count•Works with Test Lead on scope of effort and separating the manual from automated tests
Business Analysts
•Formulates a requirements process for selecting the right solution•Publishes business rules to consider in the automation architecture•Serves as conduit to UAT for automating business process
Development
•Codes any test functions that a tester may request •Documents standards for control names and objects•Assists in any debugging of the test cases
System Testers
•Supplies altered manual test cases to be used as a guide for automating•Sets direction on testing types for manual vs. automated scenarios•Guides test automation team creating end-to-end test scenarios
DBA’s•Connects data source (RDBMS) to automated solution•Builds related SQL queries for use in data-driven testing•Maintains data integrity and database optimization processes
IT Management
•Champions initiative with analysts driving the effort•Provides capital support for expenditure: (licenses, training, resources)•Reviews and invests in the initiative as the business case warrants
Looking at the different roles of who can assist with Test Automation
Case Study review
22
Case Study: Background & Solution
Titled: Automation Through the Back Door (By Supporting Manual Testing)
Experiences of Test Automation – Graham & Fewster
Background:To improve the rate of test automation in the organization, modifications were made to the test automation framework to support manual testing.
Technical Solution:Develop a framework that is based off of keyword-driven testing called command-driven testing.
What is Command-Driven Testing?• Uses keywords that are simple
commands (SELECT, BUTTON)
• Interpreter scripts are the same for all products
• Using the advantages that come from Data-Driven testing, navigation was placed into a DRIVER-File; data in a DATA-File
• These two files built together would form a command script
• The script-runner reads sequentially the commands in the DRIVER-File. DATA-Codes are substituted with data from the DATA-File
With Command-Driven testing:• Testers don’t necessarily need to learn tool scripting• The separation in navigation and data sections makes the
command scripts flexible and reusable• DRIVER-Files can be easily ported to different applications• DRIVER-Files need not be changed on migrating to another
tool• The test tool is needed only to prepare the templates and to
run the tests
23
Case Study: Process for implementation
Titled: Automation Through the Back Door (By Supporting Manual Testing)
Experiences of Test Automation – Graham & Fewster
Prerequisites:
1. You cannot start if you don’t know the application and the test cases that are to be automated
2. You need a working engine that can interpret all the commands you are going to need for your test cases
3. You must have registered all the GUI elements that will be used in test execution in the proprietary mapping of the deployed capture/replay tools in order to normalize the names of the GUI controls
Step 3Developing planned TC’s from template and build test suites
24
Case Study: Key Points
• Due to limited testing resources for test automation, significant effort still had to be spent on regression testing. However, the command-driven framework was adopted for manual testing as well as automated testing
• This approach helped limit the number of times a tester would simultaneously work on the same template, which was a current weakness
• Defect reporting became much easier. It avoided the testing team having to repeat the same steps to recreate the defect
• Continuous reviews were done to assess what features were available and what was needed to support manual testing
• Implementation included a feature that supports the execution of partially automated tests, that could aid with tedious test preparation tasks
• Manual tests focused on customer-specific conditions now instead
Experiences of Test Automation – Graham & Fewster
Session Recap
26
Recap of the presentation
Reviewing the main points of the presentation
• Outline your testing process by phase and deliverable, with consideration on where automation would apply
• Investigate the various test automation processes and frameworks available in the industry to determine what’s most suitable to your organization or initiative
• Fuse test automation topics into each testing phase or milestone accompanied with a deliverable
• Delve deeper into your testing artifacts to incorporate automation through planned activities and avoidance areas
• Include other project team members in the automation initiative to contribute in their area of expertise
• Present a real-life case study to management, to emulate good examples or to ensure this does not occur during your research and implementation