the unexpected truth about ui test automation pilot projects
TRANSCRIPT
2 A publication of
In our daily contact with organizations that aim to deliver better-quality solutions to customers, often we
come across teams that struggle to build up their test automation proof of concept or, worse, fail to get on
the right foot with UI test automation for long-term success.
For a better understanding of what it takes to be successful in the UI test automation field, we surveyed
985 IT professionals in various job positions and with varied technical backgrounds. This report explores the
first steps teams in the UI test automation field take, as well as where they are today with their automation
efforts and what helped them get there.
What did we learn? We discovered a recurring theme: software developers are a vital component of
successful UI test automation projects. Involving software developers early on when preparing applications
for automation helps most teams be successful, the majority not only with pilot projects but with UI test
automation in the long run.
WHY WE DID IT AND WHAT WE LEARNED
In Brief
3 A publication of
METHODOLOGY
• Online survey of 985 IT
professionals
• Data collected between
August 1, 2014 and
September 10, 2014
• For the purposes of this
study, data was broken down
by the following sub-groups*:
2 3 9 6 0 4
Testers Developers
Respondent profile
TYPES OF APPS AUTOMATED
TESTING TEAM SIZE
ORGANIZATIONALSIZE
MANUAL OR AUTOMATED TESTING
EXPERIENCE IN AUTOMATION PILOT PROJECTS
CODING EXPERIENCE
21%Mediumexperience
39%Little to noexperience
40%Experienced
25%Directlyinvolved
39%1-50employees
57%1-10 QAs
32%10-100 QAs
11%100-500+ QAs
26%50-500employees
35%500-1000+employees
12%Closeobservations
63%Neverparticipated
INDUSTRY
7% Healthcare
5% Telecoms
5% Education
5% Engineering & Architecture
23% Other
37% Computer & Software35% Automated
49% JavaScript
43% HTML5
28% AJAX
28% MVC
16% WinForms
9% Silverlight
10% PHP
11% WPF
5% Ruby
27% Java45% Manual
20% Not sure
9% Finance & Banking
9% Manufacturing & Retail
*Additional 142 respondents hold business positions.
4 A publication of
1. The top 3 success factors
The top 3 factors that contributed to the success of test automation pilot projects were
33% 30%
Communication between the automation lead and management was e�ective
The development team helped make the application UI automatable
30%
Management had realistic expectations
Other factors
Project goals were designed to be measurable
29% 28% There were teammate(s) with previous experience in test automation
22% Test automation tool was easy to use
22% Leveraging the know-how of other teams in company
WHY PILOT PROJECTS SUCCEED/FAIL?
5 A publication of
Further, the factor chosen most often for contributing to the success or failure of teams was whether or not
developers were involved in the automation process.
2. The top 3 failure factors
42%
Test automation tool was not intuitive and easy to use
45%
The development team did not help make the application UI automatable
There were no teammates with previous automation experience
39%
Other factors
Poor communication between Management and the Automation Lead
26% 24% Project goals were not designed to be measurable
18% Management had unrealistically high expectations
16% Manual testers felt threatened
Teams whose test automation pilot projects failed said the top 3 factors that influenced their efforts negatively were
Why pilot projects succeed/fail?
6 A publication of
Successful
How successful was the pilot project?
QA-only teamsTeams that include devs
Unsuccessful30%
70%80%
20%
3.1. UI test automation project teams that involved developers were more successful.
30% of UI test automation pilot projects carried out by QA-only teams, failed. By contrast, only 20% of
pilot project teams that involved developers were unsuccessful.
3. Leaving developers out of the UI test automation pilot project increased risk of failure by 50%.
Why pilot projects succeed/fail?
7 A publication of
4. Pilot projects determine the future of automated testing within organizations
3.2. 37% of QA-only team members who indicated they were successful in their UI test automation pilot projects have a somewhat technical profile
I’m proficient
25%
38%
21%
13%
3%
No experience at all
Little experience
Medium experience
Very experienced
Companies that don’t try again
Companies that try again
72% of companies that failed the first time did not try again in the next few years.
Why pilot projects succeed/fail?
8 A publication of
A minority of teams with failed projects gave test automation a second try and ultimately succeeded. The top 3 factors that helped teams with failed pilots implement test automation were
39% 21%
The test automation tool vendor provided timely tech support
Project goals were designed to be measurable
Other teammates had automation experience
18% 18%
The development team helped make the application UI automatable
Other factors
Test automation tool was easy to use
14% 11% E�ective communication between the automation lead and management
11% We used good learning resources (books, courses, blogs)
7% Management had realistic expectations
5. A minority of teams that originally failed, eventually succeeded
Why pilot projects succeed/fail?
9 A publication of
QAs:Developers:
70%
Less time spent on regression testing: 25-50% improvement
62%
Less time spent on regression testing: 25-50% improvement
39%
Fewer bugs slip into production:25-50% improvement
45%
Testers have more time for new testing activities: 25-50% improvement
WHAT IS SUCCESS IN TEST AUTOMATION?
1. Benefits of UI test automation
Both testers and developers agree the greatest benefit of UI test automation is reducing the time spent on regression testing
10 A publication of
On average, it takes teams 8.5 months to automate less than 25% of manual tests and around 11 months to
achieve 25% to 50% of test automation. Organizations take about 14 months to automate 51—75% of manual
tests, and more than 15 months to automate more than 75% of manual tests.
In a 6-month timeframe
48% of QA-only teams have automated less than 1/4 of their manual
tests. 43% of teams that involved developers have automated less
than 1/4 of their manual tests.
2. The amount of test automation achieved
DevsQAs
1/4
What is success in test automation?
11 A publication of
3.1. Testing team size
3.2. Organizational size
30% of teams invested in test automation consist of 51 to more
than 500 QA professionals. An additional 29% of respondents
have 11 to 50 QA professionals.
43% of organizations invested in automated testing have 500 to
more than 1,000 employees. An additional 31% of organizations
have 50 to 500 employees.
41%
29%
30%
1-10 QAs
11-50 QAs
51-500+ QAs
26%1 - 50 employees
31%51-500 employees
43%501-1,000+ employees
3. UI test automation is more widespread among larger teams
What is success in test automation?
12 A publication of
3.3. Industry
14%
11%
8%
5%
5%
4%
4%
4%
4%
3%
1%
37%Computer-related products or services
Finance / Banking
Healthcare
Engineering / Architecture
Education
Manufacturing
Agriculture / Forest / Fishing
Government / Military
Retail / Wholesale
Telecommunication
Insurance
Transportation
3.4. In terms of app development, organizations that have implemented test automation focus mostly on web applications (76%), followed by desktop and mobile web applications. 49%
Desktop
37%
18%17%
Web
Native
Hybrid
Mobile76%
Web (excluding mobile)
What is success in test automation?
13 A publication of
3.5. Development technologies used by organizations invested in test automation include .NET, JavaScript and Java.
50% 38%
JavaScriptJava
PHPRuby
.NET
71% 13% 8%
What is success in test automation?
14 A publication of
42%
35%
20%
15%
9%
8%
Selenium WebDriver
Microsoft Visual Studio Test Professional
Homegrown tool
HP QTP
Smartbear TestComplete
Telerik Test Studio
THE AUTOMATION TOOLS THEY USE
1. Tools used
42% of respondents rely on Selenium WebDriver, whereas 35% use Microsoft Visual Studio Test Professional and 20% use homegrown solutions.
15 A publication of
13%
49%
24%
12%
2%
Very satisfied
Satisfied
Neutral
Dissatisfied
Very dissatisfied
2. Ease of use
3. Level of satisfaction
12%
37%
30%
19%
2%
Very easy
Easy
Neutral
Di�icult
Very di�icult
21% of respondents admit their solutions are difficult to use.
14% of respondents are dissatisfied with their test automation tool.
The automation tools they use
16 A publication of
4. Tools require different skill level
Respondents who have succeeded with their pilot projects by using different tools vary in average coding skill level. Some tools may bring down the skill requirements.
10% 13% 27% 30% 20%Microsoft Visual StudioTest Professional
Selenium WebDriver
25% 36% 17% 8% 14%HP QTP
33% 28% 6% 22% 11%Telerik Test Studio™
6% 12% 35% 29% 18%SmartBear TestComplete
22% 19% 21% 23% 15%
10% 20% 32% 22% 16%Homegrown tools
No experience at all Little experience Medium experience Very experienced I’m proficient
The automation tools they use
17 A publication of
SUCCEEDING WITH AUTOMATION
Pilot projects are a terrific way for teams to ease into larger adoption of UI automation. Why do teams
of smart, diligent testers end up in such a quagmire with those projects? They grapple with the same
generalized specters that haunt software development projects: complexity, brittleness and, that most
fundamental problem plaguing technologists everywhere, poor communication.
In the following sections you’ll find theory and practical experience to back up the data presented in this
report. You’ll learn steps to take to deliver the best value to customers.
Part of any successful pilot project is to identify the fundamental changes that have to take place in a team’s
culture. Changing team culture isn’t always easy. Culture and collaboration changes have to be supported,
and implemented, from the top down. That means stakeholders, management and leadership have to buy in
on the large-scale transformations that will likely have to take place. Such changes are especially challenging
in environments where testers and developers are organizationally and/or geographically dispersed.
Regardless, investing the effort is worthwhile.
Without a doubt, the best payoff comes from involving testers in conversations with developers as early as
possible—optimally before developers start developing. This enables developers to get a clear picture of
Closing analysis and recommendations by industry expert Jim Holmes, VP of ALM and Testing at Falafel Software
Why Do Test Automation Pilot Projects Fail?
The Human Factor: Changing Team Culture
18 A publication of
exactly how testers will be testing their work. Working with a developer to refine and clarify the acceptance
criteria is a crucial step to smoothing out the testing process.
Early conversations between developers and testers also helps developers make the system more testable
as they’re building or enhancing it. Testers can point out UI areas that should have concrete locators, such
as ID or name attributes. Developers can let testers know where potentially tricky asynchronous situations
may occur. They can even work to provide configuration switches to shut off features like CAPTCHA or
swap out tricky rich editors for plain-text ones. Testers and developers can also flesh out enhancements to
backing APIs during those early conversations.
It's important to elaborate on all work discussed in these conversations in existing work items (user stories,
tasks, etc.), as applicable to your specific workflow. Additionally, you should update estimates to include the
effort with those details. It’s also important to communicate those changes across the team.
Successful teams approach the actual work of creating or updating UI automation differently. Some teams
have testers and developers pair up through the entire task. Others split automation work between testers
and developers. For example, testers may build the scaffolding of a UI test while developers flesh out calls
to backing APIs, test oracles and so on. The main point is this: each successful team determines what works
best for them, and they adjust over time to make sure the process is as smooth as possible.
Too often, teams lose sight of a critical aspect: Test automation is a software engineering effort. Regardless
of whether you’re scripting 100-percent of your tests using Watir or WebDriver or using a codeless tool like
Telerik Test Studio, you still need to approach test automation with the same care you use (hopefully!) in
writing system software. Teams have to take the same care in properly estimating test automation tasks, as
well as updating and maintaining those tests. Those writing tests should pay close attention to the same
software craftsmanship principles great developers heed, such as:
Setting Expectations: Test Automation is Software Engineering
Succeeding with Automation
19 A publication of
• Readability. If you can’t read it, you don’t know if it’s doing what it should be doing. Tests have to be
readable, which sometimes means going back and editing scripts someone wrote by hand, or objects
that a tool recorded. Tests, like code, are read many, many more times than they’re edited. Readability is
crucial to helping someone quickly decipher a test months after it was written. (And too often it’s a test
you wrote yourself and forgot what it was for!)
• Single Responsibility Principle (SRP). Old UNIX hands understand this concept intrinsically: a test (or
a block of code) should do one thing--only one thing. But it should do that one thing really well. SRP
ensures tests are easy to understand, simple to reuse and a snap to update when the system changes.
• Don’t Repeat Yourself (DRY). The DRY principle is critical to test suites that scale with minimal costs.
Duplication leads to increased work and much more brittle tests. Good tests centralize storage of
object locators, enable re-use of tests as modules and don’t force users to update or change the same
information in multiple places.
• Simplicity. Complexity kills systems. It also kills tests. SRP and DRY are a great part of simple tests, but
you should avoid over-using branching, looping and similar constructs, as well.
• Abstraction. Abstraction enables good software and tests to isolate implementation details behind a
façade. With this in mind, it doesn’t matter if the underlying system changes: the test is for discovering
what or why something happens or doesn’t happen, not how something happens.
• Maintainability. As with system software, often the larger cost of test automation code is incurred in
maintenance. Teams building tests without maintainability in mind are doomed to suffer in the long run.
Getting developers involved early in pilot projects can help mitigate or even eliminate many issues.
Succeeding with Automation
20 A publication of
Good developers bring insight to test concepts; however, they provide other valuable insights, as well.
Deep knowledge of the system is first and foremost. Developers understand how data flows, where and
how errors and exceptions should be handled and how components should interact. Just as importantly,
they understand those extremely brittle, scary parts of the system that need careful testing after
someone touches code.
If you’re fortunate and have unit or system/integration tests in place, developers can help testers focus UI
testing in the high-value, high-risk parts of the system.
Perhaps one of the most critical skills a developer can bring to the table is the ability to leverage the
system itself by creating backing APIs for data setup, teardown, system configuration and the like. Using
a system’s web services to create test data can be easy when a developer writes a quick abstraction layer
over the top, then hands the tester a framework that contains simple, helpful methods such as
contact_factory.create_new_randomized_contact()
or
shopping_cart.get_order_from_database_by_order_number(order_number)
Such backing APIs enable testers to focus on writing great tests instead of trying to figure out how to
invoke web service endpoints or calls to stored procedures.
Getting Developers’ Buy-in
Succeeding with Automation
21 A publication of
Jim Holmes
Jim has been in various corners of the IT world since joining the
US Air Force in 1982. He’s spent time in LAN/WAN and server
management roles in addition to many years helping teams and
customers deliver great systems. Jim has worked with organizations
ranging from startups to Fortune 100 companies. He’s been in many
different environments but greatly prefers those adopting practices
from Lean and Agile communities. Telerik Test Studio
Test Studio is a powerful test automation tool that
helps you create maintainable test suites for a wide
range of platforms and browsers. It inspires testers and
developers to collaborate on building high-value test
automation and increase team velocity.
www.telerik.com/teststudio
UI automation is by far the hardest test domain in which to work. Successful teams realize their best chance
is strong, honest collaboration between all team members. Teams with strong collaboration empower, and
expect, testers and developers to work closely together. The results speak for themselves.
Succeeding with UI Automation is a Team Effort
Succeeding with Automation