the pathologies of failed test automation projects

Post on 08-May-2015

333 Views

Category:

Technology

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

Most test automation projects never die—they just become a mess and are redone. Initial solutions that start well and are full of promise often end up as brittle and unmaintainable monsters consuming more effort than they save. Political feuds can flourish as different automation solutions compete for attention and dominance. Tests become inefficient in both execution time and resource usage. Disillusionment ensues, projects are redefined, and the cycle begins again. Surely we can learn how to avoid such trouble on the next project. Michael Stahl has analyzed automation projects and identified recognizable failure patterns—mushrooming, duplication, going for the numbers, and others. Michael describes these patterns, suggests how to detect them early, and shares ways to avoid or mitigate them. Whether your team is just starting on test automation—or is already in full flight—you’ll take back ideas to improve the chances of achieving success in your test automation efforts.

TRANSCRIPT

W3 Test Automation

5/1/2013 11:30:00 AM

The Pathologies of Failed Test

Automation Projects

Presented by:

Michael Stahl

Intel

Brought to you by:

340 Corporate Way, Suite 300, Orange Park, FL 32073

888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com

Michael Stahl

Michael Stahl is a software validation architect at Intel, working with a team that validates Intel's graphics hardware drivers. In this role, Michael defines testing strategies and work methodologies for test teams, and tests part of the product himself - the work he enjoys most. As a twenty-two year veteran and senior engineer, he has been witness to what works - and what doesn’t - in test strategies, communications, and automation. An avid teacher, Michael enjoys sharing his observations with others. Contact Michael at michael.stahl@intel.com and review previous conference presentations on his website testprincipia.com.

Based, in part, on work done with Alon Linetzki, Best Testing (www.best-testing.com)

The Pathologies of failed

Test Automation Projects

Michael Stahl, Intel Apr 2013

© Michael Stahl, 2013, all rights reserved

• Automation failure patterns

• What can we do?

• Summary

On the Menu… 2

Based, in part, on work done with Alon Linetzki, Best Testing (www.best-testing.com)

© Michael Stahl, 2013, all rights reserved

Disclaimers

Names and brands referenced herein may be claimed as the property of

third parties

The views expressed in this presentation are solely my own, and do not in any manner represent the views of my employer

Information in this presentation is provided “AS IS” without any warranties or representations of any kind

© Michael Stahl, 2013, all rights reserved

Once upon a time…

A common automation story… 4

Mr. Otto Mate Cool!

Can you make it…

Otto’s Team

I can automate my work!...

© Michael Stahl, 2013, all rights reserved

5

Mr. Otto Mate

Perfect!

Can you add…

Otto’s Team

Piece of cake!

Mr. Mann a. Ger

What a wonderful

world!...

A common automation story…

© Michael Stahl, 2013, all rights reserved

6

Mr. Otto Mate

Otto! The

last release

fails!... Otto’s Team

Arggghhh! Fix Fix Fix

Mr. Mann a. Ger

Makes sense… Sir! I need time! I need help!

… and mates

A common automation story…

© Michael Stahl, 2013, all rights reserved

7

Mr. Otto Mate … and mates

… and mates

… and mates

A common automation story…

© Michael Stahl, 2013, all rights reserved

… and mates

… and mates

8

Mr. Oto Mate … and mates

Let’s REDESIGN!!!

#$&@***!!!

Mr. Mann a. Ger

A common automation story…

© Michael Stahl, 2013, all rights reserved

A pattern emerges…

9

© Michael Stahl, 2013, all rights reserved

Pattern #1: Mushrooming

© Michael Stahl, 2013, all rights reserved

11

Stage 1 – Small & Local

Single User / Developer

Simple Tool

A pattern…

© Michael Stahl, 2013, all rights reserved

Stage 2 – Generalization

12 12

Multiple Users / Single Developer

Enhanced Tool

A pattern…

© Michael Stahl, 2013, all rights reserved

13 13

A pattern…

Multiple Users &

Developers

Complicated Tool

Stage 3 – Staffing

© Michael Stahl, 2013, all rights reserved

14 14

A pattern…

Test Case

Management

Multiple Users &

Developers

Stage 4 – Non-core features

© Michael Stahl, 2013, all rights reserved

15 15

A pattern…

Stage 5 – Overload

Arghhh!

© Michael Stahl, 2013, all rights reserved

Pattern #2: The Competition

16

© Michael Stahl, 2013, all rights reserved

Team B

Pattern #2 – The Competition (ver. 1) 17

Team A

© Michael Stahl, 2013, all rights reserved

Team B

Pattern #2 – The Competition (ver. 2) 18

Team A

Team C

Team D !!!

© Michael Stahl, 2013, all rights reserved

Team B

Pattern #2 – The Competition (ver. 3) 19

Team A

Team C

Team D

!!!

!!!

!!! …

© Michael Stahl, 2013, all rights reserved

Pattern #3: The Night Run Fallacy

© Michael Stahl, 2013, all rights reserved

Pattern #3 – The Night Run Fallacy 21

© Michael Stahl, 2013, all rights reserved

Pattern #3 – The Night Run Fallacy 22

Night Time

Test Time

© Michael Stahl, 2013, all rights reserved

Pattern #3 – The Night Run Fallacy 23

Add a snapshot – or a movie snippet – of PAVE

Corporate Truism:

It’s easier to get budget for machines than for more testers

© Michael Stahl, 2013, all rights reserved

Pattern #3 – The Night Run Fallacy 24

Add a snapshot – or a movie snippet – of PAVE

© Michael Stahl, 2013, all rights reserved

Pattern #3 – The Night Run Fallacy 25

Test Automation Truism:

Machines create work for more testers

© Michael Stahl, 2013, all rights reserved

The Tragedy of the Commons

multiple individuals…

will ultimately deplete a shared limited resource…

even when it is not in their long-term interest

Garrett Hardin, Science, 1968

© Michael Stahl, 2013, all rights reserved

Pattern #4: Going for the Numbers

© Michael Stahl, 2013, all rights reserved

28

© Michael Stahl, 2013, all rights reserved

29

Pattern #5 – Going for the Numbers

© Michael Stahl, 2013, all rights reserved

Robustness is Invisible

30

Pattern #4 – Going for the Numbers

© Michael Stahl, 2013, all rights reserved

Pattern #5: The Magician Apprentice Syndrome

© Michael Stahl, 2013, all rights reserved

Pattern# 5 – The Magician Apprentice 32

© Michael Stahl, 2013, all rights reserved

33

Recap: The Patterns

Mushrooming

The Competition

The Night time Fallacy

Going for the Numbers

The Magician Apprentice

© Michael Stahl, 2013, all rights reserved

So... 34

What can we do?

© Michael Stahl, 2013, all rights reserved

Are the patterns… 35

Unavoidable ??

© Michael Stahl, 2013, all rights reserved

Counter measures 36

© Michael Stahl, 2013, all rights reserved

Mushrooming 37

© Michael Stahl, 2013, all rights reserved

How to use this information?

GPS Locate the stage you are at

Get directions for the way out

Map

Start right and avoid

the wrong turns

38

Alert Signals & Counter Measures

© Michael Stahl, 2013, all rights reserved

Alert Signals 39

© Michael Stahl, 2013, all rights reserved

40

Alert Signals

© Michael Stahl, 2013, all rights reserved

Failing the project

Failing the project

Failing the project

Failing the project

Failing the project

41

Alert Signals

© Michael Stahl, 2013, all rights reserved

Single feature test tool?

The creator is the user?

“Skunk works”?

Key words: “Tool”

“Utility”

42

Alert Signals: Stage 1 Small, local, feature-centered

© Michael Stahl, 2013, all rights reserved

Ensure the following… and relax:

Code control

Documentation User Manual (Usage line…)

“Green” in the code

High level design

43

Counter Measures: Stage 1 Small, local, feature-centered

© Michael Stahl, 2013, all rights reserved

Additional features?

Multiple users?

Automation web site?

>25% of the tester’s time?

Key words: “Use by other testers”

“Common Libraries”

44

Alert Signals: Stage 2 Generalization

© Michael Stahl, 2013, all rights reserved

45

Counter Measures: Stage 2 Generalization

“The hardest part of building a software system is deciding

precisely what to build... No other part of the work so cripples

the resulting system if done wrong. No other part is more

difficult to rectify later” - Fred P. Brooks (author of “The Mythical Man-Month”)

© Michael Stahl, 2013, all rights reserved

Stage 1 measures

Strategy

Architecture

Lightweight PM Version control

Scope control

Bugs & Requests database

46

Counter Measures: Stage 2 Generalization

© Michael Stahl, 2013, all rights reserved

Alert Signals: Stage 3

Requests for additional heads?

Automation F2F?

Tool-related delays in execution?

Key words: “Tool Owner”; “Automation team”

“Framework”; “Infrastructure”

“Roll back”

“Bug fix release”

47 Institutionalization and staffing

© Michael Stahl, 2013, all rights reserved

48 48

© Michael Stahl, 2013, all rights reserved

Stage 1, 2 counter measures

Management level decision time Programming language

Framework focus

Code and release management

Acceptance tests

Skillset development

Metrics

49 Institutionalization and staffing

Counter Measures: Stage 3

© Michael Stahl, 2013, all rights reserved

Counter Measures: Stage 3

Metrics suggestions

Automation framework quality Number of false fails

Framework’s test results; Bug trends

ROI Number of runs

Invested effort by type (new,

maintenance, rewrite)

Number of bugs found by Automation (?)

50 Institutionalization and staffing

© Michael Stahl, 2013, all rights reserved

Generic features?

Test log wading?

False fails?

Key words: “test suite / cycle generation”

“robustness enhancement”

“setup issues”

51

Alert Signals: Stage 4 Change of focus: Technology Management

© Michael Stahl, 2013, all rights reserved

Stage 1, 2, 3 counter measures

Build VS Buy

Re-architect Core / Non-Core

Solid infrastructure

Influence upstream Testability hooks

Design for Test (automation)

52

Counter Measures: Stage 4 Change of focus: Technology Management

© Michael Stahl, 2013, all rights reserved

Stage 4 – Non-core features

Build VS Buy?

(hint: Buy)

See: http://www.stickyminds.com/s.asp?F=S17601_COL_2

Counter Measures: Stage 4

© Michael Stahl, 2013, all rights reserved

Build (VS Buy) if… Competitive edge

Existing expertize

Core competency

Cheaper; Faster

Good use of resources

Acceptable risk

Long term support

54

Counter Measures: Stage 4 Change of focus: Technology Management

Main source: Allen Eskelin http://www.informit.com/articles/article.aspx?p=21775

© Michael Stahl, 2013, all rights reserved

Maintenance & logistics overload?

Limitations overplay?

Loss of credibility?

Stage 1 initiatives?

Key words “Did it fail in manual test?”

“Architecture limitation”

“refactoring”; “redesign”

“…I can write a small program…”

55

Alert Signals: Stage 5 Maintenance overload; Re-design

© Michael Stahl, 2013, all rights reserved

Loss of credibility

56

Alert Signals: Stage 5 Maintenance overload; Re-design

© Michael Stahl, 2013, all rights reserved

Options:

Continue…

Give up problematic areas

Partial return to Stage 1

“We value Robustness over New Features”

Prepare for re-design – with a new map...

57

Counter Measures: Stage 5 Maintenance overload; Re-design

© Michael Stahl, 2013, all rights reserved

GPS

Locate the stage you are at

Get directions for the way out

Be alert for Alerts

Identify your stage

Analyze your situation

Implement Counter

Measures

How to Use this information? 58

© Michael Stahl, 2013, all rights reserved

Plan your trip

Be alert for Alerts

Analyze your situation

Implement Counter

Measures

How to Use this information? 59

Map

Start right and avoid

the wrong turns

© Michael Stahl, 2013, all rights reserved

The Competition 60

© Michael Stahl, 2013, all rights reserved

Pattern #2 – The Competition (ver. 1) 61

Salvageable up to stage 2

Stage 3 and up: Merge the teams

EOL both tools

Start a 3rd

© Michael Stahl, 2013, all rights reserved

Pattern #2 – The Competition (ver. 2) 62

Plugin Architecture

© Michael Stahl, 2013, all rights reserved

Pattern #2 – The Competition (ver. 3) 63

Fix the main problem

Accept, encourage Stage 1 stuff…

Maintaining vigil:

Avoid ver.1 pattern

© Michael Stahl, 2013, all rights reserved

The Night Run Fallacy 64

© Michael Stahl, 2013, all rights reserved

Pattern #3 – The Night Run Fallacy 65

Never forget test time, test efficiency

Balance test skills VS automation skills

Allocate machines or machine time

65

65

65

© Michael Stahl, 2013, all rights reserved

Going for the Numbers 66

© Michael Stahl, 2013, all rights reserved

Pattern #4 – Going for the Numbers 67

Change how you count “Done”

© Michael Stahl, 2013, all rights reserved

The Magician Apprentice Syndrome 68

© Michael Stahl, 2013, all rights reserved

Pattern# 6 – The Magician Apprentice 69

http://www.youtube.com/watch?v=zlnz1rSJj7Y

© Michael Stahl, 2013, all rights reserved

Pattern# 6 – The Magician Apprentice 70

Automation should not necessarily mimic humans…

… it should get the job done

© Michael Stahl, 2013, all rights reserved

Pattern# 6 – The Magician Apprentice 71

http://www.youtube.com/watch?v=qDVYIT85ntg

© Michael Stahl, 2013, all rights reserved

Pattern# 5 – The Magician Apprentice 72

Holistic approach VS “test after test”

Re-think, re-strategize, re-evaluate before automating

Identify Actions, Keywords (KDT)

© Michael Stahl, 2013, all rights reserved

Summary 73

© Michael Stahl, 2013, all rights reserved

74

The Patterns

Mushrooming

The Competition

The Night time Fallacy

Going for the Numbers

The Magician Apprentice

© Michael Stahl, 2013, all rights reserved

The patterns are pervasive

Almost inevitable

Awareness is key (engineers; managers)

Driven by organizational and human nature

The solution is only partially technical

75

Summary

© Michael Stahl, 2013, all rights reserved

Thank You!

Questions time…

michael.stahl@intel.com

www.testprincipia.com

top related