acceptance test driven development - (lasse koskela at scrum gathering orlando 2010)

34
Copyright Reaktor 2009 Acceptance Test Driven Development Scrum Gathering Orlando 2010 Lasse Koskela @lassekoskela lasse@reaktor.fi 1 Tuesday, March 9, 2010

Upload: lkoskela

Post on 18-Nov-2014

115 views

Category:

Documents


0 download

DESCRIPTION

Acceptance Test Driven DevelopmentOur industry has pretty much accepted the value of automated developer tests and the practice of TDD is slowly making its way into being a mainstream practice of craftsman programmers for ensuring code's correctness as well as aiding in its design. Similar benefits can be delivered with Acceptance Test-Driven Development (ATDD), a test-driven approach to implementing product backlog items.We begin with an introduction to the core ideas and rationale behind ATDD, creating a baseline for understanding what kind of benefits one might expect from adopting the practice and for recognizing the dynamic that enables those benefits.We discuss common variations of ATDD, ranging from completely manual to a fully automated process with executable acceptance tests, and from a serial process to a parallelized process. Before opening the floor to questions, the presenter walks through an example scenario, illustrating what the discussed artifacts might look like in practice.

TRANSCRIPT

Page 1: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

Acceptance  Test  Driven  DevelopmentScrum  Gathering  Orlando  2010

Lasse  Koskela

@[email protected]

1

Tuesday, March 9, 2010

Page 2: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

Why  should  I  care?

With  acceptance  test  driven  development...

...we  collaborate  between  dev,  test,  biz,  etc.

...we  build  a  shared  a  language  and  vocabulary.

...we  understand  what  we're  implemenFng.

...scope  of  our  work  is  clear  and  understood.

...we  get  feedback  earlier.

2

Tuesday, March 9, 2010

Page 3: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

3

Tuesday, March 9, 2010

Page 4: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

ATDD.  What  is  it?

4

Tuesday, March 9, 2010

Page 5: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

"acceptance  tes9ng"

5

0

2

4

5

7

ATDD STDD CTDD FTDD

Tuesday, March 9, 2010

Page 6: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

Conversa9ons  about  the  product  and  its  features.

6

Tuesday, March 9, 2010

Page 7: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

Tuesday, March 9, 2010

Page 8: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

What  were  they  doing?

1. SpecificaFon

2. ValidaFon

3. VerificaFon

8

Tuesday, March 9, 2010

Page 9: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

❝❞Mary  Poppendieck

If  you  can't  predict  the  results  of  a  test,  you  don't  understand  

the  system.

9

Tuesday, March 9, 2010

Page 10: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

Rela9on  to  TDD.

10

Tuesday, March 9, 2010

Page 11: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

TDD

Developer  acFvity.

Code-­‐facing.

Local.  Isolated.

Automated  tests.

ATDD

Team  acFvity.

Business-­‐facing.

HolisFc.

May  be  automated

Tuesday, March 9, 2010

Page 12: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

12

Tuesday, March 9, 2010

Page 13: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

What  does  it  look  like?

13

Tuesday, March 9, 2010

Page 14: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

14

Tuesday, March 9, 2010

Page 15: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

15

Tuesday, March 9, 2010

Page 16: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

As a competition organizerin order to show the judging to the

audience on several monitorsI want to have the scores for the

current pair on a web page.

16

Tuesday, March 9, 2010

Page 17: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

"With 3 judges giving scores 7, 11, and 2, the displayed score should be 20."

"When the first 2 judges have given their scores, e.g. 7 and 11, the intermediate

score of 18 should be displayed already."

These examples and any additional business rules we describe become the acceptance criteria or tests.

17

Tuesday, March 9, 2010

Page 18: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

The examples are translated into executable tests,allowing a computer to run them against the product.

18

Tuesday, March 9, 2010

Page 19: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

19

Tuesday, March 9, 2010

Page 20: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

http:/ /bit . ly/atddoverview

To  me,  the  essen9al  characteris9cs  of  ATDD,  by  whatever  name  you  want  to  call  it,  are:

1.  We  use  test  ideas  to  elicit  details  from  the  business  stakeholder(s)  about  their  expecta9ons.  By  proac9vely  discussing  test  ideas  -­‐  like  boundary  condi9ons  or  configura9ons  or  varying  sequences  of  user  ac9ons,  etc  -­‐  we  can  come  to  a  shared  understanding  of  the  business  stakeholder’s  real  expecta5ons  for  the  system  instead  of  having  testers  file  a  whole  bunch  of  bugs  late  in  the  process.

2.  We  dis5ll  acceptance  criteria  into  automatable  tests  expressed  in  a  natural  language  rather  than  a  programming  language.  This  enables  us  to  completely  separate  the  ar5cula5on  of  expecta5ons  from  any  technical  details  or  dependencies.

3.  We  write  fixtures  or  libraries  to  wire  the  keywords  in  the  tests  to  the  soMware  under  development  during  implementa9on.  That’s  wiring,  not  transla9ng.  And  doing  it  as  part  of  the  implementa9on  effort,  not  aOemp9ng  to  retrofit  automa9on  aMer  the  code  is  wriOen.

Elisabeth  Hendrickson(emphasis  mine)

20

Tuesday, March 9, 2010

Page 21: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

21

Tuesday, March 9, 2010

Page 22: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

Cri9que

22

Tuesday, March 9, 2010

Page 23: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

"The  Problems  With  Acceptance  TesFng"

http:/ /bit . ly/a lternatives2at

My  experience  with  Fit  and  other  agile  acceptance  tes9ng  tools  is  that  they  cost  more  than  they're  worth.  There's  a  lot  of  value  in  geUng  concrete  examples  from  real  customers  and  business  experts;  not  so  much  value  in  using  "natural  language"  tools  like  Fit  and  similar.

The  planned  benefit  of  those  tools  is  that  the  customers  are  supposed  to  write  the  examples  themselves,  thus  improving  communica9on  between  customers  and  programmers.  In  prac9ce,  I  found  that  customers  (a)  weren't  interested  in  doing  that,  and  (b)  oMen  couldn't  understand  and  didn't  trust  tests  that  were  wriOen  by  others.  Typically,  responsibility  for  the  tests  gets  handed  off  to  testers,  which  defeats  the  whole  point.

Furthermore,  acceptance  tes9ng  tools  are  almost  invariably  used  to  create  end-­‐to-­‐end  integra9on  tests,  which  are  slow  and  briOle.  Fit  works  best  for  targeted  tests  that  describe  the  domain,  but  that's  not  how  it's  used.  Also,  tools  like  Fit  don't  work  with  refactoring  tools.  Once  you  have  a  lot  of  tests,  they  become  a  real  maintenance  burden.

These  two  problems-­‐-­‐that  customers  don't  par9cipate,  which  eliminates  the  purpose  of  acceptance  tes9ng,  and  that  they  create  a  significant  maintenance  burden,  means  that  acceptance  tes9ng  isn't  worth  the  cost.  I  no  longer  use  it  or  recommend  it.

Instead,  I  involve  business  experts  (on-­‐site  customers)  closely  throughout  the  itera9on.  I  do  have  them  create  concrete  examples,  but  only  when  faced  with  par9cularly  complex  topics,  and  never  with  the  inten9on  that  they  be  run  by  a  tool  like  Fit.  Instead,  the  programmers  use  those  examples  to  inform  their  test-­‐driven  development,  which  may  or  may  not  involve  crea9ng  automated  tests  from  the  examples,  at  the  programmers'  discre9on.

I  also  have  the  on-­‐site  customers  conduct  ad-­‐hoc  review  of  the  soMware  as  programmers  complete  it.  They  pair  with  a  programmer,  look  at  what's  been  created,  and  ask  for  changes  that  the  programmer  implements  on  the  spot.

James  Shore

23

Tuesday, March 9, 2010

Page 24: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

❝❞George  Dinwiddie

I’ve  had  some  success  using  automatable  examples  as  a  tool  to  enhance  the  conversa>on  between  the  

customer,  the  tester,  and  the  developer.    [To  add]  precision  to  the  agreement  about  what  needs  to  be  accomplished.    The  goal  is  for  the  customer  to  look  at  

the  test  and  say  “Yes,  that’s  what  I  want.”

http:/ /bit . ly/real ityofaat24

Tuesday, March 9, 2010

Page 25: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

❝❞Ron  Jeffries

I’m  concerned  about  the  poten>al  loss  of  understanding  and  agreement  between  on-­‐site  

customer  /  product  owner  and  the  team.  Examples  are  the  best  way  to  do  that  and  while  it  is  more  work  for  the  customer,  the  examples  are  more  likely  to  be  

understood,  and  to  be  correct.

http:/ /bit . ly/problemswithat25

Tuesday, March 9, 2010

Page 26: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

❝❞Ron  Jeffries

I’m  OK  that  [acceptance  tests]  are  not  customer-­‐understandable  —  though  I  would  prefer  that  they  were  if  it  were  close  to  free.  I  am  less  comfortable  with  the  no>on  that  they  are  not  automated.  My  concern  would  be  that  if  they  are  not  automated,  

doors  are  opened  to  regressions.

http:/ /bit . ly/problemswithat26

Tuesday, March 9, 2010

Page 27: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

❝❞Ian  Cooper

With  tests  that  are  evaluated  by  an  impar>al  tool,  “done”  is  really  “what  everyone  agreed  on”,  not  “almost  done  with  just  a  few  things  to  fill  in  

tomorrow”.  I’m  not  sure  whether  an  on-­‐site  review  is  enough  to  guard  against  this  completely.

http:/ /bit . ly/areattoolsevi l27

Tuesday, March 9, 2010

Page 28: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

❝❞James  Shore

When  it  comes  to  tes>ng,  my  goal  is  to  eliminate  defects.  [...]  I  think  of  defects  as  coming  from  four  

sources:  programmer  errors,  design  errors,  requirements  errors,  and  systemic  errors.  When  trying  to  eliminate  defects,  I  look  for  prac>ces  that  address  

these  four  causes.

http:/ /bit . ly/a lternatives2at28

Tuesday, March 9, 2010

Page 29: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

For  catching  requirements  errors  Jim  uses  the  following  pracFces:

• Whole  Team

• Customer  Examples

• Customer  Review

• Bring  Testers  Forward

http:/ /bit . ly/a lternatives2at29

Tuesday, March 9, 2010

Page 30: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

Conversa9ons  about  the  product  and  its  features.

30

Tuesday, March 9, 2010

Page 31: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

How  to  succeed  with  ATDD?

31

Tuesday, March 9, 2010

Page 32: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

How  to  fail  with  ATDD?

32

Tuesday, March 9, 2010

Page 33: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

1. No  collaboraFon

2. Focusing  on  how  instead  of  what

3. Tests  unusable  as  live  doc

4. ExpecFng  acceptance  tests  to  be  full  regression  suite

5. Focusing  on  tools

6. Not  considering  acceptance  tesFng  as  value  added

7. Test  code  not  maintained  with  love

8. ObjecFves  of  team  members  not  aligned

9. No  management  buy-­‐in

10. UnderesFmaFng  the  skill  required  to  do  this  well

http:/ /bit . ly/atddfai l33

Tuesday, March 9, 2010

Page 34: Acceptance Test Driven Development - (Lasse Koskela at Scrum Gathering Orlando 2010)

Copyright Reaktor 2009

#FollowFriday

34

Tuesday, March 9, 2010