david hayman - say what? testing a voice avtivated system - eurostar 2010
DESCRIPTION
EuroSTAR Software Testing Conference 2010 presentation on Say What? Testing a Voice Avtivated System by David Hayman. See more at: http://conference.eurostarsoftwaretesting.com/past-presentations/TRANSCRIPT
1
Welcome
Say What?David HaymanEuroSTAR 2010
Copenhagen, Denmark
2
Today’s Objectives
Discuss the approaches taken to test a Voice Based system
Consider how test case design techniques were used in anger
Using tools to test functionality and performance
Identify what can be done by people using a process and a positive attitude
What were the objectives of the testing in the first place?
3
Say Who?
“Speak when you are spoken to”
Proverb
We can also generate an outcome when you don‟t speak
“What has been understood no longer exists”
Paul Éluard
Can a computer „understand‟ and therefore does anything ever exist?
Say Something
Kia Ora
In the Maori alphabet there are only 10 consonants and 5 vowels
It‟s all about pronunciation
What we say and the way we say it
Kia Ora bro
4
5
Can you test this please?
A voice activated internal Directory system
We‟ve got a couple of possible tools
It‟ll need performance testing in conjunction with a DTMF system in production
Any experience?
Any testers with experience?
Any help?
Why are we doing this?
Why me? Left the word out of the dictionary -Gullible
6
Sure, now what was the question
again?
What do I know?
Who do I know?
Where can I look for help?
What have I done that is similar?
How hard can it be?
Why did I say yes?
“If no one knows what success is then how can I fail?”
Equally
“If no one knows what success is then how can I succeed?”
David Hayman 2010
Health Check
If your DTMF system has the capacity don‟t use Voice
IVR more „personal‟ and easier for people with some disabilities e.g. the blind or paralysed
Bad IVR can give you blood pressure and make you throw things
7
8
Sources of inspiration
The Internet
ISTQB
BS7925-2
Users
Customers
BA‟s
Testing Gurus
Other companies with voice systems
Usability groups
Me – I like a challenge
9
Follow the yellow brick road
Wizard of Oz Testing
What is it? We tested usability in at the start rather than the usual at the
end approach
Gap analysis
Test driven design
A review with a fancy name?
How does it work? Pseudo Scrum includes Marketing
Did it work? Yup
Business Process Model
5 seconds to respond
Go round twice
Bad language
Unrecognised name
Hang Ups
Hand off to operator/hand back
Department vs. individual
Include Use Case
All lines busy – engaged tone
Required line busy – go to voicemail
10
11
State Transition Models
Turning requirements into models
Establishing coverage – 100%
Build test cases
Used for impact analysis on change requests
State Model Example
12
13
Put the Fun in Functional
Types of name
Individuals
Departments
Fun with test data:
Mark Eting
The Reverend Ndabanibgi Sitole
John Smith
Phonetics
Accents [all within IT department]
Phonetic vs. local
Actual Recordings
Wav files
First in NZ to use this tool
UI issues and other
Tool recognition thresholds – levels of correctness - Nuance
Codec changes introduced „clicks‟
Jitter is an issue14
Script #1
15
Step/Actor Response
Step 1
IVR "Hello and welcome to the EuroSTAR Interactive Voice Response system. To help us direct your call please choose one of the following options. Say "Speakers", Tickets" Gala Dinner", "Free Gifts", or "Operator"
Step 2
Caller Silence
Step 3
IVR I'm sorry I didn't catch that. To help us direct your call please choose one of the following options. Say "Speakers", Tickets" Gala Dinner", "Free Gifts", or "Operator"
Step 4
Caller "Tickets"
Step 5
IVR To confirm did you say "Tickets"?
Step 6 Yes
Caller
Step 7
IVR Thank you. Putting you through to tickets
Sample problems
Roger Roff
Learning could be fooled
Bad language
Silence
Background noise
Phonetic rather than accent driven
16
17
Platforms
Home
Mobile
Skype
Speaker
Hands Free
Public call boxes
Maybe we should have done a wider usability trial
Background noise and how the IVR copes – again relate to „noise‟ in a performance test but this can be done on an individual call
Test Environment
Production switches and network and phone lines
Tool as a service not in-house – more realistic
No control over input devices – a good thing as they were often „worst case‟
DTMF „tromboning‟ issue caused problems with IVR throughput and line availability
18
19
Performance Testing
Tools
Tests
Operational Profile
Defects
Cyara
IBM monitoring
Test environments – had to use external hardware, therefore control over quality was impossible so had to test for and expected the worst.
Soak test – 4 hours
Other issues to be mentioned
Scripts need some management
Manage scripts to ensure silence added so that call was fully answered before the message started
Need an operational profile
What to do with calls if all lines are engaged – hand-off, engaged tone etc.
20
21
The end result
The quality of the product was overridden by the feeling in the market
Voice systems were getting a bad rap
Therefore a good system that could have improved the image was lost to the general public
How brave are you?
Good things that happened Part 1
Voice Talent
Prevention rather than cure
Gap analysis
Fully Documented
Enthused the business to get involved in testing
22
Good things that happened Part 2
Is this Agile – test driven design or test driven configuration?
Improved the UI on the tool
Made some friends
Learnt a lot about the tool, myself and my team
Not all systems that are bug free will go into production
Tune the system rather than test it
23
24
Acknowledgements
Alok Kulkarni, Bonny Malik and Thomas Fejes @ Cyara Solutions www.cyarasolutions.com
Nick Brown and Piers Langridge test team extraordinaire
Akash Jattan @ IBM New Zealand
You the Audience