experience with a profile-based automated testing environment

22
mVerify A Million Users in a Box www.mVerify.com Experience with a Profile-based Automated Testing Environment Presented at ISSRE 2003 November 18, 2003 Robert V. Binder mVerify Corporation

Upload: bob-binder

Post on 06-Dec-2014

274 views

Category:

Technology


1 download

DESCRIPTION

Industry Track, ISSRE 2003, November 18, 2003. Five Levels of Automated Testing; case study of level 4 model-based testing.

TRANSCRIPT

Page 1: Experience with a Profile-based Automated Testing Environment

mVerify A Million Users in a Box ™

www.mVerify.com

Experience with a Profile-based Automated Testing Environment

Presented at ISSRE 2003 November 18, 2003

Robert V. Binder mVerify Corporation

Page 2: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 2

Overview

Levels of Automated Testing

System Under Test

Approach

Observations

Page 3: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 3

Musa’s Observation

Testing driven by an operational profile is very efficient because it identifies failures (and hence the faults causing them) on average, in order of how often they occur.

This approach rapidly increases reliability … because the failures that occur most frequently are caused by the faulty operations used most frequently.

IEEE Software, March 1993

Page 4: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 4

Promise of Profile-Based Testing

Tester's point of view versus reliability analyst

Maximize reliability within fixed budget

Measurement of reliability not primary goal

Profile-Based Testing is optimal when

Available information already used

Must allocate resources to test complex SUT

Many significant practical obstacles

Page 5: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 5

Testing by Poking Around

System Under Test

Manual

“Exploratory”

Testing

•Not Effective

•Low Coverage

•Not Repeatable

•Can’t Scale

Page 6: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 6

Manual

Test Design/

Generation

Test Setup

Manual Testing

Manual

Test Input

Test Results

Evaluation

•1 test per hour

•Not repeatable

System Under Test

Page 7: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 7

Manual

Test Design/

Generation

Test Setup

Automated Test Script

Test Script

Programming

Test Results

Evaluation

•10+ tests per hour

•Repeatable

•Brittle System Under Test

Page 8: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 8

Model-based

Test Design/

Generation

Test Setup

Automated Generation/Agent

Automatic

Test

Execution

Test Results

Evaluation

•1000+ tests per hour

•High fidelity

•Evaluation limited System Under Test

Page 9: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 9

Full Test Automation

Automated

Test Results

Evaluation

Automated

Test Setup

Model-based

Test Design/

Generation

Automatic Test

Execution

System Under Test

•Advanced Mobile App Testing Environment

•Q3 2005

Page 10: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 10

Application Under Test

E-commerce/securities market, screen-based trading over private network

3 million transactions per hour

15 billion dollars per day

3 years, version 1.0 live Q4 2001

Page 11: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 11

Development Process/Environment

Rational Unified process

About 90 use-cases, 600 KLOC Java

Java (services and GUI), some XML

Oracle DBMS

Many legacy interfaces

CORBA/IDL distributed object model

HA Sun server farm

Dedicated test environment

Page 12: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 12

Profile-based Testing Approach

Executable operational profile

Simulator generates realistic unique test suites

Loosely coupled automated test agents

Oracle/Comparator automatically evaluate

Support integration, functional, and stress test

Page 13: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 13

Model-based Testing

Profile alone insufficient

Extended Use Case

RBSC test methodology

Defines feature usage profile

Input conditions, output actions

Mode Machine

Invariant Boundaries

Page 14: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 14

Simulator

Discrete event simulation

Generate any distribution with pseudo-random

Prolog implementation (50 KLOC)

Rule inversion

Load Profile

Time domain variation

Orthogonal to operational profile

Each event assigned a "port" and submit time

Page 15: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 15

Test Environment

Simulator generates interface-independent content

Adapters for each SUT Interface

Formats for test agent API

Generates script code

Test Agents execute independently

Distributed processing/serialization challenges

Loosely coupled, best-effort strategy

Embed sever-side serialization monitor

Page 16: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 16

Automated Run Evaluation

Oracle accepts output of simulator

About 500 unique rules

Verification Splainer – result/rule backtracking tool

Rule/Run coverage analyzer

Comparator Extract transaction log

Post run database state

end-to-end invariant

Stealth requirements engineering

Page 17: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 17

Overall Process

Six development increments

3 to 5 months

Test design/implementation parallel with app dev

Plan each day's test run

Load profile

Total volume

Configuration/operational scenarios

Page 18: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 18

Daily Test Process

Run Simulator

100,000 events per hour

FTP event files to test agents

Start SUT

Test agents automatically start at scheduled time

Extract results

Run Oracle/Comparator

Prepare bug reports

Page 19: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 19

Problems and Solutions

One time sample not effective, but fresh test suites too expense

Too expensive to develop expected results

Too many test cases to evaluate

Profile/Requirements change

SUT Interfaces change

Simulator generates fresh, accurate sample on demand

Oracle generates expected on demand

Comparator automates checking

Incremental changes to rule base

Common agent interface

Page 20: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 20

Technical Achievements

AI-based user simulation generates test suites

All inputs generated under operational profile

High volume oracle and evaluation

Every test run unique and realistic (about 200)

Evaluated functionality and load response with fresh tests

Effective control of many different test agents (COTS/

custom, Java/4Test/Perl/Sql/proprietary)

Page 21: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 21

Problems

Stamp coupling

Simulator, Agents, Oracle, Comparator

Re-factoring rule relationships, Prolog limitations

Configuration hassles

Scale-up constraints

Distributed schedule brittleness

Horn Clause Shock Syndrome

Page 22: Experience with a Profile-based Automated Testing Environment

© 2003 mVerify Corporation 22

Results

Revealed about 1,500 bugs over two years

5% showstoppers

Five person team, huge productivity increase

Achieved proven high reliability

Last pre-release test run: 500,000 events in two hours, no failures detected

Bonus: 1M event run closed big affiliate deal

No production failures