mohit verma, tufts health plan lead performance architect

40
Performance Testing in the Real World: How to Plan for a Successful Performance Test Sept 24th 2010 Mohit Verma, Tufts Health Plan Lead Performance Architect

Upload: solada

Post on 31-Jan-2016

52 views

Category:

Documents


0 download

DESCRIPTION

Performance Testing in the Real World: How to Plan for a Successful Performance Test Sept 24th 2010. Mohit Verma, Tufts Health Plan Lead Performance Architect. Agenda. Background Market State Why Performance Test? Technical Environment Performance Testing Performance Testing Benefits - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

Performance Testing in the Real World: How to Plan for a Successful Performance TestSept 24th 2010

Mohit Verma, Tufts Health PlanLead Performance Architect

Page 2: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 2

Agenda

Background Market State Why Performance Test? Technical Environment Performance Testing Performance Testing Benefits Performance Testing CSFs Performance Testing Synergies Questions?

Page 3: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 3

Background:

Tufts Health Plan is a Managed Care Provider (HMO)- our applications typically support Health Care Providers (hospitals, etc), Employers and our Members. Active Channels

EDI Legacy Web

Page 4: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 4

Market State

Forrester recently reported that among companies with revenue of more than $1 billion, nearly 85% reported experiencing incidents of significant application performance degradation. Respondents identified the application architecture and deployment as being of primary importance to the root cause of application performance problems.

Page 5: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 5

Example 1: Amazon.com – June 29th Outage* Amazon.com experienced a widespread outage on Tuesday that lasted,

at least for many customers, more than three hours and displayed blank or partial pages instead of product listings.

By mid-afternoon, Amazon's home page was devoid of any product photographs and showed only a list of categories on the left of the screen. Searching for items often didn't work, and customers' shopping carts and saved item lists were temporarily displayed as empty.

At an annual revenue of nearly $27 billion, Amazon faces a potential loss of an average of $51,400 a minute when it's site is offline. Amazon shares closed down 7.8 percent, a sharper fall than the Nasdaq index.

A post on an Amazon seller community form at 12:47 p.m. PDT said: "We are currently experiencing an issue that is impacting customers' ability to place orders on the Amazon.com website." A followup announcement an hour later said the problem had not been resolved.

Page 6: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 6

Example 2: Chase Outage Sept 14, 2010

Page 7: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 7

Example 3:

Retailer High Low Dial Up

Broadband Broadband

Dell.com 7.48 23.08 49.64

ColdwaterCreek.com 7.5 22.76 55.88

Williams-Sonoma.com 7.78 23.28 56.52

QVC.com 8.03 24.8 57.69

Amazon.com 8.24 22.05 49.46

OfficeDepot.com 8.52 25.24 57.46

Scholastic.com 8.59 29.63 66.01

CDW.com 8.95 25.38 51.15

Netflix.com 9.48 28.39 65.09

Staples.com 9.59 29.16 53.17

Dell.com gave shoppers the fastest high broadband access time among large web retailers in April, 2010 according to Gomez.

Page 8: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 8

Why Performance Test? Software Engineers often build Software components/products

not being aware of the target load or environment requirements or service level agreements

Complexity and highly distributed nature of the various hardware and web hosting servers offers challenges on optimal configuration of applications

Globalization of users offers additional complexity Virtualization of Business Critical Applications demands

Performance TestingRecommendation – Performance Test Proactively and Early in the

Software Development LifeCycle

Page 9: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 9

Technical Environment

N-tier Diagram - Simple

Page 10: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 10Tufts Health Plan 10

Typical Technical Environment

Technologies used (Complex and Diverse Environment) Web Application Servers: Weblogic, WebSphere, JBOSS,

Aqualogic Infrastructure Security: CA SiteMinder, IBM Tivoli Access Manager Web Server: Apache, IIS Middleware: Tibco BusinessWorks and BusinessConnect Reporting: Siebel, Lawson, Actuate Midrange/Mainframe/Legacy: HP/IBM Hardware: Itanium and Windows environment Databases: Oracle and Sql-Server EDI interfaces – HIPAA Compliance

Page 11: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 11

Technical Environment

n-tier EnvironmentTest Environments are most often not replica’s of production. So often we have to do some

extrapolation or accept the risk of performance testing in this environment.

Web PortalEnvironment

EIM Environment

Messaging Bus

File DropOffDepartmental DB

Web Application Servers

Enterprise Database Server

Enterprise Data

Storage

Proprietary App

ENTERPRISE TEST ENVIRONMENT ARCHITECTURE

ProprietaryDatabaseLegacy

Database

Load ControllerLoad Generators

Send ScriptsReceive Data/Results

Mainframe

External Gateway

HTTP

Screen Testing

Run Schedules via Enterprise scheduler

HTTP

Messaging Layer

EAI3

EAI1 EAi2

ContentManagement

Load Test Tool Environment

Page 12: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 12

Performance Testing

What is performance testing?

Testing which measures a performance goal (response time) Testing which measures application performance

under user load Testing which measures system performance under

user load of all system variables in the deployment environment

Testing to stress the application/system to find its limits

Page 13: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 13

Performance Testing

Key variables measured: End User Response Time Resource utilization (CPU, Memory, Disk, etc) Network utilization & latency Throughput (bytes/sec, etc)

Page 14: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 14

Performance Testing – Benefits Measure response time for applications and enforce

SLAs Improve end-user experience Proactive load/stress testing of mission critical

applications would enable us to benchmark applications as per concurrent user support, response times, etc

Capacity Planning – save Costs( $$ ) by sizing production/non-production environments more accurately

Help build proven scalable applications Failover Capabilities*

Page 15: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 15

Performance Testing: Critical Success Factors Understand the Drivers and Triggers for Performance testing (NFR) Build or identify Production Workload model Well Defined Success criteria - SLAs Identify Business Critical Workflows of application Identify/Create Test Data Build Test Environment that models production* Support of all teams – Performance Testing is a TEAM Effort!! Workflow Automation Tool (Load Test Tool) Load Generation environment Performance Test Analysis/Reporting Need Management that values Performance Testing

Keep control of the Performance Test Environment Never let Development teams run the Performance Test for you

Page 16: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 16

Successful Performance Test LifeCycle

Performance TestTriggers/

Requirements

PerformanceTest Required

NO

Perf Report:No

PerformanceTesting

Required

YES

ExistingApplication/

System

Model ExistingProductionWorkload

Build AccurateTest Scenarios(Load, Stress &

Sociability)

ANALYTICSProduction

Report identifyingtransactionalthroughput of

businesstransactions -

Define TestSuccess Criteria/Script Workflows

Performance TestEnvironmentProd vs Test

PerformanceTest Plan

Report Any DifferencesExecute TestScenario

Are ResultsAcceptable?

PerformanceTest Result

Report

Perf Reportwith results

and anyExceptions

YES - SIGNOFF

Identify Issue, MakeTweaks(software, configuration or

hardware)

YES

Based on Triggers/RequirementsNO

SUCCESSFUL PERFORMANCE TEST LIFECYCLE

LOAD TEST TOOL -LOADRUNNER/HOMEGROWN

NO

Test Scripts,Test Data

Page 17: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 17

Performance Testing CSFs:

Drivers and Triggers

SLA Change Hardware change (upgrade/downgrade, Virtualization) Application Software Upgrade (New features/enhancements) Infrastructure Software Upgrade/Patch (Security, Database,

Systems, etc) Compliance Patch (DOD) Java/.Net version upgrade Unexpected growth in number of users Database retention policy change

Typically, the non-functional requirements (NFRs)should dictate the need for performance testing

Page 18: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 18

Performance Testing CSFs:

Production Workload Model

What is the existing usage of the application/system? Transaction

Throughput (hour, day)

Number of concurrent users for the average hour/peak hour

Most used transactions

Page 19: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 19

Performance Testing CSFs:

Well-Defined Success Criteria

How do we know if the test was a success Document SLA’s (response time, CPU/Memory

usage thresholds) Meets customer goals

Page 20: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 20

Performance Testing CSFs:

Define Business Critical Workflows

Identify Business Critical Workflows of application Use the 80/20 rule (Pareto’s Principle):

20% of the transactions cause 80% of the defects in production. Performance Testing is not typically a full regression test- 20% of the total test cases provides you 80% coverage.

Include resource-intensive transactions (CPU, database, memory, network)

Include highly used transactions

Page 21: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 21

Performance Testing CSFs:

Test Data Identification

Performance Testing is data-driven testing Choose your test data carefully in consultation with

production workload models or business analysts Represent boundary value conditions (example – large

result sets) Represent required security roles when creating test ids Test with a production-sized database Test with same data setup at least 2 times for consistency Test with a randomized data setup at least once

Page 22: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 22

Performance Testing CSFs:

Test Environment Considerations*

Develop and Enforce Test readiness checklist Pristine Performance Test Environment Monitoring tools setup – Historical data is mandatory Locked down environment (including disabling virus scans) Production sized in all respects, if possible Document and communicate any deviations from production to

stakeholders If environment is shared?

Disable builds and deployment during test times Build and Communicate Test Schedule Communicate, communicate, communicate Shutdown environments not needed Monitor, monitor, monitor

Page 23: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 23

Performance Testing CSFs:

Team Support needed

Performance testing is a TEAM effort! Developers DBAs Network Engineers System Engineers Business (involve them to run UAT during

performance testing execution)Performance Engineers typically do thefirst/second line of analysis*

Root cause analysis tool may eliminate a total team effort

Page 24: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 24

Performance Testing CSFs:

Load Test Tool

For efficient performance testing need automation tool (industry standard or Open Source): Quick scripting, Correlation & Replay of scripts Building Test Models/scenarios Executing Test Scenarios Analysis Monitoring

Home grown tools may suffice where technology platform is not as varied or for proprietary applications

Page 25: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 25

Performance Testing CSFs:

Load Generation EnvironmentMimic production if possible

Firewalls Several Network locations or use WAN emulator

Page 26: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 26

Performance Testing CSFs:

Performance Test tools

HP Performance CenterCompuware QALoadMicroFocus SilkPerformerIBM Rational PerformanceStudioJmeter,OpenSta, etc

Page 27: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 27

Load/Performance Test Tool – Benefits

Identify and resolve performance bottlenecks quickly Repeatable tests can be scripted and run quickly Real world user scenarios can be modeled by the tools Helps improve the quality and stability of applications Provides server monitoring capability for non-production

environments Provides co-related performance analysis reports with

drill-down capability Integrates with existing production monitoring tools

Page 28: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 28

Performance Testing CSFs:

Performance Test Analysis/Reporting

Tool Analysis module provide: Real Time monitoring graphs Transaction Response Time Reports User Ramp-up graphs Transaction Response Summary graphs Drill-Down for Root cause analysis Co-relating Graphs and results

Page 29: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 29

Performance Testing – Analysis/Reporting

Production Profiles Provider Portal – July 1, 2007 to June 30, 2008Provider Services Transactions Year to

Date (Count)

Daily Average (251 Working days)

Average Hour Count

Performance Test Hour Count

Claim Status I nquiry - Global 1086223 4327.58 541 1200Eligibility I nquiry - Commercial 822011 3274.94 409Eligibility I nquiry - Secure Horizions 192677 767.64 96Provider Logs I n 995368 3965.61 496 650Referral I nquiry - Global 159000 633.47 79 600Referral Submission 104802 417.54 52 200

1100

Public Search TransactionsYear to Date (Count)

Daily Average (251 Working days)

Average Hour Count

Performance Test Hour Count

Public SH PCP Location Search 31334 124.837 16 xPublic SH PCP Name Search 35106 139.865 17 xPublic SH PCP Proximity Search 20816 82.9323 10 xPublic Search Location 274423 1093.32 137 700Public Search Name 698891 2784.43 348 350Public Search Proximity 237700 947.012 118 175

Public Portal– July 1, 2007 to June 30, 2008

Page 30: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 30

Performance Testing – Analysis/Reporting

Sample Report 1

Page 31: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 31

Performance Testing – Analysis/Reporting

Sample Report 2

Page 32: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 32

Performance Testing – Analysis/Reporting

Performance Test Reports – Error Rate graph

Page 33: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 33

Performance Testing – Analysis/Reporting

Non-Compliant SLA Report (MP_Login)

Page 34: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 34

Performance Testing – Analysis/Reporting

SLA Report after enhancements

Page 35: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 35

Performance Testing – Analysis/Reporting

Login Test Results (22 and 44 Concurrent Users)

22 Concurrent Users 44 Concurrent Users

Page 36: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 36

Performance Testing – Analysis/Reporting

Test Results (66, 88 and 100 Concurrent Users)

66 Concurrent Users 100 Concurrent Users88 Concurrent Users

Page 37: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 37

Performance Testing – Analysis/Reporting

Database Monitoring Report

Page 38: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 38

Performance Testing – Synergies

Performance Testing and Application Performance Management (APM) go hand in hand

Performance Testing proactively identifies and resolves issues before production – metrics captured during performance testing can help build and monitor production systems more accurately

Performance Testing Scripts can be reused for synthetic transaction monitoring in production for SLA enforcement

Performance Testing Tools can be used for Root Cause Analysis and to replicate production issues

Page 39: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 39

Application Performance Testing/Monitoring – Magic Quadrant

Page 40: Mohit Verma,  Tufts Health Plan Lead  Performance Architect

QAAC - Sept 24, 2010 40

Questions/Discussion

?