decoupled system interface testing at fedex
DESCRIPTION
If you work in a large-scale environment, you know how difficult it is to have all the systems “code complete” and ready for testing at the same time. In order to fully test end-to-end scenarios, you must be able to validate results in numerous systems. But what if all those systems are not available for you to begin testing? Chris Reites describes “decoupled testing,” an enterprise-level solution for managing interface data for capture, injection, simulation, and comparison all along your testing paths. Decoupled testing provides the ability to validate and independently test systems without having to rely on end-to-end testing. This is accomplished by capturing intermediate interface transactions at pre-determined, critical points during processing and comparing them against previously captured or generated expected results. Chris shares a case study on how this approach has benefited FedEx on critical customer-facing systems.TRANSCRIPT
T20 Concurrent Class
10/3/2013 3:00:00 PM
"Decoupled System Interface
Testing at FedEx"
Presented by:
Chris Reites
FedEx
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com
Chris Reites
FedEx Services
As a Technical Principal in software quality assurance for FedEx Services, Chris Reites has
experience providing cutting edge, best-practice processes for the development and testing of
large-scale, complex, global software systems. Chris has been working for FedEx in IT for
fifteen years. Prior to joining the software testing organization within FedEx, Chris was a
software developer for several key applications within the FedEx billing system.
9/19/2013
1
Decoupled Testing Overview
Chris Reites Technical Principal – FedEx Services
STARWEST Conference 10/03/2013
FedEx Testing – What We’re Dealing With
2
• Application Testing and Certification • Responsible for Test Planning,
Design, Execution, and Validation • Key Shipping Products (Desktop
devices, www.fedex.com, etc…) • Backend Rating, Revenue, Tracking,
and Invoicing Systems • Major releases annually, as well as
weekly exception and emergency loads
• Hundreds of applications involved.
9/19/2013
2
Types of Testing We Do
3
Definition: Validates core software functionality in production after regularly scheduled software loads and regularly scheduled data updates. Supports Corporate Loads, Dotcom Loads, Exceptions Loads & some Emergency loads.
Functional Testing
Performance Testing
Vulnerability Testing
Regional Testing
Definition: Evaluates the compliance of a system or component with specified functional requirements. Includes New Features, Regression, Integration and System tests. Coordinate with Revenue Testing for impacted products.
Definition: Evaluation of a system’s or component’s compliance with specified performance requirements. Includes Volume testing, Load / Stress testing, Failover / Recovery Testing and Disaster Recovery.
Definition: Utilizing security scanning tools and educated vulnerability testing through manual human intervention techniques, providing scanning and penetration testing for supported applications during regular releases.
Definition: Includes functional testing performed on behalf of the Regions by SQA Testing groups and Language Translation Testing performed by Marketing or User groups within the Regions.
Certification Services
Quality Assurance Approaches
Production Checkout
CSP Certification
Label Certification (ensures operational excellence)
Definition: Functional tests designed to validate specific outcomes within Revenue systems. Starts with entering shipments (via INET, WSVC, CAFE, FXRS, etc.) and addition of selected scans, flows through servers and intermediate systems, and concludes with validation of results on invoices and in accounts receivables.
Definition: Consulting with third-party software providers to integrate FedEx technology into their applications and validate that their applications meet FedEx requirements from a brand, revenue and operational perspective.
Definition: Validation of Express labels submitted by automation clients, CSP providers, WebServices customers and FXRS customers who modify labels in production. Automation clients submit all labels for certification.
End-to-End Testing
• Dependency to have all code ready at the same time
• Interfaces were critical…yet not well documented or understood
• Interface changes coming late into Code/Test phases
• Interface issues caused “Ripple Effect” throughout system
Impacts our Speed To Market and causes a lack of flexibility in testing
4
Potential Hazards in Large System Testing
9/19/2013
3
Decoupled Testing Concepts and Potential
5
Automation Revenue
Corp. Load End to End Testing
Corp. Load End to End Testing enhanced by Decoupled Test
Concept
• Provides the ability to test target systems independently by removing dependencies on other external systems.
• Divide and conquer
• Reduce defect fix/validation cycle time
• Mitigate Risk when introducing software changes – comprehensive regression test
• Reduce validation dependency on End-to-End cycles
Adoption in Automation Systems
• Reduced dependency on revenue test cycles
• Mini Revenue cycles within automation systems using Decoupled Test Tool
• Pilot mini Decoupled Testing cycle in one shipping device – FY12 Q1
• Mini Decoupled Testing cycles in multiple shipping devices for Jan12 corporate load.
Automation Revenue
Devices Front End Process
Back End Process
Edit & Rating
Invoicing Settlement
Devices Front End Process
Back End Process
Edit & Rating
Invoicing Settlement
What Makes Up Decoupled Testing?
6
Decoupled Testing provides the ability to test target systems independently by removing dependencies on other external systems.
4 Core Functions:
• Interface Data Capture: supports the collection and storage of interface data
• Interface Data Compare: supports on-demand, field-level comparisons of interface data
• Interface Data Injection: supports the ability to ‘replay’ previously captured interface data into the target system
• Interface Simulation: supports the virtualization of responses from backend system interfaces that are synchronous in nature
9/19/2013
4
eCommerce
Express Rev Rating
Ground Rev Rating
Accounting
Corp App
Revenue Back End
Regions Apps
InvoiceInp
uts
Ou
tpu
ts
GL
Entries
A/R
Order Cash
Shipment Sys
Testing Researcher
Input Output
Expected
FCIS Sys
Freight Rev Rating
End to End
Corporate Load Testing
Coupled Testing
Jump
7
eCommerce
Express Rev Rating
Ground Rev Rating
Accounting
Corp App
Revenue Back End
Regions Apps
InvoiceInp
uts
Ou
tpu
ts
GL
Entries
A/R
Order Cash
Shipment Sys
Testing Researcher
Input Output
Expected
FCIS Sys
Freight Rev Rating
Coupled Testing
Jump
8
9/19/2013
5
Express Rev
Rating Accounting
Corp App
Revenue Back
EndInvoiceIn
pu
ts
Ou
tpu
ts
GL
Entries
A/R
Order Cash
Shipment Sys
Testing Researcher
Input Output
Expected
FCIS Sys
Freight Rev
Rating
End to End Testing
Cluster
Cluster
Cluster
Cluster
ClusterCluster
Cluster
eCommerce
Back End
eCommerce
Front End
Cluster
Cluster
Decoupled Testing
evolve evolve evolve evolve
9
eCommerce
Front End
Express
Rev Rating
Revenue
Back End
Inpu
ts
Actual Before
Actual After
Oup
uts
Testing Researcher
A
A
B
B
C
S
Expected
Decoupled Testing
10
9/19/2013
6
11
Revenue Systems Example: From Serial to Parallel Processing
12
• New test data created for both system entry points and downstream injection points simultaneously
• Regression test data injected with previously
captured results
•Compare outputs against previous results Automation
•Inject data directly into backend system
•Compare outputs against previous results
Revenue
Common Test Data Design Positive Effects:
Test Data
Shipping Input
Revenue Input
• Reduce idle time (waiting for successful end-to-end execution)
• Increase the test coverage utilizing data comparison analysis
• Early detection of issues • Quicker validation of fixes
9/19/2013
7
Interface Simulation
13
• Interface Simulation is the ability to virtualize responses from backend systems.
•Simulated Interfaces remove backend complexity from testing environments and provide stable, predictable behavior to make system testing easier and more available.
• Response times of virtual interfaces can be varied to simulate latency or systems under load
Test Strategy Objectives
•From weeks to days or hours
Reduce Defect Resolution Time
•By providing data analysis solutions
•By automating validation processes
Reduce Validation Time
•By discovering and resolving defects earlier
•By more quickly fixing and revalidating defects
Improve Code Quality coming into Integration Testing
•By providing environments on demand
•By providing a self service set of test solutions
Increase Quality Assurance Capacity
14
9/19/2013
8
Common Goals And Success Metrics
Normal Dev/Test Phase DTT Dev/Test Phase
Agility and Time-to-Market
The measure of time required to launch a project/feature through newly enabled testing processes.
Defect Removal Cost Reduction
The measure of identifying and fixing defects earlier in the testing process.
Support Resource Cost Reduction / Reallocation
The measure of resource reduction due to functionality being delivered.
15
# o
f D
efec
ts
16
Defect Removal - Reduction in Cost The measure of identifying defects earlier in the testing process.
Metrics above reveal the following: • A definite shift in time in the number of defects found and closed in earlier cycles
of testing
# o
f D
efec
ts
9/19/2013
9
17
Decoupled Testing – Speed to Market
100.00%
67.00%
38.00%
0
0.2
0.4
0.6
0.8
1
1.2
Traditional Decoupled Testing v1 Decoupled Testing v2
Du
rati
on
of
Test
ing
Re
lati
ve
To T
rad
itio
nal
Met
ho
d*
Testing Method
Testing Process for Certification of a Key Customer Shipping Platform
* Duration includes
shipping/execution and validation
8.5% 100.0% 147.0%
100.0% 100.0% 147.0%
Traditional Decoupled v1 Decoupled v2
# Te
st C
ase
s Sh
ipp
ed
an
d V
alid
ate
d R
ela
tive
to
Tra
dit
ion
al M
eth
od
Testing Method
Validated Baseline
18
Decoupled Testing – Resource Cost Reduction
Testing Process for Certification of a Key Customer Shipping Platform
100%
1750%
3268%
Traditional Decoupled v1
Decoupled v2
% O
f Te
st C
ase
s Te
ste
d
Pe
r D
ay R
ela
tive
to
Tr
adit
ion
al M
eth
od
Testing Method
• For each test case: field level validation on two transactions
• Around 200 field comparisons per test case
100.00% 67.00%
38.00%
Traditional Decoupled Testing v1
Decoupled Testing v2
Du
rati
on
of
Test
ing
Rel
ativ
e
To T
rad
itio
nal
Met
ho
d*
Testing Method
9/19/2013
10
Case Study: FedEx Delivery Manager
19
FedEx Delivery Manager – Decoupled Testing
20
Business Challenge Parallel development and simultaneous delivery of multiple FedEx applications impacted by the project prevented integration testing prior to Integration Testing. Goals Provide development teams access to key backend systems during Unit Testing in order to identify defects early, and have the ability to inject transactions into various parts of the system to break dependency of needing all systems ready at same time.
9/19/2013
11
Utilizing DTT, FedEx Delivery Manager early discovered 75% of defects and validated 68% of test cases for the Back-End Systems
Delivery Manager Portal started having success
• Utilizing injection for Back-End testing, FedEx identified: • 11% of total defects during shakeout • 73% of total revenue defects • 82% of total credit card defects • 63% of total Shipment Event Processing defects
21
Results
Why All Companies Should Be Thinking About Decoupled Testing
22
• As applications and systems become larger and more complex, traditional “end to end” testing becomes un-scalable.
• With the increase in 3rd party service providers and service oriented architectures, the ability to decouple those dependencies for testing is critical.
• The longer you wait to begin decoupling the testing of your systems, the harder it is to do.
• There is probably a LOT of low hanging fruit available to start with.
9/19/2013
12
Where to Begin
23
• Involvement and buy-in of internal business counterparts is critical
• Identify testing dependencies that cause the most issues (unavailability, late delivery, high associated costs, etc…)
• Find the quick wins to build confidence and momentum
• Partner with application developers and find ways to share the decoupled testing tools
What’s Next?
• User-driven, integrated data management system
• Minimal development on interface rollout • High-level reliability & instrumentation
• Failure tolerant (re-connect) • Standard FedEx logging (monitor) • Unattended operation
• Scalable without incurring major licensing cost • Large Capacity • Service Oriented Architecture for Integration
with other Systems • Rules-driven business logic • Synergistic Test Management System
24
9/19/2013
13
25
QUESTIONS?