“agility server scalability testing”...note: this harness was only designed to test the...
Post on 22-Jun-2020
3 Views
Preview:
TRANSCRIPT
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
1
“Agility Server Scalability Testing”
Version Control
Editor Description Date Version Mark Rains First Draft 5/10/18 V1.0 Mark Rains Addition of 2nd data set + result
graphs 17/10/18 V1.1
Mark Rains Conclusion 18/10/18 V1.2 Mark Rains Appendices 19/10/18 V1.3 Mark Rains Alterations to Structure and Wording 07/01/2019 v2.6 Mark Rains Removal of contact email addresses 08/01/2019 v2.9 Mark Rains Alteration of document title 08/01/2019 V2.91
Resource
Editor Company Job Description Mark Rains Fusion Systems Consultant / Lead performance Tester /
Automation Engineer Mike Naylor Agility Multichannel DBA Lee Tonks Agility Multichannel Lead Developer
Resource
Editor Contact Details Mark Rains *******@*******.*** Mike Naylor *******@*******.*** Lee Tonks *******@*******.***
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
2
Table of Contents Document Scope ....................................................................................................................... 3
1 Introduction ....................................................................................................................... 4
1.1 Brief ........................................................................................................................... 4
2 Scope ................................................................................................................................. 5
3 Test Approach.................................................................................................................... 5
4 Test environment setup .................................................................................................... 7
5 Test data used ................................................................................................................... 7
6 Test Schedule..................................................................................................................... 8
6.1 Baseline API users against the server. Dataset 1 ....................................................... 8
6.1.1 Test 1. ................................................................................................................ 8
6.1.2 Test 2. ................................................................................................................ 8
6.1.3 Test 3. ................................................................................................................ 8
6.1.4 Test 4 ................................................................................................................. 8
6.2 Baseline API users against the server. Dataset 2 ....................................................... 9
6.2.1 Test 1. ................................................................................................................ 9
6.2.2 Test 2. ................................................................................................................ 9
6.2.3 Test 3. ................................................................................................................ 9
6.2.4 Test 4 ................................................................................................................. 9
7 Results Summary ............................................................................................................. 11
7.1 Dataset 1 End User Summary .................................................................................. 11
7.2 Dataset 2 End User Summary .................................................................................. 11
8 Conclusion ....................................................................................................................... 12
9 Summary.......................................................................................................................... 12
10 Results ‐ Detailed ......................................................................................................... 13
10.1 Dataset 1 – 10 Users ................................................................................................ 13
10.2 Dataset 1 – 30 Users ................................................................................................ 17
10.3 Dataset 1 – 50 Users ................................................................................................ 21
10.4 Dataset 1 – 100 Users .............................................................................................. 24
10.5 Dataset 2 – 10 Users ................................................................................................ 28
10.6 Dataset 2 – 30 Users ................................................................................................ 31
10.7 Dataset 2 – 50 Users ................................................................................................ 35
10.8 Dataset 2 – 100 Users .............................................................................................. 39
11 Appendix A .................................................................................................................. 42
11.1 Service Monitor Descriptions .................................................................................. 42
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
3
Document Scope This document is to provide information on scalability testing regarding the Product Information Management (PIM) software developed by Agility Multichannel. The documentation is designed for the reader to have a medium knowledge of IT to understand the concepts of environments and performance statistics.
The following information is included within this document: Introduction – Which details why and what we tested. Test Approach – How we performed the tests. Test Schedule – Timelines of tests run. Results – Completed and stored data analysis from each test run within the test schedule. Conclusion – What we can conclude from the test results.
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
4
1 Introduction
1.1 Brief Agility Multichannel Ltd (a division of Magnitude Software Inc.) is a software company that develops the ‘Agility’ product information management (PIM) system. Agility Multichannel approached Fusion Systems to benchmark a subset of the functionality in order to measure its scalability using simulated loads. The methodology is described in section 3. Fusion Systems has been engaged as a consultant to provide a third-party independent review of the testing procedure, execution of the tests, and to provide an analysis of the results. Fusion Systems is a technology company providing IT support and consultancy services. Fusion Systems has been involved with full cycle application testing across a spectrum of business sectors.
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
5
2 Scope The scope of this exercise was limited to scalability testing and did not include: • Functional Testing • Non-Functional Testing • Performance Testing • Tuning • Penetration testing The scope of the scalability testing was focused on the Agility Server using its ADAPI API because the main end user and integration processes of Agility all use the server API in order to function. The tests were limited to a single configuration of the environment, the software and the data for the following reasons:
1) The Agility application can be configured and used in many different ways for each customer.
2) The IT infrastructure for each customer differs depending on requirements which cause too many variables to complete the testing against each.
3) The 3rd party deployment platforms used were restricted to the most commonly
used option i.e. Wildfly application server (AS) and SQL Server Database.(DB) Agility can be deployed in other permutations but these were not tested:
Weblogic AS + SQL Server DB Weblogic AS + Oracle DB Websphere AS + SQL Server DB Websphere AS + Oracle DB
In addition, Agility can be deployed against different subversions of these components as well as a mixture of operating systems and environments including servers and clients which again it was impractical to test against. Customer deployments will also have different network configurations including load balancing / Server locations / Bandwidth / Latency Also each customer will have data models, data volumes, integrations and usage patterns which are unique. With the above in mind we used a generic set of tests which are NOT representative of any specific customer environment or loads but still show the scalability of the Agility Server against a subset of data defined as per section 5. Due to different profiles for each client it would be impossible to replicate Business as usual loads since this is client dependant and will require further resources including a Business Analyst.
3 Test Approach One load harness was used during the tests. This harness performed multithreaded simultaneous API calls via the App server to the DB server.
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
6
Note: This harness was only designed to test the scalability of the API, and not of the web client. As such, performance of the web client is out of the scope of these tests. The load harnesses were run to show Agility API performance against the Agility Server. 4 Runs were used at the following loads:
10 simultaneous users, each performing 10 jobs. 30 simultaneous users, each performing 10 jobs. 50 simultaneous users, each performing 10 jobs. 100 simultaneous users, each performing 10 jobs.
The above tests were executed on each dataset supplied by Agility Multichannel, as defined in the ‘Scope’. Timing points were taken from the load harness. Performance statistics were gathered on the server using Performance Monitor for the following metrics:
CPU Memory Bandwidth Disk Usage Stats SQL Stats
The load harness reported transaction times and response times for the API calls made. As mentioned in the note, the Agility ‘AMI’ Web Client is out of scope of this test process, and so the extra processing that would occur once an API response is received was not accounted for. However this would be specific to each end user PC web browser rather than shared processing on the server so should not be relevant for scalability testing. During each test the environment was rolled back to the original settings and database content to show clean runs against a clean environment so as not to skew individual results. The load harness was designed to introduce random wait times. These wait times have been specifically designed to mimic end user delays when completing transactions. Otherwise the scenarios wouldn’t be as representative as they would be in a production environment. These random wait times were not included in the request/response times. Each test when started, created the required number of users. These users then simultaneously performed their actions, until the final count of actions was complete. Some actions (such as search) are single step and require only 1 request/response from the server. Others, such as ‘Create and Link’ involve several request/response combinations, as an object is created (1 request/response), and several attributes are created and linked (several requests/responses), and then the object is deleted (1 request/response). In the opinion of Agility Multichannel, one test session in the test harness puts the server under a heavier load that of a single real-world user due to the fact the test session performs actions more frequently. However, it is hard to put a specific value on the correlation between the 2 e.g. one test session may be equivalent to the load from 2 real-world users.
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
7
4 Test environment setup The following information details the environment setup: Hypervisor Host Dell PowerEdge R420 Intel Xeon E5-2430 @2.2Ghz (24 Cores) 64GB Memory 200GB (C:\) 4.7TB (D:\) Drive Configuration – 7 drives on a single RAID5 array. Application Server Virtual Machine 2 Processor Cores 16GB Memory 100GB HDD (C:\) Database Server Virtual Machine 5 Processor Cores 32GB Memory 100GB HDD (C:\) 150GB HDD (D:\) 50GB HDD (E:\) 500GB (F:\) Both Application and database servers were run within a Virtual environment using Microsoft Hyper-V. The Virtual Machines were configured with specific resource allocation for CPU and Memory. These allocations were representative of production instances. The network interface between the App and DB server was through a virtual switch, and the external API calls were made through a dedicated interface.
Both systems were on Windows Server 2016 64-bit including all patching up to the 4th Oct 2018.
The application server was Wildfly 10.1.0 The database server was Microsoft SQL Server 2017
The network path between the environments was a gigabyte virtual switch. Each environment was hosted on a dedicated network interface connecting directly to the switch.
5 Test data used The datasets were supplied by Agility Multichannel. Data Set 1 consisted of 252,042 records (119 SKU records) with an average of 1187 attributes (8 per SKU record). Data Set 2 consisted of 3,310,499 records (1,255,071 SKU records) with an average of 21 attributes (40 per SKU record).
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
8
No other datasets were supplied to be tested against.
6 Test Schedule The following information details the test schedule which was completed during the testing phase. All tests incorporate a random wait time of between 3 and 10 seconds between tasks.
6.1 Baseline API users against the server. Dataset 1 6.1.1 Test 1. 10 users run concurrently on a clean environment running the following: The load consisted of each connection running a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search
6.1.2 Test 2. 30 users run concurrently on a clean environment running the following: The load consisted of each connection running a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search
6.1.3 Test 3. 50 users run concurrently on a clean environment running the following: The load consisted of each connection running a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search
6.1.4 Test 4 50 users run concurrently on a clean environment running the following:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
9
The load consisted of each connection running a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search
6.2 Baseline API users against the server. Dataset 2 6.2.1 Test 1. 10 users run concurrently on a clean environment running the following: The load consisted of each connection running a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search
6.2.2 Test 2. 30 users run concurrently on a clean environment running the following: The load consisted of each connection running a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search
6.2.3 Test 3. 50 users run concurrently on a clean environment running the following: The load consisted of each connection running a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search
6.2.4 Test 4 50 users run concurrently on a clean environment running the following:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
10
The load consisted of each connection running a sequence of 10 tasks randomly selected from a predefined set consisting of: - Create object and attributes then unlink - Create object and attributes then delete - Create object with no attributes then unlink - Create object with no attributes then delete - Global search
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
11
7 Results Summary The results showed the following end user times against each trigger using the two sets of data.
7.1 Dataset 1 End User Summary
7.2 Dataset 2 End User Summary
10 ‐ Average 30 ‐ Average 50‐Average 100‐Average
CREATE 0.2414 0.3812 0.4118 0.7169
UNLINK 0.0596 0.0761 0.0755 0.0817
DELETE 0.1668 0.2182 0.2219 0.3201
ADD_ATTR 0.1624 0.1724 0.1814 0.2010
SEARCH 0.0640 0.1123 0.0923 0.1662
0.0000
0.2000
0.4000
0.6000
0.8000
1.0000
1.2000
Seconds To
Run Request
DATASET 1
10 ‐ Average 30 ‐ Average 50‐Average 100‐Average
CREATE 0.0972 0.2137 0.2339 0.2046
UNLINK 0.0283 0.0296 0.0369 0.0314
DELETE 0.0729 0.0766 0.0788 0.0848
ADD_ATTR 0.1030 0.1106 0.1161 0.1178
SEARCH 0.1953 0.4794 0.5164 0.5656
0.0000
0.1000
0.2000
0.3000
0.4000
0.5000
0.6000
Seconds To
Run Request
DATASET 2
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
12
8 Conclusion All results and statistics have been stored and can be accessed on request to validate findings. All reports used have been stored since only a subset were shown within this document to report against to reduce volume. These can be provided on request. It was observed that the performance reduced as more load was introduced to the system. This is normal and what we would expect. Looking at the average timings, this change was quite small. When looking at the minimum and maximum timings, the change was more noticeable. This indicates that there were an increasing number of ‘spikes’ dependant on the load being put through the data harness. While running the automated tests, some single user manual system actions were performed. The qualitative results of this were consistent with the quantitative results. That is, that the user experience was in general very minimally effected, however there were ‘lag spikes’ and short (less than a second) periods of unresponsiveness. The frequency of these did correlate to the load being put into the API via the test harness. We can see that the specified hardware was more than capable of running the tests, and at no point came under sustained heavy load. RAM usage was constant throughout the applications – with both Java Runtime Engine and SQL Server claiming their allocated memory, and then not needing to run past this. We observed a peak in CPU performance as the test ramped up at the start, as all test harness ‘users’ were spawned and start working. This then reduced over time, as the nature of the random actions and wait times staggered the requests. This is reflected within the raw data, as we can see the requests at the start of the test took longer than those towards the end of the test. The SQL Server database performed well, constantly serving ~95% of its request via its cache, as can be seen in the ‘Cache Hit Ratio’ statistics. The performance statistics also show the impact of the dataset and data model on performance. We can see that Dataset one took roughly half the amount of time for 4 of the 5 actions performed. The ‘Search’ request does not offer a direct comparison between the 2 data sets, as the size of the returned results are significantly different, and therefore both the processing and response times naturally differ. The test environment showed adequate health when running up to 100 Users concurrently on the two data sets.
9 Summary This document was designed for Scalability testing only on a subset of the environment focused on the server API. To obtain a more detailed picture, further testing would be required on multiple client machines to more accurately mirror a production environment which would include current data sets and forecast generated data sets. With the testing we performed, we can make a statement that on our test environment the Agility Server shows little performance degradation based on the concurrent users, or database size. As mentioned previously, the search test is not a comparable
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
13
benchmark between the datasets due to the different size of the search result returned by each Dataset.
10 Results - Detailed Please see Appendix A for graph descriptions. All statistics for each description are available on request for each test.
10.1 Dataset 1 – 10 Users The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows:
10 Users
Description Min Average Max Count
CREATE 0.1360 0.2414 1.0690 54
UNLINK 0.0560 0.0596 0.0660 19
DELETE 0.0590 0.1668 0.8300 33
ADD_ATTR 0.1300 0.1624 0.8580 380
SEARCH 0.0320 0.0640 0.1570 26
The Environment statistics Application Server Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
14
Memory:
Disk Time:
Database Server: Bandwidth:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
15
CPU:
Memory:
Disk Time:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
16
SQL Cache hit Ratio:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
17
10.2 Dataset 1 – 30 Users The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows:
30 Users
Description Min Average Max Count
CREATE 0.1340 0.3812 1.8100 160
UNLINK 0.0550 0.0761 0.6400 90
DELETE 0.0570 0.2182 1.7280 70
ADD_ATTR 0.1280 0.1724 1.5610 1184
SEARCH 0.0320 0.1123 0.5650 44
The Environment statistics Application Server Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
18
Memory:
Disk Time:
Database Server: Bandwidth:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
19
CPU:
Memory:
Disk Time:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
20
SQL Cache hit Ratio:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
21
10.3 Dataset 1 – 50 Users The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows:
50 Users
Description Min Average Max Count
CREATE 0.1340 0.4118 3.5470 275
UNLINK 0.0550 0.0755 1.0240 142
DELETE 0.0550 0.2219 1.1260 132
ADD_ATTR 0.1270 0.1814 2.7090 2064
SEARCH 0.0320 0.0923 0.6390 98
The Environment statistics Application Server Bandwidth:
Memory:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
22
Disk Time:
Database Server: Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
23
Memory:
Disk Time:
SQL Cache hit Ratio:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
24
10.4 Dataset 1 – 100 Users The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows:
100 Users
Description Min Average Max Count
CREATE 0.1330 0.7169 5.7400 539
UNLINK 0.0550 0.0817 0.6760 270
DELETE 0.0560 0.3201 2.2540 267
ADD_ATTR 0.1280 0.2010 0.9360 4295
SEARCH 0.0320 0.1662 1.8310 154
The Environment statistics Application Server Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
25
Memory:
Disk Time:
Database Server: Bandwidth:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
26
CPU:
Memory:
Disk Time:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
27
SQL Cache hit Ratio:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
28
10.5 Dataset 2 – 10 Users The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows:
10 Users
Description Min Average Max Count
CREATE 0.0700 0.0972 0.3190 76
UNLINK 0.0240 0.0283 0.0540 41
DELETE 0.0420 0.0729 0.1030 34
ADD_ATTR 0.0840 0.1030 0.7420 767
SEARCH 0.0220 0.1953 1.9210 19
The Environment statistics Application Server Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
29
Memory:
Disk Time:
Database Server: Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
30
Memory:
Disk Time:
SQL Cache hit Ratio:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
31
10.6 Dataset 2 – 30 Users The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows:
30 Users
Description Min Average Max Count
CREATE 0.0710 0.2137 1.3560 239
UNLINK 0.0250 0.0296 0.0770 134
DELETE 0.0410 0.0766 0.2610 105
ADD_ATTR 0.0840 0.1106 0.5130 1920
SEARCH 0.0240 0.4794 2.2250 61
The Environment statistics Application Server Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
32
Memory:
Disk Time:
Database Server: Bandwidth:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
33
CPU:
Memory:
Disk Time:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
34
SQL Cache hit Ratio:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
35
10.7 Dataset 2 – 50 Users The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows:
50 Users
Description Min Average Max Count
CREATE 0.0680 0.2339 3.2530 406
UNLINK 0.0230 0.0369 1.0220 202
DELETE 0.0400 0.0788 0.3360 204
ADD_ATTR 0.0840 0.1161 1.2990 3152
SEARCH 0.0220 0.5164 4.6880 94
The Environment statistics Application Server Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
36
Memory:
Disk Time:
Database Server: Bandwidth:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
37
CPU:
Memory:
Disk Time:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
38
SQL Cache hit Ratio:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
39
10.8 Dataset 2 – 100 Users The end user timings were as follows: Throughout the stats Minimum, Average and Maximum times in seconds were provided as follows:
100 Users
Description Min Average Max Count
CREATE 0.0670 0.2046 1.6430 787
UNLINK 0.0230 0.0314 0.3180 387
DELETE 0.0400 0.0848 0.6090 397
ADD_ATTR 0.0820 0.1178 3.0940 6113
SEARCH 0.0220 0.5656 3.2470 210
The Environment statistics Application Server Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
40
Memory:
Disk Time:
Database Server: Bandwidth:
CPU:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
41
Memory:
Disk Time:
SQL Cache hit Ratio:
Fusion Systems Ltd. Company No: 5598231. VAT Number: 873512812 Page
42
11 Appendix A
11.1 Service Monitor Descriptions
Object Counter Instance Comment
System % Total Processor Time
Not applicable
Less than 80% means the level of processor performance is acceptable. Constant measurements above 95% mean there is cause for concern.
Physical disk
% Disk Time Each disk Less than 80% means the level of physical disk performance is acceptable.
Memory Committed Bytes
Not applicable
If this value is smaller than the available amount of RAM, you have enough memory to support the running processes without excessive paging. If this value is consistently larger than available RAM, the computer is experiencing an unacceptable level of paging, and you must add more physical RAM
Memory Page Reads/sec
Not applicable
Constant measurements greater than five indicate a requirement for more memory.
SQL Server
Cache Hit Ratio
Not applicable
98% or greater is good because SQL Server queries are not delayed by paging off disk.
top related