2013 fw car performance

14
FIREWALL COMPARATIVE ANALYSIS Performance 2013 – Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles Tested Products Barracuda F800, Check Point 12600, Cyberoam CR2500iNG, Dell SonicWALL NSA 4500, Fortinet FortiGate800c, Juniper SRX550, NETASQ NG1000A, NETGEAR ProSecure UTM9S, Palo Alto Networks PA5020, Sophos UTM 425, Stonesoft FW1301, WatchGuard XTM 1050

Upload: vivek-r-koushik

Post on 09-Sep-2015

224 views

Category:

Documents


2 download

DESCRIPTION

Firewall Performance Report

TRANSCRIPT

  • FIREWALL COMPARATIVE ANALYSIS

    Performance

    2013 Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles

    Tested Products Barracuda F800, Check Point 12600, Cyberoam CR2500iNG, Dell SonicWALL NSA 4500, Fortinet FortiGate-800c, Juniper SRX550, NETASQ NG1000-A, NETGEAR ProSecure UTM9S, Palo Alto Networks PA-5020, Sophos UTM 425, Stonesoft FW-1301, WatchGuard XTM 1050

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 2

    Overview Implementation of a firewall can be a complex process with multiple factors affecting the overall performance of the solution.

    Each of these factors should be considered over the course of the useful life of the solution, including:

    1. Deployment use cases: a. Will the Firewall be deployed to protect servers, desktop clients, or both?

    2. What does the traffic look like? a. Concurrency and connection rates. b. Connections per second and capacity with different traffic profiles. c. Latency and application response times.

    There is usually a trade-off between security effectiveness and performance; a products security effectiveness should be evaluated within the context of its performance (and vice versa). This ensures that new security protections do not adversely impact performance and security shortcuts are not taken to maintain or improve performance.

    Sizing considerations are absolutely critical, since vendor performance claims can vary significantly from actual throughput with protection turned on. Farther to the right indicates higher rated throughput. Higher up indicates more connections per second. Products with low connections/throughput ratio run the risk of running out of connections before they reach their maximum potential throughput.

    Figure 1: Throughput and Connection Rates

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    -

    50,000

    100,000

    150,000

    200,000

    250,000

    - 2,000 4,000 6,000 8,000 10,000 12,000

    Max

    imum

    TC

    P C

    onne

    ctio

    ns p

    er S

    econ

    d

    NSS Rated Throughput (Mbps)

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 3

    Table of Contents Overview ................................................................................................................................ 2

    Analysis .................................................................................................................................. 4 UDP Throughput & Latency .......................................................................................................................... 5 Connection Dynamics Concurrency and Connection Rates ....................................................................... 7 HTTP Connections per Second and Capacity ................................................................................................ 9 HTTP Connections per Second and Capacity (Throughput) ....................................................................... 9 Application Average Response Time - HTTP (at 90% Max Capacity) ...................................................... 12

    Real-World Traffic Mixes ............................................................................................................................ 13

    Test Methodology ................................................................................................................. 14

    Contact Information .............................................................................................................. 14

    Table of Figures Figure 1: Throughput and Connection Rates ................................................................................................ 2 Figure 2: Vendor Claimed vs NSS Rated Throughput in Mbps ...................................................................... 4 Figure 3: UDP Throughput by Packet Size (I) ................................................................................................ 5 Figure 4: UDP Throughput by Packet Size (II) ............................................................................................... 6 Figure 5: UDP Latency by Packet Size (Microseconds) .................................................................................. 6 Figure 6: Concurrency and Connection Rates (I) ........................................................................................... 7 Figure 7: Concurrency and Connection Rates (II) .......................................................................................... 8 Figure 8: Maximum Throughput Per Device With 44Kbyte Response .......................................................... 9 Figure 9: Maximum Throughput Per Device With 21Kbyte Response ........................................................ 10 Figure 10: Maximum Throughput Per Device With 10Kbyte Response ...................................................... 10 Figure 11: Maximum Throughput Per Device With 4.5Kbyte Response ..................................................... 11 Figure 12: Maximum Throughput Per Device With 1.7Kbyte Response ..................................................... 11 Figure 13: Maximum Connection Rates Per Device With Various Response Sizes ...................................... 12 Figure 14: Application Latency (Microseconds) Per Device With Various Response Sizes .......................... 12 Figure 15: Real-World Performance by Device ........................................................................................... 13

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 4

    Analysis NSS Labs research indicates that the majority of enterprises will deploy traditional firewalls in front of their datacenters and at the core of the network to separate unrelated traffic. Because of this, NSS rates product performance based upon the average of three traffic types: 21KB HTTP response traffic, a mix of perimeter traffic common in enterprises, and a mix of internal core traffic common in enterprises. Details of these traffic mixes are available in the Firewall Test Methodology (www.nsslabs.com).

    Every effort is made to deploy policies that ensure the optimal combination of security effectiveness and performance, as would be the aim of a typical customer deploying the device in a live network environment. This provides readers with the most useful information on key firewall security effectiveness and performance capabilities based upon their expected usage.

    Figure 2: Vendor Claimed vs. NSS Rated Throughput in Mbps

    The results presented in the chart above show the difference between the NSS performance rating and vendor performance claims, which are often under ideal/unrealistic conditions. Where multiple figures are quoted by vendors, NSS selects those that relate to TCP, or with protection enabled, performance expectations, rather than the more optimistic UDP-only or large packet size performance figures often quoted.

    Even so, NSS rated throughput is typically lower than that claimed by the vendor, often significantly so, since it is more representative of how devices will perform in real-world deployments.

    7,827

    8,400

    8,733

    850

    9,667

    2,127

    2,540

    231

    4,120

    3,000

    5,147

    2,200

    9,200

    10,000

    28,000

    990

    20,000

    5,500

    7,000

    850

    5,000

    6,000

    5,000

    10,000

    - 5,000 10,000 15,000 20,000 25,000 30,000

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    NSS Throughput Rating Vendor Stated Throughput

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 5

    UDP Throughput & Latency The aim of this test is to determine the raw packet processing capability of each in-line port pair of the device only. This traffic does not attempt to simulate any form of real-world network condition. No TCP sessions are created during this test, and there is very little for the detection engine to do in the way of protocol analysis. However, this test is relevant since vendors are forced to perform inspection on UDP packets as a result of VoIP, video, and other streaming applications.

    Figure 3: UDP Throughput by Packet Size (I)

    Fortinets Fortigate 800c was the only device to demonstrate anything close to line rate capacity with packet sizes from 1514 bytes all the way down to 64 bytes. In addition, it was the only device to consistently demonstrate latency of less than 10 microseconds. The Juniper SRX 550 demonstrated the second best latency at 12-16 microseconds.

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    -

    2,000

    4,000

    6,000

    8,000

    10,000

    12,000

    14,000

    16,000

    18,000

    20,000

    64 Byte Packets 128 Byte Packets2 256 Byte Packets 512 Byte Packets 1024 Byte Packets 1514 Byte Packets

    Meg

    abit

    s p

    er S

    econ

    d

    UDP Packet Size

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 6

    Product 64 Byte Packets 128 Byte Packets

    256 Byte Packets

    512 Byte Packets

    1024 Byte Packets

    1514 Byte Packets

    Barracuda F800 780 2,000 3,700 6,500 12,600 18,500

    Check Point 12600 1,900 3,400 6,400 11,700 12,100 12,200

    Cyberoam CR2500iNG 3,300 9,300 9,300 11,230 16,200 19,300

    Dell SonicWALL NSA 4500 120 400 400 890 1,700 2,600

    Fortinet FortiGate-800c 18,900 19,500 19,600 19,700 19,800 19,900

    Juniper SRX550 400 1,450 1,450 2,800 5,500 8,000

    NETASQ NG1000-A 350 600 1,200 2,300 4,800 7,300

    NETGEAR ProSecure UTM9S 2 12 12 20 64 102

    Palo Alto Networks PA-5020 5,092 9,400 9,400 9,597 9,785 9,935

    Sophos UTM 425 640 2,450 2,450 4,600 6,000 6,000

    Stonesoft FW-1301 2,600 4,400 8,300 14,900 19,100 19,900

    WatchGuard XTM 1050 346 925 925 2,000 4,100 6,700

    Figure 4: UDP Throughput by Packet Size (II)

    In-line security devices that introduce high levels of latency are unacceptable, especially where multiple security devices are placed in the data path. The chart below reflects the latency (in microseconds) as recorded during the UDP throughput tests at 90% of maximum load. Lower values are preferred.

    Product

    64 Byte Packets - Latency (s)

    128 Byte Packets - Latency (s)

    256 Byte Packets - Latency (s)

    512 Byte Packets - Latency (s)

    1024 Byte Packets - Latency (s)

    1514 Byte Packets - Latency (s)

    Barracuda F800 273 163 108 104 103 109

    Check Point 12600 75 124 99 82 102 109

    Cyberoam CR2500iNG 1,185 845 452 385 302 270

    Dell SonicWALL NSA 4500 30 31 32 33 37 42

    Fortinet FortiGate-800c 5 6 6 7 8 9

    Juniper SRX550 12 12 12 13 14 16

    NETASQ NG1000-A 36 36 43 46 47 36

    NETGEAR ProSecure UTM9S 232 237 243 255 337 603

    Palo Alto Networks PA-5020 15 19 22 26 33 38

    Sophos UTM 425 59 61 60 63 92 169

    Stonesoft FW-1301 50 84 51 54 82 81

    WatchGuard XTM 1050 136 156 182 269 460 705

    Figure 5: UDP Latency by Packet Size (Microseconds)

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 7

    Connection Dynamics Concurrency and Connection Rates These tests stress the detection engine to determine how the sensor copes with increasing rates of TCP connections per second, application layer transactions per second, and concurrent open connections. All packets contain valid payload and address data and these tests provide an excellent representation of a live network at various connection/transaction rates.

    Note that in all tests, the following critical breaking pointswhere the final measurements are takenare used:

    Excessive concurrent TCP connections - latency within the firewall is causing unacceptable increase in open connections on the server-side.

    Excessive response time for HTTP transactions/SMTP sessions - latency within the firewall is causing excessive delays and increased response time to the client.

    Unsuccessful HTTP transactions/SMTP sessions normally, there should be zero unsuccessful transactions. Once these appear, it is an indication that excessive latency within the firewall is causing connections to time out.

    The following are the key connection dynamics results from the performance tests.

    Product Max. Concurrent TCP Connections

    Max. Concurrent TCP Connections

    w/Data

    TCP Connections Per Second

    HTTP Connections Per Second

    HTTP Transactions Per Second

    Barracuda F800 1,000,000 1,000,000 134,200 117,000 356,000

    Check Point 12600 1,500,000 1,500,000 52,000 113,000 391,700

    Cyberoam CR2500iNG 3,100,000 2,999,000 215,000 179,000 360,000

    Dell SonicWALL NSA 4500 400,000 400,000 15,600 10,300 25,000

    Fortinet FortiGate-800c 6,000,000 4,400,000 180,000 180,000 397,000

    Juniper SRX550 533,000 542,000 18,500 18,000 80,000

    NETASQ ng1000-A 1,280,000 1,200,000 57,000 43,500 102,300

    NETGEAR ProSecure UTM9S 16,700 15,000 760 480 6,400

    Palo Alto Networks PA-5020 1,000,000 1,000,000 36,000 36,000 348,000

    Sophos UTM 425 480,000 2,000,000 12,000 31,800 270,000

    Stonesoft FW-1301 6,000,000 2,900,000 57,000 51,000 328,600

    WatchGuard XTM 1050 2,600,000 2,600,000 19,000 18,000 136,000

    Figure 6: Concurrency and Connection Rates (I)

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 8

    Beyond overall throughput of the device, connection dynamics can play an important role in sizing a security device that will not unduly impede the performance of a system or an application. Maximum connection and transaction rates help size a device more accurately than simply focusing on throughput. By knowing the maximum connections per second, it is possible to predict maximum throughput based upon the traffic mix in a given enterprise environment. For example If the device maximum HTTP CPS is 2,000, and average traffic size is 44KB such that 2,500 CPS = 1Gbps, then the tested device will achieve a maximum of 800 Mbps ((2,000/2,500) x 1,000 Mbps)) = 800 Mbps.

    Maximum concurrent TCP connections and maximum TCP connections per second rates are also useful metrics when attempting to size a device accurately. Products with low connection/throughput ratio run the risk of exhausting connections before they reach their maximum potential throughput. By knowing the maximum connections per second, it is possible to predict when a device will fail in a given enterprise environment.

    Figure 7: Concurrency and Connection Rates (II)

    Higher up indicates increased connections per second capacity. Farther to the right indicates increased concurrent / simultaneous connections. Products with low concurrent connection / connection per second ratio run the risk of exhausting connections (sessions) before they reach their maximum potential connection rate.

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    -

    50,000

    100,000

    150,000

    200,000

    250,000

    - 1,000,000 2,000,000 3,000,000 4,000,000 5,000,000 6,000,000 7,000,000

    Max

    imum

    TC

    P C

    onne

    ctio

    ns p

    er S

    econ

    d

    Maximum Concurrent / Simultaneous TCP Connections

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 9

    HTTP Connections per Second and Capacity In-line firewall devices exhibit an inverse correlation between security effectiveness and performance. The more deep-packet inspection is performed, the fewer packets can be forwarded. Furthermore, it is important to consider a real-world mix of traffic that a device will encounter.

    NSS tests aim to stress the HTTP detection engine in order to determine how the sensor copes with detecting and blocking under network loads of varying average packet size and varying connections per second. By creating genuine session-based traffic with varying session lengths, the sensor is forced to track valid TCP sessions, thus ensuring a higher workload than for simple packet-based background traffic.

    Each transaction consists of a single HTTP GET request and there are no transaction delays (i.e. the web server responds immediately to all requests). All packets contain valid payload (a mix of binary and ASCII objects) and address data. This test provides an excellent representation of a live network (albeit one biased towards HTTP traffic) at various network loads.

    HTTP Connections per Second and Capacity (Throughput)

    As previously stated, NSS research has found that there is usually a trade-off between security effectiveness and performance. Because of this, it is important to judge a products security effectiveness within the context of its performance (and vice versa). This ensures that new security protections do not adversely impact performance and security shortcuts are not taken to maintain or improve performance. The following charts compare maximum connection rate (HTTP CPS), maximum rated throughput (Mbps) and average application latency (average HTTP response time in milliseconds) across a range of HTTP response sizes.

    Figure 8: Maximum Throughput Per Device With 44Kbyte Response

    10,000

    10,000

    10,000

    960

    10,000

    2,400

    3,080

    209

    4,120

    3,000

    3,320

    2,520

    - 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    Megabits per Second

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iN

    G

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-8

    00c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure

    UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    44Kbyte Response 10,000 10,000 10,000 960 10,000 2,400 3,080 209 4,120 3,000 3,320 2,520

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 10

    Figure 9: Maximum Throughput Per Device With 21Kbyte Response

    Figure 10: Maximum Throughput Per Device With 10Kbyte Response

    8,780

    10,000

    10,000

    820

    10,000

    1,880

    2,720

    112

    4,160

    3,000

    2,940

    2,200

    - 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    Megabits per Second

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iN

    G

    Dell SonicWAL

    L NSA 4500

    Fortinet FortiGate-8

    00c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure

    UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    21Kbyte Response (Mbps) 8,780 10,000 10,000 820 10,000 1,880 2,720 112 4,160 3,000 2,940 2,200

    6,490

    7,460

    8,491

    520

    7,700

    1,260

    2,010

    58

    3,280

    3,000

    2,910

    1,610

    - 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    Megabits per Second

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800

    c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure

    UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    10KB Response (Mbps) 6,490 7,460 8,491 520 7,700 1,260 2,010 58 3,280 3,000 2,910 1,610

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 11

    Figure 11: Maximum Throughput Per Device With 4.5Kbyte Response

    Figure 12: Maximum Throughput Per Device With 1.7Kbyte Response

    4,390

    4,660

    5,650

    280

    6,850

    770

    1,360

    32

    1,680

    2,600

    2,200

    965

    - 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    Megabits per Second

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure

    UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    4.5KB Response (Mbps) 4,390 4,660 5,650 280 6,850 770 1,360 32 1,680 2,600 2,200 965

    2,603

    2,710

    3,675

    240

    3,925

    440

    920

    17

    1,150

    1,400

    1,050

    525

    - 1,000 2,000 3,000 4,000 5,000 6,000 7,000 8,000 9,000 10,000

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    Megabits per Second

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure

    UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    1.7KB Response (Mbps) 2,603 2,710 3,675 240 3,925 440 920 17 1,150 1,400 1,050 525

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 12

    The following table shows the number of HTTP connections per second required to achieve the rated throughput.

    Product 44Kbyte Response 21Kbyte Response

    10Kbyte Response

    4.5Kbyte Response

    1.7Kbyte Response

    Barracuda F800 25,000 43,900 64,900 87,800 104,100

    Check Point 12600 25,000 50,000 74,600 93,200 108,400

    Cyberoam CR2500iNG 25,000 50,000 84,908 113,000 147,000

    Dell SonicWALL NSA 4500 2,400 4,100 5,200 5,600 9,600

    Fortinet FortiGate-800c 25,000 50,000 77,000 137,000 157,000

    Juniper SRX550 6,000 9,400 12,600 15,400 17,600

    NETASQ ng1000-A 7,700 13,600 20,100 27,200 36,800

    NETGEAR ProSecure UTM9S 522 562 582 642 661

    Palo Alto Networks PA-5020 10,300 20,800 32,800 33,600 46,000

    Sophos UTM 425 7,500 15,000 30,000 52,000 56,000

    Stonesoft FW-1301 8,300 14,700 29,100 44,000 42,000

    WatchGuard XTM 1050 6,300 11,000 16,100 19,300 21,000

    Figure 13: Maximum Connection Rates Per Device With Various Response Sizes

    Application Average Response Time - HTTP (at 90% Max Capacity)

    The following table details the average application response time (application latency) with various traffic sizes at 90% Max Capacity (Throughput). The lower the number the better (improved application response time).

    Juniper SRX 3600 demonstrated the lowest (best) application response times, while the NETGEAR ProSecure UTM9S introduced the most application latency.

    Product 44Kbyte Latency (ms) 21Kbyte

    Latency (ms) 10Kbyte

    Latency (ms) 4.5Kbyte

    Latency (ms) 1.7Kbyte

    Latency (ms)

    Barracuda F800 1.7 1.1 0.9 0.3 0.3

    Check Point 12600 1.81 1.08 1.1 0.35 0.28

    Cyberoam CR2500iNG 1.61 0.83 0.7 0.22 0.24

    Dell SonicWALL NSA 4500 1.99 1.02 0.04 0.04 0.07

    Fortinet FortiGate-800c 0.94 0.78 0.53 0.4 0.34

    Juniper SRX550 0.14 0.04 0.04 0.01 0.03

    NETASQ ng1000-A 1.42 1.25 0.53 0.12 0.16

    NETGEAR ProSecure UTM9S 2.55 1.39 2.43 0.93 0.65

    Palo Alto Networks PA-5020 1.01 0.63 0.27 0.1 0.08

    Sophos UTM 425 1.98 0.99 0.75 0.06 0.03

    Stonesoft FW-1301 1.1 0.9 0.4 0.2 0.08

    WatchGuard XTM 1050 5.1 3.01 2.14 0.85 0.77

    Figure 14: Application Latency (Microseconds) Per Device With Various Response Sizes

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 13

    Real-World Traffic Mixes The aim of these tests is to measure the performance of the device under test (DUT) in a real world environment by introducing additional protocols and real content, while still maintaining a precisely repeatable and consistent background traffic load. In order to simulate real use cases, different protocol mixes are utilized to model placement of the DUT within various locations on a corporate network. For details about real world traffic protocol types and percentages, see the NSS Labs Firewall Test Methodology, available at www.nsslabs.com.

    Figure 15: Real-World Performance by Device

    Most vendors perform better in the real-world protocol mix (perimeter,) a protocol mix typically seen at an enterprise perimeter. However Fortinet Fortigate-800c is the only vendor to scale equally well in the real-world protocol mix (core), a protocol mix typical of that seen in a large datacenter or the core of an enterprise network.

    10,000

    10,000

    10,000

    950

    10,000

    3,000

    3,700

    460

    4,500

    3,000

    8,700

    2,600

    4,700

    5,200

    6,200

    780

    9,000

    1,500

    1,200

    121

    3,700

    3,000

    3,800

    1,800

    - 2,000 4,000 6,000 8,000 10,000

    Barracuda F800

    Check Point 12600

    Cyberoam CR2500iNG

    Dell SonicWALL NSA 4500

    Fortinet FortiGate-800c

    Juniper SRX550

    NETASQ NG1000-A

    NETGEAR ProSecure UTM9S

    Palo Alto Networks PA-5020

    Sophos UTM 425

    Stonesoft FW-1301

    WatchGuard XTM 1050

    Real World Protocol Mix (Perimeter) Real World Protocol Mix (Core)

  • NSS Labs Firewall Comparative Analysis - Performance

    2013 NSS Labs, Inc. All rights reserved. 14

    2013 NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, photocopied, stored on a retrieval system, or transmitted without the express written consent of the authors.

    Please note that access to or use of this report is conditioned on the following:

    1. The information in this report is subject to change by NSS Labs without notice.

    2. The information in this report is believed by NSS Labs to be accurate and reliable at the time of publication, but is not guaranteed. All use of and reliance on this report are at the readers sole risk. NSS Labs is not liable or responsible for any damages, losses, or expenses arising from any error or omission in this report.

    3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY NSS LABS. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT ARE DISCLAIMED AND EXCLUDED BY NSS LABS. IN NO EVENT SHALL NSS LABS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL OR INDIRECT DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.

    4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or software) tested or the hardware and software used in testing the products. The testing does not guarantee that there are no errors or defects in the products or that the products will meet the readers expectations, requirements, needs, or specifications, or that they will operate without interruption.

    5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned in this report.

    6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of their respective owners.

    Test Methodology Methodology Version: Firewall v4

    A copy of the test methodology is available on the NSS Labs website at www.nsslabs.com

    Contact Information NSS Labs, Inc. 206 Wild Basin Rd, Suite 200A Austin, TX 78746 USA +1 (512) 961-5300 [email protected] www.nsslabs.com v2013.02.07

    This and other related documents available at: www.nsslabs.com. To receive a licensed copy or report misuse, please contact NSS Labs at +1 (512) 961-5300 or [email protected].