DATA CENTER IPS COMPARATIVE ANALYSIS

Similar documents
WEB APPLICATION FIREWALL COMPARATIVE ANALYSIS

BREACH DETECTION SYSTEMS COMPARATIVE ANALYSIS

TEST METHODOLOGY. SSL/TLS Performance. v1.0

NEXT GENERATION FIREWALL. Tested Products. Environment. SonicWall Security Value Map (SVM) JULY 11, 2017 Author Thomas Skybakmoen

NEXT GENERATION FIREWALL COMPARATIVE REPORT

ENTERPRISE ENDPOINT COMPARATIVE REPORT

BREACH DETECTION SYSTEM PRODUCT ANALYSIS

ADVANCED ENDPOINT PROTECTION COMPARATIVE REPORT

TEST METHODOLOGY. Virtual Firewall. v2.1 MARCH 13, 2017

ADVANCED ENDPOINT PROTECTION TEST REPORT

THREAT ISOLATION TECHNOLOGY PRODUCT ANALYSIS

CONSUMER EPP COMPARATIVE ANALYSIS

TEST METHODOLOGY. Data Center Firewall. v2.2

Quick Start Guide for Administrators and Operators Cyber Advanced Warning System

Kemp Technologies LM-3600 IPv4 and IPv6 Performance Report

TEST METHODOLOGY. Breach Detection Systems (BDS) v5.0 MARCH 5, 2018

They Call It Stormy Monday

TEST METHODOLOGY. Breach Detection Systems (BDS) v3.0

CONSUMER AV / EPP COMPARATIVE ANALYSIS

The Forcepoint NGFW should be on every company s short list.

DBAM Systems EP60 Test Executive Summary

NEXT GENERATION INTRUSION PREVENTION SYSTEM (NGIPS) TEST REPORT

NEXT GENERATION INTRUSION PREVENTION SYSTEM (NGIPS) TEST REPORT

NETWORK INTRUSION PREVENTION SYSTEMS

TEST METHODOLOGY. Breach Prevention Systems (BPS) V2.0 MARCH 5, 2018

NETWORK INTRUSION PREVENTION SYSTEMS

BREACH DETECTION SYSTEMS TEST REPORT

NEXT GENERATION FIREWALL PRODUCT ANALYSIS

Maturing VARs Offer New Outsourcing Option

Check Point Power

SOFTWARE-DEFINED WIDE AREA NETWORKING TEST REPORT

TEST METHODOLOGY. Breach Detection Systems (BDS) v4.0

TEST METHODOLOGY. Next Generation Intrusion Prevention System (NGIPS) V4.0 FEBRUARY 2, 2018

NETWORK INTRUSION PREVENTION SYSTEM

Market Analysis. Overview 2013 INTRUSION PREVENTION SYSTEMS. Authors: Rob Ayoub, Andrew Braunberg, Jason Pappalexis

CAWS CONTINUOUS SECURITY VALIDATION PLATFORM API GUIDE VERSION 3.0

Managing Latency in IPS Networks

HYCU SCOM Management Pack for F5 BIG-IP

TEST METHODOLOGY. Breach Prevention Systems (BPS) v1.0

SUPPORT MATRIX. HYCU OMi Management Pack for Citrix

Intel Unite. Intel Unite Firewall Help Guide

TEST METHODOLOGY. Data Center Network Security (DCNS) V2.0. October 10, 2018

Customer Support: For more information or support, please visit or at Product Release Information...

CAWS CONTINUOUS SECURITY VALIDATION PLATFORM API GUIDE VERSION 3.0

CAWS CYBER THREAT PROTECTION PLATFORM API GUIDE. Version 2.3

SUPPORT MATRIX. Comtrade OMi Management Pack for Citrix

Intrusion Prevention System Performance Metrics

TEST METHODOLOGY. Next Generation Intrusion Prevention System (NGIPS) v2.0

By Jorge Gomez. Citrix Consulting Services. Citrix Systems, Inc.

Avaya Port Matrix: Avaya Proprietary Use pursuant to the terms of your signed agreement or Avaya policy.

Release Information. Revision History. Version: build 018 Release Date: 23 rd November 2011

Wireless Clients and Users Monitoring Overview

Avaya Port Matrix: Avaya Aura Appliance Virtualization Platform 7.0

Performance Characterization of ONTAP Cloud in Azure with Application Workloads

Performance Consistency

Stealthwatch and Cognitive Analytics Configuration Guide (for Stealthwatch System v6.10.x)

Validating Service Provisioning

Policies & Medical Disclaimer

Solace Message Routers and Cisco Ethernet Switches: Unified Infrastructure for Financial Services Middleware

CA IDMS 18.0 & 18.5 for z/os and ziip

QPP Proprietary Profile Guide

Terms of Use. Changes. General Use.

10 ways to securely optimize your network. Integrate WAN acceleration with next-gen firewalls to enhance performance, security and control

Tisio CE Release Notes

Intelligent Application Bypass

Release Notes. BlackBerry Enterprise Identity

Avaya Proprietary Use pursuant to the terms of your signed agreement or Avaya policy.

STONESOFT. New Appliances2012

Keeping the Doors Open and the Lights On

Milestone Solution Partner IT Infrastructure Components Certification Report

USB Server User Manual

Flow Sensor and Load Balancer Integration Guide. (for Stealthwatch System v6.9.2)

Information Security Specialist. IPS effectiveness

ivms-5200 Mobile Surveillance V1.1.0 Port List

Redirector User Guide

IP Addressing: Fragmentation and Reassembly Configuration Guide

Avaya Port Matrix: Avaya Communicator for Microsoft Lync 6.4. Avaya Proprietary Use pursuant to the terms of your signed agreement or Avaya policy.

HP S1500 SSL Appliance. Product overview. Key features. Data sheet

TEST METHODOLOGY. Wireless Networking. v1.0 DECEMBER 5, 2016

SKD Labs Test Report. A Comparative Test on Anti-Malware Products in the China Region

User Manual Arabic Name Romanizer Name Geolocation System

File Server Comparison: Executive Summary. Microsoft Windows NT Server 4.0 and Novell NetWare 5. Contents

Release Notes. BlackBerry UEM Client for Android Version

Stonesoft Firewall/VPN Express. Release Notes for Version 5.5.4

Fortinet Firewall Validation FortiGate-3810D

Upgrade Guide GateManager Version 5.x to 5.x

CIO Update: Security Platforms Will Transform the Network Security Arena

Nokia Intrusion Prevention with Sourcefire. Appliance Quick Setup Guide

Prime Service Catalog: UCS Director Integration Best Practices Importing Advanced Catalogs

StoneGate IPS. Hardware Requirements for Version 5.2.0

Validation of Cisco SCE8000

Compatibility Matrix. Good Control and Good Proxy. June 4, 2018

Supra-linear Packet Processing Performance with Intel Multi-core Processors

HTTP Errors User Guide

TOLLY. No March Fortress Technologies, Inc.

Stonesoft Management Center. Release Notes for Version 5.6.1

Milestone Solution Partner IT Infrastructure Components Certification Report

Aellius LynX Office Lookup Enhancements

IBM BigFix Lifecycle 9.5

IP Addressing: Fragmentation and Reassembly Configuration Guide, Cisco IOS XE Release 3S (Cisco ASR 1000)

Transcription:

DATA CENTER IPS COMPARATIVE ANALYSIS Performance 2014 Jason Pappalexis, Thomas Skybakmoen Tested Products Fortinet FortiGate 5140B, Juniper SRX 5800, McAfee NS- 9300, Sourcefire 8290-2

Overview Implementation of intrusion prevention systems (IPS) can be a complex process with multiple factors affecting the overall performance of the solution. Each of these factors should be considered over the course of the useful life of the solution, including: 1. Where will it be deployed? 2. What is the predominant traffic mix? 3. What security policy is applied? There is usually a trade- off between security effectiveness and performance; a product s security effectiveness should be evaluated within the context of its performance (and vice versa). This ensures that new security protections do not adversely impact performance and security shortcuts are not taken to maintain or improve performance. Sizing considerations are absolutely critical, since vendor performance claims can vary significantly from actual throughput with protection enabled. NSS Labs rates throughput based on the average of the following test results: "Real- World Protocol Mixes for the data center (financial, virtualization hub, mobile users and applications, web- based applications and services, and Internet Service Provider mix), and 21 KB HTTP response- based capacity tests. 2,500,000( Sourcefire(8290=2( 2,000,000( Maximum'TCP'Connec/ons'per'Second' 1,500,000( 1,000,000( 500,000( McAfee(NS=9300( For$net(For$Gate(5140B( Juniper(SRX(5800( 0( 0( 20,000( 40,000( 60,000( 80,000( 100,000( 120,000( 140,000( 160,000( NSS6Tested'Throughput'(Mbps)' Figure 1 Throughput and Connection Rates Farther to the right indicates higher tested throughput. Higher up indicates higher maximum connections per second. Products with low connections/throughput ratio run the risk of exhausting connection tables before they reach their maximum potential throughput. Furthermore, if bypass mode is enabled, the IPS engine could be allowing uninspected traffic to enter the network once system resources are exhausted, and administrators would never be informed of threats within subsequent sessions. 2014 NSS Labs, Inc. All rights reserved. 2

For0net&For0Gate&5140B& 980,000& 1,195,000& 107,400& 113,500& McAfee&NSC9300& 681,000& 1,045,300& Sourcefire&8290C2& 801,984& 2,086,208& 0& 500,000& 1,000,000& 1,500,000& 2,000,000& 2,500,000& Maximum&TCP&Connec0ons&Per&Second& Maximum&HTTP&Connec0ons&Per&Second& Figure 2 Connection Dynamics Performance is not just about raw throughput. Connection dynamics are also important and will often provide an indication of the effectiveness of the inspection engine. If devices with high throughput capabilities cannot set up and tear down TCP or application- layer connections quickly enough, their maximum throughput figures can rarely be realized in a real- world deployment. 2014 NSS Labs, Inc. All rights reserved. 3

Table of Contents Analysis... 5 Connection Dynamics Concurrency and Connection Rates... 6 HTTP Connections per Second and Capacity... 7 HTTP Connections per Second and Capacity (Throughput)... 8 Application Average Response Time HTTP (at 90% Maximum Capacity)... 10 Real- World Traffic Mixes... 11 UDP Throughput & Latency... 13 Test Methodology... 15 Contact Information... 15 Table of Figures Figure 1 Throughput and Connection Rates... 2 Figure 2 Connection Dynamics... 3 Figure 3 Vendor- Claimed vs. NSS- Tested Throughput (Mbps)... 5 Figure 4 Concurrency and Connection Rates (I)... 6 Figure 5 Concurrency and Connection Rates (II)... 7 Figure 6 Maximum Throughput per Device with 44 KB Response... 8 Figure 7 Maximum Throughput per Device with 21 KB Response... 8 Figure 8 Maximum Throughput per Device with 10 KB Response... 9 Figure 9 Maximum Throughput per Device with 4.5 KB Response... 9 Figure 10 Maximum Throughput per Device with 1.7 KB Response... 9 Figure 11 Maximum Connection Rates per Device with various Response Sizes... 10 Figure 12 Application Latency (milliseconds) per Device with various Response Sizes... 10 Figure 13 Real World Protocol Mix (Data center Financial)... 11 Figure 14 Real World Protocol Mix (Data center Virtualization Hub)... 11 Figure 15 Real World Protocol Mix (Data center Mobile users and applications)... 12 Figure 16 Real World Protocol Mix (Data center Web- based applications and services)... 12 Figure 17 Real World Protocol Mix (Data Center Internet Service Provider Mix)... 12 Figure 18 UDP Throughput by Packet Size (Mbps)... 13 Figure 19 UDP Throughput by Packet Size (Mbps)... 13 Figure 20 UDP Latency by Packet Size (microseconds)... 14 2014 NSS Labs, Inc. All rights reserved. 4

Analysis NSS research indicates that all enterprises tune their IPS devices when deployed in the data center. Therefore, during NSS testing of IPS products, the devices are deployed with tuned policies (see Data Center IPS Comparative Analysis Security ). Every effort is made to deploy policies that ensure the optimal combination of security effectiveness and performance, as would be the aim of a typical customer deploying the device in a live network environment. This provides readers with the most useful information on key IPS security effectiveness and performance capabilities based on their expected usage. For0net'For0Gate'5140B' 59,340' 120,000' Juniper'SRX'5800' 31,625' 40,000' McAfee'NSC9300' 40,000' 47,533' Sourcefire'8290C2' 80,000' 136,033' 0' 20,000' 40,000' 60,000' 80,000' 100,000' 120,000' 140,000' 160,000' NSSCTested'Throughput'(Mbps)' VendorCClaimed'Throughput'(Mbps)' Figure 3 Vendor- Claimed vs. NSS- Tested Throughput (Mbps) Figure 3 depicts the difference between the NSS performance rating and the vendor performance claims, which are often under ideal/unrealistic conditions. Where multiple figures are quoted by vendors in marketing materials, NSS selects those that relate to TCP, or with protection enabled, rather than the more optimistic UDP- only or large packet size performance figures often quoted. Even so, NSS- tested throughput typically is lower than that which is claimed by the vendor, often significantly so, since it is more representative of how devices will perform in real- world deployments. Only McAfee and Sourcefire tested higher than vendor claimed performance. 2014 NSS Labs, Inc. All rights reserved. 5

Connection Dynamics Concurrency and Connection Rates These tests stress the detection engine to determine how the sensor copes with increasing rates of TCP connections per second, application layer transactions per second, and concurrent open connections. All packets contain valid payload and address data, and these tests provide an excellent representation of a live network at various connection/transaction rates. Note that in all tests, the following critical breaking points where the final measurements are taken are used: Excessive concurrent TCP connections latency within the IPS is causing unacceptable increase in open connections on the server- side. Excessive response time for HTTP transactions latency within the IPS is causing excessive delays and increased response time to the client. Unsuccessful HTTP transactions normally, there should be zero unsuccessful transactions. Once these appear, it is an indication that excessive latency within the IPS is causing connections to time out. The following are the key connection dynamics results from performance tests. Product Theoretical Maximum Maximum Connections per Second Maximum Concurrent TCP Connections Concurrent TCP Connections w/data TCP HTTP HTTP Transactions per Second Fortinet FortiGate 5140B 120,000,000 120,000,000 1,195,000 980,000 2,777,000 Juniper SRX 5800 60,000,000 60,000,000 107,400 113,500 988,000 McAfee NS- 9300 44,000,000 43,276,208 1,045,300 681,000 806,340 Sourcefire 8290-2 240,000,000 238,620,912 2,086,208 801,984 2,359,665 Figure 4 Concurrency and Connection Rates (I) Beyond overall throughput of the device, connection dynamics can play an important role in sizing a security device that will not unduly impede the performance of a system or an application. Maximum connection and transaction rates help size a device more accurately than simply examining throughput. By having knowledge of the maximum connections per second, it is possible to predict maximum throughput based on the traffic mix in a given enterprise environment. For example, if the device maximum HTTP connections per second (CPS) is 2,000 and average traffic size is 44 KB such that 2,500 CPS = 1 Gbps, then the tested device will achieve a maximum of 800 Mbps (i.e., (2,000/2,500) x 1,000 Mbps = 800 Mbps). Maximum concurrent TCP connections and maximum TCP connections per second rates are also useful metrics when attempting to size a device accurately. Products with low connection/throughput ratio run the risk of exhausting connections before they reach their maximum potential throughput. By determining the maximum connections per second, it is possible to predict when a device will fail in a given enterprise environment. 2014 NSS Labs, Inc. All rights reserved. 6

2500000( Maximum'TCP'Connec/ons'per'Second' 2000000( 1500000( 1000000( 500000( For$net(For$Gate(5140B( McAfee(NS=9300( Sourcefire(8290=2( Juniper(SRX(5800( 0( 100,000( 1,000,000( 10,000,000( 100,000,000( 1,000,000,000( Maximum'Concurrent'/'Simultaneous'TCP'Connec/ons' Figure 5 Concurrency and Connection Rates (II) Higher up indicates increased connections per second capacity. Farther to the right indicates increased concurrent/simultaneous connections. Products with low concurrent connection/connection per second ratios run the risk of exhausting connections (sessions) before they reach their maximum potential connection rates. HTTP Connections per Second and Capacity In- line IPS devices exhibit an inverse correlation between security effectiveness and performance. The more deep- packet inspection is performed, the fewer packets can be forwarded. Furthermore, it is important to consider a real- world mix of traffic that a device will encounter. NSS tests aim to stress the HTTP detection engine in order to determine how the sensor copes with detecting and blocking exploits under network loads of varying average packet size and varying connections per second. By creating genuine session- based traffic with varying session lengths, the sensor is forced to track valid TCP sessions, thus ensuring a higher workload than for simple, packet- based background traffic. Each transaction consists of a single HTTP GET request, and there are no transaction delays (i.e., the web server responds immediately to all requests). All packets contain valid payload (a mix of binary and ASCII objects) and address data. This test provides an excellent representation of a live network (albeit one biased towards HTTP traffic) at various network loads. 2014 NSS Labs, Inc. All rights reserved. 7

HTTP Connections per Second and Capacity (Throughput) As previously stated, NSS research has found that there is usually a trade- off between security effectiveness and performance. Because of this, it is important to judge a product s security effectiveness within the context of its performance (and vice versa). This ensures that new security protections do not adversely impact performance and that security shortcuts are not taken to maintain or improve performance. Figure 6 to Figure 10 depict the maximum throughput achieved across a range of different HTTP response sizes that may be encountered in a typical data center. 0$ 20,000$ 40,000$ 60,000$ 80,000$ 100,000$ 120,000$ 140,000$ 160,000$ For.net$For.Gate$5140B$ 80,000$ Juniper$SRX$5800$ 37,360$ McAfee$NSB9300$ 60,000$ Sourcefire$8290B2$ 146,880$ Sourcefire$8290B2$ McAfee$NSB9300$ Juniper$SRX$5800$ For.net$For.Gate$5140B$ 44$KB$Response$ 146,880$ 60,000$ 37,360$ 80,000$ Megabits)per)Second) Figure 6 Maximum Throughput per Device with 44 KB Response 0$ 20,000$ 40,000$ 60,000$ 80,000$ 100,000$ 120,000$ 140,000$ 160,000$ For.net$For.Gate$5140B$ 80,000$ Juniper$SRX$5800$ 18,980$ McAfee$NSB9300$ 60,000$ Sourcefire$8290B2$ 134,200$ Sourcefire$8290B2$ McAfee$NSB9300$ Juniper$SRX$5800$ For.net$For.Gate$5140B$ 21$KB$Response$(Mbps)$ 134,200$ 60,000$ 18,980$ 80,000$ Megabits)per)Second) Figure 7 Maximum Throughput per Device with 21 KB Response 2014 NSS Labs, Inc. All rights reserved. 8

0& 10,000& 20,000& 30,000& 40,000& 50,000& 60,000& 70,000& 80,000& For0net&For0Gate&5140B& 55,028& 9,840& McAfee&NSC9300& 41,490& Sourcefire&8290C2& 68,000& 10&KB&Response&(Mbps)& Sourcefire&8290C2& 68,000& McAfee&NSC9300& 41,490& 9,840& For0net&For0Gate&5140B& 55,028& Megabits)per)Second) Figure 8 Maximum Throughput per Device with 10 KB Response 0& 5,000& 10,000& 15,000& 20,000& 25,000& 30,000& 35,000& 40,000& 45,000& For/net&For/Gate&5140B& 36,050& 4,960& McAfee&NSB9300& 38,000& Sourcefire&8290B2& 38,185& 4.5&KB&Response&(Mbps)& Sourcefire&8290B2& 38,185& McAfee&NSB9300& 38,000& 4,960& For/net&For/Gate&5140B& 36,050& Megabits)per)Second) Figure 9 Maximum Throughput per Device with 4.5 KB Response 0% 5,000% 10,000% 15,000% 20,000% 25,000% For-net%For-Gate%5140B% 21,200% Juniper%SRX%5800% 2,800% McAfee%NS@9300% 13,200% Sourcefire%8290@2% 13,455% Sourcefire%8290@2% McAfee%NS@9300% Juniper%SRX%5800% For-net%For-Gate%5140B% 1.7%KB%Response%(Mbps)% 13,455% 13,200% 2,800% 21,200% Megabits)per)Second) Figure 10 Maximum Throughput per Device with 1.7 KB Response 2014 NSS Labs, Inc. All rights reserved. 9

Figure 11 depicts the maximum application layer connection rates (HTTP connections per second) achieved with different HTTP response sizes (from 44 KB down to 1.7 KB). Product 44 KB Response 21 KB Response 10 KB Response 4.5 KB Response 1.7 KB Response Fortinet FortiGate 5140B 200,000 400,000 550,275 721,000 848,000 Juniper SRX 5800 93,400 94,900 98,400 99,200 112,000 McAfee NS- 9300 150,000 300,000 414,900 760,000 528,000 Sourcefire 8290-2 367,200 671,000 680,000 763,700 538,200 Figure 11 Maximum Connection Rates per Device with various Response Sizes Application Average Response Time HTTP (at 90% Maximum Capacity) Figure 12 depicts the average application response time (application latency, measured in milliseconds) with different packet sizes (from 44 KB down to 1.7 KB) recorded at 90% of the measured maximum capacity (throughput). A lower number is better (denotes improved application response time). Product 44 KB Latency (ms) 21 KB Latency (ms) 10 KB Latency (ms) 4.5 KB Latency (ms) 1.7 KB Response (ms) Fortinet FortiGate 5140B 0.003 0.030 0.030 0.020 0.040 Juniper SRX 5800 0.010 0.008 0.005 0.001 0.001 McAfee NS- 9300 0.010 0.010 0.010 0.020 0.008 Sourcefire 8290-2 0.012 0.010 0.007 0.043 0.018 Figure 12 Application Latency (milliseconds) per Device with various Response Sizes 2014 NSS Labs, Inc. All rights reserved. 10

Real- World Traffic Mixes The aim of these tests is to measure the performance of the device under test in a real world environment by introducing additional protocols and real content, while still maintaining a precisely repeatable and consistent background traffic load. In order to simulate real use cases, different protocol mixes are utilized to model different data center deployment scenarios. For details about real world traffic protocol types and percentages, see the Data Center Intrusion Prevention System Methodology, available at www.nsslabs.com. For.net%For.Gate%5140B% 75,000% Juniper%SRX%5800% 20,500% McAfee%NSA9300% 58,500% Sourcefire%8290A2% 160,000% 0% 20,000% 40,000% 60,000% 80,000% 100,000% 120,000% 140,000% 160,000% 180,000% Real%World %Protocol%Mix%(Data%center%A%Financial)% Figure 13 Real World Protocol Mix (Data center Financial) For-net&For-Gate&5140B& 66,250& 40,000& McAfee&NS@9300& 58,000& Sourcefire&8290@2& 160,000& 0& 20,000& 40,000& 60,000& 80,000& 100,000& 120,000& 140,000& 160,000& 180,000& Real&World &Protocol&Mix&(Data&center&@&Virtualiza-on&Hub)& Figure 14 Real World Protocol Mix (Data center Virtualization Hub) 2014 NSS Labs, Inc. All rights reserved. 11

For/net&For/Gate&5140B& 47,426& 30,500& McAfee&NSB9300& 43,700& Sourcefire&8290B2& 105,000& 0& 20,000& 40,000& 60,000& 80,000& 100,000& 120,000& Real&World &Protocol&Mix&(Data&center&B&Mobile&Applica/ons)& Figure 15 Real World Protocol Mix (Data center Mobile users and applications) For0net&For0Gate&5140B& 28,363& 39,770& McAfee&NSC9300& 45,000& Sourcefire&8290C2& 118,000& 0& 20,000& 40,000& 60,000& 80,000& 100,000& 120,000& 140,000& Real&World &Protocol&Mix&(Data&center&C&Web&Apps)& Figure 16 Real World Protocol Mix (Data center Web- based applications and services) For/net%For/Gate%5140B% 59,000% Juniper%SRX%5800% 40,000% McAfee%NSB9300% 20,000% Sourcefire%8290B2% 139,000% 0% 20,000% 40,000% 60,000% 80,000% 100,000% 120,000% 140,000% 160,000% Real%World %Protocol%Mix%(Data%center%B%ISP)% Figure 17 Real World Protocol Mix (Data Center Internet Service Provider Mix) 2014 NSS Labs, Inc. All rights reserved. 12

UDP Throughput & Latency The aim of this test is to determine the raw packet processing capability of each in- line port pair of the device. This traffic does not attempt to simulate any form of real- world network condition. No TCP sessions are created during this test, and there is very little for the detection engine to do in the way of protocol analysis. However, this test is relevant since vendors are forced to perform inspection on UDP packets as a result of current threat vectors. Figure 18 and Figure 19 depict the maximum UDP throughput (in megabits per second) achieved by each device using different packet sizes. 180,000( 160,000( Sourcefire(8290=2( 140,000( Megabits)per)Second) 120,000( 100,000( 80,000( 60,000( For$net(For$Gate(5140B( McAfee(NS=9300( 40,000( 20,000( 0( 64(Byte(( 128(Byte(( 256(Byte(( 512(Byte(( 1024(Byte(( 1514(Byte(( UDP)Packet)Size) Juniper(SRX(5800( Figure 18 UDP Throughput by Packet Size (Mbps) Product 64 Byte 128 Byte Throughput (Mbps) 256 Byte 512 Byte 1024 Byte 1514 Byte Fortinet FortiGate 5140B 15,443 41,000 67,400 63,840 76,900 86,740 Juniper SRX 5800 3,753 5,728 8,094 11,052 13,424 14,225 McAfee NS- 9300 22,700 36,177 60,218 69,500 70,300 75,000 Sourcefire 8290-2 16,273 25,600 43,614 74,017 113,423 160,000 Figure 19 UDP Throughput by Packet Size (Mbps) 2014 NSS Labs, Inc. All rights reserved. 13

In- line security devices that introduce high levels of latency lead to unacceptable response times for users, particularly where multiple security devices are placed in the data path. Figure 20 reflects the latency (in microseconds) as recorded during the UDP throughput tests at 90% of maximum load. Lower values are preferred. Product 64 Byte 128 Byte 256 Byte Latency (μs) 512 Byte 1024 Byte 1514 Byte Fortinet FortiGate 5140B 319.0 315.0 8.0 10.0 13.0 16.0 Juniper SRX 5800 236.0 94.0 109.0 143.0 211.0 278.0 McAfee NS- 9300 22.5 44.1 26.1 40.2 38.8 42.6 Sourcefire 8290-2 111.0 109.0 109.0 113.0 128.0 139.0 Figure 20 UDP Latency by Packet Size (microseconds) 2014 NSS Labs, Inc. All rights reserved. 14

Test Methodology Methodology Version: Data Center IPS Test Methodology v1.1.1 A copy of the test methodology is available on the NSS Labs website at www.nsslabs.com. Contact Information NSS Labs, Inc. 206 Wild Basin Rd Buliding A, Suite 200 Austin, TX 78746 +1 (512) 961-5300 info@nsslabs.com www.nsslabs.com This and other related documents available at: www.nsslabs.com. To receive a licensed copy or report misuse, please contact NSS Labs at +1 (512) 961-5300 or sales@nsslabs.com. 2014 NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, photocopied, stored on a retrieval system, or transmitted without the express written consent of the authors. Please note that access to or use of this report is conditioned on the following: 1. The information in this report is subject to change by NSS Labs without notice. 2. The information in this report is believed by NSS Labs to be accurate and reliable at the time of publication, but is not guaranteed. All use of and reliance on this report are at the reader s sole risk. NSS Labs is not liable or responsible for any damages, losses, or expenses arising from any error or omission in this report. 3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY NSS LABS. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON- INFRINGEMENT ARE DISCLAIMED AND EXCLUDED BY NSS LABS. IN NO EVENT SHALL NSS LABS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL OR INDIRECT DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or software) tested or the hardware and software used in testing the products. The testing does not guarantee that there are no errors or defects in the products or that the products will meet the reader s expectations, requirements, needs, or specifications, or that they will operate without interruption. 5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned in this report. 6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of their respective owners. 2014 NSS Labs, Inc. All rights reserved. 15