BREACH DETECTION SYSTEMS COMPARATIVE ANALYSIS

Similar documents
ENTERPRISE ENDPOINT COMPARATIVE REPORT

DATA CENTER IPS COMPARATIVE ANALYSIS

NEXT GENERATION FIREWALL. Tested Products. Environment. SonicWall Security Value Map (SVM) JULY 11, 2017 Author Thomas Skybakmoen

ADVANCED ENDPOINT PROTECTION TEST REPORT

BREACH DETECTION SYSTEM PRODUCT ANALYSIS

WEB APPLICATION FIREWALL COMPARATIVE ANALYSIS

THREAT ISOLATION TECHNOLOGY PRODUCT ANALYSIS

NEXT GENERATION FIREWALL COMPARATIVE REPORT

TEST METHODOLOGY. SSL/TLS Performance. v1.0

ADVANCED ENDPOINT PROTECTION COMPARATIVE REPORT

CONSUMER EPP COMPARATIVE ANALYSIS

Quick Start Guide for Administrators and Operators Cyber Advanced Warning System

CONSUMER AV / EPP COMPARATIVE ANALYSIS

TEST METHODOLOGY. Breach Detection Systems (BDS) v5.0 MARCH 5, 2018

TEST METHODOLOGY. Breach Detection Systems (BDS) v3.0

They Call It Stormy Monday

TEST METHODOLOGY. Virtual Firewall. v2.1 MARCH 13, 2017

BREACH DETECTION SYSTEMS TEST REPORT

Maturing VARs Offer New Outsourcing Option

TEST METHODOLOGY. Breach Detection Systems (BDS) v4.0

Kemp Technologies LM-3600 IPv4 and IPv6 Performance Report

Advanced Malware Protection. Dan Gavojdea, Security Sales, Account Manager, Cisco South East Europe

CAWS CONTINUOUS SECURITY VALIDATION PLATFORM API GUIDE VERSION 3.0

Market Analysis. Overview 2013 INTRUSION PREVENTION SYSTEMS. Authors: Rob Ayoub, Andrew Braunberg, Jason Pappalexis

NEXT GENERATION INTRUSION PREVENTION SYSTEM (NGIPS) TEST REPORT

HYCU SCOM Management Pack for F5 BIG-IP

DBAM Systems EP60 Test Executive Summary

TEST METHODOLOGY. Data Center Firewall. v2.2

CAWS CYBER THREAT PROTECTION PLATFORM API GUIDE. Version 2.3

NEXT GENERATION INTRUSION PREVENTION SYSTEM (NGIPS) TEST REPORT

The Forcepoint NGFW should be on every company s short list.

TEST METHODOLOGY. Breach Prevention Systems (BPS) V2.0 MARCH 5, 2018

SKD Labs Test Report. A Comparative Test on Anti-Malware Products in the China Region

TEST METHODOLOGY. Wireless Networking. v1.0 DECEMBER 5, 2016

Achieve deeper network security

SUPPORT MATRIX. HYCU OMi Management Pack for Citrix

CAWS CONTINUOUS SECURITY VALIDATION PLATFORM API GUIDE VERSION 3.0

EXECUTIVE BRIEF: WHY NETWORK SANDBOXING IS REQUIRED TO STOP RANSOMWARE

SONICWALL SECURITY HEALTH CHECK SERVICE

Advanced Endpoint Protection

IT S NOT ABOUT THE 98 PERCENT YOU CATCH, IT S ABOUT THE 2 PERCENT YOU MISS.

Terms of Use. Changes. General Use.

Securing Your Environment with Dell Client Manager and Symantec Endpoint Protection

EU GENERAL DATA PROTECTION: TIME TO ACT. Laurent Vanderschrick Channel Manager Belgium & Luxembourg Stefaan Van Hoornick Technical Manager BeNeLux

PRIVATE MOBILE CONNECTION (formerly COMMERCIAL CONNECTIVITY SERVICE (CCS)) CUSTOM APN ATTACHMENT

MERIDIANSOUNDINGBOARD.COM TERMS AND CONDITIONS

SUPPORT MATRIX. Comtrade OMi Management Pack for Citrix

ROBOCYBERWALL INC. External Penetration Test Report. September 13, 2017

Keeping the Doors Open and the Lights On

Meeting PCI DSS 3.2 Compliance with RiskSense Solutions

SOLUTION BRIEF RSA NETWITNESS SUITE 3X THE IMPACT WITH YOUR EXISTING SECURITY TEAM

TERMS & CONDITIONS. Complied with GDPR rules and regulation CONDITIONS OF USE PROPRIETARY RIGHTS AND ACCEPTABLE USE OF CONTENT

Policies & Medical Disclaimer

HUAWEI TECHNOLOGIES CO., LTD. Huawei FireHunter6000 series

Cisco Advanced Malware Protection (AMP) for Endpoints Security Testing

10 ways to securely optimize your network. Integrate WAN acceleration with next-gen firewalls to enhance performance, security and control

Avast Customer & Technical Support Policy

Wireless Clients and Users Monitoring Overview

LOGO LICENSE AGREEMENT(S) CERTIPORT AND IC³

BCDC 2E, 2012 (On-line Bidding Document for Stipulated Price Bidding)

10 KEY WAYS THE FINANCIAL SERVICES INDUSTRY CAN COMBAT CYBER THREATS

CALSTRS ONLINE AGREEMENT TERMS AND CONDITIONS

Anti-Virus Comparative. Factsheet Business Test (August-September 2018) Last revision: 11 th October

TERMS OF USE Effective Date: January 1, 2015 To review material modifications and their effective dates scroll to the bottom of the page. 1.Parties.

NEXT GENERATION FIREWALL PRODUCT ANALYSIS

Kaspersky Cloud Security for Hybrid Cloud. Diego Magni Presales Manager Kaspersky Lab Italia

4. Save as expressly set out herein no license is granted in respect of any intellectual property rights vested in F1000 or other third parties.

TEST METHODOLOGY. Next Generation Intrusion Prevention System (NGIPS) V4.0 FEBRUARY 2, 2018

Commercial Product Matrix

1. License Grant; Related Provisions.

Security Advisory Relating to the Speculative Execution Vulnerabilities with some microprocessors

Internet Scanner 7.0 Service Pack 2 Frequently Asked Questions

Security Advisory Relating to the Speculative Execution Vulnerabilities with some microprocessors

The Top 6 WAF Essentials to Achieve Application Security Efficacy

Release Information. Revision History. Version: build 018 Release Date: 23 rd November 2011

These terms and conditions outline the rules and regulations for the use of Duxbury Networking's Website.

Information Security Specialist. IPS effectiveness

CyberArk Privileged Threat Analytics

eguide: Designing a Continuous Response Architecture 5 Steps to Reduce the Complexity of PCI Security Assessments

SOLUTION BRIEF RSA ARCHER IT & SECURITY RISK MANAGEMENT

Agile Security Solutions

TEST METHODOLOGY. Next Generation Intrusion Prevention System (NGIPS) v2.0

RSA NetWitness Suite Respond in Minutes, Not Months

IPS with isensor sees, identifies and blocks more malicious traffic than other IPS solutions

ENTERPRISE ENDPOINT PROTECTION BUYER S GUIDE

OCTOSHAPE SDK AND CLIENT LICENSE AGREEMENT (SCLA)

Advanced Threat Defense Certification Testing Report. Trend Micro Incorporated Trend Micro Deep Discovery Inspector

Wired Network Summary Data Overview

Correlation and Phishing

INCLUDING MEDICAL ADVICE DISCLAIMER

Barracuda Advanced Threat Protection. Bringing a New Layer of Security for . White Paper

BlackBerry Enterprise Server Express for Microsoft Exchange

BlackBerry Java Development Environment (JDE)

PC SECURITY LABS COMPARATIVE TEST. Microsoft Office. Flash. August Remote code execution exploit. mitigations for popular applications

Healthfirst Website Privacy Policy

Windows Security Updates for August (MS MS06-051)

Privileged Account Security: A Balanced Approach to Securing Unix Environments

Trend Micro Deep Discovery for Education. Identify and mitigate APTs and other security issues before they corrupt databases or steal sensitive data

HYCU SCOM Management Pack for F5 BIG-IP

Streaming Prevention in Cb Defense. Stop malware and non-malware attacks that bypass machine-learning AV and traditional AV

Transcription:

BREACH DETECTION SYSTEMS COMPARATIVE ANALYSIS Security Thomas Skybakmoen, Jason Pappalexis Tested Products AhnLab MDS Fidelis XPS Direct 1000 FireEye Web MPS 4310 and Email MPS 5300 Fortinet FortiSandbox 3000D Sourcefire (Cisco) Advanced Malware Protection 1 Trend Micro Deep Discovery Inspector Model 1000 Environment Breach Detection Systems: Test Methodology 1.5 1 Sourcefire is now part of Cisco.

Overview Implementation of breach detection systems (BDS) can be a complex process with multiple factors affecting the overall security effectiveness of the solution. These should be considered over the course of the useful life of the solution, and include: 1. Detection rate 2. Device stability and reliability In order to determine the relative security effectiveness of devices on the market and facilitate accurate product comparisons, NSS Labs has developed a unique metric: Security Effectiveness = Detection Rate 2 x Stability & Reliability Figure 1 Security Effectiveness Formula By focusing on overall security effectiveness instead of the detection rate alone, NSS is able to factor in the ease with which defenses can be bypassed, as well as the reliability of the device. Product Detection Rate Stability & Reliability Security Effectiveness AhnLab MDS 94.7% 100% 94.7% Fidelis XPS Direct 1000 98.4% 100% 98.4% FireEye Web MPS 4310 and Email MPS 5300 94.5% 100% 94.5% Fortinet FortiSandbox 3000D 99.0% 100% 99.0% Sourcefire Advanced Malware Protection 99.0% 100% 99.0% Trend Micro Deep Discovery Inspector Model 1000 99.1% 100% 99.1% Figure 2 Security Effectiveness Because enterprise users consider effective management to be a critical component of any enterprise security deployment, this also should be factored into total cost of ownership (TCO) and overall product selection. This is outside the scope of this report, however, for more information, refer to the TCO CAR. For a complete view of Security Effectiveness mapped against Value, refer to the Security Value Map (SVM) CAR. As part of the initial BDS test setup devices are configured/tuned as deemed necessary by the vendor. Every effort is made to deploy policies that ensure the optimal combination of security effectiveness and performance, as would be the aim of a typical customer deploying the device in a live network environment. This provides readers with the most useful information on key BDS security effectiveness and performance capabilities based upon their expected usage. This chart depicts the relationship between protection and performance when tuned policies are used. Farther up indicates better security effectiveness, and farther to the right indicates higher throughput. The maximum throughput shown is the first stage at which one or more attacks are not detected. 2 Detection Rate is defined as the number of malware detected under test within the 48- hour window.

100%' Trend'Micro'Deep'Discovery'Inspector' Model'1000' ) 99%' ForAnet'ForASandbox'3000D' Sourcefire'Advanced'Malware'ProtecAon' Security)Effec,veness) 98%' 97%' 96%' Fidelis'XPS 'Direct'1000' 95%' FireEye'Web'MPS'4310' and'email'mps'5300' AhnLab'MDS' 94%' 500' 600' 700' 800' 900' 1,000' 1,100' 1,200' NSS1Tested)Throughput)(Mbps)) Figure 3 Security Effectiveness and Performance When selecting products, those along the top line of the chart (closer to 100% security effectiveness) should be prioritized. The throughput is a secondary consideration and will be dependent on enterprise- specific deployment requirements.

Table of Contents Tested Products... 1 Environment... 1 Overview... 2 Analysis... 5 Tuning... 5 Detection Rate... 5 Malware Delivered over HTTP... 6 Malware Delivered over E- mail... 6 Malware Delivered by Exploits... 7 False Positive... 7 Stability & Reliability... 8 Security Effectiveness... 8 Contact Information... 9 Table of Figures Figure 1 Security Effectiveness Formula... 2 Figure 2 Security Effectiveness... 2 Figure 3 Security Effectiveness and Performance... 3 Figure 4 Malware Delivered over HTTP... 6 Figure 5 Malware Delivered over E- mail... 6 Figure 6 Malware Delivered using Exploits... 7 Figure 7 False Positive... 7 Figure 8 Stability and Reliability... 8 Figure 9 Security Effectiveness... 8

Analysis The threat landscape is evolving constantly; attackers are refining their strategies and increasing both the volume and intelligence of their attacks. Enterprises now must defend against targeted persistent attacks (TPA). In the past, servers were the main target; attacks against desktop client applications are now mainstream and present a clear danger to organizations. Through constant analysis of suspicious code and identification of communications with malicious hosts, breach detection systems claim to providing enhanced detection of advanced malware, zero- day and targeted attacks that could bypass traditional defenses. Tuning Security products are often complex, and vendors are responding by simplifying the user interface and security policy selection to meet the usability needs of a broadening user base. Indeed, many organizations accept and deploy the default settings, understanding these to be the best recommendations from the vendor. NSS research has found that BDS systems often require little or no tuning. In fact, several vendors come with little or no tuning options as standard. However, where possible all BDS products are tuned prior to testing to eliminate false positives and provide the most appropriate coverage for the systems to be protected. Typically, tuning is carried out by experienced system engineers from the vendor company, but where this is not possible, NSS engineers will perform the necessary tuning. NSS engineers may also amend the configuration of a device under test (DUT), where specific characteristics of the DUT or its configuration interfere with the normal operation of any of the tests, or where the results obtained from those tests would, in the opinion of those engineers, misrepresent the true capabilities of the DUT. Every effort is made to ensure the optimal combination of security effectiveness and performance, as would be the aim of a typical customer deploying the DUT in a live network environment. Detection Rate NSS security effectiveness testing leverages the deep expertise of our engineers to generate the same types of attacks used by modern cyber criminals, utilizing multiple commercial, open source, and proprietary tools as appropriate. With over 1800 live exploits and malware samples, this is the industry s most comprehensive test to date. Most notable, all of the live exploits in these tests have been validated such that: A reverse shell is returned A bind shell is opened on the target allowing the attacker to execute arbitrary commands A malicious payload is installed The system is rendered unresponsive 5

Malware Delivered over HTTP Figure 4 depicts how each product was able to detect socially engineered malware using the HTTP protocol as its transport mechanism, i.e., it is downloaded through a web browser. 0%% 10%% 20%% 30%% 40%% 50%% 60%% 70%% 80%% 90%% 100%% AhnLab%MDS% Fidelis%XPS %Direct%1000% 97.7%% FireEye%Web%MPS%4310%and%Email%MPS%5300% 95.1%% ForHnet%ForHSandbox%3000D% 98.7%% Sourcefire%Advanced%Malware%ProtecHon% 98.7%% Trend%Micro%Deep%Discovery%Inspector%Model%1000% 97.3%% Trend%Micro%Deep% Discovery%Inspector% Model%1000% 97.3%% Sourcefire%Advanced% Malware%ProtecHon% ForHnet%ForHSandbox% 3000D% FireEye%Web%MPS%4310% and%email%mps%5300% Fidelis%XPS %Direct%1000% AhnLab%MDS% Malware%Delivered%over%HTTP% 98.7%% 98.7%% 95.1%% 97.7%% Figure 4 Malware Delivered over HTTP Malware Delivered over E- mail Figure 5 depicts how each product was able to detect socially engineered malware that uses email (SMTP/IMAP) as its transport mechanism; for example, a malicious email attachment. 0%& 10%& 20%& 30%& 40%& 50%& 60%& 70%& 80%& 90%& 100%& AhnLab&MDS& 94.0%& Fidelis&XPS &Direct&1000& 97.6%& FireEye&Web&MPS&4310&and&Email&MPS&5300& 96.0%& ForHnet&ForHSandbox&3000D& 98.4%& Sourcefire&Advanced&Malware&ProtecHon& 98.4%& Trend&Micro&Deep&Discovery&Inspector&Model&1000& 100.0%& Trend&Micro&Deep& Discovery&Inspector& Model&1000& Sourcefire&Advanced& Malware&ProtecHon& ForHnet&ForHSandbox& 3000D& FireEye&Web&MPS&4310& Fidelis&XPS &Direct&1000& and&email&mps&5300& AhnLab&MDS& Malware&Delivered&over&EQmail& 100.0%& 98.4%& 98.4%& 96.0%& 97.6%& 94.0%& Figure 5 Malware Delivered over E- mail 6

Malware Delivered by Exploits Figure 6 depicts how each product was able to detect malware delivered by exploits. Exploits are defined as malicious software that is designed to take advantage of an existing deficiency in a hardware or software system, be it a vulnerability or a bug. 0%% 10%% 20%% 30%% 40%% 50%% 60%% 70%% 80%% 90%% 100%% AhnLab%MDS% 90.0%% Fidelis%XPS %Direct%1000% FireEye%Web%MPS%4310%and%Email%MPS%5300% 92.5%% ForHnet%ForHSandbox%3000D% Sourcefire%Advanced%Malware%ProtecHon% Trend%Micro%Deep%Discovery%Inspector%Model%1000% Trend%Micro%Deep% Discovery%Inspector% Model%1000% Sourcefire%Advanced% Malware%ProtecHon% ForHnet%ForHSandbox% 3000D% FireEye%Web%MPS%4310% and%email%mps%5300% Fidelis%XPS %Direct%1000% AhnLab%MDS% Malware%Delivered%by%Exploits% 92.5%% 90.0%% Figure 6 Malware Delivered using Exploits False Positive The ability of the product to identify and pass legitimate traffic while maintaining detection of threats and breaches is as important as detection rate alone. There is commonly a trade- off between detection accuracy and performance; a product s detection accuracy should be evaluated within the context of its performance and vice versa. For more information, please refer to the Comparative Analysis Performance on www.nsslabs.com. Figure 7 shows the results of this test. Product % False Positive Samples Detected AhnLab MDS 7% Fidelis XPS Direct 1000 0% FireEye Web MPS 4310 and Email MPS 5300 0% Fortinet FortiSandbox 3000D 0% Sourcefire Advanced Malware Protection 0% Trend Micro Deep Discovery Inspector Model 1000 0% Figure 7 False Positive 7

Stability & Reliability Long- term stability is particularly important for an in- line device, where failure can produce network outages. These tests verify the stability of the DUT along with its ability to maintain security effectiveness while under normal load and while passing malicious traffic. Products that are not able to sustain legitimate traffic (or that crash) while under hostile attack will not pass. The DUT is required to remain operational and stable throughout these tests, and it is required to operate at 100% scanning capability, raising an alert for each detection. If any malicious traffic passes undetected, caused by either the volume of traffic or by the BDS failing for any reason, this will result in a FAIL. Product Detection Under Extended Attack Attack Detection - Normal Load Protocol Fuzzing and Mutation Detection Ports Protocol Fuzzing and Mutation Management Port Power Fail Redundancy Persistence of Data AhnLab MDS PASS PASS PASS PASS PASS PASS PASS Fidelis XPS Direct 1000 PASS PASS PASS PASS PASS PASS PASS FireEye Web MPS 4310 and Email MPS 5300 PASS PASS PASS PASS PASS PASS PASS Fortinet FortiSandbox 3000D PASS PASS PASS PASS PASS PASS PASS Sourcefire Advanced Malware Protection Trend Micro Deep Discovery Inspector Model 1000 PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS Figure 8 Stability and Reliability Security Effectiveness The security effectiveness of a device is determined by factoring the results of stability & reliability testing into the detection rate. Figure 9 depicts the security effectiveness of each device. Product Detection Rate Stability & Reliability Security Effectiveness AhnLab MDS 94.7% 100% 94.7% Fidelis XPS Direct 1000 98.4% 100% 98.4% FireEye Web MPS 4310 and Email MPS 5300 94.5% 100% 94.5% Fortinet FortiSandbox 3000D 99.0% 100% 99.0% Sourcefire Advanced Malware Protection 99.0% 100% 99.0% Trend Micro Deep Discovery Inspector Model 1000 99.1% 100% 99.1% Figure 9 Security Effectiveness 8

Test Methodology Breach Detection Systems: Test Methodology 1.5 A copy of the test methodology is available on the NSS Labs website at www.nsslabs.com Contact Information NSS Labs, Inc. 206 Wild Basin Rd Building A, Suite 200 Austin, TX 78746 +1 (512) 961-5300 info@nsslabs.com www.nsslabs.com This and other related documents available at: www.nsslabs.com. To receive a licensed copy or report misuse, please contact NSS Labs at +1 (512) 961-5300 or sales@nsslabs.com 2014 NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, photocopied, stored on a retrieval system, or transmitted without the express written consent of the authors. Please note that access to or use of this report is conditioned on the following: 1. The information in this report is subject to change by NSS Labs without notice. 2. The information in this report is believed by NSS Labs to be accurate and reliable at the time of publication, but is not guaranteed. All use of and reliance on this report are at the reader s sole risk. NSS Labs is not liable or responsible for any damages, losses, or expenses arising from any error or omission in this report. 3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY NSS LABS. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON- INFRINGEMENT ARE DISCLAIMED AND EXCLUDED BY NSS LABS. IN NO EVENT SHALL NSS LABS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL OR INDIRECT DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or software) tested or the hardware and software used in testing the products. The testing does not guarantee that there are no errors or defects in the products or that the products will meet the reader s expectations, requirements, needs, or specifications, or that they will operate without interruption. 5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned in this report. 6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of their respective owners. 9