BREACH DETECTION SYSTEMS COMPARATIVE ANALYSIS Security Thomas Skybakmoen, Jason Pappalexis Tested Products AhnLab MDS Fidelis XPS Direct 1000 FireEye Web MPS 4310 and Email MPS 5300 Fortinet FortiSandbox 3000D Sourcefire (Cisco) Advanced Malware Protection 1 Trend Micro Deep Discovery Inspector Model 1000 Environment Breach Detection Systems: Test Methodology 1.5 1 Sourcefire is now part of Cisco.
Overview Implementation of breach detection systems (BDS) can be a complex process with multiple factors affecting the overall security effectiveness of the solution. These should be considered over the course of the useful life of the solution, and include: 1. Detection rate 2. Device stability and reliability In order to determine the relative security effectiveness of devices on the market and facilitate accurate product comparisons, NSS Labs has developed a unique metric: Security Effectiveness = Detection Rate 2 x Stability & Reliability Figure 1 Security Effectiveness Formula By focusing on overall security effectiveness instead of the detection rate alone, NSS is able to factor in the ease with which defenses can be bypassed, as well as the reliability of the device. Product Detection Rate Stability & Reliability Security Effectiveness AhnLab MDS 94.7% 100% 94.7% Fidelis XPS Direct 1000 98.4% 100% 98.4% FireEye Web MPS 4310 and Email MPS 5300 94.5% 100% 94.5% Fortinet FortiSandbox 3000D 99.0% 100% 99.0% Sourcefire Advanced Malware Protection 99.0% 100% 99.0% Trend Micro Deep Discovery Inspector Model 1000 99.1% 100% 99.1% Figure 2 Security Effectiveness Because enterprise users consider effective management to be a critical component of any enterprise security deployment, this also should be factored into total cost of ownership (TCO) and overall product selection. This is outside the scope of this report, however, for more information, refer to the TCO CAR. For a complete view of Security Effectiveness mapped against Value, refer to the Security Value Map (SVM) CAR. As part of the initial BDS test setup devices are configured/tuned as deemed necessary by the vendor. Every effort is made to deploy policies that ensure the optimal combination of security effectiveness and performance, as would be the aim of a typical customer deploying the device in a live network environment. This provides readers with the most useful information on key BDS security effectiveness and performance capabilities based upon their expected usage. This chart depicts the relationship between protection and performance when tuned policies are used. Farther up indicates better security effectiveness, and farther to the right indicates higher throughput. The maximum throughput shown is the first stage at which one or more attacks are not detected. 2 Detection Rate is defined as the number of malware detected under test within the 48- hour window.
100%' Trend'Micro'Deep'Discovery'Inspector' Model'1000' ) 99%' ForAnet'ForASandbox'3000D' Sourcefire'Advanced'Malware'ProtecAon' Security)Effec,veness) 98%' 97%' 96%' Fidelis'XPS 'Direct'1000' 95%' FireEye'Web'MPS'4310' and'email'mps'5300' AhnLab'MDS' 94%' 500' 600' 700' 800' 900' 1,000' 1,100' 1,200' NSS1Tested)Throughput)(Mbps)) Figure 3 Security Effectiveness and Performance When selecting products, those along the top line of the chart (closer to 100% security effectiveness) should be prioritized. The throughput is a secondary consideration and will be dependent on enterprise- specific deployment requirements.
Table of Contents Tested Products... 1 Environment... 1 Overview... 2 Analysis... 5 Tuning... 5 Detection Rate... 5 Malware Delivered over HTTP... 6 Malware Delivered over E- mail... 6 Malware Delivered by Exploits... 7 False Positive... 7 Stability & Reliability... 8 Security Effectiveness... 8 Contact Information... 9 Table of Figures Figure 1 Security Effectiveness Formula... 2 Figure 2 Security Effectiveness... 2 Figure 3 Security Effectiveness and Performance... 3 Figure 4 Malware Delivered over HTTP... 6 Figure 5 Malware Delivered over E- mail... 6 Figure 6 Malware Delivered using Exploits... 7 Figure 7 False Positive... 7 Figure 8 Stability and Reliability... 8 Figure 9 Security Effectiveness... 8
Analysis The threat landscape is evolving constantly; attackers are refining their strategies and increasing both the volume and intelligence of their attacks. Enterprises now must defend against targeted persistent attacks (TPA). In the past, servers were the main target; attacks against desktop client applications are now mainstream and present a clear danger to organizations. Through constant analysis of suspicious code and identification of communications with malicious hosts, breach detection systems claim to providing enhanced detection of advanced malware, zero- day and targeted attacks that could bypass traditional defenses. Tuning Security products are often complex, and vendors are responding by simplifying the user interface and security policy selection to meet the usability needs of a broadening user base. Indeed, many organizations accept and deploy the default settings, understanding these to be the best recommendations from the vendor. NSS research has found that BDS systems often require little or no tuning. In fact, several vendors come with little or no tuning options as standard. However, where possible all BDS products are tuned prior to testing to eliminate false positives and provide the most appropriate coverage for the systems to be protected. Typically, tuning is carried out by experienced system engineers from the vendor company, but where this is not possible, NSS engineers will perform the necessary tuning. NSS engineers may also amend the configuration of a device under test (DUT), where specific characteristics of the DUT or its configuration interfere with the normal operation of any of the tests, or where the results obtained from those tests would, in the opinion of those engineers, misrepresent the true capabilities of the DUT. Every effort is made to ensure the optimal combination of security effectiveness and performance, as would be the aim of a typical customer deploying the DUT in a live network environment. Detection Rate NSS security effectiveness testing leverages the deep expertise of our engineers to generate the same types of attacks used by modern cyber criminals, utilizing multiple commercial, open source, and proprietary tools as appropriate. With over 1800 live exploits and malware samples, this is the industry s most comprehensive test to date. Most notable, all of the live exploits in these tests have been validated such that: A reverse shell is returned A bind shell is opened on the target allowing the attacker to execute arbitrary commands A malicious payload is installed The system is rendered unresponsive 5
Malware Delivered over HTTP Figure 4 depicts how each product was able to detect socially engineered malware using the HTTP protocol as its transport mechanism, i.e., it is downloaded through a web browser. 0%% 10%% 20%% 30%% 40%% 50%% 60%% 70%% 80%% 90%% 100%% AhnLab%MDS% Fidelis%XPS %Direct%1000% 97.7%% FireEye%Web%MPS%4310%and%Email%MPS%5300% 95.1%% ForHnet%ForHSandbox%3000D% 98.7%% Sourcefire%Advanced%Malware%ProtecHon% 98.7%% Trend%Micro%Deep%Discovery%Inspector%Model%1000% 97.3%% Trend%Micro%Deep% Discovery%Inspector% Model%1000% 97.3%% Sourcefire%Advanced% Malware%ProtecHon% ForHnet%ForHSandbox% 3000D% FireEye%Web%MPS%4310% and%email%mps%5300% Fidelis%XPS %Direct%1000% AhnLab%MDS% Malware%Delivered%over%HTTP% 98.7%% 98.7%% 95.1%% 97.7%% Figure 4 Malware Delivered over HTTP Malware Delivered over E- mail Figure 5 depicts how each product was able to detect socially engineered malware that uses email (SMTP/IMAP) as its transport mechanism; for example, a malicious email attachment. 0%& 10%& 20%& 30%& 40%& 50%& 60%& 70%& 80%& 90%& 100%& AhnLab&MDS& 94.0%& Fidelis&XPS &Direct&1000& 97.6%& FireEye&Web&MPS&4310&and&Email&MPS&5300& 96.0%& ForHnet&ForHSandbox&3000D& 98.4%& Sourcefire&Advanced&Malware&ProtecHon& 98.4%& Trend&Micro&Deep&Discovery&Inspector&Model&1000& 100.0%& Trend&Micro&Deep& Discovery&Inspector& Model&1000& Sourcefire&Advanced& Malware&ProtecHon& ForHnet&ForHSandbox& 3000D& FireEye&Web&MPS&4310& Fidelis&XPS &Direct&1000& and&email&mps&5300& AhnLab&MDS& Malware&Delivered&over&EQmail& 100.0%& 98.4%& 98.4%& 96.0%& 97.6%& 94.0%& Figure 5 Malware Delivered over E- mail 6
Malware Delivered by Exploits Figure 6 depicts how each product was able to detect malware delivered by exploits. Exploits are defined as malicious software that is designed to take advantage of an existing deficiency in a hardware or software system, be it a vulnerability or a bug. 0%% 10%% 20%% 30%% 40%% 50%% 60%% 70%% 80%% 90%% 100%% AhnLab%MDS% 90.0%% Fidelis%XPS %Direct%1000% FireEye%Web%MPS%4310%and%Email%MPS%5300% 92.5%% ForHnet%ForHSandbox%3000D% Sourcefire%Advanced%Malware%ProtecHon% Trend%Micro%Deep%Discovery%Inspector%Model%1000% Trend%Micro%Deep% Discovery%Inspector% Model%1000% Sourcefire%Advanced% Malware%ProtecHon% ForHnet%ForHSandbox% 3000D% FireEye%Web%MPS%4310% and%email%mps%5300% Fidelis%XPS %Direct%1000% AhnLab%MDS% Malware%Delivered%by%Exploits% 92.5%% 90.0%% Figure 6 Malware Delivered using Exploits False Positive The ability of the product to identify and pass legitimate traffic while maintaining detection of threats and breaches is as important as detection rate alone. There is commonly a trade- off between detection accuracy and performance; a product s detection accuracy should be evaluated within the context of its performance and vice versa. For more information, please refer to the Comparative Analysis Performance on www.nsslabs.com. Figure 7 shows the results of this test. Product % False Positive Samples Detected AhnLab MDS 7% Fidelis XPS Direct 1000 0% FireEye Web MPS 4310 and Email MPS 5300 0% Fortinet FortiSandbox 3000D 0% Sourcefire Advanced Malware Protection 0% Trend Micro Deep Discovery Inspector Model 1000 0% Figure 7 False Positive 7
Stability & Reliability Long- term stability is particularly important for an in- line device, where failure can produce network outages. These tests verify the stability of the DUT along with its ability to maintain security effectiveness while under normal load and while passing malicious traffic. Products that are not able to sustain legitimate traffic (or that crash) while under hostile attack will not pass. The DUT is required to remain operational and stable throughout these tests, and it is required to operate at 100% scanning capability, raising an alert for each detection. If any malicious traffic passes undetected, caused by either the volume of traffic or by the BDS failing for any reason, this will result in a FAIL. Product Detection Under Extended Attack Attack Detection - Normal Load Protocol Fuzzing and Mutation Detection Ports Protocol Fuzzing and Mutation Management Port Power Fail Redundancy Persistence of Data AhnLab MDS PASS PASS PASS PASS PASS PASS PASS Fidelis XPS Direct 1000 PASS PASS PASS PASS PASS PASS PASS FireEye Web MPS 4310 and Email MPS 5300 PASS PASS PASS PASS PASS PASS PASS Fortinet FortiSandbox 3000D PASS PASS PASS PASS PASS PASS PASS Sourcefire Advanced Malware Protection Trend Micro Deep Discovery Inspector Model 1000 PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS Figure 8 Stability and Reliability Security Effectiveness The security effectiveness of a device is determined by factoring the results of stability & reliability testing into the detection rate. Figure 9 depicts the security effectiveness of each device. Product Detection Rate Stability & Reliability Security Effectiveness AhnLab MDS 94.7% 100% 94.7% Fidelis XPS Direct 1000 98.4% 100% 98.4% FireEye Web MPS 4310 and Email MPS 5300 94.5% 100% 94.5% Fortinet FortiSandbox 3000D 99.0% 100% 99.0% Sourcefire Advanced Malware Protection 99.0% 100% 99.0% Trend Micro Deep Discovery Inspector Model 1000 99.1% 100% 99.1% Figure 9 Security Effectiveness 8
Test Methodology Breach Detection Systems: Test Methodology 1.5 A copy of the test methodology is available on the NSS Labs website at www.nsslabs.com Contact Information NSS Labs, Inc. 206 Wild Basin Rd Building A, Suite 200 Austin, TX 78746 +1 (512) 961-5300 info@nsslabs.com www.nsslabs.com This and other related documents available at: www.nsslabs.com. To receive a licensed copy or report misuse, please contact NSS Labs at +1 (512) 961-5300 or sales@nsslabs.com 2014 NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, photocopied, stored on a retrieval system, or transmitted without the express written consent of the authors. Please note that access to or use of this report is conditioned on the following: 1. The information in this report is subject to change by NSS Labs without notice. 2. The information in this report is believed by NSS Labs to be accurate and reliable at the time of publication, but is not guaranteed. All use of and reliance on this report are at the reader s sole risk. NSS Labs is not liable or responsible for any damages, losses, or expenses arising from any error or omission in this report. 3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY NSS LABS. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON- INFRINGEMENT ARE DISCLAIMED AND EXCLUDED BY NSS LABS. IN NO EVENT SHALL NSS LABS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL OR INDIRECT DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or software) tested or the hardware and software used in testing the products. The testing does not guarantee that there are no errors or defects in the products or that the products will meet the reader s expectations, requirements, needs, or specifications, or that they will operate without interruption. 5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned in this report. 6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of their respective owners. 9