Invincea Endpoint Protection Test

Similar documents
Web Gateway Security Appliances for the Enterprise: Comparison of Malware Blocking Rates

Remediation Testing Report

Real World Testing Report

Comparative Remediation Testing Report

Avira Test Results 2013

Trend Micro SMB Endpoint Comparative Report Performed by AV-Test.org

Testing Exploit-Prevention Mechanisms in Anti-Malware Products

Transparency report. Examining the AV-TEST January-February 2018 Results. Prepared by. Windows Defender Research team

Beyond Testing: What Really Matters. Andreas Marx CEO, AV-TEST GmbH

Performance Test. ESET Endpoint Security. Language: English September Last Revision: 14 th September

Trend Micro Endpoint Comparative Report Performed by AV-Test.org

How To Remove Security Shield 2012 Virus Manually

Trend Micro Enterprise Endpoint Comparative Report Performed by AV-Test.org

COMPARATIVE MALWARE PROTECTION ASSESSMENT

Trend Micro SMB Endpoint Comparative Report Performed by AV-Test.org

Smart Protection Network. Raimund Genes, CTO

SE Labs Test Plan for Q Endpoint Protection : Enterprise, Small Business, and Consumer

MRG Effitas 360 Degree Assessment & Certification Q4 2017

Component Protection Metrics for Security Product Development: CheckVir Endpoint Test Battery

Symantec Antivirus Manual Removal Tool Corporate Edition 10.x

Protecting Virtual Environments

Win 7 Security 2011 Virus Manual Removal

MRG Effitas 360 Degree Assessment & Certification Q1 2018

DETERMINATION OF THE PERFORMANCE

Anti-Virus Comparative

Measuring cloud-based anti-malware protection for Office 365 user accounts

Kaspersky Internet Security - Top 10 Internet Security Software in With Best Antivirus, Firewall,

Symantec Endpoint Protection 14

This report is based on sampled data. Jun 1 Jul 6 Aug 10 Sep 14 Oct 19 Nov 23 Dec 28 Feb 1 Mar 8 Apr 12 May 17 Ju

Get BitDefender Client Security 2 Years 30 PCs software suite ]

CONSUMER EPP COMPARATIVE ANALYSIS

PassMark S O F T W A R E

IT Security Cost Reduction

Symantec vs. Trend Micro Comparative Aug. 2009

Securing and File Sharing in the Cloud

MRG Effitas 360 Assessment & Certification Programme Q4 2015

How To Manually Uninstall Symantec Antivirus Corporate Edition 10.x Client

Anti-Virus Testing and AMTSO

SE Labs Test Plan for Q Endpoint Protection : Enterprise, Small Business, and Consumer

Whole Product Dynamic Real-World Protection Test (July-November 2017)

Small Business Anti-Virus Protection

Test Strategies & Common Mistakes International Antivirus Testing Workshop 2007

Enterprise Anti-Virus Protection

MRG Effitas Trapmine Exploit Test

Presentation by Brett Meyer

MRG Effitas 360 Degree Assessment & Certification Q MRG Effitas 360 Assessment & Certification Programme Q2 2017

Comparison Of Antivirus Software

MYOB Enterprise Solutions software system requirements guidelines. Thursday, 6 May 2010 Version 2.0

Seqrite Endpoint Security

Nigerian Telecommunications Sector

Anti-Virus Comparative

Symantec Ransomware Protection

MRG Effitas Real Time Protection Test Project, First Quarter Q MRG Effitas Real Time Protection Test Project, First Quarter (Q2 2013)

Retrospective Testing - How Good Heuristics Really Work

USB Drive Antivirus provides comprehensive protection against any virus, worm trying to attack via USB drive. When an USB device is inserted into

Latest Press Release. atlanta backpage women seeking men

Norton 360 vs trend micro vs mcafee vs symantec: which anti-virus solution is best

Jordan Levesque - Keeping your Business Secure

Malwarebytes Endpoint Protection vs. Four Competitors (March 2018)

Securing the SMB Cloud Generation

Anti Virus Comparative Performance Test (Suite Products) May 2012

Kaspersky Cloud Security for Hybrid Cloud. Diego Magni Presales Manager Kaspersky Lab Italia

Proteggere Office365 e Cloud file sharing in meno di un minuto Tiberio Molino Sr.Sales Engineer Trend Micro


McAfee Network Security Platform 8.3

Symantec & Blue Coat Technical Update Webinar 29. Juni 2017

Symantec Endpoint Protection 12

A Guide to Closing All Potential VDI Security Gaps

How To Update Kaspersky Internet Security 2014

Defending Against Known & Unknown Threats

TECHNOLOGY. Roberto Corso. Territory Manager NW - Italy. 6 Giugno 2017

Norton Antivirus 2008 Manual Update File Xp

Predictive malware response testing methodology. Contents. 1.0 Introduction. Methodology version 1.0; Created 17/01/2018

Home Anti-Virus Protection

SE Labs Test Plan for Q Endpoint Protection : Enterprise, Small Business, and Consumer

PC SECURITY LABS COMPARATIVE TEST. Microsoft Office. Flash. August Remote code execution exploit. mitigations for popular applications

McAfee Network Security Platform 8.3

CheckVir anti-virus testing and certification

CONSUMER REPORT April - June 2016

Android Security Product Testing

Home Anti-Virus Protection

UP L13: Leveraging the full protection of SEP 12.1.x

Network Security Platform 8.1

Home Anti-Virus Protection

Monthly SEO Report. Example Client 16 November 2012 Scott Lawson. Date. Prepared by

Anti-Virus Comparative No.4

SMALL BUSINESS ENDPOINT PROTECTION

Advanced Threat Defense Certification Testing Report. Trend Micro Incorporated Trend Micro Deep Discovery Inspector

DATE OF BIRTH SORTING (DBSORT)

Seattle (NWMLS Areas: 140, 380, 385, 390, 700, 701, 705, 710) Summary

RTTL Certification Test - March Language: English. March Last Revision: 8 th April

Whole Product Dynamic Real-World Protection Test (July-November 2018)

KASPERSKY ENDPOINT SECURITY FOR BUSINESS

Prevx 3.0 v Product Overview - Core Functionality. April, includes overviews of. MyPrevx, Prevx 3.0 Enterprise,


Piero DePaoli, Director, Product Marketing Scott Sawoya, Senior Manager, Product Management. SR B19: Symantec Endpoint Protection 12 Customer Panel

F-PROT Antivirus Engine performance analysis

Next Generation Enduser Protection

Kaspersky Security Network

HOME ANTI- MALWARE PROTECTION

Transcription:

Invincea Endpoint Protection Test A test commissioned by Invincea and performed by AV-TEST GmbH. Date of the report: May 2 nd, 2016 Executive Summary In April 2016, AV-TEST performed a review of the Invincea Endpoint protection solution to determine the malware detection and blocking capabilities. Invincea commissioned AV-TEST to run this independent test. In order to ensure a fair review, the sponsor has not supplied any samples or had any influence or any prior knowledge regarding the samples being tested. The following test scenarios are standard tests that AV-TEST does on a regular basis for endpoint antimalware solutions: 1. Protection: Measuring the effectiveness of the product in protecting the system. One part is the protection against real-world attacks, such as malicious websites or e-mail attachments. The other part is the detection of widespread and prevalent malicious Windows binaries. 2. Usability/False Positives: False detections of legitimate software during a scan or while installing and using popular programs. 3. Performance: The performance impact of the security solution while performing typical tasks on a computer on two different hardware platforms. Breaking out the data by test shows that Invincea provided a very good protection level and minimal performance impact. Real-World Testing Detection of prevalent Malware Total Test Cases 64 12,281 Detected Samples 64 12,210 Detection Rate 100% 99.42% Figure 1: Summary of the test results for Malware blocking False Positives (Microsoft files) False Positives (Popular Programs) During Installation and Usage (Popular Programs) Total Test Cases 397,612 100565 82 Detected Samples 0 112 1 False Positive Rate 0% 0.11% 1.22% Figure 2: Summary of the test results for False Positives Old hardware New hardware Download files 0,32% 0,68% Load websites 11,42% 5,37% Install applications 4,82% 9,37% Run applications opening specific documents 4,31% 6,29% Copy files 6,04% 1,80% Figure 3: Summary of the test results for Performance Testing 1

Overview With the increasing volume of malware, targeted attacks and advanced persistent threats spreading through the Internet these days, the danger of getting infected is higher than ever before. In the year 2000, AV-TEST received more than 170,000 new unique samples, and in 2015, the number of new samples grew to over 140,000,000. The growth of these numbers is displayed in Figure 4. New unique samples added to AV-TEST's malware repository (2005-2016) 160.000.000 140.000.000 120.000.000 100.000.000 80.000.000 60.000.000 40.000.000 20.000.000 0 Dec Nov Oct Sep Aug Jul Jun May Apr Mar Feb Figure 4: New malware samples per year To protect the enterprise network against the enormous number of threats a well working endpoint protection is required. Old approaches such as simple static signatures are not enough anymore to protect against mass attacks as well as targeted attacks. Products need to deliver protection with different layers: static signatures, heuristics, dynamic detection, URL blocking, reputation and possibly more. A clever combination of those layers makes it hard for the attacking site to infiltrate the enterprise network. Product Tested The product Invincea Enterprise in Version 6.0.0.22576 was tested. Methodology and Scoring Platform The test has been performed on several PCs with identical hardware, running Windows 7 Ultimate, SP1 (64-Bit). The hardware platform for the tests was as follows: 2

Older standard business PC: Intel Xeon X3360 @ 2,83GHz, 4 GB RAM, 500 GB HDD Recent High-End PC (for Performance testing only): Intel i7 3770 @ 3,40GHz, 16 GB RAM, Samsung 512 GB SSD Testing methodology Invincea provided help in setting up the product and configuring it. The test itself has been performed by AV-TEST engineers according to the methodologies described at https://www.avtest.org/en/test-procedures/test-modules/ 3

Test Results The following table shows the individual results of Invincea vs. the average results from the public tests that AV-TEST performed in Nov/Dec 2015 (11 products) 1 and Jan/Feb 2016 (12 products) 2 for enterprise protection products. Please note that the test sets and samples were different for all three tests. Also the performance test was changed in the 2016, so there are no values from Nov/Dec 2015. Invincea Nov/Dec 2015 Jan/Feb 2016 Protection Real-World Attacks 100% 97.9% 98.1% Prevalent Malware 99.4% 99.9% 99.8% Usability/False Positives Critical files from Windows/Office 0 0 0 Less Critical files from popular programs 112 6 2 False Warnings during the execution and Usage of 0 0 0 Programs False Blockages during the execution and Usage of Programs 1 0 0 Performance Total Impact Old Hardware 5,38% - 16.16% Total Impact New Hardware 4,70% - 18.93% Figure 5: Test Results compared to Nov/Dec 2015 and Jan/Feb 2016 average results Protection Results In this test the product had to protect against 64 malicious websites. Invincea managed to block all the attacks and no infection occurred. Comparing this to previous published AV-TEST results from Nov/Dec 2015 and Jan/Feb 2016 shows that 100% protection is not the standard yet. The average was 97.9% in Nov/Dec and 98.1% in Jan/Feb 2016. So 100% detection and blocking can be considered very good. The second part of the protection test involved 12,281 widespread and prevalent malicious files that had to be detected either by a scan or during execution. Invincea managed to detect 99.42% of the files. This is only slightly below the average of the Nov/Dec 2015 and Jan/Feb 2016 test, but still a very good result. False Positive Results The static false positive testing has been carried out against 397,612 critical files from Windows and office installations. No false positives occurred here. A second test set consisted of 100,565 files from popular programs downloaded from major download sites. Invincea detected 112 files as malicious or unwanted. This is the only part of the testing where Invincea showed a weakness. However, in a 1 https://www.av-test.org/en/antivirus/business-windows-client/windows-10/december-2015/ 2 https://www.av-test.org/en/antivirus/business-windows-client/windows-7/february-2016/ 4

business environment the risk of false positives is much smaller, as the average user is usually not allowed to install additional software. A further test involved the installation and usage of 41 well known products such as 7-Zip, Adobe Reader, Java JDK or Google Chrome. During this test we check for warning messages or wrong blockings of certain action. Invincea blocked the installation of one product: HD Clone 6.0.5 As mentioned before, in a business environment these kinds of false positives are probably not a big problem. Should they occur, the system administrator can easily whitelist those cases for proper usage. Performance Results The performance tests are run on older and newer hardware to cover the different hardware platforms that are in use in different companies. In both cases the performance impact of Invincea was minimal and actually one of the best values we have experienced in our tests so far. There were no notable problems anywhere and the impact was a lot less than the average impact we measured in the public Jan/Feb 2016 testing of other products. Summarized Score When publishing results, AV-TEST translates the detection rates etc. into a score from 0 to 6, where 6 is the best possible result. If this is done with the Invincea results the following scores would be achieved: Protection: 5.5 Usability/False Positives: 4 Performance: 6 This allows a rough comparison with the Nov/Dec 2015 and Jan/Feb 2016 results, see tables below. Please note that different test sets have been used and the tests ran on different dates. Also the performance test has been updated for 2016. So the results are not directly comparable, but give a first indication how products would compare. November and December 2015 Protection Usability Performance Total Bitdefender 6 5.5 6 17.5 Cylance 5.5 4 4 13.5 F-Secure 6 5.5 4.5 16 G Data 5.5 4.5 5.5 15.5 Kaspersky 6 6 6 18 Intel Security 5.5 5 6 16.5 Microsoft 4.5 4.5 6 15 Seqrite 4.5 3 5.5 13 Sophos 6 5 5.5 16.5 Symantec 6 5.5 5.5 17 Trend Micro 6 5.5 6 17.5 Invincea (April private test) 5.5 4 6 15.5 Figure 6: Test Results compared to Nov/Dec 2015 5

January and February 2016 Protection Usability Performance Total AVG 6 5 6 17 Bitdefender 6 5.5 6 17.5 F-Secure 6 6 5 17 G Data 5.5 5 6 16.5 Kaspersky (Endpoint) 6 5.5 6 17.5 Kaspersky (Small Office) 6 6 6 18 Intel Security 5.5 5 6 16.5 Microsoft 3 5 6 14 Seqrite 4 3.5 5.5 13 Sophos 5 4 6 15 Symantec 6 5.5 6 17.5 Trend Micro 6 4.5 6 16.5 Invincea (April private test) 5.5 4 6 15.5 Figure 7: Test Results compared to Jan/Feb 2016 Summary The results illustrate that Invincea is able to provide the same level of protection as other products that are regularly included in AV-TEST public comparative tests. The performance impact was also minimal and one of the best values we ever measured. Only the false positive testing revealed a few weaknesses. This is something that can be improved in feature. However even now those false positives shouldn t be a big problem in business environments as they occurred primarily on consumer software and could easily by whitelisted by a system administrator. Copyright 2016 by AV-TEST GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web https://www.av-test.org 6