Document: Endpoint Protection Performance Testing - Enterprise - Win10 - Edition 1.docx Authors: M. Baquiran, D. Wren Company: PassMark

Similar documents
Enterprise Endpoint Protection Performance Benchmarks

Enterprise Endpoint Security Performance Benchmarks

Enterprise Endpoint Protection Performance Benchmarks

Small Business Endpoint Protection Performance Benchmarks

SentinelOne Endpoint Protection Platform vs. Seven Competitors (September 2017)

Document: SentinelOne vs. Seven Competitors (August 2017) Authors: M. Baquiran, D. Wren Company: PassMark Software Date: 8 August 2017 File:

Malwarebytes Endpoint Protection vs. Four Competitors (March 2018)

Webroot Secure Anywhere Cloud Antivirus vs. Four Competitor Antivirus Products on Windows 8 (Nov 2012) Antivirus Performance Benchmark

Document: Malwarebytes Anti-Malware vs. Seven Competitors (October 2014) Authors: M. Baquiran, D. Wren Company: PassMark Software Date: 19 September

Norton Utilities Premium Performance Testing (November 2018)

Webroot Secure Anywhere Cloud Antivirus vs. Six Traditional Antivirus Products (Sept 2011)

Consumer Security Products Performance Benchmarks (Edition 2)

PassMark S O F T W A R E

Document: Consumer Security Products Performance Benchmarks (Edition 1) Authors: M. Baquiran, D. Wren Company: PassMark Software Date: 14 November

Performance Test. ESET Endpoint Security. Language: English September Last Revision: 14 th September

Network Performance Test. Business Security Software. Language: English August Last Revision: 11 th October

IBM Tivoli Access Manager for e-business v3.8 Performance Details. Detailed Extranet Results

Web Gateway Security Appliances for the Enterprise: Comparison of Malware Blocking Rates

COMPARING DESKTOP DRIVE PERFORMANCE: SEAGATE SOLID STATE HYBRID DRIVE VS. HARD DISK DRIVES AND SEAGATE CLIENT SOLID STATE DRIVE

Anti-Virus Comparative

FinalCode Viewer User Manual

2 System boot time improvement up to 38%

AMD: WebBench Virtualization Performance Study

Wait less and stay unplugged longer using Microsoft Windows 10 laptops with SSDs

Scalability Testing with Login VSI v16.2. White Paper Parallels Remote Application Server 2018

FinalCode Viewer User Manual

L-Series 30 User Environment

Symantec Protection Engine for Cloud Services 7.9 Sizing Guide

Comparing battery life and boot performance in Microsoft Windows 10 S and Windows 10 Pro

Intel Optane Memory and Intel SSD 545s combine to offer NVMe-class storage performance. November 24, 2017 Version 1.0

Avast Cleanup Premium Performance Benchmarks

L-series 30-user environment

WHITE PAPER: BEST PRACTICES. Sizing and Scalability Recommendations for Symantec Endpoint Protection. Symantec Enterprise Security Solutions Group

Low Latency Evaluation of Fibre Channel, iscsi and SAS Host Interfaces

Netegrity SiteMinder 4.51 AuthMark Performance Details

64-bit Black-Scholes financial workload performance and power consumption on uniprocessor Intel-processor-based servers

Comparative Remediation Testing Report

Get more work done with a Chromebook powered by an Intel Core m3 processor

Parallels Remote Application Server. Scalability Testing with Login VSI

Chrome and IE comparisons

Symantec Drive Encryption Evaluation Guide

Airtel PC Secure Trouble Shooting Guide

N-Series SoC Based Thin Clients

EMC Business Continuity for Microsoft SharePoint Server (MOSS 2007)

Use your favorite Android apps on Chromebooks and spend less time waiting

Remediation Testing Report

Getting Started Guide. This document provides step-by-step instructions for installing Max Secure Anti-Virus and its prerequisite software.

Application Notes for Ardence Desktop Edition with Avaya Interaction Center Issue 1.0

Hardware & System Requirements

Stay engaged in your academic work with a Chromebook powered by the Intel Core m3 processor

Symantec ediscovery Platform

AW-HE40 / HE65 Firmware Update Procedure

TEST REPORT. SEPTEMBER 2007 Linpack performance on Red Hat Enterprise Linux 5.1 and 3 AS Intel-based servers

The Impact of Disk Fragmentation on Servers. By David Chernicoff

Adaptec MaxIQ SSD Cache Performance Solution for Web Server Environments Analysis

Fast and Effective Endpoint Security for Business Comparative Analysis. August 2012

Dell EMC CIFS-ECS Tool

SIMATIC. Process Control System PCS 7 V7.0 SP1 Security Information Note: Setting up antivirus software. Preface. Using virus scanners 2

Process Historian Administration SIMATIC. Process Historian V8.0 Update 1 Process Historian Administration. Basics 1. Hardware configuration 2

Colligo Briefcase 3.0

V5.0 for DLP+ HyBoost User Manual. V5.0 for DLP+ HyBoost. [User Manual V1.4] Copyrightc2015 Somansa. All rights reserved

White Paper. The impact of virtualization security on your VDI environment

NET SatisFAXtion TM Configuration Guide For use with AT&T s IP Flexible Reach Service And IP Toll Free Service

DiskSavvy Disk Space Analyzer. DiskSavvy DISK SPACE ANALYZER. User Manual. Version Dec Flexense Ltd.

Forensic Toolkit System Specifications Guide

BlackBerry Enterprise Server for Microsoft Exchange Version: 5.0. Performance Benchmarking Guide

Seagate Manager. User Guide. For Use With Your FreeAgent TM Drive. Seagate Manager User Guide 1

iload MVP is a general-purpose, script-driven capacity planning, benchmarking, and regression testing tool. The major components of iload MVP are:

One Identity Active Roles 7.2

Upgrade to Intel Optane memory and save time on everyday tasks

T H E TOLLY. No March StreamGroomer Module 200 Flow Regulator and StreamGroomer Manager (SGM) Transactions per second

X300 Seven User Environment

Quick Heal Total Security for Mac. Simple, fast and seamless protection for Mac.

escan Quick Reference and Installation Guide This document provides information to install escan and serves as a quick reference to run key tasks.

Invincea Endpoint Protection Test

Diffusion TM 5.0 Performance Benchmarks

AW-HE40 / HE65 Firmware Update Procedure

White Paper June 2004

VMWare Horizon View 6 VDI Scalability Testing on Cisco 240c M4 HyperFlex Cluster System

Competitive System Resource Impact Testing. CylancePROTECT Goes Head-to-Head with Signature-Based Antivirus Solutions.

Symantec Endpoint Protection, Symantec Endpoint Protection Small Business Edition, and Symantec Network Access Control 12.1.

THREAT ISOLATION TECHNOLOGY PRODUCT ANALYSIS

DupScout DUPLICATE FILES FINDER

ENTERPRISE ENDPOINT COMPARATIVE REPORT

Anti Virus Comparative Performance Test (Suite Products) May 2012

Key Features. DATA SHEET

OPSWAT Metadefender. Superior Malware Threat Prevention and Analysis

Test Report. May Executive Summary. Product Evaluation: Diskeeper Professional Edition vs. Built-in Defragmenter of Windows Vista

TEST REPORT. JUNE 2007 SPECjbb2005 performance and power consumption on Dell and HP blade servers

WebBench performance on Intel- and AMD-processorbased servers running Red Hat Enterprise Linux v.4.4

Visual Presenter Visual Webcaster

Quest Collaboration Services 3.6. Installation Guide

National Aeronautics and Space Admin. - FTP Site Statistics. Top 20 Directories Sorted by Disk Space

NTP Software File Auditor for Windows Edition

The Impact of Disk Fragmentation on Servers. By David Chernicoff

ivms-5200 Mobile Surveillance V1.1.1 Software Requirements & Hardware Performance

AW-HE120 Firmware Update Procedure

Quest VROOM Quick Setup Guide for Quest Rapid Recovery and Foglight Windows Installers

TEST REPORT. AUGUST 2006 SPECjbb2005 performance and power consumption on Intel Xeon 51xx processor-based servers

Symantec Desktop and Laptop Option 8.0 SP2. Symantec Desktop Agent for Mac. Getting Started Guide

Transcription:

Document: Endpoint Protection 2017 - Performance Testing - Enterprise - Win10 - Edition 1.docx Authors: M. Baquiran, D. Wren Company: Date: 8 February 2017 Edition: 1

TABLE OF CONTENTS... 2 REVISION HISTORY... 3 REFERENCES... 3 EXECUTIVE SUMMARY... 4 SCORE AND RANK... 5 PRODUCTS AND VERSIONS... 6 PERFORMANCE METRICS SUMMARY... 7 ENTERPRISE ENDPOINT SECURITY PRODUCTS RESULTS... 10 BENCHMARK 1 WORD DOCUMENT LAUNCH AND OPEN TIME (MILLISECONDS)... 10 BENCHMARK 2 EDGE LAUNCH TIME (MILLISECONDS)... 10 BENCHMARK 3 ON-DEMAND SCAN TIME (SECONDS)... 11 BENCHMARK 4 CPU USAGE DURING SCAN (PERCENT)... 11 BENCHMARK 5 BROWSE TIME (SECONDS)... 12 BENCHMARK 6 FILE COPY, MOVE AND DELETE (SECONDS)... 12 BENCHMARK 7 NETWORK THROUGHPUT (SECONDS)... 13 BENCHMARK 8 FILE COMPRESSION AND DECOMPRESSION (SECONDS)... 13 BENCHMARK 9 FILE WRITE, OPEN AND CLOSE (SECONDS)... 14 BENCHMARK 10 MEMORY USAGE DURING SYSTEM IDLE (MEGABYTES)... 14 BENCHMARK 11 MEMORY USAGE DURING SCAN (MEGABYTES)... 15 BENCHMARK 12 CPU USAGE DURING SYSTEM IDLE (PERCENT)... 16 BENCHMARK 13 CPU USAGE DURING ACTIVITY (PERCENT)... 16 BENCHMARK 14 INSTALLATION TIME (SECONDS)... 17 BENCHMARK 15 INSTALLATION SIZE (MB)... 17 BENCHMARK 16 BOOT TIME (SECONDS)... 18 DISCLAIMER AND DISCLOSURE... 19 CONTACT DETAILS... 19 APPENDIX 1 TEST ENVIRONMENT... 20 APPENDIX 2 METHODOLOGY DESCRIPTION... 21 Performance Benchmarks Windows 10 Page 2 of 26

Rev Revision History Date Edition 1 Initial version of this report. 8 February 2017 Ref # Document Author Date 1 What Really Slows Windows Down (URL) O. Warner, The PC Spy 2001-2017 Performance Benchmarks Windows 10 Page 3 of 26

conducted objective performance testing on five (5) security products during October 2017, with all but one of these products being Enterprise Endpoint Security products. This report presents our results from these performance tests. Benchmarking was performed using seventeen performance metrics to assess product performance and system impact on the endpoint or client machine. The metrics which were used in testing are as follows: Word Document Launch and Open Time; Edge Launch Time; On-Demand Scan Time; CPU Usage during Scan; Browse Time; File Copy, Move and Delete; Network Throughput; File Compression and Decompression; File Write, Open and Close; Memory Usage during System Idle; Memory Usage during Scan; Memory Usage during Activity; CPU Usage during System Idle; CPU Usage during Activity; Installation Time; Installation Size; and Boot Time. Performance Benchmarks Windows 10 Page 4 of 26

assigned every product a score depending on its ranking in each metric. Each product has scored points based on its rank out of the number of products tested in each category. The following table shows how rank in a metric relates to its attained score in a category with five (5) products: Test Rank Points Scored 1 5 2 4 3 3 4 2 5 1 We added the scores attained from each metric for each product to obtain the overall score and rank. For a hypothetical product which achieves first rank in every metric, the highest possible score attainable in testing is 85. The following chart shows the overall score and attained by each product from our testing in order of rank: 65 62 54 42 36 0 10 20 30 40 50 60 70 Performance Benchmarks Windows 10 Page 5 of 26

Enterprise Endpoint Security products: Manufacturer Product Name Date Tested Product Version Kaspersky Lab Oct 2016 10.2.5.3201 (mr3) McAfee, Inc Oct 2016 10.2.0.620 Symantec Corp Oct 2016 14.0.1899.0000 Trend Micro, Inc Oct 2016 11.0.6054 Microsoft Corp Oct 2016 4.3.220.0 Performance Benchmarks Windows 10 Page 6 of 26

We have selected a set of objective metrics which provide a comprehensive and realistic indication of the areas in which endpoint protection products may impact system performance for end users. Our metrics test the impact of the software on common tasks that end-users would perform on a daily basis. All of s test methods can be replicated by third parties using the same environment to obtain similar benchmark results. Detailed descriptions of the methodologies used in our tests are available as Appendix 2 Methodology Description of this report. This metric measures how much security software impacts on the responsiveness and performance of the endpoint system. Microsoft Word was chosen for this test because office software is commonly found on business computers. To test a product s performance in this metric, we measured the amount of time taken to launch a large, mixed media document from Microsoft Word. To allow for caching effects by the operating system, both the initial launch time and the subsequent launch times were measured. Our final result is an average of these two measurements. Similar to the Word Document Launch and Open Time metric, this metric is one of many methods to objectively measure how much a product impacts on the responsiveness of the system. This metric measures the amount of time it takes to launch the user interface of Microsoft Edge. To allow for caching effects by the operating system, both the initial launch time and the subsequent launch times were measured. Our final result is an average of these two measurements. All endpoint protection solutions have functionality designed to detect viruses and various other forms of malware by scanning files on the system. This metric measured the amount of time required to scan a set of clean files. Our sample file set comprised a total file size of 5.42 GB and was made up of files that would typically be found on end-user machines, such as media files, system files and Microsoft Office documents. The amount of load on the CPU while security software conducts a malware scan may prevent the reasonable use of the endpoint machine until the scan has completed. This metric measured the percentage of CPU used by endpoint protection software when performing a scan. It is common behaviour for security products to scan data for malware as it is downloaded from the internet or intranet. This behaviour may negatively impact browsing speed as products scan web content for malware. This metric measures the time taken to browse a set of popular internet sites to consecutively load from a local server in a user s browser window. Performance Benchmarks Windows 10 Page 7 of 26

This metric measures the amount of time taken to move, copy and delete a sample set of files. The sample file set contains several types of file formats that a Windows user would encounter in daily use. These formats include documents (e.g. Microsoft Office documents, Adobe PDF, Zip files, etc), media formats (e.g. images, movies and music) and system files (e.g. executables, libraries, etc). The metric measures the amount of time taken to download a variety of files from a local server using the HyperText Transfer Protocol (HTTP), which is the main protocol used on the web for browsing, linking and data transfer. Files used in this test include file formats that users would typically download from the web, such as images, archives, music files and movie files. This metric measures the amount of time taken to compress and decompress different types of files. Files formats used in this test included documents, movies and images. This benchmark was derived from Oli Warner s File I/O test at http://www.thepcspy.com (please see Reference #1: What Really Slows Windows Down). This metric measures the amount of time taken to write a file, then open and close that file. The amount of memory used while the machine is idle provides a good indication of the amount of system resources being consumed by the endpoint protection software on a permanent basis. This metric measures the amount of memory (RAM) used by the product while the machine and endpoint protection software are in an idle state. The total memory usage was calculated by identifying all endpoint protection software processes and the amount of memory used by each process. This metric measures the amount of memory (RAM) used by the product during an antivirus scan. The total memory usage was calculated by identifying all endpoint protection software processes and the amount of memory used by each process during an antivirus scan. This metric measures the amount of memory (RAM) used by the product during user activity on the system. The total memory usage was calculated by identifying all endpoint protection software processes and the amount of memory used by each process during the period of user activity. This metric measures the average amount of load placed on the CPU during system idle with the security software installed. Performance Benchmarks Windows 10 Page 8 of 26

This metric measures the average amount of load placed on the CPU during user activity with the security software installed. The speed and ease of the installation process will strongly influence the user s first impression of the security software. This test measures the minimum installation time required by the security software to be fully functional and ready for use by the end user. Lower installation times represent security products which are quicker for a user to install. In offering new features and functionality to users, security software products tend to increase in size with each new release. Although new technologies push the size limits of hard drives each year, the growing disk space requirements of common applications and the increasing popularity of large media files (such as movies, photos and music) ensure that a product's installation size will remain of interest to home users. This metric aims to measure a product s total installation size, and is defined as the total disk space consumed by all new files added during a product's installation. This metric measures the amount of time taken for the machine to boot into the operating system. Security software is generally launched at Windows startup, adding an additional amount of time and delaying the startup of the operating system. Shorter boot times indicate that the application has had less impact on the normal operation of the machine. Performance Benchmarks Windows 10 Page 9 of 26

In the following charts, we have highlighted the results we obtained for in yellow. The average has also been highlighted in blue for ease of comparison. The following chart compares the average time taken to launch Microsoft Word and open a 10MB document. Products with lower launch times are considered better performing products in this category. 334 335 385 496 699 729 0 ms 100 ms 200 ms 300 ms 400 ms 500 ms 600 ms 700 ms 800 ms The following chart compares the average time taken for Microsoft Edge to successively load. Products with lower load times are considered better performing products in this category. 476 535 648 679 719 1256 0 ms 200 ms 400 ms 600 ms 800 ms 1,000 ms 1,200 ms 1,400 ms Performance Benchmarks Windows 10 Page 10 of 26

The following chart compares the average time taken to scan a set of media files, system files and Microsoft Office documents that totaled 5.42 GB. Our final result is calculated as an average of five scans, with each scan having equal weighting. Products with lower scan times are considered better performing products in this category. 11 25 49 57 60 142 0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s 160 s The following chart compares the average CPU usage during a scan of a set of media files, system files and Microsoft Office documents that totaled 5.42 GB. Products with lower CPU usage are considered better performing products in this category. 26% 31% 32% 45% 48% 90% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Performance Benchmarks Windows 10 Page 11 of 26

The following chart compares the average time taken for Microsoft Edge to successively load a set of popular websites through the local area network from a local server machine. Products with lower browse times are considered better performing products in this category. 14.7 15.2 16.0 18.1 20.6 24.0 0 s 5 s 10 s 15 s 20 s 25 s 30 s The following chart compares the average time taken to copy, move and delete several sets of sample files for each product tested. Products with lower times are considered better performing products in this category. 3.3 3.5 4.0 6.6 7.1 14.9 0 s 2 s 4 s 6 s 8 s 10 s 12 s 14 s 16 s Performance Benchmarks Windows 10 Page 12 of 26

The following chart compares the average time to download a sample set of common file types for each product tested. Products with lower times are considered better performing products in this category. 5.3 6.0 6.4 7.4 9.3 9.9 0 s 2 s 4 s 6 s 8 s 10 s 12 s The following chart compares the average time it takes for sample files to be compressed and decompressed for each product tested. Products with lower times are considered better performing products in this category. 35.5 39.5 40.3 40.4 40.8 45.8 0 s 5 s 10 s 15 s 20 s 25 s 30 s 35 s 40 s 45 s 50 s Performance Benchmarks Windows 10 Page 13 of 26

The following chart compares the average time it takes for a file to be written to the hard drive then opened and closed 180,000 times, for each Internet Security product tested. Products with lower times are considered better performing products in this category. 10.9 18.9 30.9 530.9 581.5 2315.8 0 s 500 s 1,000 s 1,500 s 2,000 s 2,500 s The following chart compares the average amount of RAM in use by each product during a period of system idle. This average is taken from a sample of ten memory snapshots taken at roughly 60 seconds apart after reboot. Products that use less memory during idle are considered better performing products in this category. 112.5 113.0 150.2 155.2 156.7 243.5 0 MB 50 MB 100 MB 150 MB 200 MB 250 MB 300 MB Performance Benchmarks Windows 10 Page 14 of 26

The following chart compares the average amount of RAM in use by each product during an antivirus scan. This average is taken from a sample of ten memory snapshots taken at five second intervals during a scan of sample files which have not been previously scanned by the software. Products that use less memory during a scan are considered better performing products in this category. 129.9 163.3 206.6 304.0 487.7 532.8 0 MB 100 MB 200 MB 300 MB 400 MB 500 MB 600 MB The following chart compares the average amount of RAM in use by each product during a period of user activity on the system. This average is taken from a sample of fifteen memory snapshots taken at 60 second intervals during a period in which various applications are open and being used. Products that use less memory during a scan are considered better performing products in this category. 96 115 144 171 174 324 0 MB 50 MB 100 MB 150 MB 200 MB 250 MB 300 MB 350 MB Performance Benchmarks Windows 10 Page 15 of 26

The following chart compares the average CPU usage during system idle. Products with lower CPU usage are considered better performing products in this category. 0.81% 0.84% 1.06% 1.13% 1.15% 1.35% 0.0% 0.2% 0.4% 0.6% 0.8% 1.0% 1.2% 1.4% 1.6% The following chart compares the average CPU usage during user activity. Products with lower CPU usage are considered better performing products in this category. 1.3% 1.3% 1.5% 1.6% 1.6% 2.1% 0.0% 0.5% 1.0% 1.5% 2.0% 2.5% Performance Benchmarks Windows 10 Page 16 of 26

The following chart compares the time it takes for the endpoint software to be installed on the client machine so that it is fully functional and ready for use by the end user. Products with lower installation times are considered better performing products in this category. 77 162 264 316 499 0 s 100 s 200 s 300 s 400 s 500 s 600 s The following chart compares the total size of files added during the installation of the endpoint software. Products with lower installation sizes are considered better performing products in this category. 565.7 627.3 675.9 691.4 819.2 0 MB 100 MB 200 MB 300 MB 400 MB 500 MB 600 MB 700 MB 800 MB 900 MB Performance Benchmarks Windows 10 Page 17 of 26

The following chart compares the average time taken for the system to boot for each product tested. Products with lower boot times are considered better performing products in this category. 7.3 7.4 7.5 7.7 7.8 8.5 0 s 1 s 2 s 3 s 4 s 5 s 6 s 7 s 8 s 9 s Performance Benchmarks Windows 10 Page 18 of 26

This report only covers versions of products that were available at the time of testing. The tested versions are as noted in the Products and Versions section of this report. The products we have tested are not an exhaustive list of all products available in the competitive enterprise security market. While every effort has been made to ensure that the information presented in this report is accurate, PassMark Software Pty Ltd assumes no responsibility for errors, omissions, or out-of-date information and shall not be liable in any manner whatsoever for direct, indirect, incidental, consequential, or punitive damages resulting from the availability of, use of, access of, or inability to use this information. Symantec Corporation funded the production of this report, selected the test metrics, and supplied some of the test scripts used for the tests. All trademarks are the property of their respective owners. Pty Ltd Level 5 63 Foveaux St. Surry Hills, 2010 Sydney, Australia Phone + 61 (2) 9690 0444 Fax + 61 (2) 9690 0445 Web www.passmark.com Performance Benchmarks Windows 10 Page 19 of 26

For our testing, used a test environment running Windows 7 Ultimate SP1 (64-bit) with the following hardware specifications: Model: Lenovo H50W-50 i5 CPU: Intel Core i5-4460 CPU @ 3.20GHz 3.20 GHz Video Card: NVIDIA GeForce GT 705 RAM: 8GB DDR3 RAM SSD (Main Boot Drive): Intel SSD 730 Series 240GB Network: Gigabit (1Gb/s) O/S: Windows 10 Home (Build 10240) The Web and File server was not benchmarked directly, but served the web pages and files to the endpoint machine during performance testing. Model: CPU: Video Card: Motherboard: RAM: SSD: Network: Generic hardware Intel Xeon E3-1220v2 CPU Kingston 8GB (2 x 4GB ECC RAM) Intel S1200BTL Server Kingston 8GB (2 x 4GB) ECC RAM, 1333Mhz OCZ 128GB 2.5 Solid State Disk Gigabit (1Gb/s) The server was not benchmarked directly, but was used as the host for Virtual Machines to which enterprise components of software was installed. After installation, the Management Console server was used to deploy endpoint software to clients and to schedule scans. Model: CPU: Video Card: Motherboard: RAM: HDD: Network: Generic hardware AMD Phenom II x4 940 (Quad Core) ASUS GeForce 9400GT Gigabyte GA-MA790XT-UD4P 16GB PC3-10600 1333MHz DDR3 Memory Western Digital Caviar Green WD10EADS 1TB Serial ATA-II Gigabit (1Gb/s) Performance Benchmarks Windows 10 Page 20 of 26

The average launch time of Word interface was taken using AppTimer (v1.0.1008). This includes the time to launch Microsoft Word 2016 and open a 10MB document. This test was practically identical to the User Interface launch time test. For each product tested, we obtained a total of fifteen samples from five sets of three Word launches, with a reboot before each set to clear caching effects by the operating system. When compiling the results, the first of each set was separated out so that there was a set of values for the initial launch after reboot and a set for subsequent launches. We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result for this test is an average of the subsequent launch average and the initial launch time. AppTimer is publically available from the PassMark Website. The average launch time of Edge interface was taken using AppTimer. For each product tested, we obtained a total of fifteen samples from five sets of three Edge launches, with a reboot before each set to clear caching effects by the operating system. When compiling the results, the first of each set was separated out so that there was a set of values for the initial launch after reboot and a set for subsequent launches. For this test, we have used Microsoft Edge (Version 38.143930.0) as our test browser. We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result for this test is an average of the subsequent launch average and the initial launch time. On-Demand Scan Time measures the amount time it took for each endpoint product to scan a set of sample files from the right-click context menu in Windows Explorer. The sample used was identical in all cases and contained a mixture of system files and Office files. In total there were 8502 files whose combined size was 5.42 GB. Most of these files come from the Windows system folders. As the file types can influence scanning speed, the breakdown of the main file types, file numbers and total sizes of the files in the sample set is given here..avi 247 1024MB.dll 773 25MB.exe 730 198MB.gif 681 63MB.doc 160 60MB.docx 267 81MB.jpg 2904 318MB.mp3 333 2048MB.png 451 27MB.ppt 97 148MB.sys 501 80MB.wav 430 260MB.wma 585 925MB.xls 329 126MB.zip 14 177MB Where possible this scan was run without launching the product s user interface, by right-clicking the test folder and choosing the Scan Now option, though some products required entering the UI to scan a folder. To record the scan time, we have used product s built-in scan timer or reporting system. Where this was not possible, scan times were taken manually with a stopwatch. Performance Benchmarks Windows 10 Page 21 of 26

For each product, five samples were taken with the machine rebooted before each sample to clear any caching effects by the operating systems. Our final result was calculated as an average of five scans, with each scan having equal weighting. CPUAvg is a command-line tool which samples the amount of CPU load approximately two times per second. From this, CPUAvg calculates and displays the average CPU load for the interval of time for which it has been active. For this metric, CPUAvg was used to measure the CPU load on average (as a percentage) by the system while the Scan Time test was being conducted. The final result was calculated as an average five sets of thirty CPU load samples. We used Javascript to load a list of 103 popular websites consecutively from a local server. On each page in the sample data, a few lines of javascript are added to the website s html to execute the javascript script that loads the next website in the chain. To begin with, once the first website has been loaded completely, the script is executed to load the second website in the chain. Once this has finished loading, the script is executed to then load the third website in the chain. This process is repeated until the final website in the chain has loaded. The start time and end time of this process are recorded and the difference is calculated in seconds to get the final result. For this test, we have used the default Windows browser Microsoft Edge (Version 38.143930.0). The set of websites used in this test include front pages of high traffic pages. This includes shopping, social, news, finance and reference websites. The Browse Time test is executed five times and our final result is an average of these five samples. The local server is restarted between different products and one initial test run is conducted. This test measures the amount of time required for the system to copy, move and delete samples of files in various file formats. This sample was made up of 809 files over 683,410,115 bytes and can be categorized as documents [28% of total], media files [60% of total] and PE files (i.e. System Files) [12% of total]. This test was conducted five times to obtain the average time to copy, move and delete the sample files, with the test machine rebooted between each sample to remove potential caching effects. This benchmark measured how much time was required to download a sample set of binary files of various sizes and types over a 100MB/s network connection. The files were hosted on a server machine running Windows Server 2012 and IIS 7. CommandTimer.exe was used in conjunction with GNU Wget (version 1.10.1) to time and conduct the download test. Performance Benchmarks Page 22 of 26 Report 2 10 February 2017

The complete sample set of files was made up of 553,638,694 bytes over 484 files and two file type categories: media files [74% of total] and documents [26% of total]. This test was conducted five times to obtain the average time to download this sample of files, with the test machine rebooted between each sample to remove potential caching effects. This test measured the amount of time required to compress and decompress a sample set of files. For this test, we used a subset of the media and documents files used in the File Copy, Move and Delete benchmark. CommandTimer.exe recorded the amount of time required for 7zip.exe to compress the files into a *.zip and subsequently decompress the created *.zip file. This subset comprised 404 files over 277,346,661 bytes. The breakdown of the file types, file numbers and total sizes of the files in the sample set is shown in the following table: File format Category Number Size (bytes) DOC Documents 8 30,450,176 DOCX Documents 4 13,522,409 PPT Documents 3 5,769,216 PPTX Documents 3 4,146,421 XLS Documents 4 2,660,352 XLSX Documents 4 1,426,054 JPG Media 351 31,375,259 GIF Media 6 148,182 MOV Media 7 57,360,371 RM Media 1 5,658,646 AVI Media 8 78,703,408 WMV Media 5 46,126,167 Total 404 277,346,661 This test was conducted five times to obtain the average file compression and decompression speed, with the test machine rebooted between each sample to remove potential caching effects. This benchmark was derived from Oli Warner s File I/O test at http://www.thepcspy.com (please see Reference #1: What Really Slows Windows Down). For this test, we developed OpenClose.exe, an application that looped writing a small file to disk, then opening and closing that file. CommandTimer.exe was used to time how long the process took to complete 180,000 cycles. This test was conducted five times to obtain the average file writing, opening and closing speed, with the test machine rebooted between each sample to remove potential caching effects. Performance Benchmarks Page 23 of 26 Report 2 10 February 2017

The PerfLog++ utility was used to record process memory usage on the system at boot, and then every minute for another fifteen minutes after. This was done only once per product and resulted in a total of 15 samples. The first sample taken at boot is discarded. The PerfLog++ utility records memory usage of all processes, not just those of the anti-malware product. As a result of this, an anti-malware product s processes needed to be isolated from all other running system processes. To isolate relevant process, we used a program called Process Explorer which was run immediately upon the completion of memory usage logging by PerfLog++. Process Explorer is a Microsoft Windows Sysinternals software tool which shows a list of all DLL processes currently loaded on the system. Our final result is calculated as the total sum of Private Bytes used by each process belonging to the endpoint security software. The PerfLog++ utility was used to record memory usage on the system while a malware scan is in progress. Please refer to the metric Memory usage System Idle above for a description of the PerfLog++ utility and an explanation of the method by which memory usage is calculated. As some products cache scan locations, we take reasonable precautions to ensure that the antivirus software does not scan the C:\ drive at any point before conducting this test. A manual scan on the C:\ drive is initiated at the same time as the PerfLog++ utility, enabling PerfLog++ to record memory usage for 60 seconds at five second intervals. Our final result is calculated as the total sum of Private Bytes used by each process belonging to the endpoint security software during the malware scan. The PerfLog++ utility was used to record memory usage on the system during a period of typical user activity. Please refer to the metric Memory usage System Idle above for a description of the PerfLog++ utility and an explanation of the method by which memory usage is calculated. To emulate user activity, the following environment was created and memory snapshots were recorded at 60 second intervals over 15-minute period: 10 tabs were left open in a browser; Microsoft Outlook was used to send 3 emails and receive 3 emails (1 sent and 1 received every 5 minutes); 2 Word documents, 2 Excel documents, and 2 PowerPoint documents were left open; and An IM chat window (Skype) was open. Our final result is calculated as the total sum of Private Bytes used by each process belonging to the endpoint security software during the period of activity. Performance Benchmarks Page 24 of 26 Report 2 10 February 2017

CPUAvg is a command-line tool which samples the amount of CPU load two times per second. From this, CPUAvg calculates and displays the average CPU load for the interval of time for which it has been active. For this metric, CPUAvg was used to measure the CPU load on average (as a percentage) during a period of system idle for five minutes. This test is conducted after restarting the endpoint machine and after five minutes of machine idle. CPUAvg is a command-line tool which samples the amount of CPU load two times per second. From this, CPUAvg calculates and displays the average CPU load for the interval of time for which it has been active. For this metric, CPUAvg was used to measure the CPU load on average (as a percentage) during a period of typical user activity. To emulate user activity, the following environment was created and the CPU was measured over 15-minute period: 10 tabs were left open in a browser; Microsoft Outlook was used to send 3 emails and receive 3 emails (1 sent and 1 received every 5 minutes); 2 Word documents, 2 Excel documents, and 2 PowerPoint documents were left open; and An IM chat window (Skype) was open. This test measures the time it takes to install the client software on an endpoint so that it is fully functional and ready for use by the end user. A stopwatch was used to manually time the installation in seconds and the results are recorded in as much detail as possible. Where possible, all requests by products to pre-scan or post-install scan were declined or skipped. Where it was not possible to skip a scan, the time to scan was included as part of the installation time. Where an optional component of the installation formed a reasonable part of the functionality of the software, it was also installed. This metric also includes the time taken by the product installer to download components required in the installation. This may include mandatory updates or the delivery of the application itself from a download manager. We have excluded product activation times due to network variability in contacting vendor servers or time taken in account creation. Using PassMark s OSForensics we created initial and post-installation disk signatures on the endpoint machine for each product. These disk signatures recorded the amount of files and directories, and complete details of all files on that drive including file name, file size, checksum, etc) at the time the signature was taken. The initial disk signature was taken immediately prior to installation of the product. A subsequent disk signature was taken immediately following a system reboot after product installation on the endpoint machine. This includes Performance Benchmarks Page 25 of 26 Report 2 10 February 2017

any files added as a result of a manual update carried out on the endpoint immediately after installation. Using OSForensics, we compared the two signatures and calculated the total additional disk space consumed by files that were new or modified during product installation. uses tools available from the Windows Performance Toolkit (as part of the Microsoft Windows 10 ADK obtainable from the Microsoft Website). The Boot Performance (fast startup) test is ran as an individual assessment via the Windows Assessment Console. The network connection is disabled and the login password is removed to avoid interruption to the test. The final result is taken as the total boot duration excluding BIOS load time. Performance Benchmarks Page 26 of 26 Report 2 10 February 2017