SlideShare ist ein Scribd-Unternehmen logo
1 von 27
Downloaden Sie, um offline zu lesen
Webroot SecureAnywhere Cloud vs. Six
Traditional Security Products
(Dec 2011)

Security Performance Benchmark


Document:   Webroot Secure Anywhere Cloud vs. Six Traditional Security Products (Nov 2011)
Authors:    M. Baquiran, D. Wren
Company:    PassMark Software
Date:       29 November 2011
File:       Webroot_SecureAnywhere_vs_security_competitors_29Nov2011.docx
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Table of Contents
TABLE OF CONTENTS......................................................................................................................................... 2
REVISION HISTORY............................................................................................................................................ 3
REFERENCES...................................................................................................................................................... 3
EXECUTIVE SUMMARY ...................................................................................................................................... 4
OVERALL SCORE ................................................................................................................................................ 5
PRODUCTS AND VERSIONS ............................................................................................................................... 6
PERFORMANCE METRICS SUMMARY ................................................................................................................ 7
TEST RESULTS ................................................................................................................................................. 10
   BENCHMARK 1 – INITIAL SCAN TIME ......................................................................................................................... 10
   BENCHMARK 2 – INSTALLATION SIZE ......................................................................................................................... 10
   BENCHMARK 3 – INSTALLATION TIME ........................................................................................................................ 11
   BENCHMARK 4 – REGISTRY KEYS ADDED .................................................................................................................... 11
   BENCHMARK 5 – BOOT TIME ................................................................................................................................... 12
   BENCHMARK 6 – USER INTERFACE LAUNCH TIME ........................................................................................................ 12
   BENCHMARK 7 – MEMORY USAGE DURING SYSTEM IDLE .............................................................................................. 13
   BENCHMARK 8 – MEMORY USAGE DURING INITIAL SCAN .............................................................................................. 13
   BENCHMARK 9 – BROWSE TIME ............................................................................................................................... 14
   BENCHMARK 10 – FILE COPY, MOVE AND DELETE ....................................................................................................... 14
   BENCHMARK 11 – NETWORK THROUGHPUT ............................................................................................................... 15
   BENCHMARK 12 – FILE FORMAT CONVERSION ............................................................................................................ 15
   BENCHMARK 13 – FILE COMPRESSION AND DECOMPRESSION ........................................................................................ 16
   BENCHMARK 14 – FILE WRITE, OPEN AND CLOSE ........................................................................................................ 16
CONTACT DETAILS .......................................................................................................................................... 17
APPENDIX 1 – TEST ENVIRONMENT ................................................................................................................ 18
APPENDIX 2 – METHODOLOGY DESCRIPTION ................................................................................................. 19




Performance Benchmark                                                                                                                      Page 2 of 27
                                                                                                                                     29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Revision History
           Rev                                          Revision History                       Date

         Edition 1   Initial version of this report.                                     09 September 2011




References
           Ref #                                 Document                   Author             Date

                                                                           O. Warner,
            1        What Really Slows Windows Down (URL)                                    2001-2009
                                                                           The PC Spy




Performance Benchmark                                                                         Page 3 of 27
                                                                                        29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Executive Summary
PassMark Software® conducted objective performance testing on eight (8) security software products, on
Windows 7 Ultimate Edition (64-bit) from in November 2011. This report presents our results and findings as a
result of performance benchmark testing conducted for these consumer security products.

The aim of this benchmark was to compare the performance impact of Webroot’s SecureAnywhere Cloud
Security products with 6 other traditional security products.

Testing was performed on all products using fourteen (14) performance metrics. These performance metrics are
as follows:

 Initial Scan Time;
 Installation Size;
 Installation Time;
 Registry Keys Added;
 Boot Time;
 User Interface Launch Time;
 Memory Usage during System Idle;
 Memory Usage during Initial Scan;
 Browse Time;
 File Copy, Move and Delete;
 Network Throughput (previously named “Binary Download Test”)
 File Format Conversion;
 File Compression and Decompression; and
 File Write, Open and Close.




Performance Benchmark                                                                           Page 4 of 27
                                                                                          29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Overall Score
PassMark Software assigned every product a score depending on its ranking in each metric compared to other
products in the same category. In the following table the highest possible score attainable is 112; in a
hypothetical situation where a product has attained first place in all 14 metrics. Security products have been
ranked by their overall scores:

                                                           Product Name     Overall Score

                                  Webroot SecureAnywhere Complete 2012      102

                                  Webroot SecureAnywhere Essentials 2012    101

                                                           Norton 360 v5    76

                                                    ESET Smart Security 5   60

                                            Trend Maximum Security 2012     55

                                             McAfee Total Protection 2012   51

                                              AVG Premium Security 2012     48

                                                        Kaspersky PURE 9    46




Performance Benchmark                                                                             Page 5 of 27
                                                                                            29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Products and Versions
For each security product, we have tested the most current and available version.




                 Manufacturer       Product Name                       Release Year   Product Version     Date Tested

                                    Webroot SecureAnywhere
         Webroot Software, Inc.                                           2011           8.0.0.66          Nov 2011
                                    Complete 2012

                                    Webroot SecureAnywhereEssentials
         Webroot Software, Inc.                                           2011           8.0.0.66          Nov 2011
                                    2012

                 Kaspersky Lab      Kaspersky PURE                        2011           9.1.0.124         Nov 2011

                                    Trend Micro Titanium Maximum
               Trend Micro Inc.                                           2011           5.0.1280          Nov 2011
                                    Security 2012

             AVG Technologies       AVG Premium Security 2012             2011         2012.0.1873         Nov 2011

                Symantec Corp       Norton 360                            2011           5.0.0.125         Nov 2011

               ESET, spol. s r.o.   ESET Smart Security 5                 2010           5.0.94.0          Nov 2011

                   McAfee, Inc.     McAfee Total Protection               2011           11.0.623          Nov 2011




Performance Benchmark                                                                                         Page 6 of 27
                                                                                                        29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Performance Metrics Summary
We have selected a set of objective metrics which provide a comprehensive and realistic indication of the areas
in which an security may impact system performance for end users. Our metrics test the impact of the security
software on common tasks that end-users would perform on a daily basis.

All of PassMark Software’s test methods can be replicated by third parties using the same environment to obtain
similar benchmark results. Detailed descriptions of the methodologies used in our tests are available as
“Appendix 2 – Methodology Description” of this report.

Benchmark 1 – Initial Scan Time
All security solutions have functionality designed to detect viruses and various other forms of malware by
scanning files on the system. This metric measured the amount of time required to scan a set of clean files. Our
sample file set comprised a total file size of 982 MB and was made up of files that would typically be found on
end-user machines, such as media files, system files and Microsoft Office documents.

Benchmark 2 – Installation Size
In offering new features and functionality to users, security software products tend to increase in size with each
new release. Although new technologies push the size limits of hard drives each year, the growing disk space
requirements of common applications and the increasing popularity of large media files (such as movies, photos
and music) ensure that a product's installation size will remain of interest to home users.

This metric aims to measure a product’s total installation size. This metric is defined as the total disk space
consumed by all new files added during a product's installation.

Benchmark 3 – Installation Time
The speed and ease of the installation process will strongly influence the user’s first impression of the security
software. This test measures the minimum installation time required by the security software to be fully
functional and ready for use by the end user. Lower installation times represent security products which are
quicker for a user to install.

Benchmark 4 – Registry Keys Added
A large registry increases a machine’s use of resources. This may negatively impact system performance,
especially on much older machines. This test measures the amount of keys and values added to registry, after
rebooting the test machines, following a successful product installation. Lower numbers mean that a product has
added fewer keys during installation and had less impact on the registry.

Benchmark 5 – Boot Time
This metric measures the amount of time taken for the machine to boot into the operating system. Security
software is generally launched at Windows startup, adding an additional amount of time and delaying the
startup of the operating system. Shorter boot times indicate that the application has had less impact on the
normal operation of the machine.




Performance Benchmark                                                                               Page 7 of 27
                                                                                              29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Benchmark 6 – User Interface Launch Time
This metric provides an objective indication as to how responsive a security product appears to the user, by
measuring the amount of time it takes for the user interface of the security software to launch from Windows.
To allow for caching effects by the operating system, both the initial launch time and the subsequent launch
times were measured. Our final result is an average of these two measurements.

Benchmark 7 – Memory Usage during System Idle
This metric measures the amount of memory (RAM) used by the product while the machine and security
software are in an idle state. The total memory usage was calculated by identifying all security software
processes and the amount of memory used by each process.

The amount of memory used while the machine is idle provides a good indication of the amount of system
resources being consumed by the security software on a permanent basis. Better performing products occupy
less memory while the machine is idle.

Benchmark 8 – Memory Usage – Scan
This metric measures the amount of memory (RAM) used by the product during an security scan. The total
memory usage was calculated by identifying all endpoint protection software processes and the amount of
memory used by each process during an security scan.

Benchmark 9 – Browse Time
It is common behavior for security products to scan data for malware as it is downloaded from the internet or
intranet. This behavior may negatively impact browsing speed as products scan web content for malware. This
metric measures the time taken to browse a set of popular internet sites to consecutively load from a local
server in a user’s browser window.

Benchmark 10 – File Copy, Move and Delete
This metric measures the amount of time taken to move, copy and delete a sample set of files. The sample file
set contains several types of file formats that a Windows user would encounter in daily use. These formats
include documents (e.g. Microsoft Office documents, Adobe PDF, Zip files, etc), media formats (e.g. images,
movies and music) and system files (e.g. executables, libraries, etc).

Benchmark 11 – Network Throughput
The metric measures the amount of time taken to download a variety of files from a local server using the
HyperText Transfer Protocol (HTTP), which is the main protocol used on the web for browsing, linking and data
transfer. Files used in this test include file formats that users would typically download from the web, such as
images, archives, music files and movie files.

Benchmark 12 – File Format Conversion
This test measures the amount of time taken to convert an MP3 file to a WAV and subsequently, convert the
same MP3 file to a WMA format.




Performance Benchmark                                                                            Page 8 of 27
                                                                                           29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Benchmark 13 – File Compression and Decompression
This metric measures the amount of time taken to compress and decompress different types of files. Files
formats used in this test included documents, movies and images.

Benchmark 14 – File Write, Open and Close
This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference
#1: What Really Slows Windows Down). This metric measures the amount of time taken to write a file, then
open and close that file.




Performance Benchmark                                                                        Page 9 of 27
                                                                                       29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Test Results
In the following charts, we have highlighted the results we obtained for Webroot SecureAnywhere Complete
2012 and Webroot SecureAnywhere Essentials in green and orange respectively. The average has also been
highlighted in blue for ease of comparison.


Benchmark 1 – Initial Scan Time
The following chart compares the average time taken to run an initial scan on a set of 6159 files (totaling 982
MB) for each Security product tested.



 Webroot SecureAnywhere Complete 2012                     20.00

 Webroot SecureAnywhere Essentials 2012                       21.00

            McAfee Total Protection 2012                                   38.40

           Trend Maximum Security 2012                                      39.90

                          Norton 360 v5                                            45.00

                                Average                                              47.66

             AVG Premium Security 2012                                                       55.00

                   ESET Smart Security 5                                                               68.00

                       Kaspersky PURE 9                                                                                        94.00

                                           0s          10 s       20 s   30 s      40 s      50 s     60 s      70 s   80 s     90 s   100 s



Benchmark 2 – Installation Size
The following chart compares the total size of files added during the installation of Security products. Products
with lower installation sizes are considered better performing products in this category.


 Webroot SecureAnywhere Essentials 2012         4.18


 Webroot SecureAnywhere Complete 2012           61.02


                          Norton 360 v5         273.06


                   ESET Smart Security 5         335.50


             AVG Premium Security 2012                 428.43


           Trend Maximum Security 2012                 430.43


                                Average                   514.67


            McAfee Total Protection 2012                       596.38


                       Kaspersky PURE 9                                                                           1988.34


                                       0 MB                     500 MB          1,000 MB             1,500 MB          2,000 MB        2,500 MB

Performance Benchmark                                                                                                              Page 10 of 27
                                                                                                                              29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Benchmark 3 – Installation Time
The following chart compares the minimum installation time it takes for Security products to be fully functional
and ready for use by the end user. Products with lower installation times are considered better performing
products in this category.


 Webroot SecureAnywhere Complete 2012           5.00


 Webroot SecureAnywhere Essentials 2012         6.00


                          Norton 360 v5         48.20


                       Kaspersky PURE 9         52.50


             AVG Premium Security 2012          52.50


           Trend Maximum Security 2012                      145.80


                                Average                        169.81


                   ESET Smart Security 5                                                        389.40


            McAfee Total Protection 2012                                                                                             659.10


                                           0s              100 s             200 s    300 s              400 s      500 s          600 s      700 s




Benchmark 4 – Registry Keys Added
The following chart compares the amount of Registry Keys created during product installation for each Security
product tested. Products with lower key counts are considered better performing products in this category.


 Webroot SecureAnywhere Essentials 2012         568.00


 Webroot SecureAnywhere Complete 2012           690.00


                   ESET Smart Security 5          3324.00


                          Norton 360 v5                  4472.00


           Trend Maximum Security 2012                   4658.00


             AVG Premium Security 2012                         6048.00


                                Average                            6547.50


                       Kaspersky PURE 9                                              12008.00


            McAfee Total Protection 2012                                                                                20612.00


                                           0                       5000              10000                  15000           20000              25000




Performance Benchmark                                                                                                        Page 11 of 27
                                                                                                                        29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Benchmark 5 – Boot Time
The following chart compares the average time taken for the system to boot (from a sample of five boots) for
each Security product tested. Products with lower boot times are considered better performing products in this
category.



 Webroot SecureAnywhere Complete 2012                      26.03


 Webroot SecureAnywhere Essentials 2012                    26.06


            Trend Maximum Security 2012                      28.78


            McAfee Total Protection 2012                          32.23


                   ESET Smart Security 5                          32.60


                          Norton 360 v5                             35.23


                                Average                                        48.99


              AVG Premium Security 2012                                                                                      103.96


                       Kaspersky PURE 9                                                                                        107.00


                                           0s              20 s               40 s            60 s            80 s             100 s         120 s



Benchmark 6 – User Interface Launch Time
The following chart compares the average time taken to launch a product’s user interface. Products with lower
launch times are considered better performing products in this category.



            McAfee Total Protection 2012        46.63


                   ESET Smart Security 5        66.97


 Webroot SecureAnywhere Complete 2012                   286.90


 Webroot SecureAnywhere Essentials 2012                  290.77


            Trend Maximum Security 2012                              452.24


                                Average                                       581.91


                          Norton 360 v5                                                                  949.88


                       Kaspersky PURE 9                                                                                      1247.01


              AVG Premium Security 2012                                                                                          1314.91


                                       0 ms             200 ms        400 ms         600 ms          800 ms       1,000 ms    1,200 ms     1,400 ms



Performance Benchmark                                                                                                              Page 12 of 27
                                                                                                                              29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Benchmark 7 – Memory Usage during System Idle
The following chart compares the average amount of RAM in use by an Security product during a period of
system idle. This average is taken from a sample of ten memory snapshots taken at roughly 60 seconds apart
after reboot. Products with lower idle RAM usage are considered better performing products in this category.



 Webroot SecureAnywhere Complete 2012      2.64


 Webroot SecureAnywhere Essentials 2012    3.12


                          Norton 360 v5            23.04


                       Kaspersky PURE 9            23.74


                                Average                               48.00


             AVG Premium Security 2012                                           63.22


           Trend Maximum Security 2012                                                               84.62


            McAfee Total Protection 2012                                                                 90.12


                   ESET Smart Security 5                                                                       93.49


                                       0 MB   10 MB 20 MB 30 MB 40 MB 50 MB 60 MB 70 MB 80 MB 90 MB 100 MB



Benchmark 8 – Memory Usage during Initial Scan
The following chart compares the average amount of RAM in use by an Security product during an initial scan on
the main drive. This average is taken from a sample of ten memory snapshots taken at five second intervals
during a scan of sample files which have not been previously scanned by the software. Products that use less
memory during a scan are considered better performing products in this category.



 Webroot SecureAnywhere Complete 2012      9.14


 Webroot SecureAnywhere Essentials 2012    21.57


                   ESET Smart Security 5                    100.43


                                Average                              121.01


                          Norton 360 v5                              122.25


                       Kaspersky PURE 9                                 130.78


           Trend Maximum Security 2012                                           162.65


            McAfee Total Protection 2012                                                  184.61


             AVG Premium Security 2012                                                                       236.66


                                       0 MB        50 MB       100 MB            150 MB            200 MB              250 MB


Performance Benchmark                                                                                         Page 13 of 27
                                                                                                         29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Benchmark 9 – Browse Time

The following chart compares the average time taken for Internet Explorer to successively load a set of popular
websites through the local area network from a local server machine. Products with lower browse times are
considered better performing products in this category.


 Webroot SecureAnywhere Complete 2012                         33.22


 Webroot SecureAnywhere Essentials 2012                          36.60


                          Norton 360 v5                                  42.50


            Trend Maximum Security 2012                                   43.81


                                Average                                            49.92


              AVG Premium Security 2012                                                    54.91


                   ESET Smart Security 5                                                       56.18


                       Kaspersky PURE 9                                                         57.71


            McAfee Total Protection 2012                                                                              74.45


                                           0s   10 s   20 s    30 s        40 s           50 s           60 s         70 s       80 s



Benchmark 10 – File Copy, Move and Delete
The following chart compares the average time taken to copy, move and delete several sets of sample files for
each Security product tested. Products with lower times are considered better performing products in this
category.


 Webroot SecureAnywhere Essentials 2012                                     24.08


 Webroot SecureAnywhere Complete 2012                                       24.11


                          Norton 360 v5                                         24.38


                       Kaspersky PURE 9                                           25.42


            Trend Maximum Security 2012                                                 27.22


                                Average                                                 27.41


                   ESET Smart Security 5                                                  28.27


            McAfee Total Protection 2012                                                        30.27


              AVG Premium Security 2012                                                                       35.52


                                           0s   5s     10 s   15 s       20 s           25 s           30 s      35 s         40 s


Performance Benchmark                                                                                                 Page 14 of 27
                                                                                                                 29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Benchmark 11 – Network Throughput
The following chart compares the average time to download a sample set of common file types for each Security
product tested. Products with lower times are considered better performing products in this category.


 Webroot SecureAnywhere Essentials 2012                           10.23


 Webroot SecureAnywhere Complete 2012                             10.25


           Trend Maximum Security 2012                               11.45


                          Norton 360 v5                                           14.84


            McAfee Total Protection 2012                                          14.93


                                Average                                            15.22


                   ESET Smart Security 5                                           15.29


                       Kaspersky PURE 9                                                          18.02


             AVG Premium Security 2012                                                                                       26.78


                                           0s      5s                 10 s                15 s             20 s             25 s        30 s




Benchmark 12 – File Format Conversion
The following chart compares the average time it takes for five sample files to be converted from one file format
to another (MP3  WMA, MP3  WAV) for each Security product tested. Products with lower times are
considered better performing products in this category.



                   ESET Smart Security 5                 75.69


            McAfee Total Protection 2012                  75.78


                          Norton 360 v5                     75.90


             AVG Premium Security 2012                      75.96


 Webroot SecureAnywhere Complete 2012                             76.18


 Webroot SecureAnywhere Essentials 2012                             76.25


                                Average                                   76.51


           Trend Maximum Security 2012                                                77.41


                       Kaspersky PURE 9                                                                           78.91


                                           74 s   75 s               76 s            77 s                78 s             79 s       80 s

Performance Benchmark                                                                                                          Page 15 of 27
                                                                                                                          29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Benchmark 13 – File Compression and Decompression

The following chart compares the average time it takes for sample files to be compressed and decompressed for
each Security product tested. Products with lower times are considered better performing products in this
category.



                       Kaspersky PURE 9                          74.68


 Webroot SecureAnywhere Essentials 2012                                                    78.58


 Webroot SecureAnywhere Complete 2012                                                      78.64


                          Norton 360 v5                                                            79.95


                                Average                                                               80.46


            Trend Maximum Security 2012                                                                           82.16


            McAfee Total Protection 2012                                                                           82.52


              AVG Premium Security 2012                                                                                   83.18


                   ESET Smart Security 5                                                                                     83.94


                                           70 s         72 s      74 s           76 s       78 s           80 s       82 s           84 s           86 s



Benchmark 14 – File Write, Open and Close
The following chart compares the average time it takes for a file to be written to the hard drive then opened and
closed 180,000 times, for each Security product tested. Products with lower times are considered better
performing products in this category.



 Webroot SecureAnywhere Complete 2012           15.80


 Webroot SecureAnywhere Essentials 2012         15.98


              AVG Premium Security 2012         17.85


                       Kaspersky PURE 9            28.68


                          Norton 360 v5             29.39


                                Average                        52.11


            McAfee Total Protection 2012                               65.08


            Trend Maximum Security 2012                                  67.81


                   ESET Smart Security 5                                                                                           176.28


                                           0s       20 s       40 s       60 s      80 s      100 s     120 s       140 s         160 s     180 s      200 s


Performance Benchmark                                                                                                                    Page 16 of 27
                                                                                                                                    29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


Disclaimer and Disclosure
This report only covers versions of products that were available at the time of testing. The tested versions are as
noted in the “Products and Versions” section of this report. The products we have tested are not an exhaustive
list of all products available in these very competitive product categories.

Disclaimer of Liability
While every effort has been made to ensure that the information presented in this report is accurate, PassMark
Software Pty Ltd assumes no responsibility for errors, omissions, or out-of-date information and shall not be
liable in any manner whatsoever for direct, indirect, incidental, consequential, or punitive damages resulting
from the availability of, use of, access of, or inability to use this information.

Disclosure
Webroot Software Inc. funded the production of this report. The list of products tested and the metrics included
in the report were selected by Webroot.

Trademarks
All trademarks are the property of their respective owners.




Contact Details
PassMark Software Pty Ltd
Suite 202, Level 2
35 Buckingham St.
Surry Hills, 2010
Sydney, Australia
Phone + 61 (2) 9690 0444
Fax      + 61 (2) 9690 0445
Web      www.passmark.com




Performance Benchmark                                                                             Page 17 of 27
                                                                                             29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Appendix 1 – Test Environment
For our testing, PassMark Software used a test environment running Windows 7 Ultimate (64-bit) SP1 with the
following hardware specifications:

Windows 7 (64-bit) System
CPU:                      Intel Core i7 920 Quad Core @ 2.67GHz
Video Card:               nVidia GeForce 8800 GT
Motherboard:              Intel x58 Motherboard
RAM:                      6GB DDR3 RAM
HDD:                      Western Digital 500GB 7200RPM
Network:                  Gigabit (1GB/s) switch




Performance Benchmark                                                                       Page 18 of 27
                                                                                       29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products



Appendix 2 – Methodology Description
Windows 7 Image Creation
As with testing on Windows Vista, Norton Ghost was used to create a “clean” baseline image prior to testing.
Our aim is to create a baseline image with the smallest possible footprint and reduce the possibility of variation
caused by external operating system factors.

The baseline image was restored prior to testing of each different product. This process ensures that we install
and test all products on the same, “clean” machine.

The steps taken to create the base Windows 7 image are as follows:

1.     Installation and activation of Windows 7 Ultimate Edition.
2.     Disabled Automatic Updates.
3.     Changed User Account Control settings to “Never Notify”.
4.     Disable Windows Defender automatic scans to avoid unexpected background activity.
5.     Disable the Windows firewall to avoid interference with security software.
6.     Installed Norton Ghost for imaging purposes.
7.     Disabled Superfetch to ensure consistent results.
8.     Installed HTTP Watch for Browse Time testing.
9.     Installed Windows Performance Toolkit x64 for Boot Time testing.
10.    Installed Active Perl for interpretation of some test scripts.
11.    Install OSForensics for testing (Installation Size and Registry Key Count tests) purposes.
12.    Disabled updates, accelerators and compatibility view updates in Internet Explorer 8.
13.    Update to Windows Service Pack 1
14.    Created a baseline image using Norton Ghost.

Benchmark 1 – Initial Scan Time
Scan Time is the time it took for each product to run an initial scan on a set of sample files. The sample used was
identical in all cases and contained a mixture of system files and Office files. In total there were 6159 files whose
combined size was 982 MB. Most of these files come from the Windows system folders. As the file types can
influence scanning speed, the breakdown of the main file types, file numbers and total sizes of the files in the
sample set is given here:



                                      File Extension   Number of Files   File Size

                                            .dll            2589         490MB
                                           .exe             695          102MB
                                            .sys            332          23MB
                                            .gif            302           1MB
                                           .doc             281          64MB
                                          .wmf              185           2MB
                                           .png             149           2MB
                                          .html             126           1MB
                                            .nls             80           6MB
                                            .jpg             70           1MB
                                            .ini             59           2MB
                                            .ico             58          <1MB

Performance Benchmark                                                                               Page 19 of 27
                                                                                               29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


                                          .mof              43           6MB
                                             .ax            39           4MB
                                            .xls            38           3MB
                                          .ime              35           5MB
                                           .drv             31           1MB
                                            .txt            31           1MB
                                          .chm              30           6MB
                                            .cpl            29           4MB
                                           .mfl             29           3MB
                                            .inf            26           2MB
                                           .hlp             22           3MB
                                          .imd              20          18MB
                                            .py             20          <1MB
                                          .msc              18           1MB
                                           .vbs             18           1MB
                                           .xml             18           1MB
                                            .rtf            16          62MB
                                           .ocx             16           4MB
                                           .tsp             14           1MB
                                          .com              14          <1MB
                                            .xsl            14          <1MB
                                              .h            13          <1MB
                                           .vsd             12           2MB
                                            .scr            12           2MB
                                            .aw             12           2MB
                                              .js           12           1MB
                                            .zip            11          25MB
                                            .lex             9          10MB
                                           .ppt              9           4MB
                                          .acm               9           1MB
                                          .wav               7           5MB
                                          Total            6159          982



This scan was run without launching the product’s user interface, by right-clicking the test folder and choosing
the “Scan Now” option. To record the scan time, we have used product’s built-in scan timer or reporting system.
Where this was not possible, scan times were taken manually with a stopwatch.

Benchmark 2 – Installation Size
A product's Installation Size was previously defined as the difference between the initial snapshot of the Disk
Space (C: drive) before installation and the subsequent snapshot taken after the product is installed on the
system. Although this is a widely used methodology, we noticed that the results it yielded were not always
reproducible in Vista due to random OS operations that may take place between the two snapshots. We
improved the Installation Size methodology by removing as many Operating System and disk space variables as
possible.

Using PassMark’s OSForensics we created initial and post-installation disk signatures for each product. These
disk signatures recorded the amount of files and directories, and complete details of all files on that drive
(including file name, file size, checksum, etc) at the time the signature was taken.

The initial disk signature was taken immediately prior to installation of the product. A subsequent disk signature
was taken immediately following a system reboot after product installation. Using OSForensics, we compared
the two signatures and calculated the total disk space consumed by files that were new, modified, and deleted
during product installation. Our result for this metric reflects the total size of all newly added files during
installation.

Performance Benchmark                                                                             Page 20 of 27
                                                                                             29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


The scope of this metric includes only an ‘out of the box’ installation size for each product. Our result does not
cover the size of files downloaded by the product after its installation (such as engine or signature updates), or
any files created by system restore points, pre-fetch files and other temporary files.

Benchmark 3 – Registry Key Count
This test measures the amount of keys and values added to registry, after rebooting the test machine following a
successful product installation. The test was conducted using RegistryCounter.exe, an application which
conducts a count of all keys, errors and values under HKEY_LOCAL_MACHINE and HKEY_USERS.

Two Registry Key counts are taken, one prior to installation and a second immediately following a reboot after
installation. To obtain our result, we calculated the difference between these two registry key totals.

Occasionally, an unusually large number of registry keys were removed by the system. This seemed to occur
when a product was installed, but was not necessarily caused directly by the product. This resulted in a largely
negative difference in registry keys between the before and after registry key counts obtained using
RegistryCounter.exe. To overcome this problem, we used PassMark’s OSForensics to calculate and obtain a
breakdown of the new, modified and deleted keys separately. The total of the new keys obtained from
OSForensics was used as the result for this test in the case that the former method using RegistryCounter.exe
yielded a negative value.

Benchmark 4 – Boot Time
PassMark Software uses tools available from the Windows Performance Toolkit version 4.6 (as part of the
Microsoft Windows 7 SDK obtainable from the Microsoft Website) with a view to obtaining more precise and
consistent boot time results on the Windows 7 platform.

The boot process is first optimized with xbootmgr.exe using the command “xbootmgr.exe -trace boot –
prepSystem” which prepares the system for the test over six optimization boots. The boot traces obtained from
the optimization process are discarded.

After boot optimization, the benchmark is conducted using the command "xbootmgr.exe -trace boot -numruns
5”. This command boots the system five times in succession, taking detailed boot traces for each boot cycle.

Finally, a post-processing tool was used to parse the boot traces and obtain the BootTimeViaPostBoot value. This
value reflects the amount of time it takes the system to complete all (and only) boot time processes. Our final
result is an average of five boot traces.

Benchmark 5 – User Interface Launch Time
The launch time of a product’s user interface was taken using AppTimer (v1.0.1006). For each product tested, we
obtained a total of fifteen samples from five sets of three UI launches, with a reboot before each set to clear
caching effects by the operating system. When compiling the results the first of each set was separated out so
that there was a set of values for the initial launch after reboot and a set for subsequent launches.

We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result
for this test is an average of the subsequent launch average and the initial launch time.




Performance Benchmark                                                                               Page 21 of 27
                                                                                               29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


In some cases, AppTimer did not correctly record the time taken for UI launch. For instance, some applications
would open their window and look like they were ready, but then continued to be unresponsive. Where the
measurement from AppTimer appeared inaccurate, we have taken the time manually with a stop watch.

AppTimer is publically available from the PassMark Website.

Benchmark 6 – Memory Usage during System Idle
The Perflog++ utility was used to record process memory usage on the system at boot, and then every minute
for another fifteen minutes after. This was done only once per product and resulted in a total of 15 samples. The
first sample taken at boot is discarded.

The PerfLog++ utility records memory usage of all processes, not just those of the anti-malware product. As a
result of this, an anti-malware product’s processes needed to be isolated from all other running system
processes. To isolate relevant process, we used a program called Process Explorer which was run immediately
upon the completion of memory usage logging by PerfLog++. Process Explorer is a Microsoft Windows
Sysinternals software tool which shows a list of all DLL processes currently loaded on the system.

Benchmark 7 – Memory Usage during Initial Scan
The PerfLog++ utility was used to record memory usage on the system while a malware scan is in progress.
Please refer to the metric “Memory usage – System Idle” above for a description of the PerfLog++utility and an
explanation of the method by which memory usage is calculated.

As some products cache scan locations, we take reasonable precautions to ensure that the security software
does not scan the C: drive at any point before conducting this test. A manual scan on the C: drive is initiated at
the same time as the PerfLog++ utility, enabling PerfLog++ to record memory usage for 60 seconds at five
second intervals.

Benchmark 8 – Browse Time
We used a script in conjunction with HTTPWatch (Basic Edition, version 6.1) to record the amount of time it
takes for a set of 106 ‘popular’ websites to load consecutively from a local server. This script feeds a list of URLs
into HTTPWatch, which instructs the browser to load pages in sequence and monitors the amount of time it
takes for the browser to load all items on one page.

For this test, we have used Internet Explorer 8 (Version 8.0.6001.18783) as our browser.

The set of websites used in this test include front pages of high traffic pages. This includes shopping, social,
news, finance and reference websites.

The Browse Time test is executed five times and our final result is an average of these five samples. The local
server is restarted between different products and one initial ‘test’ run is conducted prior to testing to install
Adobe Flash Player, an add-on which is used by many popular websites.

Benchmark 9 – Installation Time
This test measures the minimum Installation Time a product requires to be fully functional and ready for use by
the end user. Installation time can usually be divided in three major phases:


Performance Benchmark                                                                               Page 22 of 27
                                                                                               29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


 The Extraction and Setup phase consists of file extraction, the EULA prompt, product activation and user
   configurable options for installation.

 The File Copy phase occurs when the product is being installed; usually this phase is indicated by a progress
   bar.

 The Post-Installation phase is any part of the installation that occurs after the File Copy phase. This phase
   varies widely between products; the time recorded in this phase may include a required reboot to finalize the
   installation or include the time the program takes to become idle in the system tray.

To reduce the impact of disk drive variables, each product was copied to the Desktop before initializing
installation. Each step of the installation process was manually timed with a stopwatch and recorded in as much
detail as possible. Where input was required by the end user, the stopwatch was paused and the input noted in
the raw results in parenthesis after the phase description.

Where possible, all requests by products to pre-scan or post-install scan were declined or skipped. Where it was
not possible to skip a scan, the time to scan was included as part of the installation time. Where an optional
component of the installation formed a reasonable part of the functionality of the software, it was also installed
(e.g. website link checking software as part of an Security Product).

Installation time includes the time taken by the product installer to download components required in the
installation. This may include mandatory updates or the delivery of the application itself from a download
manager. We have noted in our results where a product has downloaded components for product installation.

We have excluded product activation times due to network variability in contacting vendor servers or time taken
in account creation.

Benchmarks 10-14 – Real-Time Performance
We used a single script in testing Benchmarks 10-15. The script consecutively executes tests for Benchmarks 10-
15. The script times each phase in these benchmarks using CommandTimer.exe and appends results to a log file.

Benchmarks 10 – File Copy, Move and Delete
This test measures the amount of time required for the system to copy, move and delete samples of files in
various file formats. This sample was made up of 812 files over 760,867,636 bytes and can be categorized as
documents [26% of total], media files [54% of total] and PE files (i.e. System Files) [20% of total].

The breakdown of the main file types, file numbers and total sizes of the files in the sample set is shown in the
following table:

                                            File format   Number   Size (bytes)

                                               DOC          8      30,450,176

                                              DOCX          4      13,522,409

                                               PPT          3      5,769,216

                                              PPTX          3      4,146,421

                                               XLS          4      2,660,352



Performance Benchmark                                                                                Page 23 of 27
                                                                                                29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


                                        XLSX        4       1,426,054

                                        PDF        73      136,298,049

                                        ZIP         4       6,295,987

                                        7Z          1        92,238

                                        JPG        351     31,375,259

                                        GIF         6       148,182

                                       MOV          7      57,360,371

                                        RM          1       5,658,646

                                        AVI         8      78,703,408

                                       WMV          5      46,126,167

                                        MP3        28      191,580,387

                                        EXE        19       2,952,914

                                        DLL        104     29,261,568

                                        AX          1        18,432

                                        CPL         2       2,109,440

                                        CPX         2         4,384

                                        DRV        10       154,864

                                        ICO         1       107,620

                                        MSC         1        41,587

                                        NT          1         1,688

                                       ROM          2        36,611

                                        SCR         2       2,250,240

                                        SYS         1      37,528,093

                                        TLB         3       135,580

                                        TSK         1         1,152

                                        UCE         1        22,984

                                        EXE        19       2,952,914

                                        DLL        104     29,261,568

                                        AX          1        18,432

                                        CPL         2       2,109,440

                                        CPX         2         4,384

                                        DRV        10       154,864

                                        ICO         1       107,620

                                        MSC         1        41,587

                                        NT          1         1,688

                                       ROM          2        36,611

                                        SCR         2       2,250,240

                                        SYS         1      37,528,093


Performance Benchmark                                                         Page 24 of 27
                                                                         29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


                                              TLB         3         135,580

                                              TSK         1             1,152

                                             UCE          1             22,984

                                             Total       812      760,867,636




This test was conducted five times to obtain the average time to copy, move and delete the sample files, with
the test machine rebooted between each sample to remove potential caching effects.

Benchmark 11 – Network Throughput
This benchmark measured how much time was required to download a sample set of binary files of various sizes
and types over a 100MB/s network connection. The files were hosted on a server machine running Windows
Server 2008 and IIS 7. CommandTimer.exe was used in conjunction with GNU Wget (version 1.10.1) to time and
conduct the download test.

The complete sample set of files was made up of 553,638,694 bytes over 484 files and two file type categories:
media files [74% of total] and documents [26% of total]. The breakdown of the file types, file numbers and total
sizes of the files in the sample set is shown in the following table:

                                         File format   Number      Size (bytes)

                                            JPEG         343       30,668,312

                                            GIF           9             360,349

                                            PNG           5             494,780

                                           MOV            7        57,360,371

                                            RM            1         5,658,646

                                            AVI           8        78,703,408

                                           WMV            5        46,126,167

                                            MP3          28        191,580,387

                                            PDF          73        136,298,049

                                             ZIP          4         6,295,987

                                             7Z           1             92,238

                                            Total        484       553,638,694


This test was conducted five times to obtain the average time to download this sample of files, with the test
machine rebooted between each sample to remove potential caching effects.

Benchmark 12 – File Format Conversion (MP3  WAV, MP3  WMA)

This test measured how much time was required to convert five (5) different MP3 files into WAV files and
subsequently, convert the same MP3 samples into a WMA files. The total size of the five (5) MP3s used was
25,870,899 bytes.




Performance Benchmark                                                                           Page 25 of 27
                                                                                           29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


To encode the MP3 into another format, we used an application called ffmpeg.exe. The format conversion
process was timed using CommandTimer.exe.

This test was conducted five times to obtain the average conversion speed between these formats, with the test
machine rebooted between each sample to remove potential caching effects.

Benchmark 13 – File Compression and Decompression
This test measured the amount of time required to compress and decompress a sample set of files. For this test,
we used a subset of the media and documents files used in the File Copy, Move and Delete benchmark.
CommandTimer.exe recorded the amount of time required for 7zip.exe to compress the files into a *.zip and
subsequently decompress the created *.zip file.

This subset comprised 1,218 files over 783 MB. The breakdown of the file types, file numbers and total sizes of
the files in the sample set is shown in the following table:

                                       File Type   File Number   Total Size

                                         .xls          13        9.23 MB

                                         .xlsx          9        3.51 MB

                                         .ppt           9        7.37 MB

                                         .pptx         11        17.4 MB

                                         .doc          17        35.9 MB

                                         .docx         19        24.5 MB

                                          .gif         177       1.10 MB

                                         .jpg          737       66.2 MB

                                         .png          159       48.9 MB

                                         .mov           7        54.7 MB

                                         .rm            1        5.39 MB

                                         .avi          46         459 MB

                                        .wma           11        48.6 MB

                                         .avi          46         459 MB

                                        .wma           11        48.6 MB

                                         Total        1218       783 MB


This test was conducted five times to obtain the average file compression and decompression speed, with the
test machine rebooted between each sample to remove potential caching effects.

Benchmark 14 – File Write, Open and Close
This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference
#1: What Really Slows Windows Down).

For this test, we developed OpenClose.exe, an application that looped writing a small file to disk, then opening
and closing that file. CommandTimer.exe was used to time how long the process took to complete 180,000
cycles.
Performance Benchmark                                                                           Page 26 of 27
                                                                                           29 November 2011
Webroot SecureAnywhere Cloud vs. Six Traditional Security Products


This test was conducted five times to obtain the average file writing, opening and closing speed, with the test
machine rebooted between each sample to remove potential caching effects.




Performance Benchmark                                                                          Page 27 of 27
                                                                                          29 November 2011

Weitere ähnliche Inhalte

Was ist angesagt?

77848485 upgrade-security-in-your-r12-upgrade
77848485 upgrade-security-in-your-r12-upgrade77848485 upgrade-security-in-your-r12-upgrade
77848485 upgrade-security-in-your-r12-upgraderaghu_sid
 
InstallAnywhere Infragistics Success Story
InstallAnywhere Infragistics Success StoryInstallAnywhere Infragistics Success Story
InstallAnywhere Infragistics Success StoryFlexera
 
Comp tia a+_session_02
Comp tia a+_session_02Comp tia a+_session_02
Comp tia a+_session_02Niit Care
 
Security Lock Down Your Computer Like the National Security Agency (NSA)
Security Lock Down Your Computer Like the National Security Agency (NSA)Security Lock Down Your Computer Like the National Security Agency (NSA)
Security Lock Down Your Computer Like the National Security Agency (NSA)José Ferreiro
 
Take the guesswork out of video surveillance with pre-validated Milestone s...
Take the guesswork out of video  surveillance with pre-validated  Milestone s...Take the guesswork out of video  surveillance with pre-validated  Milestone s...
Take the guesswork out of video surveillance with pre-validated Milestone s...Principled Technologies
 
Comp tia a+_session_13
Comp tia a+_session_13Comp tia a+_session_13
Comp tia a+_session_13Niit Care
 
SIMULATION-BASED APPLICATION SOFTWARE DEVELOPMENT IN TIME-TRIGGERED COMMUNICA...
SIMULATION-BASED APPLICATION SOFTWARE DEVELOPMENT IN TIME-TRIGGERED COMMUNICA...SIMULATION-BASED APPLICATION SOFTWARE DEVELOPMENT IN TIME-TRIGGERED COMMUNICA...
SIMULATION-BASED APPLICATION SOFTWARE DEVELOPMENT IN TIME-TRIGGERED COMMUNICA...IJSEA
 

Was ist angesagt? (10)

77848485 upgrade-security-in-your-r12-upgrade
77848485 upgrade-security-in-your-r12-upgrade77848485 upgrade-security-in-your-r12-upgrade
77848485 upgrade-security-in-your-r12-upgrade
 
Intouch 9.0 v
Intouch 9.0 vIntouch 9.0 v
Intouch 9.0 v
 
InstallAnywhere Infragistics Success Story
InstallAnywhere Infragistics Success StoryInstallAnywhere Infragistics Success Story
InstallAnywhere Infragistics Success Story
 
Comp tia a+_session_02
Comp tia a+_session_02Comp tia a+_session_02
Comp tia a+_session_02
 
Security Lock Down Your Computer Like the National Security Agency (NSA)
Security Lock Down Your Computer Like the National Security Agency (NSA)Security Lock Down Your Computer Like the National Security Agency (NSA)
Security Lock Down Your Computer Like the National Security Agency (NSA)
 
Take the guesswork out of video surveillance with pre-validated Milestone s...
Take the guesswork out of video  surveillance with pre-validated  Milestone s...Take the guesswork out of video  surveillance with pre-validated  Milestone s...
Take the guesswork out of video surveillance with pre-validated Milestone s...
 
Comp tia a+_session_13
Comp tia a+_session_13Comp tia a+_session_13
Comp tia a+_session_13
 
SIMULATION-BASED APPLICATION SOFTWARE DEVELOPMENT IN TIME-TRIGGERED COMMUNICA...
SIMULATION-BASED APPLICATION SOFTWARE DEVELOPMENT IN TIME-TRIGGERED COMMUNICA...SIMULATION-BASED APPLICATION SOFTWARE DEVELOPMENT IN TIME-TRIGGERED COMMUNICA...
SIMULATION-BASED APPLICATION SOFTWARE DEVELOPMENT IN TIME-TRIGGERED COMMUNICA...
 
Incompatible
IncompatibleIncompatible
Incompatible
 
intouch
intouchintouch
intouch
 

Ähnlich wie Webroot SecureAnywhere Cloud vs. Six Traditional Security Products

Stopping the Adobe, Apple and Java Software Updater Insanity
Stopping the Adobe, Apple and Java Software Updater InsanityStopping the Adobe, Apple and Java Software Updater Insanity
Stopping the Adobe, Apple and Java Software Updater InsanityLumension
 
Intel And Big Data: An Open Platform for Next-Gen Analytics
Intel And Big Data: An Open Platform for Next-Gen AnalyticsIntel And Big Data: An Open Platform for Next-Gen Analytics
Intel And Big Data: An Open Platform for Next-Gen AnalyticsIntel IT Center
 
Product Key Finder - Find Product Key, Recovery and Backup
Product Key Finder - Find Product Key, Recovery and BackupProduct Key Finder - Find Product Key, Recovery and Backup
Product Key Finder - Find Product Key, Recovery and BackupNsaSoft US LLC
 
Can consumer av products protect
Can consumer av products protectCan consumer av products protect
Can consumer av products protectAnatoliy Tkachev
 
How to securely manage endpoints using SanerNow
How to securely manage endpoints using SanerNowHow to securely manage endpoints using SanerNow
How to securely manage endpoints using SanerNowSecPod
 
Empowering Application Security Protection in the World of DevOps
Empowering Application Security Protection in the World of DevOpsEmpowering Application Security Protection in the World of DevOps
Empowering Application Security Protection in the World of DevOpsIBM Security
 
Empowering Application Security Protection in the World of DevOps
Empowering Application Security Protection in the World of DevOpsEmpowering Application Security Protection in the World of DevOps
Empowering Application Security Protection in the World of DevOpsBlack Duck by Synopsys
 
The Art of Managing and Securing Endpoints
The Art of Managing and Securing EndpointsThe Art of Managing and Securing Endpoints
The Art of Managing and Securing EndpointsSecPod
 
AGILE SECURITY™ Security for the Real World
AGILE SECURITY™ Security for the Real WorldAGILE SECURITY™ Security for the Real World
AGILE SECURITY™ Security for the Real WorldCisco Russia
 
Fraunhofer Report on Black
Fraunhofer Report on BlackFraunhofer Report on Black
Fraunhofer Report on BlackFraunhofer SIT
 
Novell ZENworks Overview and Futures
Novell ZENworks Overview and FuturesNovell ZENworks Overview and Futures
Novell ZENworks Overview and FuturesNovell
 
Novell ZENworks Overview and Futures
Novell ZENworks Overview and FuturesNovell ZENworks Overview and Futures
Novell ZENworks Overview and FuturesNovell
 
STAF 在自動化測試上的延伸應用 -- TMSTAF (TrendMicro STAF)
STAF 在自動化測試上的延伸應用 -- TMSTAF (TrendMicro STAF)STAF 在自動化測試上的延伸應用 -- TMSTAF (TrendMicro STAF)
STAF 在自動化測試上的延伸應用 -- TMSTAF (TrendMicro STAF)pycontw
 
Session 1: Windows 8 with Gerry Tessier
Session 1: Windows 8 with Gerry TessierSession 1: Windows 8 with Gerry Tessier
Session 1: Windows 8 with Gerry TessierCTE Solutions Inc.
 
Top 5 DevSecOps Tools- You Need to Know About
Top 5 DevSecOps Tools- You Need to Know AboutTop 5 DevSecOps Tools- You Need to Know About
Top 5 DevSecOps Tools- You Need to Know AboutDev Software
 

Ähnlich wie Webroot SecureAnywhere Cloud vs. Six Traditional Security Products (20)

Stopping the Adobe, Apple and Java Software Updater Insanity
Stopping the Adobe, Apple and Java Software Updater InsanityStopping the Adobe, Apple and Java Software Updater Insanity
Stopping the Adobe, Apple and Java Software Updater Insanity
 
Avc prot 2012b_en
Avc prot 2012b_enAvc prot 2012b_en
Avc prot 2012b_en
 
Avc fdt 201209_en
Avc fdt 201209_enAvc fdt 201209_en
Avc fdt 201209_en
 
Intel And Big Data: An Open Platform for Next-Gen Analytics
Intel And Big Data: An Open Platform for Next-Gen AnalyticsIntel And Big Data: An Open Platform for Next-Gen Analytics
Intel And Big Data: An Open Platform for Next-Gen Analytics
 
Product Key Finder - Find Product Key, Recovery and Backup
Product Key Finder - Find Product Key, Recovery and BackupProduct Key Finder - Find Product Key, Recovery and Backup
Product Key Finder - Find Product Key, Recovery and Backup
 
Performance dec 2010
Performance dec 2010Performance dec 2010
Performance dec 2010
 
Dtl 2012 q4_home.1
Dtl 2012 q4_home.1Dtl 2012 q4_home.1
Dtl 2012 q4_home.1
 
Can consumer av products protect
Can consumer av products protectCan consumer av products protect
Can consumer av products protect
 
How to securely manage endpoints using SanerNow
How to securely manage endpoints using SanerNowHow to securely manage endpoints using SanerNow
How to securely manage endpoints using SanerNow
 
Empowering Application Security Protection in the World of DevOps
Empowering Application Security Protection in the World of DevOpsEmpowering Application Security Protection in the World of DevOps
Empowering Application Security Protection in the World of DevOps
 
Empowering Application Security Protection in the World of DevOps
Empowering Application Security Protection in the World of DevOpsEmpowering Application Security Protection in the World of DevOps
Empowering Application Security Protection in the World of DevOps
 
Avc fdt 201303_en
Avc fdt 201303_enAvc fdt 201303_en
Avc fdt 201303_en
 
The Art of Managing and Securing Endpoints
The Art of Managing and Securing EndpointsThe Art of Managing and Securing Endpoints
The Art of Managing and Securing Endpoints
 
AGILE SECURITY™ Security for the Real World
AGILE SECURITY™ Security for the Real WorldAGILE SECURITY™ Security for the Real World
AGILE SECURITY™ Security for the Real World
 
Fraunhofer Report on Black
Fraunhofer Report on BlackFraunhofer Report on Black
Fraunhofer Report on Black
 
Novell ZENworks Overview and Futures
Novell ZENworks Overview and FuturesNovell ZENworks Overview and Futures
Novell ZENworks Overview and Futures
 
Novell ZENworks Overview and Futures
Novell ZENworks Overview and FuturesNovell ZENworks Overview and Futures
Novell ZENworks Overview and Futures
 
STAF 在自動化測試上的延伸應用 -- TMSTAF (TrendMicro STAF)
STAF 在自動化測試上的延伸應用 -- TMSTAF (TrendMicro STAF)STAF 在自動化測試上的延伸應用 -- TMSTAF (TrendMicro STAF)
STAF 在自動化測試上的延伸應用 -- TMSTAF (TrendMicro STAF)
 
Session 1: Windows 8 with Gerry Tessier
Session 1: Windows 8 with Gerry TessierSession 1: Windows 8 with Gerry Tessier
Session 1: Windows 8 with Gerry Tessier
 
Top 5 DevSecOps Tools- You Need to Know About
Top 5 DevSecOps Tools- You Need to Know AboutTop 5 DevSecOps Tools- You Need to Know About
Top 5 DevSecOps Tools- You Need to Know About
 

Kürzlich hochgeladen

Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...apidays
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUK Journal
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessPixlogix Infotech
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CVKhem
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?Antenna Manufacturer Coco
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Igalia
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processorsdebabhi2
 

Kürzlich hochgeladen (20)

Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Real Time Object Detection Using Open CV
Real Time Object Detection Using Open CVReal Time Object Detection Using Open CV
Real Time Object Detection Using Open CV
 
What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?What Are The Drone Anti-jamming Systems Technology?
What Are The Drone Anti-jamming Systems Technology?
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 

Webroot SecureAnywhere Cloud vs. Six Traditional Security Products

  • 1. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products (Dec 2011) Security Performance Benchmark Document: Webroot Secure Anywhere Cloud vs. Six Traditional Security Products (Nov 2011) Authors: M. Baquiran, D. Wren Company: PassMark Software Date: 29 November 2011 File: Webroot_SecureAnywhere_vs_security_competitors_29Nov2011.docx
  • 2. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Table of Contents TABLE OF CONTENTS......................................................................................................................................... 2 REVISION HISTORY............................................................................................................................................ 3 REFERENCES...................................................................................................................................................... 3 EXECUTIVE SUMMARY ...................................................................................................................................... 4 OVERALL SCORE ................................................................................................................................................ 5 PRODUCTS AND VERSIONS ............................................................................................................................... 6 PERFORMANCE METRICS SUMMARY ................................................................................................................ 7 TEST RESULTS ................................................................................................................................................. 10 BENCHMARK 1 – INITIAL SCAN TIME ......................................................................................................................... 10 BENCHMARK 2 – INSTALLATION SIZE ......................................................................................................................... 10 BENCHMARK 3 – INSTALLATION TIME ........................................................................................................................ 11 BENCHMARK 4 – REGISTRY KEYS ADDED .................................................................................................................... 11 BENCHMARK 5 – BOOT TIME ................................................................................................................................... 12 BENCHMARK 6 – USER INTERFACE LAUNCH TIME ........................................................................................................ 12 BENCHMARK 7 – MEMORY USAGE DURING SYSTEM IDLE .............................................................................................. 13 BENCHMARK 8 – MEMORY USAGE DURING INITIAL SCAN .............................................................................................. 13 BENCHMARK 9 – BROWSE TIME ............................................................................................................................... 14 BENCHMARK 10 – FILE COPY, MOVE AND DELETE ....................................................................................................... 14 BENCHMARK 11 – NETWORK THROUGHPUT ............................................................................................................... 15 BENCHMARK 12 – FILE FORMAT CONVERSION ............................................................................................................ 15 BENCHMARK 13 – FILE COMPRESSION AND DECOMPRESSION ........................................................................................ 16 BENCHMARK 14 – FILE WRITE, OPEN AND CLOSE ........................................................................................................ 16 CONTACT DETAILS .......................................................................................................................................... 17 APPENDIX 1 – TEST ENVIRONMENT ................................................................................................................ 18 APPENDIX 2 – METHODOLOGY DESCRIPTION ................................................................................................. 19 Performance Benchmark Page 2 of 27 29 November 2011
  • 3. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Revision History Rev Revision History Date Edition 1 Initial version of this report. 09 September 2011 References Ref # Document Author Date O. Warner, 1 What Really Slows Windows Down (URL) 2001-2009 The PC Spy Performance Benchmark Page 3 of 27 29 November 2011
  • 4. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Executive Summary PassMark Software® conducted objective performance testing on eight (8) security software products, on Windows 7 Ultimate Edition (64-bit) from in November 2011. This report presents our results and findings as a result of performance benchmark testing conducted for these consumer security products. The aim of this benchmark was to compare the performance impact of Webroot’s SecureAnywhere Cloud Security products with 6 other traditional security products. Testing was performed on all products using fourteen (14) performance metrics. These performance metrics are as follows:  Initial Scan Time;  Installation Size;  Installation Time;  Registry Keys Added;  Boot Time;  User Interface Launch Time;  Memory Usage during System Idle;  Memory Usage during Initial Scan;  Browse Time;  File Copy, Move and Delete;  Network Throughput (previously named “Binary Download Test”)  File Format Conversion;  File Compression and Decompression; and  File Write, Open and Close. Performance Benchmark Page 4 of 27 29 November 2011
  • 5. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Overall Score PassMark Software assigned every product a score depending on its ranking in each metric compared to other products in the same category. In the following table the highest possible score attainable is 112; in a hypothetical situation where a product has attained first place in all 14 metrics. Security products have been ranked by their overall scores: Product Name Overall Score Webroot SecureAnywhere Complete 2012 102 Webroot SecureAnywhere Essentials 2012 101 Norton 360 v5 76 ESET Smart Security 5 60 Trend Maximum Security 2012 55 McAfee Total Protection 2012 51 AVG Premium Security 2012 48 Kaspersky PURE 9 46 Performance Benchmark Page 5 of 27 29 November 2011
  • 6. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Products and Versions For each security product, we have tested the most current and available version. Manufacturer Product Name Release Year Product Version Date Tested Webroot SecureAnywhere Webroot Software, Inc. 2011 8.0.0.66 Nov 2011 Complete 2012 Webroot SecureAnywhereEssentials Webroot Software, Inc. 2011 8.0.0.66 Nov 2011 2012 Kaspersky Lab Kaspersky PURE 2011 9.1.0.124 Nov 2011 Trend Micro Titanium Maximum Trend Micro Inc. 2011 5.0.1280 Nov 2011 Security 2012 AVG Technologies AVG Premium Security 2012 2011 2012.0.1873 Nov 2011 Symantec Corp Norton 360 2011 5.0.0.125 Nov 2011 ESET, spol. s r.o. ESET Smart Security 5 2010 5.0.94.0 Nov 2011 McAfee, Inc. McAfee Total Protection 2011 11.0.623 Nov 2011 Performance Benchmark Page 6 of 27 29 November 2011
  • 7. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Performance Metrics Summary We have selected a set of objective metrics which provide a comprehensive and realistic indication of the areas in which an security may impact system performance for end users. Our metrics test the impact of the security software on common tasks that end-users would perform on a daily basis. All of PassMark Software’s test methods can be replicated by third parties using the same environment to obtain similar benchmark results. Detailed descriptions of the methodologies used in our tests are available as “Appendix 2 – Methodology Description” of this report. Benchmark 1 – Initial Scan Time All security solutions have functionality designed to detect viruses and various other forms of malware by scanning files on the system. This metric measured the amount of time required to scan a set of clean files. Our sample file set comprised a total file size of 982 MB and was made up of files that would typically be found on end-user machines, such as media files, system files and Microsoft Office documents. Benchmark 2 – Installation Size In offering new features and functionality to users, security software products tend to increase in size with each new release. Although new technologies push the size limits of hard drives each year, the growing disk space requirements of common applications and the increasing popularity of large media files (such as movies, photos and music) ensure that a product's installation size will remain of interest to home users. This metric aims to measure a product’s total installation size. This metric is defined as the total disk space consumed by all new files added during a product's installation. Benchmark 3 – Installation Time The speed and ease of the installation process will strongly influence the user’s first impression of the security software. This test measures the minimum installation time required by the security software to be fully functional and ready for use by the end user. Lower installation times represent security products which are quicker for a user to install. Benchmark 4 – Registry Keys Added A large registry increases a machine’s use of resources. This may negatively impact system performance, especially on much older machines. This test measures the amount of keys and values added to registry, after rebooting the test machines, following a successful product installation. Lower numbers mean that a product has added fewer keys during installation and had less impact on the registry. Benchmark 5 – Boot Time This metric measures the amount of time taken for the machine to boot into the operating system. Security software is generally launched at Windows startup, adding an additional amount of time and delaying the startup of the operating system. Shorter boot times indicate that the application has had less impact on the normal operation of the machine. Performance Benchmark Page 7 of 27 29 November 2011
  • 8. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Benchmark 6 – User Interface Launch Time This metric provides an objective indication as to how responsive a security product appears to the user, by measuring the amount of time it takes for the user interface of the security software to launch from Windows. To allow for caching effects by the operating system, both the initial launch time and the subsequent launch times were measured. Our final result is an average of these two measurements. Benchmark 7 – Memory Usage during System Idle This metric measures the amount of memory (RAM) used by the product while the machine and security software are in an idle state. The total memory usage was calculated by identifying all security software processes and the amount of memory used by each process. The amount of memory used while the machine is idle provides a good indication of the amount of system resources being consumed by the security software on a permanent basis. Better performing products occupy less memory while the machine is idle. Benchmark 8 – Memory Usage – Scan This metric measures the amount of memory (RAM) used by the product during an security scan. The total memory usage was calculated by identifying all endpoint protection software processes and the amount of memory used by each process during an security scan. Benchmark 9 – Browse Time It is common behavior for security products to scan data for malware as it is downloaded from the internet or intranet. This behavior may negatively impact browsing speed as products scan web content for malware. This metric measures the time taken to browse a set of popular internet sites to consecutively load from a local server in a user’s browser window. Benchmark 10 – File Copy, Move and Delete This metric measures the amount of time taken to move, copy and delete a sample set of files. The sample file set contains several types of file formats that a Windows user would encounter in daily use. These formats include documents (e.g. Microsoft Office documents, Adobe PDF, Zip files, etc), media formats (e.g. images, movies and music) and system files (e.g. executables, libraries, etc). Benchmark 11 – Network Throughput The metric measures the amount of time taken to download a variety of files from a local server using the HyperText Transfer Protocol (HTTP), which is the main protocol used on the web for browsing, linking and data transfer. Files used in this test include file formats that users would typically download from the web, such as images, archives, music files and movie files. Benchmark 12 – File Format Conversion This test measures the amount of time taken to convert an MP3 file to a WAV and subsequently, convert the same MP3 file to a WMA format. Performance Benchmark Page 8 of 27 29 November 2011
  • 9. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Benchmark 13 – File Compression and Decompression This metric measures the amount of time taken to compress and decompress different types of files. Files formats used in this test included documents, movies and images. Benchmark 14 – File Write, Open and Close This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference #1: What Really Slows Windows Down). This metric measures the amount of time taken to write a file, then open and close that file. Performance Benchmark Page 9 of 27 29 November 2011
  • 10. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Test Results In the following charts, we have highlighted the results we obtained for Webroot SecureAnywhere Complete 2012 and Webroot SecureAnywhere Essentials in green and orange respectively. The average has also been highlighted in blue for ease of comparison. Benchmark 1 – Initial Scan Time The following chart compares the average time taken to run an initial scan on a set of 6159 files (totaling 982 MB) for each Security product tested. Webroot SecureAnywhere Complete 2012 20.00 Webroot SecureAnywhere Essentials 2012 21.00 McAfee Total Protection 2012 38.40 Trend Maximum Security 2012 39.90 Norton 360 v5 45.00 Average 47.66 AVG Premium Security 2012 55.00 ESET Smart Security 5 68.00 Kaspersky PURE 9 94.00 0s 10 s 20 s 30 s 40 s 50 s 60 s 70 s 80 s 90 s 100 s Benchmark 2 – Installation Size The following chart compares the total size of files added during the installation of Security products. Products with lower installation sizes are considered better performing products in this category. Webroot SecureAnywhere Essentials 2012 4.18 Webroot SecureAnywhere Complete 2012 61.02 Norton 360 v5 273.06 ESET Smart Security 5 335.50 AVG Premium Security 2012 428.43 Trend Maximum Security 2012 430.43 Average 514.67 McAfee Total Protection 2012 596.38 Kaspersky PURE 9 1988.34 0 MB 500 MB 1,000 MB 1,500 MB 2,000 MB 2,500 MB Performance Benchmark Page 10 of 27 29 November 2011
  • 11. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Benchmark 3 – Installation Time The following chart compares the minimum installation time it takes for Security products to be fully functional and ready for use by the end user. Products with lower installation times are considered better performing products in this category. Webroot SecureAnywhere Complete 2012 5.00 Webroot SecureAnywhere Essentials 2012 6.00 Norton 360 v5 48.20 Kaspersky PURE 9 52.50 AVG Premium Security 2012 52.50 Trend Maximum Security 2012 145.80 Average 169.81 ESET Smart Security 5 389.40 McAfee Total Protection 2012 659.10 0s 100 s 200 s 300 s 400 s 500 s 600 s 700 s Benchmark 4 – Registry Keys Added The following chart compares the amount of Registry Keys created during product installation for each Security product tested. Products with lower key counts are considered better performing products in this category. Webroot SecureAnywhere Essentials 2012 568.00 Webroot SecureAnywhere Complete 2012 690.00 ESET Smart Security 5 3324.00 Norton 360 v5 4472.00 Trend Maximum Security 2012 4658.00 AVG Premium Security 2012 6048.00 Average 6547.50 Kaspersky PURE 9 12008.00 McAfee Total Protection 2012 20612.00 0 5000 10000 15000 20000 25000 Performance Benchmark Page 11 of 27 29 November 2011
  • 12. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Benchmark 5 – Boot Time The following chart compares the average time taken for the system to boot (from a sample of five boots) for each Security product tested. Products with lower boot times are considered better performing products in this category. Webroot SecureAnywhere Complete 2012 26.03 Webroot SecureAnywhere Essentials 2012 26.06 Trend Maximum Security 2012 28.78 McAfee Total Protection 2012 32.23 ESET Smart Security 5 32.60 Norton 360 v5 35.23 Average 48.99 AVG Premium Security 2012 103.96 Kaspersky PURE 9 107.00 0s 20 s 40 s 60 s 80 s 100 s 120 s Benchmark 6 – User Interface Launch Time The following chart compares the average time taken to launch a product’s user interface. Products with lower launch times are considered better performing products in this category. McAfee Total Protection 2012 46.63 ESET Smart Security 5 66.97 Webroot SecureAnywhere Complete 2012 286.90 Webroot SecureAnywhere Essentials 2012 290.77 Trend Maximum Security 2012 452.24 Average 581.91 Norton 360 v5 949.88 Kaspersky PURE 9 1247.01 AVG Premium Security 2012 1314.91 0 ms 200 ms 400 ms 600 ms 800 ms 1,000 ms 1,200 ms 1,400 ms Performance Benchmark Page 12 of 27 29 November 2011
  • 13. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Benchmark 7 – Memory Usage during System Idle The following chart compares the average amount of RAM in use by an Security product during a period of system idle. This average is taken from a sample of ten memory snapshots taken at roughly 60 seconds apart after reboot. Products with lower idle RAM usage are considered better performing products in this category. Webroot SecureAnywhere Complete 2012 2.64 Webroot SecureAnywhere Essentials 2012 3.12 Norton 360 v5 23.04 Kaspersky PURE 9 23.74 Average 48.00 AVG Premium Security 2012 63.22 Trend Maximum Security 2012 84.62 McAfee Total Protection 2012 90.12 ESET Smart Security 5 93.49 0 MB 10 MB 20 MB 30 MB 40 MB 50 MB 60 MB 70 MB 80 MB 90 MB 100 MB Benchmark 8 – Memory Usage during Initial Scan The following chart compares the average amount of RAM in use by an Security product during an initial scan on the main drive. This average is taken from a sample of ten memory snapshots taken at five second intervals during a scan of sample files which have not been previously scanned by the software. Products that use less memory during a scan are considered better performing products in this category. Webroot SecureAnywhere Complete 2012 9.14 Webroot SecureAnywhere Essentials 2012 21.57 ESET Smart Security 5 100.43 Average 121.01 Norton 360 v5 122.25 Kaspersky PURE 9 130.78 Trend Maximum Security 2012 162.65 McAfee Total Protection 2012 184.61 AVG Premium Security 2012 236.66 0 MB 50 MB 100 MB 150 MB 200 MB 250 MB Performance Benchmark Page 13 of 27 29 November 2011
  • 14. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Benchmark 9 – Browse Time The following chart compares the average time taken for Internet Explorer to successively load a set of popular websites through the local area network from a local server machine. Products with lower browse times are considered better performing products in this category. Webroot SecureAnywhere Complete 2012 33.22 Webroot SecureAnywhere Essentials 2012 36.60 Norton 360 v5 42.50 Trend Maximum Security 2012 43.81 Average 49.92 AVG Premium Security 2012 54.91 ESET Smart Security 5 56.18 Kaspersky PURE 9 57.71 McAfee Total Protection 2012 74.45 0s 10 s 20 s 30 s 40 s 50 s 60 s 70 s 80 s Benchmark 10 – File Copy, Move and Delete The following chart compares the average time taken to copy, move and delete several sets of sample files for each Security product tested. Products with lower times are considered better performing products in this category. Webroot SecureAnywhere Essentials 2012 24.08 Webroot SecureAnywhere Complete 2012 24.11 Norton 360 v5 24.38 Kaspersky PURE 9 25.42 Trend Maximum Security 2012 27.22 Average 27.41 ESET Smart Security 5 28.27 McAfee Total Protection 2012 30.27 AVG Premium Security 2012 35.52 0s 5s 10 s 15 s 20 s 25 s 30 s 35 s 40 s Performance Benchmark Page 14 of 27 29 November 2011
  • 15. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Benchmark 11 – Network Throughput The following chart compares the average time to download a sample set of common file types for each Security product tested. Products with lower times are considered better performing products in this category. Webroot SecureAnywhere Essentials 2012 10.23 Webroot SecureAnywhere Complete 2012 10.25 Trend Maximum Security 2012 11.45 Norton 360 v5 14.84 McAfee Total Protection 2012 14.93 Average 15.22 ESET Smart Security 5 15.29 Kaspersky PURE 9 18.02 AVG Premium Security 2012 26.78 0s 5s 10 s 15 s 20 s 25 s 30 s Benchmark 12 – File Format Conversion The following chart compares the average time it takes for five sample files to be converted from one file format to another (MP3  WMA, MP3  WAV) for each Security product tested. Products with lower times are considered better performing products in this category. ESET Smart Security 5 75.69 McAfee Total Protection 2012 75.78 Norton 360 v5 75.90 AVG Premium Security 2012 75.96 Webroot SecureAnywhere Complete 2012 76.18 Webroot SecureAnywhere Essentials 2012 76.25 Average 76.51 Trend Maximum Security 2012 77.41 Kaspersky PURE 9 78.91 74 s 75 s 76 s 77 s 78 s 79 s 80 s Performance Benchmark Page 15 of 27 29 November 2011
  • 16. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Benchmark 13 – File Compression and Decompression The following chart compares the average time it takes for sample files to be compressed and decompressed for each Security product tested. Products with lower times are considered better performing products in this category. Kaspersky PURE 9 74.68 Webroot SecureAnywhere Essentials 2012 78.58 Webroot SecureAnywhere Complete 2012 78.64 Norton 360 v5 79.95 Average 80.46 Trend Maximum Security 2012 82.16 McAfee Total Protection 2012 82.52 AVG Premium Security 2012 83.18 ESET Smart Security 5 83.94 70 s 72 s 74 s 76 s 78 s 80 s 82 s 84 s 86 s Benchmark 14 – File Write, Open and Close The following chart compares the average time it takes for a file to be written to the hard drive then opened and closed 180,000 times, for each Security product tested. Products with lower times are considered better performing products in this category. Webroot SecureAnywhere Complete 2012 15.80 Webroot SecureAnywhere Essentials 2012 15.98 AVG Premium Security 2012 17.85 Kaspersky PURE 9 28.68 Norton 360 v5 29.39 Average 52.11 McAfee Total Protection 2012 65.08 Trend Maximum Security 2012 67.81 ESET Smart Security 5 176.28 0s 20 s 40 s 60 s 80 s 100 s 120 s 140 s 160 s 180 s 200 s Performance Benchmark Page 16 of 27 29 November 2011
  • 17. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Disclaimer and Disclosure This report only covers versions of products that were available at the time of testing. The tested versions are as noted in the “Products and Versions” section of this report. The products we have tested are not an exhaustive list of all products available in these very competitive product categories. Disclaimer of Liability While every effort has been made to ensure that the information presented in this report is accurate, PassMark Software Pty Ltd assumes no responsibility for errors, omissions, or out-of-date information and shall not be liable in any manner whatsoever for direct, indirect, incidental, consequential, or punitive damages resulting from the availability of, use of, access of, or inability to use this information. Disclosure Webroot Software Inc. funded the production of this report. The list of products tested and the metrics included in the report were selected by Webroot. Trademarks All trademarks are the property of their respective owners. Contact Details PassMark Software Pty Ltd Suite 202, Level 2 35 Buckingham St. Surry Hills, 2010 Sydney, Australia Phone + 61 (2) 9690 0444 Fax + 61 (2) 9690 0445 Web www.passmark.com Performance Benchmark Page 17 of 27 29 November 2011
  • 18. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Appendix 1 – Test Environment For our testing, PassMark Software used a test environment running Windows 7 Ultimate (64-bit) SP1 with the following hardware specifications: Windows 7 (64-bit) System CPU: Intel Core i7 920 Quad Core @ 2.67GHz Video Card: nVidia GeForce 8800 GT Motherboard: Intel x58 Motherboard RAM: 6GB DDR3 RAM HDD: Western Digital 500GB 7200RPM Network: Gigabit (1GB/s) switch Performance Benchmark Page 18 of 27 29 November 2011
  • 19. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products Appendix 2 – Methodology Description Windows 7 Image Creation As with testing on Windows Vista, Norton Ghost was used to create a “clean” baseline image prior to testing. Our aim is to create a baseline image with the smallest possible footprint and reduce the possibility of variation caused by external operating system factors. The baseline image was restored prior to testing of each different product. This process ensures that we install and test all products on the same, “clean” machine. The steps taken to create the base Windows 7 image are as follows: 1. Installation and activation of Windows 7 Ultimate Edition. 2. Disabled Automatic Updates. 3. Changed User Account Control settings to “Never Notify”. 4. Disable Windows Defender automatic scans to avoid unexpected background activity. 5. Disable the Windows firewall to avoid interference with security software. 6. Installed Norton Ghost for imaging purposes. 7. Disabled Superfetch to ensure consistent results. 8. Installed HTTP Watch for Browse Time testing. 9. Installed Windows Performance Toolkit x64 for Boot Time testing. 10. Installed Active Perl for interpretation of some test scripts. 11. Install OSForensics for testing (Installation Size and Registry Key Count tests) purposes. 12. Disabled updates, accelerators and compatibility view updates in Internet Explorer 8. 13. Update to Windows Service Pack 1 14. Created a baseline image using Norton Ghost. Benchmark 1 – Initial Scan Time Scan Time is the time it took for each product to run an initial scan on a set of sample files. The sample used was identical in all cases and contained a mixture of system files and Office files. In total there were 6159 files whose combined size was 982 MB. Most of these files come from the Windows system folders. As the file types can influence scanning speed, the breakdown of the main file types, file numbers and total sizes of the files in the sample set is given here: File Extension Number of Files File Size .dll 2589 490MB .exe 695 102MB .sys 332 23MB .gif 302 1MB .doc 281 64MB .wmf 185 2MB .png 149 2MB .html 126 1MB .nls 80 6MB .jpg 70 1MB .ini 59 2MB .ico 58 <1MB Performance Benchmark Page 19 of 27 29 November 2011
  • 20. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products .mof 43 6MB .ax 39 4MB .xls 38 3MB .ime 35 5MB .drv 31 1MB .txt 31 1MB .chm 30 6MB .cpl 29 4MB .mfl 29 3MB .inf 26 2MB .hlp 22 3MB .imd 20 18MB .py 20 <1MB .msc 18 1MB .vbs 18 1MB .xml 18 1MB .rtf 16 62MB .ocx 16 4MB .tsp 14 1MB .com 14 <1MB .xsl 14 <1MB .h 13 <1MB .vsd 12 2MB .scr 12 2MB .aw 12 2MB .js 12 1MB .zip 11 25MB .lex 9 10MB .ppt 9 4MB .acm 9 1MB .wav 7 5MB Total 6159 982 This scan was run without launching the product’s user interface, by right-clicking the test folder and choosing the “Scan Now” option. To record the scan time, we have used product’s built-in scan timer or reporting system. Where this was not possible, scan times were taken manually with a stopwatch. Benchmark 2 – Installation Size A product's Installation Size was previously defined as the difference between the initial snapshot of the Disk Space (C: drive) before installation and the subsequent snapshot taken after the product is installed on the system. Although this is a widely used methodology, we noticed that the results it yielded were not always reproducible in Vista due to random OS operations that may take place between the two snapshots. We improved the Installation Size methodology by removing as many Operating System and disk space variables as possible. Using PassMark’s OSForensics we created initial and post-installation disk signatures for each product. These disk signatures recorded the amount of files and directories, and complete details of all files on that drive (including file name, file size, checksum, etc) at the time the signature was taken. The initial disk signature was taken immediately prior to installation of the product. A subsequent disk signature was taken immediately following a system reboot after product installation. Using OSForensics, we compared the two signatures and calculated the total disk space consumed by files that were new, modified, and deleted during product installation. Our result for this metric reflects the total size of all newly added files during installation. Performance Benchmark Page 20 of 27 29 November 2011
  • 21. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products The scope of this metric includes only an ‘out of the box’ installation size for each product. Our result does not cover the size of files downloaded by the product after its installation (such as engine or signature updates), or any files created by system restore points, pre-fetch files and other temporary files. Benchmark 3 – Registry Key Count This test measures the amount of keys and values added to registry, after rebooting the test machine following a successful product installation. The test was conducted using RegistryCounter.exe, an application which conducts a count of all keys, errors and values under HKEY_LOCAL_MACHINE and HKEY_USERS. Two Registry Key counts are taken, one prior to installation and a second immediately following a reboot after installation. To obtain our result, we calculated the difference between these two registry key totals. Occasionally, an unusually large number of registry keys were removed by the system. This seemed to occur when a product was installed, but was not necessarily caused directly by the product. This resulted in a largely negative difference in registry keys between the before and after registry key counts obtained using RegistryCounter.exe. To overcome this problem, we used PassMark’s OSForensics to calculate and obtain a breakdown of the new, modified and deleted keys separately. The total of the new keys obtained from OSForensics was used as the result for this test in the case that the former method using RegistryCounter.exe yielded a negative value. Benchmark 4 – Boot Time PassMark Software uses tools available from the Windows Performance Toolkit version 4.6 (as part of the Microsoft Windows 7 SDK obtainable from the Microsoft Website) with a view to obtaining more precise and consistent boot time results on the Windows 7 platform. The boot process is first optimized with xbootmgr.exe using the command “xbootmgr.exe -trace boot – prepSystem” which prepares the system for the test over six optimization boots. The boot traces obtained from the optimization process are discarded. After boot optimization, the benchmark is conducted using the command "xbootmgr.exe -trace boot -numruns 5”. This command boots the system five times in succession, taking detailed boot traces for each boot cycle. Finally, a post-processing tool was used to parse the boot traces and obtain the BootTimeViaPostBoot value. This value reflects the amount of time it takes the system to complete all (and only) boot time processes. Our final result is an average of five boot traces. Benchmark 5 – User Interface Launch Time The launch time of a product’s user interface was taken using AppTimer (v1.0.1006). For each product tested, we obtained a total of fifteen samples from five sets of three UI launches, with a reboot before each set to clear caching effects by the operating system. When compiling the results the first of each set was separated out so that there was a set of values for the initial launch after reboot and a set for subsequent launches. We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result for this test is an average of the subsequent launch average and the initial launch time. Performance Benchmark Page 21 of 27 29 November 2011
  • 22. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products In some cases, AppTimer did not correctly record the time taken for UI launch. For instance, some applications would open their window and look like they were ready, but then continued to be unresponsive. Where the measurement from AppTimer appeared inaccurate, we have taken the time manually with a stop watch. AppTimer is publically available from the PassMark Website. Benchmark 6 – Memory Usage during System Idle The Perflog++ utility was used to record process memory usage on the system at boot, and then every minute for another fifteen minutes after. This was done only once per product and resulted in a total of 15 samples. The first sample taken at boot is discarded. The PerfLog++ utility records memory usage of all processes, not just those of the anti-malware product. As a result of this, an anti-malware product’s processes needed to be isolated from all other running system processes. To isolate relevant process, we used a program called Process Explorer which was run immediately upon the completion of memory usage logging by PerfLog++. Process Explorer is a Microsoft Windows Sysinternals software tool which shows a list of all DLL processes currently loaded on the system. Benchmark 7 – Memory Usage during Initial Scan The PerfLog++ utility was used to record memory usage on the system while a malware scan is in progress. Please refer to the metric “Memory usage – System Idle” above for a description of the PerfLog++utility and an explanation of the method by which memory usage is calculated. As some products cache scan locations, we take reasonable precautions to ensure that the security software does not scan the C: drive at any point before conducting this test. A manual scan on the C: drive is initiated at the same time as the PerfLog++ utility, enabling PerfLog++ to record memory usage for 60 seconds at five second intervals. Benchmark 8 – Browse Time We used a script in conjunction with HTTPWatch (Basic Edition, version 6.1) to record the amount of time it takes for a set of 106 ‘popular’ websites to load consecutively from a local server. This script feeds a list of URLs into HTTPWatch, which instructs the browser to load pages in sequence and monitors the amount of time it takes for the browser to load all items on one page. For this test, we have used Internet Explorer 8 (Version 8.0.6001.18783) as our browser. The set of websites used in this test include front pages of high traffic pages. This includes shopping, social, news, finance and reference websites. The Browse Time test is executed five times and our final result is an average of these five samples. The local server is restarted between different products and one initial ‘test’ run is conducted prior to testing to install Adobe Flash Player, an add-on which is used by many popular websites. Benchmark 9 – Installation Time This test measures the minimum Installation Time a product requires to be fully functional and ready for use by the end user. Installation time can usually be divided in three major phases: Performance Benchmark Page 22 of 27 29 November 2011
  • 23. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products  The Extraction and Setup phase consists of file extraction, the EULA prompt, product activation and user configurable options for installation.  The File Copy phase occurs when the product is being installed; usually this phase is indicated by a progress bar.  The Post-Installation phase is any part of the installation that occurs after the File Copy phase. This phase varies widely between products; the time recorded in this phase may include a required reboot to finalize the installation or include the time the program takes to become idle in the system tray. To reduce the impact of disk drive variables, each product was copied to the Desktop before initializing installation. Each step of the installation process was manually timed with a stopwatch and recorded in as much detail as possible. Where input was required by the end user, the stopwatch was paused and the input noted in the raw results in parenthesis after the phase description. Where possible, all requests by products to pre-scan or post-install scan were declined or skipped. Where it was not possible to skip a scan, the time to scan was included as part of the installation time. Where an optional component of the installation formed a reasonable part of the functionality of the software, it was also installed (e.g. website link checking software as part of an Security Product). Installation time includes the time taken by the product installer to download components required in the installation. This may include mandatory updates or the delivery of the application itself from a download manager. We have noted in our results where a product has downloaded components for product installation. We have excluded product activation times due to network variability in contacting vendor servers or time taken in account creation. Benchmarks 10-14 – Real-Time Performance We used a single script in testing Benchmarks 10-15. The script consecutively executes tests for Benchmarks 10- 15. The script times each phase in these benchmarks using CommandTimer.exe and appends results to a log file. Benchmarks 10 – File Copy, Move and Delete This test measures the amount of time required for the system to copy, move and delete samples of files in various file formats. This sample was made up of 812 files over 760,867,636 bytes and can be categorized as documents [26% of total], media files [54% of total] and PE files (i.e. System Files) [20% of total]. The breakdown of the main file types, file numbers and total sizes of the files in the sample set is shown in the following table: File format Number Size (bytes) DOC 8 30,450,176 DOCX 4 13,522,409 PPT 3 5,769,216 PPTX 3 4,146,421 XLS 4 2,660,352 Performance Benchmark Page 23 of 27 29 November 2011
  • 24. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products XLSX 4 1,426,054 PDF 73 136,298,049 ZIP 4 6,295,987 7Z 1 92,238 JPG 351 31,375,259 GIF 6 148,182 MOV 7 57,360,371 RM 1 5,658,646 AVI 8 78,703,408 WMV 5 46,126,167 MP3 28 191,580,387 EXE 19 2,952,914 DLL 104 29,261,568 AX 1 18,432 CPL 2 2,109,440 CPX 2 4,384 DRV 10 154,864 ICO 1 107,620 MSC 1 41,587 NT 1 1,688 ROM 2 36,611 SCR 2 2,250,240 SYS 1 37,528,093 TLB 3 135,580 TSK 1 1,152 UCE 1 22,984 EXE 19 2,952,914 DLL 104 29,261,568 AX 1 18,432 CPL 2 2,109,440 CPX 2 4,384 DRV 10 154,864 ICO 1 107,620 MSC 1 41,587 NT 1 1,688 ROM 2 36,611 SCR 2 2,250,240 SYS 1 37,528,093 Performance Benchmark Page 24 of 27 29 November 2011
  • 25. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products TLB 3 135,580 TSK 1 1,152 UCE 1 22,984 Total 812 760,867,636 This test was conducted five times to obtain the average time to copy, move and delete the sample files, with the test machine rebooted between each sample to remove potential caching effects. Benchmark 11 – Network Throughput This benchmark measured how much time was required to download a sample set of binary files of various sizes and types over a 100MB/s network connection. The files were hosted on a server machine running Windows Server 2008 and IIS 7. CommandTimer.exe was used in conjunction with GNU Wget (version 1.10.1) to time and conduct the download test. The complete sample set of files was made up of 553,638,694 bytes over 484 files and two file type categories: media files [74% of total] and documents [26% of total]. The breakdown of the file types, file numbers and total sizes of the files in the sample set is shown in the following table: File format Number Size (bytes) JPEG 343 30,668,312 GIF 9 360,349 PNG 5 494,780 MOV 7 57,360,371 RM 1 5,658,646 AVI 8 78,703,408 WMV 5 46,126,167 MP3 28 191,580,387 PDF 73 136,298,049 ZIP 4 6,295,987 7Z 1 92,238 Total 484 553,638,694 This test was conducted five times to obtain the average time to download this sample of files, with the test machine rebooted between each sample to remove potential caching effects. Benchmark 12 – File Format Conversion (MP3  WAV, MP3  WMA) This test measured how much time was required to convert five (5) different MP3 files into WAV files and subsequently, convert the same MP3 samples into a WMA files. The total size of the five (5) MP3s used was 25,870,899 bytes. Performance Benchmark Page 25 of 27 29 November 2011
  • 26. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products To encode the MP3 into another format, we used an application called ffmpeg.exe. The format conversion process was timed using CommandTimer.exe. This test was conducted five times to obtain the average conversion speed between these formats, with the test machine rebooted between each sample to remove potential caching effects. Benchmark 13 – File Compression and Decompression This test measured the amount of time required to compress and decompress a sample set of files. For this test, we used a subset of the media and documents files used in the File Copy, Move and Delete benchmark. CommandTimer.exe recorded the amount of time required for 7zip.exe to compress the files into a *.zip and subsequently decompress the created *.zip file. This subset comprised 1,218 files over 783 MB. The breakdown of the file types, file numbers and total sizes of the files in the sample set is shown in the following table: File Type File Number Total Size .xls 13 9.23 MB .xlsx 9 3.51 MB .ppt 9 7.37 MB .pptx 11 17.4 MB .doc 17 35.9 MB .docx 19 24.5 MB .gif 177 1.10 MB .jpg 737 66.2 MB .png 159 48.9 MB .mov 7 54.7 MB .rm 1 5.39 MB .avi 46 459 MB .wma 11 48.6 MB .avi 46 459 MB .wma 11 48.6 MB Total 1218 783 MB This test was conducted five times to obtain the average file compression and decompression speed, with the test machine rebooted between each sample to remove potential caching effects. Benchmark 14 – File Write, Open and Close This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference #1: What Really Slows Windows Down). For this test, we developed OpenClose.exe, an application that looped writing a small file to disk, then opening and closing that file. CommandTimer.exe was used to time how long the process took to complete 180,000 cycles. Performance Benchmark Page 26 of 27 29 November 2011
  • 27. Webroot SecureAnywhere Cloud vs. Six Traditional Security Products This test was conducted five times to obtain the average file writing, opening and closing speed, with the test machine rebooted between each sample to remove potential caching effects. Performance Benchmark Page 27 of 27 29 November 2011