SlideShare ist ein Scribd-Unternehmen logo
1 von 7
Downloaden Sie, um offline zu lesen
Windows 8 Malware Protection Test
Report
A test commissioned by Kaspersky Lab and performed by AV-Test GmbH

                             th                              th
Date of the report: January 11 , 2013, last update: January 11 , 2013



Executive Summary
In December 2012, AV-Test performed a comparative review of Kaspersky Internet Security and the
built-in Windows 8 security components to determine their capabilities to protect against malware.
Four individual tests have been performed. The first was a real-world protection test of malicious
URLs and E-Mails with 42 samples, the second was a static detection test of 111487 malicious files,
the third test was another static detection test of 2500 prevalent files and the final test was a static
false positive test with 345900 samples. To perform the test runs, a clean Windows 8 image was used
on several identical PCs. On this image, the security software was installed resp. for the plain
Windows 8 image all security related components were left in default state and then the individual
tests have been carried out. In case of the dynamic test, the URLs have been accessed and the
downloaded samples have been executed and any detection by the security software was noted.
Additionally the resulting state of the system was compared with the original state before the test in
order to determine whether the attack was successfully blocked or not. In case of the static detection
test, the products had to scan two sets of files in default settings. Detections have been noted to
determine the detection result.

Kaspersky provided near perfect results, blocking all 42 URL and E-Mail attacks, while Microsoft failed
to block 5 attacks. There were also dramatic differences in the static detection of malware. Kaspersky
detected 99% of the tested samples, Microsoft only managed to detect 90%. Regarding static
detection of prevalent malware and false positives there were no problems for either product. Both
of them detected all 2500 prevalent malware samples and neither created a false positive detection
in the set of clean files. The results indicate that Windows Defender and other security features in
Windows 8 do offer a baseline protection. However, this is not enough to reliably protect against the
magnitude of attacks seen to Windows systems these days. A commercial solution, such as the tested
Kaspersky Internet Security provides a far better protection.




                                                                                                      1
Overview
With the increasing number of threats that is being released and spreading through the Internet
these days, the danger of getting infected is increasing as well. A few years back there were new
viruses released every few days. This has grown to several thousand new threats per hour.




                                   Figure 1: New samples added per year

In early 2013 the AV-TEST collection of malware exceeded 100.000.000 unique samples and the
growth rates are getting worse and worse. In the year 2000, AV-Test received more than 170,000
new samples, and in 2010 and 2011, the number of new samples grew to nearly 20,000,000 new
samples each. The numbers continue to grow in the year 2012 and 2013.

Microsoft reacted to these threats and each version of Windows released in the last years was more
secure than its predecessor. Windows 8 now even included a built-in malware scanner (Windows
Defender) and URL blocking components (SmartScreen filter). This report will assess whether these
features are enough to protect the users from today’s threats.


Products Tested
The testing occurred in December 2012. AV-Test used the latest releases available at the time of the
test of the following products:

       Microsoft Windows 8 Pro with built-in Windows Defender 4
       Kaspersky Internet Security 2013




                                                                                                       2
Methodology and Scoring
Platform
All tests have been performed on identical PCs equipped with the following hardware:

       Intel Xeon Quad-Core X3360 CPU
       4 GB Ram
       500 GB HDD (Western Digital)
       Intel Pro/1000 PL (Gigabit Ethernet) NIC

The operating system was Windows 8 Pro with only those patches that were available on December
1st 2012.

Testing methodology
General

    1. Clean system for each sample. The test systems should be restored to a clean state before
       being exposed to each malware sample.
    2. Physical Machines. The test systems used should be actual physical machines. No Virtual
       Machines should be used.
    3. Product Cloud/Internet Connection. The Internet should be available to all tested products
       that use the cloud as part of their protection strategy.
    4. Product Configuration. All products were run with their default, out-of-the-box
       configuration.
    5. Sample Cloud/Internet Accessibility. If the malware uses the cloud/Internet connection to
       reach other sites in order to download other files and infect the system, care should be taken
       to make sure that the cloud access is available to the malware sample in a safe way such that
       the testing network is not under the threat of getting infected.
    6. Allow time for sample to run. Each sample should be allowed to run on the target system for
       10 minutes to exhibit autonomous malicious behavior. This may include initiating
       connections to systems on the internet, or installing itself to survive a reboot (as may be the
       case with certain key-logging Trojans that only activate fully when the victim is performing a
       certain task).

The procedures below are carried out on all tested programs and all test cases at the same time in
order to ensure that all protection programs have the exact same test conditions. If a test case is no
longer working or its behavior varies in different protection programs (which can be clearly
determined using the Sunshine analyses), the test case is deleted. This ensures that all products were
tested in the exact same test scenarios. All test cases are solely obtained from internal AV-TEST
sources and are always fully analysed by AV-TEST. We never resort to using test cases or analyses
provided by manufacturers or other external sources.

Real-World Test

    1. The products are installed, updated and started up using standard/default settings. The
       protection program has complete Internet access at all times.
    2. AV-TEST uses the analysis program Sunshine, which it developed itself, to produce a map of
       the non-infected system.



                                                                                                     3
3. It then attempts to access the URL or receive the E-Mail
   4. If access is blocked, this is documented
   5. If access is not blocked, the malicious sample is downloaded from the URL resp. the malicious
       attachment is stored to disk
   6. If this is blocked, this is documented
   7. If this is not blocked, the malicious file will be executed.
   8. If execution of the sample is blocked with static or dynamic detection mechanisms by the
       program, this is documented.
   9. If execution is not blocked, an on-demand scan (Full computer scan) will be carried out. Any
       detections will be noted and any removal actions will be allowed.
   10. Given that the detection of malicious components or actions is not always synonymous to
       successful blockage, Sunshine constantly monitors all actions on the computer in order to
       determine whether the attack was completely or partially blocked or not blocked at all.
   11. A result for the test case is then determined based on the documented detection according
       to the protection program and the actions on the system recorded by Sunshine.

Static Scanning Test

   1. The test ran in the timeframe from December 3rd to December 10th, 2012
   2. A first scan over the whole collection has been carried out and all non-detected files have
      been identified
   3. Finally a rescan of all non-detected samples has been performed on December 10th, 2012
   4. The combined result of these scans has then been noted

The malware set for the real-world test contains 42 samples (39 URLs, 3 E-Mails). These samples have
been collected during December 11th and December 18th 2012. The malware set for the static
scanning test contains 2500 prevalent samples and 111487 further samples. These files have been
collected during November 1st and November 30th 2012.




                                                                                                    4
Test Results

The results of the first test (Figure 2) show that Kaspersky was able to protect against all 42 Real-
World threats used in the test, while Microsoft failed to block 5 out of them, which would result in an
infection of the system.




                                      Figure 2: Real-World Protection

As most of the infections these days take place via malicious websites or malicious e-mail
attachments it is especially important to provide a solid protection level in this area.




                                                                                                     5
Figure 3: Static Malware Detection

The next test shows the capability of statically detecting malware. Two different testsets have been
used to determine the results of the two products. There was one set of highly prevalent malware
that should be detected by all products and as can be seen in Figure 3 both tested products did
detect all samples of this set. The other testset however showed differences between both products.
This set contains malware that may not be as widespread but still is a risk to the user. Microsoft
detects less 90% of the 111487 samples (11228 missed samples) while Kaspersky stops over 99% of
the samples (1031 missed samples).

While static detection is only one feature of a set of protection features in modern security software
it still is a very important one and many products mainly rely on this technique to detect malware. As
such these results indicate that the built-in Windows Defender in Windows 8 is not good enough to
reliably protect the user.

The final test was a test for false positives. 345900 files collected from clean applications have been
scanned with both products to see whether any of these files have been wrongly flagged as malware.
Neither product did this and not a single false positive occurred. This shows that even a product with
a very good detection and protection rate (Kaspersky) can go without any false positives.




                                                                                                      6
Appendix
Version information of the tested software
Developer,              Product name                          Program version                           Engine/ signature version
Distributor
Kaspersky Lab           Kaspersky Internet Security 2013      13.0.1.4190 (c)                           16.5.2.1
Microsoft               Windows 8 Pro with Windows            4.0.9200.16384 (Windows                   1.1.9002.0/ 1.141.952.0
                        Defender                              Defender Version)                         (Windows Defender
                                                                                                        Version)

List of used URLs, Mails and Samples
URLs
                                                                 http://www.stellardayproducts.com/expensekeeper/OperaPMu
http://www.fauziyahospital.com/mego/OMG.exe                      pdate/OperaPMupdate.exe
http://204.15.124.183/bX2XdF.exe                                 http://ge.tt/api/1/files/9dMi6OQ/0/blob
                                                                 http://s3-sa-east-
http://shaiyaner.de/downloads/svchost.exe                        1.amazonaws.com/vonoloa/Contrato_61277.pdf.exe
http://restterem.ru/media/admin26.exe                            http://37.59.66.1/Server.exe
http://troykaakademi.net/media/firsale.exe                       http://nause.com/LjRjU.exe
http://citrus.doolr.com/appr1.exe                                http://ds.searchstar.co.kr/filepop/bacon.exe
http://kamalbose.com/uY3VZr.exe                                  http://hack3d.3dom.fr/shouky/rcv.exe
http://ballyhooindia.com/brlaikza/9qf.exe                        http://beyondcreative.com.au/flashupdate.exe
http://cateringumbria.eu/ox1.exe                                 http://206.253.165.71/phpmyadmin/5566.exe
http://1337.kz/vsocks-1.3.0.0.exe                                http://chat.ru/~dhoffmanfan/syscfg.exe
                                                                 http://demo.ovh.net/download/c356f845248362888ed36ead5a
http://tkprinter.com/Expl0it/forjd/warp.exe                      5280ae/c99199151.exe
                                                                 http://videoiindir.weebly.com/uploads/1/2/7/6/12763800/video2
http://89.248.166.21/prx.exe                                     .exe
http://74.63.196.69/bee/panel/panel/uploads/bzeusnet.exe         http://tanvirassociates.pk/ENTEL/formulario.exe
http://211.147.15.3/dls/qqtodta.exe                              http://baniganm.net/uploads/files/baniganm-a8a31351d0.exe
                                                                 http://dyneema.co.uk/components/com_ag_google_analytics2
http://weebly.com/uploads/1/2/1/1/12113165/dvxplayer.scr         /dir/Core777.exe
http://update.qmxsoft.com/1.3.exe                                http://empuriaonline24.com/iJZWa.exe
http://www.yerlipornoizle.org/porno-video.exe                    http://file.findlock.co.kr/ndn/r.exe
http://mytightride.info/missing.exe                              http://u.websuprt.co.kr/NewSidebar/Angel/AngelSupporter.exe
                                                                 http://carroceriasfejoben.com/components/com_ag_google_a
http://solarpanelinstallers.org.uk/media/firsale.exe             nalytics2/dir/p.exe
http://www.genckizlarr.com/videolar/super_sikis_videosu.exe
Mails
Subject: “aussage”                                                  Subject: “Incoming Wire Notification”
Subject: “uberweisung”
Static Detection
List of the samples not given because of size, but it is available on request.




Copyright © 2013 by AV-Test GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany
Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web http://www.av-test.org




                                                                                                                                    7

Weitere ähnliche Inhalte

Was ist angesagt?

Cyber Defense Forensic Analyst - Real World Hands-on Examples
Cyber Defense Forensic Analyst - Real World Hands-on ExamplesCyber Defense Forensic Analyst - Real World Hands-on Examples
Cyber Defense Forensic Analyst - Real World Hands-on ExamplesSandeep Kumar Seeram
 
White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?TI Safe
 
APPBACS: AN APPLICATION BEHAVIOR ANALYSIS AND CLASSIFICATION SYSTEM
APPBACS: AN APPLICATION BEHAVIOR ANALYSIS AND CLASSIFICATION SYSTEMAPPBACS: AN APPLICATION BEHAVIOR ANALYSIS AND CLASSIFICATION SYSTEM
APPBACS: AN APPLICATION BEHAVIOR ANALYSIS AND CLASSIFICATION SYSTEMijcsit
 
Automatically generated win32 heuristic
Automatically generated win32 heuristicAutomatically generated win32 heuristic
Automatically generated win32 heuristicpoojamancharkar
 
slides
slidesslides
slidesbutest
 
Behavioral and performance analysis model for malware detection techniques
Behavioral and performance analysis model for malware detection techniquesBehavioral and performance analysis model for malware detection techniques
Behavioral and performance analysis model for malware detection techniquesIAEME Publication
 
Application of data mining based malicious code detection techniques for dete...
Application of data mining based malicious code detection techniques for dete...Application of data mining based malicious code detection techniques for dete...
Application of data mining based malicious code detection techniques for dete...UltraUploader
 
Final Project _Smart Utilities
Final Project _Smart UtilitiesFinal Project _Smart Utilities
Final Project _Smart UtilitiesPasan Alagiyawanna
 
Accuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersAccuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersLarry Suto
 
Report on forensics tools
Report on forensics toolsReport on forensics tools
Report on forensics toolsVishnuPratap7
 
Understand How Machine Learning Defends Against Zero-Day Threats
Understand How Machine Learning Defends Against Zero-Day ThreatsUnderstand How Machine Learning Defends Against Zero-Day Threats
Understand How Machine Learning Defends Against Zero-Day ThreatsRahul Mohandas
 
IRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET Journal
 

Was ist angesagt? (17)

Cyber Defense Forensic Analyst - Real World Hands-on Examples
Cyber Defense Forensic Analyst - Real World Hands-on ExamplesCyber Defense Forensic Analyst - Real World Hands-on Examples
Cyber Defense Forensic Analyst - Real World Hands-on Examples
 
Avc fd mar2012_intl_en
Avc fd mar2012_intl_enAvc fd mar2012_intl_en
Avc fd mar2012_intl_en
 
White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?White Paper - Are antivirus solutions enough to protect industrial plants?
White Paper - Are antivirus solutions enough to protect industrial plants?
 
Avc beh 201207_en
Avc beh 201207_enAvc beh 201207_en
Avc beh 201207_en
 
APPBACS: AN APPLICATION BEHAVIOR ANALYSIS AND CLASSIFICATION SYSTEM
APPBACS: AN APPLICATION BEHAVIOR ANALYSIS AND CLASSIFICATION SYSTEMAPPBACS: AN APPLICATION BEHAVIOR ANALYSIS AND CLASSIFICATION SYSTEM
APPBACS: AN APPLICATION BEHAVIOR ANALYSIS AND CLASSIFICATION SYSTEM
 
Automatically generated win32 heuristic
Automatically generated win32 heuristicAutomatically generated win32 heuristic
Automatically generated win32 heuristic
 
Avc fd mar2012_intl_en
Avc fd mar2012_intl_enAvc fd mar2012_intl_en
Avc fd mar2012_intl_en
 
slides
slidesslides
slides
 
Behavioral and performance analysis model for malware detection techniques
Behavioral and performance analysis model for malware detection techniquesBehavioral and performance analysis model for malware detection techniques
Behavioral and performance analysis model for malware detection techniques
 
Application of data mining based malicious code detection techniques for dete...
Application of data mining based malicious code detection techniques for dete...Application of data mining based malicious code detection techniques for dete...
Application of data mining based malicious code detection techniques for dete...
 
Final Project _Smart Utilities
Final Project _Smart UtilitiesFinal Project _Smart Utilities
Final Project _Smart Utilities
 
Accuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersAccuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scanners
 
Report on forensics tools
Report on forensics toolsReport on forensics tools
Report on forensics tools
 
Understand How Machine Learning Defends Against Zero-Day Threats
Understand How Machine Learning Defends Against Zero-Day ThreatsUnderstand How Machine Learning Defends Against Zero-Day Threats
Understand How Machine Learning Defends Against Zero-Day Threats
 
IRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit FrameworkIRJET- A Study on Penetration Testing using Metasploit Framework
IRJET- A Study on Penetration Testing using Metasploit Framework
 
App locker
App lockerApp locker
App locker
 
Avc Report25
Avc Report25Avc Report25
Avc Report25
 

Ähnlich wie Windows 8 kasp1248

Colby_Sawyer_white_paper final 2
Colby_Sawyer_white_paper final 2Colby_Sawyer_white_paper final 2
Colby_Sawyer_white_paper final 2Scott Brown
 
AV-Comparatives Performance Test
AV-Comparatives Performance TestAV-Comparatives Performance Test
AV-Comparatives Performance TestHerbert Rodriguez
 
Behavioral and performance analysis model for malware detection techniques
Behavioral and performance analysis model for malware detection techniquesBehavioral and performance analysis model for malware detection techniques
Behavioral and performance analysis model for malware detection techniquesIAEME Publication
 
AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)Doryan Mathos
 
Functional and Behavioral Analysis of Different Type of Ransomware.pptx
Functional and Behavioral Analysis of Different Type of Ransomware.pptxFunctional and Behavioral Analysis of Different Type of Ransomware.pptx
Functional and Behavioral Analysis of Different Type of Ransomware.pptxtarkovtarkovski
 
A STATIC MALWARE DETECTION SYSTEM USING DATA MINING METHODS
A STATIC MALWARE DETECTION SYSTEM USING DATA MINING METHODSA STATIC MALWARE DETECTION SYSTEM USING DATA MINING METHODS
A STATIC MALWARE DETECTION SYSTEM USING DATA MINING METHODSijaia
 
Zero day malware detection
Zero day malware detectionZero day malware detection
Zero day malware detectionsujeeshkumarj
 
Learning area 1 information and communication technology and society
Learning area 1   information and communication technology and societyLearning area 1   information and communication technology and society
Learning area 1 information and communication technology and societyShuren Lew
 
Learning area 1_-_information_and_communication_technology_and_society
Learning area 1_-_information_and_communication_technology_and_societyLearning area 1_-_information_and_communication_technology_and_society
Learning area 1_-_information_and_communication_technology_and_societySaktis Kesavan
 
La1 information and communication technology and society
La1   information and communication technology and societyLa1   information and communication technology and society
La1 information and communication technology and societyAzmiah Mahmud
 
Enchaning system effiency through process scanning
Enchaning system effiency through process scanningEnchaning system effiency through process scanning
Enchaning system effiency through process scanningsai kiran
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishDaniel zhao
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishКомсс Файквэе
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishКомсс Файквэе
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUSfauscha
 

Ähnlich wie Windows 8 kasp1248 (20)

Colby_Sawyer_white_paper final 2
Colby_Sawyer_white_paper final 2Colby_Sawyer_white_paper final 2
Colby_Sawyer_white_paper final 2
 
Avc fdt 201209_en
Avc fdt 201209_enAvc fdt 201209_en
Avc fdt 201209_en
 
Performance dec 2010
Performance dec 2010Performance dec 2010
Performance dec 2010
 
AV-Comparatives Performance Test
AV-Comparatives Performance TestAV-Comparatives Performance Test
AV-Comparatives Performance Test
 
Avc per 201206_en
Avc per 201206_enAvc per 201206_en
Avc per 201206_en
 
Behavioral and performance analysis model for malware detection techniques
Behavioral and performance analysis model for malware detection techniquesBehavioral and performance analysis model for malware detection techniques
Behavioral and performance analysis model for malware detection techniques
 
Avc per 201304_en
Avc per 201304_enAvc per 201304_en
Avc per 201304_en
 
AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)
 
Functional and Behavioral Analysis of Different Type of Ransomware.pptx
Functional and Behavioral Analysis of Different Type of Ransomware.pptxFunctional and Behavioral Analysis of Different Type of Ransomware.pptx
Functional and Behavioral Analysis of Different Type of Ransomware.pptx
 
A STATIC MALWARE DETECTION SYSTEM USING DATA MINING METHODS
A STATIC MALWARE DETECTION SYSTEM USING DATA MINING METHODSA STATIC MALWARE DETECTION SYSTEM USING DATA MINING METHODS
A STATIC MALWARE DETECTION SYSTEM USING DATA MINING METHODS
 
Technology auto protection_from_exploit
Technology auto protection_from_exploitTechnology auto protection_from_exploit
Technology auto protection_from_exploit
 
Zero day malware detection
Zero day malware detectionZero day malware detection
Zero day malware detection
 
Learning area 1 information and communication technology and society
Learning area 1   information and communication technology and societyLearning area 1   information and communication technology and society
Learning area 1 information and communication technology and society
 
Learning area 1_-_information_and_communication_technology_and_society
Learning area 1_-_information_and_communication_technology_and_societyLearning area 1_-_information_and_communication_technology_and_society
Learning area 1_-_information_and_communication_technology_and_society
 
La1 information and communication technology and society
La1   information and communication technology and societyLa1   information and communication technology and society
La1 information and communication technology and society
 
Enchaning system effiency through process scanning
Enchaning system effiency through process scanningEnchaning system effiency through process scanning
Enchaning system effiency through process scanning
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUS
 

Windows 8 kasp1248

  • 1. Windows 8 Malware Protection Test Report A test commissioned by Kaspersky Lab and performed by AV-Test GmbH th th Date of the report: January 11 , 2013, last update: January 11 , 2013 Executive Summary In December 2012, AV-Test performed a comparative review of Kaspersky Internet Security and the built-in Windows 8 security components to determine their capabilities to protect against malware. Four individual tests have been performed. The first was a real-world protection test of malicious URLs and E-Mails with 42 samples, the second was a static detection test of 111487 malicious files, the third test was another static detection test of 2500 prevalent files and the final test was a static false positive test with 345900 samples. To perform the test runs, a clean Windows 8 image was used on several identical PCs. On this image, the security software was installed resp. for the plain Windows 8 image all security related components were left in default state and then the individual tests have been carried out. In case of the dynamic test, the URLs have been accessed and the downloaded samples have been executed and any detection by the security software was noted. Additionally the resulting state of the system was compared with the original state before the test in order to determine whether the attack was successfully blocked or not. In case of the static detection test, the products had to scan two sets of files in default settings. Detections have been noted to determine the detection result. Kaspersky provided near perfect results, blocking all 42 URL and E-Mail attacks, while Microsoft failed to block 5 attacks. There were also dramatic differences in the static detection of malware. Kaspersky detected 99% of the tested samples, Microsoft only managed to detect 90%. Regarding static detection of prevalent malware and false positives there were no problems for either product. Both of them detected all 2500 prevalent malware samples and neither created a false positive detection in the set of clean files. The results indicate that Windows Defender and other security features in Windows 8 do offer a baseline protection. However, this is not enough to reliably protect against the magnitude of attacks seen to Windows systems these days. A commercial solution, such as the tested Kaspersky Internet Security provides a far better protection. 1
  • 2. Overview With the increasing number of threats that is being released and spreading through the Internet these days, the danger of getting infected is increasing as well. A few years back there were new viruses released every few days. This has grown to several thousand new threats per hour. Figure 1: New samples added per year In early 2013 the AV-TEST collection of malware exceeded 100.000.000 unique samples and the growth rates are getting worse and worse. In the year 2000, AV-Test received more than 170,000 new samples, and in 2010 and 2011, the number of new samples grew to nearly 20,000,000 new samples each. The numbers continue to grow in the year 2012 and 2013. Microsoft reacted to these threats and each version of Windows released in the last years was more secure than its predecessor. Windows 8 now even included a built-in malware scanner (Windows Defender) and URL blocking components (SmartScreen filter). This report will assess whether these features are enough to protect the users from today’s threats. Products Tested The testing occurred in December 2012. AV-Test used the latest releases available at the time of the test of the following products:  Microsoft Windows 8 Pro with built-in Windows Defender 4  Kaspersky Internet Security 2013 2
  • 3. Methodology and Scoring Platform All tests have been performed on identical PCs equipped with the following hardware:  Intel Xeon Quad-Core X3360 CPU  4 GB Ram  500 GB HDD (Western Digital)  Intel Pro/1000 PL (Gigabit Ethernet) NIC The operating system was Windows 8 Pro with only those patches that were available on December 1st 2012. Testing methodology General 1. Clean system for each sample. The test systems should be restored to a clean state before being exposed to each malware sample. 2. Physical Machines. The test systems used should be actual physical machines. No Virtual Machines should be used. 3. Product Cloud/Internet Connection. The Internet should be available to all tested products that use the cloud as part of their protection strategy. 4. Product Configuration. All products were run with their default, out-of-the-box configuration. 5. Sample Cloud/Internet Accessibility. If the malware uses the cloud/Internet connection to reach other sites in order to download other files and infect the system, care should be taken to make sure that the cloud access is available to the malware sample in a safe way such that the testing network is not under the threat of getting infected. 6. Allow time for sample to run. Each sample should be allowed to run on the target system for 10 minutes to exhibit autonomous malicious behavior. This may include initiating connections to systems on the internet, or installing itself to survive a reboot (as may be the case with certain key-logging Trojans that only activate fully when the victim is performing a certain task). The procedures below are carried out on all tested programs and all test cases at the same time in order to ensure that all protection programs have the exact same test conditions. If a test case is no longer working or its behavior varies in different protection programs (which can be clearly determined using the Sunshine analyses), the test case is deleted. This ensures that all products were tested in the exact same test scenarios. All test cases are solely obtained from internal AV-TEST sources and are always fully analysed by AV-TEST. We never resort to using test cases or analyses provided by manufacturers or other external sources. Real-World Test 1. The products are installed, updated and started up using standard/default settings. The protection program has complete Internet access at all times. 2. AV-TEST uses the analysis program Sunshine, which it developed itself, to produce a map of the non-infected system. 3
  • 4. 3. It then attempts to access the URL or receive the E-Mail 4. If access is blocked, this is documented 5. If access is not blocked, the malicious sample is downloaded from the URL resp. the malicious attachment is stored to disk 6. If this is blocked, this is documented 7. If this is not blocked, the malicious file will be executed. 8. If execution of the sample is blocked with static or dynamic detection mechanisms by the program, this is documented. 9. If execution is not blocked, an on-demand scan (Full computer scan) will be carried out. Any detections will be noted and any removal actions will be allowed. 10. Given that the detection of malicious components or actions is not always synonymous to successful blockage, Sunshine constantly monitors all actions on the computer in order to determine whether the attack was completely or partially blocked or not blocked at all. 11. A result for the test case is then determined based on the documented detection according to the protection program and the actions on the system recorded by Sunshine. Static Scanning Test 1. The test ran in the timeframe from December 3rd to December 10th, 2012 2. A first scan over the whole collection has been carried out and all non-detected files have been identified 3. Finally a rescan of all non-detected samples has been performed on December 10th, 2012 4. The combined result of these scans has then been noted The malware set for the real-world test contains 42 samples (39 URLs, 3 E-Mails). These samples have been collected during December 11th and December 18th 2012. The malware set for the static scanning test contains 2500 prevalent samples and 111487 further samples. These files have been collected during November 1st and November 30th 2012. 4
  • 5. Test Results The results of the first test (Figure 2) show that Kaspersky was able to protect against all 42 Real- World threats used in the test, while Microsoft failed to block 5 out of them, which would result in an infection of the system. Figure 2: Real-World Protection As most of the infections these days take place via malicious websites or malicious e-mail attachments it is especially important to provide a solid protection level in this area. 5
  • 6. Figure 3: Static Malware Detection The next test shows the capability of statically detecting malware. Two different testsets have been used to determine the results of the two products. There was one set of highly prevalent malware that should be detected by all products and as can be seen in Figure 3 both tested products did detect all samples of this set. The other testset however showed differences between both products. This set contains malware that may not be as widespread but still is a risk to the user. Microsoft detects less 90% of the 111487 samples (11228 missed samples) while Kaspersky stops over 99% of the samples (1031 missed samples). While static detection is only one feature of a set of protection features in modern security software it still is a very important one and many products mainly rely on this technique to detect malware. As such these results indicate that the built-in Windows Defender in Windows 8 is not good enough to reliably protect the user. The final test was a test for false positives. 345900 files collected from clean applications have been scanned with both products to see whether any of these files have been wrongly flagged as malware. Neither product did this and not a single false positive occurred. This shows that even a product with a very good detection and protection rate (Kaspersky) can go without any false positives. 6
  • 7. Appendix Version information of the tested software Developer, Product name Program version Engine/ signature version Distributor Kaspersky Lab Kaspersky Internet Security 2013 13.0.1.4190 (c) 16.5.2.1 Microsoft Windows 8 Pro with Windows 4.0.9200.16384 (Windows 1.1.9002.0/ 1.141.952.0 Defender Defender Version) (Windows Defender Version) List of used URLs, Mails and Samples URLs http://www.stellardayproducts.com/expensekeeper/OperaPMu http://www.fauziyahospital.com/mego/OMG.exe pdate/OperaPMupdate.exe http://204.15.124.183/bX2XdF.exe http://ge.tt/api/1/files/9dMi6OQ/0/blob http://s3-sa-east- http://shaiyaner.de/downloads/svchost.exe 1.amazonaws.com/vonoloa/Contrato_61277.pdf.exe http://restterem.ru/media/admin26.exe http://37.59.66.1/Server.exe http://troykaakademi.net/media/firsale.exe http://nause.com/LjRjU.exe http://citrus.doolr.com/appr1.exe http://ds.searchstar.co.kr/filepop/bacon.exe http://kamalbose.com/uY3VZr.exe http://hack3d.3dom.fr/shouky/rcv.exe http://ballyhooindia.com/brlaikza/9qf.exe http://beyondcreative.com.au/flashupdate.exe http://cateringumbria.eu/ox1.exe http://206.253.165.71/phpmyadmin/5566.exe http://1337.kz/vsocks-1.3.0.0.exe http://chat.ru/~dhoffmanfan/syscfg.exe http://demo.ovh.net/download/c356f845248362888ed36ead5a http://tkprinter.com/Expl0it/forjd/warp.exe 5280ae/c99199151.exe http://videoiindir.weebly.com/uploads/1/2/7/6/12763800/video2 http://89.248.166.21/prx.exe .exe http://74.63.196.69/bee/panel/panel/uploads/bzeusnet.exe http://tanvirassociates.pk/ENTEL/formulario.exe http://211.147.15.3/dls/qqtodta.exe http://baniganm.net/uploads/files/baniganm-a8a31351d0.exe http://dyneema.co.uk/components/com_ag_google_analytics2 http://weebly.com/uploads/1/2/1/1/12113165/dvxplayer.scr /dir/Core777.exe http://update.qmxsoft.com/1.3.exe http://empuriaonline24.com/iJZWa.exe http://www.yerlipornoizle.org/porno-video.exe http://file.findlock.co.kr/ndn/r.exe http://mytightride.info/missing.exe http://u.websuprt.co.kr/NewSidebar/Angel/AngelSupporter.exe http://carroceriasfejoben.com/components/com_ag_google_a http://solarpanelinstallers.org.uk/media/firsale.exe nalytics2/dir/p.exe http://www.genckizlarr.com/videolar/super_sikis_videosu.exe Mails Subject: “aussage” Subject: “Incoming Wire Notification” Subject: “uberweisung” Static Detection List of the samples not given because of size, but it is available on request. Copyright © 2013 by AV-Test GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web http://www.av-test.org 7