VB100 comparative review on Windows 8.1

Forums General General Discussion Windows Security VB100 comparative review on Windows 8.1

Viewing 2 reply threads
  • Author
    Posts
    • #44872
      hackerman1
      Moderator

        “2014-11-14 John Hawes
        Virus Bulletin

        The VB test team put 14 corporate products and 34 consumer products through their paces on Windows 8.1 “

        VB100 comparative review on Windows 8.1: https://www.virusbtn.com/virusbulletin/archive/2014/08/vb201408-comparative#id4258595

        Direct-link to report (PDF): https://www.virusbtn.com/pdf/magazine/2014/201408-vb100-comparative.pdf

      • #61128
        Anonymous

          Look at the RAP-quadrant, almost at the bottom of the report.
          Kaspersky are low on proactive detection.
          Outpost are low on both reactive & proactive.
          Both of them usually perform better.

          http://www.virusbtn.com/virusbulletin/archive/2014/08/figures/RAP-consumer-large.jpg

          Summary information about RAP testing

          “The unique RAP (Reactive and Proactive) tests measure detection rates using the freshest samples available at the time products are submitted to the test, as well as samples not seen until after product databases are frozen.
          This provides a measure of both the vendors’ ability to handle newly emerging malware,
          and their accuracy in detecting previously unknown malware.

          The four-test RAP averages quadrant allows at-a-glance comparison of detection rates by these criteria.”

          “The RAP tests are run according to the following procedures:

          RAP samples are split into four sets. The set known as ‘week +1’ is gathered in the period from one to seven days after the product submission deadline.
          The ‘week -1’ set covers the deadline day itself and the six previous days.
          The ‘week -2’ set includes samples gathered eight to 14 days before the deadline,
          and the ‘week -3’ set consists of samples gathered 15 to 21 days before the deadline.”

          “For each product entered for a review, we measure detection using our standard on-demand scanning procedure;
          this uses default product settings and ignores detections labelled as ‘suspicious’ only.
          Scores used in the per-test RAP quadrants are labelled ‘Proactive’ (the ‘week +1’ score),
          and ‘Reactive’ (the average of the scores for weeks -1, -2 and -3).
          Scores used in the four-test RAP averages quadrant are the averages of each score over the last four tests.
          In the per-test quadrants, products with false positives in the test in question are marked by striking through the product identifier.
          For the four-test RAP averages quadrant, such scores are excluded when calculating averages.”

          Full details of the RAP scheme: https://www.virusbtn.com/vb100/rap-index

          The X-axis (horizontal) is detection of “new” malware (“0-day”), and the Y-axis (vertical) is detection of “old” malware.
          So a good antimalware-program should be at the upper-right corner.

        • #61129
          Anonymous

            You donĀ“t have to read the report to find out which antimalware-programs achieved a VB100.
            For just a quick overview take a look at the summary: https://www.virusbtn.com/vb100/archive/summary

        Viewing 2 reply threads
        • You must be logged in to reply to this topic.