11 out of 26 anti-virus products fail VB100 certification


VB reveals which products failed to meet VB100 certification criteria, and updated ‘RAP quadrant’ showing Reactive And Proactive detection abilities.


Virus Bulletin

has revealed the results of its latest VB100 certification test on Windows Server 2008.

Of the 26 products tested 11 failed to achieve VB100 certification, all of the failures being at least in part due to incomplete detection of one or both of a pair of highly complex polymorphic file-infecting viruses.

The results of the RAP (‘Reactive And Proactive’) tests conducted at the same time showed a continuation of the trends and patterns seen in recent tests, with dual-engine products from

Trustport

and

G Data

showing particularly remarkable scores.


Virus Bulletin

‘s Test Director John Hawes said: “This month’s test was a real challenge for the products, with two separate variants of a particularly tricky polymorphic virus included in our core WildList set. We used large numbers of samples of each to thoroughly measure accuracy of detection, and showed that many products continue to have trouble with these nasties.”

Hawes continued: “On a brighter note, there were some quite impressive scores in our RAP test, showing that some vendors are doing a good job handling the large volumes of new malware appearing every day. Looking at the long-term picture, we can also see some products achieving high levels of consistency month on month, which is also a good indicator of a solid, well-run lab. We’re looking forward to seeing if these trends continue with a wider range of products in our first comparative on

Windows 7

, due soon.”


VB

‘s cumulative RAP quadrant gives a quick visual reference as to products’ reactive and proactive detection rates – with the better performing products placed in the top right-hand corner:


Virus Bulletin

‘s RAP testing measures products’ detection rates across four distinct sets of malware samples. The first three test sets comprise malware first seen in each of the three weeks prior to product submission. These measure how quickly product developers and labs react to the steady flood of new malware. A fourth test set consists of malware samples first seen in the week after product submission. This test set is used to gauge products’ ability to detect new and unknown samples proactively, using heuristic and generic techniques.

The results of the October 2009 VB100 certification review can be seen

here

.

The full review, including detailed results tables, is available to Virus Bulletin subscribers

here

or in PDF format

here

.

A full description of the RAP testing methodology can be seen

here

.

Posted on 09 October 2009 by

Virus Bulletin


Posted

in

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *