Thursday 9 May 2013

Anti-virus testing conspiracy theories

Every time an anti-virus tester publishes a report the WildersSecurity internet forum buzzes with a mixture of responses ranging from conspiracy theory to logical reasoning (via understandable suspicion, disappointment and useful questioning).

A recent line of discussion following the latest Dennis Technology Labs report is no exception and includes some pretty direct accusations of corruption, as well as an increasing number of sensible replies.

I have posted my response on the forum, but here is a copy of my statement:

Hello everyone.

As usual I have enjoyed the discussion and would like to provide some feedback.

I appreciate that some people reading this thread have preconceived ideas, or even conspiracy theories, that will never be changed by anything that I write here. However, for the record, I would like to answer some of the questions, concerns and both direct and implied accusations that appear above.

Prior associations with Symantec, “money talks”. Dennis is a Symantec affiliate. Mock surprise that Norton comes top.

We have associations with most vendors through our involvement in AMTSO. We have provided services to companies including, but not limited to, McAfee, Trend Micro, Kaspersky Lab, ESET, AVG and Symantec.

We pride ourselves on our transparency and the ethical way in which we conduct our business. We have no secret affiliations with any customer.

One of the reasons the vendors like working with us is because our testing can provide a number of benefits. For example, if a product performs very well then the vendor’s marketing team are happy. However, if we discover a problem then the engineers are able to use the very detailed information that we log to fix that problem.

It would be of very limited use to everyone if we fixed the results to favour one vendor. Such cheating would be fairly obvious to the other vendors, and the favoured vendors would be unaware of problems with their products.

Assumption that most tests used internet malware

Absolutely, and this fact was made clear throughout the report, starting with the introduction. We expose the products to live (internet) web-based threats. Products that block malicious URLs thoroughly will do well, whereas those that rely solely on signatures and heuristics face a greater chance of being compromised.

DTL tests are not on a par with the major test labs

All test labs have different methodologies. It would be fairly pointless if we all worked in exactly the same way as we would be duplicating each other’s work.

Some testers concentrate on running vast number of samples through AV engines. We focus on investigating cases in forensic detail. This means that our sample numbers are much smaller than those found in on-demand tests. The workload is heavy and manual. It takes us six weeks to run a 100-sample test with 20 or less products.

Having said that, when other testers perform ‘real-world’ testing such as we specialise in, they also use small sample sizes that are comparable with ours.

The note that the test is unsponsored was an attempt to knock Wilder’s argument up front. Also, unsponsored is a matter of definition.

Important though your opinions are to us, we don't word our reports to pre-empt predictable controversies on Wilders. And as you can see, it would not work anyway . We include such details to ensure that everyone knows when a test is and is not sponsored. We try to be as transparent as possible.

Sometimes a customer will ask us to perform a test. This often happens at certain times of the year, such as in the summer when new products are launched. If we agree that the products in the list are fairly comparable we will consider running such tests. These, which are commissioned by the customer, are what we call ‘sponsored’ tests.

The regular quarterly testing is financed by subscribers, which we refer to as partner vendors. These companies support the test by paying a quarterly subscription fee. In return they are allowed certain privileges.

For example, if their product performs well then they are able to use our logos to promote their success. They also have an opportunity to review our logs to discover where problems may lie with their products.

Finally they have an opportunity to challenge our findings (for their own products). This means that, should they wish to argue with our findings, they can do so before we publish the report. This is why you sometimes see comments from a vendor in the report itself – perhaps explaining why they believe the product suffered some issues.

Neutralisation definition is vague: “What constitutes a running threat ? A malicious script that is terminated/blocked as soon as it starts, or some malware that is already active on the computer (e.g. running trojan)?”

A neutralisation occurs when a threat runs on the system but is unable to effect a significant change. As we log every process that runs on the test systems we know if a malicious process ran or not.

A script or Trojan that executes, but which is terminated soon after it starts, sounds like a classic example of a neutralisation in our test.

There is also a further granularity to this type of result. We have the possibility of partial or full remediation.

So if a script ran, was terminated and no changes were made to the system then that would be a neutralisation with full remediation. This means extra points.

However, imagine that a Trojan runs and sets itself up to survive a reboot using a Registry Run entry. The AV product detects the Trojan and removes it, but leaves the Registry entry in place. Or removes the Registry entry but leaves the Trojan file on the disk. That would be a neutralisation without full remediation.

The testing platform is outdated (Win XP) and runs vulnerable software. Hopes that Dennis will switch to Windows 7.

As we noted in the report, there are a lot of people still running Windows XP. According to the research that we’ve seen, just less than half of all Windows users are using XP.

As XP’s life is coming to an end, and Windows 7 has recently become dominant, we will switch to Windows 7 before the end of this year. You can expect one more XP-based report (2013 Q2) before we upgrade.

Regarding the use of vulnerable software, we pre-install it to give the security software a chance to protect the system.

To use an analogy: if we were testing car tyre safety we would not do so in optimum conditions. We'd have windy, wet roads. Similarly we use systems that are vulnerable to attack so we can judge how well the security software protects them.

Who are the partner vendors?

As I mention above, we work with all of the main AV vendors but I am not going to name the specific companies that form the partner vendor line-up.

In some cases you can easily tell, because you’ll see our logos on their marketing material. But in other cases either they may choose not to show the logos or, more likely, they may not achieve an award at all.

Desire for paid AVG.

We have tested the paid version of AVG in two earlier tests. There was a strong desire from readers to see free products and, as the paid-for and free versions of AVG *should* receive similar results it seemed sensible to test the free version only.

Why do I think that the free and paid-for versions should receive similar results? Because we are not currently testing email threats or sending malware and exploits over IM. Ours is a web-based test.

Default settings used?

With consumer products default settings are used. However, we disable each product’s ability to report its statistics and other telemetry home to the vendor. Also, we may increase the logging level if that helps us discover issues with the product.

In some cases we may extend the number of TCP ports that the product is allowed to scan for HTTP-based threats. This is because we use a range of ports when replaying the threat sessions.

More programs should have been tested

This consumer test includes nine very well-known products. At the same time we ran a small business and enterprise test, bringing the total number of products to 19.

This involves a vast amount of work and, while we could possibly squeeze a few more products in, it’s not a trivial matter. We hope that as more vendors support the test we'll be able to extend the number of products tested.

Adjusted testing procedures to give benefit to skew the test in favour of one or more products/vendors
We do not adjust our methodology in favour of any vendor or product. We will make small configuration changes if a vendor notes that its product is behaving differently in our environment than it would in the real world.

Our test setup is very close to being the same as many home and business users, but there are always compromises we have to take in order to expose the products to the same threats in the same way. Malicious sites don’t like being tested!

So when we encounter issues we address them to remain fair to all.

DTL tests are on a par with basic magazine tests and are not comparable with AV test labs’ work

Dennis Technology Labs is a testing business that resides within the media company called Dennis Publishing Ltd. We provide test data to magazines within Dennis and to magazines published by other companies.

We test anti-malware products constantly, using state of the art techniques and equipment. I don’t know of any magazine that would be able to summon resources even vaguely resembling the way we work.

As I mentioned above, we are focussed on realistic testing and ensuring that all results are accurately verified and recorded. We don’t run on-demand scans on millions of files like some testers do. We simply take a different approach.

That said, our results are not that different in terms of rankings, to those produced by the other major testers. This is pretty interesting considering our different testing approaches.

No regulation of testing groups

There is no regulation of anti-malware testing, just as there is no regulation of many industries. There is the Anti-Malware Testing Standards Organization, though. AMTSO brings together vendors and testers into a forum for (usually very robust!) discussion.

Dennis Technology Labs is proud to be a member of AMTSO, alongside almost all of the other major testing groups.

From my own, personal point of view, one of AMTSO’s great strengths is that it enables discussion between vendors, testers and anyone else interested enough to engage. It’s not a regulator but it is very influential.

No comments:

Post a Comment