Assiste.com - Sécurité informatique - Vie privée sur le Web - Neutralité d'Internet

cr  01.04.2012      r+  22.10.2024      r-  22.10.2024      Pierre Pinard.         (Alertes et avis de sécurité au jour le jour)

Document d'archives d'Assiste.com

Le document suivant est conservé pour mémoire. Il peut servir de point de repère pour une archéologie du Web et du Net.

Il a beaucoup été fait état, en son temps (1988 - 1991), d'un test nommé « Rosenthal's Antivirus Test » (et de son programme appelé « Virus Simulator » ou « Rosenthal Virus Simulator software suite (VirSim) »), distribué par « Rosenthal Engeneering ».

Le « Rosenthal's Antivirus Test » ou « Virus Simulator » fut un outil de test des antivirus et une énorme bêtise qui eut son heure de gloire.

Retrouvé dans les archives du Web (la WayBack Machine) le 18.10.2015

Compilation datée 1996 de textes antérieurs (novembre 94 dans le texte ?)

VIRSIM2C.ZIP MAIN

65K 11/94 Virus Simulator Ver 2C<ASADxASP> Audit and demonstrate anti-virus protection. Rosenthal Engineering's absolute neccessity for anyone serious about virus defense, security and training. "Unreservedly recommended!" by Computer Virus Developments Quarterly. Used in tests conducted by National Software Testing Labs for Software Digest and PC Digest. Written about in Computerworld, Virus Bulletin, Virus News Int., Telecomputing, etc.

Retrouvé dans les archives du Web (la WayBack Machine) le 18.10.2015

Pas daté

VIRUS SIMULATOR

Aquellos de vosotros que hayáis sufrido las consecuencias de tener un virus informático en el disco duro de vuestro ordenador, os habréis tirado de los pelos al comprobar las labores de destrucción que llevaban a cabo con la información guardada en ellos. Además, el proceso para acabar con ellos en algunos casos es bastante complicado y desesperante. «Virus Simulator» genera una serie de virus simulados, que actúan como los reales pero que al resetear el ordenador desaparecen sin dejar rastro. Con este programa tenemos dos posibilidades: generar virus aleatoriamente en una unidad de disquete en forma de ficheros .EXE y .COM, o emular un virus en memoria, que al pasar un anti-virus cualquiera de mensajes de advertencia por todos lados. Sin duda alguna, la primera opción es la más interesante, pues imaginaros la gracia que le podría hacer a un amigo que le metierais un virus en su ordenador sin saber que es de mentira. Para conseguirlo, sólo tenéis que escribir una carta a: Rosenthal Engineering, 3737 Sequoia, San Luis Obispo, CA 93401 USA

La dernière version commerciale connue du « Rosenthal's Antivirus Test », la version « Virus Simulator v.2c », date du 06.08.1991.

Après son abandon en tant qu'outil de test, son auteur, Doren Rosenthal, s'est rapproché de l'ASP (Association of Shareware Professionals) et à rendu public, gratuitement, une version 3.0 évoquée dans ce message (bien qu'il se plaigne, le 06.07.1998, soit plus de 6 ans après l'abandon, dans ce message, de la trop grande diffusion de la version gratuite alors qu'il touche encore des revenus de la version commerciale).

Le « Rosenthal's Antivirus Test » fut utilisé massivement, par certains, pour tester et comparer des antivirus. Les résultats de ces tests étaient encore publiés au début des années 2000 (les derniers tests et comparatifs semblent remonter à 2002 et les résultats des tests peuvent être trouvés sur le WEB, par exemple ceux du CNET (C|Net) de 2002 où ils déclarent utiliser le « Rosenthal's Antivirus Test » ).

Une lettre ouverte incendiaire, signée de 19 professionnels des antivirus, était pourtant adressée au C|Net dès le 9 octobre 2000.

Lettre incendiaire, cosignée par 19 grands acteurs du monde de l'antivirus, à propos du « Rosenthal's Antivirus Test »

Credibility and Ethics in Antivirus Product Reviewing
An Open Letter to CNET
October 9, 2000

To:

Molly Wood, Associate Editor, CNET
Erik Johnson, Lab Manager, CNET
Eric Franklin, Project Leader, CNET

Cc:

Richard Ford, Ombudsman for WildList Organization International
Sarah Gordon, European Institute of Computer Antivirus Research
Paul Robinson, SC Magazine, Secure Computing
Roger Thompson, International Computer Security Association
Francesca Thorneloe, Virus Bulletin
Matthew Zintel, Aladdin Knowledge Systems
Chengi Jimmy Kuo, Network Associates
Charles Renert, Symantec
Susan Orbuch, Trend Micro

From:

Joe Wells, CEO Wells Antivirus Research Laboratory.

To whom it may concern,

CNET’s September 21, 2000, review of antivirus products betrayed their readers' trust. Moreover, it did antivirus product users a major disservice. Although this review was presented as being fair and professional, the evidence demonstrates that it was neither.

Consider the following facts.

Quote: "Viruses"

First, it should be noted that the review leveled charges about missing "viruses" against all four products in the test. Note these examples:

Aladdin eSafe Desktop 2.2

  • Virus defense leaks like a sieve.
  • Missed half of our test viruses.
  • Let our test PC get more infections than there are in a hospital ward.

McAfee VirusScan 5.1

  • Missed more viruses in our tests than [Norton] AntiVirus.
  • It let more file-carried and email-borne viruses through in our tests.
  • It missed three of our nine test viruses.

Norton AntiVirus 2001

  • Doesn't protect against Internet-borne viruses as well as McAfee.
  • One test virus got through.
  • AntiVirus blithely let [a downloaded, virus-infected file] through.

Trend PC-cillin 2001

  • Three in nine viruses got through.
  • Fails to finger every incoming virus.
  • It lets too many viruses through to be considered safe.

Unfortunately, neither the methodology nor result reports indicate exactly what "viruses" were used or which "viruses" were not detected by each product. However, what the methodology does indicate is that some of the "viruses" used in the test were not actually viruses. The section on "How We Tested" states:

CNET Labs used Rosenthal Utilities, a program that simulates viruses, to test for virus detection in main memory, in the file sector of floppy disks in A: drive, on the hard drive, and in the boot sector of floppy disks in A: drive.

In addition to these simulated viruses, CNET did use some real viruses, but the number, ratio and identity of the real and simulated viruses are not disclosed. Only a couple were identified by name. Credible antivirus tests include detailed information on the viruses used and which viruses were missed by each product.

Yet more important than numbers and names is the fact that simulated viruses are not real viruses and using them will skew testing beyond the point of credibility.

To demonstrate this claim, please consider the following information.

Today’s antivirus products use a variety of sophisticated methods to detect viruses. Such methods include execution analysis, code and data mapping, virtual machine emulation, cryptographic analysis of file sections, etc.

Such advanced antivirus systems make virus simulation for testing virtually impossible. This is because there is no way to know what sections of viral code and/or data are targeted by any given product. That being the case, all of the virus code and data must be in the file and in the correct order for the product to detect it as that virus. If a simulator did create a file with everything possibly needed in place, it would have to create the virus exactly. It would no longer be a simulator and the virus would be real, not simulated. Therefore a virus cannot be reliably simulated.

So simulated viruses cannot reliably take the place of real viruses. This in turn means they are not a measure of an antivirus product’s worth. Think about it. If a product does not report a simulated virus as being infected, it’s right. And if a program does report a simulated virus as being infected, it’s wrong. Thus, using simulated viruses in a product review inverts the test results. It grossly misrepresents the truth of the matter because:

  • It rewards the product that incorrectly reports a non-virus as infected.
  • It penalizes a product that correctly recognizes the non-virus as not infected.

Competent, credible antivirus product reviewers today recognize the need to reflect the real world in their testing. To do so, they focus detection testing on the real-world threat, using real viruses. They focus on viruses reported by the WildList Organization International. True, some may also include other viruses in testing, but they still use real viruses, not simulated ones.

In addition, the documentation provided with Rosenthal Engineering’s Virus Simulator clearly states, "These test virus simulations are not intended to replace the comprehensive collection of real virus samples."

Finally, CNET’s own answer to the question "How can I test my antivirus software?" includes the statement, "Most people in the antivirus community consider a "virus simulator" unnecessary and unsuitable for this task." (This is found on CNET’s help.com site at
https://www.help.com/cat/2/285/287/tip/3920.html?tag=st.hp.ht.txt.tip.)(ce lien ne fonctionne plus - vérifié le 22.04.2014 - aucune trace dans les archives du Web).

Furthermore, the methodology does not state exactly what viruses were simulated. Did the simulated viruses represent viruses that would be an actual threat to the reader?

In light of these facts, it becomes evident that a highly questionable review has been published and CNET’s credibility has suffered. Yet their credibility has suffered, not just because they used simulated viruses, but also because the reviewer refers to "test viruses" throughout most of the article. As seen in the quotations above, the review continually refers to the "viruses" that were used, whereas the methodology states that CNET Labs used "a program that simulates viruses."

What happens if a reader doesn’t read the "How We Tested" page? What will they assume? They would assume that the viruses are real, wouldn’t they? Moreover, they’ll probably suppose that these "viruses" are a real threat to them.

But beyond that, what happens when the review actually tells them that the testing represents real-world performance, will they believe it? Why wouldn’t they?

Consider as an example the case of Aladdin’s eSafe Desktop 2.2. CNET reported the following in their review of eSafe under the subhead Horrible Virus Handling.

eSafe's real-world performance stinks. It failed to sniff out half of our test viruses -- the worst score of any virus hunter we examined.

How exactly does the CNET reviewer define real-world performance? The context here implies that it’s based on "test viruses" being missed.

The review says they used "nine real-world viruses on each app, from KakWorm to this year's latest global threat, the I Love You virus." Where then do the simulated viruses fit in? Were simulated versions of "real world" viruses used? What were the other seven "real-world" viruses?

This uncertainty leads u? more questions. Exactly what "viruses" made eSafe "stink" so much? Were they actually viruses, or were they simulated?

Let’s illustrate the extent of this problem by indulging in a conjectural scenario.

Suppose the "viruses" that eSafe missed were all simulated, and therefore not real viruses. If that were the case, then eSafe was correct in not reporting them, wasn’t it? Further, if all the other products mistakenly reported simulated viruses as being real viruses, they would be wrong, wouldn't they? Where does this lead us?

Well, if our conjectural suppositions were true then that would mean CNET’s reviewer had slurred a product and declared it the worst, because it was the most accurate one tested. This shows why testing with simulated viruses is, at best, misleading.

In light of the above facts, one thing should be quite obvious. Testing antivirus products with simulated viruses is a gross misrepresentation of reality. So, in doing such testing, and thereby publishing a misleading review, CNET has violated the trust of their readers. In addition, CNET’s review has done antivirus users a major disservice.

What does this say about CNET?

If on the one hand, the reviewers mistakenly assumed that testing with simulated viruses was OK, then they are evidently not very well informed. In that case, are they actually qualified to do valid testing of antivirus products?

If, on the other hand, they were informed and did know what they were doing, then misrepresenting simulated viruses as "viruses" throughout the review was a deception and products were knowingly misrepresented.

It is quite doubtful that the reviewer had malicious intent. Still, whichever case is true, CNET’s credibility as a product testing body has been called into serious question.

Having said that, it must also be pointed out that there is another major failing in this review.

An Ethical Quandary

Most antivirus companies are under some form of self-imposed restrictions that prevent them from knowingly creating new viruses or virus variants. In addition, competent testing and certification bodies such as ICSA, Virus Bulletin, Secure Computing, and AV-Test.org, do not create new viruses or virus variants for testing.

Indeed, the consensus throughout the antivirus development and testing community is that creating a new virus or variant for product testing would be very bad — and totally unnecessary. To do so would undoubtedly raise questions about their ethics.

Whether or not CNET knew this fact is unknown, but they did in fact create two new virus variants for their testing. Please note this fact as described in the "How We Tested" section.

We scanned for the I Love You virus in three different ways. In the first test, we left the code as is. In the second test, we changed every reference to love in the code. In the third test, we changed the size of the file by inserting a comment that did not affect the code.

Changing an existing virus results in a new virus. If a testing body does this, they brand themselves with, as it were, a scarlet "V" (as has CNET at this point). They mark themselves as a virus creating organization in the eyes of antivirus experts worldwide.

More importantly, producing new virus variants creates an incredibly complex quandary. It places the tester in a very difficult position, which can quickly escalate the problem.

When a tester claims that a product should not be purchased because it misses viruses, that tester takes on the burden of proof. Their claim can be challenged. Antivirus companies have every right to demand proof that the testing was fair. If in turn the proof cannot be given, they have the right to advertise that fact and demand a retraction.

Proof generally involves either having an independent body repeat the testing, or providing copies of the viruses missed to antivirus companies. In either case, where the virus was created by the testing body, they would need to send the new virus to someone else. If they send it to an antivirus company, other companies could rightly demand copies, too. But what happens if they send their new virus to someone else?

Creating a virus for testing is one thing, distributing it is quite another. Doing s?calates the problem and virtually destroys the testing body’s reputation. This is because they then become a virus creation and distribution organization and, once the virus has left their control, there is the possibility that their new virus might escape into the wild and spread.

True, CNET, or some other testing body, could conceivably attempt to sidestep this issue by saying they will not send the viruses they created. They could offer to explain how the antivirus company or independent tester can create the virus themselves, to see why a product missed it. This ploy is obviously not a solution, because ethical tester organizations and researchers at antivirus companies will refuse to create a new virus. In fact, many would also refuse to accept a newly created sample as well.

Contradictory Combination

We've discussed two factors, the use of simulated viruses, and the creation of new variants. If we combine these factors the results produce a contradiction in the logic upon which CNET’s methodology is based.

We can ask, why didn't they just use real, common viruses in testing?

The common reason given to justify the use of simulated viruses is the possibility that real viruses might escape from the test environment and spread. But if this was CNET's reason for using simulated viruses, wouldn't the same possibility of escape have existed for the two viruses they created. Or is the opposite true? They might have had a good, secure environment in which to test their new viruses. But if that's the case it only brings us back to asking why they didn't use real viruses in that same, safe environment.

Each of these two factors (using simulated viruses and using modified viruses) has been demonstrated as an invalid basis for testing. When we juxtapose these two factors we evidence our claim that the logic underlying CNET’s methodology is contradictory, further weakening the already-crumbling foundation upon which their "virus testing" was based.

Summary

The use of simulated viruses in CNET’s review is bad. Representing them to readers as "test viruses" is worse. But creating new virus variants is the worst transgression of all -- especially as such tactics in testing are totally unjustifiable. There are better ways to test.

Well-documented methods to effectively test various antivirus solutions are available. Several excellent papers exist on antivirus product testing. There are also competent antivirus testing labs that can provide metrics testing, which can be fully documented and easily reproduced.

Moreover, it cannot be claimed as a matter of cost. Some antivirus labs test a variety of products on a regular basis and permit the publication of their most recent test results at little or no cost. Why do they do this? Because, first and foremost, they desire to see the publication of high quality, incontrovertible test results, rather than misleading results based on questionable methodology.

As a result, there is absolutely no justification for the use of simulated viruses, which do not represent reality. There is never a reason to create new viruses to test products, especially when there is not a secure, dedicated virus testing facility.

If CNET does not have a secure virus test lab then they should use a competent outside lab. Other news organizations do so. Indeed, there are highly qualified testing labs that can do accurate testing against viruses and under conditions that reflect reality.

Therefore, we must conclude that, when reviewing antivirus products, statistical metrics involving viruses should be delegated to antivirus experts who do it all the time. At the same time, other product facets such as usability, intuitive interface, update issues, support factors, and so forth should by all means be done by the experts at CNET who regularly test a variety of software. This methodology will solve the problems encountered in CNET’s antivirus product review of September 21, 2000.

It is therefore hoped that CNET, as the responsible news source it is, will retract the entire test, renounce forever the flawed methodology, and provide fair, factual, and beneficial antivirus product reviews in the future.

Maybe the four products tested by CNET would score exactly as they did in the fallacious testing or maybe they wouldn’t. But in either case, the product review would be based on facts rather than falsehood. Thus CNET's readers would benefit, instead of having their trust betrayed.

Sincerely,

Joe Wells

CEO and Director of Wells Antivirus Research Laboratory, Inc. USA
CEO, Founder and Director of WildList Organization International
Former Senior Editor of IBM’s antivirus online magazine, USA
Advisory board member Virus Bulletin, UK

The following individuals have asked to have their names attached to this open letter to indicate their agreement and support in this matter.

Francesca Thorneloe
Editor of Virus Bulletin. UK

Pavel Baudis
Vice President of ALWIL Software, Czech Republic
Advisory board member Virus Bulletin, UK

Kenneth L. Bechtel
Founder of Team Anti-Virus, North America

Dr. Vesselin Vladimirov Bontchev
Antivirus Researcher at FRISK Software International, Iceland
Founding member of CARO (Computer Antivirus Research Organization)
Founding member of VSI (the Virus Security Institute)

Shane Coursen
Manager of WarLab virus and antivirus testing facility. USA
Vice President of Wells Antivirus Research Laboratory, Inc. USA
Direct?f WildList Organization International

Joost De Raeymaeker
Owner of RSVP Consultores Associados, Lda. Portugal

Allan Dyer
Chief Consultant, Yui Kee Computing Ltd. Hong Kong

Nick FitzGerald
Director, Computer Virus Consulting Ltd, New Zealand
Advisor to the WildList Organization International
Former editor and antivirus product tester, Virus Bulletin, UK

David Harley
Security Analyst, Imperial Cancer Research Fund, UK
Consultant, SherpaSoft Anti-Virus UK & Mac Virus UK

Dr. Jan Hruska
Chief Executive Officer, Sophos Anti-Virus, UK

Jose Martinez
Manager of Hacksoft S.R.L. Perú

Andreas Marx
University Otto-von-Guericke Magdeburg
Head of the Virus and Anti-Virus Test Lab "AV-Test.org", Germany

Petr Odehnal
Head of Virus Lab at GRISOFT(c) SOFTWARE, Czech Republic

David Phillips
Project Officer, The Open University, Technology, UK

Peter V. Radatti
President and CEO of CyberSoft, Inc., USA

Stuart Taylor
Head of Virus Laboratory, Sophos Plc. UK

Robert Vibert
Anti-Virus Researcher and Solution Architect at Segura Solutions Inc. Canada

Eddy Willems
Technical Director of Data Alert International, Benelux
News Editor for EICAR

Righard J. Zwienenberg
Anti-Virus Researcher at Norman ASA, Norway
Founding member of VSG (the Virus Strategy Group)

Representatives of Aladdin Knowledge Systems, Network Associates, and Trend Micro, regret their inability to be signatories, as their products (eSafe Desktop, McAfee VirusScan and PC-cillin 2000) were included in the review.

Le « Rosenthal's Antivirus Test » ou « Virus Simulator », développé par Doren Rosenthal de « Rosenthal Engineering », visait à analyser les capacités heuristiques (détection de nouveaux virus, inconnus) des antivirus. Comme son nom l'indique, il s'agit d'un simulateur de virus.

Le « Rosenthal's Antivirus Test » ou « Virus Simulator » génère des fichiers contenant des jeux d'instructions censés avoir des comportements qui devraient être soupçonnés de malveillances par les antivirus. Il génère également de fausses signatures prétendument devant être interceptées par les antivirus.

Tous ces objets de test, soumis aux antivirus ne sont donc pas des virus ni des malveillances. Ce ne sont que des trucs qui simulent, qui font semblant.

Les résultats des tests avec le « Rosenthal's Antivirus Test » ou « Virus Simulator » ont conduit à encenser des antivirus réagissant à tous ces faux comportements et ces faux virus. Ont donc été encensés les antivirus produisant le maximum de faux positifs (fausses alertes) et descendus en flammes les antivirus ne faisant aucune faute !

Des polémiques en tous sens apparurent sur le Web comme, par exemple, dans le célèbre groupe de discussion alt.comp.virus, Doren Rosenthal or Symantec answer Me !!!. L'antivirus Norton, accusé habituellement à l'époque de produire de très nombreux faux positifs (fausses alertes), est accusé ici de ne produire aucun faux positif.

Ce test n'est plus disponible et, s'il vous arrive de le trouver en téléchargement, faites une croix dessus et ne l'utilisez pas, c'est une imbécilité.

Doren Rosenthal dans le Hall of Fame de alt.comp.virus 2002 (en des termes sévères)

Doren Rosenthal:
A pain in the ass whose virus simulations are still a pain in the ass.

Professor Timo Salmi's best friend.

https://groups.google.com/forum/#!msg/alt.comp.virus/fVE9DrDSOLk/o8kU5Sq4kxQJ


Présentation du Virus Simulator from Rosenthal Engineering

* * * Virus Simulator from Rosenthal Engineering * * *

Safely Validate Your Anti-Virus Protection

Virus Simulator, selected as one of the years best programs in "PC Magazines Guide to Shareware", was employed as a standard of comparison for all anti-virus products reviewed by "Software Digest", "PC Digest" and "LAN Reporter" with independent tests conducted by the "National Software Testing Laboratories".

"Unreservedly recommended!" by "Computer Virus Developments Quarterly", to test anti-virus measures or demonstrate how they work.

Additional articles have appeared in "Computerworld", "Telecomputing", "Mobile Office", "PC Novice", "Info-Security", among others. This latest version of Virus Simulator is an absolute necessity, for anyone seriously interested in defending against viruses. Government agencies, business, security consultants, law enforcement, institutions and system administrators, employ Virus Simulator when conducting internal security audits and training.

(VIRSIM##)




Analyse de la version 3.0 du « Rosenthal's Antivirus Test » (« Virus Simulator »)
AntivirusRésultatMise à jour
AvastVirussim [Tool]20141006
Baidu-InternationalHacktool.DOS.VirusSim.ATF20141006
KasperskyVirTool.DOS.VirusSim.a20141006

Analyse de la version 3.0 du « Rosenthal's Antivirus Test » (« Virus Simulator »)
Un test réellement pris en grippe par les antivirus.
AntivirusRésultatMise à jour
AVGGeneric_c.BOXN20141006
Ad-AwareVirtool.Dos.Virussim.B20141006
AgnitumTool.VirusSimulator.A20141006
Antiy-AVLHackTool[VirTool]/DOS.VirusSim20141006
AvastVirussim [Tool]20141006
AviraKIT/PolyDOS.Virus.A20141006
Baidu-InternationalHackTool.DOS.VirusSim.aK20141006
BitDefenderVirtool.Dos.Virussim.B20141006
BkavDos.Clodf75.Trojan.d86820141006
ClamAVDOS.VirusSim.a20141006
ComodoApplication.VirTool.VirusSim.A20141006
CyrenTool!dc3720141006
ESET-NOD32VirTool.VirusSim.A20141006
EmsisoftVirtool.Dos.Virussim.B (B)20141006
F-ProtTool!dc3720141006
F-SecureVirtool.Dos.Virussim.B20141006
FortinetW32/VIRUSSIM.A!tr20141006
GDataVirtool.Dos.Virussim.B20141006
IkarusVirTool.DOS.VirusSim20141006
JiangminPolyEngineSGen.VirusSim.a20141006
KasperskyVirTool.DOS.VirusSim.a20141006
McAfeeDemo-VirSim.kit20141006
McAfee-GW-EditionDemo-VirSim.kit20141006
MicroWorld-eScanVirtool.Dos.Virussim.B20141006
NANO-AntivirusRiskware.Dos.VirusSim.gsvb20141006
PandaUniv.AP.H20141006
Qihoo-360Win32/Virus.DoS.80420141007
TrendMicroVIRUSSIM.A20141006
TrendMicro-HouseCallVIRUSSIM.A20141006
VBA32PolyEngineSGen.VirusSim.a20141006
ViRobotVirTool.VirusSim.5324620141006
ZillyaTool.DOS.0B19F60D20141006
nProtectVirtool.Dos.Virussim.B20141006