Choosing Anti-Anti-Virus Software

On a more development related note, it is important to keep at least one copy of Vista in a VM that has all those options left to their defaults. I don’t know how many times I’ve heard fellow developers say, “but it works on my machine”. The two biggest issues I see are:

  1. Developers never testing their software with a non-admin account.

  2. Developers never testing with a software firewall that has an “ask first” policy of whitelisting applications. I don’t know how many times I’ve hit a dialog that is supposed to hit the internet and it fails because a firewall is prompting the user to allow or block. No “Retry” button, just an ugly, unhelpful error.

One more thing I always disabled on XP was the System Restore. All it did was take up time and space and I never once had an occassion to use it.

Again, after having read the article Jeff referenced for those test results, all the test really shows is that you can’t really multi-task in XP with only 512MB RAM and one CPU. Especially if one of the tasks is heavily hitting the hardware. Double-especially when the hardware is virtualized.

David,

512mb is a TON of memory for XP, even for heavy multitasking (several apps active) and anti-virus. Don’t forget that XP was released in 2001, when 512mb was a pretty substantially large amount of memory to have in a PC.

I use Visual Studio 2005 in virtual machines running XP all the time and it’s fine even with 128mb, though 192mb is a bit roomier.

But yes, virtualization is particularly brutal on disk perf. Which means running anti-virus under a VM-- one of the most disk intensive apps out there-- would be quite painful.

At any rate, the I have total confidence in the scientific method used in Oli’s tests. He has a baseline number reflecting benchmarks without any software installed, and then differential numbers reflecting benchmark results with (x) software running. The absolute difference might be smaller on physical hardware, but the relative scale of the perf difference should be the same.

Thanks for the link, Jeff and thanks for standing up for my figures.

I did try and perform them in the best possible way but without having to wipe and reinstall windows 100-odd times. The VM was the only way to do that.

There are obviously going to be bottlenecks that rear their heads in the VM that a native OS wouldn’t see – but the comparisons are just a guide. If anything this should really be used as an av-vs-av rather than an av-vs-nothing study if you want to take direct comparisons but either way, I stick with what I say and “go commando”.

A sharp eye and some common sense is my antivirus.

Great post Jeff.

I run Avast Home and was not aware of the 115% I/O delay. I will definitely look over my options again now when my eyes are open. In the case of Avast it should be easy to turn off the “on access scan” when I’m doing the heavy tasks of gaming (and letting it be on when the rest of the family is using the computer).

To clarify one point: I am not proposing that everyone immediately stop running anti-virus software just because I said so. If only I had that kind of power…

I just want users to understand exactly how severe the performance penalty is when you do choose to run anti-virus software.

As David Sokol so aptly pointed out, sometimes this is a reasonable tradeoff. But I’d still like to see it get fixed in the OS first; there’s a reason Linux and Mac users almost never need to run anti-virus software.

It is extremely important to point out that your two benchmark sources, TomsHardware and Anandtech, publish benchmarks with an eye toward gaming (rightly so, since this is where any performance impact will be felt most severely).

What people are forgetting is we exeperienced the exact same sort of performance penalty with WinXP when it was released. People were disappointed to find their games ran slower than on Win98. But as better drivers emerged and (more importantly) applications were OPTIMIZED for XP instead of Windows 98, XP became known as a fantastic performing platform in a very short amount of time.

In 9 months, nobody will be complaining about Vista’s performance.

Jeff the difference between backups and VMs are that a VM is typically a copy of the current state of your machine, where a backup is a collective copy of individual files that make up your machine. If you are infected and you don’t know when you got infected you should need the ability to incrementally go back and find a VM that isn’t infected anymore and that copy may not have the file you want that was in another copy.

VM’s are great sandbox tools, the ability to do stuff and get it so messed up that you can easily restore a safe image and start again.

As for not running any Anti-Virus software at all. That’s making the assumption that “you” will be the only one that is allowing your machine to get infected. There is this little thing called the OS that on it’s own allows things to exploit it and get infected.

Take this, albeit far feached thought, for example. Let’s say I was able to hack into or work at a DVR company (like Tivo) and insert a virus/trojan into the next OS of the DVR. Your DVR connects up to the mothership, downloads it’s marching orders and then spends all day behind your firewall trying to hack your pc or futuristic toaster, since you leave both plugged in and/or on all day. So you come home and the only thing you immediately notice is that only half your heating coils work in your toaster. Then weeks or months later, your system eats itself. Then what?

I totally agree that Anti-Virus software is the biggest plight on performance that I have ever seen. However when I am doing things that require all my system resources (like gaming), I just temporarily disable the anti-virus software… And the sidebar, and windows defender… and the firewall… and Windows Update… etc

But when an OS is not a security threat, and it’s left soley up to the user to be a good citizen, then I will be right there with you with my ‘user’ security, no anti-virus and my firewire5000 backups.

Post rant.

I agree with Jeff -

  1. Disable those 4 things they are either slow or annoying as hell.
  2. Set yourself up to be a ‘user’ by default and use the ‘administrator’ when you need it.
  3. Get a good anti-virus like Kasperky.
  4. Listen to what your “performance score is” as you could be running the wrong speed of RAM, running on 4500 RPM hard drive, etc.

Do those things and you should have a very good Vista experience.

That’s why I install AVG on my home computer… and used to on any family’s computers that didn’t have anything. It wasn’t the best, but it would usually just work and didn’t slow them down so much they would complain about it…

Also, Kasperky is light and it is the most effective:

a href="http://www.virus.gr/english/fullxml/default.asp?id=82"http://www.virus.gr/english/fullxml/default.asp?id=82/a

Best anti-virus ever? Installing windows on D: instead of C: :wink:

@EnricoG: What about %winroot%\system32\

Ryan posted the following link
a href="http://www.virus.gr/english/fullxml/default.asp?id=82"http://www.virus.gr/english/fullxml/default.asp?id=82/a

AOL uses Kasperky, so I am glad to see they got the same score. But it’s funny they tested different versions.

Rank

  1. Kaspersky version 6.0.0.303 - 99.62%
  2. Active Virus Shield by AOL version 6.0.0.299 - 99.62%

I think we need a different approach to virus testing.

We need a reliable record of what files have been modified. Any file which is modified (or is new, thus never scanned) gets scanned before being allowed to run. Once a file has been approved then it runs the next time without being scanned.

The AV program would keep it’s own list of approved files. This would have be somehow tamper-proof as well as the detection of files that have been written. Support for this would probably have to be written into the heart of the OS.

Loren, instead of that, some software, such as Avira Antivir, can scan files only on write. (And only those with a given format, normally.)

And I’m pretty sure there is actually something like the tripwire+av that you mention, I just can’t think of any offhand.

As for only marking files as executable, that wouldn’t defend against the worst current threats: Exploiting buffer overflows in running software (fortunately partly fixed by NX), and spyware installed alongside legit software, at which point you gave it full permissions to own you and mark whatever it wants executable for later. Cute!

Jeff, you speak a lot about virtualization and recently mentioned in your reply that Microsoft should embrace it. I could’ve sworn they quietly bought out a company that produced emulators awhile back. I could be mistaken.

But what does amuse me is that what you’re essentially talking about is what OS X had in the earlier implementations: the Classic Layer. I’m not going to go into exhaustive detail, but it strikes me as amusing that what you see as a solution is one that was seen by Apple awhile back.

Classic Layer is still available for OS X, but if you know what you’re doing you’ve already either found the OS X version of the app or an OS X implementation by a later vendor.

I think without the Classic Layer, OS X would’ve fallen flat on its face. OS X abandoned OS 9’s conventions in favor of a more intelligent design, but even Apple knew that if they didn’t do it, they’d be hosed.

It’s not the first time Apple has shown themselves to be cunning in this regard. Awhile back they converted from Motorola 680x0 processors to PowerPC processors. But they kept compatibility for 680x0 apps in PowerPC implementations and even made it possible to compile a single app into a Fat Binary, which would allow you to compile once and deploy to both. It was woefully bloated if you made them into one, but it was a better solution that trying to run two separate downloads.

Frankly, if Vista does dump so much backwards support, you’re barking up the right tree.

I don’t see why running the entier OS in a virtualized machine would not have zero performance problems compared to anti-viruses, if it provided sort of data protection. Sounds like magic thinking.

Also I don’t think that trashing Windows and starting anew makes sense, that’s like throwing your desk away, drawers and all, because you dropped ink on one book on top of it. You never know what you’re going to loose, what you’re going to roll back to.

Ho and to the poster who thinks Apple Classic Layer is such an amazing invention. Microsoft did the same thing years before with the WOW subsystem for windows 16 bit application, and of course, what you do think the DOS virtual machine was in Windows 16-bit. In fact NT has had support for sub-systems in general, including POSIX and an OS2 1.x compatibility sub-system. In Win64 there is a break in compatibility with Win32 stuff to some extent, that’s the jump that microsoft is using to start a few things anew.

Rick Strahl, some small percentage of the world is capable of not running AV software and getting by, but most people should as they do not have your skill level. That said, most AV software is not so good. Certainly Norton is a pile of crap.

As to speaking against “American Society” and its paranoia, please let us native Americans handle that, and you can deal with your country’s paranoia, e.g., making it an illegal an jailable offence to question certain aspects of your country’s history in the 1940s.

Great article! Like you, I don’t run ANY anti-virus nor anti-malware software other than non-resouce hogging app SpywareBlaster. That said, the hoards of ignorant computer using idiots out there must sacrafice performance for protection from themselves. It’s a fact of life for the vast majority of users. XP should have never granted admin rights to new user accounts, but I do understand why. How would you (Microsoft) like to field the support calls from millions of morons when they can’t install a program or defragment their systems? UAC is a decent ‘work-around’ for this problem.