Randall Kennedy's take on the W7 Benchmark Tests

"Sorting through the latest Windows 7 benchmark results
I take a closer look at ZDnet's latest numbers and how they compare to my own test results.

Windows 7 is not faster than Windows Vista.

Yes, it is! No it's not! Is too!

Looking at all of the benchmarks now surfacing for the "pre-beta" Windows 7 releases is enough to make your head spin. It brings to mind the age-old adage "there are lies, damned lies, and statistics." Or in this case, benchmarks.

ZDnet's Adrian Kingsley-Hughes ran four benchmarks on Windows 7 that show, in three cases at least, that the pre-beta is faster than Vista SP1 -- and notably slower in a fourth instance. My own DMS Clarity Studio benchmarks show that Windows 7 pre-beta performs essentially the same as Vista SP1.

So why the apparent contradiction? That gets us back to lies, damned lies, and benchmarks. Benchmarks are designed to test specific things, so you can't compare them blindly.

For example, one of Kingsley-Hughes' tests looked at boot time, where Windows 7 was about 39 percent faster than Vista SP1 and 17 percent faster than XP SP3. That's great, but my benchmarks test runtime performance, not boot time. So Kingsley-Hughes' boot tests don't contradict mine because they test something else. (And it's great that Windows 7 pre-beta boots faster.)

The PassMark benchmark that Kingsley-Hughes used seems to more directly contradict my benchmarks, until you examine them closely. First, his results show that the Windows 7 per-beta is just 1.5 percent faster than XP SP3 and 2.1 percent faster than Vista SP1 -- basically a tie, as my own results showed.

Second, PassMark results vary wildly, so you can't take the results from just one system, as Kingsley-Hughes did, and extrapolate it to the universe of PCs. You need to look at multiple systems and then figure out a rough average. On my test PC (Intel Core 2 Duo T7200 CPU, 2GB of RAM, 7200rpm drive), the PassMark results showed that XP SP3 was 24 percent faster than the pre-beta Windows 7. And those PassMark subtests can vary by as much as 20 percent -- from pass to pass -- on the same system, so you need to give yourself a good margin of error when comparing results.

Finally, the PassMark tests are synthetic, directly testing hardware components and then making assumptions about how those component tests affect actual users. My tests run real applications and services (Office, Workflow, Database, and Windows Media) -- what business people actually do. The debate between synthetic and real-world tests has gone on as long as there have been PC benchmarks, and publications that originated with Ziff-Davis have tended to rely on synthetic tests, while publications such as InfoWorld published by IDG have tended to rely on real-world tests. We won't resolve that 20-year-old debate here.

The PCMark Vantage tests that he ran show that the Windows 7 pre-beta is about 6 percent faster than Vista SP1 -- not bad, but not the same as what the PassMark tests show. So, what to make of that apparent contradiction? PCMark Vantage was designed for Vista testing in consumer usage scenarios, so it relies a lot on graphics-processing applications. Neither PassMark nor my tests are consumer-oriented, so that "contradiction" is easy to explain.

Note: PCMark Vantage does not run on XP, so there's no comparison to be made there. Although Kingsley-Hughes published PCMark Vantage results ostensibly for XP, he later pulled those numbers.

That leaves the CineBench numbers, which focus on multimedia apps (unlike PassMark or my DMS Clarity Studio benchmarks). Kingsley-Hughes' tests show that Windows 7 pre-beta is 3 percent faster at running CineBench than Vista SP1. Great!

So, it's clear from the tests that are the closest in intent -- DMS Clarity Studio and PassMark -- that Windows 7 and Windows Vista perform essentially the same, even if individual PassMark tests vary from that overall conclusion. The other tests focus on very specific performance measures, so they can't be used for any general performance conclusions.

At the end of the day, what matters is performance in your own context. By all means, run tests like PassMark, PCMark, and CineBench to get a sense of general-user (PassMark) and consumer-application performance between Vista and Windows 7. If you're in IT -- which most InfoWorld readers are -- the DMS Clarity Studio tests focus on what you and your business users do, so it remains the most useful of these benchmarks for your purposes.

Of course, you can easily verify my results by downloading the free (for non-commercial use) DMS Clarity Studio suite from the exo.performance.network Web site and run the OfficeBench test script for yourself. This Office-specific benchmarking tool, which I wrote while working for Intel's Desktop Architecture Labs (DAL), uses actual Office applications to execute the script. It runs unmodified across four generations of Windows and Office and typically shows less than 1 percent variability between test runs (contrast this with the synthetic PassMark).

Or you can just take these latest results out of context, as some in the blogosphere have chosen to do. To his credit, Kingsley-Hughes simply ran the numbers, without commenting on what they meant or how they compared to others. We should all follow his example, or at least cite the relevant context before making conclusions."

Sorting through the latest Windows 7 benchmark results |Enterprise Desktop | Randall C. Kennedy | InfoWorld