In the months leading up to Valve’s official Half-Life 2 unveiling, we were given two tools with which to test performance. One was the beta version of Counter-Strike: Source and the other was an eye-candy demonstration called the Video Stress Test, intended to help evaluate system readiness before the actual game’s arrival. When we first ran a slew of processors through both tests and made our initial comparisons
, each test gave us a different result. The Video Stress Test, it seemed, was bound by graphics processing power, while the Counter-Strike demos that we ran demonstrated CPU scaling issues. Naturally, we were left to guess which metric would be more representative of the actual game once it emerged.
And now it’s out, available to the gaming enthusiasts who’ve waited so long for Half-Life 2 to be finished. It should come as no surprise that we’re revisiting CPU scaling; after all, if processor performance takes a backseat to graphics, there’s still time to fine-tune the upgrades on your holiday wish list.
We received plenty of feedback on the last processor scaling piece and we tried to consider it all for this story. Thus, we’ve added an intermediate resolution, presenting 800x600, 1280x1024, and 1600x1200. Each setting corresponds to a different set of quality options, reflecting a wide range of visual experiences.
The tests at 800x600 employ low model detail, low texture detail, no anti-aliasing, no bilinear filtering, simple water reflections, low shader detail, low shadow detail, and the DirectX 9.0 code path.
At 1280x1024, those settings rise to medium model detail, medium texture detail, 2x anti-aliasing, 2x anisotropic filtering, world reflective water detail, high shader detail, high shadow detail, and again, the DirectX 9.0 code path.
Then, 1600x1200 steps in to represent Half-Life 2 at its pinnacle. Here you’ll find high model detail, high texture detail, 4x anti-aliasing, 16x anisotropic filtering, world reflective water detail, high shader detail, high shadow detail, and naturally, DirectX 9.0.
We’ve also added more processors to cast more light on the various systems out there. There are 17 total this time around. And before you start wondering who’d pair an Athlon XP 2100+ with NVIDIA’s GeForce 6800 Ultra, anyway, bear in mind that we’re looking at processor performance here. By using a high-end graphics card, we minimize that component’s affect on the overall scores to the best of our ability. We used NVIDIA graphics cards because ATI still doesn’t have matching PCI Express and AGP products, which we need in order to normalize graphics performance between the Intel and AMD platforms.