Summary: Are you an AMD user who was planning on upgrading to a GeForce 8800 GTX or 8800 GTS sometime in the near future? If so, then you'll want to check out today's article. We've gathered AMD CPUs ranging from the X2 3800+ all the way up to the FX-62, and paired them with the GeForce 8800 GTX and 8800 GTS, as well as ATI's Radeon X1950 XTX. See which clocks deliver the best performance with these GPUs in this article!
NVIDIA has incorporated new features such as a unified shader architecture with up to 128 stream processors in the GeForce 8800 GTX, which also has a 384-bit memory interface running at 900MHz with 768MB of memory. This pales in comparison to the GeForce 7900 GTX, which featured 24 pixel shaders and 8 vertex shaders, as well as a narrower 256-bit memory interface.
With so much graphics horsepower onboard GeForce 8800 however, some potential 8800 card owners may run into cases where the card isn’t running to its full potential and won’t see these huge performance scaling increases, the card isn’t able to “stretch its legs” so to speak. This can occur frequently in games that are based on aging game engines. Sports titles for instance are often based on the same rehashed game engine year after year. In some cases, performance can even be held back due to a game that’s poorly coded.
But software isn’t the only aspect that can hold back a next-gen card at launch, another culprit is often the CPU.
Quite simply, with next-gen graphics processors delivering 1.5-2X times more performance than their predecessors at launch, the CPU can become a bottleneck in many titles we test with. After all, the clock speed of new CPUs tends to only increase in increments of 100-200MHz; only delivering about 10% more performance with each new processor release. You can see this in benchmarks where you hit the same frame rate regardless of graphics settings (such as increasing screen resolution from 1280x1024 to 1600x1200). As soon as a faster CPU is inserted, your frame rates increase.
This is a situation you’d like to avoid if you’ve just plunked down $700 on a brand new 8800 GTX graphics card, as you’re not getting the most from your money (although being CPU-limited does allow you to crank up the image quality settings for “free”, as you won’t get the performance hit usually associated with these setting changes).
Intel’s latest Core 2 CPUs are based on an entirely new micro-architecture, with a wider execution core, 1,066MHz FSB and unified L2 cache architecture with up to 4MB on-chip. As a result, Core 2 delivers substantially more performance than any previous processor from AMD or Intel, including AMD’s flagship Athlon 64 FX-62.
This makes the Core 2 Extreme X6800 we tested with in our GeForce 8800 Performance Preview article an excellent companion for NVIDIA’s latest GeForce cards, but what about the millions of AMD users out there? Is the GeForce 8800 GTX still capable of outrunning the Radeon X1950 XTX by a factor of 2X on an Athlon 64 X2 4200+? What about the 5000+?
Many of you have asked us this very question, and today we’re here to provide some answers. We’ve included AMD CPUs ranging from the Athlon 64 X2 3800+ all the way up to the Athlon 64 FX-62 to see how the GeForce 8800 GTX and 8800 GTS, as well as ATI’s Radeon X1950 XTX scale with these CPUs…
Half-Life 2 Lost Coast
While we’re limiting our testing to AMD’s AM2-based CPUs, which predominantly ship with a 512K L2 cache, we did want to test the performance impact of having a larger 1MB L2 cache for Socket 939 and Athlon 64 X2 5200+ or Opteron 1xx users. In order to accomplish this, we lowered the multiplier on our FX-62 CPU from the factory default setting of 14.0x to 13.0x in order to simulate the performance of a 5200+. This chip was then compared to our stock AMD Athlon 64 X2 5000+.
3DMark 06 – Direct3D
As expected, 3DMark scales nicely with faster CPUs thanks to in large part to its dedicated CPU tests. If you look at the game test results however, you will see that both the GeForce 8800 GTX and GTS scale best with the 5000+ and FX-62, the GeForce 8800 GTX/X2 3800+ combination is often outperformed by the GTS with faster X2 CPUs. This test illustrates why its important to have a fast CPU if you plan on going with a GeForce 8800 GTX card.
Half-Life 2: Lost Coast – Direct3D
The GeForce 8800 GTX is largely CPU-bound in Half-Life 2 Lost Coast. As you can see, performance barely changes going from 1280x1024 to 1600x1200 with this GPU, even with the FX-62 we only saw a 3% performance hit. In comparison the X1950 XTX’s performance dropped by 19% with the FX-62.
Dark Messiah of Might and Magic – Direct3D
Since it’s a little more demanding than Lost Coast, GTX performance scales a little better across the various CPUs in Dark Messiah of Might and Magic, although clearly we’re still CPU-limited in Dark Messiah with the 4600+, 4200+ and 3800+, and to a slightly lesser extent with the 5000+. Since it has fewer shaders, slower clocks, and a narrower memory interface, the GeForce 8800 GTS isn’t nearly as CPU-bound as the GeForce 8800 GTX was in Dark Messiah, performance scales appropriately with each resolution change, the CPU really doesn’t become a bottleneck unless you’re dealing with a 3800+.
Battlefield 2 – Direct3D
Quake 4 – OpenGL
The GeForce 8800 GTX is largely CPU-bound with all the processors we tested in Quake 4, even at 1920x1200 the GTX’s performance only drops 4% from 1280x1024 on the FX-62. It looks like AMD’s CPUs just aren’t fast enough to keep the GPU fed with data in this particular game. Note however that the GeForce 8800 GTX still enjoys a comfortable performance advantage over the GeForce 8800 GTS and Radeon X1950 XTX at 1600x1200 and up.
Lock On: Modern Air Combat – Direct3D
Even in a flight sim like LOMAC, the GeForce 8800 GTX is CPU-bound with pretty much all of AMD’s CPUs, including the FX-62 at lower resolutions. Even at 1920x1200 performance is within 7% of the frame rate we saw at 1280x1024 with the GeForce 8800 GTX/FX-62 combination.
F.E.A.R. – Direct3D
The GeForce 8800 cards scale much better under F.E.A.R., even with a slower CPU like the X2 3800+. This bodes well for the GTX under more shader-intensive titles we’re likely to see in 2007 such as Crysis and Unreal Tournament 2007. The GeForce 8800 GTX and Radeon X1950 XTX perform neck-and-neck with each other in this game, we also saw this earlier this month with the Core 2 Extreme CPU.
Oblivion – Direct3D
Like our results with F.E.A.R., Oblivion scales well with the GeForce 8800 GTX, even with slower CPUs like the X2 3800+.
Oblivion – Direct3D
Under the greater demands of our foliage area, we’re entirely GPU-bound. This means that regardless of the CPU used, the GeForce 8800 GTX and GeForce 8800 GTS delivered the same performance at the various resolutions. This probably comes as good news to those of you with slower AMD CPUs.
Call of Duty 2 – Direct3D
Once again we’re GPU-bound rather than CPU-bound in Call of Duty 2. Regardless of the CPU used, performance was the same across both the GeForce 8800 GTX and GeForce 8800 GTS.
Far Cry – Direct3D
Company of Heroes – Direct3D
Even with a slower CPU like AMD’s Athlon 64 X2 3800+, performance scales well, although clearly at the same time we saw higher numbers with the faster CPUs at all resolutions except 1920x1200. At that res we’re totally GPU-bound.
The GeForce 8800 GTS is a cut-down derivative of the GeForce 8800 GTX with fewer stream processors, slower clocks, and a 320-bit memory interface with less onboard memory, therefore it doesn’t have quite the graphics horsepower as the GeForce 8800 GTX and as a result, it isn’t CPU-limited nearly as often as the GeForce 8800 GTX is, although those of you with X2 3800+ and X2 4200+ chips in particular will want to look closely over our results as there are definitely cases where the GTS is CPU-bound with your CPU.
The bottom line for those of you with slower AMD X2 CPUs is you’re going to want to overclock your processor a little in order to get the best performance out of GeForce 8800. The exact amount you should shoot for is going to depend on which graphics card you plan on getting.
If you were thinking about upgrading to a GeForce 8800 GTS, you won’t need to hit speeds quite as high; 2.4GHz or more is a good starting point you should shoot for when overclocking. But if you were planning on picking up a GeForce 8800 GTX, our 2.6GHz 5000+ was CPU bound in some cases, so you’ll probably want to shoot for even higher speeds.
Of course, another aspect we noticed is that the amount you’re CPU-bound varies depending on the game. In games based on older game engines like Valve’s Source (HL2 and Dark Messiah) and Doom 3 (Quake 4), we found ourselves often CPU-bound with the GeForce 8800 GTX and GTS. Flight sims also draw heavily on CPU performance and therefore it was no surprise to see us being held back in Lock On: Modern Air Combat.
In fact if you’ve got an X2 3800+ running at stock speeds, there’s no point in upgrading to a GeForce 8800 GTX, as there were often cases where the 3800+/8800 GTX combination were outrun by the 8800 GTS and a faster CPU like the 4200+ or 4600+.
In shader-heavy titles like Oblivion and F.E.A.R. however the GeForce 8800 GTX and 8800 GTS were never CPU-bound. In fact in our foliage test with Oblivion we were 100% GPU-bound, whether we were testing with an X2 3800 or the FX-62, performance was the same across all resolutions, including 1280x1024! Of course, keep in mind that we were testing Oblivion in particular with HDR+AA, so that’s going to be harder on the GPU, if we’d turned off AA the results may have been a little different, but we have a strong feeling that anyone who is going to buy a GeForce 8800 card is going to want to run with HDR+AA.
In any case, this suggests that perhaps with next-gen shader model 3.0 and DX10 games the CPU won’t play as large a role in overall performance, but we’ll have to wait and see if that’s truly the case when those titles ship.
Surprisingly enough, Battlefield 2142, which is based on the same engine used in BF2, scaled well regardless of the CPU used. The Battlefield engine isn’t known for its use of shaders, so this definitely surprised us to see.
Of course, if you planned on upgrading to GeForce 8800 SLI you’ll need an even faster CPU to keep both graphics cards fed with data. Otherwise you won’t see the roughly 2X performance improvement NVIDIA’s SLI technology typically provides.
So there you have it, our take on how things stand with the GeForce 8800 line once it’s paired with slower processors. Hopefully you found this article helpful and will be able to better plan your next upgrade accordingly. It certainly takes a lot of time for us to do these articles, but if it helps you guys, that’s what it’s all about.
|© Copyright 2003 FS Media, Inc.|