GeForce FX 5900 XT core
GeForce FX 5900 Value gets a name
When the GeForce FX 5900 Value was first announced, rumors were swirling over what features this chip would support. Probably the most popular report was that the memory interface was being sliced in half, to 128-bits total. This would have the effect of starving the GeForce FX 5900 core, as it would only have half its memory bandwidth (13.6GB/sec assuming 850MHz memory frequency). As a result, the 5900 Value would have less memory bandwidth than NVIDIA’s GeForce FX 5700 Ultra – clearly this wouldn’t make much sense.
The 5700U cooler looks daunting
The 5900 XT is slightly longer
Slim single-slot design
Therefore, the 256-bit memory interface present on more senior GeForce FX 5900 variants remains on the GeForce FX 5900 XT (Value). In order to reduce manufacturing costs however, NVIDIA has replaced the 2.2 nanosecond memory with 2.8 nanosecond modules. This memory is rated for operation at 700MHz, which is the speed NVIDIA has chosen for the 5900 XT. This reduces peak memory bandwidth to 22.4GB/sec, 4.8GB/sec shy of the GeForce FX 5900 128MB. Surprisingly enough, this figure matches the RADEON 9800 PRO 256MB, and bests the 128MB RADEON 9800 PRO by 600MB/sec. In other words, this figure still isn’t bad for a “value” card.
In addition, to help boost performance, the 2.8ns memory modules NVIDIA’s 5900 XT board partners utilize run at lower latencies than the 2.2ns modules used on the GeForce FX 5900 and 5900 Ultra. If you’ve ever lowered the latency on your system RAM (via your motherboard’s BIOS), you know that this change increases performance. As you’ll see in our performance results, this allows the 5900 XT to run neck-and-neck with the GeForce FX 5900 128MB in many of our benchmarks.
The GeForce FX 5900 XT’s core remains unchanged from the GeForce FX 5900 128MB. The core clock frequency is still 400MHz, equating to a peak texel fill rate of 3.2Gigatexels/second, which is higher than the RADEON 9800 PRO and just shy of the RADEON 9800 XT. Obviously traditional metrics like memory bandwidth and fill rate don’t tell the whole story anymore, but it’s an impressive figure nonetheless.
It’s also interesting to note that while NVIDIA borrows ATI’s naming conventions for the GeForce FX 5900 XT core and the GeForce FX 5900 SE cards that are based on it, this is not a crippled graphics card like ATI’s “SE” series, which are often stripped down models with narrower memory interfaces or half their pixel pipelines disabled. The GeForce FX 5900 XT boasts all the features of the other 5900 GPUs, including such goodies as UltraShadow. This is what we like to call “a good thing”.
To coincide with the release of the GeForce FX 5900 XT, NVIDIA launched its Detonator 53.03 driver set last week, but actually, NVIDIA’s board partners provided 53.03 a month prior for use with their GeForce FX 5900 XT cards, which were already shipping to retail channels.
From what we have seen so far, 53.03 builds largely on its predecessor, 52.16, which means it still has the problems we mentioned in our ForceWare 52.16 report with Splinter Cell and pseudo-trilinear filtering with UT2K3. In fact, end users have been playing with this driver for weeks and have found that it circumvents FutureMark’s recent patch for 3DMark 03. We’ll be providing a more detailed analysis of 53.03 (which enables support for the 5900 XT) soon, but we can tell you that we have witnessed a performance increase in OpenGL titles Quake 3 and Call of Duty, and a performance decline in UT2003.