Summary: Today we're taking a look at NVIDIA's flagship GeForce FX 5800 Ultra. Armed with its 500MHz core clock, 1,000MHz DDR2 and one attention-grabbing cooler, the GeForce FX 5800 Ultra has been one of the most anticipated products of 2003. But how does it stack up against RADEON 9700 PRO? In this article we'll examine the performance and visual quality. We've also thrown in a few MP3s of the revised cooling system, DVD playback screenshots and more. Check it all out in this article!
While 2003 is barely three months old no product has generated more discussion than NVIDIA’s GeForce FX. In fact, earlier this year we conducted a poll asking which hardware technology you were most looking forward to. AMD’s Athlon 64, Hyper-Threading, and ATI’s R350 were among the list of choices. 28% out of 11,441 votes were for GeForce FX, making it the top response in our poll. Clearly NVIDIA has some pretty big shoes to fill.
Unlike previous launches in its recent past, with GeForce FX NVIDIA is playing catch up to ATI. ATI was first to market with a DirectX 9 accelerator, the RADEON 9700 PRO, and followed that up with its mainstream solution, the RADEON 9500 series. All NVIDIA had to counter with was its line of AGP 8X-enabled GeForce 4 products, which were based on an architecture that was over six months old. Complicating matters even further is GeForce FX’s next-generation architecture. Whenever a product is perceived as being “all new” public anticipation grows even further. And when the industry is as highly visible as 3D graphics (no pun intended), you’ve got a lot to live up to.
Therefore, when GeForce FX was officially unveiled at Comdex last fall, we, as well as several other members of the press were all over every morsel of information NVIDIA was divulging. We knew NVIDIA would be pushing its CineFX architecture and the enhanced programmability it brings, but important details such as clock frequencies and price were still being determined. We were able to spend some time with a prerelease GeForce FX engineering sample before the show began, but obviously the board we played with was very different than the board we’re testing today.
GeForce FX Variants
Which brings us to March 2003, just over two months since GeForce FX was announced. We now know that two GeForce FX models will be available, the GeForce FX 5800 Ultra which ships at 500MHz core/500MHz memory (1,000MHz effective) and features NVIDIA’s controversial FX Flow cooling system, and the GeForce FX 5800. Unlike the Ultra model, the GeForce FX 5800 is clocked at a more conservative 400MHz core/400MHz memory (800MHz effective) and doesn’t require FX Flow, although we’ve learned that many third party manufacturers will be using cooling units that consume the PCI slot adjacent to the AGP slot. The MSRP of the 5800 Ultra is $399 while the GeForce FX 5800 is priced at $299. In terms of availability, the only manufacturer to ship so far has been BFG Technologies, which began shipping their 5800 Ultra cards last week. From what we’ve heard from our sources, samples of Ultra cards are extremely limited right now, while the 5800 won’t be available for a few more weeks. This certainly isn’t good for NVIDIA, as ATI’s follow-up to the RADEON 9700 PRO is confirmed to be less than a month away from release.
NVIDIA’s GeForce FX is the company’s first foray into TSMC’s 0.13-micron process, which is often fingered as the source of GeForce FX’s numerous delays. It’s no so much the use of 0.13-micron (with copper interconnects) that’s the culprit; rather it’s believed that TSMC’s difficulties with low-k dielectric are the primary cause. Low-k dielectric is a material used to shield the wiring within a chip, reducing electrical crosstalk.
8-Pixel Pipeline or 4?
Another more recent source of controversy surrounding GeForce FX has been the number of pixel pipelines it contains. At launch NVIDIA was proud to boast GeForce FX’s “8 pixels/clock rendering pipeline”. Everyone assumed NVIDIA was referring to an 8-pixel pipeline with 1 texture unit per pipe configuration similar to what ATI has employed in its RADEON 9700 architecture. However, when GeForce FX cards made their way into reviewers hands it was discovered that GeForce FX performed similarly to the 4-pixel pipeline architecture used in GeForce4 in some situations.
SIDEBAR: TSMC also manufactures ATI’s graphics chips
In the whitepaper, NVIDIA brings up a very valid point in that the first game test uses a significant amount of single texturing but with GeForce FX 5800 Ultra’s 500MHz core clock and 8-pixel pipelines NVIDIA theoretically has an advantage over RADEON 9700 PRO in these situations (4,000 Mpixels/sec in GeForce FX versus 2,600 in RADEON 9700) so why should they complain? Lets use 3DMark to see just what kind of fill rate GeForce FX actually delivers:
Fill rate Performance
So essentially, GeForce FX has been designed for more forward-looking titles at the expense of older single-textured applications. As you can see its 4x2 architecture gives it a multi-textured fill rate advantage over RADEON 9700 PRO, 4,000Mtexels/sec versus RADEON 9700 PRO’s 2,600Mtexels/sec. Is this as big a deal as some are making it out to be? Not according to Serious Sam programmer Dean Sekulic:
When push comes to shove, Dean hits the nail on the head. Both the GeForce FX and RADEON 9700 are more than capable of handling today’s latest games and software applications. Until the first wave of DirectX 9 games is released we won’t truly know whose design is best. What we do know is that ATI’s RADEON 9700 has been available for months now, so NVIDIA has a lot of catching up to do if they wish to recapture the high-end segment.
SIDEBAR: The GeForce FX 5800 Ultra model’s codename is NV30U.
Physically, the GeForce FX 5800 Ultra board is slightly shorter than the GeForce4 Ti 4600, but not by much. As a result, GeForce FX dwarfs RADEON 9700. Like GeForce4 the power circuitry is located on the back of the board while the Silicon Image Sil 164 (which is also used on GeForce4) DVI transmitter is placed on the underside of the card.
Like RADEON 9700, GeForce FX requires an external power source, the AGP interface just isn’t capable of supplying enough power for the GeForce FX core. NVIDIA opted to go with a standard 4-pin Molex power connector similar to what you’d use to power a DVD-ROM or hard drive rather than the floppy connector ATI utilizes with its RADEON 9500/9700 series.
Frankly we prefer this implementation over ATI’s as it requires fewer, more reliable parts. There have been numerous times where we had to apply a considerable amount of force to disconnect the 9700’s floppy connector. We’ve gotten to the point where we just leave the adapter on, even when the card isn’t in use. We wouldn’t be surprised if more than a few 9700 and 9500 owners have physically ripped the floppy power connector off of their card. If you don’t connect the power connector, the card boots up and runs Windows fine. The only drawback is that the card will operate at a reduced clock rate of 250/500. NVIDIA should probably implement some form of warning message, in the current 42.69 driver it only gives you a general warning that your card is running at reduced clock frequencies, it never goes on to tell you that it’s doing so because the power connector isn’t plugged in to the power source.
In the rush to get their game on, some gamers may forget to plug in the power source and then skip over the clock frequency message (as we said, it’s way too general)
To keep the GeForce FX GPU cool, NVIDIA has implemented its FX Flow cooling system. Like ABIT’s OTES (which uses a similar Y.S. Tech cooling fan) and Sapphire’s Ultimate Edition, FX Flow uses heat pipe technology to keep the core cool. Like the OTES system a copper heat pipe is soldered to a copper plate, which is in turn attached to the GeForce FX core. You’ll also see that NVIDIA uses a rolled copper fin heatsink (with much larger fins than OTES) and adds a second duct for cool air to help the process along. The enclosure is housed in a plastic covering, creating a sealed environment, which also ensures better cooling.
The other key difference between GeForce FX and ABIT OTES is memory cooling. NVIDIA uses a copper heatsink to cool the memory on the bottom of the card (with heatsink paste) while the same copper plate used to cool the GeForce FX core is also used to cool the DDR2 memory on the top of the card. The heatsink on the back of the card gets hot to the touch after extended use, suggesting that not only is the GeForce FX 5800 Ultra core a bit on the hot side, but so is its accompanying 1,000MHz DDR2 memory.
Since the first GeForce FX previews went out last month, NVIDIA has implemented a new revision of its FX Flow cooling system. Originally the card’s fan ran at a low speed in 2D mode, before cranking up to full speed in 3D applications. With the revised cooler, the fan is shut off in 2D mode, rendering the card completely silent. Once a 3D application is started, NVIDIA has told us that the fan operates at full speed or in at a slightly slower mode. As far as we can tell the card never got out of full speed mode during gaming, so we can’t comment on the intermediate setting, but as it stands now GeForce FX 5800 Ultra is still pretty loud in 3D.
We’ve taken a few audio clips of the GeForce FX and RADEON 9700 PRO. The microphone was placed on the fifth PCI slot of the P4G8X, roughly 3.5” away from the video card. The zip file includes two GeForce FX samples, one taken on system startup (startupclose.mp3), this is what you’ll hear every time you boot up your GeForce FX 5800 Ultra-equipped system. The second audio file is the FX Flow cooler starting up from the Windows desktop, and then shutting down. (If you listen closely you’ll also hear the IBM Deskstar hard drive and a few mouse clicks.) The third file is the RADEON 9700 PRO running a few tests in Quake 3. You will probably need to turn up the volume before listening, as the recordings are a little faint.
SIDEBAR: The codename for GeForce FX was NV30.
NVIDIA has implemented a 12-layer board design with GeForce FX 5800 Ultra. In comparison, GeForce4 Ti 4200 utilized a 6-layer design while Ti 4400 and Ti 4600 were built on eight layers; ATI’s RADEON 9700 PRO is a 10-layer design. As you add additional layers you get cleaner signals, but manufacturing costs also increase. To put it simply, the GeForce FX 5800 Ultra board design is just as complicated as the graphics core itself.
In light of this, NVIDIA is handling GeForce FX 5800 Ultra board production in-house. NVIDIA CEO Jen Hsun Huang recently went on record discussing GeForce FX board production “At the initial stage, Nvidia will ship 1.5 million NV30 chips and will also make cards based on the top-end GeForce FX 5800 Ultra (NV30 core) itself. The lower-end GeForce FX 5800 chips, however, will be released to downstream card manufacturers for production.”
Gainward has stated that they are producing a GeForce FX 5800 Ultra board of their own that will be up to 7 decibels quieter than NVIDIA’s 5800 Ultra reference board but since this board hasn’t shipped we can’t verify these claims. With NVIDIA’s CEO stating that they’re handling all board production we’re not certain if Gainward has to submit their cooling solution to NVIDIA who then produces Gainward’s boards for them, or if Gainward is able to purchase their Ultra boards without the FX Flow cooling system.
NVIDIA’s GeForce FX is now up to driver version 42.69, the drivers we used for this test. NVIDIA has devoted the latter (42.67 and up) drivers for 3DMark 03 optimizations, resulting in a small bug we ran into involving the dynamic clock frequency feature. During testing with Serious Sam we noticed our GeForce FX 5800 Ultra board stutter, once the test was complete we noticed that the score was much lower than we expected.
We proceeded to repeat the test when about 20 seconds later the fan on the board shut off, as if the card was running in 2D mode. Keep in mind that we were in the middle of a benchmark run when this occurred! We immediately restarted the system and never ran into the problem again, or at least we can say that the fan never shut off in the middle of 3D applications. From time to time we still ran into performance problems, so we certainly believe the drivers were dynamically lowering the clock frequencies in the middle of testing. This occurred quite a few times with Serious Sam when anisotropic filtering and anti-aliasing were enabled, and even happened a few times in Quake 3. Fortunately the problem was always solved with a quick repeat of the test, but you can’t expect a gamer to restart his game or reboot his machine every time his performance is a little off.
We were told by NVIDIA that GeForce FX 5800 Ultra had been extensively tested in ovens, so they don’t believe it’s an issue with heat, rather a bug in the newer drivers (as they hadn’t run into these issues with 42.63). Quite honestly, we tend to believe them, as we were able to run countless tests with 3DMark 2003 while the GeForce FX 5800 Ultra board was overclocked, so we’re hoping that this will be addressed soon.
Other than this snafu, we also ran into some visual quality issues with the GeForce4 and the 42.69 driver in 3DMark 03, but of course this isn’t a GeForce4 preview.
SIDEBAR: NVIDIA will be announcing the rest of its products for the first half of 2003 at GDC this week.
DVD Playback Quality
First off, visual quality is highly subjective, what looks flawless to one person may be ugly to another. In addition, NVIDIA-based card manufacturers have traditionally manufactured their own boards (and thus, may implement lower quality filters), so any visual quality observations made with NVIDIA’s reference card could be entirely different than real, shipping hardware. However, since NVIDIA is handling all GeForce FX 5800 Ultra board production we feel like we can make some comments regarding the visual quality of the Ultra reference card we received from NVIDIA.
In DVD playback, the GeForce FX 5800 Ultra definitely looks good, but it doesn’t appear to be much, if any of an improvement over GeForce4. Skin tones just don’t look quite as natural with either NVIDIA product, but again this is something that may or may not worry you if you’re not a hardcore videophile. What will annoy you however is the fact that the FX Flow cooling system remains on during DVD playback, just as if you were running a 3D application.
This is one issue that definitely needs to be addressed in a future driver update, as we can’t imagine that there are many of you who want to hear the FX Flow cooling system in action while you’re watching a movie. We’ve attached a several screenshots of the GeForce FX and GeForce4 as well as ATI’s RADEON 9700 but the images aren’t duplicates so it isn’t a perfect comparison. All images were produced with PowerDVD’s screen capture feature.
ATI’s 2D display still outshines NVIDIA in our eyes. ATI’s display appears to be warmer, resulting in a very soft display that’s pleasing to the eye. Text with both cards is razor sharp and colors are vibrant so we don’t think you’ll be disappointed with either cards display, but ATI’s desktop just looks a little better.
ATI’s adaptive anisotropic filtering engine has always been one of its strongest suits, both the RADEON 8500 and RADEON 9700 were highly regarded for their anistropic filtering quality. As a result, NVIDIA has put more of an emphasis on its anisotropic engine, and contained within the latest Detonator driver is NVIDIA’s response.
While older Detonator drivers weren’t very flexible, the Detonator 42.6x series incorporates three anistropic filtering settings – balanced, application and aggressive. The balanced setting is default, and, according to NVIDIA, is meant to compete with ATI’s “quality” setting. As its name suggests, the aggressive setting uses lower quality samples than in turn results in higher performance at the cost of image quality. The application setting uses whatever aniso level is requested by the application.
To compare the visual quality of ATI and NVIDIA’s latest offerings (as well as GeForce4), we tested with Unreal Tournament 2003’s DM-Insidious level. We’ve included screenshots of our testing below, but if you really want to get an accurate representation of what was displayed on the screen, we highly suggest you download the original bitmaps. We’ve zipped them up into one file for your convenience.
First, we have our scene with anisotropic filtering disabled (taken with GeForce FX 5800 Ultra):
GeForce FX didn’t quite get the scene right in the image above (and below for that matter). You can clearly see the huge black “box” between the two television screens in the center of the image. This isn’t so much a glitch or a bug, as this is a series of columns hanging from the ceiling. What appears to be happening is that GeForce FX is overapplying the shadows -- instead of soft shadows we actually get an outline of the columns below the ceiling. This happens in a few other areas of the map, so hopefully NVIDIA will address this with a driver update. (If you download the bitmaps, you’ll see that GeForce4 doesn’t exhibit this problem with the same driver.) Now that we’ve got that out of the way, we have NVIDIA’s aggressive implementation on the left, and ATI’s performance on the right, both taken with the 4x anisotropic setting:
Both solutions look pretty good, but if you look a little closer you’ll see that ATI’s textures look better. The key areas to look at are the stones in and around the crosshairs as well as the area where the metal grating meets the stones (at the tip of the minigun). Again, we highly suggest you download the bitmaps and zoom in around these areas to 500% to get a real good look at what we’re describing, as you may miss the details upon first glance.
Next we have both solutions quality setting. Again, NVIDIA’s balanced setting is on the left, and ATI’s quality setting is on the right:
In 4x mode, it’s real tough for us to pick a winner, even when zoomed both cards output very similar displays. This really demonstrates how far NVIDIA has come with anisotropic filtering in its latest drivers, as GeForce4 used to output a muddier image than ATI’s RADEON 9700. Once armed with Detonator 42.69 GeForce4 (which also has the balanced and aggressive settings available) looks just as good as GeForce FX. It’s also worth noting that RADEON 9700’s performance mode looks practically identical to its quality mode, at least in our testing. Again, just download the original bitmaps and see for yourself.
Before we move on to AA, lets take a look at both cards quality 8x anistropic. NVIDIA on the left, ATI on the right:
At first glance we thought ATI’s textures were a bit sharper, then we noticed that we’re a little closer to the center of the metal platform in the ATI shot, once you look further you’ll see that both cards look excellent.
NVIDIA has really come a long way with its latest Detonator drivers, anistropic filtering quality is drastically improved over older driver revisions. They deserve tons of credit for stepping up to the challenge, and as you can see they’ve delivered a very good package. However, we still have to give the nod to ATI, not only does their engine go one step further than NVIDIA with its 16x setting, ATI’s performance mode delivers better image quality. ATI better not rest on their laurels for too much longer though, as NVIDIA has made some great strides in anistropic filtering.
To test anti-aliasing image quality, we used 3DMark 03’s image quality test. If you’ve followed some of our other preview articles, you’ve probably noticed that we prefer to use 3DMark for all image quality anti-aliasing tests. The reason we prefer to use 3DMark is simple, jaggies can creep up at a moment’s notice. One slight twitch of the mouse can make a difference. The only way to ensure that we’re getting the exact same image is to use 3DMark’s image quality test, which grabs a screen capture of a static display.
We chose game test 1 because it’s a bright map (and thus jaggies are easier to see), and also because flight simulations are notorious for jagged edges. The long, straight lines of a planes’wing are a great place to spot them, while the edge of the horizon is a perfect place to spot shimmering. In this case all you have to do is look along the leading edge of the B-17’s wing and tail to see the jagged edges. We used the default screen resolution of 1024x768 for testing.
First lets look at a shot with anti-aliasing turned off (taken with GeForce FX 5800 Ultra):
It doesn’t take 20/20 vision to see all the jaggies in this picture, every plane has lots of them.
Next, 2x AA:
First, to the naked eye it’s very hard to see the difference between GeForce FX 5800 Ultra and RADEON 9700 PRO, both cards look very similar. However, once you zoom in 500% you can see that the leading edges on the B-17 wing appear to blend slightly better on the ATI card. Download this zip file to see all of the images.
It doesn’t take a 500% zoom to see the GeForce FX 5800 Ultra’s jaggies (but you will need to download the larger images); it just isn’t a dramatic improvement over 2x anti-aliasing. We asked NVIDIA if 3DMark 03’s image quality test was an accurate representation of GeForce FX 5800 Ultra’s image quality and were told that it works as long as the AA level is set in the application, so we double-checked to ensure that our images were accurate and got similar results. If we learn otherwise, we’ll update this article accordingly, but as it stands now we’ve got to give ATI the AA crown.
SIDEBAR: Please keep in mind that screenshots can only paint a picture of AA quality, the best way to judge it is to see it in action.
We decided to include test results with both NVIDIA modes (aggressive as well as balanced) in order to provide a more complete picture of the card’s performance. To keep things as similar as possible, we also decided to do the same with the RADEON 9700 PRO. However, in some cases we did have to manually adjust the quality sliders for the ATI card in order to maintain adequate visual quality in Open GL applications. Therefore, we’re keeping all of our official comments solely to the default settings used by both drivers.
While RADEON 9700 PRO is only an evolutionary step above GeForce4, GeForce FX 5800 Ultra demonstrates remarkable performance in polygon testing with 3DMark 2001 SE. GeForce FX 5800 Ultra delivers twice the performance of RADEON 9700 PRO in more intensive tests with eight light sources, while GeForce FX 5800 Ultra delivers 1.5 times the performance of RADEON 9700 PRO in testing with one light.
In bump mapping tests, GeForce FX 5800 trails RADEON 9700 PRO by 3% in environment mapped bump mapping, but finishes seven percentage points ahead of the ATI card in dot3 bump mapping.
Pixel Shader Performance
Vertex Shader Performance
We ran pixel shader tests with both 3DMark 2001SE (depicted as “01” in the graphs) and 3DMark 03 (the pixel shader 2.0 figures in the graphs). The advanced pixel shader test comes from 3DMark 2001SE. In the simple pixel shader test we see that the GeForce FX 5800 Ultra outperforms RADEON 9700 PRO by 3%, but once 2.0 pixel shaders are used RADEON 9700 PRO nearly delivers 1.5 times the performance of GeForce FX 5800 Ultra. In advanced pixel shader testing we see similar results, the margin is actually slightly greater.
PowerVR’s fablemark benchmark essentially takes its popular villagemark test to the next level by incorporating an updated graphics engine with soft shadows. The end result is a more stressful benchmark. GeForce FX 5800 Ultra and RADEON 9700 PRO run neck-and-neck in this test with both cards offering up to 2.5 times the performance of GeForce4. At 1600x1200 GeForce FX 5800 Ultra is able to take a substantial 16% lead, we ran a few quick tests with VillageMark and noticed an even larger performance advantage for GeForce FX 5800 Ultra.
From here on you’ll see the use of NVIDIA’s balanced “bal” and aggressive “aggr” modes, while we used ATI’s performance “perf” and quality settings “qual” to provide as complete a picture as possible of the performance of these cards. We’ll leave our official comments however to their default settings, as we had to make some custom driver adjustments to the 9700 PRO in OpenGL titles to maintain adequate visual quality. Besides, that’s the setting most gamers are likely to use.
3D Mark 2001 SE v.330
The margin between RADEON 9700 PRO and GeForce FX 5800 Ultra is just 2% at lower resolutions in 3DMark 2001SE, but once we crank up the screen resolution we see that GeForce FX 5800 Ultra is able to pull ahead. Lets take a closer look at the scores though.
3DMark 2001 - Car Chase
3DMark 2001 - Dragothic
3DMark 2001 - Lobby
3DMark 2001 - Nature
FutureMark’s car chase test uses physics calculations for the car’s suspension and movements, the truck itself has three texture layers, while the wrecked cars and some houses in the scene use two textures. In low detail mode, two textures layers are used, while high detail adds an additional texture layer (as well as additional objects with more detail) and dynamic shadowing. In low detail mode GeForce FX 5800 Ultra outpaces RADEON 9700 PRO by 7%, while RADEON 9700 PRO takes a 16% lead in high detail mode. Dragothic also makes extensive use of multiple textures and vertex shaders are used for morphing the animations of the dragon, bowmen, and the people in the village.
FutureMark’s controversial 3DMark03 gives the nod to GeForce FX 5800 Ultra. We’ve seen how NVIDIA can play the numbers in the overall score with a quick driver update, so lets head straight to the frame rate results.
3DMark03 – Wings of Fury
3DMark03 – Battle of Proxycon
3DMark03 – Troll’s Lair
3DMark03 – Mother Nature
While NVIDIA isn’t a big fan of game test 1 (wings of fury), due to its simple nature, GeForce FX 5800 Ultra actually performed well in this benchmark, outperforming RADEON 9700 PRO by 2%. Again, we’re not commenting on the “performance” and “aggressive” settings, we want to stick with the defaults to keep the analysis simple but we still wanted to provide the numbers for you to compare. Also keep in mind that ATI’s performance mode offers visual quality that is actually closer to NVIDIA’s balanced mode than NVIDIA’s performance-oriented “aggressive” mode.
Serious Sam SE (Elephant Atrium) – OpenGL
The RADEON 9700 PRO has historically had a more difficult time with the Serious Sam engine, that continues to hold true to this day. At 1600x1200x32 we’re getting very playable frame rates out of both GeForce FX 5800 Ultra and RADEON 9700 PRO, although the GeForce FX card does hold a thin 1% lead at that resolution.
Quake III - OpenGL
We eagerly await the arrival of Doom III, if only so we can finally retire Quake 3 from our stable of benchmarks: we’re seeing 200+ frames per second at 1600x1200! It’s truly a testament to the huge gap between the hardware and software out there today. At 1600x1200 GeForce FX holds a pretty substantial 17% lead, although with the performance we’re seeing in Q3 today you’d be crazy to play this game without AA and anistotropic filtering turned on.
Comanche 4 – DirectX 8
While the Comanche 4 benchmark test uses pixel shaders extensively, it’s unfortunately a better processor benchmark than a graphics-related test. However, at 1600x1200 we do see that RADEON 9700 PRO finishes 5% ahead of GeForce FX 5800 Ultra.
Unreal Tournament 2003 Flyby – DirectX 8
Unreal Tournament 2003 Botmatch – DirectX 8
We really like the Unreal Tournament 2003 flyby test for running video tests; botmatch (with its extensive use of AI) on the other hand is good for testing processor performance. With that being said, we see that GeForce FX 5800 Ultra outperforms RADEON 9700 PRO at the higher resolutions we’re likely to play it (by up to 18%), while RADEON 9700 PRO dominates the lower resolutions. While this is definitely a win for NVIDIA, we still have enough horsepower to spare with both cards that we can turn on AA and anistropic filtering, we’ll get to those results in a minute.
Quake III – High Quality
Under the additional stress of 4x AA mode, we see that the GeForce FX 5800 Ultra loses its performance advantage it enjoyed earlier, trailing RADEON 9700 PRO by 5% at 1600x1200.
Quake III – High Quality
With 4xAA and 8x anisotropic filtering, GeForce FX 5800 Ultra is able to put up a better fight than it did with just 4xAA enabled, but it still isn’t enough to overcome RADEON 9700 PRO, which bests GeForce FX 5800 Ultra by 4% at 1600x1200.
Serious Sam SE (Elephant Atrium) – OpenGL
Even with AA and anistropic filtering enabled, Serious Sam 2 continues to favor GeForce FX 5800 Ultra. In fact, with its 9% performance advantage at 1600x1200, the Ultra board was able to open up a little breathing room in this test.
Unreal Tournament 2003 Flyby
Unreal Tournament 2003 Botmatch
While the scores remain close, ATI’s RADEON 9700 PRO is able to maintain a lead over GeForce FX 5800 Ultra in Unreal Tournament 2003 once anistropic filtering and anti-aliasing are turned on. At 1600x1200 the margin between both cards is at its greatest, 7%.
From a pure performance perspective, GeForce FX 5800 Ultra is not the performance champion we were lead to believe it would be back in November. Sure, GeForce FX 5800 Ultra wins its fair share of benchmarks (especially in non-AA situations), but for the most part ATI’s RADEON 9700 PRO was able to keep up with, our outperform it once the visuals were really cranked up a few notches.
NVIDIA deserves lots of credit for improving the flexibility of its anisotropic filtering. We’ve seen that in its balanced mode, it offers image quality that is dramatically improved over NVIDIA’s prior efforts. Fortunately this is a feature present in the newer Detonator drivers, so GeForce4 owners can also take advantage of this option if they chose to do so. With a little more fine-tuning NVIDIA could have a real advantage here.
Overall though, we can’t help but feel slightly disappointed with GeForce FX. While the 5800 Ultra model was announced over three months ago, these boards still can’t be found on store shelves and rumors are running rampant that you never will. When you consider that ATI’s RADEON 9700 PRO has been available for over six months now, it makes the situation even worse for NVIDIA.
Complicating matters is the heat output of the card and noise from the FX Flow cooling unit, both of these issues are turnoffs to many gamers. In addition, the scalable clock frequency “feature” can sometimes underclock your GeForce FX 5800 Ultra card right in the middle of gaming. We had to repeat multiple runs of Serious Sam and Quake 3 running with 4xAA/8xAniso enabled to get our final numbers, in some cases the margin between the scores was as high as 30%!
Of course, if you look back at NVIDIA’s last architecture change, the GeForce2 to GeForce3 transition, GeForce FX certainly looks better purely from NVIDIA’s historical perspective. While GeForce3 offered few performance gains over GeForce2 Ultra, GeForce FX 5800 Ultra demonstrates a remarkable improvement over GeForce4 Ti 4600. The key difference however, is two years ago GeForce3’s closest competitor was an NVIDIA product. This time around ATI beat NVIDIA to the punch by several months and built a highly competitive part.
Ultimately consumers will decide which product they prefer. But in order for NVIDIA to even have a chance of winning the high-end segment back they’ve got to ship more GeForce FX 5800 Ultra parts. And ATI isn’t sitting on their heels, R350 is right around the corner. Clearly NVIDIA’s days of dominance are over.
The GeForce FX 5800 Ultra is a great performer with today’s games and applications, but we’re going to reserve final judgment on the GeForce FX versus RADEON 9700 debate until more DirectX 9 titles arrive. Doom 3 (and the underlying games based on this game engine) is going to sell lots of graphics cards for ATI and NVIDIA, and right now it’s still unclear if ATI’s DX8/DX9 hyrbrid approach taken with RADEON 9700 is best, or if GeForce FX’s forward-looking design is ultimately proven to be the winning strategy. In any case, we’ll be here to report along the way, 2003 should prove to be a very interesting year for 3D graphics!
SIDEBAR: Are you waiting to pick up GeForce FX or do you plan on going with ATI for your next upgrade? Voice your thoughts on the 3D market, the GeForce FX 5800 Ultra, and this article in general in the news comments!
|© Copyright 2003 FS Media, Inc.|