Summary: NVIDIA's GeForce FX 5200 is poised to bring DirectX 9 gaming to the gamer on a budget. How does a $79 DX9 video card sound to you? In today's article we examine the performance of the GeForce FX 5200 Ultra. We've included scores from the GeForce2/3/4 as well as ATI's latest value solutions, RADEON 8500 and RADEON 9000 PRO. See how the FX 5200 Ultra stacks up in comparison in this article!
Prior to summer 2000, the value segment of the graphics market was pretty boring. The formula for graphics manufacturers was pretty simple: release your latest graphics product at the high end, and cut prices on your existing product lineup, just as CPU manufacturers had been doing for decades. Eventually the blueprint was refined a bit, rather than older or lower-clocked products at the low end, we got graphics cores that were stripped variants of their high-end siblings. These crippled products left much to be desired, leaving a bitter taste in the mouths of consumers.
Redefining Value: GeForce2 MX
Everything changed with the debut of NVIDIA’s NV11 graphics core, better known today as GeForce2 MX. Unlike previous graphics architectures, GeForce2 MX was designed from the ground up for the value segment of the 3D graphics market. Fusing DirectX 7 features such as hardware transformation and lighting to a dual-pixel pipeline (with the ability to process two textures per clock), GeForce2 MX was one of the most significant releases that year. Not only did GeForce2 MX bring DX7 to the masses, it also propelled NVIDIA to #1 in desktop market share, a position it has held tenaciously to this day.
Last year NVIDIA unveiled its GeForce4 MX family, and while NVIDIA integrated the Accuview anti-aliasing engine from GeForce4 (which allowed it to perform very competitively with GeForce3 in Quincunx AA mode) into GeForce4 MX, this part was essentially nothing more than June 2000’s GeForce2 MX on steroids. The hardware pixel and vertex shaders launched roughly a year earlier with GeForce3 were notably absent.
The GeForce FX 5200 is built on TSMC’s 0.15-micron manufacturing process, just like the older GeForce4 graphics processors. Transistor count increases from 31 million in GeForce4 MX-8X to 45 million in GeForce FX 5200. This figure isn’t surprising for a value part, as a complex chip design like the 120 million transistor GeForce FX 5800 isn’t cheap to manufacture. NVIDIA’s GeForce4 and GeForce FX 5600 weigh in at 65 million and 80 million transistors respectively. So where did all the transistors go?
NVIDIA’s Intellisample anti-aliasing engine has been removed from the core, which from a hardware perspective means we lose the lossless color and z-compression present in the GeForce FX 5600 and GeForce FX 5800 series. With color compression gone, anti-aliasing performance will be hampered in comparison to the other GeForce FX models, while stripping z-compression will hurt performance in both AA as well as regular use. This also means we lose the new 6XS and 8XS anti-aliasing modes, but GeForce FX 5200 really doesn’t have the horsepower to run at these settings anyway.
The GeForce FX 5200 Ultra models
Two products have been announced within the GeForce FX 5200 family: GeForce FX 5200 and GeForce FX 5200 Ultra. Like previous NVIDIA products, the only difference between the variants is clock speed. NVIDIA hasn’t announced the final clock frequencies for GeForce FX 5200, but GeForce FX 5200 Ultra features a 325MHz core clock with 325MHz (650MHz effective) memory.
Like previous GeForce architectures, the GeForce FX 5200 family utilizes DDR memory with a 128-bit memory interface. Up to 128MB of memory is supported, although we will see 64MB boards from some manufacturers.
Looking at pure fill rate and bandwidth figures, GeForce FX 5200 Ultra offers up to 10.4GB/sec of peak memory bandwidth, that’s a 2.2GB/sec increase over GeForce4 MX 440-8X, GeForce FX 5200 Ultra’s predecessor, and 1.6GB/sec more than RADEON 9000. The Ultra’s 325MHz core yields a 1.3 billion texels/sec fill rate, 200 million more than GeForce4 MX 440-8X and RADEON 9000, an improvement of 15%.
While it remains to be seen if GeForce FX 5200 Ultra provides enough power to run next-generation DX9 games at high frame rates, keep in mind that this is still more pixel-pushing muscle than any other video card in the value market that has been announced. We will have to wait and see how ATI responds with RADEON 9200. Core clock and memory frequencies have not been announced yet.
SIDEBAR: You can view the entire NVIDIA GDC launch presentation at their website.
A quick inspection of the GeForce FX 5200 Ultra reference board reveals a design that more closely resembles GeForce4 than GeForce FX. To cool the graphics core, a single slot cooling solution is used that doesn’t require a 7200RPM fan a la GeForce FX 5800 Ultra’s FX Flow cooling system. In fact, the heatsink assembly resembles the unit used on the GeForce4 Ti 4400 and Ti 4600 reference cards. This heatsink however is quite a bit larger.
Physically GeForce FX 5200 Ultra is slightly shorter than GeForce4 Ti 4600, but larger than ATI’s RADEON 9700/9500 series. To keep the GeForce FX 5200 Ultra fed with juice, an external power source is required for optimum performance. This is located on the end of the board with the rest of the Ultra board’s power circuitry. If the end user forgets to connect their GeForce FX 5200 Ultra to the system power supply, the board operates at a reduced 250/650MHz clock frequency. A warning message is displayed once the operating system has loaded informing the user of the problem.
Our reference board shipped with 128MB of 400MHz Hynix BGA memory. Keep in mind that this can change with retail boards, as the final decision is up to the needs of the card manufacturers. The first generation GeForce FX 5200 Ultra boards will most likely closely resemble NVIDIA’s reference design with second-generation boards supporting all kinds of exotic solutions for cooling and memory.
Like the other GeForce FX cards, the fan on GeForce FX 5200 Ultra operates dynamically. Under more rigorous situations the fan kicks in, keeping core temperatures in check. Note however that this doesn’t necessarily mean the fan automatically operates while running in 3D applications such as games; quite often the fan would remain off during gaming, kick in for a few minutes, and then turn itself off again. We also got a chance to witness the core’s automatic clock-throttling during overclocking testing.
On the video side, we’ve got dual 350MHz RAMDACs for driving two displays at resolutions up to 2048x1536 at 75Hz, just like GeForce4 MX. The video processing engine also appears unchanged, which means we’ve got an integrated TV encoder and an onboard MPEG2 decoder. Basically, GeForce FX 5200 is geared up for NVIDIA’s Personal Cinema right out of the box.
From an image quality perspective, we must keep in mind that RADEON 9000 is fundamentally based RADEON 8500, and therefore inherits some of its drawbacks. For starters, ATI’s RADEON 9000 utilizes rotated-grid super-sampling for its anti-aliasing. While super-sampling looks good visually, it draws a significant drain from the graphics core’s fill rate. The end result is a severe performance hit.
Anisotropic filtering quality
Like our GeForce FX 5800 Ultra article, we ask that you download the anisotropic images we took to see the cards in action yourself. Image quality is difficult enough to judge as it is, downloading the jpegs and looking at them in as uncompressed a format as possible is the closest way to see what we saw on our screen when taking the screenshot. Plus it allows you to judge image quality for yourself.
We’ve also included a quick shot of GeForce FX 5200 Ultra with anisotropic filtering disabled as well as 2x and 8x settings:
Again, we highly recommend that you download the screenshots so you can see the rest of the images. As you can see in the GeForce FX 5200 Ultra screenshots above, NVIDIA still hasn't addressed the issues with UT 2003's DM-Insidious level that we noted with 42.69.
SIDEBAR: GeForce4 MX’s codename was NV17, the AGP 8X variant NV18.
Fill rate Performance
Pixel Shader Performance
Vertex Shader Performance
This is probably the synthetic test you’re most eager to see, does GeForce FX 5200 Ultra have what it takes to perform with 2.0 pixel shaders? Based on our 3DMark 03 results, it looks like a definite “no”. Although they’re not depicted in the graph, GeForce FX 5200 Ultra is considerably behind the elder DX9 cards from ATI as well as GeForce FX 5800 Ultra in PS 2.0 testing. In 3DMark 2001 testing however we see that it performs very well in older pixel shaders, although it’s behind RADEON 9000 PRO in both advanced pixel shader and simpler pixel shader performance. RADEON 9000 PRO actually doubles GeForce FX 5200 Ultra’s performance in advanced pixel shader.
VillageMark is just as much a fill rate test as it tests occlusion culling, so it’s no huge surprise to see the results we obtained.
At 1600x1200, Codecreatures was a bit too demanding on some of the older graphics cards, so we omitted 1600x1200 results. But as you can see, GeForce FX 5200 Ultra performs favorably in comparison to RADEON 9000 PRO in this benchmark, despite the synthetic shader results we obtained in 3DMark 2001, 03, and ChameleonMark. Lets take a look at 3DMark 2001SE next.
3D Mark 2001 SE v.330
We see similar results to the ones we obtained with Chameleonmark in 3DMark 2001; the biggest difference is the RADEON 8500’s performance. It runs neck and neck with GeForce FX 5200 Ultra, while RADEON 9000 PRO is off their pace by as much as 20%.
3DMark 2001 - Car Chase
3DMark 2001 - Dragothic
3DMark 2001 - Lobby
3DMark 2001 - Nature
Before you get all excited for GeForce FX 5200 Ultra, remember that the 3DMark 03 overall result is an average of all four tests. As such, GeForce FX 5200 Ultra is the only card tested that is capable of performing all four of them. That’s why we provide the individual test results folks.
3DMark03 – Wings of Fury
3DMark03 – Battle of Proxycon
3DMark03 – Troll’s Lair
3DMark03 – Mother Nature
The GeForce FX 5200 Ultra is able to perform well in the individual tests, garnering a clean sweep over both ATI cards. The mother nature results in particular are pretty disappointing considering that the card is only running at 640x480x32. Also keep in mind that this demo is a mix of 1.1, 1.4, and 2.0 pixel shaders. The GeForce FX 5800 Ultra and RADEON 9800 PRO offered much better performance in our testing at 1024x768x32, but that’s also why you pay $400 for those graphics cards.
Serious Sam SE (Elephant Atrium) – OpenGL
With vanilla settings, no AA or anisotropic filtering, the RADEON 9000 PRO and RADEON 8500 both finish ahead of GeForce FX 5200 Ultra. The margin between 9000 PRO and GeForce FX 5200 Ultra is 8% at 1600x1200.
Quake III - OpenGL
Even with the latest value cards, its gotten to the point where they’re all more than capable of handling Quake 3. GeForce FX 5200 Ultra is able to garner a win over the ATI cards here, and offers a substantial improvement over GeForce4 MX440, but we the AA scores will be the ultimate judge of GeForce FX 5200 Ultra’s performance which we’ll get to soon.
Comanche 4 – DirectX 8
Unreal Tournament 2003 Flyby – DirectX 8
Unreal Tournament 2003 Botmatch – DirectX 8
Remember guys, the “RADEON 9000” in the graphs is actually the RADEON 9000 PRO. We’ll have this issue resolved shortly. Now that that’s out of the way, lets get on to the results. With GeForce FX 5200 Ultra’s memory bandwidth advantage over GeForce3, we expected it perform better in the flyby tests, instead we don’t really see it outshine GeForce3 until we turn on 2x AA. Those results are coming shortly. Therefore we have a feeling that drivers are likely holding the card back somewhat, although it is still able to outperform RADEON 9000 PRO. GeForce3’s superior multi-textured fill rate may also be playing a role here.
Quake III – High Quality
First off, we can’t compare the RADEON 8500/9000 PRO scores to GeForce FX 5200 Ultra, as both cards utilize super-sampling, crippling the card’s performance with AA. This is an edge NVIDIA will always have over ATI until the Canadian company comes up with an entirely new part for the value segment. Instead we’ll just have to compare the card’s performance to the NVIDIA 64MB GeForce 4 Ti 4200 reference board we received nearly a year ago.
Quake III – High Quality
GeForce4’s fill rate advantage gives it the edge at lower resolutions, but as the screen resolution is kicked up, memory bandwidth begins to hobble the Ti 4200 core, allowing the GeForce FX 5200 Ultra to essentially keep up with Ti 4200 at 1600x1200.
Quake III – High Quality
The aniso/AA mix allows the GeForce FX 5200 Ultra to outperform GeForce4 Ti 4200 in Quake 3 outright. The Ti 4200 just doesn’t have the memory bandwidth to keep up with GeForce FX 5200.
Serious Sam SE (Elephant Atrium) – OpenGL
Based on our results with Serious Sam, it looks like fill rate plays a more important role in performance with this engine than it did with Quake 3. GeForce FX 5200 Ultra also loses z-compression, technology that was first introduced with NVIDIA’s GeForce3 (which also offers a higher multi-textured fill rate than GeForce FX 5200 Ultra). GeForce FX 5200 Ultra is behind Ti 4200 by nearly 30% at 1600x1200.
Unreal Tournament 2003 Flyby
Unreal Tournament 2003 Botmatch
In vanilla testing with UT2003, we found that GeForce3 and GeForce FX Ti 5200 Ultra were very closely matched. Under the greater demands of 2xAA however, this changes completely, the FX 5200 Ultra nearly doubles GeForce3’s performance at 1600x1200. We also see Ti 4200’s memory bandwidth put a cap on its frame rate at the same resolution.
Based on our preliminary results, NVIDIA’s GeForce FX 5200 Ultra is the new performance champion in the value segment, dethroning ATI’s RADEON 9000 PRO. On top of its performance you also get 2.0 pixel and vertex shaders, making it feature ready for the next generation of DX9 games that we’ll see coming out over the course of the next year.
While some will correctly argue that GeForce FX 5200 doesn’t provide enough horsepower to play these next generation games at adequate frame rates, keep in mind that GeForce FX 5200 Ultra is more capable than any other graphics card currently available in the value segment of powering today’s existing games. We will also see considerably more DX8 titles ship this year than those that really take advantage of DX9. So fine, don’t call it a DX9 card, but you can’t argue with its performance in today’s latest games, just look at Unreal Tournament 2003 for an example.
The real beauty of the GeForce FX 5200 family is that they’ve brought DX9 down not to the mainstream level, but to the value buyer. This grows the market of DX9 hardware on the market considerably, not only are OEMs going to be buying these cards by the truckloads, but you’ll probably see these cards have a much greater impact at the retail level than GeForce4 MX as well. Here’s a marketing tip to NVIDIA’s manufacturing partners: be sure to put plenty of screenshots of NVIDIA’s Dawn demo on your GeForce FX 5200 packaging, surely guys will be all over that.
As the installed base of DX9 hardware grows, the case for DX9 becomes more compelling to game developers. Remember how long it took us to see DX8 titles ship? That’s largely because DX8 hardware was so cost prohibitive when it launched. Here we are just a few months removed from DX9’s release and we’ve now got cards that can take full advantage of this technology for under $100! Hopefully the end result is that we’ll see more DX9 games sooner than we have previously. Even you cynics out there must like that.
In many ways GeForce FX 5200 reminds us of GeForce2 MX. While the performance delta between the high and low end of DX9 hardware is greater now than it was with GeForce2 MX a few years ago, we’re getting a feature complete card at a very attractive price. Once all the third-party manufacturers get into full swing on their boards, prices will quickly fall, making these cards even more tempting to the gamer on a budget.
It will be interesting to see what other models NVIDIA announces for GeForce FX 5200. Remember how the GeForce4 MX 460 bombed because it was priced so closely to GeForce4 Ti 4200? We may see the same situation play out here, as the 128MB GeForce FX 5200 Ultra board featured in this article will officially retail for $149. We were told that NVIDIA came up with the $79 GeForce FX 5200 literally within days of its surprise announcement at GDC. Prior to the conference, media were only aware of two variants. In addition, that $99 GeForce FX 5200 could be quite a contender with the right clock frequencies. We’ll just have to wait until cards ship in April to see how things play out. NVIDIA may be holding something up its sleeve just in case ATI makes some surprising countermoves with RADEON 9200.
SIDEBAR: Questions or comments? Is the GeForce FX 5200 Ultra the best thing since sliced bread or are you looking for a little bit more? Voice your thoughts in the news comments!
|© Copyright 2003 FS Media, Inc.|