Summary: Wondering which mainstream graphics cards perform best with S.T.A.L.K.E.R.? If so, you're in luck, as today we've rounded up over a dozen different ATI and NVIDIA graphics cards. Along the way we also discuss topics such as 256MB vs. 512MB of memory, SLI/CrossFire performance, and the game's dynamic lighting model. If you're planning on a graphics upgrade for S.T.A.L.K.E.R. you won't want to miss this article!
But as beautiful as S.T.A.L.K.E.R. the tech demo looked, no one really knew how the game would play; the storyline was vague and the demo was so open-ended you could tell that the game was very early in development. Who would have thought it would take developer GSC Game World another 4+ years to finish the game? Certainly not us. In any case, S.T.A.L.K.E.R. is here now and so far the early reviews from the press and end users alike are quite positive.
The game is set in a post apocalyptic alternative reality where a reactor in the Chernobyl nuclear power plant explodes, spewing radiation throughout the area. You play the role of a “stalker”, a person who scavenges the area within the Chernobyl exclusion zone searching for valuable artifacts, fighting mutants, and dealing with other hostile stalkers. At the start of the game you’ve suffered from amnesia and can’t remember anything. Your ultimate goal is to figure out who you are, and what’s going on inside the exclusion zone.
The game is a first-person shooter, but GSC Game World also mixes in role-playing elements. The game has a storyline, but this isn’t a linear game like Half-Life with a set path you must follow to reach the game’s conclusion. Along the way you’ll be assigned the obligatory story missions that are required to complete the storyline, as well completing secondary quests that aren’t tied to the game’s story. Like an RPG, your character starts off inexperienced, but as you complete missions and quests you can equip yourself with more powerful weapons. Already S.T.A.L.K.E.R. has been compared to games like Deus Ex and System Shock – fine company indeed.
Graphically S.T.A.L.K.E.R.’s X-Ray game engine is quite impressive. According to the developer, up to a million polygons can be on-screen at any one time, and the game sports the latest eye candy effects, including HDR lighting, parallax and normal maps, 3.0 pixel/vertex shaders, per-pixel lighting and soft dynamic shadows. In fact, the game is so demanding that many card owners with DX9 cards like the GeForce 6800 and Radeon X800 have been forced to play the game in DirectX 8 mode. This is because the game’s dynamic lighting model performs so many calculations it can bring many DX9 cards to their knees: get to aggressive with the graphics settings, and you can bring a modern GPU like the GeForce 7900 or Radeon X1900 to a sluggish crawl.
With this in mind, in today’s article we’ll be evaluating the performance of ATI and NVIDIA’s latest and greatest mainstream cards, as well as a few older cards from generations past. We’ll also be covering topics like graphics memory (i.e. does 512MB of graphics RAM impact performance or is 256MB enough?), SLI/CrossFire, and the game’s various lighting modes. We’re going to start off first though by comparing ATI and NVIDIA’s image quality with the game.
Another thing about S.T.A.L.K.E.R. that you’ll soon see in the screenshots is that anti-aliasing doesn’t play as critical a role as it does in most other contemporary games. The developers have done a good job at keeping the jaggies in check without AA turned on. This is a good thing because turning on AA, whether via the game’s settings menu (or the user.ltx file) or the graphics driver control panel, AA has little or no affect on image quality. Considering the game’s usage of HDR lighting, we weren’t too surprised to see the GeForce cards lack of AA, but we did expect the Radeon X1K cards to work properly. Oh well, here are the screenshots as proof:
The shots above were taken with a Radeon X1950 Pro 256MB PCIe and a GeForce 7950 GT 512MB with AA disabled. You can clearly see the jaggies on the bottom of the window sill in the first test image, and along the front edge of the house (just above the ladder) in the second. Now let’s see what happens when we turn on AA in the game’s menu:
In our first test area, the jaggies seem slightly smoother on both the ATI and NVIDIA cards under 4xAA (to our surprise in the case of the GeForce 7950 GT), but they’re still notably present. We should note that forcing 4xAA in the driver control panel didn’t improve things in this test case. Let’s look at our second test area:
Woah, the jaggies on the front edge of the house are still there. In fact, it doesn’t look like they’ve been touched at all by either card! Let’s see what happens when you force AA via the driver control panel.
Normally when you force AA via the driver control panel, the driver automatically applies AA to everything in the scene, overriding the game’s AA mode by default. But forcing AA doesn’t appear to affect anything at this point in S.T.A.L.K.E.R., as the jaggies on the front edge of the house are still present.
Again, considering that the GeForce 7 cards lack support for HDR+AA, we weren’t surprised to see that AA isn’t functioning 100% in S.T.A.L.K.E.R, but we did expect the Radeon X1K cards to run AA with HDR. For further analysis we’ve run performance numbers with the game running 0xAA, 4xAA via the game’s menu, and 4xAA forced via the driver control panel. In theory, we should get the highest performance numbers with 0xAA, and the slowest fps with 4xAA forced via the control panel, let’s see what happens.
UPDATE 3/28/07: We received the following from NVIDIA: "The trouble is not HDR+AA but rather the game's deferred shading engine which is fundamentally incompatible with hardware MSAA. Essentially it's the same issue as Ghost Recon Advanced Warfighter's engine. This is why the developer implemented a shader based AA (the slider in the game) but as you noticed, its effect is quite subtle. We have a new driver coming which gives some perf improvements and SLI for G80. Will keep you posted!"
S.T.A.L.K.E.R. – Direct3D
With the exception of the GeForce card, performance doesn’t scale for us when 4xAA is forced via the control panel. Note that we set AA via both the game menu system, and by typing it into the user.ltx text file with the commands: “r2_aa on” and “r_supersample 4”. Performance was tested using FRAPS.
As anyone who has played S.T.A.L.K.E.R. on an older graphics card will tell you, one of the most significant settings that can reduce performance is to turn on the game’s full dynamic lighting mode. According to the game’s manual, the “static lighting” mode runs the game in DX8 mode, while the two dynamic lighting modes run under DX9. We took the following screenshots on a Radeon X1350 XT to compare static to full dynamic lighting:
To demonstrate the profound impact full dynamic lighting can make on performance, we also ran benchmarks with the Radeon X1350 XT and GeForce 6800 GS under both static lighting and full dynamic lighting. We used the following user.ltx text file for our static lighting testing, and this user.ltx file for our dynamic lighting testing.
S.T.A.L.K.E.R. – Direct3D
To test the performance improvement (if any) 512MB of graphics memory brings, we tested ATI’s Radeon X1900 XT 256MB against a 512MB X1900 XT board. For the NVIDIA cards, we also underclocked a GeForce 7900 GTX to GeForce 7900 GT speeds, as we don’t have a GeForce 7900 GT 512MB card on-hand. Let’s see if the added graphics memory plays a role shall we?
As you can see, the extra 256MB of graphics memory improved the performance of our simulated GeForce 7900 GT 512MB by 7% at 1600x1200 and 11% for the Radeon X1900 XT 512MB. The X1900 XT 512MB pulls away even further from its 256MB equivalent at 1920x1200. Clearly it looks like S.T.A.L.K.E.R. stands to benefit from the extra graphics memory provided by the 512MB cards.
Intel Core 2 Duo E6600 (2.4GHz)
ASUS P5W DH Deluxe (for ATI cards)
EVGA nForce 680i motherboard (for NVIDIA cards)
2GB Corsair TWIN2X2048-6400C4
ATI Radeon X1800 XT 512MB
ATI Radeon X1800 GTO 256MB
ATI Radeon X1300 XT 256MB
ATI Radeon X1600 Pro 256MB
ATI Radeon X1650 XT 256MB
ATI Radeon X1900 GT 256MB
Sapphire Radeon X1950 GT 256MB
ATI Radeon X1900 XT 256MB
NVIDIA GeForce 6600 GT
NVIDIA GeForce 6800 GS
NVIDIA GeForce 7600 GS 256MB
NVIDIA GeForce 7600 GT 256MB
NVIDIA GeForce 7900 GS 256MB
NVIDIA GeForce 7900 GT 256MB
EVGA e-GeForce 7900 GTO 512MB
NVIDIA GeForce 7950 GT 512MB
ASUS GeForce 8800 GTS 320MB
300GB Western Digital Caviar SE
Windows XP Professional SP2
We ran all of our tests with S.T.A.L.K.E.R. using the same system we used previously in games like Oblivion: we load up a predefined area (from a saved game) and use FRAPS to record performance. We repeat this test three times and take the average for our final result, rounding up when necessary. Rather than test a variety of areas, for this test we’re testing in an area that mixes in a fair amount of foliage with a wide open space. Basically our test involves running down a set path where the environment goes from a densely vegetated forest (which hurts frame rate) into a more open environment. Once again we’re using this user.ltx text file for our testing. We like this config because it allowed our mainstream cards to deliver very playable performance without severely affecting IQ, for instance we’ve got full dynamic lighting and the game looks great.
S.T.A.L.K.E.R. – Direct3D
S.T.A.L.K.E.R. – Direct3D
Unfortunately, Catalyst 7.2 doesn’t appear to have a CrossFire profile for S.TA.L.K.E.R., performance was unchanged under dual GPU CrossFire configs. We even tried forcing CrossFire’s AFR mode by changing the Catalyst A.I. slider to advanced, but this didn’t help. Hopefully ATI will issue a hotfix to address this, but for now CrossFire users are out of luck.
If you can afford to spend $150, the Radeon X1950 GT is pretty hard to beat at that price point for S.TA.L.K.E.R. The X1950 GT isn’t quite as fast as its predecessor, the X1900 GT, but as we discussed in our Sapphire X1950 GT review, these cards are based on the exact same GPU used in the X1950 Pro, so it’s not hard to get more performance out of these chips with a little bit of overclocking. In the US Sapphire is the only board partner bringing the X1950 GT to market at this point, while in Europe and Asia Palit and TUL are the sole card manufacturers. NVIDIA’s GeForce 7900 GS also performs well, but at stock speeds it isn’t quite as fast as the Radeon X1950 GT. A factory overclocked 7900 GS card would no doubt fare better in S.TA.L.K.E.R.
Moving higher up the price bracket, the Radeon X1900 XT 256MB and GeForce 7900 GTO really stand out. Unfortunately neither one of these cards can be found either online or at retail very easily anymore, they’ve basically been replaced by the Radeon X1950 XT 256MB and the GeForce 7950 GT. We don’t have a Radeon X1950 XT 256MB for testing, but considering that the X1900 XT 256MB outperformed the GeForce 7950 GT, the X1950 XT 256MB should be the faster card. Of course, keep in mind that the GeForce 7950 GT we tested was running at stock speeds, and many of NVIDIA’s board partners have chosen to overclock their 7950 GT cards from the factory for enhanced performance, so it’s quite possible that one of these OC’ed 7950 GT boards could give the ATI card a run for its money.
Sitting alone at the very upper echelon of the mainstream graphics cards we tested is NVIDIA’s GeForce 8800 GTS 320MB. Clearly the GeForce 8800 GTS 320MB is the fastest sub-$300 graphics card you can buy for S.TA.L.K.E.R. today; with the card delivering performance that was over 25% greater than the next closest competitor at 1600x1200. It really isn’t even close. Besides performance, the GeForce 8800 GTS 320MB has other intangibles going for it such as its added AA modes and low noise. It really is a great card for the gamer on a strict sub-$300 budget.
Surprisingly enough, AA quality is pretty much a wash between the GeForce 7 and equivalent Radeon cards right now in S.TA.L.K.E.R. With their support for HDR+AA, we expected the Radeon X1K cards to have an obvious advantage here, but it looks like GSC isn’t enabling AA for anyone in HDR mode at this point. Considering that Unreal Engine 3 games with HDR like Rainbow Six: Vegas also lack support for HDR+AA, GSC isn’t alone here, but we’d like to see more game devs taking advantage of this feature, particularly since more powerful DX10 cards are becoming more readily available.
However one area where the Radeon cards still have the advantage over GeForce 7 is in AF quality. As we’ve noted in the past, the GeForce cards shimmer under the default “Quality” image quality mode set by NVIDIA. To reduce the shimmering, you’ll want to adjust the image quality setting to “High Quality”. This turns off the filtering optimizations which cause the texture shimmering. Fortunately though the shimmering isn’t that noticeable in most environments, it really only stands out on long, flat objects like roads. Therefore if you haven’t noticed it in the past, you probably won’t see it in S.TA.L.K.E.R. either.
So that does it for Part 1 of our S.TA.L.K.E.R. performance eval. In Part 2 we’ll be taking a look at high-end cards, and in Part 3 we’ll examine CPU performance. Supposedly S.TA.L.K.E.R. takes advantage of quad-core CPUs. We’ll be putting this to the test shortly!
If you’ve taken some cool screenshots within S.TA.L.K.E.R. you should definitely check out the FiringSquad S.T.A.L.K.E.R.: Shadow of Chernobyl Screenshot Contest. The winner will receive a copy of Supreme Commander or Company of Heroes, courtesy of THQ! For more details, click here.
|© Copyright 2003 FS Media, Inc.|