Summary: World in Conflict is the latest game to sport DX10 graphics. In this article, Brandon chats with the developer about DX10 and also examines the game's performance. Check out the results inside!
Slowly but surely the number of DX10 titles is steadily increasing. Relic was first out of the gates with a DX10 patch for Company of Heroes, followed quickly by Lost Planet. Now another RTS from developer Massive Entertainment (the minds behind the Ground Control series) is set to offer DX10 support. The game, World in Conflict, is set to debut in a little over a month and, with the highest graphics settings, it looks like it’s really going to push today’s latest systems to the max. We recently tested the public beta with a variety of high-end graphics cards to see how they’d perform with the game, we were particularly interested in the game’s DX10 path, as the game has been hyped to really deliver in this area; more on that later though.
In case you haven’t heard of World in Conflict, we’ll provide a quick summary. The game is set in an alternate reality where the Berlin Wall never fell and the Cold War is still going strong. The year is 1988 and tensions between east and west are at an all-time high. Ultimately the Soviet Union invades western Europe, compelling NATO to invoke Article V, the self defense charter which states that an attack against one ally is an attack against them all. The war front quickly spreads from the fields of Europe to the shores of America itself – before America can send reinforcements to Europe, a Soviet invasion force hiding in unmarked container ships attacks Seattle.
The storyline of the game was created by New York Times best-selling author Larry Bond and uses real-world air and ground forces: if you’ve ever dreamed of an RTS with Hind and Cobra helicopters, and M-1 Abrams and T-80 main battle tanks, this is your game.
Unlike many RTS games, World In Conflict skips the resource gathering and base management tasks, focusing instead on combat. You must also choose from one of four different classes to play (air, armor, infantry, or support); this should make teamwork critical in multiplayer matches. The game even supports dual monitors if you run DX10 under Vista.
We recently had the chance to chat with Christian Seger, Massive’s lead engineer. As you probably know by now there are two basic routes game devs can choose from when designing their games for DX10: improving performance or improving graphics. We were curious to see what route Massive chose and how that choice affected the game in comparison to the DX9 version.
Firingsquad: Which new features in DirectX 10 excite you the most and how are you taking advantage of that in World in Conflict?
Christian Seger: Texture arrays, stream-out, and lesser draw call overhead are features that first come to mind that look the most appealing. However, for World in Conflict, we had limited time to get DX10 features in before we hit our beta date, so we had to choose a feature that would give us high visual impact compared to the time we had available. The feature we choose is having the depth of the scene always available. We do this in DX10 by simultaneously rendering to multiple render targets, this can also be done in DX9, but not without ditching FSAA (if I remember correctly), which we weren't prepared to do. The scene depth buffer gave us the possibility to implement three new features: Soft Particles, Cloud Shadows and Light Shafts.
Massive provided a good description of these features (and a video) on nzone.com. We’re reposting it here:
Soft Particles: One of the great features of DX10 is the Soft Particle effect, which ensures that all the different particle effects (like smoke, explosions and fire) look as realistic as possible. In many games that don’t have this feature, effects like these can sometimes look striped, as if there are a number of two-dimensional objects stacked next to each other. Not so with Soft Particles. The soft particles, or z-feather, make sure that these particle effects act in a realistic manner, as they actually exist in full 3D within the game level. This means that they look and act exactly like the smoke, explosions, etc ought to do in the real world.
Global Cloud Shadows: Another very interesting graphical feature in DX10 is the possibility to make Global Cloud Shadows. This means that we can have our volumetric clouds moving across the level in actual 3D and cast shadows on buildings, hills and forest in a completely realistic way. The shadows are affected by the shape, volume and thickness of the clouds and as the sky gets filled with more smoke and clouds, the shadows reflect the harshness of the battlefield.
Volumetric Lighting, “God rays”: The volumetric lighting makes sure that we can have some excellent lighting to go on the side of our shadow features. This is very noticeable as we see rays of light, or “god rays”, coming through the clouds over the battlefield. It’s a very atmospheric effect that can make even the most intense war scenery look beautiful.
Firingsquad: Is there anything in DirectX 10 that you couldn’t do in DirectX 9.0?
Christian Seger: The scene depth buffer support couldn't be implemented in DX9 at the same quality and with the same performance impact. I'm sure there are plenty of other things, but further DX10 implementation will be a focus of our future projects, where DX9 is likely to be the low-end render path.
Firingsquad: What kind of differences should gamers expect between the DX9 and DX10 version of World in Conflict? Is Massive mainly using DX10 shaders to improve performance, or to enhance graphics over DX9? Some combination of both?
Christian Seger: Our focus for World in Conflict was on improving graphical quality in the DX10 version. Improving performance is a more long-term commitment best suited for our future titles, where DX10 will likely be the main renderer, and not a late high-end addition as with World in Conflict. The DX10 version can also have the Mega Map enabled at all times on a secondary screen, which we don’t have in DX9.
Firingsquad: In his interview on nzone.com, Stefan Johansson said that “we have optimized the DX10 implementation for high-end cards, so that low-end cards might actually be better off with running DX9.” Could you elaborate on this a bit more? It sounds like the visual improvements you’ve implemented for the DX10 path are too demanding for lower-end cards? If so do you think this is a dilemma other game developers could be facing?
Christian Seger: The DX10-only features we've implemented consume fill rate and video memory on top of rendering all other things in World in Conflict. So I think Stefan's comment is simply about total workload for the GPU. The lower-end cards have less speed and less memory available, so running World in Conflict on lower settings on those machines may be a good idea to get a smoother gaming experience. However, we have worked a lot on getting the lower settings as beautiful as possible, so it should still look really good, but without all the cool DX10 only features. For people who want the full DX10 gaming coolness, I'd recommend getting the higher-end DX10 cards, with half a gig or more of video memory.
Firingsquad: So far in our own testing, we’ve seen little visual difference between the DX9 and DX10 code path running the current (public) World in Conflict beta, and we’ve also witnessed slower performance running under DX10. We’ve heard newer builds are showing substantial performance improvements and better image quality under DX10. Can you shed more light on this?
Christian Seger: The visual impact of particularly the cloud shadows and the light shafts is very different on different maps in the game. The level designers decided on specific settings that are appropriate for the environment on each map. Soft particles will always make particles look real, instead of having clipped polygons when they intersect with world geometry. Performance will depend on drivers and how well we utilize the new API. Our goal has been to at least match DX9. For future titles we think that the improvements in the DX10 API will be more visible.
Firingsquad: Based on what you’ve seen with DirectX 10, do you think it will be easier for game developers to program for than DirectX 9 was? How easy was it porting World in Conflict to DX10?
Christian Seger: DX9 is actually a wonderful API, and it hasn't been that difficult to program for. I think DX10 is at the same level, but it will let us do better and cooler things. I don't think DX10 will have an impact on development times – there are much more challenging things in game development than the graphics programming. Our primary DX10 engineer Stefan Johansson did the DX10 implementation of our renderer in about two months, then he and Iain Cantlay (nVidia) worked on the DX10 features for an extra month or so. So the porting and the new features was on the whole a very smooth experience for us.
Firingsquad: What is it like being a part of the NVIDIA Way It’s Meant To Be Played Program? What kind of assistance do you receive from NVIDIA and how much time have you spent tuning World in Conflict to run on Radeon GPUs from AMD? The current beta doesn’t run very well with AMD’s latest DX10 Radeon hardware, should gamers with these cards be concerned with compatibility?
Christian Seger: Nvidia's program is fantastic! We had great support from them starting as early as last year. We had support from their engineers, and most importantly, help with performance measurements and identifying bottlenecks in our use of the GPU. AMD helped us identify some problems with using their GPUs, and those will be addressed in a future patch, hopefully as close to release as possible. I don't think gamers should be concerned about anything, if you get the latest drivers for your graphics card and download the latest patches for the game.
Firingsquad: How long do you think it will take before games require DirectX 10? When should gamers who already have high-end DX9 cards really care about this new API, when will it really begin to affect them?
Christian Seger: I think DX9 will live on for a couple of years, at least as a low-end renderer. I think that from now on, if it is time for an upgrade of your system, you should get one of the DX10 cards on the high-end side. They have the video memory needed by modern games as well as the DX10 support.
Firingsquad: While graphics is definitely important, World in Conflict also benefits from having a fast processor – we also noted a slight performance improvement when going from two to four cores. Is the game code multithreaded, and if so, what kind of performance improvement should gamers with quad-core CPUs expect in World in Conflict?
Christian Seger: Yes, World in Conflict has multithread code that will improve performance on multi-core systems. You can expect an average improvement ranging between 20-40%. But perhaps more importantly you get large improvements in minimum FPS, for example when a nuke has been dropped or other heavy stuff in the battle occurs, you get rid of those short drops in performance and you get a smoother experience.
On behalf of FiringSquad, we’d like to thank Massive’s Christian Seger for answering our questions on the tech inside World in Conflict. Now that we’ve got a little more background on the tech inside the game, how does it perform? Time to find out don’t you think?
World In Conflict Public Beta DX9 Codepath
We’re testing with all NVIDIA GeForce graphics cards for this article. AMD’s Radeon HD 2900 XT card doesn’t run the game properly, resulting in issues such as garbled text and error messages in some cases. When we contacted AMD about this, they stated that they’re aware of the issues with the public beta and that they would have an optimized driver that would address these problems once the game is released.
As you can see, the game is highly CPU-bound: cranking up the screen res hardly shows any performance difference even at 2560x1600! As a result, obviously performance between the GTX and Ultra cards isn’t all that different, even including the SLI setups; the GeForce 8800 GTS 320MB does lag behind the other cards though. The GeForce 8800 GTS 320MB runs at the same speeds as the 640MB board, but runs 9% slower at 1600x1200. This margin eventually increases substantially at 2560x1600.
We’re still CPU-bound under 4xAA. The game also has a very high graphics setting, perhaps we should have tried that to push the GPU’s harder. Turning on 4xAA didn’t result in the huge hit in performance you normally expect to see. At 16x12 the GeForce 8800 Ultra for instance delivered 39 fps without AA and 36 fps with 4xAA, a performance hit of just 8%.
The margins were about the same under 4xAA/16xAF for the GeForce 8800 GTX and Ultra, but performance declined further for the GeForce 8800 GTS cards, with the 640MB GeForce 8800 GTS running 25% slower under DX10 and the 320MB a whopping 53% slower.
Hopefully these figures are indeed improved with final game code, as well as driver optimizations from AMD and NVIDIA. Obviously though it looks like if you want to enjoy the game with high quality settings you’re going to want to follow Christian’s advice and get a high-end card with 512MB of memory or more. GeForce 8800 GTS 320MB owners will likely want to turn down the settings if they wish to game at high screen resolutions, particularly as you turn on the AA. (and if you were thinking 2xAA may help the situation somewhat, we ran those numbers too and did see a nice double-digit performance gain over 4xAA under DX10 and DX9 for the 320MB card).
The other point to take away from all this is that the game is highly CPU-limited with today’s high-end cards, even when running with a Core 2 Extreme X6800. You’ll want to get your hands on the fastest processor possible to really keep the frame rate high, overclocking probably wouldn’t be a bad idea either. Because the game is so CPU-bound, dual card SLI and CrossFire setups won’t see the performance gains we’re used to seeing; gamers with these cards might as well really crank up the AA to extreme levels considering this.
We also hope AMD and Massive can get the issues resolved for Radeon HD card owners in time for the game’s retail release in September (a demo will also be made available on Aug 20th).
In case you’re worried about how the game will play with older cards, don’t worry. We also quickly checked out performance with older cards like the GeForce 7600 GS/GT with lower graphics settings under DX9 and the game still looked good and played well. In all honesty, it was very tough to spot differences between the DX9 and DX10 code paths during the public beta, the game looked just as great under both rendering modes. Like Company of Heroes DX10, the differences between World in Conflict DX9 and DX10 were very subtle, in fact they were more subtle than CoH. We couldn’t take identical screenshots with the beta code, but you literally have to zoom in 200% to really spot any differences. We’ve been told to expect changes to the final game that should hopefully highlight the differences a bit better in favor of DX10.
With Supreme Commander, Command and Conquer 3, and soon, World in Conflict, 2007 could go down as the year of the RTS. Of course, there are lots of first-person shooters yet to be unveiled, including the granddaddy of them all – Crysis. It’s also great to see so much DX10 content out there even though so far nothing has really wowed us. We’ll see if that changes in the coming months though…
|© Copyright 2003 FS Media, Inc.|