Firingsquad: What is it like being a part of the NVIDIA Way It’s Meant To Be Played Program? What kind of assistance do you receive from NVIDIA and how much time have you spent tuning World in Conflict to run on Radeon GPUs from AMD? The current beta doesn’t run very well with AMD’s latest DX10 Radeon hardware, should gamers with these cards be concerned with compatibility?
Firingsquad: What kind of differences should gamers expect between the DX9 and DX10 version of World in Conflict? Is Massive mainly using DX10 shaders to improve performance, or to enhance graphics over DX9? Some combination of both?
Christian Seger: Our focus for World in Conflict was on improving graphical quality in the DX10 version. Improving performance is a more long-term commitment best suited for our future titles, where DX10 will likely be the main renderer, and not a late high-end addition as with World in Conflict. The DX10 version can also have the Mega Map enabled at all times on a secondary screen, which we don’t have in DX9.
Firingsquad: In his interview on nzone.com, Stefan Johansson said that “we have optimized the DX10 implementation for high-end cards, so that low-end cards might actually be better off with running DX9.” Could you elaborate on this a bit more? It sounds like the visual improvements you’ve implemented for the DX10 path are too demanding for lower-end cards? If so do you think this is a dilemma other game developers could be facing?
Christian Seger: The DX10-only features we've implemented consume fill rate and video memory on top of rendering all other things in World in Conflict. So I think Stefan's comment is simply about total workload for the GPU. The lower-end cards have less speed and less memory available, so running World in Conflict on lower settings on those machines may be a good idea to get a smoother gaming experience. However, we have worked a lot on getting the lower settings as beautiful as possible, so it should still look really good, but without all the cool DX10 only features. For people who want the full DX10 gaming coolness, I'd recommend getting the higher-end DX10 cards, with half a gig or more of video memory.
Firingsquad: So far in our own testing, we’ve seen little visual difference between the DX9 and DX10 code path running the current (public) World in Conflict beta, and we’ve also witnessed slower performance running under DX10. We’ve heard newer builds are showing substantial performance improvements and better image quality under DX10. Can you shed more light on this?
Christian Seger: The visual impact of particularly the cloud shadows and the light shafts is very different on different maps in the game. The level designers decided on specific settings that are appropriate for the environment on each map. Soft particles will always make particles look real, instead of having clipped polygons when they intersect with world geometry. Performance will depend on drivers and how well we utilize the new API. Our goal has been to at least match DX9. For future titles we think that the improvements in the DX10 API will be more visible.
Firingsquad: Based on what you’ve seen with DirectX 10, do you think it will be easier for game developers to program for than DirectX 9 was? How easy was it porting World in Conflict to DX10?
Christian Seger: DX9 is actually a wonderful API, and it hasn't been that difficult to program for. I think DX10 is at the same level, but it will let us do better and cooler things. I don't think DX10 will have an impact on development times – there are much more challenging things in game development than the graphics programming. Our primary DX10 engineer Stefan Johansson did the DX10 implementation of our renderer in about two months, then he and Iain Cantlay (nVidia) worked on the DX10 features for an extra month or so. So the porting and the new features was on the whole a very smooth experience for us.
Nvidia's program is fantastic! We had great support from them starting as early as last year. We had support from their engineers, and most importantly, help with performance measurements and identifying bottlenecks in our use of the GPU. AMD helped us identify some problems with using their GPUs, and those will be addressed in a future patch, hopefully as close to release as possible. I don't think gamers should be concerned about anything, if you get the latest drivers for your graphics card and download the latest patches for the game.
Firingsquad: How long do you think it will take before games require DirectX 10? When should gamers who already have high-end DX9 cards really care about this new API, when will it really begin to affect them?
I think DX9 will live on for a couple of years, at least as a low-end renderer. I think that from now on, if it is time for an upgrade of your system, you should get one of the DX10 cards on the high-end side. They have the video memory needed by modern games as well as the DX10 support.
Firingsquad: While graphics is definitely important, World in Conflict also benefits from having a fast processor – we also noted a slight performance improvement when going from two to four cores. Is the game code multithreaded, and if so, what kind of performance improvement should gamers with quad-core CPUs expect in World in Conflict?
Yes, World in Conflict has multithread code that will improve performance on multi-core systems. You can expect an average improvement ranging between 20-40%. But perhaps more importantly you get large improvements in minimum FPS, for example when a nuke has been dropped or other heavy stuff in the battle occurs, you get rid of those short drops in performance and you get a smoother experience.
On behalf of FiringSquad, we’d like to thank Massive’s Christian Seger for answering our questions on the tech inside World in Conflict. Now that we’ve got a little more background on the tech inside the game, how does it perform? Time to find out don’t you think?