256MB vs 512MB testing
In order to properly determine the performance impact the added video memory brings, we decided to split our testing in two. Weíre going to run one batch of tests with a mainstream card to see if the added memory makes a difference in a way thatís similar to how most users will actually play the game, as well as a second test thatís designed to stress the graphics card as much as possible. For the mainstream test we used ATIís Radeon X800 XL GPU, as it was a popular upgrade choice up until it was replaced by more recent Radeon offerings, while on the high-end, we took a GeForce 7900 GTX board and underclocked it to the same speeds as the GeForce 7900 GT. Weíre running the X800 XL with the same graphics settings as we outlined on the previous page, while the 7900 GTX is running with the gameís highest settings in order to stress the card as much as possible.
Our results are mixed for the high-end setup with max settings. Running the indoors and foliage areas yielded no performance improvement, yet the large, open region our mountains demo takes place in we did see a noticeable improvement for the GeForce card running with 512MB of memory, up to 13% at 1024x768. When running the Radeon X800 XL under low graphics settings however, we saw no performance improvement. In fact, the added latency the extra memory provides usually hurt performance based on our results.
Based on all this, it looks like youíll only see improved performance with 512MB cards if youíre running Oblivion with the highest graphics settings, and even then, only under limited conditions.