BioShock Performance Analysis
So, one of the most eagerly anticipated games of the year has finally hit the shops websites. With the PC demo available to download, netizens everywhere have endeavoured to max out their bandwidth and usage policies just to catch a glimpse of BioShock.
Our full review of the game is underway, and will be posted when we’ve completed the game, maybe a couple of times, in order to fully appreciate exactly what 2K Games have served up for us to feast on. What we’ll be looking at in this article, is how well the high end graphics cards of today handle the latest DirectX10 installment available.
Without further ado, let’s get down to the nitty-gritty.
Here on Digital Report, we’ve got an ATI HD2900XT and nVidia 8800GTX available to test the performance of any game at our disposal. ATI’s offering comes in the form of a Gecube model with 512MB of GDDR3 memory and reference clocks of 742Mhz core and 828Mhz (1856Mhz DDR) memory. The nVidia offering has a slight advantage in being a pre-overclocked ACS3 model from EVGA. It has 768MB of GDDR3 memory and has clocks of 626Mhz core and 1000Mhz (2000Mhz DDR) memory. These cards are also in significantly different price brackets, with the ATI card currently retailing for around £240 and the nVidia card for £413 (both scan.co.uk).
Testing was performed on Rich’s PC, a Xeon 3060 (Core 2 Duo E6600) overclocked to 3.6Ghz with 800Mhz memory at 3-3-3-8 timings on an Intel D975XBX ‘Bad Axe’ motherboard with a Western Digital Raptor X hard drive.
In the testing, we ran through the first minutes of the demo after you gain control of Jack, your character. The framerate was recorded using Rivatuner and the test repeated and averaged to form the results. Every test was carried out at the same visual settings and same resolution. We have been able to test a few driver revisions for each card, taking the Catalyst 7.7 and 7.8 releases for the ATI card and the official Forceware 162.22 and the 163.44 Beta release for the nVidia.