Page 1 of 1

PC Upgrade

Posted: 18 Apr 2008, 19:21
by Gator
Not sure if you recall, but last year (Jan 2007) I built a my current gaming PC. Basically, Core 2 Duo at 2.4GHz, 3GB ram, single 8800GTX (stock clocks).

Last week, I performed a midlife upgrade on the beast and clocked that same processor up to 3.02 GHz (not the ram, though, it's still stock) and installed a second 8800GTX.

Note: The second 8800GTX is not the same brand (BFG and eVGA), but they are clocked the same, etc, so this is ok according to Nvidia.

To see if i was getting a reasonable benefit, I benchmarked the machine before and after the graphics SLI upgrade (all tests performed with CPU clocked at 3.02 GHz).

I used real game benchmarks for reality's sake. Astonishing results:

Test 1: Crysis-25 FPS before, 37 FPS after -- 46% increase
Test 2: Supreme Commander: 21 FPS before, 44 FPS after -- 112% increase (more than double!)
Test 3: World In Conflict: 35 FPS before, 39 FPS after -- 11% increase

Interesting results. It appears that Crysis and SupCom are both heavily dependent on the GPU power where WIC saw almost no benefit (after all, 35 FPS before the upgrade was good enough).

Those games had their graphics sliders set so that the games were enjoyable before the upgrade (not their highest settings), so it stands to reason, I should be able to crank settings up now.

I will update this post with results of the benchmarks after the settings are readjusted.

FWIW: 8800GTX priced at $290 after rebates ... not a bad upgrade for a more-than-a-year-old PC. I plan to swap in a 3GHz quad core CPU probably at the end of the year for the final configuration of this machine, which should give it quite a bit of useful service.

Issues: During the upgrade, the GPUs installed and configured easily with no hassle. The only trouble I had is due to the Mobo configuration. With two double-height GPU cards, there is only one available PCI slot, so I had to move my XFi card to a new slot. Windows did not handle that very well at all. Had to completely remove the XFi card, boot into safe mode to uninstall every vestige of Creative drivers/software and then reinstall the card like I'd just bought it. No amount of uninstalling and reinstalling the drivers would work (kept locking up and BSD-ing) until I completely removed the card.

Posted: 18 Apr 2008, 20:09
by Gator
Manually edited the config files to allow the game to be played on DX9 with the "very high" settings that are normally reserved for DX10. So, it's a level above the maxed DX9 settings. PLUS, I set it on 16xQ AntiAliasing (highest setting in the GUI)

24 FPS - totally playable. (36 FPS with all settings on "High")

Mission accomplished: Crysis on Maxed settings playable!

Posted: 18 Apr 2008, 23:09
by VEGETA
sweet upgrade man
and ya you can even have seperate clock speeds as long as both are GTX 8800 G80 cards. the higher end one being primary and it will work fine.

Posted: 19 Apr 2008, 05:34
by Gator
yep, it would clock one down to match the slow one according to their forums. But, why pay more for 600MHz instead of 575MHz?

Posted: 19 Apr 2008, 09:42
by VEGETA
My understanding is that the system will take care of that for you, the matching clock speed

Posted: 19 Apr 2008, 11:03
by Gator
yep, that's what I said: "it would clock one down" ... not "I would clock one down."

Posted: 19 Apr 2008, 11:16
by VEGETA
o me bad.

also I know you can even use different memory size as long as its again say a GTS for example and same chip. GTS tho came in the G80 and now the G92 chipset. tho hey GTX always the G80.

fun part is my GTS are G(2 so mch never chipset, but the GTX has a bigger memory bandwidth, so larger screen above 1650 resultion and GTX smokes mine lol. Below and equal its a close match.

Now I have found that lockon will work but clouds and the camera screen flicket with SIL enabled. disable it and no isues but that is a orlder game. Tho also may be for another reason, got to look into that oen more. Heard soe games don't like it tho. But nvidia control panal and click disable sli and its relay to go.