PC Upgrade

This is the general discussion area. Everyone is welcome, but you must register to post.

Moderator: RLG MGMT Team

Post Reply
User avatar
Gator
Posts: 1412
Joined: 29 Jun 2002, 15:55
Location: Reston, VA
Contact:

PC Upgrade

Post by Gator »

Not sure if you recall, but last year (Jan 2007) I built a my current gaming PC. Basically, Core 2 Duo at 2.4GHz, 3GB ram, single 8800GTX (stock clocks).

Last week, I performed a midlife upgrade on the beast and clocked that same processor up to 3.02 GHz (not the ram, though, it's still stock) and installed a second 8800GTX.

Note: The second 8800GTX is not the same brand (BFG and eVGA), but they are clocked the same, etc, so this is ok according to Nvidia.

To see if i was getting a reasonable benefit, I benchmarked the machine before and after the graphics SLI upgrade (all tests performed with CPU clocked at 3.02 GHz).

I used real game benchmarks for reality's sake. Astonishing results:

Test 1: Crysis-25 FPS before, 37 FPS after -- 46% increase
Test 2: Supreme Commander: 21 FPS before, 44 FPS after -- 112% increase (more than double!)
Test 3: World In Conflict: 35 FPS before, 39 FPS after -- 11% increase

Interesting results. It appears that Crysis and SupCom are both heavily dependent on the GPU power where WIC saw almost no benefit (after all, 35 FPS before the upgrade was good enough).

Those games had their graphics sliders set so that the games were enjoyable before the upgrade (not their highest settings), so it stands to reason, I should be able to crank settings up now.

I will update this post with results of the benchmarks after the settings are readjusted.

FWIW: 8800GTX priced at $290 after rebates ... not a bad upgrade for a more-than-a-year-old PC. I plan to swap in a 3GHz quad core CPU probably at the end of the year for the final configuration of this machine, which should give it quite a bit of useful service.

Issues: During the upgrade, the GPUs installed and configured easily with no hassle. The only trouble I had is due to the Mobo configuration. With two double-height GPU cards, there is only one available PCI slot, so I had to move my XFi card to a new slot. Windows did not handle that very well at all. Had to completely remove the XFi card, boot into safe mode to uninstall every vestige of Creative drivers/software and then reinstall the card like I'd just bought it. No amount of uninstalling and reinstalling the drivers would work (kept locking up and BSD-ing) until I completely removed the card.
Image
Silence is golden - Duct Tape is silver
User avatar
Gator
Posts: 1412
Joined: 29 Jun 2002, 15:55
Location: Reston, VA
Contact:

Post by Gator »

Manually edited the config files to allow the game to be played on DX9 with the "very high" settings that are normally reserved for DX10. So, it's a level above the maxed DX9 settings. PLUS, I set it on 16xQ AntiAliasing (highest setting in the GUI)

24 FPS - totally playable. (36 FPS with all settings on "High")

Mission accomplished: Crysis on Maxed settings playable!
Image
Silence is golden - Duct Tape is silver
VEGETA
Posts: 2592
Joined: 13 Mar 2002, 15:00
Location: Brampton, Ontario, Canada Eh

Post by VEGETA »

sweet upgrade man
and ya you can even have seperate clock speeds as long as both are GTX 8800 G80 cards. the higher end one being primary and it will work fine.
User avatar
Gator
Posts: 1412
Joined: 29 Jun 2002, 15:55
Location: Reston, VA
Contact:

Post by Gator »

yep, it would clock one down to match the slow one according to their forums. But, why pay more for 600MHz instead of 575MHz?
Image
Silence is golden - Duct Tape is silver
VEGETA
Posts: 2592
Joined: 13 Mar 2002, 15:00
Location: Brampton, Ontario, Canada Eh

Post by VEGETA »

My understanding is that the system will take care of that for you, the matching clock speed
User avatar
Gator
Posts: 1412
Joined: 29 Jun 2002, 15:55
Location: Reston, VA
Contact:

Post by Gator »

yep, that's what I said: "it would clock one down" ... not "I would clock one down."
Image
Silence is golden - Duct Tape is silver
VEGETA
Posts: 2592
Joined: 13 Mar 2002, 15:00
Location: Brampton, Ontario, Canada Eh

Post by VEGETA »

o me bad.

also I know you can even use different memory size as long as its again say a GTS for example and same chip. GTS tho came in the G80 and now the G92 chipset. tho hey GTX always the G80.

fun part is my GTS are G(2 so mch never chipset, but the GTX has a bigger memory bandwidth, so larger screen above 1650 resultion and GTX smokes mine lol. Below and equal its a close match.

Now I have found that lockon will work but clouds and the camera screen flicket with SIL enabled. disable it and no isues but that is a orlder game. Tho also may be for another reason, got to look into that oen more. Heard soe games don't like it tho. But nvidia control panal and click disable sli and its relay to go.
Post Reply