Official GIGABYTE Forum

Graphics Cards in SLI

Stu_UK

  • 7
  • 0
Graphics Cards in SLI
« on: April 27, 2014, 04:26:38 pm »
More of a curiosity rather than a problem:

When I set up my new build late last year/earlier this year I was very interested in getting a couple of decent graphics cards working together in SLI. During my research I was interested in identifying the benefits of have having two GPUs working together over the use of one really good GPU running the show. The conclusions I found were much of a muchness, the significant exception being that a single more expensive card would certainly incorporate newer standards of hardware.

I was interested in what PSU would be required to handle two cards, information that again wasn't handed to me on a plate; I had to spend quite a lot of time on Google to get this information. A lot of forum threads seemed to push for PSU's that were 800W and up! For the record the Corsair CS 650 M PSU can run a fully loaded machine that hosts two nVidia 660GTX Windforce OC (Gigabyte) cards no problem despite the fact that the Gigabyte website indicates that these are 450W cards. Newegg TV benchmarked a pair of 660's in SLI: 310W under load.

And I was on shaky ground as to whether I could justify an SLI configuration; I'm not what you'd call a gamer. My day to day interests are more in the areas of computer graphics and 3D modelling so I needed a machine that was better at number crunching with a ballsy processor and lots of memory more than possessing the ability to fire lots of frames per second at me which it would seem is the big advantage of SLI setups - Framerates.

So yeah, 2x 660GTX's in SLI... £150 each... And the most current Need For Speed game (well, current in December 2013!) runs great with all the effects, screen resolution and quality settings set to maximum... All good!

But the Gigabyte GA-X79-UD3 is capable of running up to four cards in SLI. Has anyone around here actually tried this? The physical dimensions of my cards are pretty big, they take up two slots of the rear panel each. I'm not sure if one could squeeze four 660's in there!

To anyone who may be running 4 cards in SLI, do tell: What cards are they? What kind of PSU are you using?? How much did those cards cost??? And what have the real life day to day benefits actually been???? - Are your games really that amazing now?
« Last Edit: April 27, 2014, 04:30:34 pm by Stu_UK »
“...but we're flexible”
“But not about the jugs!”
“Yes we have to be firm about the jugs!”
“And the jugs have to be very firm!!”

Richie and Eddie at the dating agency

Romuls

  • 4
  • 0
Re: Graphics Cards in SLI
« Reply #1 on: June 07, 2014, 01:58:39 pm »
Hello Gentlemen!You wrote your best aficionados. Take a long time products Gigabyte, in particular motherboard GA-EP-43T-UD-3L. Performance chepseta super! (Computer bought for $ 1500 - 2010) The whole system works fine, but in our life there are disappointments. Bought your video card GV-R489OC-1GD for $ 300! (Decided to improve the system.) After a few days starts to fly OS. At first I thought that the motherboard is not spravlyaetsya, but no! decided that Windows 7 is low on memory, bought additional memory (for maternal playty - read) 150 dollars. Not pomaglo! Bought a radiator for the CPU Inte correl 2 quad 9400! (Since taking Box) +50 dollars without pomaglo! (Shamanism nachalos0! And already in 2012. Thought that around a problem with the temperature control! But no Okazyvaetsya GV-R489OC-1GD is not designed correctly!! Some bitch Constructor (Piedra bastard condom) with hands monster considered himself a true designer decided to make fun of the buyers. Rasskhod $ 1,000 to identify the causes of system failure and the PC in general does not bother me (I decided to investigate)! radiator appears to GV-R489OC-1GD is not installed correctly! Radiator RAM chips to put just like that! he does not capture the most basking elements! Krnstruktoru hands to snatch it!!!! Guide to see! Boobies you and not the developers!!P.S.When did not buy consumer goods Gigabyt0e-Soup!My remarks will go into Microsoft!