Windows 7 Forums
Welcome to Windows 7 Forums. Our forum is dedicated to helping you find support and solutions for any problems regarding your Windows 7 PC be it Dell, HP, Acer, Asus or a custom build. We also provide an extensive Windows 7 tutorial section that covers a wide range of tips and tricks.


Windows 7: I Want to SLI GTX580, need help on how to.


24 Oct 2011   #1

64bit Windows 7 Ultimate
 
 
I Want to SLI GTX580, need help on how to.

Firstly this may be a stupid question for 90% of the people here,but I honestly don't know,so please don't judge me on this.

Now i would like to sli 2 nvidia gtx 580's , they will both be stock cards and same make and so on, my question is how do you sli them, but more precisely, can i put them in different slots in the motherboard so there is a gap between them? ie: 1st card in 1st pci second in 3rd pci so they are not touching each other.

Secondly will my motherboard be able to do that 1st and second pci thing i just mentioned, and how many slots does your case need for it?

And lastly will their be allot more heat when doing sli,and will i need allot of power for them?

ps: my new ax1200 psu from corsair arrives tomorrow.

My System SpecsSystem Spec
.

24 Oct 2011   #2

Windows 7 Pro X64 SP1
 
 

You may find this site worth a look:

FAQ

The motherboard must have two PCI-E X16 slots, spaced widely enough to take the two GTX580s. You'll also need an SLI bridge that's the right length.

A 1200W power supply is, er, adequate. There's a listing there of SLI certified PSUs at the same web site. One Corsair 1200W model is certified for three 580s.

If you max out the 580 cards, they'll each draw roughly 300W each. The fan noise from the cards will be fairly loud at that, if they're like my GTX480.

If you wish to exercise the cards, I suggest Furmark:

FurMark: VGA Stress Test, Graphics Card and GPU Stability Test, Burn-in Test, OpenGL Benchmark and GPU Temperature | oZone3D.Net

It's difficult to find anything that brutalizes graphics cards more than Furmark.
My System SpecsSystem Spec
24 Oct 2011   #3

64bit Windows 7 Ultimate
 
 

Cool,thanx, i will check if my motherboard is wide enough for that pci sepperation
My System SpecsSystem Spec
.


24 Oct 2011   #4

64bit Windows 7 Ultimate
 
 

It seems my mobo slots are 1st 16x, second 16x, 3rd 8x, and 4th 8x, so will it be a problem if i put the second in 8x pci? And what about heat and so on?
My System SpecsSystem Spec
24 Oct 2011   #5

Windows 7 Pro X64 SP1
 
 

If the Maximus IV Extreme-Z in your system list is the one you're using, the manual (ASUSTeK Computer Inc. -Support- Drivers and Download Maximus IV Extreme-Z) lists the preferred configuration on p. 21-15. The recommended slots are PCI-E 16 #1 and #3, which appear to to be the ones driven by the Intel controller. (They'll run at 8X with two cards in place.) Apparently this is to be preferred to using the NF2000 switch.

Heat isn't that big an issue, as the GTX580 exhaust outside the case. You want to make sure there's enough airflow in the case that the inlets to the cards see air that's not too hot.
My System SpecsSystem Spec
25 Oct 2011   #6

64bit Windows 7 Ultimate
 
 

Bobkn, you seem to be the only one answering,so thanx for that and i would rep again,but can only do it once, and thanx, thats all i needed to know,so i will get another 580 and do it 1 and 3. I got an email of the new mars 2 from asus that will be a super clocked gtx 590...was thinking of getting that but most reviews say sli 580's is faster....but sli 590's is the ultimate.....but since I already have an 580 i think it would be easier to sli two of them. And yeah,everything in my system specs is what i am going to use.
My System SpecsSystem Spec
25 Oct 2011   #7

Windows 7 Home Premium 64bit
 
 

Do not worry about the GPUs running at x8 mode each.
the difference between x16/x16 and x8/x8 is very,very little.

This may come out more confusing bit Ill try.

In the P67/z68 boards, the Sandy bridge CPU only has 16 PCI Express lannes.
So for 2 cards, there can only be 16 lanes going to the CPU, meaning for 2 GPUs, it must split that bandwidth at x8/x8.


Your best bet is to keep each 580 spaced out as much as you can.


Some higher end boards make use of the NF200 chip (which your Maximuss should have) which makes running 3 way possible,


Every board implements the use of this chip differently.
But If I recall correctly, the Maximuss bypasses it if only 1 or 2 cards are being used, because overall x8/x8 directly to the CPU produces better results than going through the NF200. If you run 3 GPUs, you have no choice but to engauge the NF200 chip.



In some setups, the NF200 will steal x8 bandwidth from the CPU. Meaning the first GPU runs at x8 off the CPU.
Additional cards can run at either x16 or x8 to the NF200, but only have a total of x8 from the NF200 to the CPU.
So for 2 cards running of the NF200 it would be something Like:
Card 1, x8 directly to the CPU.
Card 2, (x8 or x16) to the NF200 Chip. --> x8 to the CPU

Total x16 bandwith for 2 GPUs to CPUs PCIe controller.



So you are better off just splitting it to the CPU directly, and avoid the added latency the NF200 brings, if you can.

In the end, it really doesnt cause that big of a performance hit either way. Not that youll notice.
My System SpecsSystem Spec
25 Oct 2011   #8

 
 

Quote   Quote: Originally Posted by Alsisgevat View Post
....but sli 590's is the ultimate....
And you wouldn't use it all. Not with a single monitor.


Quote   Quote: Originally Posted by Wishmaster View Post

In the end, it really doesnt cause that big of a performance hit either way. Not that youll notice.
Not at 1920x1080 at any rate.
My System SpecsSystem Spec
25 Oct 2011   #9

64bit Windows 7 Ultimate
 
 

Quote   Quote: Originally Posted by Wishmaster View Post
Do not worry about the GPUs running at x8 mode each.
the difference between x16/x16 and x8/x8 is very,very little.

This may come out more confusing bit Ill try.

In the P67/z68 boards, the Sandy bridge CPU only has 16 PCI Express lannes.
So for 2 cards, there can only be 16 lanes going to the CPU, meaning for 2 GPUs, it must split that bandwidth at x8/x8.


Your best bet is to keep each 580 spaced out as much as you can.


Some higher end boards make use of the NF200 chip (which your Maximuss should have) which makes running 3 way possible,


Every board implements the use of this chip differently.
But If I recall correctly, the Maximuss bypasses it if only 1 or 2 cards are being used, because overall x8/x8 directly to the CPU produces better results than going through the NF200. If you run 3 GPUs, you have no choice but to engauge the NF200 chip.



In some setups, the NF200 will steal x8 bandwidth from the CPU. Meaning the first GPU runs at x8 off the CPU.
Additional cards can run at either x16 or x8 to the NF200, but only have a total of x8 from the NF200 to the CPU.
So for 2 cards running of the NF200 it would be something Like:
Card 1, x8 directly to the CPU.
Card 2, (x8 or x16) to the NF200 Chip. --> x8 to the CPU

Total x16 bandwith for 2 GPUs to CPUs PCIe controller.



So you are better off just splitting it to the CPU directly, and avoid the added latency the NF200 brings, if you can.

In the end, it really doesnt cause that big of a performance hit either way. Not that youll notice.

Wow,that is allot to think about...okay,now I have some major new bunch of quistions,to bad I matked this as solved. With the asus maxiums iv extreme-z that does support the nf200 I went on google and typed in that board name and nf200 to see what i get( first time i ever heard of nf200 )

Now on my next post i will add pieces from the website and it would be cool to see your thoughts on this.please.
My System SpecsSystem Spec
25 Oct 2011   #10

64bit Windows 7 Ultimate
 
 

"The specifications for multi GPU setups on the Maximus IV Extreme(-Z) boards is 4 x PCIe 2.0 x16 (x16, or dual x8, or x8, x16, x16). So in other words, if you're using one video card, it runs at x16; if you use two or run "dual", it's x8 / x8, and if you run three it's x8 / x16 / x16.What people asked was, why does ASUS who offers the NF200 chip on these boards opt for a x8 / x8 setup in a dual card situation, when any other company with a dual GPU setup would make use of the NF200 chip and go for a z16 / x16 setup? "

"Well, ASUS say the way they do it is faster. Huh!?!?! x8 / x8 is slower than x16 / x16, though? - Yes, but because the NF200 chip piggy backs onto the Intel chipset, they say a certain amount of lag is added because another chip is being thrown into the mix.A good way to look at it is we've got two runners in a 100m race. The x8 / x8 Intel runner is slower overall in straight out speed, but the x16 / x16 NF200 runner has hurdles in his lane and while he can run faster, because he has to jump hurdles (i.e. go through the Intel chip) he's overall actually a little slower. "


"Sliding our cards into Slot 1 and Slot 3, what ASUS recommend we have our setup running at x8 / x8 via the "Native" Intel P67 chip........................Once we've tested with that setup, we put our cards in Slot 2 and Slot 4, the setup that ASUS don't recommend; and the setup according to them which will run slower. Looking in the BIOS again, you can now see we're running at x16 / x16 via the NF200 chip"


"So let's find out, is running via the Native Intel chip at x8 / x8 faster than running through the higher speed x16 / x16 NF200? Or is it just a bunch of marketing mumbo jumbo?"


"Under 3DMark Vantage we can see that the x8 / x8 shows a little extra speed, but we're dealing with quite large numbers here that we probably wouldn't put past fluctuation."

"It's interesting to see under Lost Planet 2 that the x16 / x16 NF200 setup comes out ahead, especially at the lowest resolution. At the highest there's only an FPS between the two setups."

AND THE FINAL THOUGHTS OF THEM WERE:

"So what's faster? You know, there's probably not a clear winner when it comes to overall speed. The better question would be; so what's better? Well, the x8 / x8 setup that ASUS choose to implement seems to be. Yes, it's not always faster, but when we're all about the video card power, it is the faster setup. The times we see the NF200 setup come out ahead is when we're looking at benchmarks with really high FPS.We can see under intensive situations like Aliens vs. Predator and Unigine Heaven, the x8 / x8 via the native P67 chip is the better option. When it all comes down to it, there's little difference between the two setups. The decision for ASUS to go down the x8 / x8 path via the Intel chip instead of the better looking x16 / x16 NF200 path seems to be the right decision."


So from what i understand and looking at the pics,there is no difference to it that is major...as long as they work...I even thought about dropping my 580 and picking up 2 560's in sli....but will read some up first on that.


Attached Images
 
My System SpecsSystem Spec
Reply

 I Want to SLI GTX580, need help on how to.




Thread Tools



Similar help and support threads for2: I Want to SLI GTX580, need help on how to.
Thread Forum
Solved GTX 690 4GB + GTX580 3GB(PhysX) Graphic Cards
Need Help with System Interrupts - GTX580 Drivers
Solved ati 5850 vs. GTX580 Graphic Cards
GTX580's in SLI causes BCCode 116 Graphic Cards
Solved XFX ATI Radeon 6970 Vs NVIDIA GTX580 Hardware & Devices
gtx580 vs hd6970 Graphic Cards
Solved Any reason to get GTX580 yet? Graphic Cards

Our Sites

Site Links

About Us

Find Us

Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

Designer Media Ltd

All times are GMT -5. The time now is 12:58 AM.
Twitter Facebook Google+



Windows 7 Forums

Seven Forums Android App Seven Forums IOS App
  

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33