I Want to SLI GTX580, need help on how to.

Page 1 of 2 12 LastLast

  1. Posts : 1,438
    64bit Windows 10
       #1

    I Want to SLI GTX580, need help on how to.


    Firstly this may be a stupid question for 90% of the people here,but I honestly don't know,so please don't judge me on this.

    Now i would like to sli 2 nvidia gtx 580's , they will both be stock cards and same make and so on, my question is how do you sli them, but more precisely, can i put them in different slots in the motherboard so there is a gap between them? ie: 1st card in 1st pci second in 3rd pci so they are not touching each other.

    Secondly will my motherboard be able to do that 1st and second pci thing i just mentioned, and how many slots does your case need for it?

    And lastly will their be allot more heat when doing sli,and will i need allot of power for them?

    ps: my new ax1200 psu from corsair arrives tomorrow.
      My Computer


  2. Posts : 2,606
    Windows 7 Pro X64 SP1
       #2

    You may find this site worth a look:

    FAQ

    The motherboard must have two PCI-E X16 slots, spaced widely enough to take the two GTX580s. You'll also need an SLI bridge that's the right length.

    A 1200W power supply is, er, adequate. There's a listing there of SLI certified PSUs at the same web site. One Corsair 1200W model is certified for three 580s.

    If you max out the 580 cards, they'll each draw roughly 300W each. The fan noise from the cards will be fairly loud at that, if they're like my GTX480.

    If you wish to exercise the cards, I suggest Furmark:

    FurMark: VGA Stress Test, Graphics Card and GPU Stability Test, Burn-in Test, OpenGL Benchmark and GPU Temperature | oZone3D.Net

    It's difficult to find anything that brutalizes graphics cards more than Furmark.
      My Computer


  3. Posts : 1,438
    64bit Windows 10
    Thread Starter
       #3

    Cool,thanx, i will check if my motherboard is wide enough for that pci sepperation
      My Computer


  4. Posts : 1,438
    64bit Windows 10
    Thread Starter
       #4

    It seems my mobo slots are 1st 16x, second 16x, 3rd 8x, and 4th 8x, so will it be a problem if i put the second in 8x pci? And what about heat and so on?
      My Computer


  5. Posts : 2,606
    Windows 7 Pro X64 SP1
       #5

    If the Maximus IV Extreme-Z in your system list is the one you're using, the manual (ASUSTeK Computer Inc. -Support- Drivers and Download Maximus IV Extreme-Z) lists the preferred configuration on p. 21-15. The recommended slots are PCI-E 16 #1 and #3, which appear to to be the ones driven by the Intel controller. (They'll run at 8X with two cards in place.) Apparently this is to be preferred to using the NF2000 switch.

    Heat isn't that big an issue, as the GTX580 exhaust outside the case. You want to make sure there's enough airflow in the case that the inlets to the cards see air that's not too hot.
      My Computer


  6. Posts : 1,438
    64bit Windows 10
    Thread Starter
       #6

    Bobkn, you seem to be the only one answering,so thanx for that and i would rep again,but can only do it once, and thanx, thats all i needed to know,so i will get another 580 and do it 1 and 3. I got an email of the new mars 2 from asus that will be a super clocked gtx 590...was thinking of getting that but most reviews say sli 580's is faster....but sli 590's is the ultimate.....but since I already have an 580 i think it would be easier to sli two of them. And yeah,everything in my system specs is what i am going to use.
      My Computer


  7. Posts : 4,517
    Windows 7 Home Premium 64bit
       #7

    Do not worry about the GPUs running at x8 mode each.
    the difference between x16/x16 and x8/x8 is very,very little.

    This may come out more confusing bit Ill try.

    In the P67/z68 boards, the Sandy bridge CPU only has 16 PCI Express lannes.
    So for 2 cards, there can only be 16 lanes going to the CPU, meaning for 2 GPUs, it must split that bandwidth at x8/x8.


    Your best bet is to keep each 580 spaced out as much as you can.


    Some higher end boards make use of the NF200 chip (which your Maximuss should have) which makes running 3 way possible,


    Every board implements the use of this chip differently.
    But If I recall correctly, the Maximuss bypasses it if only 1 or 2 cards are being used, because overall x8/x8 directly to the CPU produces better results than going through the NF200. If you run 3 GPUs, you have no choice but to engauge the NF200 chip.



    In some setups, the NF200 will steal x8 bandwidth from the CPU. Meaning the first GPU runs at x8 off the CPU.
    Additional cards can run at either x16 or x8 to the NF200, but only have a total of x8 from the NF200 to the CPU.
    So for 2 cards running of the NF200 it would be something Like:
    Card 1, x8 directly to the CPU.
    Card 2, (x8 or x16) to the NF200 Chip. --> x8 to the CPU

    Total x16 bandwith for 2 GPUs to CPUs PCIe controller.



    So you are better off just splitting it to the CPU directly, and avoid the added latency the NF200 brings, if you can.

    In the end, it really doesnt cause that big of a performance hit either way. Not that youll notice.
      My Computer


  8. Posts : 12,364
    8 Pro x64
       #8

    Alsisgevat said:
    ....but sli 590's is the ultimate....
    And you wouldn't use it all. Not with a single monitor.


    Wishmaster said:

    In the end, it really doesnt cause that big of a performance hit either way. Not that youll notice.
    Not at 1920x1080 at any rate.
      My Computer


  9. Posts : 1,438
    64bit Windows 10
    Thread Starter
       #9

    Wishmaster said:
    Do not worry about the GPUs running at x8 mode each.
    the difference between x16/x16 and x8/x8 is very,very little.

    This may come out more confusing bit Ill try.

    In the P67/z68 boards, the Sandy bridge CPU only has 16 PCI Express lannes.
    So for 2 cards, there can only be 16 lanes going to the CPU, meaning for 2 GPUs, it must split that bandwidth at x8/x8.


    Your best bet is to keep each 580 spaced out as much as you can.


    Some higher end boards make use of the NF200 chip (which your Maximuss should have) which makes running 3 way possible,


    Every board implements the use of this chip differently.
    But If I recall correctly, the Maximuss bypasses it if only 1 or 2 cards are being used, because overall x8/x8 directly to the CPU produces better results than going through the NF200. If you run 3 GPUs, you have no choice but to engauge the NF200 chip.



    In some setups, the NF200 will steal x8 bandwidth from the CPU. Meaning the first GPU runs at x8 off the CPU.
    Additional cards can run at either x16 or x8 to the NF200, but only have a total of x8 from the NF200 to the CPU.
    So for 2 cards running of the NF200 it would be something Like:
    Card 1, x8 directly to the CPU.
    Card 2, (x8 or x16) to the NF200 Chip. --> x8 to the CPU

    Total x16 bandwith for 2 GPUs to CPUs PCIe controller.



    So you are better off just splitting it to the CPU directly, and avoid the added latency the NF200 brings, if you can.

    In the end, it really doesnt cause that big of a performance hit either way. Not that youll notice.

    Wow,that is allot to think about...okay,now I have some major new bunch of quistions,to bad I matked this as solved. With the asus maxiums iv extreme-z that does support the nf200 I went on google and typed in that board name and nf200 to see what i get( first time i ever heard of nf200 )

    Now on my next post i will add pieces from the website and it would be cool to see your thoughts on this.please.
      My Computer


  10. Posts : 1,438
    64bit Windows 10
    Thread Starter
       #10

    "The specifications for multi GPU setups on the Maximus IV Extreme(-Z) boards is 4 x PCIe 2.0 x16 (x16, or dual x8, or x8, x16, x16). So in other words, if you're using one video card, it runs at x16; if you use two or run "dual", it's x8 / x8, and if you run three it's x8 / x16 / x16.What people asked was, why does ASUS who offers the NF200 chip on these boards opt for a x8 / x8 setup in a dual card situation, when any other company with a dual GPU setup would make use of the NF200 chip and go for a z16 / x16 setup? "

    "Well, ASUS say the way they do it is faster. Huh!?!?! x8 / x8 is slower than x16 / x16, though? - Yes, but because the NF200 chip piggy backs onto the Intel chipset, they say a certain amount of lag is added because another chip is being thrown into the mix.A good way to look at it is we've got two runners in a 100m race. The x8 / x8 Intel runner is slower overall in straight out speed, but the x16 / x16 NF200 runner has hurdles in his lane and while he can run faster, because he has to jump hurdles (i.e. go through the Intel chip) he's overall actually a little slower. "


    "Sliding our cards into Slot 1 and Slot 3, what ASUS recommend we have our setup running at x8 / x8 via the "Native" Intel P67 chip........................Once we've tested with that setup, we put our cards in Slot 2 and Slot 4, the setup that ASUS don't recommend; and the setup according to them which will run slower. Looking in the BIOS again, you can now see we're running at x16 / x16 via the NF200 chip"


    "So let's find out, is running via the Native Intel chip at x8 / x8 faster than running through the higher speed x16 / x16 NF200? Or is it just a bunch of marketing mumbo jumbo?"


    "Under 3DMark Vantage we can see that the x8 / x8 shows a little extra speed, but we're dealing with quite large numbers here that we probably wouldn't put past fluctuation."

    "It's interesting to see under Lost Planet 2 that the x16 / x16 NF200 setup comes out ahead, especially at the lowest resolution. At the highest there's only an FPS between the two setups."

    AND THE FINAL THOUGHTS OF THEM WERE:

    "So what's faster? You know, there's probably not a clear winner when it comes to overall speed. The better question would be; so what's better? Well, the x8 / x8 setup that ASUS choose to implement seems to be. Yes, it's not always faster, but when we're all about the video card power, it is the faster setup. The times we see the NF200 setup come out ahead is when we're looking at benchmarks with really high FPS.We can see under intensive situations like Aliens vs. Predator and Unigine Heaven, the x8 / x8 via the native P67 chip is the better option. When it all comes down to it, there's little difference between the two setups. The decision for ASUS to go down the x8 / x8 path via the Intel chip instead of the better looking x16 / x16 NF200 path seems to be the right decision."


    So from what i understand and looking at the pics,there is no difference to it that is major...as long as they work...I even thought about dropping my 580 and picking up 2 560's in sli....but will read some up first on that.
    Attached Thumbnails Attached Thumbnails I Want to SLI GTX580, need help on how to.-x4147_21_nvidia_nf200_x16_x16_vs_intel_x8_x8_p67_performance_analysis.png.pagespeed.ic.0umyt8oor.png  
      My Computer


 
Page 1 of 2 12 LastLast

  Related Discussions
Our Sites
Site Links
About Us
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 07:29.
Find Us