Building a dedicated rig with 3 GPUs

A forum for discussing FAH-related hardware choices and info on actual products (not speculation).

Moderator: Site Moderators

Forum rules
Please read the forum rules before posting.
markdotgooley
Posts: 101
Joined: Tue Apr 21, 2020 11:46 am

Building a dedicated rig with 3 GPUs

Post by markdotgooley »

This is largely just idle speculation, as hardware isn't in good supply right now (and for many people, cash), but suppose one wanted to build a dedicated machine with three fairly high-end GPUs and get the most out of them. What CPU and motherboard would suffice?

Right now I have an Ubuntu Linux box with a vanilla RTX 2060 (TU 106) and an RTX 2060 KO (TU 104) on an MSI B450 Gaming Plus Max with (probably excessive) 16 Gb memory and a Ryzen 5 1600 AF (6 cores). The BIOS setup I've chosen has the 2060 KO on the faster (PCIe 3.0) PCIe connector running at x8, and the regular 2060 on the slower PCIe 2.0 running at x4: that seems to be all that this motherboard can manage. GPU utilization on the KO is usually over 95%, that on the regular 2060 usually rather lower but occasionally as high as 98%. The slower PCIe connection does seem to reduce performance on the second card at times, but not as much as when I had it running for a time at x1 (often the GPU utilization was then 75% or less). Probably the PCIe 2.0 at x4 is usually providing enough bandwidth.

I'm guessing that the more powerful a GPU card is, the more a low-bandwidth PCIe connection will tend to throttle it. So maybe an x1 connection won't cripple a not-too-powerful card running Etherium or the like, but Folding@Home seems to need a fair amount of bandwidth between even a somewhat fast GPU and the CPU. PCIe 3.0 at x8 seems to be plenty for the 2060 KO; perhaps a more powerful card would need more bandwidth to stay at high utilization.

So what sort of CPU/motherboard combination would get the most out of three fairly powerful (RTX 2060 or better) GPU cards, assuming that I'm right about bandwidth? (I might well be mistaken, I admit: I'm just judging by my limited experience.)
MeeLee
Posts: 1375
Joined: Tue Feb 19, 2019 10:16 pm

Re: Building a dedicated rig with 3 GPUs

Post by MeeLee »

My experience with Core 22, is that a PCIE 3.0 x4 slot is okay for even an RTX2080Ti in Linux, but an x8 slot is preferred if you're running Windows.
For your setup, (RTX 2060 and 2060Ko), most budget motherboards out nowadays will suffice, as they almost all have at least a PCIE x16 and either a PCIE x16 slot at x4 speed or higher, and/or an m.2 slot at x4 speed.
The CPU you use is good. Disable SMT (hyperthreading) for slightly better CPU to GPU feeding, but even with SMT enabled, it's still a very good CPU, that in theory should be able to feed a 2080Ti and perhaps even 25% faster PCIE 3.0 GPUs as well.
If you're not into CPU folding, you might as well only enable 2 or 4 CPU cores in the BIOS, saves you a few running watts.

If you plan on upgrading, I'd say don't.
Instead I'd probably recommend to buy an entirely new system.
The CPU is good, but built on 12nm. 7nm is the norm now for AMD, and 7nm++ will be by the end of the year.
By the end of next year, 5nm should be trickling in; which will be great if you're just planning on folding on a system.
5nm 2C/4T could run below 30Watts at full load, and probably will run in the 4Ghz with 5Ghz boost frequencies.
These CPUs could feed 4GPUs easily.
Also great if they support PCIE4.0, which we hope by the end of this year, both AMD and Nvidia will release GPUs for.
PCIE 4.0 isn't necessarily needed for it's speed, but it'll allow you to run a very fast GPU (faster than 2080Ti) on an x4 slot in Windows, and possibly an x1 slot in Linux.

I don't think it's wise to upgrade right now.
By the end of the year, upgrade the GPUs, motherboard, and the CPU to a 7nm++ CPU, might be wise,
however, if you can wait until the end of next year, when 5nm CPUs come out, it might be an even better time.
JimboPalmer
Posts: 2573
Joined: Mon Feb 16, 2009 4:12 am
Location: Greenwood MS USA

Re: Building a dedicated rig with 3 GPUs

Post by JimboPalmer »

"1 x PCIe 3.0 x16 slot (PCI_E1)
1st, 2nd and 3rd Gen AMD Ryzen™ support x16 mode
Ryzen™ with Radeon™ Vega Graphics and 2nd Gen AMD Ryzen™ with Radeon™ Graphics support x8 mode
Athlon™ with Radeon™ Vega Graphics support x4 mode
1 x PCIe 2.0 x16 slot (PCI_E4, supports x4 mode)1
4 x PCIe 2.0 x1 slots1

PCI_E4 will run x1 speed when installing devices in PCI_E2/ PCI_E3/ PCI_E5 slot." - https://www.msi.com/Motherboard/B450-GA ... cification

So you need a 1st, 2nd or 3rd Gen AMD Ryzen, just to get one GPU to x16. (third is obviously better, it need at least 2 threads but that is easy)
You also have a slot (E4) that is physically x 16 but has x4 performance if the other slots are empty.
If you use any x1 slots, then E4 is also x1

So do NOT use an APU with internal graphics, put your best card in E1, your second best card in E4 and any other cards in some other PC.

It is possible that the 1600 AF only supports x16 lanes in total. So you are x8 and x4.
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
kiore
Posts: 931
Joined: Fri Jan 16, 2009 5:45 pm
Location: USA

Re: Building a dedicated rig with 3 GPUs

Post by kiore »

If you want to speculate a mother board with 3 x 16 slots would be nice, or at least 1 x16 and 2x 8 these kinds of boards tend to be premium price.
Image
i7 7800x RTX 3070 OS= win10. AMD 3700x RTX 2080ti OS= win10 .

Team page: http://www.rationalskepticism.org
markdotgooley
Posts: 101
Joined: Tue Apr 21, 2020 11:46 am

Re: Building a dedicated rig with 3 GPUs

Post by markdotgooley »

On reflection I think it’s an especially bad time to upgrade. Lots of parts are in short supply, and new hardware is due in September. Both 2060s are running a bit hot (usually near 80C for the one higher in the case in the better PCIe slot), and if I could get something like a hybrid-cooled 2080 Super at a good price I might replace one. Not likely to happen. The case or its fans or something just isn’t keeping things especially cool, but nothing seems as if it’s about to fail.

Need to be patient... and do my research.
Ichbin3
Posts: 96
Joined: Thu May 28, 2020 8:06 am
Hardware configuration: MSI H81M, G3240, RTX 2080Ti_Rev-A@220W, Ubuntu 18.04
Location: Germany

Re: Building a dedicated rig with 3 GPUs

Post by Ichbin3 »

kiore wrote:If you want to speculate a mother board with 3 x 16 slots would be nice, or at least 1 x16 and 2x 8 these kinds of boards tend to be premium price.
Here I found some affordable boards:
https://www.idealo.de/preisvergleich/Pr ... y=minPrice
Image
MSI H81M, G3240, RTX 2080Ti_Rev-A@220W, Ubuntu 18.04
ajm
Posts: 754
Joined: Sat Mar 21, 2020 5:22 am
Location: Lucerne, Switzerland

Re: Building a dedicated rig with 3 GPUs

Post by ajm »

As for the motherboard, I think I would look for an used server MB with dual Xeon. You can find them for under 200$, incl. both CPUs, on ebay.
This one for example https://www.supermicro.com/products/mot ... -LN4F_.cfm is close to perfect, with 4 X16 PCI-e. You just need a large chassis.

Here on ebay: https://www.ebay.com/itm/Supermicro-Ser ... cf15f01599
v00d00
Posts: 396
Joined: Sun Dec 02, 2007 4:53 am
Hardware configuration: FX8320e (6 cores enabled) @ stock,
- 16GB DDR3,
- Zotac GTX 1050Ti @ Stock.
- Gigabyte GTX 970 @ Stock
Debian 9.

Running GPU since it came out, CPU since client version 3.
Folding since Folding began (~2000) and ran Genome@Home for a while too.
Ran Seti@Home prior to that.
Location: UK
Contact:

Re: Building a dedicated rig with 3 GPUs

Post by v00d00 »

I would probably get all the parts and install it on a wooden board if you have issues with heat. Throw a fine net over it to keep the dust out. Then concentrate your cooling efforts on the room its in, not the actual system itself.

As for the system itself, I wouldnt buy a brand new system. i would look on ebay for an older intel system, skylake/coffee lake and an i3 and some cheap ram and a cheap ssd. Aim to spend as little as possible on the board/cpu/ram and you obviously want a board that has at least 2 full size pcie 3.0 slots. z170 sli board from gigabyte or msi, i3 6100, 4gb ram, cheap ssd (16gb+), decent psu that can drive two high end cards. Run it headless using ssh.

Thats the way I would do it and have done it in the past.
Image
bruce
Posts: 20910
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: Building a dedicated rig with 3 GPUs

Post by bruce »

v00d00 wrote:Throw a fine net over it to keep the dust out.
When I was a kid we used to have a fine net preinstalled on a collapsible frame designed to keep the flys off of our food when we went on a picnic. It wasn't big enough for a good sized M/B but it would be ideal for a small one.... if they still make such a thing.
JimboPalmer
Posts: 2573
Joined: Mon Feb 16, 2009 4:12 am
Location: Greenwood MS USA

Re: Building a dedicated rig with 3 GPUs

Post by JimboPalmer »

bruce, like this?

https://www.amazon.com/Lauon-Umbrella-O ... B07W81XXLQ
Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
MeeLee
Posts: 1375
Joined: Tue Feb 19, 2019 10:16 pm

Re: Building a dedicated rig with 3 GPUs

Post by MeeLee »

markdotgooley wrote:On reflection I think it’s an especially bad time to upgrade. Lots of parts are in short supply, and new hardware is due in September. Both 2060s are running a bit hot (usually near 80C for the one higher in the case in the better PCIe slot), and if I could get something like a hybrid-cooled 2080 Super at a good price I might replace one. Not likely to happen. The case or its fans or something just isn’t keeping things especially cool, but nothing seems as if it’s about to fail.

Need to be patient... and do my research.
My experience with the RTX 2060 (I've had 5, with 3 different models, 2 were identical) is that the single fan designs don't have sufficient cooling.
Dual fans work fine.
I do cap their power to 125W, because there's no real reason they should be running at 170W to begin with.
A 2080 Super will guarantee to be running even hotter.
I also never use auto fan control, as on most GPUs, it's tuned to be as low noise as possible; not to keep your card cool.
They usually ramp up once the GPU reaches in the high seventies to eighties of C.

For folding, you preferably run an open bench, or have plenty of fans in a case, or open the side panel on a case, for improved cooling.
Regular PCs won't offer sufficient cooling for folding on a dual GPU system at stock power settings.
bruce
Posts: 20910
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: Building a dedicated rig with 3 GPUs

Post by bruce »

Close enough. @Jimbo.
v00d00
Posts: 396
Joined: Sun Dec 02, 2007 4:53 am
Hardware configuration: FX8320e (6 cores enabled) @ stock,
- 16GB DDR3,
- Zotac GTX 1050Ti @ Stock.
- Gigabyte GTX 970 @ Stock
Debian 9.

Running GPU since it came out, CPU since client version 3.
Folding since Folding began (~2000) and ran Genome@Home for a while too.
Ran Seti@Home prior to that.
Location: UK
Contact:

Re: Building a dedicated rig with 3 GPUs

Post by v00d00 »

I was kind of thinking something like this.

https://www.amazon.com/Yikai-Stainless- ... B082L5NGG6

Then build a wooden frame work out of cheap wood, glue the mesh to it and put it over the top of your folder like you would with the umbrella types. As long as the air can freely circulate and any heat can escape, you should be fine.

Another possibility is to create a tower from whatever material you choose or have to hand (For me a wooden frame with perspex or similar acrylic sheeting is ideal(, mount the board(s) vertically, have an air gap at the bottom, an air gap at the top, but everything else sealed. The heat rising in the tower and out of the top creates a passive loop that draws cool air in from floor level. I did that in a limited way long ago when i used to fold on AMD X2's back in the day. Had two boards on opposing sides of a homebuilt perspex tower that used convection to passively get rid of the hot air. it worked really well for moving heat away from the boards, helped by the cpu fans that blew the air upwards within the tower/column. It also made a brilliant space heater in winter. :)
Image
Mxyzptlk
Posts: 72
Joined: Wed Apr 08, 2020 8:55 pm
Hardware configuration: Lots... Look at my website: www.mxyzptlk.us
Location: California
Contact:

Re: Building a dedicated rig with 3 GPUs

Post by Mxyzptlk »

With your goal in my mind, I went a different route from what others have suggested. However I also wanted to do CPU Folding to maximize my folding commitment.
- Gigabyte x570 Aorus Master with PCI 4.0 (Yup, a bit overkill! But I wanted head room for future expansion of CPU's and GPU's)
- AMD R9 - 3900X - Planning on moving this into another computer once the AMD 4000 series comes out and are readily available.
- Phanteks P400A - There are a million Computer Cases, but you need to find one that is all about cooling with a full front mesh for allowing plenty of air to enter. (will probably need to be a full tower to allow for the third GPU)
- EVGA CLC280 AIO - this is keeping the 3900x nice and cool with no overclocking (I also turned the standard 'boost' off in the BIOS) and is folding 24/7. (actually does ~4-5x the PPD as my 2700X at 40W less power)
- For now I have a single EVGA 2070 Super in the case now, as money and supply allow, will add in another GPU. (Probably wait until this fall when Nvidia is supposed to release the new 3000 series of cards...)

On another note, my first system (and everyday computer at home) is running a 2060 Super in the upper GPU Slot and a 2060 in the lower slot (both of these are EVGA's with twin fans). The upper 2060 runs at 64c and the lower 2060 at 52c. I have them both power limited using MSI Afterburner. I have discovered that this particular case (Be Quiet 600) is terrible at air flow so I cut a hole in the front of it and fitted one of the magnetic 120 x 280 mesh screens for air filtering. Now the (2) 140 Noctua Case fans aren't starving for air and the system runs much cooler.
I fold..... look at my folding setups here: www.mxyzptlk.us
markdotgooley
Posts: 101
Joined: Tue Apr 21, 2020 11:46 am

Re: Building a dedicated rig with 3 GPUs

Post by markdotgooley »

How do I power-limit Nvidia GPUs on Ubuntu Linux? The "NVIDIA X Server Settings" app shows that both my 2060s are being limited a bit automagically (graphics clocks about 300 MHz lower than maximum, memory transfer rate about 400 MHz lower). They're running hot, one usually near 70C and the other near 80C. Phanteks Enthoo Pro full tower case with extra fans added, most mounting cages removed to try to get more airflow from the 200mm front fan... there's no shroud for the 750W power supply because the case doesn't come with one... I've got this big near-empty box (can hold an eATX motherboard) with one empty PCIe x1 slot between two scalding-hot cards. The Phantek extension cable and mounting bracket blocks almost all the PCIe slots so it's useless for relocating one GPU well away from the other and the PSU, and I haven't seen any useful products that would let me do that...
Post Reply