PPD by GPU - Advice for Dedicated F@H Machine

A forum for discussing FAH-related hardware choices and info on actual products (not speculation).

Moderator: Site Moderators

Forum rules
Please read the forum rules before posting.
Aurum
Posts: 296
Joined: Sat Oct 03, 2015 3:15 pm
Location: The Great Basin

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by Aurum »

SteveWillis wrote:...KVM switch to switch your keyboard, video, and mouse between your primary PC and folding rigs.
I run my rigs headless using TightVNC. Others like Team Viewer. I just move a monitor & keyboard around for those times when you just have to. FAHControl lets me see them all at a glance.
In Science We Trust Image
Aurum
Posts: 296
Joined: Sat Oct 03, 2015 3:15 pm
Location: The Great Basin

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by Aurum »

davidcoton wrote: I don't think (from a quick scan) anyone mentioned the CPU requirement for GPU folding.
Officially as of now it does required one CPU core per GPU.
HOWEVER, since most of the time is spent in a wait state, I speculate that leaving one core for two or three GPUs will have very little detrimental effect. I don't have a multi-GPU rig to test this with.
foldy wrote:If you want to build one rig with several GPUs keep in mind you need at least pcie x4 speed for each GPU if you use Windows and one CPU core for each GPU to feed it.
In rereading what foldy said I realize he was referring to the bus not the CPU cores. I just kept wondering if I should switch to Linux from Win7.
BTW, the fast GPUs all use a CPU core at 100% and even HD 7970 will use 90ish.
In Science We Trust Image
bruce
Posts: 20910
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by bruce »

The GPU drivers for AMD/NV, for Windows/Linux, and for OpenCl/CUDA are all different. Most are written to use a spin-wait although there may be one or two that rely on a Wait/Interrupt style of resuming processing when the conditions of the WU require it. Having a spin-wait compete with FahCore_a* is a bad idea if you want to use your CPU for FAH.

I have an old dual-core machine and I recently added a GPU. The default configuration changed from CPU:2 to CPU:1 but in examining what Windows is using the CPUs for, I discovered that there's a task called Host Process for Windows Services which is using 50% of my CPU resources (100% of one CPU). I think this is a newly revised MS task but I have not researched it yet. It appears in both Win7 and Win10.

Anyway, I've been folding with a CPU:2 allocation even though there was only one mostly-unused CPU which would have been slightly more efficient if I reconfigured it for CPU:1. Now that I've added the GPU, both of my CPUs will be busy ... one doing Microsoft's overhead and one doing NVidia's overhead, leaving exactly zero for FahCore_a*.

Has anybody else noticed that FAHClient should now default to (N-2) when you have one GPU?

----------

I guess the MS programmers have decided that the "Windows Experience" causes even more reasons to switch to Linux.
foldy
Posts: 2061
Joined: Sat Dec 01, 2012 3:43 pm
Hardware configuration: Folding@Home Client 7.6.13 (1 GPU slots)
Windows 7 64bit
Intel Core i5 2500k@4Ghz
Nvidia gtx 1080ti driver 441

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by foldy »

@Aurum: Your mainboard has 5 pcie x16 slots. When you put in 3 GPUs in the 3 fastest slots it will run at x16/x8/x8. This will give full performance in Windows or Linux.

@bruce: I had once Windows do some background work using one CPU core. I fixed it by disable some telemetry service tasks. There is a tool for Windows 10 which can do this and maybe solves your issue https://www.oo-software.com/en/shutup10.
FldngForGrandparents
Posts: 70
Joined: Sun Feb 28, 2016 10:06 pm

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by FldngForGrandparents »

ryoungblood wrote:Depending on the performance of this set-up, I might just wait for the Voltra-based titan in '18
365 days X say 2M PPD = 730M PPD you could be missing out on. That's a good contribution. There will always be something new coming out to wait on.
Image

Dedicated to my grandparents who have passed away from Alzheimer's

Dedicated folding rig on Linux Mint 19.1:
2 - GTX 980 OC +200
1 - GTX 980 Ti OC +20
4 - GTX 1070 FE OC +200
3 - GTX 1080 OC +140
1 - GTX 1080Ti OC +120
FldngForGrandparents
Posts: 70
Joined: Sun Feb 28, 2016 10:06 pm

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by FldngForGrandparents »

SteveWillis wrote: I went with an open air mining case off ebay, $62 or around $30 if y0u build it from scratch, probably not too hard if you are handy and have the tools
pcie header cables $22/GPU so say $66
$20 to build your own:
viewtopic.php?f=38&t=28869
Image

Dedicated to my grandparents who have passed away from Alzheimer's

Dedicated folding rig on Linux Mint 19.1:
2 - GTX 980 OC +200
1 - GTX 980 Ti OC +20
4 - GTX 1070 FE OC +200
3 - GTX 1080 OC +140
1 - GTX 1080Ti OC +120
SteveWillis
Posts: 409
Joined: Fri Apr 15, 2016 12:42 am
Hardware configuration: PC 1:
Linux Mint 17.3
three gtx 1080 GPUs One on a powered header
Motherboard = [MB-AM3-AS-SB-990FXR2] qty 1 Asus Sabertooth 990FX(+59.99)
CPU = [CPU-AM3-FX-8320BR] qty 1 AMD FX 8320 Eight Core 3.5GHz(+41.99)

PC2:
Linux Mint 18
Open air case
Motherboard: ASUS Crosshair V Formula-Z AM3+ AMD 990FX SATA 6Gb/s USB 3.0 ATX AMD
AMD FD6300WMHKBOX FX-6300 6-Core Processor Black Edition with Cooler Master Hyper 212 EVO - CPU Cooler with 120mm PWM Fan
three gtx 1080,
one gtx 1080 TI on a powered header

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by SteveWillis »

Aurum wrote:
SteveWillis wrote:...KVM switch to switch your keyboard, video, and mouse between your primary PC and folding rigs.
I run my rigs headless using TightVNC. Others like Team Viewer. I just move a monitor & keyboard around for those times when you just have to. FAHControl lets me see them all at a glance.
Routinely I remote in to my folding rig from my primary PC but there is no way I'm going to switch cables around whenever I need to access it directly.
Image

1080 and 1080TI GPUs on Linux Mint
ryoungblood
Posts: 39
Joined: Wed Jan 18, 2017 10:13 pm
Location: Boulder, CO

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by ryoungblood »

So I ended up building out a second machine out of an older desktop I had and a third machine from strictly new parts. They've both been running for roughly a week, but I'm averaging fairly low PPD on the second and third machines @ roughly 580k PPD per 1070. I have another 1070 putting out 720K+ consistently and I'd like to get the other 4 1070 to the same level. All machines are on the same driver 372.90 and are set to prioritize performance within the NVIDIA control panel. Does anyone have any recommendations for increasing PPD on the 4 underperforming cards? What is the current recommended driver for NVIDIA on Win10?

I may also be interested putting linux on the two additional boxes.

Image
Image
ComputerGenie
Posts: 236
Joined: Mon Dec 12, 2016 4:06 am

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by ComputerGenie »

ryoungblood wrote:...I'd like to get the other 4 1070 to the same level. All machines are on the same driver 372.90 and are set to prioritize performance within the NVIDIA control panel. Does anyone have any recommendations for increasing PPD on the 4 underperforming cards?...[/img]
Are all cards the same brand/model? Are all slots the same (i.e., x16, x4, etc)? Are all cards running the same clocks?
Aurum
Posts: 296
Joined: Sat Oct 03, 2015 3:15 pm
Location: The Great Basin

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by Aurum »

ryoungblood wrote:I have another 1070 putting out 720K+ consistently...
I've never seen any of my 1070s run that fast. How do you do it :?:
In Science We Trust Image
ryoungblood
Posts: 39
Joined: Wed Jan 18, 2017 10:13 pm
Location: Boulder, CO

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by ryoungblood »

Yes/no.

Turing has 2 x ASUS 1070 Strix, Zuse has 2 x EVGA 1070 SC Black, CMD has a 1080 FTW Hybrid and a EVGA 1070 SC. All cards are on a x16 slot.

All the 1070 have the same (forced through OC tool) core/memory clock.
Last edited by ryoungblood on Sat Feb 11, 2017 3:31 pm, edited 1 time in total.
Image
ryoungblood
Posts: 39
Joined: Wed Jan 18, 2017 10:13 pm
Location: Boulder, CO

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by ryoungblood »

Aurum wrote:
ryoungblood wrote:I have another 1070 putting out 720K+ consistently...
I've never seen any of my 1070s run that fast. How do you do it :?:
Is this not common?

https://docs.google.com/spreadsheets/d/ ... 8U2bf0/pub has a large sample size with 1070 at an average of 700k.

My 1070 running at 720k+ (it runs pretty close in PPD to my 1080 Hybrid) is at 2012 core clock, and a 3893 memory clock. The CMD machine has a i7 6700k, 512GB M2, 64GB RAM @ 2800MHz. Could it be a hardware issue? The other two have slower RAM and marginally slower CPU.
Image
ComputerGenie
Posts: 236
Joined: Mon Dec 12, 2016 4:06 am

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by ComputerGenie »

Edited to strike misread
ryoungblood wrote:... has a large sample size with 1070 at an average of 700k...
And that sheet is meant to compare cardx against cardx on a specific protect, run, clone, and generation.
Example:
ra_alfaomega, recorded that his GTX 1070, clocked at 2100, got 739,740PPD estimate for Project 10490, R256, C0, G410 (which is a core18 project)
Last edited by ComputerGenie on Sat Feb 11, 2017 3:35 pm, edited 3 times in total.
ryoungblood
Posts: 39
Joined: Wed Jan 18, 2017 10:13 pm
Location: Boulder, CO

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by ryoungblood »

ComputerGenie wrote:
ryoungblood wrote:Yes/no.

Turing has 2x ASUS 1070 Strix, Zuse has 2x EVGA 1070 SC Black, CMD has 1x 1080 FTW Hybrid and 1x EVGA SC. All cards are on a x16 slot.

All the 1070 have the same (forced through OC tool) core/memory clock.
You're, literally, never going to get a 1070 to run the same on 1x as on 2x.
viewtopic.php?f=38&t=28847

And that sheet is meant to compare cardx against cardx on a specific protect, run, clone, and generation.
You misread. When I say 1x, I mean 1 physical GPU unit, 2x is 2 units. All cards are on a x16 slot.
Image
ComputerGenie
Posts: 236
Joined: Mon Dec 12, 2016 4:06 am

Re: PPD by GPU - Advice for Dedicated F@H Machine

Post by ComputerGenie »

Fixed my misread, but that sheet is incorrectly applied by most everyone that ever sees it.
Everything on that sheet is estimated PPD based on an entire day of folding a specific WU (which will never happen in real world usage).
Post Reply