Page 1 of 1

GPU folding

Posted: Tue Dec 11, 2018 4:06 am
by ProDigit
If I were to install a gaming graphics card like the NVidia GTX 1050,
Does the folding app fold using both CPU and GPU, or only GPU?

Re: GPU folding

Posted: Tue Dec 11, 2018 5:04 am
by JimboPalmer
Both.

The Graphics card will reserve one CPU thread to feed it data, so if you have a single threaded CPU, none will be available for folding.

Otherwise, expect the CPU to do 1/20 the number of points as the GPU.

Re: GPU folding

Posted: Tue Dec 11, 2018 4:29 pm
by ProDigit
I just bought a Xeon system, with 12 cores, 24 threads, and the 1050 graphics card. Will my CPU be nearly idle, feeding graphics card data, or will it aid the graphics card?
I can't imagine a Xeon CPU getting only 1/10th the results?

Re: GPU folding

Posted: Wed Dec 12, 2018 4:51 am
by JimboPalmer
The Graphics card will need one thread leaving you 23.

There is a software 'feature' that hates multiples of large prime numbers, 23 is Prime, 22 is 2 X 11 and 11 is Prime, 21 is 3 x 7 and 7 is prime, 20 is 2 X 3 X 5, and those are all small. so you want to set the CPU slot to 20.

So one GPU doing about 70,000 PPD, and 20 CPUs doing 40,000 PPD. (my guess) The GPU will make more point than the CPUs, but not a lot more.

Re: GPU folding

Posted: Fri Dec 14, 2018 10:05 am
by ProDigit
PCIE 1x slot, transfers data at 250 MBps both ways (that is over 25MB/s).
This should be plenty for folding apps, albeit at speeds closer to integrated graphics.

As far as GPU, I don't know how it'll work on the browser, but it appears the client allows me to assign an x-amount of CPU cores, and I presume when I install a GPU, it'll also automatically add GPU cores to the list?
It'd be a pity, if 11 cores, 11 threads of the CPU would be idle.
If this be the case, I'd probably do some bitcoin mining on the side.
I suppose though, that CPU + GPU folding simultaneously, should be possible.

Re: GPU folding

Posted: Fri Dec 14, 2018 1:44 pm
by bruce
Your GPU provides a large number of "shaders" which can compute in parallel, once the data dependencies are unraveled. Your CPU can calculate up to 12 streams of calculations in parallel. My GTX 1080Ti can process up to 3584 streams of calculations. You cannot assign the number of GPU shaders ... that's done by the process that runs in the CPU's main RAM when it composes the kernel.

See my other post here: viewtopic.php?f=38&t=31189&p=304120#p304120