Low PPD on some WUs

A forum for discussing FAH-related hardware choices and info on actual products (not speculation).

Moderator: Site Moderators

Forum rules
Please read the forum rules before posting.
Bryman
Posts: 95
Joined: Mon Mar 14, 2016 12:01 pm

Re: Low PPD on some WUs

Post by Bryman »

Yup, issue solved

Image

Thanks everyone
bruce
Posts: 20910
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: Low PPD on some WUs

Post by bruce »

Did you read my post and fail to understand it?

You can fold with the CPU at some number LESS THAN 7 provided you leave a little bit of idle time rather than forcing too many threads to compete with each other.

Post Task manager again.
Bryman
Posts: 95
Joined: Mon Mar 14, 2016 12:01 pm

Re: Low PPD on some WUs

Post by Bryman »

bruce wrote:Did you read my post and fail to understand it?

You can fold with the CPU at some number LESS THAN 7 provided you leave a little bit of idle time rather than forcing too many threads to compete with each other.

Post Task manager again.
I know... but compared to the PPD I get with the video card WU it's only like a 1/10 less

Plus... a 13xxx project is not even 3 hours... I can just reenable cpu folding afterwards


I still don't get why having the cpu folding set to low priority and the gpu folding set to normal priority didn't work... it shouldn't be affecting the gpu usage that much... a 75% decrease is pretty drastic

Even when I had 7 png optimization processes and cpu folding going the gpu usage dropped down to like 90%.. but not 25%

It dropped down to about 40% at low priority and 90% at normal priority



I had 5% cpu free.. it was only using 6-7 even at a higher priority than the cpu folding and uses 1-2 without cpu folding

It actually uses less without cpu folding... about 3x less

And it ONLY happens with 13xxx projects...
Bryman
Posts: 95
Joined: Mon Mar 14, 2016 12:01 pm

Re: Low PPD on some WUs

Post by Bryman »

bruce wrote:Post Task manager again.
Without cpu folding

Image

With cpu folding

Image

And with cpu folding it drops the gpu usage from 98 to 25%



Okay, with 7 threads on the cpu folding the gpu usage drops from 98 to 40%

With 6 threads on the cpu folding the gpu usage drops from 98 to 94%

With 4 threads 98 to 96%

With 8 threads the gpu usage actually goes UP and is at about 70%

Why would going from 7 to 8 threads on the cpu WU cause the gpu usage to go from 40 to 70%?

https://www.youtube.com/watch?v=9yMwsiVHqWs

The video is slightly laggy because I turned the frame rate down to cut down on cpu usage
bruce
Posts: 20910
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: Low PPD on some WUs

Post by bruce »

I can't answer all your questions -- I don't have a detailed enough knowledge about how GPU drivers work when there's contention for the CPU. I do know it's important, though.

A lot depends on timing. The FahCore for the CPU keeps the CPU as busy as possible doing heavy calculations. Most of the time, the FahCores for the GPUs are busy moving data across the PCIe bus or they're in a spin-wait so they're always available to process the next I/O without waiting for the high-latency overhead of interrupting another task.

FahCore_21 also occasionally does several seconds of heavy-compute processing is support of the analysis, itself.

Considering CPU separately, it should be noted that FahCore_a4 is totally different. It will start as many compute-bound threads as you let it, but if they get interrupted frequently, FahCore_a4's performance will suffer. In other words, 7 CPUs will accomplish less folding than, say 4, if we assuming that the sum of all non-FAH processing exceeds an average of a couple of CPUs. i.e.- using 2 CPUs for something else plus 7 CPUs for FAHCore_a4 will guarantee that one of the 7 will lag behind the processing done by the other 6.

This also gets distorted somewhat when you consider HyperThreading.

In your previous example, you apparently had 7 threads which were getting a total of 23% which would have been inferior to setting 2 threads each getting 12% but mostly staying in sync.
Bryman
Posts: 95
Joined: Mon Mar 14, 2016 12:01 pm

Re: Low PPD on some WUs

Post by Bryman »

bruce wrote:I can't answer all your questions -- I don't have a detailed enough knowledge about how GPU drivers work when there's contention for the CPU. I do know it's important, though.

A lot depends on timing. The FahCore for the CPU keeps the CPU as busy as possible doing heavy calculations. Most of the time, the FahCores for the GPUs are busy moving data across the PCIe bus or they're in a spin-wait so they're always available to process the next I/O without waiting for the high-latency overhead of interrupting another task.

FahCore_21 also occasionally does several seconds of heavy-compute processing is support of the analysis, itself.

Considering CPU separately, it should be noted that FahCore_a4 is totally different. It will start as many compute-bound threads as you let it, but if they get interrupted frequently, FahCore_a4's performance will suffer. In other words, 7 CPUs will accomplish less folding than, say 4, if we assuming that the sum of all non-FAH processing exceeds an average of a couple of CPUs. i.e.- using 2 CPUs for something else plus 7 CPUs for FAHCore_a4 will guarantee that one of the 7 will lag behind the processing done by the other 6.

This also gets distorted somewhat when you consider HyperThreading.

In your previous example, you apparently had 7 threads which were getting a total of 23% which would have been inferior to setting 2 threads each getting 12% but mostly staying in sync.
Well, I think I'll just keep the threads at 6 from now on

My PPD is at least close to the same if not better... and it won't conflict with 13xxx projects

The possible benefit from folding with 6 threads versus 7 is small... probably 5%, and it could just cause problems



This way it won't overload all the cores... I see all 8 threads getting a pretty equal load right now with 6 threads going... that's good
Bryman
Posts: 95
Joined: Mon Mar 14, 2016 12:01 pm

Re: Low PPD on some WUs

Post by Bryman »

Gah... I was comparing the PPD with different amount of threads on the CPU and I got a bad work unit and it dumped the WU... was it probably because I was changing the amount of threads a bunch?

I figured out 6 threads gives about 23,000 PPD, 4 threads gives 18,000 PPD, and I never figured out what 7 threads gave...

Image
Bryman
Posts: 95
Joined: Mon Mar 14, 2016 12:01 pm

Re: Low PPD on some WUs

Post by Bryman »

On a different WU, 7 threads gave 16,000 PPD and 6 threads gave 21,000?

So 6 threads is better, and takes less computer resources

Hmm...

That doesn't seem right though.. I would think 7 would be better than 4... I could understand how 6 would be better than 7 though



Yup, it's still calculating... I'll wait longer next time
Bryman
Posts: 95
Joined: Mon Mar 14, 2016 12:01 pm

Re: Low PPD on some WUs

Post by Bryman »

4 threads 11,000, 6 threads 15,300, 7 threads 14,900

Now I'm getting 6 threads 14,500

But still, 6 seems the way to go

It'll give the video card WU more PPD and take less computer resources and not cause conflicts with 13xxx projects
toTOW
Site Moderator
Posts: 6309
Joined: Sun Dec 02, 2007 10:38 am
Location: Bordeaux, France
Contact:

Re: Low PPD on some WUs

Post by toTOW »

toTOW wrote:Can you try to pause the CPU slot while running p130xx ? Does it help ?

If it helps, reduce your CPU slot to 6 threads ... it should help too as a long term solution ...
So who was right ? :roll:
Image

Folding@Home beta tester since 2002. Folding Forum moderator since July 2008.
foldy
Posts: 2061
Joined: Sat Dec 01, 2012 3:43 pm
Hardware configuration: Folding@Home Client 7.6.13 (1 GPU slots)
Windows 7 64bit
Intel Core i5 2500k@4Ghz
Nvidia gtx 1080ti driver 441

Re: Low PPD on some WUs

Post by foldy »

This may lead to a general question:
A 4 core cpu with hyperthreading has 8 threads or logical cores where each of 4 core has 2 threads.
But when one thread is free and 7 threads are busy then there must be 4 cores busy.

FAH by default leaves one cpu core free for each GPU slot.
Shouldn't this calculate to 2 threads/logical cores on a hyperthreading CPU?
Why does FAH not check for physical cores but threads/logical cores?

This issue may now rise because core_21 takes more cpu resources on heavy 13000 projects.
And also hit other users?
bruce
Posts: 20910
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: Low PPD on some WUs

Post by bruce »

foldy wrote:Shouldn't this calculate to 2 threads/logical cores on a hyperthreading CPU?
Why does FAH not check for physical cores but threads/logical cores?
This was a request during early development of FAHClient but they wasted a lot of time on it and it never worked as required. They backed off to threads.

There's one redeeming factor, though. Use an example of 8-threaded HT CPU, you'll have 6 threads sharing all four SSE units so they'll work at the speed of about 60% of a dedicated CPU. The seventh thread will have exclusive use of the SSE unit, while sharing with the GPU driver thread which uses virtually no FP/SSE operations so that CPU thread will get 100% of the dedicated CPU speed. [The only exception is Core_21 which uses a few seconds of floating point at every checkpoint, but it's not really that significant.]

Adding the first four threads contributes more to Core_a4 than adding any of the last four, and adding a GPU thread is almost the same as leaving that thread idle, from the perspective of any app that's using mostly floating point.

For CPUs from either AMD or Intel, SSE/FPU operations use shared hardware but most other operations have access to unshared resources. That makes sense since, except for scientific apps and the more sophisticated 3D games, almost everything else written doesn't use floating point.
Bryman
Posts: 95
Joined: Mon Mar 14, 2016 12:01 pm

Folding@home causes my video card to do weird things

Post by Bryman »

So.. if I run a stress test my video card starts throttling when the VRM temp hits 110, but if I'm running folding@home the VRM temp goes all the way up to 135 and shuts my video card off...

I originally thought my video card started throttling 60 seconds after hitting the power limit.. but it turns out it just takes about that long for the VRM temp to hit 110

And when I run folding@home I can't underclock my video card



I just ran a stress test while running folding@home and my video card didn't throttle even when the VRM temp hit 120+

Why does folding@home disable throttling when the VRM temp goes up... it'll go all the way to 135 and shut my video card off...




(My video card doesn't have a power limit, that's why it got so hot)

https://www.youtube.com/watch?v=MEPNzrBoyBM

Without folding@home running, my video card will start throttling when the VRM temp hits 110... but with folding@home running it never throttles with VRM temp...



At least I haven't gotten any folding@home errors in a while.. heh

And thank god I fixed that PPD issue on 13xxx projects...




At least my video card shut itself off.. heh... VRM temp would of gone easily 160+ if it didn't



It also causes my GPU to not throttle when the core temp goes up... normally my video card starts throttling if the gpu temp stays above 97 for more than a few seconds (first it tries to turn fan speed to 100% to cool the gpu down)

But it went all the way up to 101 and shut my video card off..

My video card shuts itself off at only 101°C... that's pretty sad... my 7850 doesn't even force fans to 100% until 102

https://www.youtube.com/watch?v=l_D0ezitJbY

And that was only 1050mhz... I had it at 1100mhz with the VRM temp one




So... now my video card fan seems to have a mind of it's own

My core temp keeps going from 84 to 90... it keeps turning the fan speed down and back up... I have it set to 40% but it seems to not be keeping it at that



Well... my fan seems to be stable now.. maybe I had something running or something that I didn't realize and it was causing that to happen. dunno

Actually.. the fan speed percentage stayed the same, but the rpm was changing...

No idea :/
toTOW
Site Moderator
Posts: 6309
Joined: Sun Dec 02, 2007 10:38 am
Location: Bordeaux, France
Contact:

Re: Low PPD on some WUs

Post by toTOW »

You definitely need to improve your card and/or case cooling :(
Image

Folding@Home beta tester since 2002. Folding Forum moderator since July 2008.
foldy
Posts: 2061
Joined: Sat Dec 01, 2012 3:43 pm
Hardware configuration: Folding@Home Client 7.6.13 (1 GPU slots)
Windows 7 64bit
Intel Core i5 2500k@4Ghz
Nvidia gtx 1080ti driver 441

Re: Low PPD on some WUs

Post by foldy »

Bryman wrote:I do have some case fans I could snap onto my case... but it would make more noise... I prefer just to have my video card about 5°C hotter and have the side of my case open with no case fans
I think your GPU fans make more noise at 100% than some case fans which you can even throttle from 12V to 7V.
You can also use a case fan and but it directly near the GPU using some wire tie and still keep your case open.

You can also change your GPUs fan curve using a software tool, so it does increase fan speed earlier.
I think MSI Afterburner allows that.
Post Reply