Monitor GPU power consumption values?

Moderators: Site Moderators, FAHC Science Team

Post Reply
Alex_Atkin
Posts: 38
Joined: Mon Oct 24, 2022 4:32 am

Monitor GPU power consumption values?

Post by Alex_Atkin »

I was just wondering if anyone has considered adding support for reporting current power consumption of GPU jobs?

Would it even be possible to include CPU too?

I've been able to hack together stats myself by just looping a script calling nvidia-smi and writing to a network share, but it would be great if FAHClient presented this information directly to help keep control of the cost of contributing our hardware.
Image
JimboPalmer
Posts: 2573
Joined: Mon Feb 16, 2009 4:12 am
Location: Greenwood MS USA

Re: Monitor GPU power consumption values?

Post by JimboPalmer »

Tsar of all the Rushers
I tried to remain childlike, all I achieved was childish.
A friend to those who want no friends
bikeaddict
Posts: 192
Joined: Sun May 03, 2020 1:20 am

Re: Monitor GPU power consumption values?

Post by bikeaddict »

My systems are plugged into a Kasa/TP-Link KP115 energy monitoring smart plug. It pairs with a phone app that shows realtime power use and also keeps a total of kWh used over the last 7 or 30 days.

Before that I used a Kill A Watt which failed catastrophically and melted after less than a year, then a Gardner Bender PM3000 power monitor.

There are kWh cost calculators online (https://www.rapidtables.com/calc/electr ... lator.html) that can be used to estimate your monthly cost. I calculate my real kWh rate from the electric company by taking the monthly cost and dividing by kWh used, which is a bit higher than the rate they quote, probably due to fees and taxes.
Alex_Atkin
Posts: 38
Joined: Mon Oct 24, 2022 4:32 am

Re: Monitor GPU power consumption values?

Post by Alex_Atkin »

You misunderstand my point, this is so I can easily compare the performance between different hardware in an OS agnostic manner.

I already have multiple scripts to partially achieve this by pulling the v7 client data from each client, alongside nvidia-smi writing to a shared folder, then creating a single json file with all the combined information I can display on my website. In addition I pull the maximum CPU clock reported by the CPU on Linux. I have no clue how I'd do that on Windows.

With v8 using websockets, this becomes a lot more problematic.
Image
Post Reply