Guide for fan speed control for Nvidia cards in Linux

FAH provides a V7 client installer for Debian / Mint / Ubuntu / RedHat / CentOS / Fedora. Installation on other distros may or may not be easy but if you can offer help to others, they would appreciate it.

Moderators: Site Moderators, FAHC Science Team

Jesse_V
Site Moderator
Posts: 2851
Joined: Mon Jul 18, 2011 4:44 am
Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4
Location: Western Washington

Guide for fan speed control for Nvidia cards in Linux

Post by Jesse_V »

Note: this is for enabling fan control settings in Linux for more than one Nvidia GPU. If you only have one card, then look at viewtopic.php?p=246099#p246099

MSI Afterburner a pretty useful program for Windows, and works well even if you've got more than one graphics card. Since it was one of the main reasons I was still running Windows on my desktop, so I decided to see what was out there for Linux. Turns out that there are multiple guides and tutorials out there, and it's entirely possible to set it up for Linux, at least for Nvidia. I have a GTX 560TI and a GTX 480, so I set mine up around that in Mint 15 (raring package base). If this guide is useful to you, feel free to adapt it to your own setup as needed. Others on FF also have experience doing this sort of thing, so post below if you need.

1) In a fully updated environment, first install the Nvidia drivers. I chose the 310 drivers from the official repos as they seem to work pretty well. The drivers should also come with nvidia-settings, which is GUI for changing these settings. You should be able to modify the /home/.nvidia-settings.rc file manually if your build is headless or if you prefer CLI.

2) Restart.

3) Open nvidia-settings and then save the current configuration to the xorg.conf file. Make a backup of this file just in case things go downhill.

4) Coolbits needs to be enabled for the Nvidia drivers to display the advanced settings. This is really easy to do if you've only got one GPU, (follow this guide by bollix47) but the problem for multiple GPUs is that coolbits requires that an X server be running on each GPU. If you only have a single monitor, you have to trick X/Nvidia into thinking that you've got a second one and you need another X server instance. Fortunately, this is not hard to do. First you need to know how to connect to the other GPU(s). Run the command "lspci | grep VGA" and you should see all the GPUs that are connected to your motherboard. There are three numbers at the beginning of each entry, you'll need those later.

5) Open xorg.conf in your text editor of choice. The filepath is /etc/X11/xorg.conf. Under the ServerLayout section at the top, you'll need to make an entry for every card you have. You should see "Screen 0 "Screen0" 0 0" already listed, so just make another entry. In my case, I put the coordinates at 1600 0 because my actual monitor is 1600 pixels wide. There are likely a number of values that work for this, but that was the one that I went with.

6) Create a Monitor section for the other GPUs. You should already see one listed, so under it you can write

Code: Select all

Section "Monitor"
    Identifier     "Monitor1"
    VendorName     "Unknown"
    ModelName      "CRT-0"
    HorizSync       0.0 - 0.0
    VertRefresh     0.0
    Option         "DPMS"
EndSection
for a third card, your third entry should say "Monitor2", etc. Basically you're just defining these Monitor variables that you referenced in the previous step.

7) Still in xorg.conf, create a Device section for all the other GPUs that you have. You should already have one listed, so just copy-paste and use it as a template. Change the BoardName entry accordingly. Use the three numbers you obtained in step 4 here, and apply them to the BusID. Make sure the formatting is good.

8) Ensure that every Device section has an Option with the values "Coolbits" "5". This enables the extra controls in nvidia-settings.

9) Create a second Screen section, using the existing one as a template. Here's my entry, it should work for you:

Code: Select all

Section "Screen"
    Identifier     "Screen1"
    Device         "Device1"
    Monitor        "Monitor1"
    DefaultDepth    24
    Option         "ConnectedMonitor" "CRT"
    Option         "Coolbits" "5"
    Option         "Stereo" "0"
    Option         "nvidiaXineramaInfoOrder" "CRT-0"
    Option         "metamodes" "nvidia-auto-select +0+0"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection
Notice that the Screen section references the Device and Monitor variables, so you'll need to ensure that these all match accordingly.

10) Ensure that every Screen section has a "Coolbits" "5" entry. In the end, your xorg.conf should look very similar to mine: http://pastebin.com/EmCNt5GX

11) Restart.

If all goes well, your desktop should boot properly and under nvidia-settings you should see a fake CRT-0 display under "X Server Display Configuration", along with more than one X Screen listed on the left, and the advanced features such as fan settings under the "Thermal Settings" subsection for each GPU. On my machine, the Cinnamon desktop under Mint 15 didn't like this hack and reverted to Fallback Mode on startup. If anyone encounters this and has figured out how to fix it, please let me know. KDE seems to work fine though, it's easy to switch desktops.

Enjoy!
Last edited by Jesse_V on Sun Oct 20, 2013 11:06 pm, edited 2 times in total.
jimerickson
Posts: 533
Joined: Tue May 27, 2008 11:56 pm
Hardware configuration: Parts:
Asus H370 Mining Master motherboard (X2)
Patriot Viper DDR4 memory 16gb stick (X4)
Nvidia GeForce GTX 1080 gpu (X16)
Intel Core i7 8700 cpu (X2)
Silverstone 1000 watt psu (X4)
Veddha 8 gpu miner case (X2)
Thermaltake hsf (X2)
Ubit riser card (X16)
Location: ames, iowa

Re: Guide for Nvidia multi-card fan control in Linux

Post by jimerickson »

thank you for posting this Jesse_V!! excellent guide.
bollix47
Posts: 2941
Joined: Sun Dec 02, 2007 5:04 am
Location: Canada

Re: Guide for Nvidia multi-card fan control in Linux

Post by bollix47 »

Thanks Jesse ... bookmarked for future reference. :wink:
Jesse_V
Site Moderator
Posts: 2851
Joined: Mon Jul 18, 2011 4:44 am
Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4
Location: Western Washington

Re: Guide for Nvidia multi-card fan control in Linux

Post by Jesse_V »

Thanks guys. Thanks jimerickson for the help in setting this up in the first place, and to bollix47 for the tips, links, and guide for single-card boxes. :D
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
Jesse_V
Site Moderator
Posts: 2851
Joined: Mon Jul 18, 2011 4:44 am
Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4
Location: Western Washington

Re: Guide for fan speed control for Nvidia cards in Linux

Post by Jesse_V »

I just confirmed that this works on a fresh install of Mint 15 KDE edition. Last time I just installed KDE as a replacement to Cinnamon, so it's good to know that KDE can handle this hack out of the box. Copy-pasted xorg.conf from the old install, restarted, and was good to go.
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
Jesse_V
Site Moderator
Posts: 2851
Joined: Mon Jul 18, 2011 4:44 am
Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4
Location: Western Washington

Re: Guide for fan speed control for Nvidia cards in Linux

Post by Jesse_V »

I recently swapped out my 560TI for a 480. I changed the xorg.conf to reflect the name change, but other than that I've left it alone. Yet I'm no longer seeing the thermal settings for the first 480, only for the second one. Is Nvidia getting confused? Should I reinstall the driver?

Edit: reinstalled the driver. Thermal settings are still not showing up.
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
7im
Posts: 10189
Joined: Thu Nov 29, 2007 4:30 pm
Hardware configuration: Intel i7-4770K @ 4.5 GHz, 16 GB DDR3-2133 Corsair Vengence (black/red), EVGA GTX 760 @ 1200 MHz, on an Asus Maximus VI Hero MB (black/red), in a blacked out Antec P280 Tower, with a Xigmatek Night Hawk (black) HSF, Seasonic 760w Platinum (black case, sleeves, wires), 4 SilenX 120mm Case fans with silicon fan gaskets and silicon mounts (all black), a 512GB Samsung SSD (black), and a 2TB Black Western Digital HD (silver/black).
Location: Arizona
Contact:

Re: Guide for fan speed control for Nvidia cards in Linux

Post by 7im »

Might be related to why the new core 17 temp feature only works with one GPU?
How to provide enough information to get helpful support
Tell me and I forget. Teach me and I remember. Involve me and I learn.
PantherX
Site Moderator
Posts: 7020
Joined: Wed Dec 23, 2009 9:33 am
Hardware configuration: V7.6.21 -> Multi-purpose 24/7
Windows 10 64-bit
CPU:2/3/4/6 -> Intel i7-6700K
GPU:1 -> Nvidia GTX 1080 Ti
§
Retired:
2x Nvidia GTX 1070
Nvidia GTX 675M
Nvidia GTX 660 Ti
Nvidia GTX 650 SC
Nvidia GTX 260 896 MB SOC
Nvidia 9600GT 1 GB OC
Nvidia 9500M GS
Nvidia 8800GTS 320 MB

Intel Core i7-860
Intel Core i7-3840QM
Intel i3-3240
Intel Core 2 Duo E8200
Intel Core 2 Duo E6550
Intel Core 2 Duo T8300
Intel Pentium E5500
Intel Pentium E5400
Location: Land Of The Long White Cloud
Contact:

Re: Guide for fan speed control for Nvidia cards in Linux

Post by PantherX »

The reason is that the Linux version hasn't been updated, it is still on version 0.0.46 (IIRC):

00:59 < bollix47> deleting the core in linux does not get the new core ... just downloads older version again
00:59 < bollix47> location: cores/www.stanford.edu/~pande/Linux/AMD64/NVIDIA/Fermi/
01:11 <@proteneer> yes
01:11 <@proteneer> linux hasnt been updated yet
01:11 <@proteneer> (mainly because i accidentally my linux build system)
01:11 <@proteneer> and things like temperature control etc.
01:11 <@proteneer> doesn't do anything new in linux
01:12 < bollix47> k
ETA:
Now ↞ Very Soon ↔ Soon ↔ Soon-ish ↔ Not Soon ↠ End Of Time

Welcome To The F@H Support Forum Ӂ Troubleshooting Bad WUs Ӂ Troubleshooting Server Connectivity Issues
Jesse_V
Site Moderator
Posts: 2851
Joined: Mon Jul 18, 2011 4:44 am
Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4
Location: Western Washington

Re: Guide for fan speed control for Nvidia cards in Linux

Post by Jesse_V »

Thanks, but that's not what I'm referring to. I'm not seeing the thermal sensors and fan controls in nvidia-settings. I would have expected them to be there because of the Coolbits setting, which I did not change from when it was working with the 560TI and the 480. Now with two 480s, I simply updated the GPU label in xorg.conf, but nothing else has changed.

For those that have a multi-GPU setup using the same card, what did you do to get Coolbits activated for both?
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
jimerickson
Posts: 533
Joined: Tue May 27, 2008 11:56 pm
Hardware configuration: Parts:
Asus H370 Mining Master motherboard (X2)
Patriot Viper DDR4 memory 16gb stick (X4)
Nvidia GeForce GTX 1080 gpu (X16)
Intel Core i7 8700 cpu (X2)
Silverstone 1000 watt psu (X4)
Veddha 8 gpu miner case (X2)
Thermaltake hsf (X2)
Ubit riser card (X16)
Location: ames, iowa

Re: Guide for fan speed control for Nvidia cards in Linux

Post by jimerickson »

Jesse_V: did you try changing the board name in xorg.conf or did the pci address change?

Edit: have to go to work right now will be back this evening.
Jesse_V
Site Moderator
Posts: 2851
Joined: Mon Jul 18, 2011 4:44 am
Hardware configuration: OS: Windows 10, Kubuntu 19.04
CPU: i7-6700k
GPU: GTX 970, GTX 1080 TI
RAM: 24 GB DDR4
Location: Western Washington

Re: Guide for fan speed control for Nvidia cards in Linux

Post by Jesse_V »

Jim, nope, I double checked the PCI address. Still same as before.
F@h is now the top computing platform on the planet and nothing unites people like a dedicated fight against a common enemy. This virus affects all of us. Lets end it together.
ChristianVirtual
Posts: 1596
Joined: Tue May 28, 2013 12:14 pm
Location: Tokyo

Re: Guide for fan speed control for Nvidia cards in Linux

Post by ChristianVirtual »

here my little success story

on an Asus board with iGPU connected to HDMI as primary/only screen and an GTX 660 Ti headless for folding (driver 319.x)

The following xorg.conf allows me to set the fan speed and fold while using the iGPU to drive the UI. Until now I had only luck with attached screens. What I added is a default screen resolution and skipping the monitor detection based on some hints from other sources.

Code: Select all

# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 319.49  (buildmeister@swio-display-x64-rhel04-03)  Tu
e Aug 13 20:42:18 PDT 2013


Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    Screen      1  "Screen1" RightOf "Screen0"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
    Option         "Xinerama" "0"
EndSection

Section "Files"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection

Section "Monitor"
    Identifier     "Monitor1"
    VendorName     "Headless"
    ModelName      "CRT1"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "Enable" "true"
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "intel"
    VendorName     "Intel Corporation"
    BoardName      "iGPU"
    BusID          "PCI:0:2:0"
EndSection

Section "Device"
    Identifier     "Device1"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 660 Ti"
    BusID          "PCI:1:0:0"
    Option         "ConnectedMonitor" "CRT-0"
    Option         "UseEDID" "false"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
        Modes      "1920x1080"
    EndSubSection
EndSection

Section "Screen"
    Identifier     "Screen1"
    Device         "Device1"
    Monitor        "Monitor1"
    Option         "Coolbits" "5" 
    Option         "UseEDID" "false"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24
        Modes      "800x600"
    EndSubSection
EndSection
and a little script for the fan (as referred earlier)

Code: Select all

nvidia-settings -a [gpu:0]/GPUFanControlState=1
nvidia-settings -a [fan:0]/GPUCurrentFanSpeed=60
The only little downside not yet solved but also not relevant: on my HDMI screen I don't see any Unity UI elements; only a naked desktop; I need to start a terminal with Ctrl-Alt+t which is perfectly fine for a dedicated folding rig; I can even fully start with ssh this way. Happy camper.
ImageImage
Please contribute your logs to http://ppd.fahmm.net
Nicolas_orleans
Posts: 106
Joined: Wed Aug 08, 2012 3:08 am

Re: Guide for fan speed control for Nvidia cards in Linux

Post by Nicolas_orleans »

Thanks Jesse_V, it works and now both GPUs are correctly cooled !

Full file

Code: Select all

# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 304.117  (buildmeister@swio-display-x86-rhel47-01)  Tue Nov 26 22:29:40 PST 2013

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    Screen      1  "Screen1" 1920 0
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
EndSection

Section "Files"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"

    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection

Section "Monitor"
    Identifier     "Monitor1"
    VendorName     "Unknown"
    ModelName      "Unknown"
    HorizSync       28.0 - 33.0
    VertRefresh     43.0 - 72.0
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BusID          "PCI:1:0:0"
EndSection

Section "Device"
    Identifier     "Device1"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BusID          "PCI:2:0:0"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Coolbits" "4"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

Section "Screen"
    Identifier     "Screen1"
    Device         "Device1"
    Monitor        "Monitor1"
    DefaultDepth    24
    Option         "ConnectedMonitor" "CRT"
    Option         "Coolbits" "4"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection
MSI Z77A-GD55 - i5-3550 - 16 Go RAM - GTX 980 Ti Hybrid @1461 MHz + GTX 770 @ 1124 MHz + GTX 750 Ti @ 1306 MHz - Ubuntu 16.10
Bead_Hand
Posts: 3
Joined: Fri Mar 21, 2014 6:21 am

Re: Guide for fan speed control for Nvidia cards in Linux

Post by Bead_Hand »

The info in this thread is a tad dated. It's not that it's incorrect, it's just that there is an easier way to configure X to recognize all of your nVidia cards and allow manual fan control. You simply use the nvidia-xconfig command with appropriate parameters, arguments and/or switches on the command line in a terminal. nvidia-xconfig is installed when you install the nVidia driver. I don't know exactly when they began shipping it with the driver but if you have a reasonably recent version of the driver you will definitely have nvidia-xconfig.

The tips/methods/commands below will work if you have just installed Linux and the nVidia drivers for the first time. It should also work if you had a working setup but broke it by changing cards or rearranging cards in your PCIe slots. Unlike that other OS, it's not necessary to do the herky-jerky the way you have to sometimes just to uninstall the driver, config files and registry (gag) entries just to reinstall and get going again.

I use Ubuntu. The stuff below works for me on Ubuntu 12.04 and 14.04. I don't know about 13.04 but probably.

If you have the nVidia driver installed and a card(s) in the slot(s) then open a terminal. On Ubuntu with Unity the easy way to open a terminal is to bring the desktop into focus (click on an open area of the desktop) and type CTRL-ALT-T.

In the terminal, to enable (recognize) all cards present on the mainboard type

Code: Select all

nvidia-xconfig --enable-all-gpus


To allow manual fan control on all enabled cards type

Code: Select all

nvidia-xconfig --cool-bits=4
Please note that in both of the above commands there is a space character before the "--" sequence. Note also that in the second command there is no space before or after the "=".

The nvidia-xconfig command has a plethora of options. Do man nvidia-xconfig in a terminal to read the user manual.

I have a nifty Python script that monitors the temperature of each of my 3 nVidia cards and increases/decreases each cards fan speed independently as required to keep the temperature at a user selectable target temperature. In practice, due to hysteresis and dynamic load on the GPU, the actual temps hover about 2 degrees Celsius above or below the target. I have my system configured to autostart the script at boot time. Anybody want a copy? Remember it's a Python script therefore you need to install Python to use it. Your distro might have installed Python automatically for you.
Hawkeye
Posts: 9
Joined: Mon Jan 19, 2009 3:07 am

Re: Guide for fan speed control for Nvidia cards in Linux

Post by Hawkeye »

Sorry to thread necro, but could I get a copy of your script? I'm having a hard time getting my 2 960s to set at 60% fan. GPU0 does but GPU1 wont.
Image
Post Reply