RTX 2080ti vs 2060, which one to buy?

A forum for discussing FAH-related hardware choices and info on actual products (not speculation).

Moderator: Site Moderators

Forum rules
Please read the forum rules before posting.

RTX 2080ti vs 2060, which one to buy?

Postby Theodore » Sun Feb 17, 2019 6:06 pm

Trying to choose between the RTX 2080ti and the RTX 2060.

I'm thinking of either getting the 2060 and selling it later on, when better stuff is on the market, or get the 2080 ti now to keep for a year or two.
I'm thinking of buying from EVGA, because I can get the cards at a better deal.
Do you think it's feasible for an EVGA 2080 ti, to be able to run for 1 to 2 years straight, nearly 24/7?

I did read an article about a contamination in the cards, and am a bit hesitant to spend a larger amount of money on a card that might end up being bad.
https://www.extremetech.com/computing/2 ... gpu-wafers
Theodore
 
Posts: 118
Joined: Sun Feb 10, 2019 3:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Postby foldy » Sun Feb 17, 2019 6:23 pm

Have a good power supply and enough cooling in the case and clean it from dust every year. I guess we will not see better graphics cards this year and prices will stay high. But late 2020 nvidia will release the RTX in 7nm with 16GB and AMD will release Navi GPU. If noise is a problem for you then think about quiet hardware parts. If you run 24/7 then also calculate your power usage and price. If you can afford it the go for the biggest GPU.
foldy
 
Posts: 1939
Joined: Sat Dec 01, 2012 4:43 pm

Re: RTX 2080ti vs 2060, which one to buy?

Postby Nathan_P » Sun Feb 17, 2019 6:35 pm

I've got a Galax/KFA 21070 purchased in august 2016 that is still going strong. Its had some down time but routinely runs for 3-4 months between shutdowns. Out of the 30 months i've had the card its been on for 24 of those months.

Here's an outlier for you, a RTX 2060 can get 800k-1m PPD depending on project and costs ~$350, a pair of 2060's costs ~ $700 and will net 1.6-2m PPD, a 2080Ti can't be found for less than $1000 and is reported to earn 2.2m PPD. Personally, if iIhad the space for the 2 cards I would get a pair of 2060's or if power cost wasn't an issue a 2060 and a 2070 - nearly the same points for around $200 less.
Image
Nathan_P
 
Posts: 1170
Joined: Wed Apr 01, 2009 10:22 pm
Location: Jersey, Channel islands

Re: RTX 2080ti vs 2060, which one to buy?

Postby Theodore » Sun Feb 17, 2019 7:00 pm

Thanks,

After some consideration, I went with the 2060.
It's the card for the average joe, like myself.

Had my income been bigger, I would have gone with a pair of 2080s.
Theodore
 
Posts: 118
Joined: Sun Feb 10, 2019 3:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Postby gordonbb » Sun Feb 17, 2019 8:45 pm

Nathan_P wrote:I've got a Galax/KFA 21070 purchased in august 2016 that is still going strong. Its had some down time but routinely runs for 3-4 months between shutdowns. Out of the 30 months i've had the card its been on for 24 of those months.

Here's an outlier for you, a RTX 2060 can get 800k-1m PPD depending on project and costs ~$350, a pair of 2060's costs ~ $700 and will net 1.6-2m PPD, a 2080Ti can't be found for less than $1000 and is reported to earn 2.2m PPD. Personally, if iIhad the space for the 2 cards I would get a pair of 2060's or if power cost wasn't an issue a 2060 and a 2070 - nearly the same points for around $200 less.

I have both an EVGA 2060 XC and 2070 XC and can confirm that they run well together. I’m getting 2.3MPPD with a mild overclock and dropping the power limit to 160 and 140W respectively which initial testing is showing to be much more efficient than running at stock power for folding.
Image
User avatar
gordonbb
 
Posts: 476
Joined: Mon May 21, 2018 5:12 pm
Location: Great White North

Re: RTX 2080ti vs 2060, which one to buy?

Postby Theodore » Mon Feb 18, 2019 5:19 pm

gordonbb wrote:
Nathan_P wrote:I've got a Galax/KFA 21070 purchased in august 2016 that is still going strong. Its had some down time but routinely runs for 3-4 months between shutdowns. Out of the 30 months i've had the card its been on for 24 of those months.

Here's an outlier for you, a RTX 2060 can get 800k-1m PPD depending on project and costs ~$350, a pair of 2060's costs ~ $700 and will net 1.6-2m PPD, a 2080Ti can't be found for less than $1000 and is reported to earn 2.2m PPD. Personally, if iIhad the space for the 2 cards I would get a pair of 2060's or if power cost wasn't an issue a 2060 and a 2070 - nearly the same points for around $200 less.

I have both an EVGA 2060 XC and 2070 XC and can confirm that they run well together. I’m getting 2.3MPPD with a mild overclock and dropping the power limit to 160 and 140W respectively which initial testing is showing to be much more efficient than running at stock power for folding.


I know Nvidia cards don't fully use the power they're rated for while folding.
When they're rated for 160W, I presume they would only use ~140Watts while folding.
Reducing the power threshold from 160 to 140 Watts, or reducing it by ~15% in that case scenario, won't really lower power consumption on the card.

Which program do you use to undervolt the card (in Linux)?
Nvidia X-server doesn't give me that option.
Theodore
 
Posts: 118
Joined: Sun Feb 10, 2019 3:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Postby gordonbb » Mon Feb 18, 2019 7:26 pm

Theodore wrote:I know Nvidia cards don't fully use the power they're rated for while folding.
When they're rated for 160W, I presume they would only use ~140Watts while folding.
Reducing the power threshold from 160 to 140 Watts, or reducing it by ~15% in that case scenario, won't really lower power consumption on the card ...
Yes, you can lower the Power Usage of the card directly and easily.

Recent Nvidia Cards have 5 milliOhm Current Shunt resistors on the input legs from the PCIe Bus and each Power Connector. They use these in Boost 3 to monitor the Total Card Power usage and keep it under a defined threshold which is usually the Default Card Power Limit (217W in the case of the EVGA RTX 2060 XC Ultra, I believe).

The actual power draw and the limit can be viewed by running the nvidia-smi utility at a command prompt (terminal):
Code: Select all
nvidia-smi

To change this power-limit:
Code: Select all
nvidia-smi -i <GPU_id> --power-limit=<PWR_Limit>
where <GPU_id> is the ID of the GPU (Starts at 0 for the first GPU) and <PWR_Limit> is the new desired Power Limit.

You can query the default Power limits using:
Code: Select all
nvidia-smi -i <GPU_id> -q | grep Power
but I usually just enter a bogus power limit such as 0 or 1000 and then the command will throw an error and tell you the allowed range.

But this user-defined Power Limit will not persist across work units so you first should set:
Code: Select all
nvidia-smi -i <GPU_id> -pm 1

Once you change the Target Power Limit and set Persistence mode the GPU will honor the new limit and Boost will adjust the voltages (and hence the GPU Frequencies) to keep the GPU at the new Target.

Theodore wrote:... Which program do you use to undervolt the card (in Linux)?
Nvidia X-server doesn't give me that option.

Undervolting in Windows is usually done by setting a custom Voltage Curve in Precision Pro or Afterburner neither of which are available in Linux. Using the NVidia Control Panel under X-Windows you can adjust the Graphics (Shader) and Memory clocks as well as force the Fan Speed to a specific percentage if a card is running a little hotter than you'd like.

To Enable these settings in the PowerMiser and Thermal tabs you first need to add:
Code: Select all
Option "Coolbits" "12"
in your X server config. This can be a bit of a Black Art but the best way I've found so far is to edit /etc/X11/xorg.conf.d/nvidia.conf and add it to the "Device" Section and restart your X Server
User avatar
gordonbb
 
Posts: 476
Joined: Mon May 21, 2018 5:12 pm
Location: Great White North

Re: RTX 2080ti vs 2060, which one to buy?

Postby Theodore » Mon Feb 18, 2019 11:34 pm

gordonbb wrote:
Theodore wrote:I know Nvidia cards don't fully use the power they're rated for while folding.
When they're rated for 160W, I presume they would only use ~140Watts while folding.
Reducing the power threshold from 160 to 140 Watts, or reducing it by ~15% in that case scenario, won't really lower power consumption on the card ...
Yes, you can lower the Power Usage of the card directly and easily.

Recent Nvidia Cards have 5 milliOhm Current Shunt resistors on the input legs from the PCIe Bus and each Power Connector. They use these in Boost 3 to monitor the Total Card Power usage and keep it under a defined threshold which is usually the Default Card Power Limit (217W in the case of the EVGA RTX 2060 XC Ultra, I believe).

The actual power draw and the limit can be viewed by running the nvidia-smi utility at a command prompt (terminal):
Code: Select all
nvidia-smi

To change this power-limit:
Code: Select all
nvidia-smi -i <GPU_id> --power-limit=<PWR_Limit>
where <GPU_id> is the ID of the GPU (Starts at 0 for the first GPU) and <PWR_Limit> is the new desired Power Limit.

You can query the default Power limits using:
Code: Select all
nvidia-smi -i <GPU_id> -q | grep Power
but I usually just enter a bogus power limit such as 0 or 1000 and then the command will throw an error and tell you the allowed range.

But this user-defined Power Limit will not persist across work units so you first should set:
Code: Select all
nvidia-smi -i <GPU_id> -pm 1

Once you change the Target Power Limit and set Persistence mode the GPU will honor the new limit and Boost will adjust the voltages (and hence the GPU Frequencies) to keep the GPU at the new Target.

Theodore wrote:... Which program do you use to undervolt the card (in Linux)?
Nvidia X-server doesn't give me that option.

Undervolting in Windows is usually done by setting a custom Voltage Curve in Precision Pro or Afterburner neither of which are available in Linux. Using the NVidia Control Panel under X-Windows you can adjust the Graphics (Shader) and Memory clocks as well as force the Fan Speed to a specific percentage if a card is running a little hotter than you'd like.

To Enable these settings in the PowerMiser and Thermal tabs you first need to add:
Code: Select all
Option "Coolbits" "12"
in your X server config. This can be a bit of a Black Art but the best way I've found so far is to edit /etc/X11/xorg.conf.d/nvidia.conf and add it to the "Device" Section and restart your X Server


What I meant to say,

I see online people posting they've lowered their card's power consumption by 10% without noticeable performance loss, not taking into consideration that they've only set the power limits to what the card already was using.
Usually setting a ~10-15% lower power draw would result in no change in performance, or power usage.

One can set the power threshold below what the card is using, like you mentioned, but this comes at a performance penalty.
Lowering GPU voltage can lower power draw (and reduce heat) without a performance penalty, when set right.
I couldn't find much information online yet, about an equilibrium between core voltage reduction and performance (PPD) on GPUs.
Most information about voltage adjustment found online, is about overclocking and increasing voltage for higher performance, mostly in Windows applications; which makes sense if you want to play a game at maximum graphics settings.
Or want to earn maximum PPD in the shortest amount of time.

It makes more sense to find a way to lower the power consumption (and heat) for continuous use, like folding.
I would need a way to lower GPU voltage, while monitoring it's frequency for any drop caused by the undervoltage in Linux..
Nvidiux does currently not support my Nvidia cards.

Using coolbits in Linux, nvidia-server only allows me to change core clock, memory, and fan speed on some of my cards. Not on all of them.
The cards that have no fan control in nvidia-server, also have no GPU/VRAM frequency adjustment available.
My MSI and Asus card seem to be supported.
The PNY and Zotec not.

For the issue of not accepting all nvidia based graphics cards, it is mentioned in one of their forums to install a newer driver; 410 to 415.
I have been unsuccessful in my attempts to update the driver beyond 390 on my system.
Theodore
 
Posts: 118
Joined: Sun Feb 10, 2019 3:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Postby gordonbb » Tue Feb 19, 2019 1:20 am

Theodore wrote:For the issue of not accepting all nvidia based graphics cards, it is mentioned in one of their forums to install a newer driver; 410 to 415.
I have been unsuccessful in my attempts to update the driver beyond 390 on my system.
410 is the first version to support RTX 2070s and 415 the 2060s.
To install 410, as it was not available in the ppa repository, I ended up installing the latest CUDA toolkit which had it.

415 is available for Ubuntu in the ppa repository now.

Which distribution are you using?

Worst case you could install the Binary (.run) files from Nvidia directly but I find when I do that it can often leave a mess that takes some work to clean up when the desired driver version is finally available in the distribution’s repository.
User avatar
gordonbb
 
Posts: 476
Joined: Mon May 21, 2018 5:12 pm
Location: Great White North

Re: RTX 2080ti vs 2060, which one to buy?

Postby katakaio » Tue Feb 19, 2019 1:26 am

+1 to what gordonbb shared. I've installed binaries directly from Nvidia before when I needed a driver with no OpenGL libs, but ppa:graphics-drivers/ppa is a godsend if you're on a Debian-based distro and you just want the latest and greatest stable driver.
Code: Select all
sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt-get update
katakaio
 
Posts: 27
Joined: Wed Oct 28, 2009 8:31 pm
Location: Florida

Re: RTX 2080ti vs 2060, which one to buy?

Postby gordonbb » Tue Feb 19, 2019 8:22 am

There are also issues with Fan Control from the command line not working in the 410 drivers for the 2070 but fixed in the 415 driver but command line Fan Control for the 2060 does not work for the 415 driver so they seem to be off one driver

Similarly for GPU and Memory Overclocking using nvidia-settings from the command line does not work from the command line but, with coolbits properly set in 415 the GUI can be used for adjusting Fans and the clocks.

I get what your saying about the Power Limits. My 2070 has a default limit of 205W but rarely exceeds 190W when pushed in a case with good airflow and a large GPU clock offset applied.

This 190W I usually take as the absolute power limit and so adjust my operating Power Limit to 170W which is enforced by Boost.

I did a fair bit of testing with FAHBench using the default WUs and a then common WU in the 117xx series and observed a knee at the top of the performance curve where the last 10 to 20W provided only marginal increases in performance. Of course, the Quick Return Bonus (QRB) will offset the normal decreasing rate of efficiency increase.

Here’s the results for an EVGA 1060 6GB Card:
Image

As you can see this card caps out at 130W so I usually run these at a 120W Limit averaging 450kPPD but dropping down to 110W nets an even higher efficiency averaging 436kPPD.

I now have achieved the goal of my stats testing and have my efficiency per card (PPD/W) being plotted on a Zabbix server which calculates the Efficiency dynamically so I’m running a baseline at:
    GTX 1060 110W
    GTX 1070 Ti 150W
    RTX 2060 140W
    RTX 2070 160W

and after a week I should have a pretty stable average efficiency then I can change the Power Limits and monitor the changes.
User avatar
gordonbb
 
Posts: 476
Joined: Mon May 21, 2018 5:12 pm
Location: Great White North

Re: RTX 2080ti vs 2060, which one to buy?

Postby Theodore » Tue Feb 19, 2019 3:31 pm

NVidia server doesn't offer overclocking on all cards, but on all cards it offers the options: "auto", "balanced", and "Performance".
Has anyone tried out if these settings affect performance or efficiency?
Currently mine are running on performance, but I might switch to balanced, if this results in slightly lower performance on a lower power draw.
I've played around with it, but didn't notice any immediate difference.
Theodore
 
Posts: 118
Joined: Sun Feb 10, 2019 3:07 pm

Re: RTX 2080ti vs 2060, which one to buy?

Postby foldy » Tue Feb 19, 2019 4:33 pm

"Performance" keeps high GPU clock and power usage even when only low 3D load is done, e.g. watching youtube video in chrome browser. As FAH has high GPU load the setting doesn't matter and you always get highest GPU clock and power usage.
foldy
 
Posts: 1939
Joined: Sat Dec 01, 2012 4:43 pm

Re: RTX 2080ti vs 2060, which one to buy?

Postby HaloJones » Fri Feb 22, 2019 9:17 pm

Changing the power limit on Linux is as easy as:

nvidia-smi -i 0 -pl xxx

where 0 is the GPU ID and xxx is the power limit in W that you want.
1x Titan X, 5x 1070, 1x 970, 1 x Ryzen 3600

Image
HaloJones
 
Posts: 816
Joined: Thu Jul 24, 2008 11:16 am

Re: RTX 2080ti vs 2060, which one to buy?

Postby MeeLee » Sat Feb 23, 2019 7:49 am

If it is any help,

Running 2 RTX 2060 cards, costs you $700 for ~330W of power.
One RTX 2080 ti card costs you $1175 for ~225W of power.
Both settings net you ~2.2M PPD

On the 2060 cards, you can lower power levels to 130W, or 260W total power draw, and still get 2M PPD.
It brings the performance per watt difference between both settings much closer.
MeeLee
 
Posts: 931
Joined: Tue Feb 19, 2019 11:16 pm

Next

Return to FAH Hardware

Who is online

Users browsing this forum: No registered users and 2 guests

cron