Tesla V100-SXM2-16GB

Post requests to add new GPUs to the official whitelist here.

Moderators: Site Moderators, PandeGroup

Re: Tesla V100-SXM2-16GB

Postby csvanefalk » Fri Oct 27, 2017 4:05 pm

foldy wrote:Oh my god it's Nvidia Volta: 5120 shaders, 15 TFlops, ~1700k PPD


That's really impressive. My 1080Ti maxes out at 1.5M ppd.
csvanefalk
 
Posts: 172
Joined: Mon May 21, 2012 10:28 am

Re: Tesla V100-SXM2-16GB

Postby Luscious » Tue Nov 07, 2017 3:54 am

Thinkmate is selling V100 rackmount systems for purchase right now, including a 4U 2P 10 GPU variant. 17 million PPD out of a single box :eo :eo :eo That's more than what most TEAMS make.

http://www.thinkmate.com/system/gpx-xt24-24s1-10gpu
Luscious
 
Posts: 27
Joined: Sat Oct 13, 2012 6:38 am

Re: Tesla V100-SXM2-16GB

Postby 84036980 » Thu Nov 09, 2017 10:42 pm

FAH should works but I'm getting some error if i add it in GPU list file manually.
I just want it to be officially supported asap.


FAHBench wokrs. FYR

Loading plugins from plugin directory
Number of registered plugins: 3
Deserializing input files: system
Deserializing input files: state
Deserializing input files: integrator
Creating context (may take several minutes)
Checking accuracy against reference code
Creating reference context (may take several minutes)
Comparing forces and energy
Starting Benchmark

Benchmarking finished
Final score: 230.1101
Scaled score: 230.1101 (23558 atoms)
84036980
 
Posts: 16
Joined: Fri Feb 06, 2015 7:18 pm

Re: Tesla V100-SXM2-16GB

Postby bruce » Thu Nov 09, 2017 11:13 pm

You cannot add to GPUs.txt manually -- the server's copy must match.

Run fahclient --lspci or obtain the lspci identifiers elsewhere and post them here.

At the present time, FAH uses OpenCL, so you're going to be limited to what can be done with OpenCL. OpenMM is not written to support tensor math so performance is going to be reduced to whatever can be done with the CUDA cores. My guess is that FAH won't load up that many CUDA cores simultaneously, either.

What version of CUDA is installed and what version of OpenCL is supported?

Please describe your hardware.
bruce
 
Posts: 21534
Joined: Thu Nov 29, 2007 10:13 pm
Location: So. Cal.

Re: Tesla V100-SXM2-16GB

Postby foldy » Fri Nov 10, 2017 7:47 am

@Luscious: Price for the rack with 10 nvidia Teslas: only $100000
foldy
 
Posts: 1148
Joined: Sat Dec 01, 2012 3:43 pm

Re: Tesla V100-SXM2-16GB

Postby toTOW » Sun Nov 12, 2017 1:21 pm

I added 0x1db1 / GV100 [Tesla V100 SXM2] and 0x1db4 / GV100 [Tesla V100 PCIe] to the GPU.txt file ... let us know if something goes wrong ...
Folding@Home beta tester since 2002. Folding Forum moderator since July 2008.

FAH-Addict : latest news, tests and reviews about Folding@Home project.

Image
User avatar
toTOW
Site Moderator
 
Posts: 8454
Joined: Sun Dec 02, 2007 10:38 am
Location: Bordeaux, France

Re: Tesla V100-SXM2-16GB

Postby 84036980 » Sun Nov 12, 2017 8:45 pm

it's working now : )

Thank you guys,
84036980
 
Posts: 16
Joined: Fri Feb 06, 2015 7:18 pm

Re: Tesla V100-SXM2-16GB

Postby foldy » Mon Nov 13, 2017 11:37 am

Can you post some PPD numbers? I guess current work units are too small for Tesla so you may not get more than 1000k PPD currently
foldy
 
Posts: 1148
Joined: Sat Dec 01, 2012 3:43 pm

Re: Tesla V100-SXM2-16GB

Postby 84036980 » Mon Nov 13, 2017 5:56 pm

84036980
 
Posts: 16
Joined: Fri Feb 06, 2015 7:18 pm

Re: Tesla V100-SXM2-16GB

Postby icemanncsu » Thu Jun 28, 2018 9:31 pm

Burning off some AWS EC2 credit since its the end of the month, it would have expired otherwise :) . Right now AWS EC2 spots in USE1 are $7.80/hour, on-demand is normally $24.

Forgot to mention this is a single p3.16xlarge instance.

57 CPUs at 2.7GHz & 8 x Tesla V100's 16GB

15.5M PPD

Image
Full size image here -> https://ibb.co/dTWduo
icemanncsu
 
Posts: 1
Joined: Thu Jun 28, 2018 9:25 pm

Re: Tesla V100-SXM2-16GB

Postby foldy » Sat Jun 30, 2018 4:03 pm

That's some folding power! Be sure to have enough CPUs left to feed the GPUs.
foldy
 
Posts: 1148
Joined: Sat Dec 01, 2012 3:43 pm

Previous

Return to New GPUs (whitelist)

Who is online

Users browsing this forum: No registered users and 1 guest

cron