Advanced search

Message boards : Graphics cards (GPUs) : 1660 Ti

Author Message
Zalster
Avatar
Send message
Joined: 26 Feb 14
Posts: 211
Credit: 4,496,324,562
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 51591 - Posted: 4 Mar 2019 | 1:04:54 UTC

Anyone running a 1660 Ti? Want to see how it does here and what your temps are like.


Thanks,

Zalster
____________

PappaLitto
Send message
Joined: 21 Mar 16
Posts: 511
Credit: 4,617,042,755
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 51592 - Posted: 4 Mar 2019 | 2:16:39 UTC - in response to Message 51591.

Anyone running a 1660 Ti? Want to see how it does here and what your temps are like.


Thanks,

Zalster

Pretty sure it's turing architecture so it won't run until the 2000 series runs on GPUGrid

Azmodes
Send message
Joined: 7 Jan 17
Posts: 34
Credit: 1,371,429,518
RAC: 0
Level
Met
Scientific publications
watwatwat
Message 51596 - Posted: 5 Mar 2019 | 10:15:27 UTC
Last modified: 5 Mar 2019 | 10:24:12 UTC

Yes, it's Turing. You'll get immediate computation errors, no support for those yet unfortunately.

For those interested in the 1660's compute performance, I posted some runtimes from mine over at PrimeGrid. FWIW, I remember temps were <70C most of the time.

AuxRx
Send message
Joined: 3 Jul 18
Posts: 22
Credit: 2,758,801
RAC: 0
Level
Ala
Scientific publications
wat
Message 51597 - Posted: 5 Mar 2019 | 11:08:05 UTC - in response to Message 51596.

Thank you, Azmodes!

davidBAM
Send message
Joined: 17 Sep 18
Posts: 11
Credit: 695,185,729
RAC: 0
Level
Lys
Scientific publications
watwatwat
Message 51688 - Posted: 5 Apr 2019 | 9:56:19 UTC

another 1660Ti user eagerly awaiting Turing support on GPUGrid :-)

Jozef J
Send message
Joined: 7 Jun 12
Posts: 112
Credit: 1,118,845,172
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 51690 - Posted: 8 Apr 2019 | 21:30:07 UTC
Last modified: 8 Apr 2019 | 21:31:29 UTC

They will skip Turing gen.
https://www.tweaktown.com/news/65178/nvidia-tease-next-gen-ampere-gpu-7nm-gtc-19/index.html
Ampere is near heere

tullio
Send message
Joined: 8 May 18
Posts: 190
Credit: 104,426,808
RAC: 0
Level
Cys
Scientific publications
wat
Message 51694 - Posted: 11 Apr 2019 | 13:07:36 UTC

I just went from Maxwell to Pascal, by the use of Link connector from VGA to Displayport, so I upgraded from a GTX 750 to a GTX 1050 Ti. The speed increase in GPUGRID was notable, now I am testing Einstein.
Tullio
____________

Profile Dingo
Avatar
Send message
Joined: 1 Nov 07
Posts: 20
Credit: 122,646,317
RAC: 0
Level
Cys
Scientific publications
watwatwatwatwatwatwatwatwatwat
Message 51704 - Posted: 18 Apr 2019 | 9:05:16 UTC

I guess this is the error because GTX 1660 Ti can't run this project. If it is when can an update be expected. It can be run on a number of projects like SETI, Primegrid, Amicable Numbers and Moo. As these new cards are becoming more popular can an update be seen soon:

Stderr output
<core_client_version>7.14.2</core_client_version>
<![CDATA[
<message>
(unknown error) - exit code -59 (0xffffffc5)</message>
<stderr_txt>
# GPU [GeForce GTX 1660 Ti] Platform [Windows] Rev [3212] VERSION [80]
# SWAN Device 0 :
# Name : GeForce GTX 1660 Ti
# ECC : Disabled
# Global mem : 6144MB
# Capability : 7.5
# PCI ID : 0000:01:00.0
# Device clock : 1830MHz
# Memory clock : 6001MHz
# Memory width : 192bit
# Driver version : r419_50 : 42531
#SWAN: FATAL: cannot find image for module [.nonbonded.cu.] for device version 750

</stderr_txt>
]]>

____________

Proud Founder and member of



Have a look at my WebCam

Jim1348
Send message
Joined: 28 Jul 12
Posts: 819
Credit: 1,591,285,971
RAC: 0
Level
His
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 51705 - Posted: 18 Apr 2019 | 14:09:11 UTC - in response to Message 51704.

I guess this is the error because GTX 1660 Ti can't run this project. If it is when can an update be expected. It can be run on a number of projects like SETI, Primegrid, Amicable Numbers and Moo. As these new cards are becoming more popular can an update be seen soon:

There has been a large number of unsent "longs" for some time. I am putting an extra GTX 1060 and 1070 on it. There is usually not enough work for them. At some point, they will need to get another CUDA expert, or whatever it takes.

PappaLitto
Send message
Joined: 21 Mar 16
Posts: 511
Credit: 4,617,042,755
RAC: 0
Level
Arg
Scientific publications
watwatwatwatwatwatwatwat
Message 51707 - Posted: 20 Apr 2019 | 12:40:50 UTC

Pretty sure the next GPU architecture will be out before they support turing

Keith Myers
Send message
Joined: 13 Dec 17
Posts: 1284
Credit: 4,919,131,959
RAC: 6,253,368
Level
Arg
Scientific publications
watwatwatwatwat
Message 51708 - Posted: 21 Apr 2019 | 0:06:08 UTC - in response to Message 51705.

I guess this is the error because GTX 1660 Ti can't run this project. If it is when can an update be expected. It can be run on a number of projects like SETI, Primegrid, Amicable Numbers and Moo. As these new cards are becoming more popular can an update be seen soon:

There has been a large number of unsent "longs" for some time. I am putting an extra GTX 1060 and 1070 on it. There is usually not enough work for them. At some point, they will need to get another CUDA expert, or whatever it takes.

I'm no expert but I think likely all it takes is to update this
[.nonbonded.cu.] module for device version 750 API's.

I needed to update a device list in a C module for my temp sensor driver because it only had models listed for the older generation of AMD processors. I just added the latest second generation Threadripper models to the list and it compiled fine with the ability to recognize the new processors.

Same thing I think is going on here. Probably just need a revised list of known devices in that .nonbonded.cu file.

Erich56
Send message
Joined: 1 Jan 15
Posts: 1090
Credit: 6,603,906,926
RAC: 21,893,126
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 51714 - Posted: 22 Apr 2019 | 6:03:06 UTC

Would be great if someone from the GPUGRID people could tell us what their plans are.
Turing support coming up soon? Or skipping Turing and heading for Ampere support?

I guess for some crunchers here (including myself) such information would be valuable in connection with replacement plans of older cards.

ExtraTerrestrial Apes
Volunteer moderator
Volunteer tester
Avatar
Send message
Joined: 17 Aug 08
Posts: 2705
Credit: 1,311,122,549
RAC: 0
Level
Met
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 51728 - Posted: 26 Apr 2019 | 21:11:30 UTC - in response to Message 51714.

As far as I know that's the status: link

MrS
____________
Scanning for our furry friends since Jan 2002

Erich56
Send message
Joined: 1 Jan 15
Posts: 1090
Credit: 6,603,906,926
RAC: 21,893,126
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwat
Message 51730 - Posted: 28 Apr 2019 | 15:54:24 UTC - in response to Message 51728.

As far as I know that's the status: link

thanks for the link.
Although time schedule and feasibility seem rather vague :-(

Since 4 months have passed since that statement from end of Dezember 2018, perhaps GDF could give us an update ?

mmonnin
Send message
Joined: 2 Jul 16
Posts: 332
Credit: 3,772,896,065
RAC: 4,765,302
Level
Arg
Scientific publications
watwatwatwatwat
Message 51731 - Posted: 28 Apr 2019 | 22:46:09 UTC

I read that post as supporting the new RTX functions vs supporting the same CUDA functions on newer cards as the new features would require coding. I would think a Turing card would still support the same compute features with the same code if it was recognized.

Profile koschi
Avatar
Send message
Joined: 14 Aug 08
Posts: 124
Credit: 466,579,198
RAC: 362,756
Level
Gln
Scientific publications
watwatwatwatwatwatwatwatwatwatwat
Message 51734 - Posted: 1 May 2019 | 0:15:55 UTC

Yes, supporting the current app on Turing should be rather low effort. Making use of exclusive Turing features or newer CUDA levels, that requires more effort.

I'd really like to upgrade my GTX1060 to a GTX2060, but knowing it's unsupported by GPUGrid, I'm not going to do it. Pity for the project itself, it could tap into so much more compute power by supporting Turing. :-(

Post to thread

Message boards : Graphics cards (GPUs) : 1660 Ti

//