Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I'd dreading Nvidia crippling deep learning for all those not paying X more dollars (as the situation already is for server farms).

This already changed years ago when NVIDIA removed the last non-crippled double and half precision GPUs from their product lineup. The cheapest GPU you can buy for ML now is the titan v, which was $3000 at launch.



I'm only planning at this point - so I don't know but am very interested. I see the RTX 3080 reviewed as the most cost effective chip you can get for deep learning. I have the impression a lot of research is moving to lower precision also.

https://timdettmers.com/2020/09/07/which-gpu-for-deep-learni...


They have been going the other way recently: Titan V ($3000) -> Titan RTX ($2500) -> RTX 3090 ($1500). The 3090 beats the Titan RTX in double precision and is close to 2x in single precision.


Scientific calculation need double but ML don't need double, isn't it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: