Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
superkuh
64 days ago
|
parent
|
context
|
favorite
| on:
Apple M5 chip
I know it's only shared system RAM and not VRAM, but the M5's 150GB/s isn't going to be very fast when doing AI inference. A fairly old rtx 3060 12GB does 360GB/s. But I guess quantity is a quality all of it's own when it comes to RAM and inference.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: