Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This seems like a major issue given they are doing more and more machine learning inference on-device and android is trying to standardize on a framework for on-device ML inference.

Assuming that the Pixel Visual Core is a generic tensor processing unit similar to what the iPhone has. It’s possible I’ve got some things confused here. The important question is does the 3a have a hardware accelerated ML inference chip or is it just GPU?



No and no. It doesn't contain a hardware accelerated ML inference chip. The Qualcomm 670 chipset used with the 3a includes the Qualcomm Hexagon 685 DSP which provides vector processing acceleration for ML workloads, so also not just GPU.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: