Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Typically eye-trackers work by illuminating the eyes with near infrared light and using infrared cameras. This creates a higher contrast image of the pupils, etc. I assume Apple is doing this in the Vision Pro. Eye-tracking can also be done with just visible light, though. Apple has the benefit of knowing where all the user interface elements are on screen, so eye-tracking in this on the iPhone or iPad doesn't need to be high precision. Knowledge of the position of the items can help to reduce the uncertainty of what is being fixated on.


So there isn’t much more to it than getting a good resolution image of the eye from any distance, every millisecond.

Precision is the issue because we are mostly moving our eyes in about 8 directions, there’s no precision because we don’t know how to measure focusing of our eye lens with a camera yet (unless that too is just a matter of getting a picture).

Squinting would be the closest thing to physically expressing focusing. So the camera needs to know I’m looking left with my eye, followed by a squint to achieve precision. Seems stressful though.

Gonna need AI just to do noise cancelling of involuntary things your eyes do like pupil dilation, blinking.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: