Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Good point - I should have probably shown the direct interaction too (touching the actual fabric in space) - there's a gif here: https://jmp.sh/s/lHqJm6NEvqMqkXMPyUpZ. It was a little but laggy when also screen recording on device.

In the video I am looking at where I want to interact and then using a pinch gesture and the sounds are mapped to different cells of the cloth. By either looking and tapping, or playing directly you can hopefully play your intended sound.



So it looks for gesture events and maps them to locations where your eyes are looking rather than where the gesture was done?

Is that how most vision pro interfaces work?


Yes, that's basically it. If the gesture is targeted to something, either by looking at it or touching it directly in space then it responds. Direct touching takes priority, so I think you could look away but still touch in space.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: