Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In the "The Talk Show" Interview after WWDC [1], Mike Rockwell from Apple explained how they specifically engineered the Vision Pro to work very well for watching movies on planes. Here's a copy from the YouTube-generated transcript:

[1] https://daringfireball.net/2023/06/the_talk_show_live_from_w...

---

...all these cameras to do that and um and the other thing is interesting and you may or may not have seen it in the in the keynote because we can also do this on a plane which uh when you for those who are you know in this kind of space like that's not an easy thing to do because planes kind of like turn and fly and move around and you know the IMU on the system how does it know what to do in that case because it doesn't have a fixed inertial space so there's some magic going on there but you can get that same level of stability when you use it on a plane which means that you know when you're at 40 000 feet and you know the baby's crying next to you and you really kind of want to be somewhere else you can be uh



As pointed out by others, both Microsoft and Meta have faced the same problem.

MS had to do it first to solve boat use for the IVAS military program and partnered with Volkswagen [1]. Meta just announced a program last month [2] and is partnering with BMW.

So like most of the Vision Pro, Apple is using technology choices with the same trade-offs as everyone else in the market and Karl, TFA author, is an opinionated expert in those trades so go read his teardown of the Quest Pro if anyone is offended at his redteam style calling him an Apple basher.

But I think he might be missing the tightly coupled OS layer with an attention to detail not typical in the industry to date. In the same YouTube video quoted above they mention (50:57) developing a whole new glyph rendering engine just to get better text. That and a hundred other fine-tunings is what I hope to see.

[1] https://mspoweruser.com/microsoft-and-volkswagen-collaborati...

[2] https://about.fb.com/news/2023/05/meta-bmw-ar-vr-experiences...


I agree. I think as an expert he might be blind to what Apple is capable of doing. He has already updated some of his comments based on new shit coming to light.


It's a very complicated field. You know, a lot of ins, a lot of outs, a lot of what-have-yous.


many moving pieces.


Wow, when written out like that it's really hard to follow what they're saying.


We utilize an array of cameras for stabilization, which you may have seen during our keynote presentation. Another fascinating aspect is our ability to replicate this process on a plane. For those familiar with the field, you'll understand that this is not a trivial task given the constant motion and changing orientations of an airplane.

The system's Inertial Measurement Unit (IMU) faces a challenge in these scenarios due to the absence of a fixed inertial space. However, we've developed some innovative solutions to ensure a similar level of stability, even when used on an airplane.

This means that, even at 40,000 feet, amidst distractions such as a crying baby nearby, our system can transport you to a more pleasant environment.

(courtesy of you-know-what)


Inertial guidance systems have been using visual reference to verify gyroscopic readouts since the 50's.

I see that quote as saying the visionPro verifies what the camera sees and what the IMU is describing to arrive at a realistic interpretation.

Forgive me if that wasn't a question.


immediately recognizable as you-know-what


The thing that must not be named.


There’s literally no punctuation of any kind. Maybe it was generated


> Here's a copy from the YouTube-generated transcript

...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: