Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Foveated streaming is wild to me. Saccades are commonly as low as 20-30ms when reading text, so guaranteeing that latency over 2.4Ghz seems Sisyphean.

I wonder if they have an ML model doing partial upscaling until the eyetracking state is propagated and the full resolution image under the new fovea position is available. It also makes me wonder if there's some way to do neural compression of the peripheral vision optimized for a nice balance between peripheral vision and hints in the embedding to allow for nicer upscaling.



I worked on a foveated video streaming system for 3D video back in 2008, and we used eye tracking and extrapolated a pretty simple motion vector for eyes and ignored saccades entirely. It worked well, you really don't notice the lower detail in the periphery and with a slightly over-sized high resolution focal area you can detect a change in gaze direction before the user's focus exits the high resolution area.

Anyway that was ages ago and we did it with like three people, some duct tape and a GPU, so I expect that it should work really well on modern equipment if they've put the effort into it.


It is amazing how many inventions duck tape found its way into.


Foveated rendering very clearly works well with a dedicated connection, wiht predictable latency. My question was more about the latency spikes inherent in a ISM general use band combined with foveated rendering, which would make the effects of the latency spikes even worse.


They're doing it over 6GHz, if I understand correctly, which with a dedicated router gets you to a reasonable latency with reasonable quality even without foveated rendering (with e.g. a Quest 3).

With foveated rendering I expect this to be a breeze.


Even 5.8Ghz is getting congested. There's a dedicated router in this case (a USB fob), but you still have to share spectrum with the other devices. And at the 160Mhz symbol rate mode on WiFi6, you only have one channel in the 5.8GHz spectrum that needs to be shared.


You're talking about "Wi-Fi 6" not "6 GHz Wi-Fi".

"6 GHz Wi-Fi" means Wi-Fi 6E (or newer) with a frequency range of 5.925–7.125 GHz, giving 7 non-overlapping 160 MHz channels (which is not the same thing as the symbol rate, it's just the channel bandwidth component of that). As another bonus, these frequencies penetrate walls even less than 5 GHz does.

I live on the 3rd floor of a large apartment complex. 5 GHz Wi-Fi is so congested that I can get better performance on 2.4 in a rural area, especially accounting for DFS troubles in 5 GHz. 6 GHz is open enough I have a non-conflicting 160 MHz channel assigned to my AP (and has no DFS troubles).

Interestingly, the headset supports Wi-Fi 7 but the adapter only supports Wi-Fi 6E.


Not so much of an issue when neighbors with paper thin walls see that 6ghz as a -87 signal

That said, in the US it is 1200MHz aka 5.925 GHz to 7.125 GHz.


The One Big Beautiful Bill fixed that. Now a large part of this spectrum will be sold out for non-WiFi use.


Different spectrum. They're grabbing old radar ranges.

Also talking about adding more spectrum to the existing ISM 6GHz band.



This is part of my job, dealing with spectrum and Washington.

I communicate with the FCC and NTIA fairly often at this point.

You need to pay attention to Arielle Roth, Assistant Secretary of Commerce for Communications and Information Administrator, National Telecommunications and Information Administration (NTIA).

https://policy.charter.com/2025-ntia-spectrum-policy-symposi...

From the article, about the November event:

"... administration’s investment in unlicensed access in 6 GHz ensures the benefits of the entire spectrum band are delivered directly to American families and businesses in the form of more innovation and faster and more reliable connectivity at home and on the go, which will continue to transform and deliver long-lasting impact for communities of all sizes across the country.

Charter applauds Administrator Roth's leadership, and her recognition of the critical role unlicensed spectrum plays today and in the future, both in the U.S. and across the globe."

---

Now here: https://www.ntia.gov/speech/testimony/2025/remarks-assistant...

"... To identify the remainder, NTIA plans to assess four targeted spectrum bands in the range set by Congress: 7125-7400 MHz; 1680-1695 MHz; 2700-2900 MHz; and 4400-4940 MHz."

"On the topic of on-the-ground realities, let’s also not forget what powers our networks today. While licensed spectrum is critical, the majority of mobile traffic is actually offloaded onto Wi-Fi. Born in America, led by America, Wi-Fi remains an area where we dominate, and we must continue to invest in this important technology. With Wi-Fi, the race has already been won. China knows it cannot compete and for that reason looks for ways to sabotage the very ingenuity that made Wi-Fi a global standard."

Roth is not going to take away 6GHz from current ISM allocation.


If Wi-Fi 6E goes upto 7125 and the targeted spectrum band includes 7125 (onwards), what will happen exactly at 7125?


The same thing that happens with every frequency range?

Depending on the spectrum and technology there can be a small slice of guard band between usable portions, which is what we have today.

Nothing there today as provisioned is going to change.


Oh goody! I hope some of it can be used for DRM encrypted TV broadcasts too.


I know you're attempting humor here, but I am not aware of anyone investing in broadcast tv.


I'm just amazed you can do bidirectional ATSC 3.0 with two PlutoSDRs, a minor investment to hack on.

https://www.reddit.com/r/sdr/comments/1ow80n5/help_needed_ho...


More of an issue when your phone's wifi or your partner watching a show while you game is eating into that one channel in bursts, particularly since the dedicated fob means that it's essentially another network conflicting with the regular WiFI rather than deeply collaborating for better real time guarantees (not that arbitrary wifi routers would even support real time scheduling).

MIMO helps here to separate the spectrum use by targeted physical location, but it's not perfect by any means.


IMO there is not much reason to use WiFi 6 for almost anything else. I have a WiFi 6 router set up for my Quest 3 for PC streaming, and everything else sits on its 5GHz network. And since it doesn't really go through walls, I think this is a non-issue?

The Frame itself here is a good example actually - using 6GHz for video streaming and 5GHz for wifi, on separate radios.

My main issue with the Quest in practice was that when I started moving my head quickly (which happens when playing faster-paced games) I would get lag spikes. I did some tuning on the bitrate / beam-forming / router positioning to get to an acceptable place, but I expect / hope that here the foveated streaming will solve these issues easily.


The thing is that I'd expect foveated rendering to increase latency issues, not help them like it does for bandwidth concerns. During a lag spike you're now looking at an extremely down sampled image instead of what in non foveated rendering had been just as high quality.

Now I also wonder if an ML model could also work to help predict fovea location based on screen content and recent eye trackng data. If the eyes are reading a paragraph, you have a pretty good idea where they're going to go next for instance. That way a latency spike that delays eye tracking updates can be hidden too.


My understanding is that the foveated rendering would reduce bandwidth requirements enough that latency spikes become effectively non-existent.

We’ll see in practice - so far all hands-on reviewers said the foveated rendering worked great, with one trying to break it (move eyes quickly left right up down from edge to edge) and not being able to - the foveated rendering always being faster.

I agree latency spikes would be really annoying if they end up being like you suggest.


Enough bandwidth to absolve any latency issues over a wireless connection is not really a thing for a low latency use case like foveated rendering.

What do you do when another device on the main wifi network decides to eat 50ms of time in the channel you use for the eye tracking data return path?


I believe all communication with the dongle is on 6GHz - both the video and the return metadata.

So again, you just make sure the 6GHz band in the room is dedicated to the Frame and its dongle.

The 5GHz is for WiFi.


On the LTT video he also said that Valve had claimed to have tested with a small number of devices in the same room, but hadn’t tried out larger scenarios like tens of devices.

My guess based on that is you likely dont need to totally clear 6GHz in the room the Frame is in, but rather just make sure its relatively clear.

We’ll know more once it ships and we can see people try it out and try and abuse the radio a bit.


Pretty funny to me that you're backseat engineering Valve on this one. If it didn't have a net benefit they wouldn't have announced it as a feature yet lmao


I'm not saying it doesn't work; I'm asking what special sauce they've added to make it work, and noting that despite the replies I've gotten, foveated streaming doesn't help latency, and in fact makes the effects of latency spikes worse.


Why are you assuming the fob would use the same WiFi channel as your regular 6GHz network? That would be extremely poor channel selection.


MU-MIMO is very nice.


The real trick is not over complicating things. The goal is to have high fidelity rendering where the eye is currently focusing so to solve for saccades you just build a small buffer area around the idealized minimum high res center and the saccades will safely stay inside that area within the ability of the system to react to the larger overall movements.

Picture demonstrating the large area that foveated rendering actually covers as high or mid res: https://www.reddit.com/r/oculus/comments/66nfap/made_a_pic_t...


It was hard for me to believe as well but streaming games wirelessly on a Quest 2 was totally possible and surprisingly latency-free once I upgraded to wifi 6 (few years ago)

It works a lot better than you’d expect at face value.


At 100fps (mid range of the framerate), you need to deliver a new frame every 10ms anyway, so a 20ms saccade doesn't seem like it would be a problem. If you can't get new frames to users in 30ms, blur will be the least of your problems, when they turn their head, they'll be on the floor vomiting.


> Saccades are commonly as low as 20-30ms when reading text

What sort of resolution are one's eyes actually resolving during saccades? I seem to recall that there is at the very least a frequency reduction mechanism in play during saccades


During a saccade you are blind. Your brain receives no optical input. The problem is measuring/predicting where the eye will aim next and getting a sharp enough image in place over there by the time the movement ends and the saccade stabilizes.


Yeah. I’d love to understand how they tackle saccades. To be fair they do mention they’re on 6ghz - not sure if they support 2.4 although I doubt the frequency of the data radio matters here.


I would guess that the “foveated” region that they stream is larger than the human fovea, large enough to contain the saccades movement (with some good-enough probability).


Saccades afaik can jump to an arbitrary part of the eye which adds to the latency of finding the iris; basically the the software ends up having to look through the entire image to reacquire the iris whereas normally it’s doing it incrementally relative to the previous position.

Are you really sure overrendering the fovea region would really work?


Not sure, probably depends on the content too. When you read text, the eye definitely isn’t jumping “arbitrarily”, it’s clustered around what you’re focusing on. Might be different for a FPS game where you’re looking out for ambushes.

I’m not sure what you mean by “look through the entire image to reacquire the iris”? You’re talking about the image from the eye tracking camera?


> You’re talking about the image from the eye tracking camera?

Yes. A normal trick is to search just a bit outside the last known position to make eye tracking cheap computationally and to reduce latency in the common case.


They use a 6 Ghz dongle




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: