Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> If you're interested in video, you might be interested to know that AV2 is in development.

Oh interesting to know that! What would be the differences between Av1 and AV2?

Found a website (https://www.geekextreme.com/av1-vs-av2-video-codec/) which gave me some interesting results

AV2 delivers 30% better compression efficiency than AV1, which already compresses 30% better than HEVC (H.265). AV2 encoding demands 2-3 times more computational power than AV1, requiring advanced hardware like RTX 5090 for practical use. AV2 will officially release by end of 2025, with widespread hardware support expected around 2027 or later. AV2 introduces advanced features like split-screen delivery, enhanced AR/VR support, and dynamic bitrate switching for adaptive streaming. 88% of AOMedia members plan to implement AV2 within two years, despite infrastructure and hardware compatibility challenges.

If there's any other difference then let me know too but Honestly a bit curious but it mentions that it requires RTX 5090

Wouldn't this be a little bad for the market too? Sure it compresses 30% more but not everybody has rtx 5090

Are we gonna see multi codec in things like say netflix where to devices which don't support av2 will be sent av1 but they would prefer to send av2 if the hardware category is matched?



Just in case you missed it, your quote was referring to encoding requirements. Decoding (eg. Netflix users) will have a different set of requirements. The situation will also improve over time as dedicated hardware encoders and decoders become available.

For the moment, I don't really mind if it requires more GPU power to encode media, since it only needs to happen once. I expect it will still be possible on a weaker card, but it would just take longer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: