AV2 video codec delivers 30% lower bitrate than AV1, final spec due in late 2025
How AV2 Achieves Its Gains
- Commenters are impressed by another ~30% bitrate reduction over AV1 and discuss how this mostly comes from new tools, not magic.
- One example: more flexible block (“superblock”) partitioning and larger maximum blocks better match actual motion and reduce overhead describing block shapes.
- Modern codecs add many more prediction modes (intra, inter, global/warped motion, chroma-from-luma, etc.), all of which expand the encoder’s search space.
Compute Cost, Encoding vs Decoding, and Hardware
- Several note complexity is highly asymmetric: encoding gets much harder; decoding is comparatively cheap but still needs hardware acceleration on mobiles/TVs.
- AV2 work reportedly included “rigorous scrutiny” of hardware complexity with input from chip vendors, raising hopes for faster hardware support than AV1.
- Others worry about device obsolescence and power use; some older laptops already struggle with newer codecs in software.
- There’s debate over whether newer codecs actually increase end-user power usage: some argue AV1 hits a “sweet spot” where better compression offsets extra compute.
Patents and IP
- Thread discusses how many foundational video-compression patents (e.g., older transforms) have expired, reducing risk, but patent trolls and litigation around AV1 remain.
- Some argue the trend toward more “fitted” codec designs reduces overlap with legacy MPEG patents; others see software/compression patents as harmful and counterintuitive.
Limits of Compression & Neural / Generative Codecs
- Multiple comments speculate we’re approaching a point where further gains require synthesis (hallucinating details), as already common in phone cameras and AI upscalers.
- Some mention experimental neural codecs and model-based audio (e.g., sending text/parameters plus a local generative model) and extrapolate to faces, scenes, or even entire movies personalized on-device.
- Others are wary, citing jbig2-style failures where pattern-based compression changes numbers, and artistic concerns if grain/noise and other “imperfections” are regenerated client-side.
Streaming Quality and Over-Compression
- A long subthread complains that major streaming services still over-compress, especially dark scenes and gradients, even on high-end 4K setups and gigabit links.
- Economic incentives push services to cut bitrate; better codecs often get “spent” on lower costs rather than visibly higher quality.
- Some point to OTA broadcast and Blu-ray as still delivering superior image quality; piracy and high-end niche systems are mentioned as ways to escape over-compressed streams.
Containers, Extensions, and Adoption Friction
- There’s confusion about AV1/AV2 as codecs vs containers; raw streams might use .av1/.av2, but most content will remain in MKV/MP4/WebM with codec identifiers.
- Rapid codec iteration without backward-compatible hardware acceleration forces services to store multiple encodings or fall back to CPU decode, which slows adoption and can hurt batteries.