AV1 is Supposed To Make Streaming Better, So Why Isn’t Everyone Using It?
Despite its potential for improving streaming efficiency, the AV1 video codec has not become widely adopted since its introduction in 2018, even with major support from industry giants like Netflix, Microsoft, Google, Amazon, and Meta. The Alliance for Open Media claims AV1 can offer up to 30% more efficiency compared to older codecs, delivering higher-quality video at lower bandwidths without any royalty costs.
While platforms such as YouTube and Netflix have begun using AV1—Netflix reportedly encodes around 95% of its content using this codec—many others have not due to hardware limitations. End-users need compatible devices with AV1 decoders, and even though newer products from Apple, Nvidia, AMD, and Intel support AV1, the transition is not yet universal.
Complex encoding and decoding processes associated with AV1 may pose additional challenges for streaming services, indicating that widespread adoption may take time.
Key Points
- AV1 codec promises 30% more efficiency than older standards like HEVC.
- Major platforms like Netflix encode a significant amount of their content using AV1.
- Hardware limitations hinder broader adoption, as not all devices support AV1 decoding yet.
- High encoding complexity may deter streaming services from fully transitioning to AV1.
- The adoption of AV1 is gradual, with potential for increased future support as hardware improves.
Why should I read this?
This article provides insights into the challenges and potentials of AV1 in the streaming industry. Understanding the current state of video codecs can help tech enthusiasts and industry professionals grasp the dynamics affecting video quality and bandwidth usage in digital streaming. As more devices become AV1-compatible, its eventual adoption could reshape the streaming landscape.
“`