Everyone has a different set of rules when it comes to how they use their phone, but one thing most everyone has in common is enjoying audio and video on the go. The bright vibrant display on your phone is perfect for watching a movie or a show and there's no better music or podcast player than the one that stays in your pocket. While hardware manufacturers do everything they can to make sure the device capabilities are better with each generation, Android needs to keep pace on the software side.
Android 10 is doing its part. On the video side, the inclusion of HDR10+ (or HDR10 Plus if you prefer) is awesome. It's a quality-of-life level change that can make a big difference in how a video looks on your screen because of its underlying tech. It may only be a short blurb amongst all the other Android 10 features, but it deserves to be called out.
Every display has a dynamic range. the key is matching the content to it.
Your phone's screen is designed and manufactured specifically to be a portable display. There are a lot of differences between building a phone-sized screen and a living room-sized screen, and when it comes to brightness and dynamic range these differences can be really noticeable. Everything about the display was done with one thing in mind: conserving battery. A display shows color using small points of light (pixels) and these are powered in different ways depending on whether you have an AMOLED screen or an LCD, but both will use more battery if they are brighter.
We understand screen brightness when talking about the overall display, but the dynamic range of the display was set with battery power in mind, too. Dynamic range is the scope of colors your screen can show as defined by a low point and a high point. Colors outside of these points can't be displayed truly, no matter how wide the range itself may be. That can lead to crushed dark areas or overblown bright areas when you watch a video. You'll likely never notice, but it's relatively easy to fix using what's called an HDR profile. It tells the software that renders the video what colors should be used and it's adjustable by the content creator.
With HDR10+, these profiles are dynamic. That means they can adjust for each individual frame of a video instead of making one adjustment for the whole thing. Using metadata that's streamed along with the video information content can be adjusted so that dark colors never crush, blues never look black, stop signs aren't the same color as firetrucks, etc. It makes a big difference on a large display; on mobile, it means you can have both a bright sky and a dark alleyway on the screen at the same time and both look great.
That's not the only feature that will make watching video on your phone even better. Android 10 supports the AV1 codec. Originally designed by a long list of companies including Intel, Microsoft, Netflix, Amazon, and Google to be an open and royalty-free alternative to standards like H.264, it turns out that AV1 can display video of the same or better quality and have up to 20% higher data compression than VP9 or HEVC/H.265. That means it uses less data to stream it in the same quality and that download sizes can be smaller for the same content.
HDR video can use a lot of bandwidth; AV1 might be the fix for that.
On the audio side, support for the Opus codec has been included. Like AV1, Opus is open and royalty-free so anyone can use it in any project and it's better for streaming than the still-widely-used MP3 codec. You can see all the capabilities of the Opus codec at its website but two things jump right out: bitrates from 6 kb/s to 510 kb/s and sampling rates between 8 kHz and 48 kHz. That means the actual audio being streamed to you can be in excellent quality (or be of lower quality to save bandwidth) just like streaming audio using other codecs is, the source files can be saved in excellent quality to preserve all the sound, and companies that use the Opus tech don't have to pay for doing it. Everyone wins.
These changes mean nothing unless developers have an easy way to make sure the quality a player is trying to deliver matches the capabilities of the device it's being played on. The new MediaCodecInfo API can gather every resolution and frame rate that a particular video codec can render in so that the correct size and frame rate can be chosen. Besides querying the device information to determine what to use during playback, things like user-definable choices through a settings menu or changing the resolution depending on network settings are now possible. Companies making Android apps for media playback have one less thing to hard-code into their app so they can focus on making the rest of it great.
The smartphone has replaced the TV or traditional computer for many hours of our viewing time. For some of us, it's completely replaced both. That means it's important for your phone to make media both look and sound great and device makers know it. New software tools and support for new streaming technologies mean that we will love our content and because of the royalty-free nature of them, more companies will be willing and able to use them. Google has a vested interest in making sure you love using your Android phone; making a video look better and audio sound better is a great place to start.
We may earn a commission for purchases using our links. Learn more.