Smartphones have seen nothing short of explosive growth over the last decade, and they've quickly become the most important communications tool in the world. They're also incredible media devices, both for creation and consumption, and as someone who makes a living with their camera, my favorite technology in mobile over the last ten years is without a doubt the advancement of HDR+ processing.
High Dynamic Range explained
In the simplest terms, shooting in HDR lets you capture more detail in your photos. The whole process is based around capturing multiple photos in rapid succession, with each shot set at a different exposure, then stitching them all together into one final image. This lets you pull information from the brightest points of an imageand the darkest points without clipping in either direction.
Landscape photographers have always loved shooting in HDR, since the sky is often brighter than the rest of the shot and would otherwise blow out the photo. With HDR, you're able to retain the highlight detail of the clouds in the sky while also capturing the shadow detail in the mountains, plains, or whatever other scenery you might be shooting.
It's also great for portrait photography, particularly in broad daylight where the direct lighting from the sun can cause harsh shadows and overexposed highlights.
So why wouldn't you just always shoot in HDR then? Since you're essentially capturing multiple photos in a short burst, any fast motion in HDR shots can lead to motion blur. You'll definitely notice this when taking photos of people, especially at night when your camera's shutter speed is lower to allow it to take in more light.
You might also not want to give up the contrast between highlights and shadows that HDR works against depending on the type of shot you're after. That contrast can be used in all sorts of artistic ways, from highlighting focal points of your photo with direct lighting to carefully hiding details away in the shadows.
Just like any other photography mode, HDR is a tool that, when used in the right context, can lead to some extraordinary results.
HDR in the making
HDR has become a table stakes feature in just about every smartphone these days, but it wasn't always this way. The first phone I remember touting HDR photography was the Galaxy S III, all the way back in 2012. Having the ability to shoot in HDR was great, but it was slow to process, and the effect made a lot of photos look almost fake because, well, even the human eye doesn't see full highlight and shadow detail at once.
Phones as far back as the HTC One M7 were even able to shoot HDR video, which was technically impressive at the time even though the results didn't always look great.
Fast forward to today, and HDR+ imaging has gotten a lot more advanced. Photos look far more natural, and phones are able to capture the necessary exposures so quickly that it's usually set to automatically come on in the appropriate conditions by default. Many phones, like the Galaxy S10 and OnePlus 7T, also support HDR video playback just like on modern TVs, offering deeper blacks and brighter highlights with properly shot footage.
Using HDR+ for low light photography
If you ask me, the biggest leap forward with HDR imaging has come from Google's work with its Pixel line of phones and the revolutionary Night Sight photography mode.
Originally introduced with last year's Pixel 3 and retroactively ported to older devices, Night Sight is essentially a souped-up version of HDR+ that takes anywhere from 9-15 frames, each at a fairly long exposure, within a few seconds of hitting the shutter button, then combines them all using advanced algorithms to pull off miraculously bright images in even the worst conditions.
It's not quite the same as taking a typical long exposure shot, which would require the camera to remain perfectly still to avoid blurry shots. In fact, one of the most incredible parts of Night Sight is that it works even when shooting handheld. It's a remarkable experience that's only made possible thanks to HDR+ technology … but it doesn't stop there.
With the Pixel 4 this year, Google debuted another shocking low light feature, astrophotography mode. It takes all the technical prowess of Night Sight to the next level, letting you shoot photos of the stars that a tiny phone sensor would never typically be able to pull off.
We have a full explainer on astrophotography mode that dives into the specifics, but here's the basics: Generally speaking, the longer the exposure, the more light you'll take in, but too long of an exposure can lead to trailing in the sky due to the Earth's rotation. To combat this, the Pixel 4 takes four minutes' worth of 15-second exposures, which are just short enough to avoid star trails, then uses HDR+ processing to create a stunning end result out of them all.
This type of computational photography using HDR technology really feels like a new generational step for photography as a whole; practically a sort of cheat to get around the limitations of small sensors, and even a way to solve some of the problems of traditional cameras.
It's made it not only possible, but at times preferable for me to use the camera that's in my pocket rather than the more expensive and elaborate camera in my bag. That's pretty incredible, and I can't wait to see what the next decade of mobile photography tech brings.
The smartest camera
Google Pixel 4
My favorite camera that fits in my pocket
The Pixel 4 may not have the best battery life around, but it has the smartest camera. A combination of HDR+ processing for nearly every image captured, plus more advanced features like Night Sight and astrophotography, make it my favorite photography tool.
We may earn a commission for purchases using our links. Learn more.