There's been a lot of hype surrounding the Pixel 4's new astrophotography mode, with plenty of users (including myself!) posting jaw-dropping shots of the stars in the sky all over social media. Astrophotography, the capturing of stars and other celestial objects, isn't new to smartphones; Huawei's P30 Pro featured this ability all the way back in March, but as per usual, Google's approach is a bit different.
The Pixel 4 uses a combination of long exposures, HDR+, and Semantic Segmentation to pull off miraculous results.
If you've ever tried to shoot the night sky with a dedicated camera like a DSLR, you're undoubtedly familiar with the process of shooting long exposures, where you allow the camera's shutter to stay open for longer than usual to capture more light. This works well with a large sensor, but the comparatively tiny sensor in a phone like the Pixel 4 naturally pulls in less light, so it needs to expose for even longer.
The problem with exposing for too long, though, is that the Earth's natural rotation starts to affect how the sky is captured, and you'll quickly end up with star trails, or white lines in the sky as a result of the stars' different positions at the beginning and end of the exposure. So how does Google compensate for this while still allowing the Pixel 4 to take in enough light for a clean image?
Astrophotography is built into the Night Sight camera mode in the Pixel's stock camera app, and kicks in automatically when it detects dark enough conditions. A capture can take as long as four minutes to complete, but it isn't a four-minute exposure; instead, the Pixel takes a succession of 15-second exposures and stitches them together in real time, using the phone's gyroscope and computational data to align each new exposure with the last. Think of it like an advanced form of HDR+, with some exposures prioritizing highlights and others shadows.
Google also has a new feature called Semantic Segmentation that, similar to Apple's Deep Fusion, identifies different objects in the shot and processes them differently. This lets the PIxel 4 sharpen the stars and brighten the foreground, for instance, independently of each other. In addition, there are new enhancements to the noise reduction processing that helps fight against random white pixels that would otherwise appear in shots.
Combine all of this tech together, and the Pixel 4 is able to pull off miraculous photos in even the darkest of conditions, provided you keep it in one place for long enough. While one of the merits of Night Sight is the ability to capture impressive low light photos even while shooting handheld, the nature of any long exposure means you'll need to prop the Pixel 4 against a nearby object or mount it on a tripod.
Whether or not you understand how it works, the important thing with astrophotography is that you go out and take some new shots of your own! Grab a tripod and your Pixel 4, go out somewhere with as little noise pollution as you can manage, and get to shooting.
We may earn a commission for purchases using our links. Learn more.
Samsung will unveil the Galaxy Note 20 on Aug 5 at virtual Unpacked event
The Galaxy Note 20 and Note 20+ are coming on August 5, right on schedule — and the virtual event should be plenty intriguing.
It's about to get easier than ever to stream 4K video without any buffering
Videos that are 50% smaller and still look good is the science we need right about now.
Android TV apps are still second-class citizens, and that's a huge problem
Without good apps, it doesn't matter how good your device is, just ask Windows Phone. And if Android TV ever wants to get out of the "aftermarket afterthought" sinkhole its been stuck in, it needs to shape up the apps that people use with it.
Take your Pixel 4 to the next level with these accessories
Google crafted the Pixel 4 to be an immensely powerful phone on its own, but when you pair it with the right accessories, that's when the real fun begins.