Astrophotography mode: Here's how the Pixel 4's killer new camera feature works

PIxel 4 in-hand
PIxel 4 in-hand (Image credit: Hayato Huseman / Android Central)

There's been a lot of hype surrounding the Pixel 4's new astrophotography mode, with plenty of users (including myself!) posting jaw-dropping shots of the stars in the sky all over social media. Astrophotography, the capturing of stars and other celestial objects, isn't new to smartphones; Huawei's P30 Pro featured this ability all the way back in March, but as per usual, Google's approach is a bit different.

The Pixel 4 uses a combination of long exposures, HDR+, and Semantic Segmentation to pull off miraculous results.

If you've ever tried to shoot the night sky with a dedicated camera like a DSLR, you're undoubtedly familiar with the process of shooting long exposures, where you allow the camera's shutter to stay open for longer than usual to capture more light. This works well with a large sensor, but the comparatively tiny sensor in a phone like the Pixel 4 naturally pulls in less light, so it needs to expose for even longer.

The problem with exposing for too long, though, is that the Earth's natural rotation starts to affect how the sky is captured, and you'll quickly end up with star trails, or white lines in the sky as a result of the stars' different positions at the beginning and end of the exposure. So how does Google compensate for this while still allowing the Pixel 4 to take in enough light for a clean image?

Astrophotography is built into the Night Sight camera mode in the Pixel's stock camera app, and kicks in automatically when it detects dark enough conditions. A capture can take as long as four minutes to complete, but it isn't a four-minute exposure; instead, the Pixel takes a succession of 15-second exposures and stitches them together in real time, using the phone's gyroscope and computational data to align each new exposure with the last. Think of it like an advanced form of HDR+, with some exposures prioritizing highlights and others shadows.

Google also has a new feature called Semantic Segmentation that, similar to Apple's Deep Fusion, identifies different objects in the shot and processes them differently. This lets the PIxel 4 sharpen the stars and brighten the foreground, for instance, independently of each other. In addition, there are new enhancements to the noise reduction processing that helps fight against random white pixels that would otherwise appear in shots.

Combine all of this tech together, and the Pixel 4 is able to pull off miraculous photos in even the darkest of conditions, provided you keep it in one place for long enough. While one of the merits of Night Sight is the ability to capture impressive low light photos even while shooting handheld, the nature of any long exposure means you'll need to prop the Pixel 4 against a nearby object or mount it on a tripod.

Whether or not you understand how it works, the important thing with astrophotography is that you go out and take some new shots of your own! Grab a tripod and your Pixel 4, go out somewhere with as little noise pollution as you can manage, and get to shooting.

Hayato Huseman

Hayato was a product reviewer and video editor for Android Central.