The Pixel 2 camera's secret weapon: A Google-designed SoC, the 'Pixel Visual Core'

We've been using the Pixel 2 and it's bigger sibling the Pixel 2 XL for a while. Once again, Google's phones have some fantastic photo capabilities. What we were seeing from both the 12.2MP rear camera and 8MP front-facing is just so much better than any other phone we've ever used. And we've used a lot of them.

Read the Google Pixel 2 review

And that's before Google enables its secret weapon. Google has designed a custom imaging-focused SoC (system on chip) in the Pixel 2, and it's called Pixel Visual Core.

We don't have all the details; Google isn't ready to share them and maybe isn't even aware of just what this custom chip is capable of yet. What we do know is that the Pixel Visual Core is built around a Google-designed eight-core Image Processing Unit. This IPU can run three trillion operations each second while running from the tiny battery inside a mobile phone.

Interestingly, the Pixel Visual Core wasn't even enabled at launch on the Pixel 2 and 2 XL — we're just now seeing an "early version" of it with the Android 8.1 Developer Preview 2. With the Pixel Visual Core finally enabled, Google's HDR+ routines will be processed using this IPU, and it runs fives times faster while using less than one-tenth of the energy than it would if it ran through the standard image processor in the Snapdragon 835.

Google says this is possible because of how well the software and hardware have been matched with each other. The software on the Pixel 2 controls "many more" details of the hardware than you would find in a typical processor to software arrangement. By handing off control to software, the hardware can become a lot more simple and efficient.

Google is a software company first and foremost. It's no wonder that its first custom mobile SoC leverages software the way other companies use hardware.

Of course, this means the software then becomes more and more complex. Rather than use standard methods of writing code, building it into a finished product and then trying to manage everything after all the work is finished, Google has turned to machine learning coding languages. Using Halide for the actual image processing and TensorFlow for the machine learning components themselves, Google has built its own software compiler that can optimize the finished production code into software built specifically for the hardware involved.

Even though it wasn't ready at launch and took extra time to enable, right now the only part of the camera experience using the Pixel Visual Core is the camera's HDR+ feature. It's already very good; this is what comes next.

HDR+ is only the beginning for the Pixel Visual Core.

With the Android 8.1 Developer Preview 2, the Pixel Visual Core will be opened up as a developer option. The goal is to give all third-party apps access through the Android Camera API. This will give every developer a way to use Google's HDR+ and the Pixel Visual Core, and we expect to see some really big things.

For the one last thing we always love to hear about, Google says that we should remember the Pixel Visual Core is programmable and they are already building the next set of applications that can harness its power. As Google adds more abilities to its new SoC, the Pixel 2 and 2 XL will continue to get better and be able to do more. New imaging and machine learning applications are coming throughout the life of the Pixel 2, and we're ready for them.

Jerry Hildenbrand
Senior Editor — Google Ecosystem

Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Twitter.

  • Now if they could couple this insane soft ware with better hardware such as larger sensors and layered, custom lenses, they could do even better! But good job anyways google!
  • This isn't software though. This is an additional 8 core processor specifically to combine camera images. It's hardware that wasn't turned on because the Android updates weren't there yet. Bear in mind ANY 8 core processor is a chip that will be as big as your CPU... and that the different image core is the primary reason ANYONE ever said that Apple cameras were better... and everyone has said that they produce better images than technically comparable Android images from same-specced phones, even from Samsung - who make the camera in the iPhone. The reason you need to use the built in camera is because it is now programmed to actually use the image core, and not the one built into the Snapdragon processor. The Pixel and Pixel 2 both had disabled internal hardware to make the pictures much better and it's being enabled now.
  • I thought maybe we'd see the AR Stickers Google showed off during the Pixel event and subsequent hands-on videos there. Not a word about them other than coming with future update...
  • I'm on Developer Preview 2 and I can say for sure that the processing time for Portrait Mode has went down significantly. Haven't had much time to play with the camera today but looking forward to it over the next few days.
  • I don't see the gain you saw, processing time hasn't changed for me.
  • Can we get a comparison of the photo quality of the stock Pixel 2 camera app before and after the Visual Core activation? I'm not sold by seeing a third part app comparison.
  • Yes where are the AR stickers?
  • Does anyone know or remember where Google sourced this chip from? It seems like most of the parts for their Pixel products are still "off the shelf" so to speak, so I'm wondering who they used to custom manufacture this thing...
  • In a Podcast a month or so ago, Jerry speculated that it was Intel based, but wasn't sure. I seem to remember another article claiming this also, but it wasn't an AC article. I'll see if I can find it. EDIT: Google confirmed to CNBC that it was Intel:
  • There's not a lot of info out about Pixel Visual Core + Google Camera HDR+ integration. This interview with a few Pixel execs at Google (Brian Rakowski & Tim Knight) leads one to believe that the Google Camera on the Pixel doesn't use PVC.
    Q. Is the Visual Core Chip going to be used for anything other than faster HDR and for giving other apps access to similar HDR processing? Brian: The Visual Core which we will be turning on in the coming apps will primarily be for 3rd party apps. The cool thing about it is that it gives pretty good performance in a default capture scenario. So when 3rd parties use the camera APIs they’ll be able to get those high quality HDR processed images. So we’re really looking forward to see what they do with it. Turns out we do pretty sophisticated processing, optimising and tuning in the camera app itself to get the maximum performance possible. We do ZSL and fast buffering to get fast HDR capture. So we don’t take advantage of the Pixel Visual Core, we don’t need to take advantage of it. So you won’t see changes in the pictures captured from the default camera app in the coming weeks. What you’ll see is that pictures taken in 3rd party apps will get significantly better as they’ll start taking benefit from some of the HDR processing.We’re pretty excited about what it’ll deliver and we’re looking forward to seeing all your favourite apps that use the camera will get much better pictures as a result.
  • I've seen similar posts stating the same thing - I just find it hard to believe that Google would create a custom chip and software - strictly for 3rd party developers... If this new core - has no direct value to the camera or phone - as is - then maybe that's why Google took on all of those engineers from HTC.... I truly believe Google is still finding ways to create value in this new chip...
  • I think it will be more interesting to see what else this chip can be used for besides photos.
  • Looks like this will be the phone for people that like Snapchat!
  • Nope. The iPhone will always be the phone for Snapchat. I know people who switched from Android to iPhone simply because of Snapchat.
  • Yeah especially the X with truedepth, but even the Pluses with the dual back cameras. I think the fact that Apple releases ready tech that's ready for developers to jump on it helps. No dormant chips for months in their phones. Not having dual sensors on the Pixel lessens the value of this chip. The fact that they are so sparse with information is just another reason why I went with the 8 Plus over the 2 XL. I don't need HDR+. I need better video and higher frame rates, among other things - like HEVC/HEIF support. The Live Photos implementation on Android phones is still not that good, and neither is the slow mo recording. Photography is a solved problem. It's all about the video and AR, now. I don't even consider picture quality when buying a new phone now - in the premium flagship bracket. HDR+ is not a huge selling point when other camera phones are so good and actually produce more true to life images. This phone is still 2 generations behind Apple for video recording and a generation behind for AR stuff. Doesn't help that they're using the comparatively crappy QC SoCs, either.
  • You do know that Live Photos was licensed from HTC, correct?
  • It got me all hyped up and now it's a big let down. Its pretty worthless in my world. I don't use many apps to share pictures with.
  • « Google isn't ready to share them and maybe isn't even aware of just what this custom chip is capable of yet » Really? They don’t know what a chip they deeigned is caoable of?! They just randomly design something to check what to do after mmm
  • « Google says this is possible because of how well the software and hardware have been matched with each other. » Sure, wih the same hardware thzn basically everyone else...