Skip to main content

Google is bringing its bandwidth-saving RAISR image processing to your Android phone

Google is constantly searching for ways to make it easier for us to share and use technology. Back in November 2016, they showcased RAISR, a new method for image processing that takes a low-res upload of a photo and uses machine learning technology to fill in the gaps between the pixels, so that the end-user gets the full resolution of the photo while using a fraction of the bandwidth and download time.

Google has been testing and refining this technology on its own social media platform, Google+, which has allowed them to take a low-res version of a photo, process it through RAISR, and deliver the photos to people on a subset of Android devices with near full-resolution quality while keeping the file sizes low. This cuts down on bandwidth constraints and load times while maintaining the integrity of the original images.

After a soft launch on Google+, it appears Google may be getting ready to unleash this image processing beast across more of its services and devices. In a recent blog post, Google+ Product Manager John Nack states that RAISR is now processing over a billion photos a week, an astonishing number when you consider its limited implementation. But of note is what Nack says in closing:

"In the coming weeks we plan to roll this technology out more broadly — and we're excited to see what further time and data savings we can offer."

The phrasing here is deliciously vague, as it's unclear whether that means the RAISR technology will simply be rolling out to be use on a wider subset of Android devices, but still self-contained within Google+, or spread across the wide spectrum of Google services and products.

Whatever happens, it's great to see Google continually developing and implementing these sorts of processes that allow us to share our favorite photos more efficiently. Once this technology is ubiquitous, we'll all reap the benefits of saving more data and time.

Marc Lagace was an Apps and Games Editor at Android Central between 2016 and 2020. You can reach out to him on Twitter [@spacelagace.

9 Comments
  • That's interesting.
  • Sounds promising....
  • Nice to see it on other phones than just the Pixel
  • its not on the pixel. they have been testing it on google plus. There is not one mention of the pixel phone in this entire article.
  • i really hope they arent going to do this to google photos and now this is what happens to all your free unlimited storage photos. 25% of the pixels does not a "near full resolution" photo make!! lol
  • The process removes 25% of the pixels before transferring, but when you view the image, 100% will be present. The difference is that 75% of those pixels won't be the "original" pixels so to speak--they will be an intelligent guess as to what the original pixels were. But, it still will be a full resolution photo.
  • "This cuts down on bandwidth constraints and load times while maintaining the integrity of the original images" Well no, it doesn't. The displayed images may be close, to the naked eye, to the original images, but data has been lost, so the integrity of the original images is not maintained.
  • No offense to Marc, but I don't think article is super clear on how bandwidth is saved without compromising image quality. Either that or Google is being a little vague in describing how RAISR works. "...which has allowed them to take a low-res version of a photo, process it through RAISR, and deliver the photos to people on a subset of Android devices with near full-resolution quality while keeping the file sizes low." I think a step is missing here. So, if I'm understanding it correctly, the process takes a full-resolution photo and removes 75% of the pixels. When a user wants to view the photo, the new, reduced photo (with only 25% of the pixels) is transmitted to the user's device along with metadata about the 75% of the pixels that were removed. Then, the RAISR process uses the metadata to do its best job of recreating the missing 75% of pixels and presents a full-resolution photo to the user. This last step happens on the device, though, which is where the bandwidth savings come in. Basically, it's device- or client-side rendering. RAISR uses fancy image analysis in order to render a full-resolution photo on the device even though only 25% of the original photo was downloaded to the device.
  • This sounds great.