Google is helping organizations preserve user privacy with its new open-source tools

Google Campus Logo
Google Campus Logo (Image credit: Android Central)

What you need to know

  • Google has announced the open-source version of its differential privacy library for developers and organizations.
  • Developers can use Google's open-source differential privacy library to develop tools that can help gain insights from large data sets while preserving the privacy of individuals.
  • Some of the other open-source privacy technologies announced by Google this year include Tensorflow Privacy, Tensorflow Federated, Private Join and Compute.

Google today announced the rollout of an open-source version of its differential privacy library, aiming to make it easier for developers and organizations to build tools that can help protect private information when gathering insights from big data sets.

Here's what Miguel Guevara, Product Manager, Privacy and Data Protection Office at Google wrote in a blog post explaining differential privacy:

Differentially-private data analysis is a principled approach that enables organizations to learn from the majority of their data while simultaneously ensuring that those results do not allow any individual's data to be distinguished or re-identified. This type of analysis can be implemented in a wide variety of ways and for many different purposes. For example, if you are a health researcher, you may want to compare the average amount of time patients remain admitted across various hospitals in order to determine if there are differences in care. Differential privacy is a high-assurance, analytic means of ensuring that use cases like this are addressed in a privacy-preserving manner.

The open-source version of the differential privacy library rolled out by Google today supports most common data science operations, which means developers will be able to compute counts, sums, averages, medians, and percentiles quite easily. The library also offers a PostgreSQL extension and an extensible 'Stochastic Differential Privacy Model Checker library' to help developers prevent mistakes.

As noted by The Verge, Google began using Randomized Aggregatable Privacy-Preserving Ordinal Response (RAPPTOR) to improve Chrome in 2014. The differential privacy tool can analyze and gather insights from the Chrome browser while ensuring sensitive information such as personal browsing histories cannot be traced. In March this year, Google introduced the open-source TensorFlow Privacy module to help developers train machine-learning models with privacy.

U.S. state regulators to scrutinize Google in antitrust probe

Babu Mohan
News Writer