Step by step Towards the Differential Privacy: Randomized Response in Privacy

“Privacy is obtained by the process, there are no “good” or “bad” responses”[3] Allegory of Grammar and Logic/Dialectic. Perugia, fontana&nb

Local sensitivity for differential privacy from scratch

In this code tutorial, I show the difference between local and global sensitivity, and program from scratch how to calculate local sensitivity for both the bounded and unbounded definitions of differential privacy.

Global sensitivity for differential privacy from scratch

In this code tutorial, I show how to implement empirically the calculation of the global sensitivity for the bounded and unbounded definitions of differential privacy.

Differential Identifiability

In this code tutorial, we implement differential identifiability, a differential privacy definition produced by Jaewoo Lee et al. This definitions helps practitioners to decide in a more intuitive manner what the value of epsilon should be, a major problem in the field.

Choosing Epsilon for Differential Privacy

The authors of the paper (Jaewoo Lee et al.) behind this code tutorial proposed bounds for epsilon so that its value may not yield a random output query result that leads to a posterior that is greater than the disclosure risk. In this post, we code their solution.

Local vs Global Differential Privacy

Summary: We learn best with toy code we can play with and diving into the documentations. This beginner friendly blog post covers a quick an

Privacy Series Basics: Definition

Summary: Sometimes knowing the basics is important! This beginner friendly blog post covers a quick and easy introduction to Differential Pr

Differential Privacy using PyDP

Differential Privacy using PyDP - An introductory tutorial. Here's an outline: What does Differential Privacy try to address? Why doesn't anonymization suffice? PyDP Example Walkthrough

Tempered Sigmoid Activations for Deep Learning with Differential Privacy

Read to find out how tempered sigmoid activations help overcome the problem of exploding gradients and yield better accuracy under differentially private model training.

PySyft + Opacus: Federated Learning with Differential Privacy

We use Opacus from PyTorch and PySyft from OpenMined to combine Federated Learning with Differential Privacy.

Private Deep Learning of Medical Data for Hospitals using Federated Learning and Differential privacy

Featuring Dmitrii Usynin - Speaker at #PriCon2020 - Sept 26 & 27 With the upcoming OpenMined Private Conference 2020 around the corner

Build PATE Differential Privacy in Pytorch

Summary: In this blog we’re going to discuss PATE - "Private Aggregation of Teacher Ensembles".  PATE is a private machine learning techniq

Differentially Private Deep Learning in 20 lines of code: How to use the PyTorch Opacus Library

I learn best from toy code I can play with. This tutorial teaches Differentially Private Deep Learning using a recently released library called Opacus.

What is Differential Privacy by Shuffling?

This post is part of our Privacy-Preserving Data Science, Explained series. Differential privacy has been established as the gold standard

Dev Diaries- Wrapping Differential Privacy for Python

The compelling use cases for differential privacy are growing each day. Engineers at OpenMined have been busy building libraries to improve

Privacy-Preserving Data Science, Explained

In this blog series, we’ll explain common topics in privacy-preserving data science, from a single sentence to code examples. We hope these posts serve as a useful resource for you to figure out the best techniques for your organization.

Maintaining Privacy in Medical Data with Differential Privacy

How can you make use of these datasets without accessing them directly? How can you assure these hospitals that their patients’ data will be protected? Is it even a possibility?

Dev Diaries- Bringing Google's differential privacy to iOS - the struggle, the pain and the glory!

Madhava Jay, iOS ecosystem lead for OpenMined's Differential Privacy team recently released the alpha verision of SwiftDP. Here he shares hi

Use Cases of Differential Privacy

In this blog post, we will cover a few use cases of differential privacy (DP) ranging from biomedical dataset analysis to geolocation.

Roadmap to Differential Privacy for All

Team DP has been tasked with delivering differential privacy as a simple to use package for app developers, data engineers and machine learn

OpenMined is wrapping Google's differential privacy into your app (and we need you!)

Building an easy to use wrapper around a robust cryptography library for use in mobile apps and browsers. COVID-19 is not the first pandemic