OpenMined Featured Contributor: July 2021

Interview with Irina Bejan, OpenMined's Featured Contributor for July 2021!

Install PySyft using Conda

PySyft is a Python library for secure and private Deep Learning. It uses Federated Learning, Differential Privacy, and Encrypyted Computatio

Yes, privacy is worth the effort. Here’s why

When we advocate for privacy, we tend to concentrate on the negative consequences of privacy violations [56; 32; 19; 50]. These portrayals a

OpenMined Featured Contributor: June 2021

Interview with George Kaissis, OpenMined's Featured Contributor for June 2021!

Interview with Chinmay Shah: Differential Privacy Team Lead at OpenMined

In this interview, Chinmay Shah, who leads the Differential Privacy team at OpenMined, talks about his interests, the team's journey and the exciting road ahead!

The Next Generation of Data Ethics Tools

In early 2021, Consequential, alongside the Open Data Institute (ODI), conducted a short, qualitative research project to explore the impact of data ethics tools in supporting a vision of a world where tech and data works for everyone.

Private AI: Machine Learning on Encrypted Data

Protect privacy of your data by encrypting it. Outsource computations on the encrypted data, and decrypt at your end to view results.

Local sensitivity for differential privacy from scratch

In this code tutorial, I show the difference between local and global sensitivity, and program from scratch how to calculate local sensitivity for both the bounded and unbounded definitions of differential privacy.

Structured Transparency Continued: Verifying Inputs and Outputs

More on Structured Transparency: Input and Output Verification, and Flow Governance

Private Deep Learning of Medical Data for Hospitals using Federated Learning and Differential Privacy

Speakers: Kritika Prakash , Lucile Saulnier, Dmitrii Usynin, Zarreen Naowal Reza This talk provides an example of  private deep learning u

Global sensitivity for differential privacy from scratch

In this code tutorial, I show how to implement empirically the calculation of the global sensitivity for the bounded and unbounded definitions of differential privacy.

OpenMined Featured Contributor: May 2021

Interview with Thiago Costa Porto, OpenMined's Featured Contributor for May 2021!

Defining Privacy Tech

An attempt to avoid talking past each other so we can instead direct our efforts towards moving the needle on privacy through privacy innovation

Neural Networks as Steganographic Vehicles

Steganography is the art of hiding information. In this blog post we’ll see how neural networks can be used as a vehicle for covert communication.

Differential Privacy from Program Analysis Perspective

An interview with Yuxin Wang, a rising star in programming analysis of differential privacy.

Confidential Computing Explained. Part 2 : Attestation

This post introduces the concept of attestation with Intel SGX enclaves

Differential Identifiability

In this code tutorial, we implement differential identifiability, a differential privacy definition produced by Jaewoo Lee et al. This definitions helps practitioners to decide in a more intuitive manner what the value of epsilon should be, a major problem in the field.

OpenMined Featured Contributor: April 2021

Interview with Fatemeh Mireshghallah, OpenMined's Featured Contributor for April 2021!

Confidential computing explained. Part 1: introduction

This post is a first introduction to the basic principles of Confidential Computing.

Installing PySyft v0.5.0 with PyGrid on a Raspberry Pi 4

Introduction Install PySyft PyTorch v1.8.1 (1 min) Install other dependencies (6 min) Install syft 0.5.0 (6 min) Testing the environment In

Choosing Epsilon for Differential Privacy

The authors of the paper (Jaewoo Lee et al.) behind this code tutorial proposed bounds for epsilon so that its value may not yield a random output query result that leads to a posterior that is greater than the disclosure risk. In this post, we code their solution.

Building Differential Privacy at Scale

Observe how a differential privacy infrastructure from the ground up. You'll see use cases that bolster the fact that making the infrastructure “user-centric” is crucial. We also discuss why people are reluctant to use such infrastructure

PRIMAL: a framework for secure evaluation of neural networks

This is a summary of the talk by Daniel Escudero at the OpenMined Privacy Conference 2020. Daniel Escudero talked about PRIMAL, a framewor

Adaptive Federated Optimization

In non-federated settings, adaptive optimization methods have desirable convergence properties. Can federated versions of these adaptive optimizers, including Adagrad, Adam, and Yogi facilitate better convergence in the presence of heterogeneous data?