Looking for a Kotlin/Java/Android developer to join our PSI team!

Our efforts in PSIWe recently published an article shining light on private set intersection (PSI) and  its use in the COVID-19 crisis. We a

PySyft Tutorial in Bengali

Welcome! OpenMined is an open-source community whose goal is to make the world more privacy-preserving by lowering the barrier-to-entry to p

OpenMined Featured Contributor: May 2020

Interview with Sourav Das, OpenMined's Featured Contributor for May 2020!

Dev Diaries- Wrapping Differential Privacy for Python

The compelling use cases for differential privacy are growing each day. Engineers at OpenMined have been busy building libraries to improve

Sentiment Analysis on Multiple Datasets With SyferText - Demo

How can you do pre-processing if you are not allowed to have access to plaintext data? SyferText can help you! With SyferText, you can define pre-processing components to perform pre-processing remotely, blindly and in a completely secure fashion.

Federated Learning for Credit Scoring

Want bureaus to score your credit without hoarding your data? Find out how FL can enable privacy-preserving, cross-border credit assessment.

A Federated Learning Approach for Pill Identification

Alright, so you’ve built an MNIST classifier using Federated Learning. Now it’s time to build something a little more cooler. Let’s build a

Announcing the OpenMined Operations Team

As OpenMined continues to grow we are excited to announce the OpenMined Operations Team! The mission of the Operations Team is to empower ot

What is Secure Multi-Party Computation?

This post is part of our Privacy-Preserving Data Science, Explained Simply series.

Privacy-Preserving Data Science, Explained

In this blog series, we’ll explain common topics in privacy-preserving data science, from a single sentence to code examples. We hope these posts serve as a useful resource for you to figure out the best techniques for your organization.

What is Federated Learning?

This post is part of our Privacy-Preserving Data Science, Explained series. In this article of the introductory series on Private ML, we wi

Maintaining Privacy in Medical Data with Differential Privacy

How can you make use of these datasets without accessing them directly? How can you assure these hospitals that their patients’ data will be protected? Is it even a possibility?

Announcing the OpenMined Learning Team

We are incredibly pleased to announce the OpenMined Learning Team. With the rising usage of Artificial Intelligence techniques to solve a wi

Dev Diaries- Bringing Google's differential privacy to iOS - the struggle, the pain and the glory!

Madhava Jay, iOS ecosystem lead for OpenMined's Differential Privacy team recently released the alpha verision of SwiftDP. Here he shares hi

Towards privacy with RStudio: Encrypted deep learning with Syft and Keras

In this post, we introduce Syft, an open-source framework that integrates with PyTorch as well as TensorFlow, and show how to use it from R. In an example use case, we obtain private predictions from an R Keras model.

Privacy-Preserving AI Summary: MIT Deep Learning Series

This article briefly discusses the key concepts covered in Part 1 of this lecture by Andrew Trask, which is part of the MIT Deep Learning Se

Looking for a Go/C++ developer to join our PSI team!

Our efforts in PSIWe recently published an article shining light on private set intersection (PSI) and its use in the COVID-19 crisis. We ar

Use Cases of Differential Privacy

In this blog post, we will cover a few use cases of differential privacy (DP) ranging from biomedical dataset analysis to geolocation.

A privacy-preserving way to find the intersection of two datasets

This post is part of our Privacy-Preserving Data Science, Explained series. Private set intersection (PSI) is a powerful cryptographic techn

اُپن‌مایند برای اَپ‌های کووید ۱۹ حریم خصوصیِ بازمَتن فراهم می‌کند

اَپ‌‌های بسیاری را ویژهٔ همه‌گیری کوویدِ ۱۹ همه‌جا دارند می‌سازند. می‌خواهند کمک کنند از تهدیدهای پیشِ روی جوامع چه در اقتصاد چه در سلامت کاسته شود. حریم خصوصیِ داده‌ها برای این اپ‌ها سرنوشت‌ساز است. نه‌تنها حریم خصوصی از حقوق بشر است، بلکه در جلب اعتماد و در پی‌اش اقبال به این اپ‌ها ضروری است.

Extracting Private Data from a Neural Network

Neural networks are not infallible. In this post we use a model inversion attack to discover data which was input to a target model.

Build an Homomorphic Encryption Scheme from Scratch with Python

This blog post aims at explaining the basic mathematical concepts behind most of today's homomorphic encryption schemes, and then build upon this to implement our own scheme (similar to BFV) from scratch using Python.

OpenMined Featured Contributor: April 2020

Interview with Benjamin Szymkow, OpenMined's Featured Contributor for April 2020!

Genetic data privacy in the dawn of big data forensics

“You would see hundreds and hundreds of unsolved crimes solved overnight,” Detective Michael Fields of the Orlando Police Department, from