Meet OpenMined's new UCSF-OpenMined Fellows

We’re very excited to announce the recipients of the latest round of open-source software development grants in the OpenMined community, gen

Announcing the OpenMined-UCSF Data-Centric Federated Learning Fellowship

We’re very excited to announce the next round of open-source software development grants in the OpenMined community, generously sponsored by the University of California San Francisco! These grants will focus on bringing data-centric federated learning with differential privacy budgeting to PyGrid.

OpenMined + Genesis Cloud Partnership For Cloud Infrastructure and Applications

The mission at OpenMined is to make the world more privacy-preserving by lowering the barrier-to-entry to privacy-preserving technologies th

Looking for a Kotlin/Java/Android developer to join our PSI team!

Our efforts in PSIWe recently published an article shining light on private set intersection (PSI) and  its use in the COVID-19 crisis. We a

PySyft Tutorial in Bengali

Welcome! OpenMined is an open-source community whose goal is to make the world more privacy-preserving by lowering the barrier-to-entry to p

OpenMined Featured Contributor: May 2020

Interview with Sourav Das, OpenMined's Featured Contributor for May 2020!

Dev Diaries- Wrapping Differential Privacy for Python

The compelling use cases for differential privacy are growing each day. Engineers at OpenMined have been busy building libraries to improve

Sentiment Analysis on Multiple Datasets With SyferText - Demo

How can you do pre-processing if you are not allowed to have access to plaintext data? SyferText can help you! With SyferText, you can define pre-processing components to perform pre-processing remotely, blindly and in a completely secure fashion.

Federated Learning for Credit Scoring

Want bureaus to score your credit without hoarding your data? Find out how FL can enable privacy-preserving, cross-border credit assessment.

A Federated Learning Approach for Pill Identification

Alright, so you’ve built an MNIST classifier using Federated Learning. Now it’s time to build something a little more cooler. Let’s build a

Announcing the OpenMined Operations Team

As OpenMined continues to grow we are excited to announce the OpenMined Operations Team! The mission of the Operations Team is to empower ot

What is Secure Multi-Party Computation?

This post is part of our Privacy-Preserving Data Science, Explained Simply series.

Privacy-Preserving Data Science, Explained

In this blog series, we’ll explain common topics in privacy-preserving data science, from a single sentence to code examples. We hope these posts serve as a useful resource for you to figure out the best techniques for your organization.

What is Federated Learning?

This post is part of our Privacy-Preserving Data Science, Explained series. Update as of November 18, 2021: The version of PySyft mentioned

Maintaining Privacy in Medical Data with Differential Privacy

How can you make use of these datasets without accessing them directly? How can you assure these hospitals that their patients’ data will be protected? Is it even a possibility?

Announcing the OpenMined Learning Team

We are incredibly pleased to announce the OpenMined Learning Team. With the rising usage of Artificial Intelligence techniques to solve a wi

Dev Diaries- Bringing Google's differential privacy to iOS - the struggle, the pain and the glory!

Madhava Jay, iOS ecosystem lead for OpenMined's Differential Privacy team recently released the alpha verision of SwiftDP. Here he shares hi

Towards privacy with RStudio: Encrypted deep learning with Syft and Keras

In this post, we introduce Syft, an open-source framework that integrates with PyTorch as well as TensorFlow, and show how to use it from R. In an example use case, we obtain private predictions from an R Keras model.

Privacy-Preserving AI Summary: MIT Deep Learning Series

Update as of November 18, 2021: The version of PySyft mentioned in this post has been deprecated. Any implementations using this older versi

Looking for a Go/C++ developer to join our PSI team!

Our efforts in PSIWe recently published an article shining light on private set intersection (PSI) and its use in the COVID-19 crisis. We ar

Use Cases of Differential Privacy

In this blog post, we will cover a few use cases of differential privacy (DP) ranging from biomedical dataset analysis to geolocation.

A privacy-preserving way to find the intersection of two datasets

This post is part of our Privacy-Preserving Data Science, Explained series. Private set intersection (PSI) is a powerful cryptographic techn

اُپن‌مایند برای اَپ‌های کووید ۱۹ حریم خصوصیِ بازمَتن فراهم می‌کند

اَپ‌‌های بسیاری را ویژهٔ همه‌گیری کوویدِ ۱۹ همه‌جا دارند می‌سازند. می‌خواهند کمک کنند از تهدیدهای پیشِ روی جوامع چه در اقتصاد چه در سلامت کاسته شود. حریم خصوصیِ داده‌ها برای این اپ‌ها سرنوشت‌ساز است. نه‌تنها حریم خصوصی از حقوق بشر است، بلکه در جلب اعتماد و در پی‌اش اقبال به این اپ‌ها ضروری است.

Extracting Private Data from a Neural Network

Neural networks are not infallible. In this post we use a model inversion attack to discover data which was input to a target model.