Weekly Digs #5
Good mix of approaches this time, including custom secure computation protocols, secure enclaves, and differential privacy.
Papers
- Private Nearest Neighbors Classification in Federated Databases
Great work on custom MPC protocols for k-NN classification of a sample using a distributed data set and without leaking neither sample nor data set (concrete use-case is document classification with cosine similarity). - Chiron: Privacy-preserving Machine Learning as a Service
Interesting look at protecting both model specifics and privacy of training data via secure enclaves. The technology is promising despite having experienced a few issues recently and avoids use of heavy cryptography. - Hiding in the Crowd: A Massively Distributed Algorithm for Private Averaging with Malicious Adversaries
Gossip-based peer-to-peer protocol for privately computing the exact average of a distributed data set directly between peers. No heavy cryptography is used in case of honest peers, with a PHE-based extension for detecting malicious cheating. - Locally Private Bayesian Inference for Count Models
When applying differential privacy one may either ignore the fact that noise has been added to the data or try to take it into account; the latter is done here with good illustrations of the improvements this can give. - Comparing Population Means under Local Differential Privacy
Another use-case of differential privacy, here looking at how to recover from the extra noise added in local DP when during privacy preserving A/B testing (and more). - Cloud-based MPC with Encrypted Data
Gives two schemes for private Model Predictive Control by a central authority (who might have a better understanding of the environment than individual sensors), one based on PHE and another on MPC.