Vitaly Shmatikov (with Adam Smith of Penn State): a system for privacy-preserving deep learning. Deep learning based on artificial neural networks has led to dramatic improvements in speech and image recognition, language translation, and other AI tasks, yet centralized collection of training data from millions of users presents serious privacy risks. This project aims to design and implement a practical system that will enable data holders to learn accurate neural-network models without sharing their training datasets but still benefitting from other participants who are concurrently learning similar models.
Nate Foster and Dexter Kozen: a new probabilistic framework for programming software-defined networks (SDNs). Most existing SDN languages are deterministic, which makes it difficult to give satisfactory treatments of phenomena such as congestion given uncertain traffic patterns or reliability in the presence of failures. This project will develop a new programming language based on probabilistic semantics and tools for compiling programs and automatically verifying formal properties.
Ari Juels and Tom Ristenpart: security property enforcement (SPE) for cloud services. SPE applies in settings in which a cloud provider such as Google wishes to apply proprietary algorithms to customers sensitive data. SPE provides strong evidence to customers (or auditors working on their behalf) that the provider is not abusing its access to sensitive data and to the provider that malicious customers cannot exploit sensitive information about proprietary provider algorithms.
Rafael Pass: large-scale privacy-preserving computation. Cryptographic methods such as secure multi-party computation enable securely and privately performing any computation on individuals private inputs. Such methods, however, do not scale to the modern regime of large-scale distributed, parallel, data processing. This project will develop methods to securely and privately process large amounts of data using parallel distributed algorithms.