Rachel Cummings, Assistant Professor, Department of Industrial Engineering and Operations Research
Differential privacy is a mathematically rigorous definition of privacy, which has become a leading algorithmic technique used to meet the increasing consumer demand for digital privacy. Despite recent widespread deployment of differential privacy, relatively little is known about what users think of differential privacy. In this work, we conducted a series of user studies (n=2424) to explore users' privacy expectations related to differential privacy. We find that users care about the kinds of information leaks against which differential privacy protects and are more willing to share their private information when the risks of these leaks are less likely to happen. Additionally, we find that the ways in which differential privacy is described in the wild haphazardly set users' privacy expectations, which can be misleading depending on the deployment.