Differential Privacy and Private Bayesian Inference (original) (raw)

We consider a Bayesian statistician (B) communicating with an untrusted third party (A). B wants to convey useful answers to the queries of A, but without revealing private information. For example, we may want to give statistics about how many people suffer from a disease, but without revealing whether a particular person has it. This requires us to strike a good balance between utility and privacy. In this extended abstract, we summarise our results on the inherent privacy and robustness properties of Bayesian inference [1]. We formalise and answer the question of whether B can select a prior distribution so that a computationally unbounded A cannot obtain private information from queries. Our setting is as follows: