Differential Privacy Research Agenda

From Simson Garfinkel
Revision as of 05:25, 20 January 2020 by Simson (talk | contribs) (Created page with "Thoughts on a differential privacy research agenda: * Anonymous set intersection made differentially private. * Practical applications for DP secure multiparty computations....")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Thoughts on a differential privacy research agenda:

  • Anonymous set intersection made differentially private.
  • Practical applications for DP secure multiparty computations.
  • Tools for deciding on the privacy/accuracy tradeoff
  • Tools for building DP models
  • Approaches for analyzing programs that provide DP. Analyzing programs that implement randomized algorithms is hard: how do you find bugs?
  • More mechanisms.
  • DP for text, or a new theory for text de-identification.
  • DP theories that provide empirical privacy loss measures by taking into account background information. Would this produce different mechanisms? COuld these mechanisms offer more accuracy without a practical impacts on privacy loss?
  • Improved methods for teaching differential privacy.
  • Integrating the concepts of differential privacy into the high school curriculum.
  • DP theories that take into account computational complexity or numeric precision, so that we can release more aggregate data and know that we aren't impacting privacy loss of individuals.