Differential privacy is a promising approach to the privacy-preserving release of data: it offers a strong guaranteed bound on the increase in harm that a user incurs as a result of participating in a differentially private data analysis. Several mechanisms and software tools have been developed to ensure differential privacy for a wide range of data analysis tasks, such as combinatorial optimization, machine learning, answering distributed queries, etc.
Researchers in differential privacy come from several area of computer science as algorithms, programming languages, security, databases, machine learning, as well as from several areas of statistics and data analysis. The workshop is intended to be an occasion for researchers from these different research areas to discuss the recent developments in the theory and practice of differential privacy.
The overall goal of TPDP is to stimulate the discussion on the relevance of differentially private data analyses in practice. For this reason, we seek contributions from different research areas of computer science and statistics.
Authors are invited to submit a short abstract (4-5 pages maximum) of their work by January 23, 2015. Abstracts must be written in English and be submitted as a single PDF file at EasyChair page for TPDP.
Submissions will be judged on originality, relevance, interest and clarity. Submission should describe novel works or works that have already appeared elsewhere but that can stimulate the discussion between different communities at the workshop. Accepted abstracts will be presented at the workshop.
The workshop will not have formal proceedings and is not intended to preclude later publication at another venue. Additionally, we plan to have a journal special issue if the number and quality of submissions will be of an high standard.
Submissions of abstracts by PC members are allowed and encouraged.
Specific topics of interest for the workshop include (but are not limited to):
Call for Papers: txt