Program (tentative)

TPDP 2023 will be held in the Metcalf Trustee Center in the Boston University Questrom School of Business, at 1 Silber Way, 9th floor, Boston. (all listed times are Eastern time)

Wednesday, September 27

8:00-9:00 Breakfast (provided)
9:00-9:05 Opening Remarks
9:05-9:50 Keynote: Differentially Private Measurements in Advertising (Video)
Pasin Manurangsi (Invited Speaker)

Digital advertising is a critical component of the internet and is powered by large-scale data analytics and machine learning models; privacy concerns around these are on the rise. In the past few years, numerous proposals have been put forth to address such concerns. Many of these proposals aim to achieve formal guarantees of differential privacy. In this talk, we will review the setting, discuss high-level ideas of proposed solutions and describe the main research challenges in the space, with focus on measurements.


9:50-10:35 (Combined Talk) (Video)
Robustness Implies Privacy in Statistical Estimation
Samuel B. Hopkins, Gautam Kamath, Mahbod Majid, Shyam Narayanan
From Robustness to Privacy and Back
Hilal Asi, Jonathan Ullman, Lydia Zakynthinou

Stability is Stable: Connections between Replicability, Privacy, and Adaptive Generalization(Video)
Mark Bun, Marco Gaboardi, Max Hopkins, Russell Impagliazzo, Rex Lei, Toniann Pitassi, Satchit Sivakumar, Jessica Sorrell

Better Differentially Private Approximate Histograms and Heavy Hitters using the Misra-Gries Sketch (Video)
Christian Janos Lebeda, Jakub Tětek
10:35-11:00 Break
11:00-12:30 Poster Session 1
12:30-2:00 Lunch Break (not provided, please find a list of recommendations here)
2:00-2:30 Private Everlasting Prediction (Video)
Moni Naor, Kobbi Nissim, Uri Stemmer, Chao Yan

(Combined Talk)(Video)
Models Matter: Setting Accurate Privacy Expectations for Local and Central Differential Privacy
Mary Anne Smart, Priyanka Nanayakkara, Rachel Cummings, Gabriel Kaptchuk, Elissa M. Redmiles
What Are the Chances? Explaining the Epsilon Parameter in Differential Privacy
Priyanka Nanayakkara, Mary Anne Smart, Rachel Cummings, Gabriel Kaptchuk, Elissa M. Redmiles
2:30-3:45 Panel on Machine Learning, Memorization, and Privacy
Vitaly Feldman, Woodrow Hartzig, Katherine Lee, Hideyuki ("Yuki") Matsumi, Adam Smith
3:45-4:15 Break
4:15-5:45 Poster Session 2

Thursday, September 28

8:00-9:00 Breakfast (provided)
09:00-09:45 Keynote: The Ever-evolving Intersection of Statistics and Privacy (Video)
Jeremy Seeman (Invited Speaker)

Differential privacy (DP) is the latest evolution in the history of disclosure avoidance research; with its many innovations come new ways of addressing foundational questions in private data publishing. However, research on DP at the interface of statistics and computer science remains relatively sparse. In this talk, I'll focus on uncertainty quantification as the natural intersection of traditional and modern methods. By unpacking how different literatures frame and address uncertainty quantification problems, I'll show how building effective DP systems will require tighter coupling, even entangling, of methods between statistics and CS. Doing so will help data users leverage DP's hallmark methodological transparency for more reproducible social science.

9:45-10:15 Privacy Auditing with One (1) Training Run
Thomas Steinke, Milad Nasr, Matthew Jagielski

Private Online Prediction from Experts: Separations and Faster Rates
Hilal Asi, Vitaly Feldman, Tomer Koren, Kunal Talwar
10:15-10:45 Break
10:45-12:15 Poster Session 3
12:15-1:45 Lunch Break (not provided, please find a list of recommendations here)
1:45-2:30 Keynote: Concurrent Composition of Interactive Differential Privacy
Wanrong Zhang (Invited Speaker)

We initiate a study of the composition properties of interactive differentially private mechanisms. An interactive differentially private mechanism is a possibly stateful algorithm that allows an analyst to adaptively ask queries about a sensitive dataset. We focus on *concurrent* composition, where an adversary can arbitrarily interleave its queries to several interactive differentially private mechanisms, which may be feasible when differentially private query systems are deployed in practice. We prove that the main composition theorems for non-interactive differentially private mechanisms extend to the concurrent composition of interactive differentially private mechanisms. Our theorems cover the cases when differential privacy is measured using the hypothesis testing framework of f-DP, which captures standard ($\epsilon,\delta$)-DP as a special case, or using Renyi divergence of fixed order (Renyi DP). Our proof for the case of f-DP works by showing that every interactive f-DP mechanism can be simulated, on any pair of adjacent datasets, by an interactive post-processing of a non-interactive f-DP mechanism.
We then extend these results to the concurrent composition of interactive mechanisms with adaptively chosen privacy-loss parameters. We prove that every valid privacy filter and odometer for noninteractive mechanisms extend to the concurrent composition of interactive mechanisms if privacy loss is measured using $(\epsilon, \delta)$-DP, $f$-DP, or R\'enyi DP of fixed order. Our results offer strong theoretical foundations for enabling full adaptivity in composing differentially private interactive mechanisms, showing that concurrency does not affect privacy guarantees.
This talk is based on joint work with Salil Vadhan, Tianhao Wang, Xin Lyu, Vicki Xu, Samuel Haney, Michael Shoemate, Grace Tian, and Andrew Vyrros.

2:30-3:00 Confidence-Ranked Reconstruction of Census Microdata from Published Statistics(Video)
Travis Dick, Cynthia Dwork, Michael Kearns, Terrance Liu, Aaron Roth, Giuseppe Vietri, Zhiwei Steven Wu

Publishing Wikipedia usage data with strong privacy guarantees (Video)
Temilola Adeleye, Damien Desfontaines, Michael Hay, Isaac Johnson, Cléo Lemoisson, Ashwin Machanavajjhala, Tom Magerlein, Gabriele Modena, David Pujol, Daniel Simmons-Marengo, Hal Triedman

Context

Differential privacy (DP) is the leading framework for data analysis with rigorous privacy guarantees. In the last 15 years, it has transitioned from the realm of pure theory to large scale, real world deployments. The use of differential privacy by the U.S. Census Bureau and increasing industry adoption has increased its public profile and presents new questions for social scientists and policymakers.

Differential privacy is an inherently interdisciplinary field, drawing researchers from a variety of academic communities including machine learning, statistics, security, theoretical computer science, databases, and law. The combined effort across a broad spectrum of computer science is essential for differential privacy to realize its full potential. To this end, this workshop aims to stimulate discussion among participants about both the state-of-the-art in differential privacy and the future challenges that must be addressed to make differential privacy more practical.

Specific topics of interest for the workshop include (but are not limited to):

The OpenDP Community Meeting 2023 will be taking place the same week as TPDP in Boston, on Friday, September 29. OpenDP will also be hosting a reception at 4pm on Thursday, September 28, after the conclusion of TPDP. We encourage participants to consider attending the community meeting!

Accepted Papers

Poster Session 1

Poster Session 2

Poster Session 3

Virtual Posters

Submission

The goal of TPDP is to stimulate the discussion on the relevance of differentially private data analyses in practice. For this reason, we seek contributions from different research areas of computer science and statistics.

Authors are invited to submit a short abstract (4 pages maximum, with unlimited references and appendices (only read at reviewer's discretion)) of their work. Submissions are single-blind (non-anonymized), and there is no prescribed style file (though authors should be considerate of reviewers in their selection).

Submissions will undergo a lightweight review process and will be judged on originality, relevance, interest and clarity. Submission should describe novel work or work that has already appeared elsewhere but that can stimulate the discussion between different communities at the workshop. Accepted abstracts will be presented at the workshop either as a talk or a poster.

The workshop will not have formal proceedings and is not intended to preclude later publication at another venue. In-person attendance is encouraged, though there will be some limited means for work by non-attending authors to be featured at TPDP 2023.

Selected papers from the workshop will be invited to submit a full version of their work for publication in a special issue of the Journal of Privacy and Confidentiality.

Call for Papers: PDF

Invited Speakers

Corporate Sponsors

We are very grateful to our sponsors whose generosity has been critical to the continued success of the workshop. For information about sponsorship opportunities, please contact us at tpdp2023@gmail.com.

Gold Tier Sponsors

Apple

Apple logo

Google

Google logo

Silver Tier Sponsors

Boston University Faculty of Computing & Data Sciences

BU logo

Rafik B. Hariri Institute for Computing and Computational Science & Engineering

 
  HIC Logo

Bronze Tier Sponsors

Tumult Labs

Tumult logo

DPella

DPella logo

Registration

Register for TPDP 2023 here!

Important Dates

Abstract Submission
July 7, 2023 (AoE)
Notification
August 11, 2023
Workshop
September 27-28, 2023

Submission website

https://hcrp.cs.uchicago.edu

For concerns regarding submissions, please contact tpdp2023@gmail.com.

Organizing and Program Committee