Schedule for: 23w5106 - Contextual Integrity for Differential Privacy
Beginning on Sunday, July 30 and ending Friday August 4, 2023
All times in UBC Okanagan, Canada time, PDT (UTC-7).
Sunday, July 30 | |
---|---|
16:00 - 23:00 | Check-in begins at 16:00 on Sunday and is open 24 hours (Front Desk Nechako Residence) |
18:30 - 20:30 |
Dinner (for those around) ↓ We will meet at the front desk Nechako and walk somewhere together. (Front Desk Nechako) |
Monday, July 31 | |
---|---|
08:00 - 08:45 | Breakfast (Sunshine/ADM) |
08:45 - 09:00 | Introduction and Welcome by BIRS-UBCO Staff (TBA) |
09:00 - 10:00 |
Whole Group Introductions (ARTS 114) ↓ Everyone gives a short introduction on their research area and interests (Main Meeting Room) |
10:00 - 10:30 | Coffee Break (ARTS 112) (Main Meeting Room) |
10:30 - 11:30 | Differential Privacy Primer, part 1 (ARTS 114) (Main Meeting Room) |
11:30 - 13:00 | Lunch (Sunshine/ADM) |
13:00 - 14:00 | Contextual Integrity Primer, part 1 (ARTS 114) (Main Meeting Room) |
14:00 - 15:00 | Differential Privacy Primer, part 2 (ARTS 114) (Main Meeting Room) |
15:00 - 15:30 | Coffee Break (ARTS 112) (Main Meeting Room) |
15:30 - 16:30 | Contextual Integrity Primer, part 2 (ARTS 114) (Main Meeting Room) |
16:30 - 17:00 |
Wrap up & Brainstorming discussion topics (ARTS 114) ↓ Open group discussion of reflections on the day, and brainstorming of discussion topics for breakout groups throughout the week. (Main Meeting Room) |
17:30 - 19:00 | Dinner (Sunshine/ADM) |
Tuesday, August 1 | |
---|---|
08:00 - 09:00 | Breakfast (Sunshine/ADM) |
09:00 - 09:20 |
Priyanka Nanayakkara: Explaining the epsilon parameter in differential privacy ↓ Differential privacy (DP) is a mathematical privacy notion increasingly deployed across government and industry. With DP, privacy protections are probabilistic: they are bounded by the privacy budget parameter, epsilon. Prior work in health and computational science finds that people struggle to reason about probabilistic risks. Yet, communicating the implications of epsilon to people contributing their data is vital to avoiding privacy theater---presenting meaningless privacy protection as meaningful---and empowering more informed data-sharing decisions. Drawing on best practices in risk communication and usability, we develop three methods to convey
probabilistic DP guarantees to end users: two that communicate odds and one offering concrete examples of DP outputs.
We quantitatively evaluate these explanation methods in a vignette survey study (n=963) via three metrics: objective risk comprehension, subjective privacy understanding of DP guarantees, and self-efficacy. We find that odds-based explanation methods are more effective than (1) output-based methods and (2) state-of-the-art approaches that gloss over information about epsilon. Further, when offered information about epsilon, respondents are more willing to share their data than when presented with a state-of-the-art DP explanation; this willingness to share is sensitive to epsilon values: as privacy protections weaken, respondents are less likely to share data. (Main Meeting Room) |
09:20 - 09:40 |
Sebastian Benthall: Differential Privacy in Causal Context Models ↓ We consider the problem of determining appropriate information flow in a setting with privacy mechanisms. We formalize contextual integrity's definition of a context -- which include purposes, roles, ends, and norms -- in terms of a Multi-Agent Causal Influence Diagram (MACID). This representation adds to contextual integrity a rigorous way of accounting for agents' strategic responses to information flow mechanisms. We then demonstrate how these models can illuminate how assumptions about context can inform the most appropriate setting of the parameters of a privacy-preserving mechanism. (Main Meeting Room) |
09:40 - 10:00 |
Ero Balsa: Through the lens of CI: uses and misuses of DP ↓ Differential privacy (DP) promises impressive privacy guarantees: generalizable and provable, composable, and independent from auxiliary knowledge. At the same time, its guarantees are often unintuitive, inscrutable for most non-experts, rendering them vulnerable to misinterpretation and misuse. An (already classic) case in point is the inadequate selection of epsilon and other DP parameters.
While DP provides robust mechanisms to quantify information disclosure, it is mute when it comes to informing normative determinations about just how much disclosure we should allow, if any at all. In contrast, Nissenbaum’s theory of Contextual Integrity (CI) provides a conceptual framework to reason about privacy violations, providing the normative guidance that DP lacks. Yet, surprisingly, work combining these two cornerstone notions in contemporary privacy scholarship has so far remained scarce.
Our work contributes to close this gap. We rely on CI to reason about the (mis)applications of DP. We seek to test whether and how CI may guide us identifying settings that call for the use of DP, as well as scenarios where using DP may be misguided, causing more harm than good. As an example, in this talk we revisit the popular claim "statistical inference is not a privacy violation", showing how a CI-analysis suggests otherwise. (Main Meeting Room) |
10:00 - 10:30 | Coffee Break (TBA) |
10:30 - 10:50 |
Leanne Wu: Indigenous Data Sovereignty: Privacy, Governance and Reconciliation ↓ A key aspect of decolonization and reconciliation is returning sovereignty over data collected from and about indigenous communities and individuals around the world by colonial powers, to support the right of these individuals and communities to self-determination and ensure they can employ their data in beneficial ways that can prioritize their own well-being. As Canadian society moves toward reconciliation with the indigenous peoples on whose traditional territories we reside, making provisions to protect the privacy and data sovereignty of indigenous communities and individuals will be an increasing priority for government, institutions and enterprise. Harmonizing indigenous data sovereignty with extant privacy legislation and practices, however, can be a complex undertaking, and contextual integrity will be an important frame through which to understand the interplay of legislation, jurisdiction, tradition, alternate ways of knowing, and individual experience. (Main Meeting Room) |
10:50 - 11:10 |
Bailey Kacsmar: Features of Privacy Context in Multiparty Data Sharing ↓ Private computation, which includes techniques like multi-party computation and differentially private query execution, holds great promise for enabling organizations to analyze data they and their partners hold while maintaining data subjects' privacy. However, users’ preferences have variation in terms of acceptability, depending on the nature of the data sharing. That is, depending on the features of the privacy context. In this talk, I will discuss survey results on how different structures of sharing influence the perceived acceptability of data usage practices. For instance, the number of participating companies and who receives the data after the computation is complete both impact acceptability. To dig deeper into this phenomenon, I also include the results of an interview study that asks participants to consider both a privacy preserving approach to data sharing practices and conventional non-privacy preserving data sharing practices. While the privacy preserving technique, such as differential privacy or multi-party computation was discussed, participants emphasized throughout other attributes being relevant to their decision. That is, participants emphasized non-technical privacy contexts such as the purpose of the computation and the type of data as factors influencing their decision regardless of the privacy technique employed. Thus, this leads to the question of where techniques such as differential privacy fit within other features of privacy context for the population. (Main Meeting Room) |
11:10 - 11:30 |
Jeremy Seeman: The Role of Framing Effects in Differential Privacy and Contextual Integrity ↓ Differential Privacy (DP) has a complex relationship with social and legal theories of privacy, including but not limited to contextual integrity (CI). Like all technologies, the way DP defines privacy problems affects how we view particular solutions and evidence for or against their adoption, known in science and technology studies (STS) as framing effects. In this talk, I'll discuss work on these framing effects and how they impact efforts to align DP and CI. While CI posits that appropriate information flows are essentially social, DP applications introduce new socio-technical dimensions to how data processors might justify new information flows as ethically legitimate. Examining such processes invites strategies and interventions for governing DP applications, some of which extend DP's mechanics for better CI alignment and others that may require interventions beyond DP's scope. (Main Meeting Room) |
12:00 - 12:20 | Group Photo (Main Meeting Room) |
12:45 - 14:00 |
Lunch ↓ Blind Tiger Vineyards - 11014 Bond Rd, Lake Country, BC V4V 1J6, Canada (Other - See Description) |
14:00 - 15:00 |
Breakout groups ↓ Taking four themes for working groups, break into groups (Other - See Description) |
15:00 - 15:30 | Coffee Break (TBA) |
15:30 - 16:00 | Share back from breakout groups (Other - See Description) |
16:00 - 16:30 |
guidance for regulation ↓ introduce the theme of guidance for privacy regulation (Main Meeting Room) |
16:30 - 17:00 | Helen Nissenbaum: Contextual Integrity Primer, part 3 (Main Meeting Room) |
17:30 - 19:00 | Dinner (Sunshine/ADM) |
Wednesday, August 2 | |
---|---|
08:00 - 08:45 | Breakfast (Sunshine/ADM) |
09:00 - 09:20 | Wanrong Zhang: Online Local Differential Private Quantile Inference via Self-normalization (Main Meeting Room) |
09:20 - 09:40 | Joel Reardon: Changing Norms in Computing (Main Meeting Room) |
09:40 - 10:00 | Yan Shvartzshnaider: Overview of CI Projects (Main Meeting Room) |
10:00 - 10:30 | Coffee Break (TBA) |
10:30 - 11:30 | working groups (Main Meeting Room) |
11:30 - 13:00 | Lunch (Sunshine/ADM) |
13:00 - 17:30 | Group outing - Wine Tasting and afternoon at the lake (Main Meeting Room) |
17:30 - 19:00 | Dinner (Sunshine/ADM) |
Thursday, August 3 | |
---|---|
08:00 - 09:00 | Breakfast (Sunshine/ADM) |
09:00 - 09:20 |
Thomas Steinke: Algorithms with More Granular Differential Privacy Guarantees ↓ Differential privacy is often applied with a privacy parameter that is larger than the theory suggests is ideal; various informal justifications for tolerating large privacy parameters have been proposed. In this work, we consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis. In this framework, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person (i.e., all the attributes). (Main Meeting Room) |
09:20 - 09:40 |
Gautam Kamath: Considerations for Differentially Private Learning with Large-Scale Public Pretraining ↓ The performance of differentially private machine learning can be boosted significantly by leveraging the transfer learning capabilities of non-private models pretrained on large public datasets. We critically review this approach.
We primarily question whether the use of large Web-scraped datasets should be viewed as differential-privacy-preserving. We caution that publicizing these models pretrained on Web data as "private" could lead to harm and erode the public's trust in differential privacy as a meaningful definition of privacy.
Beyond the privacy considerations of using public data, we further question the utility of this paradigm. We scrutinize whether existing machine learning benchmarks are appropriate for measuring the ability of pretrained models to generalize to sensitive domains, which may be poorly represented in public Web data. Finally, we notice that pretraining has been especially impactful for the largest available models -- models sufficiently large to prohibit end users running them on their own devices. Thus, deploying such models today could be a net loss for privacy, as it would require (private) data to be outsourced to a more compute-powerful third party.
We conclude by discussing potential paths forward for the field of private learning, as public pretraining becomes more popular and powerful. (Main Meeting Room) |
09:40 - 10:00 |
Mark Bun: Unifying Replicability, Privacy, and Adaptive Generalization in Learning ↓ Replicability is the principle that the findings of an empirical study should remain the same when it is repeated on new data. Motivated by the difficulty of ensuring replicability in today’s complex data generation and analysis processes, Impagliazzo, Lei, Pitassi, and Sorrell recently put forth an algorithmic definition that serves as a general sufficient condition for replicability.
This definition isn't the first notion of algorithmic stability aimed at ensuring the utility and safety of modern data analysis. Others are central to relatively mature areas such as differential privacy and adaptive data analysis. In this talk, I'll describe algorithmic transformations that together paint a clear picture of the relationships between replicability, differential privacy, and generalization in adaptive data analysis for statistical learning problems. I'll also describe some (surprising!) implications of this picture to differentially private algorithm design. (Main Meeting Room) |
10:00 - 10:30 | Coffee Break (TBA) |
10:30 - 11:30 | working groups (Main Meeting Room) |
11:30 - 13:00 | Lunch (ARTS 112) |
13:00 - 13:15 | Wanrong Zhang (Main Meeting Room) |
13:15 - 13:30 | Shlomi Hod (Main Meeting Room) |
13:30 - 13:45 | Nidhi Hegde (Main Meeting Room) |
13:45 - 14:00 | Rei Safavi-Naini (Main Meeting Room) |
14:00 - 15:30 | working groups (Main Meeting Room) |
15:30 - 16:00 | Coffee Break (Main Meeting Room) |
15:30 - 17:00 | Working groups (Main Meeting Room) |
17:00 - 17:30 | reporting back from working groups (Main Meeting Room) |
17:30 - 18:00 | Workshop wrap up (Main Meeting Room) |
18:00 - 19:30 | Dinner (Main Meeting Room) |
Friday, August 4 | |
---|---|
08:00 - 09:00 | Breakfast (Sunshine/ADM) |
09:00 - 10:00 |
event wrap up ↓ reporting from the workout groups and open discussion (Main Meeting Room) |
10:00 - 10:30 | Coffee Break (TBA) |
10:30 - 11:00 | Checkout by 11AM (Front Desk Nechako Residence) |
11:30 - 13:00 | Lunch (Sunshine/ADM) |