Table of Links
3. Theoretical Lenses
4. Applying the Theoretical Lenses and 4.1 Handoff Triggers: New tech, new threats, new hype
4.2. Handoff Components: Shifting experts, techniques, and data
4.3. Handoff Modes: Abstraction and constrained expertise
5. Uncovering the Stakes of the Handoff
5.1. Confidentiality is the tip of the iceberg
6.2 Lesson 2: Beware objects without experts
6.3 Lesson 3: Transparency and participation should center values and policy
8. Research Ethics and Social Impact
Acknowledgments and References
5.2 Data Utility
The DAS attempts to balance confidentiality with data utility. The goal is to implement confidentiality protections that, in theory, allow a wide range of stakeholders to access and use census data, while ensuring that census takers trust these confidentiality protections enough to disclose their information. But in response to the Bureau’s decision to update the DAS, many stakeholders expressed concerns that the noise added under DP would render Census data unusable for many use cases [130, 137]. Through the handoff lens, we note that the adoption of DP shifted other components of the DAS: in particular, in order to minimize privacy loss under DP, the Bureau reduced the number of published statistics and the number of counts that were held invariant (i.e., not affected by confidentiality protections; recall §4.2.2). The decision to report specific invariant statistics reflect policy decisions about what use cases are most important and where data utility should be preserved above confidentiality – what statistics are understood to be essential for democratic representation, versus those which are malleable. Notably, the Bureau rejected a request from the National Congress of American Indians to keep state-level data for tribal areas invariant [122].
\ Because Census data are closely tied to the allocation of political and economic resources, decisions about data utility impact the pursuit of values like equity and justice. When the Bureau solicited feedback from data users in the Federal Register, the solicitation and the responses revealed unspoken agreements about data access and utility for a wide range of applications, including state and local government, public health, anti-discrimination efforts, research, and education efforts. Stakeholders had differing epistemic perspectives about what makes data “good enough” to be useful [19]. These ethical and epistemic questions underlie a number of important policy decisions about how the DAS should operationalize and prioritize data utility across different settings. Notably, our analysis highlights a key reframing around utility: the switch to DP and its focus on formalism (§5.3) meant that utility was largely operationalized as accuracy, thus collapsing this epistemic debate.
5.3 Formalism
The shift to DP introduced a formal definition of both privacy and data utility. While SDL methods could be formalized as a series of rules, they did not allow the Bureau to quantify the resulting confidentiality protections. The Bureau highlighted the advantages of formalism, citing provable and externally verifiable guarantees [5] as well as precision in balancing competing values [9].
\ An extensive literature on quantification in the history and sociology of science examines why the call to numbers has been so powerful, particularly in bureaucratic contexts [35, 37, 46, 101]. Quantitative approaches to ethical questions promise to make political decisions visible [46, 118] and facilitate debate in a common language [35, 37], creating an avenue toward accountability by facilitating participation in democratic deliberation.
\
\ The quantification literature cautions that numbers can hide policy decisions beneath a veneer of scientific objectivity, producing legitimacy in highly contested decision-making settings and, at times, foreclosing external intervention [101]. Moreover, the choice to quantify privileges that which is easily measurable [46]. In the context of the DAS, the Bureau called for “pre-specified, objective criteria” [65, p. 2] to compare privacy methodologies. We argue that privileging formally quantifiable confidentiality guarantees led to a sociologically unintuitive conceptualization of privacy [109], and one which does not capture, e.g., notions of data privacy that are dependent on context or social relations [96, 132]. Thus, the Bureau’s decision to formalize privacy harms as reidentification risk–and data utility as (lack of) statistical uncertainty–in accordance with DP were not neutral, but reflect particular assumptions about the values at the center of the DAS.
\
:::info Authors:
(1) AMINA A. ABDU, University of Michigan, USA;
(2) LAUREN M. CHAMBERS, University of California, Berkeley, USA;
(3) DEIRDRE K. MULLIGAN, University of California, Berkeley, USA;
(4) ABIGAIL Z. JACOBS, University of Michigan, USA.
:::
:::info This paper is available on arxiv under CC BY-NC-SA 4.0 DEED license.
:::
\
Tidak ada komentar:
Posting Komentar