This project is about the discourses of privacy and privacy law. It
constructs the landscape of privacy discourse, where it has been, where
it is going, and who it empowers along the way. Based on primary sources
research, the project argues that the changing discourse around privacy
is shifting power over our data from the field of law to the terrain of
technology, thereby weakening substantive privacy protections for
individuals.
The dominant discourse of privacy today, often called “notice and
consent”, is explicitly neoliberal. This regime has been roundly
criticized by privacy scholars as a failure. And yet, for all its
faults, notice-and-consent always made sense from a sociological or
phenomenological perspective. That is, it was inadequate yet scrutable;
because of the latter, we determined the former. Neoliberal privacy law
is ineffective, but it was always accessible and open for interrogation
from the ground up. That inadequate, yet relatable discourse, however,
is now losing ground to the inscrutable, unaccountable discourse of
technology designers. The same neoliberal social, political, and legal
forces, superpowered by more advanced technology and a more powerful
technologist profession, are shifting privacy law discourse from
accessible concepts like choice to inaccessible computer code, from
something regulators could interrogate to the “black box” language of
technology. The discourse of privacy law and, thus, power over its
translation into practice, resides in the design team, where engineers,
supervised by other engineers, make consequential choices about how, if
at all, to interpret the requirements of privacy law and integrate them
into the code of technologies they create. Based on primary source
research, this project argues that the code-based discourse of engineers
is gaining hegemonic power in privacy law, thereby defining privacy law
and what it means in practice, stacking the deck against robust privacy
protections, and undermining the promise of privacy laws already
passed.