Contents
Guide
Pagebreaks of the print version
WOODROW HARTZOG
Privacys Blueprint
The Battle to Control the Design of New Technologies
Cambridge, Massachusetts, and London, England2018
Copyright 2018 by the President and Fellows of Harvard College
All rights reserved
Many of the designations used by manufacturers, sellers, and internet software applications and services to distinguish their products are claimed as trademarks. Where those designations appear in this book and Harvard University Press was aware of a trademark claim, the designations have been printed in initial capital letters.
Jacket design: Jill Breitbarth
Jacket art: Thinkstock/Getty Images
978-0-674-97600-9 (hardcover : alk. paper)
978-0-674-98510-0 (EPUB)
978-0-674-98511-7 (MOBI)
978-0-674-98512-4 (PDF)
The Library of Congress has cataloged the printed edition as follows:
Names: Hartzog, Woodrow, 1978 author.
Title: Privacys blueprint : the battle to control the design of new technologies / Woodrow Hartzog.
Description: Cambridge, Massachusetts : Harvard University Press, 2018. | Includes bibliographical references and index.
Identifiers: LCCN 2017039954
Subjects: LCSH: Privacy, Right ofUnited States. | Design and technologyUnited States. | Data protectionLaw and legislationUnited States.
Classification: LCC KF1262 .H37 2018 | DDC 342.7308 / 58dc23
LC record available at https://lccn.loc.gov/2017039954
For Jen, Will, and Romy,
with love and gratitude
Contents
Modern discussions of privacy can give rise to worry, but I did not write this book as a privacy and technology fatalist. Quite the opposite. This book is the culmination of both the first part of my academic career as well as my lived experience on the important role that privacy plays in our lives. I have worked for a library, a television station, a restaurant, a university newspaper, a law firm, an advocacy group, and several educational institutions. At every step I have seen how necessary privacy is and how challenging it can be to balance privacy with other important, competing values. And, if Im being honest, this book was at least partially motivated by the blessed obscurity of the awkward growing pains of my youth.
While my experiences and prior work informed the values I highlight and the theory I propose in this book, I do not mean for this theory to be the final or only word at the intersection of privacy and design. Theories are meant to evolve; they are meant to interact with other theories, to be criticized, reinterpreted, and, with any luck, eventually contribute to a momentum that improves our world.
Portions of this book were adapted from parts of the following articles: The Indispensable, Inadequate Fair Information Practices, Maryland Law Review 76 (2017): 952982; The Feds Are Wrong to Warn of Warrant-Proof Phones, MIT Technology Review, March 17, 2016; The Internet of Heirlooms and Disposable Things, North Carolina Journal of Law andTechnology 17 (2016): 581598 (coauthored with Evan Selinger); Taking Trust Seriously in Privacy Law, Stanford Technology Law Review 19 (2016): 431472 (coauthored with Neil Richards); Increasing the Transaction Costs of Harassment, Boston University Law Review Annex, November 4, 2015 (coauthored with Evan Selinger); Social Media Needs More Limitations, Not Choices, Wired, April 2015; Surveillance as Loss of Obscurity, Washington and Lee Law Review 72 (2015): 13431387 (coauthored with Evan Selinger); The FTC and the New Common Law of Privacy, Columbia Law Review 114 (2014): 583676 (coauthored with Daniel J. Solove); Reviving Implied Confidentiality, Indiana Law Journal 89 (2014): 763806; The Value of Modest Privacy Protections in a Hyper Social World, Colorado Technology Law Journal 12 (2014): 332350; The Case for Online Obscurity, California Law Review 101 (2013): 150 (coauthored with Fred Stutzman); The Fight to Frame Privacy, Michigan Law Review 111 (2013): 10211043; Obscurity by Design, Washington Law Review 88 (2013): 385418 (coauthored with Fred Stutzman); Social Data, Ohio State Law Journal 74 (2013): 9951028; Website Design as Contract, American University Law Review 60 (2011): 16351671; The Privacy Box: A Software Proposal, First Monday 14 (2009); and Promises and Privacy: Promissory Estoppel and Confidential Disclosure in Online Communities, Temple Law Review 82 (2009): 891928. In some cases, I have used only selected passages from the articles. In many cases, the text and argument of the articles have been significantly reworked. I give special thanks to Neil Richards, Evan Selinger, Daniel Solove, and Fred Stutzman for allowing me to adapt some of our coauthored material for this book.
SOMETIMES even your best efforts are no match for technology. In the fall of 2012, Bobbi Duncan, then a student at the University of Texas at Austin, was outed as a lesbian to her father in part by the design choices of the social network site Facebook. Duncan did her best to keep posts about her sexuality hidden from her father by adjusting her privacy settings. But Facebooks discussion groups were not designed to ensure that the intent of an individuals privacy settings was respected. So when the creator of a Facebook group for UTs Queer Chorus added Duncan to the group, Facebook automatically posted a note to all her Facebook friends, including her father, without checking with her first. The design default allowed her to be added to the group in a public way without her permission.
Facebooks decision to design its site this way had real consequences. Duncan told the Wall Street Journal that a few hours after her friends were notified that she had been added to the group, her father began leaving her angry voicemails. No no no no no no no, Duncan remembers telling a friend. I have him hidden from my updates, but he saw this. He saw it. Duncan became estranged from her father and fell into depression for weeks. I couldnt function, she said. I would be in class and not hear a word anyone was saying. I remember I was miserable and said, Facebook decided to tell my dad that I was gay.
Bobbi Duncan is hardly the only person whose secrets have been betrayed by design. Hackers exploited the design of the social media service Snapchat. The service, which set time limits on how long posts by users were visible to recipients, was configured in such a way that third-party applications could access the service (despite third-party access being prohibited in Snapchats terms of use). Predictably, hundreds of thousands of pictures and videos taken by users were intercepted by hackers through a third-party application and, after a few days of bragging and bluster, were finally posted online in a thirteen-gigabyte dump. Snapchat blamed its users for using insecure applications, but it was Snapchat that let its users down; the company was the entity in the best position to protect its users. There are ways to ensure that only authorized software can interact with Snapchats servers, like rigorous client authentication in addition to standard user authentication. Yet Snapchat left access to its servers insecure and merely buried a warning about third-party add-on apps in the dense, hidden, and confusing terms of use that it knew no user would read.