• Complain

Sara Wachter-Boettcher - Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech

Here you can read online Sara Wachter-Boettcher - Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2017, publisher: W. W. Norton & Company, genre: Politics. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Sara Wachter-Boettcher Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
  • Book:
    Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
  • Author:
  • Publisher:
    W. W. Norton & Company
  • Genre:
  • Year:
    2017
  • Rating:
    4 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 80
    • 1
    • 2
    • 3
    • 4
    • 5

Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Sara Wachter-Boettcher: author's other books


Who wrote Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech? Find out the surname, the name of the author of the book and a list of all author's works by series.

Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make

Thank you to my editorial team Alane Salierno Mason for emailing me out of - photo 1

Thank you to my editorial team: Alane Salierno Mason, for emailing me out of the blue and encouraging me to do this; and Ashley Patrick, for patiently answering my endless questions. Many thanks to my copy editor, Stephanie Hiebert, for bringing clarity and thoroughness to the manuscript (and for coining the delightful phrase tenuous legitimacy at best, which is how I plan to describe myself from now on).

Endless gratitude to those who read early drafts: my love, William Bolton; and my friends Marie Connelly, Katel LeD, Ethan Marcotte, and Mary Rohrdanz. I owe all of you wine, doughnuts, and hugs.

This book wouldnt have been possible without Eric Meyer. Collaborating with you on Design for Real Life changed the course of my career.

Thanks to those I spoke with during the writing process: Erin Abler, Jacky Alcin, Libby Bawcombe, Sally Jane Black, Anil Dash, Maggie Delano, Veronica Erb, Sorelle Friedler, Aimee Gonzalez-Cameron, Lena Groeger, Sydette Harry, Dan Hon, Kate Kiefer Lee, Safiya Noble, Sally Rooney, Grace Sparapani, Kaya Thomas, Indi Young, and a whole host of wonderful people who shared their stories with me in confidence. I am also incredibly grateful to my friends Steve Fisher, Jason Santa Maria, and Matt Sutter for providing design help.

Thank you to the friends who gave feedback on all kinds of details, and who were there for me as I moaned and griped through this processespecially all the members of Camp Contentment, the Male Tears Club, Pizza Club, and the Ladies Anti-Fascist Friends Society. Cat-heart-eyes emoji for days.

Thank you to anyone I missed. I hope you forgive my terrible memory.

And, finally, thank you to everyone striving to make tech fairer, kinder, and more humane. I know we can do it.

ALSO BY SARA WACHTER-BOETTCHER

Content Everywhere

Design for Real Life (with Eric Meyer)

Technically Wrong

Sexist Apps, Biased
Algorithms, and Other
Threats of Toxic Tech

Sara Wachter-Boettcher

Picture 2

W. W. NORTON & COMPANY

INDEPENDENT PUBLISHERS SINCE 1923

NEW YORK LONDON

Copyright 2017 by Wachter-Boettcher Consulting

p. 41: 2016 National Public Radio, Inc. NPR news report titled Designing
New Products With Empathy: 50 Stress Cases To Consider by Libby
Bawcombe as originally published on https://npr.design/ on August 16, 2016,
and is used with the permission of NPR. Any unauthorized duplication is
strictly prohibited.

All rights reserved
First Edition

For information about permission to reproduce selections from this book,
write to Permissions, W. W. Norton & Company, Inc., 500 Fifth Avenue,
New York, NY 10110

For information about special discounts for bulk purchases, please contact
W. W. Norton Special Sales at specialsales@wwnorton.com or 800-233-4830

Book design by Daniel Lagin
Production manager: Anna Oler
Jacket Design: Steve Fisher

The Library of Congress has cataloged the printed edition as follows:

Names: Wachter-Boettcher, Sara, author.

Title: Technically wrong : sexist apps, biased algorithms, and other threats of toxic tech / Sara Wachter-Boettcher.

Description: First edition. | New York, NY : W.W. Norton & Company, independent publishers since 1923, [2017] | Includes bibliographical references and index.

Identifiers: LCCN 2017031829 | ISBN 9780393634631 (hardcover)

Subjects: LCSH: System failures (Engineering) | Business failures. |
TechnologySocial aspects. | New productsMoral and ethical aspects.

Classification: LCC TA169.5 .W33 2017 | DDC 303.48/34dc23

LC record available at https://lccn.loc.gov/2017031829

ISBN: 978-0-393-63464-8 (e-book)

W. W. Norton & Company, Inc.
500 Fifth Avenue, New York, N.Y. 10110
www.wwnorton.com

W. W. Norton & Company Ltd.
15 Carlisle Street, London W1D 3BS

For Elena and Audrey, who remind me
that wrongs are always worth righting.

Page numbers listed correspond to the print edition of this book. You can use your devices search function to locate particular terms in the text.

Note: Italic page numbers refer to illustrations.

Abler, Erin, 3233

Acxiom data brokers, 104

advertising

and collection of gender information, 6566

Facebooks selections for users, 10

and filtering, 65

and proxy data, 110112

and Reddit, 162

and value of user data, 96

Airbnb, 20

Alcin, Jacky, 129130, 132133, 135, 137138

alcohol use, 1718

algorithms

biases in, 144145, 176

and clean design aesthetic, 143

and COMPAS, 120121, 125129, 145

and debiasing word-embedding systems, 140

described, 121123

and edge cases, 137

and Facebooks use of proxy data, 112

and Friends Day Facebook feature, 84

and Google, 123, 136, 144

and neural networks, 131133

and News Feed Facebook feature, 168

and social media trends, 10

and training data, 145146, 171

and Trending Facebook feature, 149, 166167, 169

and Yelp, 123125

Allen, Paul, 182

AltaVista, 2

alt-right movement, 153, 164

Apple

and emoji suggestions, 80

iPhone location settings, 105108

and Siris female voice, 36

and Siris responses to crises, 67, 7

and Siris teasing humor, 8889

smartwatches from, 13

and use of personas, 27

and workforce diversity, 1920

artificial intelligence

and failure to understand crises, 67

and loss of jobs, 192

Siri as, 8889

word-embedding systems, 139140

Automattic, 183

average users, 3844, 47

Barron, Jesse, 114115

Batman, Miranda, 57

Bawcombe, Libby, 4042

Beyonc, 55

bias. See also gender bias; political bias; racial bias

in algorithms, 144145, 176

in default settings, 3538, 61

of Facebooks creators, 168172

of Twitters creators, 150, 158160

binary choices, 62

Black Lives Matter movement, 81

Bouie, Jamelle, 61

Brown, Mike, 163

Brown Eyes, Lance, 54

Butterfield, Stewart, 190191

BuzzFeed, 157, 165166

cares about us (CAU) metric, 97

caretaker speech, 114115

celebrations. See misplaced celebrations and humor

COMPAS (Correctional Offender Management Profiling for Alternative Sanctions), 119121, 125129, 136, 145

computer science, and tech industry pipeline, 2126, 181182

Cook, Tim, 19

Cooper, Sarah, 24

Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), 119121, 125129, 136, 145

Costolo, Dick, 148

Cramer, Jim, 158

Creepingbear, Shane, 5356

Criado-Perez, Caroline, 156

criminal justice

and COMPAS, 119121, 125129, 136, 145

predictive policing software, 102

sentencing algorithms for, 10

culture fit, 2425, 25, 189

curators, of Trending Facebook feature, 165169, 172

daily active users (DAUs) metric, 74, 9798

Daniels, Gilbert S., 39

Dash, Anil, 9, 187

data. See personal data; proxy data; training data

data brokers, 101104

Data Detox Kit, 102103

DAUs (daily active users) metric, 74, 9798

default settings

and average users, 3839

bias in, 3538, 61

and cultural norms, 198

default effect, 34, 65

defined, 3435

and Facebook, 108109

and gender of game avatars, 3536

and marginalized populations, 37, 66

and Ubers location tracking, 106, 108

Delano, Maggie, 2831, 33

delight, 8, 79, 90, 9394, 96

Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech»

Look at similar books to Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech»

Discussion, reviews of the book Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.