Meredith Broussard - More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech
Here you can read online Meredith Broussard - More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. publisher: MIT Press, genre: Politics. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:
Romance novel
Science fiction
Adventure
Detective
Science
History
Home and family
Prose
Art
Politics
Computer
Non-fiction
Religion
Business
Children
Humor
Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.
- Book:More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech
- Author:
- Publisher:MIT Press
- Genre:
- Rating:4 / 5
- Favourites:Add to favourites
- Your mark:
- 80
- 1
- 2
- 3
- 4
- 5
More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech: summary, description and annotation
We offer to read an annotation, description, summary or preface (depends on what the author of the book "More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.
More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech — read online for free the complete book (whole text) full work
Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.
Font size:
Interval:
Bookmark:
Confronting Race, Gender, and Ability Bias in Tech
Meredith Broussard
The MIT Press
Cambridge, Massachusetts|London, England
2023 Massachusetts Institute of Technology
All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher.
An earlier version of chapter 5, Real Students, Imaginary Grades, was published as When Algorithms Give Real Students Imaginary Grades in the New York Times, September 9, 2020. https://www.nytimes.com/2020/09/08/opinion/international-baccalaureate-algorithm-grades.html .
An earlier version of chapter 7, Gender Rights and Databases, was published as The Next Frontier for Gender Rights Is Inside Databases, in You Are Not Expected to Understand This: How 26 Lines of Code Changed the World, ed. Torie Bosch (Princeton, NJ: Princeton University Press, 2022). All rights reserved.
The MIT Press would like to thank the anonymous peer reviewers who provided comments on drafts of this book. The generous work of academic experts is essential for establishing the authority and quality of our publications. We acknowledge with gratitude the contributions of these otherwise uncredited readers.
Library of Congress Cataloging-in-Publication Data
Names: Broussard, Meredith, author.
Title: More than a glitch : confronting race, gender, and ability bias in tech / Meredith Broussard.
Description: Cambridge, Massachusetts : The MIT Press, [2023] | Includes bibliographical references and index. | Summary: Broussard argues that the structural inequalities reproduced in algorithmic systems are no glitch. They are part of the system design. This book shows how everyday technologies embody racist, sexist, and ableist ideas; how they produce discriminatory and harmful outcomes; and how this can be challenged and changedProvided by publisher.
Identifiers: LCCN 2022019913 (print) | LCCN 2022019914 (ebook) | ISBN 9780262047654 | ISBN 9780262373067 (epub) | ISBN 9780262373050 (pdf)
Subjects: LCSH: TechnologySocial aspects. | Data processingSocial aspects. | Artificial intelligenceSocial aspects. | Discrimination. | Software failures.
Classification: LCC T14.5 .B765 2023 (print) | LCC T14.5 (ebook) | DDC 303.48/3dc23/eng/20221006
LC record available at https://lccn.loc.gov/2022019913
LC ebook record available at https://lccn.loc.gov/2022019914
d_r0
For my family
List of Figures
Linear relationship between variables.
Source: Solon Barocas and Chandler May.
Nonlinear relationship between variables.
Source: Solon Barocas and Chandler May.
Nonmonotonic relationship between variables.
Source: Solon Barocas and Chandler May.
Multidimensional linear relationship between variables.
Source: Solon Barocas and Chandler May.
Multidimensional nonlinear relationship between variables.
Source: Solon Barocas and Chandler May.
Multidimensional nonmonotonic relationship between variables.
Source: Solon Barocas and Chandler May.
Multidimensional nonmonotonic relationship between variables.
Source: Solon Barocas and Chandler May.
Depiction of selected federal, state, and nongovernment systems with facial recognition technology used by federal agencies that employ law enforcement officers, and the number of photos in them, as of March 31, 2020.
Source: GAO-21-105309, https://docs.house.gov/meetings/JU/JU08/20210713/113906/HMTG-117-JU08-Wstate-GoodwinG-20210713.PDF ; GAO analysis of information provided by system users or owners.
Often, when people talk about making more equitable technology, they start with fairness. This is a step in the right direction. Unfortunately, it is not a big enough step in the right direction. Understanding why starts with a cookie. (A sweet and crunchy one, like you would eatnot like the cookie that you have to accept when visiting a web page.)
When I think of a cookie, I think of the jar my mother kept on our yellow Formica kitchen counter throughout my childhood. It was a large porcelain jar with a wide mouth, and often it was filled with homemade cookies. The porcelain lid clanked loudly every time a kid opened the jar for a snack. If I heard my little brother opening the jar, I wandered into the kitchen to get a cookie too. My brother did the same if he heard me. It was a mutually beneficial systemuntil we got to the last cookie.
When there was only one cookie left in the jar, my brother and I bickered about who got it. It was inevitable. My brother and I squabbled about everything as kids. (As adults, we work in adjacent fields, and theres still a fair bit of good-natured back and forth.) At the time, our cookie conflicts seemed high-stakes. There were often tears. Today, as a parent myself, I admire my mother for stepping in hundreds of times to resolve these kinds of kid disputes. I admire her more for the times she didnt step in and let us work out the problem on our own.
If this story were a word problem in an elementary school math workbook, the answer would be obvious. Each kid would get half of the cookie, or 50 percent. End of story. This mathematical solution is how a computer would solve the dispute as well. Computers are machines that do math. Everything computers do is quite literally a computation. Mathematically, giving each kid 50 percent is fair.
In the real world, when kids divide a cookie in half, there is normally a big half and a little half. Anyone who has had a kid or been a kid can tell you what happens next: theres a negotiation over who gets which half. Often, there is more arguing at this point, and tears. If I was in the mood for peaceful resolution when I was a kid, I would strike a deal with my little brother. If you let me have the big half, Ill let you pick the TV show that we watch after dinner, Id offer. He would think for a moment and decide that sounded fair. We would both walk away content. Thats an example of a socially fair decision. My brother and I each got something we wanted, even though the division was not mathematically equal.
Social fairness and mathematical fairness are different.
Computers can only calculate mathematical fairness.
This difference explains why we have so many problems when we try to use computers to judge and mediate social decisions. Mathematical truth and social truth are fundamentally different systems of logic. Ultimately, its impossible to use a computer to solve every social problem. So, why do so many people believe that using more technology, more computing power, will lead us to a better world?
The reason is technochauvinism. Technochauvinism is a kind of bias that considers computational solutions to be superior to all other solutions. Embedded in this bias is an a priori assumption that computers are better than humanswhich is actually a claim that the people who make and program computers are better than other humans. Technochauvinism is what led to the thousands of abandoned apps and defunct websites and failed platforms that litter our collective digital history. Technochauvinist optimism led to companies spending millions of dollars on technology and platforms that marketers promised would revolutionize and digitize everything from rug-buying to interstellar travel. Rarely have those promises come to pass, and often the digital reality is not much better than the original. In many cases, it is worse. Behind technochauvinism are very human factors like self-delusion, racism, bias, privilege, and greed. Many of the people who try to convince you that computers are superior are people trying to sell you a computer or a software package. Others are people who feel like they will gain some kind of status from getting you to adopt technology; people who are simply enthusiastic; or people who are in charge of IT or enterprise technology. Technochauvinism is usually accompanied by equally bogus notions like algorithms are unbiased or computers make neutral decisions because their decisions are based on math. Computers are excellent at doing math, yes, but time and time again, weve seen algorithmic systems fail at making social decisions. Algorithms cant sufficiently monitor or detect hate speech, cant replace social workers in public assistance programs, cant predict crime, cant determine which job applicants are more suited than others, cant do effective facial recognition, cant grade essays or replace teachersand yet technochauvinists keep selling us snake oil and pretending that technology is the solution to every social problem.
Font size:
Interval:
Bookmark:
Similar books «More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech»
Look at similar books to More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.
Discussion, reviews of the book More than a Glitch : Confronting Race, Gender, and Ability Bias in Tech and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.