• Complain

Patrick K. Lin - Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System

Here you can read online Patrick K. Lin - Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2021, publisher: Patrick K. Lin, genre: Politics. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

No cover
  • Book:
    Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System
  • Author:
  • Publisher:
    Patrick K. Lin
  • Genre:
  • Year:
    2021
  • Rating:
    5 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 100
    • 1
    • 2
    • 3
    • 4
    • 5

Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

When todays technology relies on yesterdays data, it will simply mirror our past mistakes and biases.

AI and other high-tech tools embed and reinforce Americas history of prejudice and exclusion - even when they are used with the best intentions. Patrick K. Lins Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System takes a deep and thorough look into the use of technology in the criminal justice system, and investigates the instances of coded bias present at every level.

In this book, youll learn how algorithms and high-tech tools are used in unexpected ways: suggesting which neighborhoods to police, predicting whether someone is more or less likely to commit a crime, and determining how long someones prison sentence should be.

Machine See, Machine Do takes you on an eye-opening journey of discovery, encouraging you to think twice about our current system of justice and the technology that supposedly makes it more objective and fair. If you are someone who cares deeply about criminal justice reform, is curious about the role of technology in our day-to-day lives, and ultimately believes we should aspire to make both of these spaces more ethical and safe, this book is for you.

Patrick K. Lin: author's other books


Who wrote Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System? Find out the surname, the name of the author of the book and a list of all author's works by series.

Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Machine See, Machine Do

Machine See, Machine Do

How Technology Mirrors Bias in Our Criminal Justice System

Patrick K. Lin

Machine See Machine Do How Technology Mirrors Bias in Our Criminal Justice System - image 1

New Degree Press

Copyright 2021 Patrick K. Lin

All rightsreserved.

Machine See, Machine Do

How Technology Mirrors Bias in Our Criminal Justice System

ISBN 978-1-63730-821-9 Paperback

978-1-63730-883-7 Kindle Ebook

978-1-63730-967-4 Ebook

Whether AI will help us reach our aspirations or reinforce the unjust inequalities is ultimately up to us.

Joy Buolamwini

Contents

For every person who has ever been shortchanged, excluded, and underestimated by machines.

For the public interest technologists improving the world with their hope, vision, and perseverance.

For the activists and advocates protecting civil liberties through organizing, education, and dedication.

Introduction

Much of what New York City looks like today is attributed to a man who never held elected office or received any formal training in architecture or urban planning.

Robert Moses has been called the master builder of mid-twentieth century New York and its surrounding suburbs. He shaped much of the infrastructure of modern New York City, Long Island, Rockland County, and Westchester County (Caro, 1974).

Over the course of his forty-four-year career, Moses built nearly seven hundred miles of road, including massive highways that stretched out of Manhattan into Long Island and Upstate New York; twenty thousand acres of parkland and public beaches; 658 playgrounds; seven new bridges; the UN Headquarters; the Central Park Zoo; and the Lincoln Center for the Performing Arts (Burkeman, 2015). It would be an understatement to say Moses left a lasting mark on New York. In the twentieth century, the influence of Robert Moses on the cities of America was greater than that of any other person, wrote American historian Lewis Mumford.

However, new, large-scale developments come with a priceand not everyone pays the same amount.

To build hundreds of miles of highways and dozens of housing and urban renewal projects, Moses had more than five hundred thousand people evicted (Gratz, 2007). Black and Brown people comprised 40 percent of the evicted population at a time when those demographics made up only about 10 percent of the New York Citys overall population (Census Bureau, 2021). The construction of Lincoln Center alone displaced more than seven thousand working-class families and eight hundred businesses. Many of these evicted New Yorkers ended up in Harlem and the Bronx, further segregating the city (Williams, 2017). Moses also avoided building public pools in Black neighborhoods and instead designed those same neighborhoods to be prone to traffic congestion, not only withholding public goods from Black neighborhoods, but also forcing them to bear the brunt of the social costs (Schindler, 2015).

Robert Moses with a model of the proposed Battery Bridge Source The Library - photo 2

Robert Moses with a model of the proposed Battery Bridge. Source: The Library ofCongress.

Moses infamously hated the idea of poor peopleparticularly poor people of colorusing the new public parks and beaches he was building on Long Island (Burkeman, 2015). To that end, Moses used his influence and connections to pass a law forbidding public buses on highways, but he knew laws could someday be repealed. Legislation can always be changed, Moses said. Its very hard to tear down a bridge once its up. So Moses built scores of bridges that were too low to let public buses pass, literally concretizing discrimination (Bornstein, 2017). The effect of these decisions has been profound and enduring. Decades later, the bus laws Moses fought for were overturned. Still, the towns he built along the highways remain as segregated as ever.

People often do not want to believe seemingly innocuous objectslike bridges or highwayscan be racial or political, but as Moses buildings and plans show, human history is inherently racial and political. Moses racist views played out in what he built, how he built, and where he built.

But Moses was not alone. He wielded tremendous power and influence throughout his career, but he was still just an individual operating within a system built on bias and racism. For example, the Federal Housing Administrations Underwriting Manual states incompatible racial groups should not be permitted to live in the same communities, recommending highways be built as a way to separate Black neighborhoods from white neighborhoods (Gross, 2017). Rooting out bias isnt only about powerful individuals; it isnt even just about you or me. Its about history and systems that continue to exist, bridges that are too difficult to tear down.

Discriminatory decisions and policies of the past impact the present. Racial and social inequity affect the very fabric of our reality. Everything has costs and benefits, and these are not evenly distributed. The decision, whether conscious or unconscious, to advance or burden some members of society over others is fundamentally racial and political.

Artificial intelligence is no different. The technology is relentlessly improving and increasingly pervasive, yet despite well-documented biases, AI developed in the private and public sectors alike consistently fail to account for it. Somehow, in the past two decades, we got the idea machines make better decisions than humans. We began saying things like, People are biased, but AI will be more objective. We have forgotten humans design and deploy AI to serve their purposes. Humans, even those with the best intentions, can introduce bias to the AI they develop. Technology is not inherently objective or fair.

Todays technology, built from yesterdays data, will reflect the biased environment from which that data came. Bias often appears in AI systems through factors like race and gender, which generally are not directly inputted into the system but still have a strong influence over the systems decisions. The system is especially prone to bias when one of these factors is strongly correlated with information directly used by the system.

For example, suppose a system that makes determinations about someones level of education uses zip code as a factor to make its decisions. Direct information about race is never given to the system, so how can a system like that be biased?

Zip code is correlated with race since a lot of neighborhoods in America are still segregated, senior staff technologist Daniel Kahn Gillmor at the ACLUs Speech, Privacy, and Technology Project said to me. Gillmors work focuses on the way our technical infrastructure shapes society and impacts civil liberties. The data youre using to make these guesses is ultimately going to be pulled from a society that has a bunch of other problems, and the system is going to just reflect those problems back.

By using zip code as a factor, the AI system is indirectly making decisions based on race. In other words, zip code is a proxy for race. Therefore, even if the systems math and logic is all correct, an underlying ethical question reveals itself: is it appropriate to make these decisions based on these inputs?

A magical machine offering the promise of objectivity and fairness is extremely appealing. The public can be tricked into accepting an imperfect or even incompetent algorithm, particularly when the current state of an institution has historically been plagued by prejudice and bias, like the judicial system. We know things need to change and we want to believe technology can be that change. However, unlike humans, an algorithm cannot dissent, disobey, or make exceptions. People, on the other hand, can learn to account for ways in which data is a representation of the past.

Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System»

Look at similar books to Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System»

Discussion, reviews of the book Machine See, Machine Do: How Technology Mirrors Bias in Our Criminal Justice System and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.