Amazon cover image
Image from Amazon.com

More Than a Glitch : Confronting Race, Gender, and Ability Bias in Tech.

By: Publisher: Cambridge, Massachusetts : The MIT Press, 2023Copyright date: ©2023Description: 234 pages : illustrations ; 24 cmContent type:
  • text
Media type:
  • unmediated
Carrier type:
  • volume
ISBN:
  • 9780262047654
  • 0262047659
Subject(s): Genre/Form: Additional physical formats: Online version:: More than a glitch.DDC classification:
  • 303.48/3 23/eng/20221006
LOC classification:
  • T14.5 .B765 2023
Contents:
Introduction -- Understanding machine bias -- Recognizing bias in facial recognition -- Machine fairness and the justice system -- Real students, imaginary grades -- Ability and technology -- Gender rights and databases -- Diagnosing racism -- An AI told me I had cancer -- Creating public interest technology -- Potential reboot.
Summary: In this book, the author argues that the structural inequalities reproduced in algorithmic systems are no glitch. They are part of the system design. This book shows how everyday technologies embody racist, sexist, and ableist ideas; how they produce discriminatory and harmful outcomes; and how this can be challenged and changed. -- Provided by publisher.Summary: When technology reinforces inequality, it's not just a glitch - it's a signal that we need to redesign our systems to create a more equitable world. The word "glitch" implies an incidental error, as easy to patch up as it is to identify. But what if racism, sexism, and ableism aren't just bugs in mostly functional machinery - what if they're coded into the system itself? In the vein of heavy hitters such as Safiya Umoja Noble, Cathy O'Neil, and Ruha Benjamin, the author demonstrates how neutrality in tech is a myth and why algorithms need to be held accountable. The author, a data scientist and one of the few Black female researchers in artificial intelligence, masterfully synthesizes concepts from computer science and sociology. She explores a range of examples: from facial recognition technology trained only to recognize lighter skin tones, to mortgage-approval algorithms that encourage discriminatory lending, to the dangerous feedback loops that arise when medical diagnostic algorithms are trained on insufficiently diverse data. Even when such technologies are designed with good intentions, this book shows, fallible humans develop programs that can result in devastating consequences. The author argues that the solution isn't to make omnipresent tech more inclusive, but to root out the algorithms that target certain demographics as "other" to begin with. With sweeping implications for fields ranging from jurisprudence to medicine, the ground-breaking insights of this book are essential reading for anyone invested in building a more equitable future. -- Publisher's description.
List(s) this item appears in: 2024 New Titles
Holdings
Item type Current library Call number Copy number Status Date due Barcode Item holds
BOOK BOOK NCAR Library Mesa Lab T14.5 .B765 2023 1 Checked out 06/03/2025 50583020016899
Total holds: 0

Includes bibliographical references (pages [193]-221) and index.

Introduction -- Understanding machine bias -- Recognizing bias in facial recognition -- Machine fairness and the justice system -- Real students, imaginary grades -- Ability and technology -- Gender rights and databases -- Diagnosing racism -- An AI told me I had cancer -- Creating public interest technology -- Potential reboot.

In this book, the author argues that the structural inequalities reproduced in algorithmic systems are no glitch. They are part of the system design. This book shows how everyday technologies embody racist, sexist, and ableist ideas; how they produce discriminatory and harmful outcomes; and how this can be challenged and changed. -- Provided by publisher.

When technology reinforces inequality, it's not just a glitch - it's a signal that we need to redesign our systems to create a more equitable world. The word "glitch" implies an incidental error, as easy to patch up as it is to identify. But what if racism, sexism, and ableism aren't just bugs in mostly functional machinery - what if they're coded into the system itself? In the vein of heavy hitters such as Safiya Umoja Noble, Cathy O'Neil, and Ruha Benjamin, the author demonstrates how neutrality in tech is a myth and why algorithms need to be held accountable. The author, a data scientist and one of the few Black female researchers in artificial intelligence, masterfully synthesizes concepts from computer science and sociology. She explores a range of examples: from facial recognition technology trained only to recognize lighter skin tones, to mortgage-approval algorithms that encourage discriminatory lending, to the dangerous feedback loops that arise when medical diagnostic algorithms are trained on insufficiently diverse data. Even when such technologies are designed with good intentions, this book shows, fallible humans develop programs that can result in devastating consequences. The author argues that the solution isn't to make omnipresent tech more inclusive, but to root out the algorithms that target certain demographics as "other" to begin with. With sweeping implications for fields ranging from jurisprudence to medicine, the ground-breaking insights of this book are essential reading for anyone invested in building a more equitable future. -- Publisher's description.

Questions? Email library@ucar.edu.

Not finding what you are looking for? InterLibrary Loan.