More Than a Glitch : Confronting Race, Gender, and Ability Bias in Tech.
Publisher: Cambridge, Massachusetts : The MIT Press, 2023Copyright date: ©2023Description: 234 pages : illustrations ; 24 cmContent type:- text
- unmediated
- volume
- 9780262047654
- 0262047659
- Technology -- Social aspects
- Electronic data processing -- Social aspects
- Artificial intelligence -- Social aspects
- Discrimination
- Software failures
- Intelligence artificielle -- Aspect social
- Bogues (Informatique)
- TECHNOLOGY & ENGINEERING / Social Aspects
- SOCIAL SCIENCE / Discrimination
- SOCIAL SCIENCE / Gender Studies
- Artificial intelligence -- Social aspects
- Discrimination
- Software failures
- Technology -- Social aspects
- 303.48/3 23/eng/20221006
- T14.5 .B765 2023
Item type | Current library | Call number | Copy number | Status | Date due | Barcode | Item holds | |
---|---|---|---|---|---|---|---|---|
![]() |
NCAR Library Mesa Lab | T14.5 .B765 2023 | 1 | Checked out | 06/03/2025 | 50583020016899 |
Includes bibliographical references (pages [193]-221) and index.
Introduction -- Understanding machine bias -- Recognizing bias in facial recognition -- Machine fairness and the justice system -- Real students, imaginary grades -- Ability and technology -- Gender rights and databases -- Diagnosing racism -- An AI told me I had cancer -- Creating public interest technology -- Potential reboot.
In this book, the author argues that the structural inequalities reproduced in algorithmic systems are no glitch. They are part of the system design. This book shows how everyday technologies embody racist, sexist, and ableist ideas; how they produce discriminatory and harmful outcomes; and how this can be challenged and changed. -- Provided by publisher.
When technology reinforces inequality, it's not just a glitch - it's a signal that we need to redesign our systems to create a more equitable world. The word "glitch" implies an incidental error, as easy to patch up as it is to identify. But what if racism, sexism, and ableism aren't just bugs in mostly functional machinery - what if they're coded into the system itself? In the vein of heavy hitters such as Safiya Umoja Noble, Cathy O'Neil, and Ruha Benjamin, the author demonstrates how neutrality in tech is a myth and why algorithms need to be held accountable. The author, a data scientist and one of the few Black female researchers in artificial intelligence, masterfully synthesizes concepts from computer science and sociology. She explores a range of examples: from facial recognition technology trained only to recognize lighter skin tones, to mortgage-approval algorithms that encourage discriminatory lending, to the dangerous feedback loops that arise when medical diagnostic algorithms are trained on insufficiently diverse data. Even when such technologies are designed with good intentions, this book shows, fallible humans develop programs that can result in devastating consequences. The author argues that the solution isn't to make omnipresent tech more inclusive, but to root out the algorithms that target certain demographics as "other" to begin with. With sweeping implications for fields ranging from jurisprudence to medicine, the ground-breaking insights of this book are essential reading for anyone invested in building a more equitable future. -- Publisher's description.