Article published In: Information Visualization
Edited by Marian Dörk and Isabel Meirelles
[Information Design Journal 25:1] 2019
► pp. 56–70
Gaps between the digits
On the fleshy unknowns of the human
Available under the Creative Commons Attribution-NonCommercial (CC BY-NC) 4.0 license.
For any use beyond this license, please contact the publisher at rights@benjamins.nl.
Published online: 16 March 2020
https://doi.org/10.1075/idj.25.1.05mor
https://doi.org/10.1075/idj.25.1.05mor
Abstract
Artificially intelligent systems (ai) are increasingly becoming the ubiquitous, unseen arbiters of our social, civic and familial lives. Ever increasing computational power, combined with almost limitless data, has led to a turning point in the way artificial intelligence assists, judges, and cares for humans. In the wake of such power we must ask ourselves what it is that we are making inherently unknowable as the world becomes more predictable, managed, and discrete. Building on the work of black feminists Sylvia Wynter and Hortense Spillers, I perform a reading of the “flesh”. I aim to hint towards a different field of relations and a knowledge politic premised on unknowability and the radical potential of the subjugated to foster new imaginaries of the human fluid enough to weather instability. This piece troubles the boundaries inscribed between things. Settled in the flesh of blackness, we are reminded of the ways that blackness floods the landscape of productive reason while holding outlier ways of being beyond Western Man. This paper seeks to return to the pulse found within the flesh as a critical site for thinking through alternate ways of being, within the messiness, the unstable, the precarious; finding life born of transition, the pulse within discord.
Keywords: artificial intelligence, machine bias, black studies
Article outline
- 1.Introduction
- 2.As human as an acronym
- 3.Fixing logics: Building the set
- 4.Racializing assemblages: Making through discretion
- 5.Building from detritus: Towards fleshy ways of unknowing
- Notes
References
References (20)
AI Now Institute (2018). Litigating algorithms: Challenging government use of algorithmic decision systems (Report, in collaboration with Center on Race, Inequality, and the Law, Electronic Frontier Foundation).
Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine Bias. ProPublica. Retrieved from [URL]
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. New York, NY: St. Martin’s Press.
Frequently Asked Questions (2017, December 15). Retrieved from [URL]
Kurgan, L. (2013). Close up at a distance: Mapping, technology, and politics. Brooklyn, NY: Zone Books.
Mckittrick, K. (2011). On plantations, prisons, and a black sense of place. Social & Cultural Geography, 12(8), 947–963.
Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York, NY: New York University Press.
Oliver, M. L., & Shapiro, T. M. (1995). Black wealth/white wealth: A new perspective on racial inequality. New York, NY: Routledge.
O’Neil, C. (2018). Weapons of math destruction: How big data increases inequality and threatens democracy. London: Penguin Books.
Rothstein, R. (2017). The color of law: A forgotten history of how our government segregated America. New York, NY: Liveright Publishing Corporation.
Spillers, H. J. (1987). Mama’s baby, Papa’s maybe: An American grammar book. Diacritics, 17(2), 64–81.
Terry, J., & Urla, J. (Eds.). (1999). Deviant bodies: Critical perspectives on difference in science and popular culture. Bloomington: Indiana University Press.
The Human Project (2017, November 20). Retrieved from [URL]
Weheliye, A. G. (2014). Habeas viscus: Racializing assemblages, biopolitics, and black feminist theories of the human. Durham: Duke University Press.
Cited by (1)
Cited by one other publication
This list is based on CrossRef data as of 11 december 2025. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.
