AI Errors and the Human Body

body-figuresIn the last six months, multiple errors have been making some headlines accusing algorithms to be racist, sexist or harmful in their ‘reading of the human body’.  If we want to really appreciate what is happening today and why AI systems and algorithms seem to be getting it so wrong when it comes to reading the human body, we need to look at the history of scientific racism of Western Thought.

Automated preview image cropping on Twitter favors white-skinned faces over darker ones. Facebook algorithm does not display the same job offers according to gender. A farmer in Canada promoting his onion seeds on Facebook finds his add deleted because it was automatically considered as sexual content… Whether funny or infuriating, we could almost be grateful for these errors as they shed light on how our current technologies are the expression of a long history of mismeasurement of the human body, which finds its roots in the history of scientific racism.

In 1981, Gould, a paleontologist, wrote The Mismeasure of Man (1981), a groundbreaking (and heartbreaking) overview of scientific racism and statistical methods. Gould (1981) was particularly interested in IQ rankings, and he argued that western scientific thought had been based on the wrong idea that intelligence could be measured in an unitary and linear way, and that there were clear biological determinants that determined whether one was intelligent or not (e.g. measure of the cranium). Gould’s work is groundbreaking because it puts us in front of the biases and misconstruction of scientific racism.

We need to look into this history if we want to understand what is happening today and how our society is shaped by misconstrued and racist understandings of the human body. In her 2019 book Fearing the Black Body the sociologist Strings argues that current calculations of the Body Mean Index for instance are not rooted in scientific evidence, but rather on cultural and racist understandings of what the ordinary human should be. She shows how throughout Western history the body has been used to validate race, class and gender prejudice.

If we understand technologies for what they are, cultural artefacts, then it is not surprising the technologies that we design are full of the race, class, and gender prejudices of our culture. For instance, in the last few years there has been a growing interest about how social network like Instagram is designed for and contributes to building an idea of “standard beauty”. When the black plus-size female model and influencer Nyome Nicholas-Williams launched a campaign to obtain a change about Instagram’s nudity policy, she addressed the issue of “photos of semi-naked skinny white women” being allowed while “those posted by black women in similar poses” were either automatically deleted or less promoted, reports the Guardian.

Standardized beauty and its associated features are not a new subject, but digital technologies enhance and automatize the discrimination towards other types of physical beauty and their ways of expression. To fight such discrimination which tends to be fiercer towards people who have historically been treated this way (Eubanks, 2018), political action like this campaign to force tech giants to adjust their policies is a first step. Making sure that the design teams working for these tech giants include more diverse members is a second, as Sara Wachter-Boettcher  (2015) advocates in Technically Wrong.

Yet we need to acknowledge that scientific racism creeps up in different levels of technological life and the COVID-19 pandemic is only amplifying and reinforcing this problem.

A key example which is close to the heart, at the time of writing, can be found in COVID-19 contact-tracing apps. In her fascinating work, Milan (Milan, 2020) has shown that most of these apps are based on a “standard” experimental subject that hardly allows for exploring the role of variables such as gender, ethnicity, race, or low income. Milan shows how the roots of this reductionism, stems from design practice itself. It is for this reason that, she draws on anthropologist Arturo Escobar who has advanced a new vision for design theory one that takes into account the complex and intersectional pluriverse we live in (Escobar, 2018).

The truth is that we could almost be grateful for these algorithmic errors in reading the human body because they underline what we have forgotten in the race for technological innovation. Each human being is unique, evolves through time, interactions and experiences. No matter how fast and powerful AI systems become, the human complexity will never be sorted with a computational logic and reducible to numerical data sets, and AI systems will always be shaped by the history  and prejudices of their designers. These algorithmic error might be interpreted as weak signal of a common necessity to reconsider how we read and understand the human body.

by Marie Poux-Berthe and Veronica Barassi