• Ideas

‘Racist’ Technology Is a Bug—Not a Crime

4 minute read
Ideas
John McWhorter is an associate professor of English and comparative literature at Columbia University.

We are told of late that we must entertain whether technology can be a racist. Like when Google Photos, in 2015, algorithmically identified black people as gorillas. Or earlier this year when Microsoft’s Twitterbot Tay, designed to emulate human conversation by trawling tweets, sucked up racist nonsense along with everything else and started spouting some of its own. Or when, in August, Snapchat offered a selfie-altering filter that rendered users as an offensive Asian caricature.

Of course these things should not, once noticed, stay as they are. (All of the examples above were altered or taken down.) But are they racist—i.e., evidence that contempt for racial minorities is the warp and woof of our society still?

The fact that we are trained to approach such things from that perspective calls for some words from James Baldwin, someone few consider to have ever gotten much wrong on race. Here he is in 1962: “I do not know many Negroes who are eager to be ‘accepted’ by white people, still less to be loved by them; they, the blacks, simply don’t wish to be beaten over the head by the whites every instant of our brief passage on this planet.”

To Baldwin, the issue was getting rid of segregation and police brutality, not cleansing whites’ hearts of all racist sentiment, which blacks of his generation considered beside the point, not to mention impossible. I suspect many today concur–Baldwin’s quote is very Black Lives Matter–but too often we get our heads turned in unproductive directions. This leads to an obsession not with racism as an obstacle to achievement, but with racism as a social stain to rub out; it’s like trying to shame people who don’t recycle or floss.

Machines cannot, themselves, be racists. Even equipped with artificial intelligence, they have neither brains nor intention. The question worth asking is whether the people who created a given technology qualify as racists. We can dismiss the idea that wonks dreaming up these mechanisms deliberately intend to offend people. No one at Google giggled while intentionally programming its software to mislabel black people. Microsoft’s engineers were horrified by their Frankenstein Twitterbot.

All three of these flubs were just that, unintentional outcomes that their creators were quick to regret and correct. For example, should we expect these creators to have anticipated that software that codes gorillas as black in color, and perhaps having fullish lips, might apply the same label to black people? They may well have assumed that the recognition software was programmed richly enough to recognize specifically human traits that they needed not worry about this. To instead take the occasion to flay Silicon Valley for not hiring enough black people is hasty: Can we really be certain that a design team with more black programmers would not have made the same flub?

Tay’s programmers, meanwhile, would hardly be alone in underestimating the degree of vicious idiocy on Twitter, and may have assumed that Tay’s being asked normal, neutral questions would not have sparked links to noxious vitriol. These were, in a word, bugs. Bugs in programs involved chiefly in labeling and language are bound, at some point, to create offense.

The Snapchat filter stands out. Clearly somebody in creative there is on the clueless side (and Snapchat is one of the tech companies that refuses to report on the racial composition of its staff). However, cluelessness is not bigotry. Eyes like the ones used by Snapchat are legion in anime-derived emojis, for example. A sane person could well assume, although perhaps in haste, that such a facelet was within the bounds of decency.

Our culture has gotten to the point that we consider it our jobs to say these mistakes indicate “racism,” with the implication that the designers and the people who hire them are therefore “racists.” This disproportionate disgust is a touch medieval in two ways. Imputing bigotry to a computer program is like imputing a spirit to a tree. And calling a Silicon Valley computer programmer a racist is like deeming one’s innocent next-door neighbor a witch for being less than perfect. In a healthier moment there would be more room for saying that these people simply made a mistake.

More Must-Reads From TIME

Contact us at letters@time.com

TIME Ideas hosts the world's leading voices, providing commentary on events in news, society, and culture. We welcome outside contributions. Opinions expressed do not necessarily reflect the views of TIME editors.