Abiodun Ogunyemi has been an Uber Eats delivery driver since February 2020. But since March he has been unable to work due to what a union supporting drivers claims is a racially-biased algorithm. Ogunyemi, who is Black, had submitted a photograph of himself to confirm his identity on the app, but when the software failed to recognize him, he was blocked from accessing his account for “improper use of the Uber application.”
Ogunyemi is one of dozens of Uber drivers who have been prevented from working due to what they say is “racist” facial verification technology. Uber uses Microsoft Face API software on its app to verify drivers’ identification, asking drivers to submit new photos on a regular basis. According to trade union the Independent Workers’ Union of Great Britain (IWGB) and Uber drivers, the software has difficulty accurately recognizing people with darker skin tones.
In 2018, a similar version of the Microsoft software was found to fail one in five darker-skinned female faces and one in 17 darker-skinned male faces. In London nine out of 10 private hire drivers identify as Black or Black British, Asian or Asian British, or mixed race, according to Transport for London data. This poses a potential issue for those who work for Uber.
In an email to TIME, an Uber spokesperson said that its facial verification software is “designed to protect the safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel.” A Microsoft spokesperson said in an emailed statement: “Microsoft is committed to testing and improving Face API, paying special attention to fairness and its accuracy across demographic groups. We also provide our customers with detailed guidance for getting the best results and tools that help them to assess fairness in their system.”
Last week around 80 Uber drivers and protestors gathered outside the ride-hailing app’s London headquarters in Aldgate to waving placards reading “Scrap the racist algorithm” and “Stop unfair terminations,” to protest about the software’s role in disproportionately leading to terminations of drivers of color, among other concerns.
Ogunyemi—who was unable to attend the protest because he is based in Manchester—has three children, and since March he says his wife has taken on full-time work to support the family. Even so, he has fallen into arrears on loan and mortgage payments, he says.
The delivery driver, who until recently had a 96% customer rating, had run into difficulties with the automatic facial identification software before. Drivers are given the option of submitting their pictures to a computer or an Uber employee for review and Ogunyemi often had to wait for additional human verification after submitting his photos. When Uber rejected his picture in March, he says, the situation turned into “a nightmare.”
After his appeal of Uber’s decision was rejected, Ogunyemi asked to speak to someone more senior, but his request was denied, he says. IWGB has since stepped in for Ogunyemi, sending evidence to Uber on his behalf. Last month, he received a message from Uber saying his account had been reactivated and that his photo had initially been rejected by a member of staff due to “human error.” Yet, when Ogunyemi tried to access his account, he was asked to upload another picture for verification. He immediately submitted a new photo, which was denied. His account remains blocked.
“Every single day that I cannot work has a negative impact on my family,” he told TIME in a phone call. “My kids need to go to school, I need to give them pocket money. I need to pay for their bus pass.”
Uber’s spokesperson said that its system “includes robust human review to make sure that this algorithm is not making decisions about someone’s livelihood in a vacuum, without oversight,” but did not address Ogunyemi’s case.
Ogunyemi says he knows of five other drivers, all of whom are Black, who have had their accounts terminated because of issues with facial identification. IWGB says that 35 drivers have reported similar incidents to the union.
Driver identity concerns
Uber began using the problematic software after it was stripped of its license to operate in London in November 2019 amid safety concerns. Authorities found that more than 14,000 trips had been taken with 43 drivers who had used false identities. There were 45,000 Uber drivers licensed in London at the time. A year later, Uber won an appeal to have its license reinstated, but promised to root out unverified drivers by using regular facial identification procedures.
Last week it was reported that an unnamed Black British Uber driver is taking the company to court alleging indirect race discrimination because the facial recognition software was preventing him from working. According to the driver’s claim, he submitted two photos of himself, which were rejected by the platform. The IWGB, which is supporting his claim alongside Black Lives Matter U.K., said his account was later deactivated and he received a message saying: “Our team conducted a thorough investigation and the decision to end the partnership has been made on a permanent basis.” The message also said that the matter was “not subject to further review.” The ADCU is also taking legal action against Uber over the dismissal of a driver and a courier due to the software failing to recognize them.
In the U.S., a similar case was taken to a Missouri court in 2019, filed under civil rights law. The plaintiff, William Fambrough, claimed he was forced to lighten the photos he submitted for immediate verification, since he worked “late nights” for Uber and the software could not identify his face in “pitch darkness.” The company said the photos were fraudulent and his account was suspended. Fambrough’s claim was ultimately unsuccessful.
According to Professor Toby Breckon, an engineer and computer scientist at Durham University, England, facial recognition software is designed for well-lit photos. He says that people with lighter skin tones tend to be more easily recognized by the software, even in badly-lit environments. The data on racial bias in Uber’s software is “particularly bad,” although there is currently no software without a racial bias, Breckon says. His team of researchers, who are working to reduce racial bias in facial recognition algorithms, has found that skin tone is not the only factor: the technology equally struggles to identify a variety of facial features and hair types.
At the London protest, drivers expressed anger about the dismissal of their colleagues, which some believed was a symptom of systemic racism within the company. George Ibekwe, an Uber driver whose account was suspended after a customer complained that he had argued with another driver during the trip, told TIME that he believed racism was at play when his account was suspended without further investigation. Uber’s spokesperson did not comment on Ibekwe’s case.
“I haven’t had any criminal record in my life,” he said. “It is totally devastating. It affects me personally, financially, and mentally.” Without an income, he says he has been forced to claim unemployment benefits.
Another driver at the protest, who asked not to be named, claimed he was terminated after a customer complained he was “staring” at them. He said there was “no evidence, no investigation, and no interview” before his account was suspended.
Uber’s spokesperson did not comment about these allegations when asked by TIME.
Uber drivers’ rights
Uber drivers have long fought against worsening pay (despite rising fares) due to higher service fees, and unsafe working conditions. In February, the British Supreme Court ruled that Uber drivers must be treated as workers, rather than self-employed, entitling them to earn a minimum wage and take paid vacation leave. The ruling was the culmination of a long-running legal battle over the company’s responsibility to its drivers. Similar efforts are underway in other countries around the world, including Spain, the Netherlands, and South Africa, while in California, legal wrangling over ride-sharing drivers’ rights is ongoing.
According to Alex Marshall, president of IWGB, the U.K. Supreme Court ruling has opened the door to drivers suing Uber on the basis that the company has failed to protect them from discrimination. He says that since the tribunal alleging indirect race discrimination against a driver was launched, “Uber seem to be slightly on the backfoot.”
“We’re sending off emails [about facial identification errors], and we’re hearing decisions getting overturned a lot quicker than in the past,” he says.
The outcome of the upcoming court case may have major implications for Uber’s facial identification processes, and could set a precedent for use of the technology. “We’re seeing this movement growing,” Marshall says. “We’re seeing the power switch back to the drivers and we’re going to keep fighting.”
Ogunyemi will be watching the other drivers’ tribunals closely and says he is considering whether to approach a lawyer himself. “It’s been six months since I’ve been out of work,” he says. “I have tried everything humanly possible to reason with Uber. I am not going to sit around any longer waiting for them.”
- Inside Mississippi's Last Abortion Clinic—and the Biggest Fight for Abortion Rights in a Generation
- Do Current COVID-19 Tests Still Detect Omicron?
- The First U.S. Offshore Wind Farm Could Be a Lifeline for Struggling New England Cities
- Welcome to TV's Era of Peak Redundancy
- The Key Role a Local Newspaper Played in the Trial Over Ahmaud Arbery's Murder
- TIME's Top 100 Photos of 2021
- 2021: The Year the Grift Kept Giving