Update (2:00 PM) Statement from Uber spokesperson: ¡°The driver in question did not lose access to the Uber app due to a change in physical appearance. In fact, he had visited an Uber Partner Seva Kendra when he was unable to log in and it was communicated to him that his access to the app had been previously removed due to violations of our community guidelines, which set the highest possible safety standards on our platform. Uber¡¯s facial comparison tool helps ensure that only the registered driver is using the account and is capable of detecting natural changes in a person¡¯s appearance such as long or cropped hair. In case drivers face a problem logging in due to any technical issue with the selfie verification process, they have the option to visit the nearest Uber Partner Seva Kendra for a manual review of their profile.¡±
Original Article (10:42 AM) An Uber driver has been locked out of the Uber partner app for more than a month after its facial recognition identity check failed to recognise his face with his shaved head, Telangana Gig and Platform Workers Union (TGPW) said in a statement.
Neradi Srikanth has been driving with Uber for over 1.5 years now and holds a 4.67-star rating for the 1,428 trips he completed, the Union said. He got his head shaved during a recent trip to Tirupati, and when he returned, he found that he could not log in to his Uber partner app, which is his source of daily income.
The union said that the ride-hailing giant¡¯s facial recognition tech didn¡¯t identify Srikanth with his shaved head after he uploaded a routine selfie. This happened 34 days ago. And for these past 34 days, Srikanth has been visiting the Uber Office every day and is being met with no response, all without his daily income.
¡°Like Srikanth, many have been doing rounds of Uber offices in their cities but there is no manager or any staff that can take decisions to solve their complaints. The drivers are left stranded for days without work and money and no one to resolve their problems,¡± Shaikh Salauddin, the president of TGPW, claimed.
Meanwhile, we have reached out to Uber India representatives for a comment on these allegations, which will be updated in this article as soon as they respond.
In March 2017, Uber rolled out a ¡®Real Time ID Check¡¯ feature in India to verify that driver accounts aren¡¯t being used by anyone other than the licensed individuals registered with the ride-hailing platform.
¡°Drivers are asked periodically to take a selfie in the Uber app before they accept rides. We then use Microsoft¡¯s Cognitive Services to instantly compare this photo to the one corresponding with the account on file. If the two photos don¡¯t match, the account is temporarily blocked while we look into the situation,¡± Uber explained in a blog post.
This, according to Uber, prevents fraud and protects drivers¡¯ accounts from being compromised. It also protects riders by building another layer of accountability into the app to ensure the right person is behind the wheel.?
But Uber¡¯s use of facial recognition for a driver identity system is also being challenged in the U.K. where a union has called for Microsoft to suspend the ride-hailing giant¡¯s use of the technology.
The union said it has identified seven cases of ¡°failed facial recognition and other identity checks¡± in the UK leading to drivers losing their jobs and licence revocation action by Transport for London (TfL).
Uber says if there¡¯s no machine match, the system sends the query to a three-person human review panel to conduct a manual check. In one instance Uber blamed human error on the part of its manual review team who couldn¡¯t identify a bearded man¡¯s selfie as the same person in the clean-shaven photo Uber held on file.
However, it didn¡¯t say what happened in the other five identity check failures and neither did it reveal the ethnicities of the seven drivers that were misidentified. Why ethnicity? Read on to know what I¡¯m getting at.
In June 2020, Microsoft banned the sale of its facial recognition technology to US police in the wake of the Black Lives Matter protests against law enforcement brutality and bias. Research found that face analysis was less accurate for people with darker skin tones and false matches could lead to wrongful arrests.
According to a 2018 MIT study Microsoft¡¯s system can have an error rate as high as 20 per cent for dark-skinned women. And although it¡¯s more accurate than the competition, the error rate on darker subjects (12.9 per cent) is about 20 times higher than that on lighter individuals (0.7 per cent).
Illumination is of particular importance when doing an evaluation based on skin type. Default camera settings are often optimized to expose lighter skin better than darker skin.
In September 2020, there was a lot of anger surrounding Twitter¡¯s image cropping algorithm that seemingly had a bias towards fair-skinned people.?
When users submit pictures that are too tall or too wide for the layout, Twitter automatically crops them to roughly a square. Turns out, when such images include two or more people and they¡¯ve got different coloured skin, the crop picks the lighter face.
Twitter explained that it¡¯s machine learning algorithm relies on saliency--a measure that predicts where people might look first in an image--and added that it¡¯s exploring ways to give users ¡°control over what their images will look like in a Tweet¡±.
A 2019 analysis published in Nature found rampant racism in decision-making software used by US hospitals. The study concluded that the algorithm was less likely to refer black people than white people who were equally sick to programmes that aim to improve care for patients with complex medical needs.