Challenge Uber's discriminatory use of facial recognition systems
Challenge Uber's discriminatory use of facial recognition systems
This case is raising funds for its stretch target. Your pledge will be collected within the next 24-48 hours (and it only takes two minutes to pledge!)
Uber's facial recognition systems turns lives upside down
Pa Edrissa Manjang and Imran Javaid Raja both worked for Uber as a courier and private hire driver respectively. Both were dismissed unfairly after Uber's facial recognition identification system, known as the Real Time ID (RTID) failed to identify them.
Drivers and couriers working for Uber are asked to submit real time selfies which the system matches with a reference photo. If the system fails to make the match workers are dismissed and accused of committing fraudulent activity on the app. Uber assumes that the workers are allowing other people to access the app.
Facial recognition software is known to be highly inaccurate in identifying people of colour and Uber is deploying this technology on a vulnerable workforce largely made up of individuals of ethnic minority background.
Workers lose livelihood without human review
Uber reported the dismissal of Mr Raja to Transport for London (TfL) who immediately revoked his private hire driver and vehicle licenses without investigation. This meant that he was banned from working for any private hire operator and he could not rent his vehicle out either.
After three months and with the assistance of the App Drivers & Couriers Union (ADCU), Uber and TfL finally accepted that they had made a mistake in his case. TfL restored his license and Uber reactivated his account. Neither Uber nor TfL have ever apologised or offered to pay compensation for lost wages or legal costs.
Uber also dismissed Mr Manjang after a series of RTID checks which they claimed they could not verify. He was not given a chance to appeal and his request for a human review of the photos he submitted was denied.
Companies know facial recognition fails people of colour
Uber uses Microsoft's FACE API facial recognition software as part of its RTID checks. Unfortunately, the system is known to have serious problems with accuracy particularly with people of colour.
A 2018 study by the Massachusetts Institute of Technology found that three facial recognition programs (including the Microsoft software, which is used by Uber) produced identification errors at a rate of 0.8% for men with light skin. However, the error rate increased to 20% to 34% for women with dark skin.
Microsoft, IBM and Amazon have all discontinued or withdrawn sales of their facial recognition products to US police departments last year following widespread protests over police brutality and racial discrimination. Microsoft President, Brad Smith, said at the time "we will not sell facial-recognition technology to police departments in the United States until we have a national law in place, grounded in human rights, that will govern this technology".
A vulnerable workforce
According to TfL reports, 94% of private hire drivers in London are from a BAME background and this is likely the same across the UK.
Uber has had a record of consistently denying basic worker rights to workers for almost a decade. Given their record we say, Uber should not be allowed to use facial recognition software in the UK against a vulnerable workforce already at risk of exploitation and human rights violations. It is entirely disproportionate and unnecessary.
The App Drivers and Couriers Union has written to Microsoft to raise concerns regarding the use of Microsoft's FACE API. In response, Microsoft has stressed that parties involved in the deployment of such technologies have responsibilities which include: "incorporating meaningful human review to detect and resolve cases of misidentification or other failure; to provide support to people who believe their results were incorrect; and to identify and address fluctuations in accuracy due to variation in conditions."
It is clear to us that Uber has not implemented any of these measures.
What case are we bringing?
Claims have been initiated against Uber in the Central London Employment Tribunal for harassment related to race, victimisation and indirect race discrimination. Pa and Imran (and indeed all couriers and drivers) are entitled to protection and remedy against Uber under the 2010 Equalities Act.
We believe Uber has knowingly used automated facial recognition systems known to have a high error rate when used with people of colour but nevertheless relied on the results of these systems to dismiss us from our jobs without an effective right of an appeal before taking that decision. Uber's governance of such systems is wholly inadequate and their use disproportionate. We believe our rights to freedom from discrimination have been violated and the continued use of these systems poses a threat to many more workers who are subjected to them.
Call to action
Any donations that you make will help further in contributing to the legal costs of these cases. Pa and Imran are grateful to the Equality and Human Rights Commission for providing financial support to their cases. The Commission has the ability to provide funding to and intervene in important cases that align with its objectives, often providing invaluable support in cases such as these. The Commission’s funding has helped pay for legal representation to get these claims off the ground, and the Commission may make additional contributions as the cases progress.
This case is also supported by the App Drivers and Couriers Union (ADCU) and Worker Info Exchange, a digital rights NGO. The ADCU is a grassroots, craft trade union for drivers and couriers run by drivers and couriers. The experience of Pa and Imran featured in this case is unfortunately the experience of many, many people the union has represented over the last year particularly in licensing appeals at the Magistrates Court and against Uber in the Netherlands under data protection law. The ADCU now seeks to tackle the root of the problem, racial discrimination in the reckless use of error prone technology against a minority workforce.
What are we trying to achieve?
The gig economy is the thin edge of the wedge for a future world of work where management decision is automated and discrimination is embedded in algorithmic function. Today it is marginalised workers in the gig economy. But tomorrow it could be you if we allow the mainstreaming of such harmful technologies to proceed unchecked.
What is the next step in the case?
We are represented by Bates Wells with Chris Milsom of Cloisters instructed as Counsel. The claims have been filed with the Central London Employment Tribunal.
No updates yet
There are no public comments on this case page.