In a recent incident, Amazon’s face recognition technology received a lot of flak as it erroneously identified 28 US congressmen with as criminals. Rekognition, Amazon’s facial recognition tool which was introduced in 2016, delivered disappointing results as it identified the public personalities as people who have been arrested for crime.
The test, which was conducted by ACLU of Northern California to check its efficiency in facial recognition, included a wide range of datasets — of men, women, legislators, Republicans, Democrats of all ages from across the country. The ACLU in a statement had said that it had used exact same facial recognition system that Amazon offers to the public, which anyone could use to scan for matched between images of faces. The running of entire test had cost them a meager $12.33.
What caught the curious eyes around the whole case was that despite being one of the biggest names in the tech industry, Amazon’s Rekognition failed to deliver. It has raised serious concerns against facial recognition as a technology and its legitimacy and accuracy.
What Is Rekognition?
Introduced in Nov 2016, the image recognition software by Amazon promised to analyse billions of images and videos daily, to recognise objects, people, text and inappropriate content. The website also claimed to be providing highly-accurate facial analysis and facial recognition on images and videos provided by the user. It claimed to be capable of detecting, analysing and comparing faces for a wide variety of user verification and public safety use cases.
Amazon Rekognition is based on highly-scalable deep learning technology, which has been developed by Amazon’s computer vision scientists. Some of its benefits are that it includes simple integration with easy to use APIs that don’t require any machine learning expertise, includes continuous learning, and the cost involved is really low.
According to the company, Rekognition can identify people in real-time by instantaneously searching databases containing millions of faces. It also claimed that the system can identify faces in group photos, crowd, public places such as airport, among others.
What Went Wrong When ACLU Used Rekognition For Surveillance And Recognition
In a recent instance, the ACLU, scanned the faces of all 535 members of the congress against the database of 25,000 publicly available mugshots using Amazon’s open Rekognition API. It was essentially done to test the system’s accuracy. This database was searched against each of the current member of the House, and then identified with default match settings available with Amazon Rekognition.
Unfortunately, the test by ACLU showed incorrect identification of faces, along with an indication of racial bias, which is an underlying problem of many facial recognition systems that exist today. It was reported that 11 of the 28 falsely matched people were of colour, which amounts to 39 percent of the total number. It is despite the fact that people of colour make up only 20 percent of those in Congress. The facial recognition system especially shows high error rates in case of women and African-Americans as well.
“Our test reinforces that face surveillance is not safe for government use,” Jacob Snow, a technology and civil liberties attorney at the ACLU Foundation of Northern California, said in a statement. “Face surveillance will be used to power discriminatory surveillance and policing that targets communities of colour, immigrants, and activists. Once unleashed, that damage can’t be undone.”
Rekognition had earlier delivered results for the Washington County Sheriff’s Department in Oregon, where it used images with a database of 300,000 mugshots.
In a letter written by ACLU, addressing CEO Jeff Bezos, they have clearly expressed the need to protect civil rights and liberties, and safeguarding communities. Implying a probable harm to people with colour, undocumented immigrants and protesters, the letter demanded for Amazon to stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the country. “Amazon should not be in the business of providing surveillance systems like Rekognition to the government,” the letter said.
It noted that Amazon’s Rekognition product runs counter to its earlier shared values of supporting First Amendment freedoms and discriminatory bans on Muslims. In the letter, ALCU chief also suggests that Amazon must act swiftly to stand up for civil rights and civil liberties, including those of its own customers, and take Rekognition off the table for governments.
ACLU strongly suggests that people should be able to walk freely on the streets without being watched by the government, and technologies like faulty facial recognition system threatens this freedom.
Keeping these concerns as the focal point, as many as 70 civil rights groups, 400 members of the academic community and more than 150,000 members of the public have already spoken up to demand that Amazon stops providing face surveillance to the government.
The company reportedly said that poor calibration was the culprit behind all the chaos, while also suggesting that the technology has many useful purposes, including finding abducted people or lost children, among others.
Amazon also said that the ACLU tests were performed using Rekognition’s default confidence threshold of 80 percent, which, according to the company suits best and gives accurate results for photos of animals or chairs — not humans. The company suggested that the recommended threshold for law enforcement applications is around 95 percent.
Whether or not the company is at fault, the bottom line is that it induced a feeling of racism. It is high time that the errors are corrected and laws are enforced for the right use of technologies such as face recognition. It is not correct to use the technology until all harms are considered and necessary steps are taken to prevent them from harming vulnerable communities. It is important for companies bringing these technologies to be highly responsible and ensure that it is correctly deployed in the communities.