Professionalism/Amazon, Rekognition, and Law Enforcement
Amazon’s Rekognition software is one of a set of AI technologies known as computer vision. In this field computers are trained to interpret and visualize the world around them. It uses databases of reference images or videos with labels as comparators to the target image or video in a process known as deep learning. Deep learning models use these databases to identify objects and react accordingly. Computer vision is a recent field made possible through a proliferation of image taking devices and increases in computing power and data capacity. It can be used to detect and identify people as well, including their pathing and other details. This technology has widespread applications throughout medical, manufacturing, and security industries.
Amazon’s department for facial recognition through computer vision is known as Amazon Rekognition. Running on Amazon's own AWS system of cloud based computing, it offers a scaling suite of computer vision features to both large and small scale consumers. Abilities include object, scene, and activity detection, facial recognition, facial analysis, pathing, and nsfw content detection. It is capable of analyzing and filtering huge amounts of information in small periods of time. Extraction of text from images is another small feature. It's computer vision algorithms can be run on either user created or Amazon generated databases. This allows customers to customize the systems accuracy to their particular use cases. One of the biggest customers for Amazon Rekognition is law enforcement groups. Quick algorithms to recognize individuals can be used in cases like kidnappings as well as tracking criminals. Computer's are also more reliable for repeat offenders where the database of images is larger.
Washington County Sheriff's OfficeEdit
Amazon as been working to modify the Rekognition technology to better suit police forces. Amazon has sold the Rekognition technology to the Washington County Sheriff’s Office in Oregon. The police use this technology to identify persons of interest. Before the technology, persons of interest were identified by on-duty police officers. If a certain officer was off-duty during a facial recognition breakthrough, the process would be slowed. Amazon’s Rekognition technology allows faces to be run through a database to find a match must faster than before.
Amazon’s Rekognition technology is not always accurate. Amazon recommends that police forces acquire at least a 95% match before proceeding with arrest. The databases used by Rekognition must been expanded to increase the reliability that the software will make a correct match.
BodyWorn is a facial recognition technology that utilizes the body cameras worn by police officers. The body cameras take footage that is immediately sent to the precinct. The precinct can then run the footage through the facial recognition software in hopes of finding a match. This decreases the length of time that it would take for a police investigation to find a person of interest.
Amazon’s Rekognition online dashboard offers a Celebrity Recognition feature. Users may upload photos of celebrities and have them matched with a degree of certainty. Uploaded photos that are closer to a portrait have a higher degree of certainty. A portrait of Tony Bennett  had a higher degree of certainty (91%) than a photo of him after winning the Final Four game against Purdue (67%).
Some celebrities look somewhat alike, but Rekognition is still able to distinguish them. Rekognition labeled an uploaded photo of Katy Perry and Zooey Deschanel side by side correctly with certainty at or above 80%. Although these two look similar, there are thousands of pictures of them on the internet. Actor Will Ferrell and drummer Chad Smith also look similar, but Rekognition was still able to distinguish them with certainty of one hundred percent. Like Perry and Deschanel, these two are pictured widely across the web. Rekognition compares these photos to databases full of pre-labeled celebrity images. it is no surprise that these images were labeled correctly because the uploaded images were taken directly from Google. Some of the images in the comparison databases may be the exact ones that were uploaded. Being an exact match, one hundred percent certainty of labeling the celebrity makes sense.
Rekognition may be successful in the comparing of popular images, but may rely on the popularity of the celebrity. The algorithm does seem to have promising applications, but may rely on features that are not always present. Ferrell and Smith were distinguished correctly, but there are small features that differentiate them. Both celebrities have different hairlines, tend to dress differently, and often have different facial hair. Smith often tends to wear backwards hats, whereas Ferrell does not. In an uploaded image of Ferrell and Smith wearing the same clothes, Rekognition labeled Ferrell and Smith both as Chad Smith. Having a similar facial structure, wearing the same clothes and hats may take away some of the differences in the two celebrities, producing inaccurate results.
The American Civil Liberties Union (ACLU) conducted a study in which images of lawmakers were run through the Rekognition software. They found that the software misidentified 28 lawmakers. Of these 28 misidentifications, the majority of them were African-American and Latino. In some of the cases, the lawmakers were matched with images of people who had previously been arrested.
Three of the lawmakers who were misidentified contacted Jeff Bezos to voice their concerns. They emphasized that the 5% error rate among known lawmakers indicates that the Rekognition software has issues and should not be sold to law enforcement any time soon. They requested additional information on how Amazon tests the Rekognition technology and who Amazon's government customers are to ensure that injustices are not occurring due to the inaccuracies of the technology.
The American Civil Liberties Union (ACLU) has written a letter to Amazon’s CEO, Jeff Bezos, expressing their concerns regarding Rekognition. In the letter, the ACLU discussed the frequent misidentification of people of color by the Rekognition software. They fear that the government could use the software to remove freedom from already over-policed communities of color. They discuss that the Rekognition software would allow the government to “continuously track immigrants as they embark on new lives” and “identify political protesters captured by officer body cameras.”
One major issue that the ACLU had with Amazon was their connection to U.S. Immigration and Customs Enforcement (ICE). Amazon has been in contact with ICE regarding Rekognition. The ACLU argues that if ICE has the Rekognition technology in their toolbelt, it will make it much easier for them to target and separate families living in the U.S. The Rekognition software could theoretically have a database of all illegal immigrants in the U.S. and compare any individual that comes into contact with the software to the database.
The ACLU used other companies to show that facial recognition technology is highly flawed. In the letter, they discussed how both Google and Microsoft have acknowledged the risks with facial recognition software and do not intend to market it until these risks are mitigated. Other organizations signed the letter to demonstrate solidarity against Rekognition being sold to law enforcement. Two of the organizations include the Muslim Justice League and the Center on Policy Initiatives.
Failure to Comply with StandardsEdit
The National Institute of Standards and Technology (NIST) is a government organization that aims to provide standards for technology in the US. NIST sets standards for certain algorithms, one of which is facial recognition and gender classification. They have an ongoing Face Recognition Vendor Test (FRVT), working with 127 algorithms among 45 vendors to set facial recognition accuracy standards and make sure that all are performing in accordance. Although the Rekognition service has been available since 2016, they are still not a NIST vendor. With a reputable computing infrastructure, users would have no reason to suspect Amazon’s service was inaccurate. Their large brand name was the driving force in user trust of the system even though it was not verified by NIST standards for three years.
Free Comparison DatabasesEdit
Social media content sharing sites (Facebook, Instagram, Twitter, etc.) allow for the harvesting of free data to populate recognition comparison databases. Nothing is stopping someone from saving Facebook photos and creating their own database full of human images to compare other faces to. Camera footage ran against some of these databases may be skewed. Comparing footage to databases of mugshots will produce results only for those previously convicted. Another resort is to compare footage against images of random people, hoping to find a match. Georgetown researchers found that about half of all american adults have their face in a comparison database. These databases may produce inaccurate results. Security camera footage is often inaccurate because of the blurriness of the image, which does not allow for the heightening of facial features. When such footage is compared with these databases, a match may occur, regardless of its validity. Innocent people may be drawn into an investigation or convicted of a crime they did not commit, tarnishing their reputation.
Algorithm Justice LeagueEdit
Letter to BezosEdit
The Algorithm Justice League (AJL) was started by Joy Buolamwini in response to the flaws in the Rekognition service. They wrote Bezos in 2018 to explain that there are several issues recognizing minorities with his service. Some of the features that were being used to identify people (skin color, hair color, etc.) were not favorable to minority groups. In the case of gender classification, there is 50% chance of correctly labeling an image. Facial recognition has much lower odds, with a small chance that the compared face will be in a database. Law enforcement would not be using this technology for identifying makes and females, but identities. They urged Bezos to stop working with law enforcement as this technology can be more harmful than good when people are misidentified.
Safe Face PledgeEdit
The AJL has started the Safe Face Pledge, which forbids the use of facial recognition for police use and any government use of it must be fully transparent, meaning non-skewed databases that represents all races and genders equally. The AJL urged Amazon to sign the pledge, stop working with law enforcement, and work with NIST to produce accurate results. As of May 2019, Amazon continues to work with law enforcement. They are not a NIST vendor, claiming that the Rekognition infrastructure cannot be downloaded as it is a part of AWS. Amazon is open to working with NIST to develop better benchmarks and testing of external APIs.
- https://aws.ama zon.com/rekognition/
- https://aws.ama zon.com/blogs/machine-learning/thoughts-on-recent-research-paper-and-associated-article-on-amazon-rekognition/