Amazon’s Rekognition could be the path toward authoritarianism
George Orwell’s “Big Brother” has arrived, courtesy of Amazon. The e-commerce tech giant is selling facial recognition technology to police departments — something not well known, and for good reason. This image analysis service, called Rekognition, is deeply flawed and deeply dangerous.
While Rekognition has been used for harmless indulgence in celebrity culture, like identifying celebrity wedding guests as they arrived at Windsor castle for the royal wedding, the program has several functions and can be adjusted by developers for a given company’s need. Some functions, aside from facial recognition, include pathing (tracking an object), facial analysis (familiarization with emotions) and deciphering text in images otherwise illegible to the naked eye. This technology’s adaptability and absent government regulation means companies and law enforcement can use Rekognition however they choose.
A study published in August by MIT found that Rekognition identified white men with 100 percent accuracy, which dropped to 68.6 percent when identifying women of color. In January, another study was conducted by researchers at MIT, which found that Rekognition mistook women for men 19 percent of the time and misidentified darker-skinned women for men 31 percent of the time. A study conducted by the ACLU of Northern California found Rekognition incorrectly matched the faces of 28 members of Congress with individuals arrested for a crime; among the misidentified were six members of the Congressional Black Caucus.
In the face of these studies and numerous calls from prominent artificial-intelligence researchers, civil rights groups and its own shareholders, Amazon simply chose to brush the findings off as “false” and “misleading” while refusing to present its own studies or to submit the system to the National Institute of Standards and Technology for evaluation.
Rekognition has the capacity to automate the identification and tracking of any individual; if outfitted for body camera usage, police could effectively become walking surveillance cameras. But you don’t have to take my word for it: one need only look to Orlando’s Rekognition pilot program, which has installed cameras that scan hundreds of thousands of faces daily across the city. More concerning, we don’t know how this technology is being used.
Although Orlando may be testing the pilot program on its own officers for now, the city will eventually broaden its functions while remaining free from legal restrictions. Other law enforcement departments implementing Rekognition haven’t been detailed about the way they employ the software, and the absence of laws in this area may mean they’re not obliged to disclose anything.
While a deputy for the Washington County Sheriff’s office stated officers were trained not to rely solely on the software, misidentifying persons of color as a threat has long been an issue, even without the racial biases of Rekognition. This technology risks the false imprisonment of minorities and women while simultaneously violating our right to privacy.
Amazon insists there have been no reports of misuse, but one wouldn’t expect law enforcement agencies to self-report on their civil rights violations, especially regarding laws governing facial-recognition technology. When requested by the ACLU, neither Orlando nor Washington County could produce records showing their communities were provided a forum to discuss Rekognition before its implementation or rules outlining how Rekognition could be used while ensuring the protection of rights.
In accordance with its business model — sell cheap and eliminate competitors — Amazon is pushing Rekognition, expanding its marketing of the product to organizations from police departments to the Immigration and Customs Enforcement (ICE) agency. Considering the current administration’s approach to migrants at the southern border, it’s not far-fetched to say Rekognition may be utilized by ICE for nefarious purposes.
Without government regulation, we won’t know how Rekognition is being used, when we’re being watched or when our privacy is being violated. The ambiguity of it all makes it increasingly likely that these things are already happening. Big Brother could already be, and likely is, watching.
Written by: Hanadi Jordan — email@example.com
Disclaimer: The views and opinions expressed by individual columnists belong to the columnists alone and do not necessarily indicate the views and opinions held by The California Aggie.