Citing concerns of potential civil and human rights risks, Amazon shareholders on Thursday drafted a letter pushing Amazon to stop selling its facial recognition software – Rekognition, to government agencies. The shareholders’ group owning over $1.32 billion worth of assets at Amazon requests the company to halt sales at least until the board has determined that the technology doesn’t pose risks to civil and human rights.
The resolution, backed by the shareholders, was initiated by Open MIC – a non-profit organization focused on corporate accountability – and was filed by the Sisters of St. Joseph of Brentwood. “It’s a familiar pattern: a leading tech company marketing what is hailed as breakthrough technology without understanding or assessing the many real and potential harms of that product,” said Open MIC Executive Director Michael Connor in a statement. “Sales of Rekognition to government represent a considerable risk for the company and investors. That’s why it’s imperative those sales be halted immediately.”
It was the American Civil Liberties Union who first raised concerns about the Rekognition software which they reported as showing racial bias after conducting a test last year. Later that year, Amazonians came forward questioning the use of Rekognition, during Amazon’s staff meeting in November. And now, as the shareholders have come together supporting the opinion, this would definitely bear a significant impact on the company’s sale of facial recognition software.
Their letter said that they (the investors) are concerned about Amazon’s software being used "to justify the surveillance, exploitation, and detention of individuals seeking to enter the U.S., posing human rights risk.” The letter also hinted Amazon’s expansion plans for the software amid sensitive controversies. Amazon is said to be in the process of pitching the software to the U.S. Immigration and Customs Enforcement and also the FBI, the letter indicates.
Well, Amazon doesn’t seem to be agreeing on it. Matt Wood, General Manager of Artificial Intelligence for Amazon Web Services, goes the other way around explaining in his blog posts the good sides of the software. “We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft),” he wrote, in his June posts.
He continued the argument in his recent posts telling that “there have always been and will always be risks with new technology capabilities,” but said, “… it is the wrong approach to impose a ban on promising new technologies because they might be used by bad actors for nefarious purposes in the future.”