Amazon and ICE: Recognizing Corporate Bias

Three weeks ago, broke the story on Amazon employees meeting with Immigration and Customs Enforcement (ICE) employees to discuss possibilities of  the agency using Amazon’s facial recognition software. This isn’t the first time that law enforcement groups have used Amazon’s Rekognition program. Documents reported from the ACLU showed that several city police departments have already bought and implemented the system for police use. As Amazon employees have began to protest the actions of the company, it marks one of the many controversies surrounding tech giants interacting with law enforcement and the military. Google recently declared a ban of the company’s AI being used in any military capacity following a series of resignations after a contract with the Department of Defense.

Obviously this presents a major concern to the public. Americans have long feared the implications of increased police surveillance, and these concerns are not unwarranted. An investigation from the Associated Press found  665 instances of officers misusing confidential database information between 2013 and 2015. This fear, combined with the media coverage of the Ferguson protests being suppressed by heavily armed police officers, can easily stir fears of some Orwellian police state.

However, people forget to recognize the positive aspects of this technology. Having a more accurate means of identifying suspects in video could lead to a massive decrease in racial profiling and false convictions. If AI can more accurately diagnose patients and prevent misdiagnoses, it isn’t hard to believe that technologies such as Rekognition could more accurately find the right criminal and prevent unnecessary arrests. The fear of police forces abusing these technologies is valid, but the possibilities for reducing racial profiling in a nation with a long history of racial discrimination by law enforcement presents a strong counterargument.

Despite the concerns over these proposals, one issue seems to remain unrecognized: should a private company like Amazon be allowed to lease out these technologies to government agencies? In an era where the data of millions has been shown to be misused, from the Cambridge Analytica scandal with Facebook profiles to the recent reveal of Google+’s hole in security for the personal information of its users, should a private company be entrusted to host the biometric data of Americans? The potential of company misuse and leaks presents an unnecessary danger for a technology that doesn’t need to be privately sourced to the government. Not only is the matter of a private corporation leasing out this algorithm concerning, but the man who owns that corporation is as well. Jeff Bezos is considered to be the wealthiest man in the world. Already well interconnected with government agencies through Amazon Web Services, a cloud computing program lent out to organizations such as the CIA and the Department of Defense, Bezos shows how easy it is for billionaires to become influential in a nation without ever running for office. One can only hope that people like Bezos hold apolitical intentions when working with government, but in recent years, one can look at countries like Russia and Brazil to see how the 1% of the 1% use these ties for their own monetary and political ambitions. When looking at the talks between Amazon and ICE, the question is not if the ends justify the means, but who owns the means in the first place.


3102 Total Views 3 Views Today

Sam Carroll is a Lowcountry native and Sociology major in his sophomore year. Besides being a staff writer for CisternYard Media, he is the current vice president of Delta Tau Delta fraternity as well as the vice president for College of Charleston’s nonexistent shag club.

'Amazon and ICE: Recognizing Corporate Bias' has no comments

Be the first to comment this post!

Would you like to share your thoughts?

Your email address will not be published.

Images are for demo purposes only and are properties of their respective owners. Old Paper by