Google continues to show us why it chose to abandon its old motto of “Don’t Be Evil,” as it becomes more and more enmeshed with the military-industrial complex. Most recently, Google has removed four key points from its AI principles. Specifically, it previously read that the company would not pursue AI applications involving (1) weapons, (2) surveillance, (3) technologies that “cause or are likely to cause overall harm,” and (4) technologies whose purpose contravenes widely accepted principles of international law and human rights.
Those principles are gone now.
In its place, the company has written that “democracies” should lead in AI development and companies should work together with governments “to create AI that protects people, promotes global growth, and supports national security.” This could mean that the provider of the world’s largest search engine–the tool most people use to uncover the best apple pie recipes and to find out what time their favorite coffee shop closes–could be in the business of creating AI-based weapons systems and leveraging its considerable computing power for surveillance.
Published by Electronic Frontier Foundation (EFF)
Google is on the Wrong Side of History
for Electronic Frontier Foundation (EFF)Privacy First: A Better Way to Address Online Harms
for Electronic Frontier Foundation (EFF)The truth is many of the ills of today’s internet have a single thing in common: they are built on a system of corporate surveillance. Multiple companies, large and small, collect data about where we go, what we do, what we read, who we communicate with, and so on. They use this data in multiple ways and, if it suits their business model, may sell it to anyone who wants it—including law enforcement. Addressing this shared reality will better promote human rights and civil liberties, while simultaneously holding space for free expression, creativity, and innovation than many of the issue-specific bills we’ve seen over the past decade.
In other words, whatever online harms you want to alleviate, you can do it better, with a broader impact, if you do privacy first.