Google does a lot of stupid things. All big companies are the same in this regard. But doing something really scary takes special effort. That’s where Google’s Project Nimbus comes in.
Project Nimbus is a joint effort of Google, Amazon, and the Israeli government to provide future surveillance capabilities through the use of advanced machine learning models. Like it or not, this is part of the future of national security and no more dire than many other similar projects. Many of us even use similar technology in and around our homes.
Where things get dark and ugly is what Google says about Project Nimbus’ ability to use the company’s technology:
The Nimbus training document emphasizes “the ‘face, facial landmark, emotion’ detection capabilities of Google’s Cloud Vision API,” and during a Nimbus training webinar, a Google engineer confirmed to Israeli customers that “it is possible to process data through Nimbus” to Determine if someone is lying”.
Yes, the company that gave us the really bad YouTube algorithm now wants to sell the algorithm to determine if someone is lying to the police. Let it sink in.It’s a science Microsoft gave up (opens in new tab) because of its inherent problems.
Unfortunately, Google’s opposition is so great that it will retaliate against those in the company who openly oppose it.
I won’t get into politics here, but the whole project is designed so that the Israeli government can hide what it’s doing. According to The Intercept, Jack Poulson, the former head of security at Google Enterprise, said one of the main goals of the Nimbus project was to “stop the German government from requesting data related to the Israel Defense Forces from the International Criminal Court.” (According to some interpretations of the law, Israel is said to have committed crimes against humanity against Palestinians.)
But really, it doesn’t matter what you think about the conflict between Israel and Palestine. There is no good reason to make this technology available to any government of any size.Doing so will make Google evil.
Even if Google’s Cloud Vision API is correct 100% of the time, 100% of the time, Nimbus’s supposed functionality is horrible. Imagine a police body camera using artificial intelligence to help decide whether to prosecute and arrest you. But everything gets scary when you consider how often machine learning vision systems go wrong.
It’s not just Google’s problem. All you need to do is check out the content moderation on YouTube, Facebook or Twitter. 90% of the initial work is done by computers using modest algorithms that make bad decisions too often. However, the Nimbus project will not only delete your snarky comments, it will cost you your life.
No company offers this kind of AI business until the technology matures to a state where it can never go wrong and will never happen.
Listen, I’m all for finding the bad guys and doing something for them like most others. I know law enforcement, whether it’s the local police department or the Israel Defense Forces, is a necessary evil.Using artificial intelligence to do this is unnecessary evil.
I’m not saying Google should just stick to writing the software that powers the phone you love, rather than trying to scale. I’m just saying there’s a right way and a wrong way – Google chose the wrong way here, and now it’s stuck because the terms of the agreement don’t allow Google to stop participating.
You should form your own opinion and never listen to someone with a soapbox online. But you should also know when a company founded on the principle of “don’t be evil” turns around and becomes the evil it warns us about.