[AI can be terrifying when used in horrible ways.]
[[link removed]]
GOOGLE’S PROJECT NIMBUS IS THE FUTURE OF EVIL
[[link removed]]
Jerry Hildenbrand
September 3, 2022
Android Central
[[link removed]]
*
[[link removed]]
*
[[link removed]]
*
*
[[link removed]]
_ AI can be terrifying when used in horrible ways. _
Android Wallpaper - I'm the Future , image: Saad Irfan (CC
BY-NC-SA-2.0)
Google does a lot of stupid things. All giant corporations are the
same in that regard. But it takes special effort to do something truly
terrible. That's where Google's Project Nimbus comes in on the
spectrum.
Project Nimbus is a joint effort of Google, Amazon, and the Israeli
government that provides futuristic surveillance capabilities through
the use of advanced machine learning models. Like it or not, that's
part of the future of state security, and not any more terrible than
many other similar projects. Many of us even use similar tech in and
around our homes [[link removed]].
Where things get dark and ugly is what Google says about Project
Nimbus' capabilities using the company's technology:
_Nimbus training documents emphasize “the ‘faces, facial
landmarks, emotions’-detection capabilities of Google’s Cloud
Vision API,” and in one Nimbus training webinar, a Google engineer
confirmed for an Israeli customer that it would be possible to
“process data through Nimbus in order to determine if someone is
lying”. _
Yes, the company that gave us the awesomely bad YouTube algorithms now
wants to sell algorithms to determine if someone is lying to the
police. Let that sink in. This is a science that Microsoft has
abandoned
[[link removed]](opens
in new tab) because of its inherent problems.
Unfortunately, Google disagrees so much that it retaliates against
people in the company
[[link removed]] that
speak out against it.
There is no good reason to provide this sort of technology to any
government at any scale.
I'm not going to wade too deeply into the politics at play here, but
the entire project was designed so the Israeli government could hide
what it is doing. According to Jack Poulson, former head of Security
for Google Enterprise, one of the main goals of Project Nimbus is
"preventing the German government from requesting data relating to the
Israel Defence Forces for the International Criminal Court" according
to The Intercept
[[link removed]].
(Israel is said to be committing crimes against humanity against
Palestinians, according to some people's interpretation of the laws.)
Really, though, it doesn't matter how you feel about the conflict
between Israel and Palestine. There is no good reason to provide this
sort of technology to any government at any scale. Doing so makes
Google EVIL.
Nimbus' supposed capabilities are scary, even if Google's Cloud Vision
API was 100% correct, 100% of the time. Imagine police body cameras
that use AI to help decide whether or not to charge and arrest you.
Everything becomes terrifying when you consider how often machine
learning vision systems get things wrong
[[link removed]],
though.
This isn't just a Google problem
[[link removed]]. All
one needs to do is look to content moderation on YouTube, Facebook, or
Twitter. 90% of the initial work is done by computers using moderation
algorithms that make wrong decisions far too frequently. Project
Nimbus would do more than just delete your snarky comment, though —
it could cost you your life.
No company has any business providing this sort of AI until the
technology has matured to a state where it is never wrong, and that
will never happen.
Look, I'm all for finding the bad guys and doing something about them
just like most everyone else is. I understand that law enforcement,
whether a local police department or the IDF, is a necessary evil.
Using AI to do so is an _unnecessary _evil.
I'm not saying Google should just stick to writing the software which
powers the phones you love
[[link removed]] and not trying
to branch out. I'm just saying there is a right way and a wrong way
— Google chose the wrong way here, and now it's stuck because
the terms of the agreement do not allow Google to stop participating
[[link removed]].
You should form your own opinions and never listen to someone on the
internet who has a soapbox. But you should also be well-informed when
a company who was founded on a principle of "Don't Be Evil" turns full
circle and becomes the evil it warned us about.
_Jerry Hildenbrand
[[link removed]] is Senior
Editor — Google Ecosystem_
_Jerry is an amateur woodworker and struggling shade tree mechanic.
There's nothing he can't take apart, but many things he can't
reassemble. You'll find him writing and speaking his loud opinion on
Android Central and occasionally on Twitter
[[link removed]]._
* AI
[[link removed]]
* artificial intelligence
[[link removed]]
* Google
[[link removed]]
* facial recognition
[[link removed]]
* Facebook
[[link removed]]
* Israel
[[link removed]]
*
[[link removed]]
*
[[link removed]]
*
*
[[link removed]]
INTERPRET THE WORLD AND CHANGE IT
Submit via web
[[link removed]]
Submit via email
Frequently asked questions
[[link removed]]
Manage subscription
[[link removed]]
Visit xxxxxx.org
[[link removed]]
Twitter [[link removed]]
Facebook [[link removed]]
[link removed]
To unsubscribe, click the following link:
[link removed]