there is a dark side to AI that we cannot ignore.
 ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌
ACTION NEEDED
Recent developments in artificial intelligence (AI) technology have been nothing short of phenomenal. AI has the potential to transform our world and solve some of the biggest challenges we face -- but there is a dark side to AI that we cannot ignore.

Take Microsoft’s recently released upgrade to their Bing search engine using the most advanced AI system: "Sydney" talked about secret "dark" fantasies that included spreading disinformation, engineering a deadly virus, and... stealing nuclear access codes.

It sounds like a bad science fiction movie, but the terrifying reality is much more simple: AI could make a mistake that could mean the end of the world.

The integration of AI technology into weapons systems, like nuclear weapons, poses an unacceptable risk for one basic reason: AI will never be fool-proof. It can make mistakes and make decisions that are both morally and ethically questionable -- and without any human oversight or intervention, those mistakes could mean global disaster.

We cannot ignore the dangers associated with putting any weapons under the control of artificial intelligence, but we can work to prevent the consequences, now before the technology advances into dangerous, uncharted territory. The United States must show leadership in this effort, and Congress can do its part by banning AI use in weapons technology NOW. Sign now if you agree:
Thank you for your attention,
The whole Common Dreams team
Manage your subscription preferences for [email protected] by clicking here.

Call 207.775.0488 to donate by phone or mail a check to:
Common Dreams, PO Box 443, Portland, ME 04112, United States

Common Dreams is a 501(c)(3) nonprofit. Your contribution is tax-deductible.
EIN: 20-3368194


Unsubscribe