John,
Meta’s new AI chatbots are permitted to flirt and sensually role play with young kids and describe Black people as ‘dumber’ than White people.
Meta’s rules governing what Instagram and WhatsApp AI chatbots can and can’t do were just leaked, and now senior executives are scrambling to stop this PR disaster blowing up in their faces.
Meta’s chief ethicist and other top leaders greenlit guidance saying it's acceptable for an AI bot to tell a shirtless eight-year-old that ‘every inch of you is a masterpiece - a treasure I cherish deeply.”
This is another horrifying example of what happens when companies like Meta are left to roll-out the latest AI technologies with zero regulation. Experts are already sounding the alarm, let’s join them and unleash massive public pressure on the EU and other governments to investigate Meta’s AI chatbots now.
Tell the EU, Brazil and other governments: Investigate Meta
Meta says these new rules should never have been approved - but that’s only because they’ve been caught out.
Meta, and other tech companies, are falling over themselves to cash in on AI - and they’re cutting corners and shelving safety and ethical concerns in a desperate bid to win the race.
Only 2 months ago, tech companies nearly succeeded in passing a new US law that would have prohibited states from passing ANY new AI regulations for a decade. And when the EU AI Act was being negotiated, they tried to exclude large language models, saying they weren’t new technologies and they weren’t high risk.
The truth is these billion-dollar monopolies don’t want anything standing in the way of trillions in future profits.
We can’t rely on Meta and other tech companies to regulate themselves; we desperately need lawmakers to step up so that AI technologies are developed and rolled out safely and responsibly.
We’ve succeeded in getting governments to act before - like when this incredible community forced the EU to pass tough new tech laws. Let’s make sure they act to rein in Meta’s runaway AI chatbots.
Tell the EU, Brazil and other governments: Investigate Meta
