Dear Friend,
Last December, several students walked the halls of their Florida middle school, hanging their heads in humiliation, after two of their male peers shared nude images of them without their consent.
The twist: the boys generated these images with AI.
At the time, the classmates who were depicted in the forged images were between the ages of 12 and 13, and the boys who exchanged them were 13 and 14.
This is not an isolated incident. In fact, Thorn research reported that 1 in 10 kids say their friends or classmates use AI to generate nude images of other kids.
With AI becoming more and more accessible and sophisticated every day, everyone is vulnerable to this kind of abuse.
Someone you love could be next, which is why we need your help to combat this.
Turn the Tide by Becoming a Monthly Donor!
It may seem like there's nothing we can do about this devastating new threat. But that is false.
In the past few months, we have already seen several heartening victories in the fight against AI-generated child sexual abuse material (CSAM) and image-based sexual abuse (IBSA):
👉Apple, LinkedIn, and Google all removed “nudifying apps” we raised to their attention, or ads which promote these apps
👉Google made it easier to remove or downrank AI-generated sexually explicit images, in search results
👉The CEO of Telegram, an app notorious for CSAM and IBSA, was arrested in France and the platform is being investigated
👉Three critical bills fighting IBSA have all taken big steps forward: The DEFIANCE Act and SHIELD Act recently passed the Senate, and the TAKE IT DOWN Act unanimously passed Committee!
Imagine it is your child who falls victim to this cruel behavior. If we do not act, this possibility becomes closer to a reality. Let’s stop it before it’s too late.
Please join us!