Dear Friend,
1 in 10 kids say their friends or classmates use AI to generate nudes of other kids.
This is an alarming statistic reported in
recent research
from Thorn.
Think of 10 children in your life.
Odds are, one of them knows another kid who is violating his/her peers in this way. And it’s likely much more than that, as the research only accounts for children who
know
their peers are doing this!
We have to admit, this statistic has us shook.
This is not a “someone else” problem—
our own children and loved ones are at risk.
With AI developing at a rapid rate, we have to wonder: if so many kids are doing this
now,
how much worse is the problem going to get?
It is
imperative
that we
ACT NOW
, while we still have a chance to put a stop to this. If we don’t, it
will
become too late.
Will you become a monthly donor, to support this fight?
NCOSE is fighting hard against AI-generated child sexual abuse material and other similar forms of exploitation. We have seen many victories in the past few months:
👉Google
made it easier to remove or downrank these images in search results
👉Three critical bills
fighting AI-generated and non-consensually distributed sexual images having taken big steps forward: The
DEFIANCE Act
and
SHIELD Act
recently passed the Senate, and the
TAKE IT DOWN Act
unanimously passed Committee!
👉Apple
,
LinkedIn
, and
Google
all removed “nudifying apps” we raised to their attention, or ads which promote these apps
This fight
is
winnable!
But we need your help
. Please become a monthly donor today!
Become a Monthly Donor
Gratefully,
-----------------------------------------------------------
Email Marketing By CharityEngine ([link removed])
To be removed from this list, copy the following address into your web browser [link removed]