Supreme Court Considers Most Important Online Child Protection Case in 20 Years

 

TRIGGER WARNING: The following content contains descriptions of sexual assault and may be upsetting to some readers.

 

Rachel was a sweet, innocent seven-year-old girl. Every night she would line her stuffed animals along her bed and say goodnight to each one of them, making sure none of them felt left out.

 

But at this tender age of seven, something happened that would change Rachel’s life forever. She was exposed to Internet pornography.

 

This led to thirteen years of pornography addiction, in which Rachel’s developing brain was polluted with scene after scene of abuse and degradation.

 

At the age of sixteen, Rachel was sexually assaulted. Reflecting back on this horrifying moment, Rachel says, “I remember making it home and running into my bedroom afterwards, looking at myself in the mirror and pulling my lip down, the inside of my mouth was purple with bruises. I was shaking and crying. And I thought to myself, ‘But this is just like porn, that experience was just like porn. I’m supposed to want and like this. I said no, but what does that really mean?’”

 

Pornography had taught Rachel that sexual violence was normal. So much so, that she stayed in a relationship with the perpetrator, and continued to be sexually assaulted and abused by him for two full years.

 

Now, Rachel has one message for the world: Pornography websites must be required to verify the age of their users. They must prevent children from accessing their content.

Read More

📣ACTION: Ask Your State Legislators to Protect Children from Exposure to Online Pornography!

Take Action!

TAKE IT DOWN Act Reintroduced in Senate to Confront AI-Generated Image-Based Sexual Abuse

 

The National Center on Sexual Exploitation (NCOSE) commends the reintroduction of the TAKE IT DOWN Act in the U.S. Senate by Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN), emphasizing the urgent need for this legislation to provide necessary solutions for those impacted by the creation of AI-generated sexually explicit material (or "deepfake pornography"). 

“Nobody should endure sexual exploitation through the creation of deepfake pornography. The TAKE IT DOWN Act mercilessly confronts this rampant criminality by requiring tech platforms to remove AI-generated Image-Based Sexual Abuse within 48 hours of receiving a removal request. Right now, it is nearly impossible for victims of image-based sexual abuse to have sexually abusive content removed from websites. IBSA is a horrific assault on one’s privacy and dignity; we must act quickly to combat this abhorrent abuse,” said Marcel van der Watt, President, National Center on Sexual Exploitation. 

Read More

Ads for apps that use AI to create fake videos of people kissing anyone they want are flooding social media platforms like TikTok and Instagram. 

 

Meta and TikTok have run thousands of ads for apps that use AI to generate fake kissing videos of people, allowing users to upload photos of any two people and let AI convert them into a video of them kissing. The apps are being marketed as tools that would allow you to instantly “kiss anyone you want”— no consent required... 

While the ads are not sexually explicit like the deluge of AI-generated pornographic content that has engulfed social media platforms like Instagram, Reddit and YouTube, they can be equally dangerous, Haley McNamara, an executive at the National Center for Sexual Exploitation told Forbes. 

 

“It does not have to be explicit to be exploitative,” McNamara said. “If it's crossing boundaries to do something offline to someone without their consent, kissing, undressing, et cetera, then it's also crossing boundaries to do that online.” 

Read More

VICTORY! Ohio Governor Mike DeWine Signs Braden's Law to Make Sextortion a Felony

 

Gov. Mike DeWine signed legislation Wednesday named after Braden Markus that criminalizes sexual extortion, which occurs when someone blackmails another person over the release of private images.

 

Fifteen-year-old Braden Markus, like so many other teenagers, tragically took his own life after falling victim to sextortion. Braden was under the impression that he was texting with a teenage girl, but it was actually nefarious actors manipulating the teen in an attempt to scam him out of thousands of dollars. 

 

Sextortion is an extremely pervasive problem faced by teens on social media. Predators pose as teens and chat with real teenagers, coercing them into sending sexual photos of themselves. These predators then threaten to release these photos unless the victim pays them exorbitant amounts of money. Tragically, this leaves the teen victims feeling so distressed that they often commit suicide. 

 

"We can't bring Braden back, but what we can do is something in his name today and say we’re going to make a difference," DeWine said during a signing ceremony at the Ohio Statehouse, surrounded by Braden's family and friends.

Read More

Sincerely,

FacebookInstagramTwitterTikTokYouTubeLinkedIn