Section 230: How a Law Meant to Protect Children Became Their Biggest Threat In a special episode of the Ending Sexploitation Podcast, NCOSE Senior Vice Presidents Dani Pinter and Haley McNamara discuss why, this year, NCOSE’s annual Dirty Dozen List campaign is focusing on one single goal: Repealing Section 230 of the Communications Decency Act.
Section 230 was ironically created to protect children, but has had the opposite effect. Pinter explains:
“The irony is, it's called the Communications Decency Act, because Congress was moved to actually account for making kids safe online. Already in 1996 [when the law was enacted], Congress was hearing from constituents that kids are being exposed to explicit content, being attacked, contacted by adults in chat rooms. So, Congress was actually taking action then at the very beginning of the Internet.” So, what happened? How did we get to this point, where a law intended to protect children has instead come the biggest threat to their safety? |
|
|
TRIGGER WARNING: Disturbing discussions of sexual violence and incest
Victory! Steam Removes Harmful "No Mercy" Video Game that Glorifies Sexual Violence A massive win in the fight against exploitation online: A disturbing pornographic video game known as "No Mercy," in which users rape their wives and family members, has been removed from the Steam gaming distribution platform. The game's description reads:
“’No Mercy’ is a 3D choice-driven adult Visual Novel with a huge focus on Incest and Male Domination. After your mother’s affair shatters your family, you take on a new role: not to fix what’s broken, but to claim her for yourself. Unveil her deepest secrets, subdue her, and make all women yours.”
The content warning even states that the game contains "unavoidable non-consensual sex."
The campaign to get this game removed was initiated by Collective Shout, an Australian-based nonprofit organization working to end sexual exploitation. NCOSE joined Collective Shout in its quest to bring down this deeply troubling and repulsive video game.
It is thanks to collective advocacy from supporters like YOU, that this game was swiftly removed. Your action is vital to ensure that sexually exploitative digital content is immediately rejected by our society.
Thank you for your support in achieving this victory! |
|
|
The Guardian: ‘I didn’t start out wanting to see kids’: are porn algorithms feeding a generation of paedophiles – or creating one?
"Andy was enjoying a weekend away with his wife when it happened. 'My neighbour phoned me and said "The police are in your house. They’re looking for you.” He didn’t need to wonder why. 'You know. You know the reason. I was petrified when I got that call. It wasn’t just the thought of other people knowing what I had done; I also had to face myself, and that is a sick feeling – it is guilt, shame.' Andy had been watching and sharing images of children being sexually abused for several months before the police appeared at his door. He tried at first to keep it from his wife: 'I was afraid she would ask me to leave. I wouldn’t have blamed her if she had.'
When they got home, he told her his story: that a spiralling porn addiction had led him to ever darker places, chatrooms where people talked about sex and porn, and shared images and videos. 'That was where someone sent me a picture of a child, in exchange for some porn I sent them.'"
Andy's story speaks to the ways pornography websites feed an addiction that, in some cases, can lead to prison time. Pornography depicts every type of disturbing and harmful sexual situation imaginable, normalizing this behavior in the mind of the user. This can lead consumers down a dark path of sharing child sexual abuse material and even committing sexual abuse physically.
|
|
|
Victory! TAKE IT DOWN Act Clears House Committee! The TAKE IT DOWN Act has just advanced out of the House Energy and Commerce Committee!
If passed, the TAKE IT DOWN Act would criminalize the publication of image-based sexual abuse (IBSA), including non-consensually shared sexual images and AI-generated IBSA (commonly referred to as "deepfake pornography"). Currently, there are no federal laws that make this act a crime, and very few state laws that criminalize AI-generated IBSA.
Passing this bill is a key step to combatting IBSA by forcing platforms to take down content that is reported by the victim within 48 hours. Survivors of IBSA face often irreparable damage to their reputation, interpersonal relationships, and mental health, which is why this federal law is vital to curb the harms of IBSA.
The TAKE IT DOWN Act passed the Senate in the last Congressional session, but failed to get put up for a vote in the House, despite its bipartisan support. We are grateful for YOUR advocacy that has resulted in Congress taking swift action to advance this bill in this session. This is a MAJOR step towards getting this bill to the desk of the President, who himself has stated his eagerness to sign it into law. |
|
|
|