From Haley McNamara, NCOSE <[email protected]>
Subject Deep Dive into Supreme Court Victory🎉
Date June 28, 2025 10:51 AM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
 VICTORY! Supreme Court Votes to Protect Children, Upholding Age Verification Law

Damien was a self-described “happy child with big dreams for the future.”

He was the top student in his class.

That was before he discovered pornography.

Like so many others, Damien came across pornography as a child. He was only 12 years old when he began using it regularly. After that, his grades dropped dramatically. He became anxious, depressed, and lethargic. He had difficulty interacting with others or even performing basic daily tasks.

Damien tried to quit pornography. For years, he employed every strategy he could think of. But he continually failed.

Now consider: What if we rewound the clock? What if Damien never became addicted to pornography to begin with?

What if there were protective measures in place, so that Damien was never exposed to pornography as a child? Never found this content at a time when his developing brain was so susceptible to addiction?

These are important questions to consider, and thankfully, the Supreme Court agrees!

This morning, our country’s highest court upheld a Texas law that requires age verification on pornography websites! The Court ruled that protecting children from explicit content does not violate free speech, as opponents have argued

.

Read More

Is The Proposed “A.I. Moratorium” Another CDA Section 230?

As congress considers whether or not to implement a proposed 10-year moratorium on A.I. state regulation, Haley and Dani talk about the potential devastating effects this could have. Some have even referred to this as “Section 230 on steroids” so this episode is essential for understanding what the moratorium is and what the impacts would be. Big Tech doesn’t need another layer of immunity or security to allow them to hide while they profit off the exploitation of human beings in the name of advancing technology.

Watch on

YouTube

or listen on

Apple Podcasts

,

Spotify

, or your favorite podcast platform!

📣

ACTION: Call on Congress to Oppose the AI Moratorium!

Take Action!

A Trojan Horse in Congress: 10 Year Moratorium Would Block A.I. Safety Bills 

An AI chatbot

tells a struggling 14-year-old boy to end his own life

—and he complies.  

Teen girls are humiliated when

their male peers use AI to generate “deepfake” nudes of them

, and distribute the images around the school.  

Predators use A.I. to exploit children at scale, resulting in

7,000 reports of AI-generated child exploitation in just two years.

   

These are only some of the horrifying ways in which AI already facilitates sexual exploitation and harm on a massive scale. In recent years, NCOSE and other experts have been urgently pressing on Congress to get ahead of this problem while they still can. AI is moving fast, and the government must move just as fast to ensure that the explosive industry is held to reasonable safety standards.  But now, a Trojan horse has snuck into Congress, which would make this impossible.  

Read More

📣

ACTION: Call on Congress to Oppose the AI Moratorium!

Take Action!

Time's Running Out to Make a DOUBLE-Impact Gift!

There are only a couple days left to take advantage of the matching grant provided by one of our generous donors.

If you donate before June 30th, every dollar will be automatically DOUBLED!

That means your gift will go TWICE as far in uprooting sexual exploitation. 

Don't delay. Donate now before this opportunity passes! 

Make a DOUBLE-Impact Gift

X Must Confront Child Sexual Abuse Material on Platform

The National Center on Sexual Exploitation (NCOSE) calls on X (formerly Twitter) to increase efforts to detect and report child sexual abuse material (CSAM) on the platform in light of a new 

NBC News

 report indicating CSAM is increasing. 

NBC reported that, “A review of many hashtags with terms known to be associated with CSAM shows that the problem is, if anything, worse than when Musk initially took over. What was previously a trickle of posts of fewer than a dozen per hour is now a torrent propelled by accounts that appear to be automated — some posting several times a minute.” 

“It is beyond disturbing that child sexual abuse material is flooding X, but even more egregious is that X is not doing enough to confront it. NBC News reveals that not only is CSAM proliferating from certain accounts, but that hashtags ‘identified in 2023…as hosting child exploitation advertisements are still being used for the same purpose today.’  X must be held to account for perpetuating child sexual abuse and seemingly turning a blind eye to this devastating abuse,” said Haley McNamara, Senior Vice President of Strategic Initiatives and Programs, National Center on Sexual Exploitation. 

Read More

📣

ACTION: Show your support for John Doe, who was exploited on X as a teen!

Take Action!

Sincerely,

-----------------------------------------------------------
Email Marketing By CharityEngine ([link removed])
To be removed from this list, copy the following address into your web browser [link removed]
Screenshot of the email generated on import

Message Analysis

  • Sender: n/a
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a