Intersections of Vulnerability in the LGBT Community (+Resources)

He was only 14, but he was alone. 

After coming out about his sexual orientation to his family, his stepfather had beaten him and threw him out on the street. His stomach in knots of fear, the young boy walked into a shelter for the first time. It was only 24 hours before he met an older man who promised to take care of him, and it felt like a lifeline.  

But the man didn’t take care of him. He began exploiting him, coercing him into sex trafficking. 

 

Tragically, stories like this are not uncommon. 

Sexual exploitation impacts every demographic, but it’s a tragic reality that abusers and sex traffickers often disproportionately target vulnerable populations, such as LGBTQ+ communities. 

Read more about the unique vulnerabilities facing LGBTQ+ persons and find resources here. 

The Washington Post: AI-generated child sex images spawn new nightmare for the web

The revolution in artificial intelligence has sparked an explosion of disturbingly lifelike images showing child sexual exploitation, fueling concerns among child-safety investigators that they will undermine efforts to find victims and combat real-world abuse...

Thousands of AI-generated child-sex images have been found on forums across the dark web, a layer of the internet visible only with special browsers, with some participants sharing detailed guides for how other pedophiles can make their own creations.

“Children’s images, including the content of known victims, are being repurposed for this really evil output,” said Rebecca Portnoff, the director of data science at Thorn, a nonprofit child-safety group that has seen month-over-month growth of the images’ prevalence since last fall.

“Victim identification is already a needle in a haystack problem, where law enforcement is trying to find a child in harm’s way,” she said. “The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge.”

Read the full Washington Post article here.

📣ACTION: Call on Github to combat AI-generated child sexual abuse material and image-based sexual abuse!

GitHub is arguably the most prolific space for Artificial Intelligence development. AI-generated child sexual abuse material, image-based sexual abuse, and pornography largely originate on this platform. Which is why NCOSE placed Github on the 2023 Dirty Dozen List.

Please join us in urging Github to get rid of sexually exploitative technologies!

NCOSE Warns Potential Reddit Investors About Image-Based Sexual Abuse, Child Sexual Abuse Material

The National Center on Sexual Exploitation is warning potential Reddit investors ahead of its potential IPO that the company needs to do even more to combat sexual exploitation. 

Reddit rightly decided to restrict NSFW (i.e. sexually explicit) content from third party appsa move that NCOSE had called on the company to make. But before Reddit can offer an IPO, the company ultimately needs to confront the pervasive problem of image-based sexual abuse, child sexual abuse material, and non-consensual content that can be found on its platform.

To do this, Reddit must remove sexually explicit content unless it can enact meaningful age and consent verification of those depicted in pornography.

📣 ACTION: Call on Reddit to rid its platform of sexual exploitation!

 
 

Sincerely,

Having trouble viewing this email? View it in your web browser

Unsubscribe or Manage Your Preferences