Image

VICTORY! Meta to Blur Nudity on Instagram DMs for Minors 

 

Meta recently announced that it will automatically blur nude images in Instagram direct messages for all minors under age 18.

 

This announcement came just days after Meta was placed on the 2024 Dirty Dozen List—it is a direct result of YOU joining us in advocating for this change! 

In January 2024, Meta announced that it would blur nudity for 13-15-year-olds. While this was a much-needed fix, we questioned why the company didn’t extend those same protections to 16 and 17-year-olds given the rise in sextortion, including of older teens.

 

Alongside grassroots advocates like you, NCOSE has been pressing on Meta for years proactively prevent sexually explicit content from being sent to or sent by minors: most recently in a letter to inform Meta of its inclusion on the 2024 Dirty Dozen List.

 

We are thrilled that Meta finally listened! Friends, your voice truly does have power!

 

We continue to press on Meta to extend this safety feature to all its platforms, including Facebook and WhatsApp. 

Read More

Reddit is Riddled with Sexploitation

 

“Boys for Sale.”  

 

This was the text displayed on a Reddit post found by a NCOSE researcher this month. The post also featured an image … An image of an adult man, covering the mouth and grabbing the genital region of an underage boy. The underage boy was wearing only his underwear. Across his bare chest, there were clearly visible red marks of abuse.  

 

This is the kind of horrific content that can be found easily on Reddit.

 

It is why we named Reddit to the Dirty Dozen List—an annual campaign calling out 12 mainstream contributors to sexual exploitation—for the fourth year in a row. 

 

It is why we’re asking YOU to once again join us in demanding Reddit reform.  

Read More

📣ACTION: Urge Reddit to Reform! 

Take Action!

Unmasking the Shadows: Identity Disclosure as a Tactic to Deter Sex Buyers

 

He* hovered the mouse over the button, “Book a Girl.” A salacious grin stretched across his face as he envisioned what waited for him after clicking that button.

But just then, another image flashed through his mind. It was the memory of that one page in last week’s e-paper … The page that was filled with mugshots of local men … The page that read “Eight men arrested for patronizing brothels.”

Promptly, he jerked his hand back from the mouse and shut his laptop.

 

He couldn’t risk that. No way could he risk that...

 

 

Sex Buyers don't want to be exposed. That is why disclosing the identity of sex buyers—as is often done for other crimes—acts as a powerful deterrence tactic. 

Read More

*Composite story based on common experiences

📣ACTION: Ask Your Local Elected Officials to Combat Sex Buying! 

Take Action!

Forbes: Google Hosted Over 100 YouTube Videos Promoting AI Deepfake Porn

 

 

For people interested in using AI to generate artificial nudes, YouTube has been a good place to start. Though it doesn’t host “nudifier” tools, the video sharing platform used by 2.5 billion was hosting in excess of 100 videos with millions of views that were advertising how quickly these AI apps and websites can remove clothes from images of women, a review by Forbes found...

 

Tori Rousay, corporate advocacy program manager and analyst at NCOSE, said that Google has created “a continuous profit cycle” from nudify apps by accepting advertising money from developers and taking cuts of ad revenue and one-off payments when the app is hosted in the Google Play store. Rousay said that Apple, by comparison, had been quick to remove nudifier apps when NCOSE highlighted a number hosted on the App Store.

 

“Apple listened… Google must do the same,” Rousay added. “Google needs to create responsible practices and policies around the spread of image-based sexual abuse.”

Read More

Sincerely,

FacebookInstagramTwitterTikTokYouTubeLinkedIn