Image

Why We Can’t Ignore OnlyFans During Human Trafficking Prevention Month

She thought Austin loved her. But when she moved in, everything changed.

 

Austin held her captive in a room and forced her to make sexually explicit videos for OnlyFans. She had to create a minimum of $1,000 worth of sexually explicit content every day, and she was never given any of the money.

 

Austin sexually assaulted her repeatedly. He physically abused her, pouring hot oil down her neck and back because he “thought it was funny.” He threatened to shoot her family if they came to help her.

 

She was powerless.

 

 

The above true story is only one recent example of a woman being sex trafficked on OnlyFans. OnlyFans has convinced the world that it is the safe, empowering side of the pornography industry. It’s easy to understand how this deception has gained traction, when the girls appear to be creating content from the “safety” of their own home.

 

But the reality is: we never know who’s behind the camera. We never know who’s forcing them into that life.

This is a truth which we must declare louder than ever during this Human Trafficking Prevention month.

Read More

📣 ACTION: Ask the U.S. Attorney General to Investigate OnlyFans!*

*Action is only available to those with a U.S. zipcode

Take Action!

Bark's 2023 Annual Report

Every year Bark release a report that analyzes rates at which children have harmful experiences online. This year, their analysis was based on 5.6 billion activities on family accounts across texts, email, YouTube, and 30+ apps and social media platforms.

 

A couple highlights from the report: 

  • 58% of tweens and 75% of teens encountered sexual content 
  • 8% of tweens and 10% of teens encountered online predators 

Read the full report here.

Read More

Over 90% of Child Sexual Abuse Material is Self-Generated ... Talk to Your Kids About Sexting!

Various media outlets have been sounding the alarm this week about data from The Internet Watch Foundation shows that over 90% of child sexual abuse material (CSAM, the more apt term for "child pornography") is self-generated. 

 

A major contributor to self-generated CSAM is the normalization of "sexting" among youth, in which kids send sexually explicit images of themselves to others, often to a romantic interest. 


Please talk to your kids about the harms of sexting! Learn more here: The Phenomenon of “Sexting” and Its Risks to Youth.

Learn More

Snapchat Wants Feedback from Teens!

Snap, the owner of Snapchat, has launched its inaugural Council for Digital Well-Being, an 18-month pilot program for ~15 teens in the U.S. between the ages 13 and 16. This is an opportunity for young people to share their ideas directly with Snap on how to make Snapchat a safer, healthier, and more enjoyable place.

 

You can see the announcement post and application form here. Interested teens between 13 and 16, living in the United States, should complete and submit the online application by 8:00pm EST / 5:00 pm PST on Friday, March 22.

The program will feature monthly calls, project work, and engagement with Snap's global Safety Advisory Board, as well as a two-day summit at Snap’s headquarters in Santa Monica, California. 

Apply Now

Gratefully,

Facebook InstagramTwitterTikTokYouTubeLinkedIn