Dear Friend,

I have a fake account on Snapchat in which I pose as a 14-year-old, for the purpose of research. Here are some examples of videos I was recommended on this account, while perusing the Spotlight and Stories sections of the app: 

 👉 A joke about slapping women during sex acts

👉 Simulation of sexual fluids

👉 Simulation / insinuation of other sex acts

NCOSE has been asking Snap (the owner of Snpachat) to clean up the Stories (also referred to as Discover) section since 2016…and believe it or not, it has improved greatly since then. 

However, the reality is that Snapchat continues to push very sexualized videos (among other disturbing, inappropriate content) to minor accounts.

Snap announced new Content Controls last week that “will allow parents to filter out Stories from publishers or creators that may have been identified as sensitive or suggestive.”

Sounds good. We certainly endorse tools to moderate content on platforms.

BUT HERE’S THE FINE PRINT: This feature is only available through the recently released Family Center and only the parent can turn it on. It’s not on by default and is not available to any teen directly.

The result of this?

The only kids who will benefit from this new safety tool is those who have the privilege of involved, informed, tech-savvy parents…or any parents at all. In other words, protections are limited to the kids who are likely least at risk for exploitation and others harms in the first place.

So is it an improvement? Suuuuuuure. But one that will likely affect a very small number of kids. Most minors on Snap will likely be served what I shared above.

Further: Snap's new Content Controls could effectively be likened to putting a band-aid on a gushing head wound. 

“Which do you think is the most dangerous app?” I constantly ask law enforcement, child safety experts, lawyers, survivors, youth. Without fail, “Snap” is in the top two, and usually #1.

Even while attempting to garner good press for its supposed safety improvements, Snap continues to roll out dangerous new features—the latest being its new "MY AI" chatbot friend, which has been shown to chat with kids about booze and sex

For a thorough analysis of how Snap is largely failing to protect kids, read this blog.

📣 Take Action! 

Join us in demanding Snap make fundamental changes to its design in order to truly protect its young users
including making Content Controls available to all teens, block My AI for minors, and use existing technology to blur sexually explicit.

It will only take you 30 seconds to fill out this quick action form!

Sincerely,

Lina Nealon
Vice President and Director of Corporate Advocacy
National Center on Sexual Exploitation

 

Having trouble viewing this email? View it in your web browser

Unsubscribe or Manage Your Preferences