Light a Virtual Candle TODAY for Child Sexual Abuse Healing & Prevention!

Today marks the very first ever World Day for the Prevention of, and Healing from Child Sexual Exploitation, Abuse and Violence!

The United Nations recently established this world day after dedicated advocacy from the Global Collaborative, a survivor led network of NGO’s, faith-based institutions, survivor networks and governments of which NCOSE is a member. 

You are invited to participate in this new World Day by lighting a virtual candle! You can do this easily by visiting this page and typing in your zip code or postal code, to signify where the virtual candle has been lit. This initiative will show that, all around the world, people are lighting candles to raise awareness and “shine a light” on the need for prevention and healing of child sexual exploitation, abuse, and violence. 

 

Pornography & Prostitution: Connecting the Dots

Join our valued ally Culture Reframed for a virtual event on Dec. 6 to explore the link between pornography and prostitution. At Pornography and Prostitution: Connecting the Dots we will learn from experts in the field and special guest speakers who are committed to ending porn's harmful societal impact on women and girls. Together, we will explore how we as a community can leverage our collective knowledge and activism to break the harm chain of the commercial sex industry. RSVP here.

Image-Based Sexual Abuse: A Little-Known Term, but a Pervasive Problem

“Life-ruining.” “Hell on earth.” “A nightmare . . . which destroyed everything.”  

These are the words survivors used to describe their experiences of image-based sexual abuse (IBSA).  

If you don’t know what “image-based sexual abuse” is, you’re not the only one. But even if you haven’t heard the term before, chances are you’ve heard of people experiencing it. In fact, chances are quite high that it’s affected someone you know – or even you yourself. Because while “image-based sexual abuse” is a new and emerging term, it is an all-too-common phenomenon.  

So let’s start with more commonly known terms: 

Leaked nudes.  

Revenge pornography.  

Upskirting. 

Downblousing. 

Deepfake pornography.  

Do any of those sound more familiar? The various types of abuses these terms describe are all encompassed under the umbrella term of “image-based sexual abuse” (IBSA). Learn more about IBSA and what can be done to stop it here.

Time’s Up for Telegram Ignoring Image-Based Sexual Abuse

Telegram is a mobile and desktop messaging app, popular due to its promise of heightened user privacy. Based on the number of people who have downloaded the app, Telegram is ranked in the top ten most popular social networking platforms in the world. 

The privacy features Telegram offers include encryption for all chats and end-to-end encryption for calls and “secret chats”. Unfortunately, despite its declared focus on user privacy, Telegram is facilitating a particularly egregious violation of privacy known as image-based sexual abuse.  

Learn more about how Telegram is facilitating image-based sexual abuse and call on them to change!

Sincerely,

Having trouble viewing this email? View it in your web browser

Unsubscribe or Manage Your Preferences