The Misuse of Artificial Intelligence in Producing Child Sexual Abuse Material
From the convenience of voice-to-text converters and digital assistants like Siri to the unsettling realm of ‘deepfake’ technology and image generation, artificial intelligence has seamlessly merged into our daily lives, shaping our efficiency while advancing forms of child sexual exploitation. Before the widespread accessibility of artificial intelligence (AI), a form of producing child sexual abuse material (CSAM, the more apt term for ‘child pornography’) involved cutting out pictures of children and pasting them onto pornographic images, compositing collages of CSAM.
Today, predators can download readily available text-to-image AI software to generate CSAM of fictitious children; as well as AI-manipulated CSAM, where real pictures of any child are digitally superimposed into existing CSAM or other sexually explicit material. |
#CESESummit: Tech for Good The evolution of sexual exploitation issues that come with AI and emerging technology can feel overwhelming. The #CESESummit is a great place to get equipped!
This year, the theme of the Coalition to End Sexual Exploitation (CESE) Summit is "The Great Collision: Emerging Tech, Sexual Exploitation, and the Ongoing Pursuit of Dignity."
WHEN: August 5-8, 2024 WHERE: Washington, DC
Secure your spot now! |
Give Now to Prevent Sexual Exploitation
Your tax-deductible gift stands for those who cannot stand for themselves. Every dollar you give will expose, combat, and prevent sexual abuse and exploitation at mass-scale.
Together with you, we are pursuing the highest impact tactics to dismantle the systems which allow sexual exploitation to proliferate, unchecked. We are demanding change among the world’s biggest corporate abusers and profiteers of child sexual abuse, sex trafficking, prostitution, and pornography.
Thank you for giving generously today! |