The revolution in artificial intelligence has sparked an explosion of disturbingly lifelike images showing child sexual exploitation, fueling concerns among child-safety investigators that they will undermine efforts to find victims and combat real-world abuse... Thousands of AI-generated child-sex images have been found on forums across the dark web, a layer of the internet visible only with special browsers, with some participants sharing detailed guides for how other pedophiles can make their own creations. “Children’s images, including the content of known victims, are being repurposed for this really evil output,” said Rebecca Portnoff, the director of data science at Thorn, a nonprofit child-safety group that has seen month-over-month growth of the images’ prevalence since last fall. “Victim identification is already a needle in a haystack problem, where law enforcement is trying to find a child in harm’s way,” she said. “The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge.” Read the full Washington Post article here. |
|
|
The National Center on Sexual Exploitation is warning potential Reddit investors ahead of its potential IPO that the company needs to do even more to combat sexual exploitation. Reddit rightly decided to restrict NSFW (i.e. sexually explicit) content from third party apps—a move that NCOSE had called on the company to make. But before Reddit can offer an IPO, the company ultimately needs to confront the pervasive problem of image-based sexual abuse, child sexual abuse material, and non-consensual content that can be found on its platform. To do this, Reddit must remove sexually explicit content unless it can enact meaningful age and consent verification of those depicted in pornography. |
|