Reddit, a three-peat Dirty Dozen List target, has released a mixed bag of policy changes. Reddit has expanded their child safety policy to prohibit comments, pictures, cartoon images (anime, etc.), and poses that sexualize minors, as well as predatory or inappropriate behaviors such as "sexual role-play where an adult might assume the role of a minor.” These are positive changes that correspond to many concerns NCOSE raised! Unfortunately, so far these policies appear to be poorly enforced, as NCOSE researchers are still able to easily find indicators of child sexualization and/or child sexual abuse material on Reddit. Further, in a disappointing move, Reddit clarified that its policies on "non-consensual intimate images" (the less apt term for "image-based sexual abuse") do NOT extend to AI-generated sexually explicit content if the content does not depict an identifiable individual. This permissiveness overlooks the fact that AI-generated pornography is often built on non-consensually created, shared, or scraped sexually explicit content —regardless of whether the individual in the generated image is identifiable. Read more here. |
|