The Children of CDA 230: Story Spotlight
Jasmine (pseudonym), a minor, frequently used the app Kik to message her friends. It seemed innocent at first. They texted on Kik to set up plans to hang out or share funny memes. But things quickly took a turn.
Numerous adult men messaged Jasmine on Kik, sending her sexually explicit photographs and pressuring her to do the same. The young girl was confused, but didn't know any better. She eventually conceded and sent them many sexual pictures of herself.
Jasmine's family filed a lawsuit against Kik for allowing this grooming to occur, but the court dismissed the case under a law known as the Communications Decency Act Section 230.
Unfortunately, this law gives tech companies, like Kik, immunity from taking responsibility for the malevolent behavior that occurs on their platforms. Even when preventable tragedies, like this one occur, this law inhibits any type of legal accountability for Big Tech.
Kik was well-known for harboring predators seeking to exploit child users. After immense public scrutiny, Kik was shut down and has since been relaunched with security improvements. But the fact remains that without CDA Section 230 reformed, tech companies will continue to be legally protected, even when their platforms run rampant with sexual exploitation.