Hate speech has no place on Facebook. Accountable Tech
‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌ ‌

Friend,

Facebook can be a great tool for keeping in touch with friends and family (and remembering their birthdays!). Unfortunately, it is often also a tool for bad actors to distribute deliberate disinformation to large groups of people. This can be for a political or ideological agenda, or simply to disseminate hate speech. Will you help us push back against it?

QAnon is one of those bad actors. This extremist group has attracted increased attention recently, as a number of its members have begun to emerge as Republican candidates (some receiving personal endorsements from President Donald Trump) and as the FBI has identified it as a potential domestic terrorism threat.

Not only has QAnon spread baseless conspiracy theories about Democrats and certain Hollywood celebrities, it has spread disinformation about COVID-19, and has even helped to fuel a viral anti-mask phenomenon at the height of a pandemic. And much of this has transpired on Facebook, where the platform’s algorithm has helped elevate QAnon groups from fringe to millions of members. Tell Mark Zuckerberg and Facebook to do better.

This week, Facebook announced that they're removing 790 QAnon groups from their platform. While removing these groups is a positive development, it's unfortunately just a Band-Aid to a broader issue which Facebook has created and refused to address at its core: an algorithm that promotes dangerous and divisive content, elevating extremist groups into the mainstream in favor of clicks and profit over morality.

Failure to address this head on is not only irresponsible on Facebook’s part -- it is downright dangerous. With millions of people getting their news via this medium on a daily basis, Facebook has an ethical responsibility to monitor the information and content that is shared on its platform. And the fact that we’re in the middle of a full-blown national public health crisis and an election season makes that responsibility all the more imperative and urgent.

Will you sign on to join us in demanding that Facebook stop boosting disinformation and hate speech from extremist groups on its platform?

Thank you for your support of our work!

Accountable Tech


Copyright Accountable Tech 2020
Unsubscribe