From Stop QAnon (via Accountable Tech) <[email protected]>
Subject Tell Mark Zuckerberg: STOP boosting extremist groups on your platform! πŸ›‘
Date August 20, 2020 9:21 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
​Tell Facebook: STOP boosting extremist groups like QAnon on your platform! Add your name >>> [link removed]

Friend,

Facebook can be a great tool for keeping in touch with friends and family (and remembering their birthdays!). Unfortunately, it is often also a tool for bad actors to distribute deliberate disinformation to large groups of people. This can be for a political or ideological agenda or simply to disseminate hate speech. Will you help us push back against it?

>>> [link removed]

QAnon is one of those bad actors. This extremist group has attracted increased attention recently, as a number of its members have begun to emerge as Republican candidates (some receiving personal endorsements from President Donald Trump) and as the FBI has identified it as a potential domestic terrorism threat.

Not only has QAnon spread baseless conspiracy theories about Democrats and certain Hollywood celebrities, it has spread disinformation about COVID-19, and has even helped to fuel a viral anti-mask phenomenon in the height of a pandemic. And much of this has transpired on Facebook, where the platform's algorithm has helped elevate QAnon groups from fringe to millions of members. Tell Mark Zuckerberg and Facebook to do better.

>>> [link removed]

This week, Facebook announced that they're removing 790 QAnon groups from their platform. While removing these groups is a positive development, it's unfortunately just a Band-Aid to a broader issue which Facebook has created and refused to address at its core: an algorithm that promotes dangerous and divisive content, elevating extremist groups into the mainstream in favor of clicks and profit over morality.

Failure to address this head-on is not only irresponsible on Facebook's part -- it is downright dangerous. With millions of people getting their news via this medium on a daily basis, Facebook has an ethical responsibility to monitor the information and content that is shared on its platform. And the fact that we're in the middle of a full-blown national public health crisis and an election season makes that responsibility all the more imperative and urgent.

Will you sign on to join us in demanding that Facebook stop boosting disinformation and hate speech from extremist groups on its platform?

>>> [link removed]

Thank you for your support of our work!

Accountable Tech
----

This email was sent to [email protected].

To unsubscribe, go to:
[link removed]
Screenshot of the email generated on import

Message Analysis

  • Sender: Accountable Tech
  • Political Party: n/a
  • Country: United States
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Blue State Digital
    • EveryAction