Despite Its Own Problems, Tech Giant Casts Blame On Other Sites
<[link removed]>
<[link removed]>
Tech & Terrorism: Facebook’s Groups Feature Shown To Provide Platform For Hate
Speech
Despite Its Own Problems, Tech Giant Casts Blame On Other Sites
(New York, N.Y.) — For years, users have been spreading
<[link removed]>
“blatant misinformation and calls to violence” on Facebook’s Groups feature.
In August, data scientists warned company executives that this type of content
had evolved to comprise the majority of content posted in Groups. Facebook,
however, responded by taking limited action, banning some of the problem Groups
ahead of the U.S. presidential election and reportedly viewed this measure as
temporary. After the January 6 riot on the U.S. Capitol building, Facebook took
down more Groups and imposed new rules in an “emergency response.”
This episode marks yet another instance of Facebook’s reactive policy making
<[link removed]>
process and another failure to uphold its commitment to eliminating hateful
and harmful content from its platform. Further, despite the internal warnings,
Facebook COO Sheryl Sandberg instead attempted to shift blame to smaller social
media sites for providing a platform that allowed the January 6 attack to
occur. Sandberg’s readiness to fault others, regardless of Facebook’s own
problems with Groups, suggests that the company is still all too eager to
embrace itsdeny, delay, and deflect
<[link removed]>
mantra.
Extremists have a history of using Facebook’s own features to organize and
promote their propaganda. In its 2018 report,Spiders of the Caliphate: Mapping
the Islamic State’s Global Support Network on Facebook
<[link removed]>
, the Counter Extremism Project (CEP) observed ISIS activities on Facebook that
included recruitment, posting propaganda, hacking, spamming, and discussing
terrorist activity over Facebook Live. Pro-ISIS Facebook users also worked to
hack non-ISIS accounts and used the new account to share ISIS propaganda and
post hateful and threatening messages. A group of American ISIS supporters held
weekly “meetings” on Facebook Live to discuss topics ranging from ideology to
how to avoid detection from the FBI. Facebook’s suggested friends algorithm
even recommended ISIS supporters, propagandists, and fighters, connecting
extremist individuals and helping to expand ISIS networks.
In a recent op-ed
<[link removed]>
forMorning Consult, CEP Executive Director David Ibsen reiterated the need for
Facebook and the tech community at-large to adopt better standards for removing
extremist content and individuals—particularly notable given that Facebook
reactively changed its rules multiple times for Groups. “CEP has long argued
for tech industry removal policies that are transparent and based on
established standards and laws. For example, we have called for social media
platforms to ban participation from U.S.-designated Foreign Terrorist
Organizations and Specially Designated Nationals. Such a commonsense approach
will help ensure that the tech industry can focus on a clear and defined set of
targets and be held accountable when companies fail to take effective and
permanent action against actual extremists with a history of advocating for
violence or carrying out terrorist attacks.”
To read CEP’s report, Spiders of the Caliphate: Mapping the Islamic State’s
Global Support Network on Facebook, please click here
<[link removed]>
.
To read CEP Executive Director David Ibsen’s January 2021 Morning Consult
op-ed, please clickhere
<[link removed]>
.
###
Unsubscribe
<[link removed]>