From Counter Extremism Project <[email protected]>
Subject Tech & Terrorism: Facebook Launches Content Moderation Oversight Board
Date May 13, 2020 3:00 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Company Continues To Pursue “Delay, Deny & Deflect” PR Strategy Instead Of
Enforcing Own Community Standards


<[link removed]>
<[link removed]>
Tech & Terrorism: Facebook Launches Content Moderation Oversight Board 

Company Continues To Pursue “Delay, Deny & Deflect” PR Strategy Instead Of
Enforcing Own Community Standards

(New York, N.Y.) – Last week, Facebook announced the formation of a 20-member
independent oversight body tasked with making decisions on contentious content
issues, such as online extremism, hate speech, harassment, and user privacy and
security. Facebook’s Oversight Board comes nearly two years after a damagingNew
York Times exposé detailed how Facebook COO Sheryl Sandberg and other top
company executives worked to “delay, deny and deflect
<[link removed]>
” bad news following public relations crises. The process that led to the
Oversight Board’s creation and subsequent media announcements raise concerns
that this is yet anotherpublic relations-driven measure
<[link removed]> in
lieu of substantive change.

Even though Facebook CEO Mark Zuckerberg first outlined the board’s concept in
November 2018, the Oversight Board is still only in its initial phase. Users
will only be able to appeal to the board in cases where Facebook has removed
their content. The board vaguely claims that “over the next few months” it will
add the ability to review appeals from users who want Facebook to remove
content. Additionally, despite emphasizing their commitment to creating a
system that is “accessible to the people,” the co-chairs of the board have
noted that they “will not be able to offer a ruling on every one of the many
thousands of cases that we expect to be shared with us each year,” further
watering down its effectiveness.

“Rather than work to enforce its own content policies, Facebook has chosen to
make another cosmetic policy announcement in an attempt to appease critics.
Facebook clearly outlines removal policies for extremist content in its
Community Standards
<[link removed]>, but it
inconsistently enforces those terms,” said Counter Extremism Project (CEP)
Executive Director David Ibsen. “Moreover, when the company inevitably fails to
remove dangerous content, it often shields itself under the umbrella of free
speech concerns. Facebook’s Oversight Board is already making similar caveats
and claiming that it is ‘committed to freedom of expression within the
framework of international norms of human rights.’ Curiously, neither Facebook
nor the Oversight Board cite other fundamental rights (e.g. right to privacy or
right to security), as they may require the multi-billion-dollar company to
alter its behavior dramatically.”

Clearly, Facebook’s latest measure is a reaction to widespread scrutiny from
the public and policymakers. Lawmakers in Germany, for instance, are discussing
ways to amend the country’sNetzDG online content moderation law
<[link removed]>
to better its effectiveness. In the U.S., there remainsbipartisan support
<[link removed]>
to amend Section 230 of the Communications Decency Act to remove blanket
liability protection for content. Rather than focus its energy on efforts to
repair its public image, Facebook can support specific legislative and
regulatory proposals, and end its significant lobbying efforts against
government regulation—a move that CEP has previouslyrecommended
<[link removed]>
.

CEP has also previously called
<[link removed]>
on Facebook to support amending Section 230 of the Communications Decency Act
(CDA). Section 230 must be amended to remove companies’ blanket protections
from liability for content posted by third parties on their platforms when that
content is incontrovertibly known to be extremist in nature or otherwise
harmful. CEP also called on Facebook to voluntarily release transparency
reports about its efforts to monitor and remove extremist or otherwise harmful
content and to support amending Securities Laws governing the Securities and
Exchange Commission (SEC).

The CEP resource Tracking Facebook’s Policy Changes
<[link removed]>
outlines the instances where Facebook has made reactive policy changes and
failed to uphold its commitment to eliminating content that is hateful,
harmful, and deceitful from its platform.

 ###

Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable