On Thursday, Facebook’s Oversight Board released its first transparency report,
where it criticized Facebook for failing to be “fully forthcoming” abo
<[link removed]>
<[link removed]>
Tech & Terrorism: Facebook Fails To Provide Full, Transparent Information On
Content Moderation To Oversight Board
(New York, N.Y.) — On Thursday, Facebook’s Oversight Board released its first
transparency report, where itcriticized
<[link removed]>
Facebook for failing to be “fully forthcoming” about its controversial XCheck
program, which exempts high-profile users like politicians and celebrities from
its rules.
Tensions have mounted between the Board and tech company since the publication
of the Wall Street Journal’s Facebook Files
<[link removed]>. These articles
detailed theXCheck program
<[link removed]>, which provided a
workaround against the company’s content moderation policies and provided
exemptions to at least 5.8 million VIP users. These VIPs were allowed to post
“rule-violating material” that harassed others and incited violence. In its
transparency report, the Board found that the company was not “fully
forthcoming” about XCheck, and that its behavior was “not acceptable.” The
Board will be conducting a full review of the XCheck program and recommending
how the program can be improved, along with making it more transparent.
“Facebook’s failure to provide complete information to its own Oversight Board
exposes how the tech giant still does not apply its Community Standards
policies ‘consistently and fairly
<[link removed]>’—despite claiming
otherwise. Facebook’s Oversight Board must utilize this opportunity to spur
major changes within the company for its content moderation policies, including
enforcing them in a manner that is transparent and equal among all of its
users,” said Counter Extremism Project (CEP) Executive Director David Ibsen.
In the past, Facebook has stated that the Oversight Board’s decisions would be
“binding <[link removed]>” and committed to implementing them.
However, recent actions by the company indicates a retreat from that policy.
From October 2020 to June 2021, the Board selected 21 cases to rule on and
made 52 recommendations. However, as theWashington Post
<[link removed]>
reports, Facebook “has not agreed to implement all of them. In many instances,
it has told the Board it is ‘assessing the feasibility’ of the
recommendations.” As part of the rulings, the Board has also sent Facebook 156
questions and the company “declined to answer in 14 instances, and only
partially answered in 12.”
For over a decade, Facebook has faced criticism for the misuse of its platform
on issues ranging from the publication of inappropriate content to user privacy
and safety. Rather than taking preventative measures, Facebook has too often
jumped to make policy changes after damage has already been done. CEP has
documented instances in which Facebook has made express policy changes
following public accusations, a scandal, or pressure from lawmakers.
To read CEP’s resource Updated: Tracking Facebook’s Policy Changes, please
clickhere
<[link removed]>
.
###
Unsubscribe
<[link removed]>