From Counter Extremism Project <[email protected]>
Subject Tech & Terrorism: Facebook’s Oversight Board Releases Annual Report Calling For More Transparency From Company
Date June 29, 2022 9:32 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Board Has No Authority And Recommendations Are Not Binding





<[link removed]>
<[link removed]>



Tech & Terrorism: Facebook’s Oversight Board Releases Annual Report Calling
For More Transparency From Company

Board Has No Authority And Recommendations Are Not Binding



(New York, N.Y.) — Last week, Facebook’s Oversight Board released
<[link removed]>
its first annual report since its formation in 2020. In it, the 20-member
independent body, tasked with making decisions on contentious content issues,
such as extremism and terrorism, stated that it “continues to have significant
concerns, including around Meta’s [Facebook’s parent company] transparency and
provision of information related to certain cases and policy recommendations.”
Before the creation of the Oversight Board, Facebook faced years of criticism
concerning its content moderation policies, including the platform’s failure to
remove extremist and terrorist content in a consistent and transparent manner.



Rather than taking preventative measures, Facebook has too often jumped to
make cosmetic policy changes to limit reputational damage. One day after theNew
York Times published a November 2018 exposé detailing how COO Sheryl Sandberg
and other Facebook executives worked to downplay and spin bad news, CEO Mark
Zuckerberg announced that the company would establish an independent body to
oversee its content moderation systems, which eventually became the Oversight
Board.



Counter Extremism Project (CEP) Executive Director David Ibsen stated, “In an
attempt to appease critics and lawmakers, Facebook created the Oversight Board,
specifically using the term ‘oversight’ and ‘board’ to imply some sort of
authority. However, this is not a Board of Directors, with management or legal
authority. Rather, the Oversight Board is more akin to an internal working
group that relies solely on the company it is supposed to be overseeing for
information and funding. Clearly, the Board is part of Meta’s public relations
campaign aimed at obscuring the failures of self-regulation and muddying the
calls for major reform and regulatory oversight.”



Ibsen continued, “If Meta is serious about its pledge to improve online
safety, then it should ensure the Oversight Board is capable of true
independent oversight with the right to access any and all information related
to its inquiries. The company should also bring external experts who have core
computer science skills, a proven track record of working for online safety,
and a healthy skepticism of tech companies onto the Oversight Board, such as
U.C. Berkeley professor and CEP Senior Advisor Dr.Hany Farid
<[link removed]>, a digital forensics
expert who pioneered the leading technology that combats child sexual abuse
material online. Meta should also integrate members of the Oversight Board onto
its corporate board. Doing so would give teeth to any policy recommendations
and demonstrate a true commitment to ensuring the platform is free from
terrorism content and safe for users.”



Since its creation, the Oversight Board has ruled against Facebook 14 out of
20 times on content removal and Facebook has implemented about two thirds of
the Board’s 86 policy recommendations. Tensions between the Board and Facebook
reached a head last year, when Facebook’s controversialXCheck
<[link removed]>
content moderation program came to light. The program provided a workaround
against the company’s content moderation policies and provided exemptions to at
least 5.8 million VIP users. These VIPs were allowed to post “rule-violating
material” that harassed others and incited violence. Back then, the Board found
that the company was not “fully forthcoming” about XCheck, and that its
behavior was “not acceptable.”



To read CEP’s resource Updated: Tracking Facebook’s Policy Changes, please
clickhere
<[link removed]>
.



###





Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable