From Counter Extremism Project <[email protected]>
Subject Tech & Terrorism: Facebook Whistleblower Testifies Before U.S. Congress
Date October 5, 2021 11:01 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
On Tuesday, Facebook whistleblower Frances Haugen testified at a hearing before
the U.S. Senate’s Subcommittee on Consumer Protection, Product Safety,


<[link removed]>
<[link removed]>
Tech & Terrorism: Facebook Whistleblower Testifies Before U.S. Congress

 

(New York, N.Y.) — On Tuesday, Facebook whistleblower Frances Haugen testified
at a hearing
<[link removed]>
before the U.S. Senate’s Subcommittee on Consumer Protection, Product Safety,
and Data Security. She explained how Facebook’s products and algorithms have
had negative effects on mental health, bullying, human trafficking, and
political discourse, and how the company has failed to take appropriate action
to reduce or eliminate these harms.

 

In her opening statement, Haugen called
<[link removed]>
on Congress to act, saying, “I believe that Facebook’s products harm children,
stoke division, weaken our democracy and much more. The company’s leadership
knows ways to make Facebook and Instagram safer and won’t make the necessary
changes because they have put their immense profits before people.
Congressional action is needed.”

 

“Facebook’s actions—whether it be selectively applying its content moderation
rules, withholding information on the harmful effects the platform has on young
users, or doing nothing when it comes to content that incites violence and
hatred—have one common thread: putting profits ahead of public safety,” said
Counter Extremism Project (CEP) Executive Director David Ibsen. “Like other
for-profit tech firms, Facebook’s business model is very clear and relies on
algorithms to drive engagement to make money—including pushing content that is
divisive, conspiratorial, and extremist. Facebook’s behavior proves that its
repeated promises to do better cannot be trusted. In the face of inaction and
evendefiance
<[link removed]>
from the tech giant, government regulators must step in.”

 

Haugen’s testimony comes a month after the Wall Street Journal published The
Facebook Files <[link removed]>
that were based on the internal documents Haugen provided to them. Haugen has
also beeninvited to testify
<[link removed]>
before the European Parliament. The findings from theJournal investigation
include:

 

* Facebook created a previously unknown program called “cross check” (XCheck)
that allowed high-profile accounts and VIPs to circumvent the company’s terms
of service and post “rule-violating material” that “contain[ed] harassment or
incitement to violence.”
* Researchers from Facebook-owned Instagram found that Instagram was “harmful
for a sizable percentage of [young users], most notably teenage girls, more so
than other social media platforms.” Facebook executives have in response
“consistently played down the app’s negative effects.”
* Changes Facebook made to its algorithm in 2018 to increase user engagement
on its platforms were making “those who used it, angrier.” Facebook CEO Mark
Zuckerberg himself allegedly resisted making further changes because he
“worried they would lead people to interact with Facebook less.”
* Facebook employees warned the site was being used for heinous crimes
including facilitating human trafficking in the Middle East and inciting ethnic
violence in Ethiopia. The company’s response was described “in many instances”
as “inadequate or nothing at all.”
* Despite Facebook’s efforts to promote vaccinations, its platform instead
became flooded with “barrier to vaccination” content.
 

Earlier this year, CEP hosted a webinar
<[link removed]>
with Dr. Hany Farid, senior advisor to CEP and a professor at UC Berkeley, to
explore the nature and extent of the global phenomenon of misinformation as
well as the role of algorithmic amplification in promoting misinformation and
divisive content online. Dr. Farid said:“Algorithmic amplification is the root
cause of the unprecedented dissemination of hate speech, misinformation,
conspiracy theories, and harmful content online. Platforms have learned that
divisive content attracts the highest number of users and as such, the real
power lies with these recommendation algorithms. Until thorough regulation is
put in place, controversial content will continue to be promoted and amplified
online.”

 

Facebook has long faced criticism for the misuse of its platform on issues
ranging from the publication of inappropriate content to user privacy and
safety. Rather than taking preventative measures, however, Facebook has merely
jumped to make policy changes after damage has already been done. CEP has
documented instances in which Facebook has made express policy changes
following public accusations, a scandal, or pressure from lawmakers. While one
would hope that Facebook is continuously working to improve security on its
platform, there is no excuse as to why so many policy changes have been
reactive, and it raises the question as to what other scandals are in the
making due to still-undiscovered lapses in Facebook’s current policy.

 

To read CEP’s resource Tracking Facebook’s Policy Changes, please click here
<[link removed]>
.

 

###

Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable