CEP Senior Advisor Dr. Hany Farid Testifies On Algorithms That Drive Revenues &
Misinformation
<[link removed]>
<[link removed]>
Tech & Terrorism: Industry Faces Criticism On Capitol Hill For Promoting
Divisive Content
CEP Senior Advisor Dr. Hany Farid Testifies On Algorithms That Drive Revenues
& Misinformation
(New York, N.Y.) – Today, Counter Extremism Project (CEP) Senior Advisor Dr.
Hany Farid, a professor at University of California, Berkeley with a joint
appointment in Electrical Engineering & Computer Sciences and the School of
Information, testified before ajoint subcommittee
<[link removed]>
of the U.S. House Committee on Energy & Commerce on the effect online
disinformation has had on the country. In histestimony
<[link removed]>
, Dr. Farid criticized tech firms, including Facebook and Google-owned YouTube,
for their unwillingness to moderate harmful content on their respective
platforms. Because much of the content consumption on social media platforms is
determined by algorithms, tech firms have an incentive to amplify divisive
content, which increases user engagement and drives revenue.
Facebook CEO Mark Zuckerberg has claimed that he does not want the company to
be “the arbiter of truth
<[link removed]>
.” But as, Dr. Farid notes, this rhetoric ignores the reality that social media
unilaterally and algorithmically decides what content is relevant and promoted
every day to billions of worldwide users.
Dr. Farid told members of the subcommittee: “The point is not about truth or
falsehood, but about algorithmic amplification. The point is that social media
decides every day what is relevant by recommending it to their billions of
users. The point is that social media has learned that outrageous, divisive,
and conspiratorial content increases engagement … The vast majority of
delivered content is actively promoted by content providers based on their
algorithms that are designed in large part to maximize engagement and revenue …
Many want to frame the issue of content moderation as an issue of freedom of
speech. It is not.”
The role algorithmic amplification plays in content consumption is an issue
that must be confronted. In March, Dr. Faridco-authored
<[link removed]> a report
analyzing YouTube’s policies and efforts toward curbing its algorithm’s
tendency to spread conspiracy theories. After reviewing eight million
recommendations over 15 months, researchers determined the progress YouTube
claimed
<[link removed]>
in June 2019 to have reduced the amount of time its users watched recommended
videos including conspiracies by 50 percent—and inDecember 2019
<[link removed]>
by 70 percent—did not make the “problem of radicalization on YouTube obsolete
nor fictional.” The study, ALongitudinal Analysis Of YouTube’s Promotion Of
Conspiracy Videos, found that a more complete analysis of YouTube’s algorithmic
recommendations showed the proportion of conspiratorial recommendations is “now
only 40 percent less common than when the YouTube’s measures were first
announced.”
In order to address the effect of mis- and disinformation has had on the
Internet and society as a whole, all stakeholders must come together to do
better. Dr. Farid concluded today: “If online content providers prioritized
their algorithms to value trusted information over untrusted information,
respectful over hateful, and unifying over divisive, we could move from a
divisiveness-fueling and misinformation-distributing machine that is social
media today, to a healthier and more respectful online ecosystem. If
advertisers, that are the fuel behind social media, took a stand against online
abuses, they could withhold their advertising dollars to insist on real change.
Standing in the way of this much needed change is a lack of corporate
leadership, a lack of competition, a lack of regulatory oversight, and a lack
of education among the general public. Responsibility, therefore, falls on the
private sector, government regulators, and we the general public.”
###
Unsubscribe
<[link removed]>