From Counter Extremism Project <[email protected]>
Subject CEP Spotlight: Dr. Hany Farid
Date November 30, 2021 7:40 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Analysis From Counter Extremism Project Experts


<[link removed]>
<[link removed]>
CEP Spotlight: Dr. Hany Farid

Analysis From Counter Extremism Project Experts

 

(New York, N.Y.) – CEP Senior Advisor Dr. Hany Farid
<[link removed]>, an internationally recognized expert in digital
forensics, misinformation, algorithmic amplification, image analysis, and
hashing technology, is a professor at UC Berkeley with a joint appointment in
Electrical Engineering & Computer Sciences and the School of Information.

 

Dr. Farid’s research has repeatedly demonstrated the hollowness of tech
giants’ promises to moderate content and their role in exacerbating a host of
problems, from radicalizing terrorist and extremist content to child
exploitation and illegal drug and gun sales. Dr. Farid has appeared before
Congress on several occasions, testifying about the dangers of deep fakes,
algorithmic amplification, and the economic incentives of the big tech
companies that lead them to promote the most sensational, divisive, and harmful
content. Dr. Farid received his undergraduate degree in Computer Science and
Applied Mathematics from the University of Rochester in 1989, his M.S. in
Computer Science from SUNY Albany in 1992, and his Ph.D. in Computer Science
from the University of Pennsylvania in 1997.

He is the recipient of an Alfred P. Sloan Fellowship, a John Simon Guggenheim
Fellowship, and is a Fellow of the National Academy of Inventors.

 

Dr. Farid can be reached directly via email at [email protected]
<mailto:[email protected]> to discuss digital forensics, misinformation,
algorithmic amplification, image analysis and hashing technology.

 

Policy, Research, and Analysis

 

Section 230 Reform and the Justice Against Malicious Algorithms Act of 2021
<[link removed]>

On October 14, Dr. Farid released comments in support of the Justice Against
Malicious Algorithms Act of 2021. The bill would narrowly amend Section 230(c)
of the Communications Decency Act (CDA) and eliminate blanket liability
protection for online platforms that knowingly or recklessly deploy
recommendation algorithms to promote content that materially contributes to
physical or severe emotional injury. “Algorithmic amplification is a key
driving force for spreading problematic, harmful content online. As it stands,
for-profit tech companies have a business incentive to increase engagement on
their platforms—including by promoting divisive, hateful, and extremist
content—in order to increase revenues. Legislation is clearly needed to shift
that corporate calculation.  The proposed bill is a sensible legislative
solution that holds the tech industry accountable for their reckless behavior
in proliferating content ranging from child sex abuse, terrorism, the sale of
illegal narcotics and weapons, to misinformation.”

 

How Algorithmic Amplification Pushes Users Towards Divisive Content on Social
Media Platforms
<[link removed]>

In June, CEP hosted the first in a series of webinars
<[link removed]>
with Dr. Farid that explored the nature and extent of the global phenomenon of
misinformation, its consequences, the role of algorithmic amplification in
spreading it, and possible technological and regulatory interventions for
stopping it. Farid was joined by German MEP Tiemo Wölken, the coordinator for
the Committee on Legal Affairs (JURI) at the European Parliament, and Prabhat
Agarwal, the head of the Digital Services and Platforms Unit for DG Connect at
the European Commission. Dr. Farid said: “Algorithmic amplification is the root
cause of the unprecedented dissemination of hate speech, misinformation,
conspiracy theories, and harmful content online. Platforms have learned that
divisive content attracts the highest number of users and as such, the real
power lies with these recommendation algorithms. Until thorough regulation is
put in place, controversial content will continue to be promoted and amplified
online.” Clickhere <[link removed]> to watch Dr.
Farid explain how to combat fake news.

 

On the Threat of Deep Fakes to Democracy and Society
<[link removed]>
The influence of fake news and the manipulation of public and political
perception has been a threat to political systems for years. Today, fake news
is often supported by so-called deep fakes—seemingly real but synthesized
videos of various kinds. Due to advances in machine learning, significant
technical skills to produce deep fakes are no longer necessary, vastly
increasing the risk of their misuse. CEP, in cooperation with the Konrad
Adenauer-Stiftung (KAS), released a study, On the Threat of Deep Fakes to
Democracy and Society
<[link removed]>
. The authors, Dr. Farid and CEP Senior Director Dr. Hans-Jakob Schindler,
discussed the study and ways to confront the problem during a CEPwebinar
<[link removed]>. Media coverage: Knowable
Magazine
<[link removed]>
andThe Washington Post
<[link removed]>
.

 

Op-eds and Selected Media

 

Newsweek: “Should we Celebrate or Condemn Apple's New Child Protection
Measures?”
<[link removed]>
While child advocates were supportive of Apple’s announced move to deploy
hashing technologies to protect children from sexual abuse and exploitation,
Dr. Farid argues that Apple should not be celebrated for the modest step of
extending the reach of a decade-old technology, which applies only to images.
“For the past two decades, the technology industry as a whole has been
lethargic, even negligent, in responding to the threats posed by the global
trade of child sexual abuse material (CSAM), live-streaming of child sexual
abuse, predatory grooming and sexual extortion. Videos constitute nearly half
of the more than 65 million pieces of content reported to NCMEC (National
Center for Missing and Exploited Children) last year. Sadly, Apple's technology
will be blind to this content. Apple has come late to the game and is tackling,
at best, half of the problem. This is hardly a reason to celebrate.”

 

Industry Faces Criticism On Capitol Hill For Promoting Divisive Content
<[link removed]>
Dr. Farid testified before a joint subcommittee 
<[link removed]>
of the U.S. House Committee on Energy & Commerce on the effect online
disinformation has had on the country. In his testimony
<[link removed]>
, Dr. Farid criticized tech firms, including Facebook and Google-owned YouTube.
Because much of the content consumption on social media platforms is determined
by algorithms, tech firms have an incentive to amplify divisive content, which
increases user engagement and drives revenue. Dr. Farid said: “The point is not
about truth or falsehood, but about algorithmic amplification. The point is
that social media decides every day what is relevant by recommending it to
their billions of users. The point is that social media has learned that
outrageous, divisive, and conspiratorial content increases engagement … The
vast majority of delivered content is actively promoted by content providers
based on their algorithms that are designed in large part to maximize
engagement and revenue … Many want to frame the issue of content moderation as
an issue of freedom of speech. It is not.” Media coverage:The Sociable
<[link removed]>
,Quartz
<[link removed]>.

 

Newsweek: “Google Is Not Cracking Down on the Most Dangerous Drug in America” 
<[link removed]>
Dr. Farid and Mathea Falco, a leading expert in drug abuse prevention and
treatment, warn that Google is failing to obstruct the online purchase of pure
fentanyl and fentanyl-laced drugs. “According to the Centers for Disease
Control and Prevention <[link removed]>
two-thirds of the nearly 47,000 opioid-related deaths in the U.S. in 2018 were
caused by powerful synthetic opioids—primarily fentanyl and its analogs. Google
is failing to obstruct the online purchase of pure fentanyl and fentanyl-laced
drugs. Both can now be purchased with a simple search and a click of a mouse.
The fentanyl threat in America is too great to allow Google to continue to
stand on the sidelines, denying its complicity in making fentanyl widely
available through online sales.”

Wired: “Congress Needs to Make Silicon Valley EARN IT”
<[link removed]>
Dr. Farid argues in favor of legislation that would force tech giants to stop
prioritizing profits over safety.“Frustratingly, for the past decade, the
technology sector has been largely obstructionist and full of naysayers when it
comes to deploying new technologies to protect us. As a result of this
deliberate neglect, the internet is overrun
<[link removed]> with
child sexual abuse material, illegal sex trade, nonconsensual pornography, hate
and terrorism, illegal drugs, illegal weapons, and rampant misinformation
designed to sow civil unrest and interfere with democratic elections.”

The Lawfare Podcast: “Hany Farid on Deep Fakes, Doctored Photos and
Disinformation”
<[link removed]>
Dr. Farid, interviewed as part of Lawfare’s Arbiters of Truth series on
disinformation, talks about the explosion of deep fakes, realistic synthetic
media in which a person’s likeness is altered to show them doing or saying
something they never did or said. Dr. Farid discusses the many problems caused
by deep fakes, how much of the problem is inherent in the technology itself,
how much harm is caused by the way big technology platforms amplify incendiary
content, and how aggressive those same companies should be in monitoring and
removing disinformation.

 

###

Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable