From Counter Extremism Project <[email protected]>
Subject Tech & Terrorism: Reuploads Of Extremist Content Online Is A Solvable Problem
Date May 26, 2022 5:30 PM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
Dr. Hany Farid: “It’s Not As Hard A Problem As The Technology Sector Will Have
You Believe”





<[link removed]>
<[link removed]>



Tech & Terrorism: Reuploads Of Extremist Content Online Is A Solvable Problem  

Dr. Hany Farid: “It’s Not As Hard A Problem As The Technology Sector Will Have
You Believe”

 

(New York, N.Y.) — Recently, Counter Extremism Project (CEP) Senior Advisor
and University of California, Berkeley professorDr. Hany Farid
<[link removed]> spoke to the Guardian
<[link removed]>
, in which he discussed the aftermath of the May 14, 2022 Buffalo, New York
grocery store shooting. The gunman—who allegedly released a manifesto
identifying himself as a white supremacist, antisemite, and believer in the
Great Replacement Theory
<[link removed]>
—livestreamed part of his attack on the Amazon-owned platform Twitch. Twitch
took down the stream in less than two minutes but not before the footage was
captured, allowing users to reshare it across multiple sites and platforms.

 

Speaking on the technology that tech companies need to combat the video’s
redistribution, Dr. Farid states:

 

“The core technology to stop redistribution is called ‘hashing’ or ‘robust
hashing’ or ‘perceptual hashing’. The basic idea is quite simple: you have a
piece of content that is not allowed on your service either because it violated
terms of service, it’s illegal or for whatever reason, you reach into that
content, and extract a digital signature, or a hash as it’s called… And then
every time a video is uploaded with the hash, the signature is compared against
this database, which is being updated almost instantaneously. And then you stop
the redistribution.”

 

Therefore, the problem that online platforms are seeing with reuploads and
redistribution is not as complex as tech companies make it out to be. Rather,
there is a lack of motivation and investment on the part of Big Tech. Dr. Farid
points to the lack of a financial incentive to act:

 

“It’s not as hard a problem as the technology sector will have you believe …
But the companies are not motivated to fix the problem. And we should stop
pretending that these are companies [care] about anything other than making
money … They are doing a calculation. What’s the cost of fixing it? What’s the
cost of not fixing it? And it turns out that the cost of not fixing is less.
And so they don’t fix it.”

 

Finally, Dr. Farid states that if the tech companies are not willing to invest
the time and money into fixing the issue, the future of preventing reuploads of
extremist content will have to fall to public policy such as the EU’sDigital
Services Act
<[link removed]>
.

 

“The EU announced the Digital Services Act that will put a duty of care
[standard on tech companies]. That will start saying, if you do not start
reining in the most horrific abuses on your platform, we are going to fine you
billions and billions of dollars… The hope is that between [regulatory moves
in] Australia, the EU, UK and Canada, maybe there could be some movement that
would put pressure on the tech companies to adopt some broader policies that
satisfy the duty here.”

 

To read Dr. Hany Farid’s full interview with the Guardian, please click here
<[link removed]>
.

 

###





Unsubscribe
<[link removed]>
Screenshot of the email generated on import

Message Analysis

  • Sender: Counter Extremism Project
  • Political Party: n/a
  • Country: n/a
  • State/Locality: n/a
  • Office: n/a
  • Email Providers:
    • Iterable