(New York, N.Y.) – The
European Union has recently been debating various content
moderation laws and policies in an aim to curb the spread of online
extremist content. Instead of allowing lawmakers to do their jobs,
Facebook has spent significant resources lobbying tech industry
regulators. As a part of those efforts, Facebook CEO Zuckerberg
published an opinion
piece this week in the Financial Times and the company
issued an accompanying white
paper—both of which seek to demonstrate Facebook’s support for
increased content moderation regulations for the tech industry.
These documents tout Facebook’s achievements while seeking to
stifle action by lawmakers through half-truths and fearmongering.
Facebook also continues to fail to acknowledge the tech industry’s
historical inability or refusal to remove extremist and terrorist
material. For example, Facebook’s white paper suggests that, “specific
product design mandates – ‘put this button here’, use this wording’,
etc. – will make it harder for internet companies to test what options
work best for users.” This argument is nonsensical. Numerous
industries are mandated by regulators to place specific warnings or
certifications on their products. Facebook also claims that by
imposing mandatory windows for removal of content, such as removing
hate speech within 24 hours, “companies would have a strong incentive
to turn a blind eye to content that is older than 24 hours…” This is a
bizarre argument that contradicts Facebook’s earlier boasts that it
removes 99% of harmful content and that user safety is their primary
concern. Given the tech industry’s strident claims to have tackled the
problem of harmful content, it is hard to see how companies could be
strongly incentivized to “turn a blind eye” to any content, no matter
how old.
It is almost as if Facebook is implicitly threatening to do
less to remove objectionable content if disagreeable mandates
like mandatory 24 removal requirements are imposed on the company.
“Facebook continues to push specious arguments to create confusion
and fear among lawmakers in an effort to maintain the status quo and
limit the tech industry’s liability and responsibility,” said Counter
Extremism Project (CEP) Executive Director David Ibsen. “Rather than
dictating to public officials on how to keep the public safe,
Zuckerberg and his company should instead halt their lobbying efforts
and focus on keeping extremist and terrorist content off their
platforms. Facebook’s pressure on legislators and regulators is ironic
because they are unable to even take responsibility on their prior
pledge to eliminate extremism from their site.”
Facebook clearly outlines removal policies for extremist content in
its Community
Standards, but they inconsistently enforce their terms and shadow
themselves under the umbrella of free speech concerns with content
regulation. Worse yet, even if the company removes said content, it
has demonstrated an inability to stop repeated uploads from happening,
as shown by the Christchurch
attack video. Ultimately, it’s not a question of whether certain
speech should be allowed or not, it’s about whether Facebook will or
will not provide services to extremists and consistently enforce their
policies.
CEP has seen this pattern of behavior play out before when Germany
passed its NetzDG
law, which fines online platforms for failures to delete illegal
hate speech, and went into effect in 2018. Tech companies criticized
the German legislation and also argued that smaller firms would not be
able to afford to comply and that content would be over-blocked as a
result. However, an investigation
on the NetzDG’s impact conducted by CEP and the Centre For European
Policy Studies (CEPS) found no evidence of over-blocking, false
positives, or burdensome compliance costs related to the law.
Facebook also expends a significant
amount of resources on lobbying regulators, but instead of
lobbying, they should let regulators do their jobs. If Facebook is
sincere about making the Internet safer and more secure, then they
must be prepared to not only support specific legislative and
regulatory proposals, but to pledge to not spend any more money or
time lobbying against government regulation. Rather than simply
issuing platitudes, these tech companies must lead by example and
allow lawmakers to discuss and negotiate—without industry
interference—the most appropriate laws to promulgate in the interest
of keeping the public safe.