Tech & Terrorism: European Policymakers Reach Agreement On Regulating Big Tech
(New York, N.Y.) — Over the weekend, European policymakers reached a provisional political agreement to adopt the Digital Services Act (DSA), first proposed in 2020, which aims to upgrade liability and safety rules for digital platforms, services, and products. Although the updated DSA proposal has yet to be published, it reportedly makes online platforms take certain measures to protect users from illegal content. These measures include, among others, strengthening transparency obligations—in particular forcing major companies to provide information on its algorithms—and establishing a clearer “notice and action” procedure to help encourage the swift removal of illegal content.
The Counter Extremism Project (CEP) contributed to the first round of consultations between policymakers, asking for the DSA to, among other items, mandate clear, effective, and understandable transparency policies and encourage proactive searches for illegal terrorist and extremist content rather than relying on “notice and take down” mechanisms.
“While the final text of the DSA has not been published, initial reports have indicated improvements on transparency requirements and ‘notice and action’ processes. This is a positive step that will allow users to be better informed about how content is recommended to them and empower them to report illegal content online. More importantly, companies can be fined up to 6 percent of its global turnover for violations under the regulation—forcing the tech industry to take these new rules seriously,” said CEP Executive Director David Ibsen. “Nonetheless, CEP continues to advocate for more proactive measures to prevent the spread of terrorist content.”
The new regulations must now be approved by the co-legislators, the European Parliament, and the Council. Once formally adopted, the DSA will be applicable across the EU and is expected to enter into force as early as 2024. However, very large online platforms and very large online search engines will be subject to the DSA four months after their designation.
In May 2021, CEP Senior Director Dr. Hans-Jakob Schindler and Senior Advisors Alexander Ritzmann and Lucinda Creighton published a policy paper detailing recommendations for the DSA. Among the challenges to be addressed, the initial 2020 draft legislation did not consider the continued failure of the existing “notice and action” moderation systems (Article 14) of major tech platforms (or gatekeepers). Rather, the draft DSA perpetuates the “notice and action” mechanism as the main content moderation system, expecting millions of Internet users in the EU first to be exposed to illegal and possibly harmful content and then to notify the platforms about it. This illustrates the outsourcing of safety and security functions to users rather than a requirement for platforms to proactively ensure the safety of their customers or prevent harmful effects on the societies in which they conduct their commercial activities.
Furthermore, according to the draft DSA, very large online platforms will have to conduct internal risk assessments and are supposed to be audited. However, the systems envisioned by the draft regulation to implement this process do not take into account important lessons learned from large scale audit failures experienced in other industries. Therefore, it seems that while the passing of the DSA by the European Union is a significant first step—demonstrating that Internet regulation within free and democratic societies is possible—continuous improvements of the DSA will remain important in order to increase the effectiveness of this crucial regulation.
To read CEP’s policy paper EU Commission Consultation: Digital Services Act Package – Ex Ante Regulatory Instrument Of Very Large Online Platforms Acting As Gatekeepers, please click here.
To read CEP’s policy paper The EU Digital Services Act (DSA): Recommendations For An Effective Regulation Against Terrorist Content Online, please click here.
###
Unsubscribe |