The European Union’s Digital Services Act (DSA), seen as a regulatory “game changer,” is coming into force this year. A suite of regulatory requirements intended to curb online mis- and disinformation, the DSA will require online platforms such as Google, Meta, TikTok, and YouTube to annually assess the risks posed by content they host and circulate; track and provide data concerning their algorithms; improve transparency regarding their advertising recommendations; give users more options concerning the parameters of ad targeting; and provide avenues for redress on content-moderation decisions. Penalties for violations under the DSA include fines totaling as much as 6 percent of annual revenue—roughly $11 billion for Google, for instance, or $7 billion for Meta, both of which lobbied hard against aspects of the act.
“Ever since the nineties, we’ve had the outlook of not regulating the tech platforms,” Christel Schaldemose, a Danish member of the European Parliament who was instrumental in getting the bill through, said during a recent discussion hosted by Columbia University and the German Marshall Fund. “With the DSA, we are taking back control and making rules for the tech industry.” Dominique Cardon, a sociology professor at Sciences Po, praised the DSA’s design “because it emphasizes process over outcomes.” The act is also cheered by researchers, who are particularly enthusiastic about gaining automated access to nonpersonal and anonymized data. (Algorithm Watch has a helpful guide to the DSA’s components here.)
Implementation and enforcement of the act have not been finalized. During the year ahead, layers of EU officials and national agencies will focus on working out several areas, including deciding what data the platforms need to provide to regulators and researchers; how the platforms will carry out risk assessments; who will audit the platforms; and how to make sure the auditors are not captured by the platforms that pay them. Regulators will need to figure out evidentiary requirements and thresholds for launching investigations. A new generation of auditors and researchers will need to be trained, and some analysts have voiced the concern that, since the platforms will be paying the auditors, there is a potential for conflict of interest.
A much-discussed media exemption, which would have safeguarded content from news outlets against moderation by the platforms, was not included in the DSA. Some media organizations had argued that, since journalism is already covered by libel and other laws, there was no reason for them to be covered by the DSA. Others worried that a media exemption would open the door for more disinformation emanating from outlets like RT and Sputnik.
Differences between EU member states represent another significant challenge to finalizing an enforcement approach, noted Mathias Vermeulen, director of policy for AWO, a consultancy firm that researches technology policy. European countries had decades of data protection experience, but still found it difficult to enforce the Global Data Protection Regulation (GDPR), Vermeulen said, adding that it will be “challenging to develop a common language for different enforcement authorities with different backgrounds.”
For now, regulators around Europe are staffing up. The EU is hiring seventy staff members to implement and enforce the DSA. Each member country will decide how many staffers to hire; arcom, France’s audiovisual regulator, is expected to hire between twenty and thirty staff members and is looking for data skills and the ability to work across boundaries. “DSA creates a network of regulators,” Benoit Loutrel said during the recent Columbia panel. “We need a mix with the commission at the center and regulators in member states as well.”
Julia Trehu from the German Marshall Fund contributed to this article. Isaiah Glick assisted with research.Dr. Anya Schiffrin is the director of the Technology Media and Communications specialization at Columbia University’s School of International and Public Affairs.