Europe’s Digital Services Act is threatening to establish major liabilities for online platforms. That has many people concerned.
Between the Canadian governments impending war on the open Internet and the US’s war on Section 230, you might be tempted to believe that a lot of the bad Internet policies have been coming out of North America more than anything else. The thing is, this isn’t happening in a vacuum by any means. In fact, other places around the world are also moving forward with legislation that threatens people’s fundamental freedom such as the right to free speech.
A big example of this is Europe’s Digital Services Act. Initially, it had forward thinking idea’s that many were quite OK with. Then, somewhere along the line, things started changing with MEPs (Members of European Parliament) wanting to tack on additional liability for website owners and platforms. To give you a sense of the magnitude of the changes being called for, think of the upload filter laws but making them worse.
Last week, the Electronic Frontier Foundation (EFF) raised the alarms about some of the proposals being pushed in the Digital Services Act. From the EFF:
The Digital Services Act is the mother of all platform laws. Unlike Article 17, the draft law is intended to regulate not only liability for copyright infringement on selected commercial platforms – but liability for all illegal activities of users on all types of hosting providers, from Facebook to non-commercial hobby discussion forums. Even if platforms block content on the basis of their general terms and conditions, the Digital Services Act is intended to define basic rules for this in order to strengthen users’ rights against arbitrary decisions. In view of the balanced draft by the EU Commission, it is all the more astonishing what drastic restrictions on fundamental rights are now becoming acceptable in the European Parliament. The following three proposals are among the most dangerous.
The European Parliament’s co-advisory Legal Affairs Committee, which has already adopted its position on the Digital Services Act, goes even further and wants to give the entertainment industry in particular a free pass to block uploads. Livestreams of sports or entertainment events are to be blocked within 30 minutes; sports associations had already lobbied for similar special regulations during during copyright reform. Such short deletion periods can only be achieved by automated filters – it is hardly possible for humans to check whether a blocking request is justified at all within such a short time.
The Legal Affairs Committee envisages that organizations in the entertainment industry can be recognized as so-called “trusted flaggers”, which should be able to independently obtain the immediate blocking of content on platforms and only have to account for which content was affected once a year. This regulation opens the door to abuse. Even platforms that are not yet forced to use upload filters under the copyright reform would then automatically implement the blocking requests of the “trusted flaggers,” who in turn would almost certainly resort to error-prone filtering systems to track down alleged copyright infringements.
On top of that, the Legal Affairs Committee’s position on redefining the exclusion of liability is absurd. Hosting providers should only be able to benefit from liability exclusion if they behave completely neutral towards the uploaded content, i.e. do not even intervene in the presentation of the content by using search functions or recommendation algorithms. If this position prevails, only pure web hosters would be covered by the liability safe harbor. All modern platforms would be directly liable for infringements by their users – including Wikipedia, GitHub or Dropbox, which were exempted from Article 17 during the copyright reform after loud protests from the Internet community. The Legal Affairs Committee’s proposal would simply make it impossible to operate online platforms in the EU.
Moreover, the ancillary copyright for press publishers, the prime example of patronage politics in copyright law, is once again playing a role in the debate about the Digital Services Act. The Legal Affairs Committee has been receptive to the press publishers’ demand for special treatment of press content on social media. The committee is demanding that major platforms like Facebook would no longer be allowed to block the content of press publishers in the future – even if it contains obvious disinformation or violates terms and conditions.
The purpose of this regulation is clear – even if the Legal Affairs Committee claims that it serves to protect freedom of expression and media pluralism, this is yet another attempt to enforce the ancillary copyright. If the use of press articles by platforms is subject to a fee under the ancillary copyright, but at the same time the platforms are prohibited by the Digital Services Act from blocking press articles, then they have no choice but to display the articles and pay for them.
For publishers, this is a license to print money. In their crusade for the ancillary copyright, however, the publishers ensure that platforms can no longer counter disinformation as long as it only takes place in a press publication. That such a regulation is dangerous is not only revealed by the disclosures around fake news offers used for propaganda around autocratic regimes. A glance at the tabloid press is enough to understand that the publication of an article by a press publisher is no guarantee for quality, truth or even compliance with basic interpersonal rules of conduct.
To say that such regulations would create a dystopian era for the Internet is not an exaggeration. The very idea of pushing for requirements that only offers a 30 minute window for takedowns is insane. Already, the overwhelming conclusion from experts in many different fields is that 24 hour takedowns would lead to automated filtering and shoot first, ask questions later policies. Such a thing is considered hair-trigger response times for a very good reason: if you want to account for nuances such as commentary, criticism, or other things American’s might consider “Fair Use”, that time window is not sufficient by any means. Tightening such a window to 30 minutes pretty much assures that such a reality happens more so than before. This is regardless of if it’s uploaded content or a live stream.
Furthermore, the concept of “trusted flaggers” isn’t really going to solve anything. For one, large organizations have been known for years to issue takedown notices on completely legal content. This sort of thing goes back well over a decade with a particularly famous case being the dancing baby video case of 2007. Little has changed in the years since on the front of copyright takedowns gone wrong. Legal content like the IMDB (Internet Movie Database) have received no shortage of DMCA complaints over pages that do not host copyrighted material. This thanks to poorly made bots looking for infringing material and just firing notices blindly after.
What’s more is that copyright complaints has been long used as a bludgeon for censorship. Criticism for newly released content is nothing new. That same content being ordered taken down under the guise of copyright infringement isn’t anything new either. It’s one of the many forms of copyright abuse known to exist today. Many know it to be highly effective form of censorship because so many know that the copyright systems that exist today is to assume that the complaining party is correct and the receiver of the complaint is automatically guilty. Further, the reason why such abuse has been permitted to proliferate is because there is little to no consequences for anyone who engages in this activity.
The third point being raised is the concern surrounding the demand that platforms must host content of the press. This appears to be a follow-up to the link tax debate where publishers have been pushing to freeload off of various platforms. In short, if a platform hosts a link to their material, they get free traffic and, thus, a benefit from this. What Big Publishing pushed for is that platforms must pay money for the privilege of sending them traffic for free on top of it all. Basically, they are trying to double dip.
So, with the requirement being floated over being required to host said material in the first place, you basically legally force platforms to, as the EFF points out, turn platforms into money printing machines for the Big Publishing organizations. You have to pay for the privilege of linking to press organizations, while at the same time, you are legally obliged to link to said material in the first place. How this is even remotely constitutional is beyond us, but these are the laws being pushed today.
The EFF rightfully points out that if you are being legally compelled to link to news organizations, then this only encourages the proliferation of disinformation. If a news organization of any kind publishes an article making the false claim that the recent US election was stolen, that content can no longer be taken down even if there is a compelling case that such material should be removed. The same goes for false information about vaccines and, well, pretty much anything else for that matter.
What this also opens up platforms to is liability for countries that have laws forbidding the continued hosting of disinformation. If country A has a law that blocks disinformation from being disseminated, and has provisions that states that access be removed globally, then it’s possible for the platform to receive a complaint ordering it to be taken down. However, if the law in Country B states that any published press material must remain up, then that platform would be in violation of the law of Country B. The question then becomes, which countries law do you violate, A or B? This really is a legal catch 22 where no matter what decision you make, you break the law.
The Wikimedia foundation is raising the alarm bells over this proposal. They say that the law and the proposals surrounding this law put their very operation at risk. From Wikimedia:
The Wikimedia Foundation, the nonprofit that operates Wikipedia, applauds European policymakers’ efforts to make content moderation more accountable and transparent. However, some of the DSA’s current provisions and proposed amendments also include requirements that could put Wikipedia’s collaborative and not-for-profit model at risk.
We are calling on European lawmakers to take a more nuanced approach to internet regulation. There is more to the internet than Big Tech platforms run by multinational corporations. We ask lawmakers to protect and support nonprofit, community-governed, public interest projects like Wikipedia as the DSA proceeds through the European Parliament and Council.
Here are four things policymakers should know before finalizing the DSA legislation:
1. The DSA needs to address the algorithmic systems and business models that drive the harms caused by illegal content.
2. Terms of service should be transparent and equitable, but regulators should not be overly-prescriptive in determining how they are created and enforced.
3. The process for identifying and removing “illegal content” must include user communities.
4. People cannot be replaced with algorithms when it comes to moderating content.
We urge policymakers to think about how new rules can help reshape our digital spaces so that collaborative platforms like ours are no longer the exception. Regulation should empower people to take control of their digital public spaces, instead of confining them to act as passive receivers of content moderation practices. We need policy and legal frameworks that enable and empower citizens to shape the internet’s future, rather than forcing platforms to exclude them further.
Our public interest community is here to engage with lawmakers to help design regulations that empower citizens to improve our online spaces together.
The Wikimedia foundation isn’t alone. eBay is also fighting against this proposed law. From eseller365:
eBay in Europe is asking sellers to help them fight new EU rules being considered under the Digital Services Act that could significantly impede small and medium-sized businesses (SMEs).
According to eBay, the proposed rule changes could mean the following for SMEs on eBay and other marketplaces:
- Risk of seeing marketplace seller accounts or listings blocked without reason or at the slightest involuntary mistake, as well as huge bottlenecks ahead of the publication of new listings.
- A more bureaucratic and slower marketplace registration process that may put seller privacy at risk.
- Less affordable and diverse online services, dominated by only the largest players.
It’s unfortunate to see yet another attempt by lawmakers to crack down on digital rights on this front. It’s clear that such rules are likely coming from those who think that if they have a minor understanding how Facebook works, then they suddenly have an idea on how the entire Internet works. As we pointed out in the past, there are billions of websites that exist today. Many have different uses and ways that they operate.
Such lawmaking is basically the equivalent of someone walking into a hardware store one day, seeing that products are being sold at various prices, then concluding you know everything there is to know about the supply chain, commerce, and economics without any other background knowledge. It doesn’t work that way, yet this seems to be the common approach lawmakers are taking these days. They see that some people post something bad on Facebook, and suddenly, they feel that they are the experts of how the Internet works and, whats worse, are willing to dismiss expert commentary and experts on the field to pass laws.
Now we are in a situation that many long-standing observers are all too familiar with these days: fighting against a bad proposed law that threatens to unleash a huge amount of damage on the free and open Internet. As some already knows in the past, you can’t count on someone else putting a stop to these bad laws on the vague concept of “checks and balances” either. After all, European’s had a very vivid example of this back in 2019 when lawmakers passed Article 13/17. Hopefully, the people of Europe will find an effective way to fight this latest round of bad lawmaking.
Drew Wilson on Twitter: @icecube85 and Facebook.