from the sounds-like-it dept
I’ve been so focused of late on issues related to Section 230 in the US, that I’ve had barely any time to devote to the Digital Services Act in the EU, which is the EU’s ongoing efforts to rewrite intermediary law in the EU. The reports I have followed have been a mix of concerns, with the admission that it at least appeared that EU politicians were trying to get a good grasp on the issues and trade-offs and not rush in with a totally horrible plan. That doesn’t mean the end result is good, but so far it does not appear to be totally disconnected from reality, as with many similar proposals we’ve seen around the globe.
Joan Barata has a good report looking at the the current state of intermediary liability in the latest DSA proposal and notes that it’s… kind of a mess. Basically, as is often the case with intermediary liability laws, very few policymakers have experience with the actual issues, and thus they can’t take into account how various provisions will actually work in practice. Frequently that means that proposals are worded vaguely, and no one will really know what they mean until after a series of lengthy, confusing, and expensive court decisions.
As Barata notes, the DSA appears to retain the basic liability protections that have existed for the last two decades in the EU in the form of the E-commerce Directive (which is weaker than Section 230’s protections in the US, but are roughly equivalent in saying that websites should not be held liable for 3rd party content). The big difference with the E-commerce Directive is that websites do need to remove content “upon obtaining actual knowledge or awareness” of “illegal activities.” Of course, what exactly is meant by “obtaining actual knowledge or awareness” becomes a tricky question at times and did involve some lawsuits.
The DSA, though, moves the liability situation further away from Section 230 and more to a DMCA style “safe harbor” situation, by establishing that knowledge can be obtained through “notices”:
Apart from the provisions included in Article 5, Article 14(3) establishes that notice and action mechanisms fulfilling certain criteria give rise to actual knowledge or awareness.
The DSA tries to avoid the classic “moderator’s dilemma” issue by saying that even though knowledge or awareness could make you liable, you don’t necessarily lose your liability protections if you carry out your own investigations:
Article 6 clarifies that intermediaries may not lose their liability protections “solely because they carry out voluntary own initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation”.
But what does that mean in practice? You can lose the protections if you know about illegal stuff on your website, but you don’t lose them if you are doing your own investigation. But at what point does finding stuff during your own investigation magically morph into knowledge or awareness? Well, the answer seems… contradictory. Basically, it sounds like the current draft of the DSA hangs an awful lot of weight on the word “solely.” If illegal content is discovered “solely” via an investigation, then a site might retain its protections — but if that same content is discovered in any other way, then the site may have knowledge and face liability. This… is going to be confusing.
And, again, this tends to be the problem with all of these proposals. They want to encourage sites to moderate to clean up bad stuff, but if they mandate liability for having knowledge of bad stuff, you incentivize less searching. But now you are still left in a weird, twisted position, where you say “okay, you can search, but if you find stuff, you’re no longer liable for it… unless you also have separate knowledge of it.” And how do you manage that?
This is actually one of the many reasons why — despite all the criticism it receives — Section 230 gets the balance right. It gives websites a free hand in moderating content, but does so in a way that many, many other forces come into play in pressuring the company to be a good actor in the space. Public pressure, user concerns, media coverage, and advertiser pressure, all serve to push websites to improve their moderation practices. Too many people, however, think that without a law mandating such things, nothing happens. That’s wrong. We’re seeing every website continually work to create better policies not because of the risk of some confusing and potentially very costly law, but because they don’t want their site to be a dumpster fire of awful.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: confusion, digital services act, dsa, e-commerce directive, eu, intermediary liability, knowledge, monitoring, notices, section 230