Article 13: should you worry?

The copyright directive, proposed by the European Commission on September 14, 2016, was just approved in a modified and final version, on March 26, 2019, by the European Parliament. We would like to review all the fears aroused by its “article 13” – now article 17 – as of the issuance of the proposal more than two years and a half ago.
 
Like any directive, this text will have to be transposed into national law by each of the 28 (or 27?) EU Member States and will not be applicable as such (as opposed to the regulations, such as the GDPR that did not have to be transposed). The directive only lists goals to be reached and lets the Member States decide of the ways to reach them (for example, with an obligation or not to use automatic filtering system).
 
This directive, all along the negotiations process, raised many concerns and we wished to make things clear about each of the fears and questions that invaded the web, in particular about its “article 13” that became article 17 in the final version but that will forever stay “Article 13”…

What’s that Article 13?

Article 13 consists in:

  • Ending the ISP’s protection status (“safe harbor”) of host provider extended by the case law to the platforms: the host providers’ liability was shielded in case of prompt withdrawal of litigious content, and replacing it by an a prioriliability;
  • Paying the authors through license agreements that are not mandatory for their beneficiaries (point 1 of the article);
  • When the beneficiaries do not want to enter into a global license agreement, compelling the platforms to cooperate and thus to withdraw the litigious content (point 4);
  • When no authorization was granted by the beneficiaries, making the host providers liable for the litigious content unless they show that they did everything to obtain the authorization; made their best efforts to make the copyrighted content unavailable; promptly withdrew or deny access to the content, after receiving a notification from the beneficiaries. Principle of proportionality will determine whether the host providers complied with their duty to cooperate namely in view of the number of visitors, the size of the platform, the type of uploaded material, the efficiency of the means and their cost for the host provider;
  • Allowing the users to appeal the withdrawal of their content through a system of internal board of appeals (point 8);
  • Compelling the platforms to install automatic filtering systems, except for platforms of less than 3 years making an annual turnover of less than 10 million euros. When these platforms gather over 5 million users, they also have to show that they made their best efforts to prevent the uploading of copyrighted contents for which the authors had made specific requests (point 4aa).

Is it going to make SMEs’ life more complex, as GDPR did?

The European Parliament reduced the SMEs’ burden that the Commission initially put on their shoulders. They will not have to install systems of automatic filtering even when license agreements are negotiated with the beneficiaries (point 7). However, the smallest platforms will have no choice but to accept signing such license agreements.

Will I be taking risks when showing trademarks in my videos?

No, nothing changes in terms of trademarks. The directive only affects copyright, not trademark law. It does not change anything to the fact that it is already possible, without being held liable, to show, voluntarily or not, an item bearing a protected trademark in the uploaded content. It is for example the case of this famous spirit trademark in a video of Liza Koshy. This trademark is not used “in the course of trade”: using it is not infringing the attached rights.

What about copyrighted works? Is it going to be forbidden to show them, even on a corner of the screen and for less than one second?

Even though the European Commission had not drafted it in its proposal, the European Parliament added to the adopted version of the directive that copyright exceptions would prevent the directive from applying. Consequently, it will still be possible, as it is now, to:

  • Reproduce a short extract of a copyrighted work to illustrate a general speech (quoting a sentence from a novel, showing the excerpt of a theater show, using a few seconds from a song…) as done for example by youtuber Mark Edward Fischbach in this video showing a few seconds of The Simpsons;
  • Include a protected work as part of a larger whole such as the Louvre Pyramid by Ieoh Ming Pei (an architectural work still under protection) in a video about Paris or a Walt Disney character on the wall of a youtuber’s bedroom, as long as these works are not the principal object of the video (exception of “accidental inclusion” or of “panorama”);
  • Parodying a work to make fun of it, use it to make fun of something else or produce a humoristic content, as does this video using an excerpt from the movie “Downfall” combined with a speech of Barack Obama before the Congress;
  • And of course, it will still be possible to use freely works belonging to the public domain.

These exceptions depend on national legislations and might defer from one Member State to another.

I heard that article 13 means the return of censorship…

As long as copyright exceptions are preserved, the directive will never have the effect to “censor” the uploaders. Monitoring systems of copyrighted works might become more efficient and give rise to more automatic withdrawals. But these withdrawals will be justified by copyright infringements. Copyright is, surely, a limitation to freedom of speech as it allows to forbid the uploading of a copyrighted movie without the authorization of its beneficiaries, for example. But it is not accurate to refer to it with the word “censorship” because this limitation to freedom of speech is based in the law and in fundamental and constitutional rights in all EU Member States, namely by Article 10 of the European Convention on Human Rights. The word “censorship” implies the idea of an arbitrary interference which makes it inaccurate – unless the fears aroused by “Content ID” proved to be justified, see last question below.

Is it true that the memes will be banned?

The memes are parodical images including potentially copyrighted material, such as this 2016 parody of Game of Thrones relating to the transfer of power from Barack Obama to Donald Trump:

These pictures use a copyrighted work (Game of Thrones) and parody it with the addition of the presidents’ faces. On the face of it, this meme infringes the moral rights of the authors (no mention of the author’s name, unauthorized alteration of the work) as well as their economic rights (unauthorized reproduction of the work, no remuneration of the authors). However, this memeis not an infringement because the copyright exceptions protect it from a request to withdraw: the image is a parody and it is used to make fun of Donald Trump depicted as a terrifying and pitiless ruler.

Memes are not jeopardized by the directive, de jure.

But if the exceptions still apply, is anything going to change?

First, if the copyright exceptions are included in the directive, it means that the host providers will not have the overwhelming task to withdraw all existing content using protected works thanks to these exceptions!

What will change is most of all the way to pay the beneficiaries. They will be paid through license agreements or on a case-by-case basis, by the host provider and by the uploader while, until now, only the latter was paying the authors for the works used in the uploaded content, based on a share of the advertising revenues that the host provider was paying to the uploaders. This is one of the reasons why the host providers were lobbying against article 13.

Then, it is true that the changes might not be visible to all the users. Thus, the host providers already set up, these last years and namely in France, filtering systems to prevent the upload of protected content. For example, through the monitoring of key words but also through robots. They also withdraw the content that is notified as infringing, outside of any license agreement. Finally, they already started creating procedures of appeals against withdrawals. For example, YouTube (property of Google), offers an online procedure of copyright takedown notice, but also a procedure of appeal against the withdrawal. YouTube also set up a form to appeal a “Content ID” claim and even a possibility to appeal the confirmation of withdrawal after the appeal against the Content ID claim (same link). YouTube anticipated the implementation of article 13.

Experienced users might already be circumventing the automatic filtering systems when uploading their content: the battle of technology is not fought by the bulk of the troops of the users, but it is, as always, way ahead of the legal norms.

How does this automatic filtering robot “Content ID” work?

This robot is able to understand that a YouTube user is trying to upload a movie or part of a movie for which its beneficiaries requested the protection. Thus, if a YouTube user tries to upload on his or her channel the last Warner or Universal movie release and that these companies made a Content ID request, the uploading is automatically blocked. The same problem will happen if the uploader only includes an excerpt from the movie in the video.

The problem is the coexistence of this automatic filtering system and the copyright exceptions (parody and citation). Indeed, the beneficiaries have the possibility to set a default authorized duration of excerpts. This is why, precisely, procedures of appeal against Content ID claims were set up. But since these procedures are most of the time in favor of the beneficiaries, the uploader has no other option than referring to judicial courts to obtain a decision in favor of his or her freedom of speech.

It is of course impossible for this robot to monitor all musical and audiovisual copyrighted works (movies, TV shows, video clips, etc.). Thus, the automatic filtering will only apply to a small selection of copyrighted works. The host providers’ major fear is that the automatic filtering could be imposed in the frame of license agreements as a performance obligation and not as a best-efforts obligation: in the first case, they would be liable if the robots do not detect the unauthorized use of a copyrighted work; in the second, they would not be liable if they show that they made their best efforts to enforce the automatic filtering with their technological tools and human means.

 

Gaëlle Loinger-Benamran
Partner

European Trademark and Design Attorney

and

Jérémie Leroy-Ringuet
Attorney at Law