7 June 2021 was the implementation deadline for the Copyright in the Digital Single Market Directive (EU) 2019/790 (the Copyright Directive), yet just four EU Member States (including Germany and the Netherlands) have fully transposed the Copyright Directive, whilst four others (including France and Denmark) have transposed only parts of the Copyright Directive. The delay in implementation is perhaps unsurprising given the controversial nature of certain of the Copyright Directive’s provisions, in particular Article 17.
Recent developments have started to add colour to how Article 17 may work in practice, and how it might align with the broader regulation of platform liability for infringing content. This blog post will discuss these developments and analyse the implications for platforms and rights holders.
Article 17 requires online content-sharing service providers (OCSSPs) to obtain authorisation from the relevant rights holders prior to making copyright-protected content available on the platform. Article 17 further provides that OCSSPs will be liable for any content shared without prior authorisation, unless the OCSSP has made best efforts to (i) obtain an authorisation, (ii) block unauthorised content and ensure that it remains blocked, and (iii) promptly block or remove unauthorised content once notified.
The European Commission (the Commission) published Article 17 guidance three days prior to the Copyright Directive implementation date. The guidance sets out how Member States can correctly transpose Article 17, and provides direction on protecting fundamental rights and transparency, amongst other elements.
The positions of the Member States are emerging as they implement the Copyright Directive: in general terms, national implementation ranges from pro-user positions (which Germany is leaning towards) to pro-rights holder positions (as seen in the French and Italian implementing laws).
In addition, developments in two recent cases at the Court of Justice of the European Union (the CJEU), discussed in more detail below, shine a light both on Article 17 specifically and on the broader context of platform liability for infringing content outside of the Article 17 regime.
Polish Challenge to Article 17
In CJEU case C-2014/19, the Polish government called for certain aspects of Article 17 relating to the best efforts obligation to ensure that infringing content remains blocked to be annulled. The government claimed that the obligation strongly incentivises OCSSPs to use overly strict and automated content filters (in order to prevent unauthorised material from being re-uploaded and thereby minimise the OCSSP’s liability risk). The government considers that this obligation risks violating users’ rights to freedom of expression by necessitating excessive restrictions.
On 15 July 2021, Advocate General Henrik Saugmandsgaard Øe (the AG) released his much-anticipated opinion on the Polish government’s challenge (the Opinion), rejecting Poland’s claim. The Opinion acknowledges that: (i) Article 17 may, in effect, oblige OCSSPs to implement automated filtering tools (in order to adequately block unauthorised content) and (ii) if such tools are required, then Article 17 would constitute a limitation on the right to freedom of expression. However, the Opinion found that Article 17 and the wider the Copyright Directive contain sufficient legal safeguards to minimise the risk to individual rights, including the prohibition against a general monitoring obligation on OCSSPs, the protection of users’ rights to legitimate use of protected content, and the complaints and redress mechanism requirements. Therefore, the Opinion concluded, Poland’s challenge should be rejected.
The Opinion interprets Article 17 as requiring national law to oblige OCSSPs to:
- Respect exemptions from copyright protection (e.g., quotation, criticism, review, or parody) at the point of implementing blocking technologies rather than only after receiving a user complaint; and
- Limit use of blocking technologies to manifestly illegal content (i.e., where content is “identical” or “equivalent” to the content provided by rights holders); and
- Abstain from implementing any blocking/preventive measures in ambiguous situations in which exceptions to copyright are reasonably conceivable (as those uses may well be lawful).The Opinion noted that, in equivocal situations, the content must be presumed to be lawful and its uploading should not be hindered.
The CJEU will rule on this case later this year. Whilst the Opinion is not binding on the CJEU, the CJEU does not often deviate from an AG’s opinion. The Commission has acknowledged that it may need to revise its recently published Article 17 guidance to reflect the CJEU’s judgment in this case.
Notwithstanding the considerable uncertainty around the practical implications of Article 17, the direction of travel indicated by the Opinion will give some certainty to OCSSPs seeking to understand their content liability risk and implement compliant filtering tools, and also to rights holders considering how best to protect and extract value from their works online.
OCSSP Liability for Infringing Content
On 22 June 2021, the CJEU handed down its judgment on the request for a preliminary ruling made by the German Federal Court of Justice in relation to two proceedings concerning unauthorised uploading of copyrighted content to YouTube and to Uploaded (a file-hosting and sharing platform) (the YouTube/Cyando Case). Whilst the CJEU’s judgment relates to the Copyright Directive’s predecessor, the Information Society Directive 2001/29/EC (the InfoSoc Directive), and to the Electronic Commerce Directive 2000/31/EC (the e-Commerce Directive), the decision nonetheless highlights the CJEU’s position on platform liability for content more broadly, and its approach to balancing protection of rights holders and their content against platform users’ fundamental rights.
In line with the Opinion, the CJEU held that platform providers do not make a relevant “communication to the public” of copyright-infringing content shared by users on the platform (resulting in direct liability), unless the provider deliberately intervenes in the content sharing beyond merely making the platform available. The CJEU stated that a platform provider’s intervention in the communication may tip over such communication into an “act of communication” if, for example, the provider:
- Had specific knowledge of the illegal sharing on its platform and failed to act expeditiously to delete or block it;
- Had implied/general knowledge that content was made available illegally and failed to put in place technical measures (expected from a reasonably diligent operator) to credibly and effectively counter that copyright infringement; or
- Knowingly promoted or facilitated illegal sharing, including providing tools specifically intended for illegal sharing of content or encouraging users to illegally share content with the public on the platform.
In relation to the hosting liability exemption / safe harbour under the e-Commerce Directive (which, broadly, exempts platforms from liability for infringing content if they are merely hosting that content), the CJEU specified that a provider’s knowledge, in a general and abstract sense, that its platform is being used to illegally share protected content is not sufficient to remove the platform’s safe harbour. The CJEU further indicated that for a provider to be precluded from relying on the e-Commerce Directive safe harbour, it must have knowledge of, or control over, the specific illegal acts committed by its users relating to the protected content. In this case, the CJEU found that indexing content uploaded to the platform, providing a search function, or providing recommended content based on user profiles and preferences was insufficient to conclude that the platform had specific knowledge of, or control over, illegal activities.
The CJEU’s decision will have direct repercussions for websites, services, and platforms that host content but fall outside the scope of OCSSPs regulated by Article 17. The CJEU’s approach to liability for infringing content is also important to the ongoing development of the EU Digital Services Act (for more information, see Latham’s blog post Proposal for the Digital Services Act).
Further, the decision has relevance for UK platform providers. While the UK courts are not obliged to follow the decision and will not directly implement the Copyright Directive, the principles under the InfoSoc Directive and the e-Commerce Directive are embedded in UK law, and the UK courts may well follow the CJEU’s approach on these issues (at least in the short and medium terms).
As with the Opinion, the CJEU in this case presents a strong focus on fundamental rights, specifying that the InfoSec Directive requires a fair balance between the interests of copyright holders and the rights of users (in particular the right to freedom of expression and information), placing particular weight on freedom of expression in the balancing of competing rights. In a postscript to the Opinion, the AG acknowledged the YouTube/Cyando decision (which was delivered after the Opinion was drafted), and stated that the CJEU’s reasoning in that decision did not call into question the considerations set out in the Opinion.
How the EU national courts and the UK courts will apply these CJEU developments to the question of platform liability for content in practice remains to be seen. Nevertheless, the developments provide helpful guidance to platforms seeking to understand the extent of their potential liability for infringing content, in light of the particular operation and functionality of their services.
The developments set out above add colour to the emerging landscape of OCSSP obligations and liability for infringing content under Article 17 of the Copyright Directive, and to the broader platform liability regime under the InfoSoc Directive. There is still some way to go, however, before a more complete picture emerges of the practical implications for platform providers and rights holders. The majority of Member States have not yet implemented the Copyright Directive, and the CJEU has yet to deliver its decision in the Polish challenge to Article 17.
Given the divergence in the national laws implemented thus far, platforms and rights holders may face varying obligations and levels of protection in practice across the EU and the UK (e.g., in relation to use of, and thresholds for, automated upload filters, and the practical mechanics of content complaints processes). Platforms and rights holders should be aware of the associated risks and stay abreast of updates as their rights and obligations become more clear.
Latham & Watkins will continue to monitor and report on developments in this area.
 CJEU case C-2014/19.
 YouTube/Cyando, CJEU joined cases C-682/18 and C-683/18.