The third day of Copyright Week 2020 is focusing on the topic of remedies – the compensation that those found guilty of infringement are expected to pay to ‘make things right’. In the United States, for example, these can be very high, with the intention of having a dissuasive effect, although often final sums are the result of out-of-court settlements.
However in Europe, through the Copyright Directive, the goal appears to be to prevent copyright-infringing content being made available in the first place, and permanently remove it from circulation, at least through internet platforms. This is the logic behind Article 17, which looks to increase pressure on platforms (primarily YouTube) to do more to stop videos which contain copyrighted content getting online.
The Article contains various flaws, not least the fact that despite the original goal being to strengthen the position of rightholders in their negotiations with YouTube, it covers all types of work, and all but the smallest and newest platforms. It also includes a fundamental contradiction between the obligation to make, ‘in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter’ (Article 17(4)(b)) – almost universally assumed to mean the use of filters of uploads, and the prohibition of ‘general monitoring obligations’ (Article 17(8)). Added to this are the demands in Articles 17(7) and 17(9) to protect uses under exceptions and limitations.
In order to try to resolve some of these issues, Article 17(10) of the Directive calls for a stakeholder dialogue in order to provide guidance on these issues. Five meetings in, it remains hard to see what sort of document will emerge. If nothing else, this can hopefully serve to underline the need to resist any similar legislation elsewhere.
The below sets out a number of key points so far, based on IFLA’s engagement in the discussions alongside partner organisations such as Communia:
There’s a difference between detecting content and detecting infringement: the dialogue has heard a number of presentations from companies selling software which aims to detect copyrighted as it is uploaded to internet platforms.
While the strong variation in the claims made (from being able to detect a film in a fraction of a second to needing a somewhat longer, for example), it has been consistently clear that such tools cannot determine whether an exception of limitation to copyright is at work
Therefore, while some involved in the dialogues have sought, effectively but wrongly, to deny the existence of exceptions and limitations, it seems clear that for a determination to be made between whether use of a work is infringing or not, filters are not enough – human moderation is needed.
There’s no consensus about how much of an answer can collective management provide: one of the options for internet platforms presented in the Directive is to obtain licences for uploaded content. Clearly, when it is working well, collective management has the potential to provide at least a partial answer that would make it easier for platforms to clear at least some rights, when needed.
At the same time, representatives of many rightholders underlined their discomfort with an excessive focus on collective licensing as solution. In many countries, collecting societies do not cover every sector or type of right, and there may be questions about governance and representativeness, especially given the diversity of content uploaded onto platforms. There can be major differences between sectors, and types of right, in attitudes.
Finally, there was strong doubt expressed – including by the Finnish government representative – that extended collective licencing could work across borders.
There is a lack of data about what is really going on: the Directive underlines the need for transparency about the operation of platforms in order to inform the Dialogue.
As a result, there have been calls for figures to be shared on the volumes of works uploaded that contain copyrighted content, what share are then determined to be infringing, what share is then taken down or monetised, and how often take-downs or decisions to monetise are challenged, successfully or otherwise. However, this information is rarely if ever forthcoming, meaning that it can be difficult to understand the scale of the challenge faced.
Size matters: a point that has repeatedly emerged during the discussions is the difference between bigger and smaller players, be they creators, platforms or uploaders. For creators, there is a fair amount of evidence that exposure is essential for newer entrants as a means of building a following. However, established creators understandably do not want to lose revenues.
For platforms, the larger ones are generally able to pay for – or create – their own tools (including armies of content moderators) for trying to determine infringements. For smaller ones, it remains unclear if meaningful technologies are available at an affordable price.
Finally, for uploaders, broadcasters underlined concern about their own content – for which a priori they have cleared all rights – being blocked. This raises questions about whether some uploaders might enjoy privileges, something that could provide controversial amongst others.
There is a risk of abuse: A telling intervention from Facebook in the 4th meeting underlined their concerns about mistaken claims of copyright, and the degree to which they only make filtering tools available for use by rightholders who could trusted to use them fairly. It has also become clear that filtering companies themselves do not verify the legitimacy of copyright claims before providing the tools to enforce them.
Clearly this is a difficult area, as there are different ways of demonstrating ownership of copyright from one sector to another. As some rightholders underlined, this can slow down assessments, and so cause harm. At the same time, without any effort to show ownership, the system is wide open to abuse.
There is a need to remind everyone what’s at stake: finally, but importantly, a key issue remaining to be discussed is how to ensure that whatever system is chosen must protect fundamental rights, including those protected through exceptions and limitations such as those for quotation, criticism, satire, parody, and pastiche.
The risk, in a situation where there is no firm guidance, but rather decisions are left up to national authorities and courts, is that this aspect is forgotten among all the efforts to tailor rules to the specific situation for the sector, type of rights and country involved.
A key task for libraries and others, both in the discussions in Brussels and in national implementation, will be to ensure that this is not the case.