FSM-Position zum EU Assessment
Measures to further improve the effectiveness of the fight against illegal content online
It is of utmost importance to distinguish between different types of illegal content. Trademark and copyright infringements continue to be an economic and commercial challenge in various fields, and related actions of individuals may constitute criminal offences. However, motivation, dissemination context and indeed the societal effects differ immensely from actions related to child sexual abuse material and terrorism. The issue of illegal hate speech online is once again different since the assessment of such content always requires careful evaluation of its context, and the legal framework is often very different in various EU Member States. Any legislative act that would deal with questions of responsibility, removal obligations, reporting duties, or other compulsory actions must carefully take into account that the tools which have proven to be effective and appropriate fighting, for example, child sexual abuse material may have negative side effects when applied to hate speech.
In recent years, the FSM has been very active in these last named fields. To determine the illegality of an image or a video depicting the sexual abuse of a minor almost never requires the consideration of context. Links or reports regarding such content can easily be shared in the international network of hotlines INHOPE to be dealt with in cooperation with national authorities, and since there is broad consensus in terms of criteria to determine baseline child abuse material, the removal of such content is possible expeditiously and without friction. Because context is not an issue here, technology can be used to help automatic detection of such material. Cooperation between hotlines and industry is fruitful and innovative, and takedown times continue to decrease.
The situation is completely different when we look at illegal hate speech. There is, in many cases, no universal understanding of criteria that make a piece of content or a comment illegal. A short paragraph posted by a social network’s user may be a criminal offence in Germany but may be - just - acceptable in other European countries. In addition, the assessment of such content always requires the careful consideration of its context. This is why such content is very often not clearly illegal, and its deletion might harm the fundamental rights of the author. This is also why strict timeframes will almost certainly encourage platforms to delete content if they are in doubt, rather than investing time and resources to find the best possible solution.
Acquiring the full picture of an issue like illegal hate speech online continues to be a challenge. As the monitoring exercises the Commission was undertaking with the help of several NGOs and other organisations in the context of the Code of Conduct Countering Illegal Hate Speech Online proved once again, the size of the issue is hard to determine. Even with combined efforts, the participating organisations managed only to find and flag a rather small number of objectionable contents, obviously disproportionate with the huge amount of user contributions in general. We do not know if the - encouraging - removal times are representative for what the platforms achieve in general, and we also do not know about the number of illegal content being uploaded, reported and taken down outside of our naturally very limited view.
We therefore welcome the objective to continue monitoring not only the efforts of the platforms but also the different legal frameworks across the EU and the societal issue of hate speech in general. Any legislative approach must be evidence based and proportionate. As outlined above, the completely different nature of the several kinds of illegal content require careful attention in order to prevent unintended negative side effects.
Voluntary actions of companies, for example as included in self-regulatory frameworks, have proved to be successful in the past, even if they require patience and strong dedication. Such actions should be encouraged in the long run, and industry (sector) wide commitments fostered.
We agree that it is important that public authorities as well as hotlines and other trusted flaggers are easily able to contact responsible staff at internet platforms in order make them aware of illegal contents, thus enabling them to perform all the actions which they are obliged to under the E-Commerce-Directive. Platforms should also be encouraged to improve how especially police can contact them or court orders can be delivered.
We are, however, afraid that the concept of specific time frames for the removal of content will almost certainly be detrimental for certain fundamental rights of EU citizens (and indeed companies which have taken their place of business in one of the Member States). The obligations under the E-Commerce-Directive are already strict yet proportionate, allowing platform providers to carefully assess content if its legality is unclear while demanding the swiftest reaction in urgent (and clear) cases.