NetzDG
The Network Enforcement Act is a law to improve law enforcement in the social media that came into force on 1 October 2017. The aim is to immediately remove hate crime and other objectively punishable content which is no longer covered by freedom of opinion and therefore poses a great danger to peaceful coexistence in a democratic society. In this context, providers of social networks of a size defined in the Network Enforcement Act will be increasingly held accountable by obliging them to deal swiftly with complaints about illegal content and to report publicly on this every six months. In addition, the Act provides for the appointment of an authorised domestic service agent and a right to information on inventory data for victims of violations of personal rights on the Internet. The effectiveness of the measures is to be ensured by the threat of high fines.
Act
The Network Enforcement Act was passed by the German Bundestag on 30 June 2017 despite enormous resistance and came into force on 1 October 2017. The law is primarily intended to combat hate and fake news in social networks, but contains a total of 21 criminal offences, some of which go far beyond this. Furthermore, the law amended the Telemedia Act to the effect that anyone whose "absolutely protected rights" have been violated by illegal content in accordance with § 1 (3) Network Enforcement Act is entitled to information. Online networks should delete obviously illegal content after 24 hours. Non-obvious illegal content must be deleted after 7 days or forwarded to a self-regulatory body recognised by the Federal Office of Justice. Systematic violations can be punished with fines of up to 50 million euros. The Federal Office of Justice, which is responsible for imposing fines under the law, has drawn up guidelines on fines.
Self-Regulation
The Network Enforcement Act provides for the possibility that social networks transfer the decision on the illegality of content to a state-recognised institution of regulated self-regulation (§ 3 (2) no. 3b, (6) Network Enforcement Act). In the field of youth media protection, the self-regulatory bodies have been established and recognised for many years. The Network Enforcement Act provides for a similar system. Providers of social networks have the possibility, in more difficult cases, to consult an external body which decides on the illegality of the reported content. If the providers use this option, they are bound by the decisions of the self-regulatory body and must then take the appropriate measures. However, outsourcing can only take place to recognised bodies of the Regulated Self- Regulation Body. The conditions under which the supervisory authority, in this case the Federal Office of Justice, recognises such an institution are laid down in § 3 (6) Network Enforcement Act. Among other things, the independence of the auditors, transparent rules of procedure, proper equipment of the auditors and speedy examination within seven days must be guaranteed. Since 13 January 2020, the FSM has been recognised as a self-regulatory body under the law. Providers who wish to commission the FSM as a self-regulating body (FSM statutes § 5 paras. 1, 3) must be full members of the FSM. The board of the FSM decides whether the membership of such a company can be extended to include the tasks according to the Network Enforcement Act (FSM statutes § 8 paras 1a, 2).
Review Panel
The FSM as an institution of regulated self-regulation according to the Network Enforcement Act makes decisions through an external expert committee, the Review Panel (§ 13a). The review panel currently consists of 50 lawyers. A schedule of responsibilities specifies when the examiners are called upon to take decisions. Decisions are generally taken by circular letter after consultation in writing or by telephone. Each three-member committee appoints a chairperson from among its members, who issues the decision together with a statement of reasons. The decisions are published in anonymous form on the FSM website.