The FSM is involved in numerous projects and is cooperating in many ways with institutions that engage in online protection of minors. Likewise, members and employees of the FSM participate in numerous national committees and initiatives to discuss contemporary youth media protection and media literacy education.
MIRACLE - Machine-readable and Interoperable Age Classification Labels in Europe
Since February 2014, the FSM has been working with seven different partners under the leadership of the Hans-Bredow-Instituts on a pilot project to create interoperable age labels. The project is co-financed by the EU Commission within the framework of the ICT Policy Support Programme. The project website can be found here hier.
With a scheduled duration of two and half years, this project aims to make electronic data from various different classification systems technically compatible – i.e. interoperable – and hence universally usable, thereby improving youth protection on an international level. This is needed because for the most part, the systems in place for evaluating, approving and labelling media content for youth protection purposes vary greatly from one country to another. These evaluations tend to be country-specific, which severely limits their international relevance. The systems differ in terms of potential gradings (e.g. age-based peer groupings), and also in relation to the content classified and the classification information provided. In addition to information about age brackets, other information may be provided such as content descriptors (e.g. whether the material contains sex or violence). Equally, these classifications are made available in a variety of ways – technical metadata, for example, or explanatory visual information. The amount of classified and/or age-rated content is constantly growing – content that is available not just in its country of origin but worldwide. The MIRACLE project aims to create a common data structure for this information, allowing youth protection programs, for example, to process it in a standardised manner. In short, the aim is not to create a new age rating system, but rather to make it possible to use the existing systems, and the rating knowledge and experience bound up in them, across all national borders.
The project partners come from five European countries. In addition to the FSM and the Hans Bredow Institute for Media Research, the other participating bodies are the BBFC (British Board of Film Classification, UK), NICAM (Nederlands Instituut voor de Classificatie van Audiovisuele Media, NL), PEGI (Pan European Game Information, BE), NCBI (Národní centrum bezpečnějšího internetu, CZ), JusProg (Verein zur Förderung des Kinder- und Jugendschutzes in den Telemedien e.V., DE) and Optenet (ES).
At the heart of the MIRACLE system is a common data model designed to be open and technologically neutral. In addition to age brackets, this model is also capable of recording content descriptors and other information. Existing classification information can then be mapped into the new model. Practical ways of using MIRACLE in differing national contexts will need to be developed and implemented. Wherever possible, the age-de labelling system already in use in Germany is to be retained or expanded, as should other national systems; technical compatibility will be ensured by the use of a common reference model and syntax. Ultimately, all the information from the various different systems will be mapped into a standardised data model, thus allowing the automatic processing of age classification information. It will, for example, be easier for filter program producers to use the information from a wide range of different systems once it is available in the MIRACLE data model.
Making classification information interoperable will help people use the existing classification information and disseminate available classification data. Thanks to the convergence of electronic age labels initiated by MIRACLE, it will be simpler for providers of convergent end-devices (PCs, tablets, smartphones, smart TVs, etc.) to use age data coming from different systems and, to date, provided and transferred in different formats, for processing in youth protection instruments on their devices and platforms. In addition, proprietary systems belonging to providers can also be transferred into the data model. This means there will no longer be any need to adapt them to a new system, which in turn means secure investments and less risk of misspent money. Providers of cross-border services will have the option of making sure their offers correspond to particular national specifics and meet any potential legal requirements in multiple member states. Existing labels can be retained, since MIRACLE is not a new classification system, merely a way of making it possible for existing systems to talk to each other in a common language.
What would the web be without social media? In terms of youth protection, social networks and user-generated content present a major challenge. In the framework of ChiPSoM, a sub-group of the MIRACLE project, we are looking at current approaches used by providers, the requirements of users and parents, and possible ways of making the wide range of available protective measures technically interoperable. We aim to identify best practices and develop recommended guidelines for providers. More about ChiPSoM (pdf)
The consortium of the EU-cofinanced MIRACLE project plans to support providers wishing to integrate the MIRACLE data format into their system. This might, for example, take the form of developing application programming interfaces or providing application data in a compatible format. If you are interested in MIRACLE, please do not hesitate to contact us at any time at email@example.com.
MIRACLE facilitates international child protection
Video 2:48 min
The Internet hotline was set up in 2004 by the FSM and the Association of the Internet Industry (eco) as a way of ensuring a common complaints facility on the Internet. The aim of the Internet service available at www.internet-beschwerdestelle.de is to offer Internet users a contact point that can be used for complaints about content associated with a wide range of online services that is relevant for youth protection purposes. Incoming complaints are first subjected to a thorough legal check. If the preliminary check shows that the reported content is in breach of the relevant youth media protection laws and/or criminal laws, the content provider is requested directly to modify it, or the hosting provider is asked to remove it. In serious cases, the content complained of can be passed on to the responsible state authority without involving the content publisher or hosting provider. Additionally, the “Guide” section offers a diverse range of information on the topic of youth protection on the Internet and hints on how to use the medium proficiently. The Internet hotline has been available since 2005 as one of the “Handlungsversprechen” (promises to act) offered by the registered association “Deutschland sicher im Netz” (Keeping Germany safe online). Financial support for the activities of the Internet hotline is provided by the EU’s “Safer Internet Programme”. Under the “Saferinternet.de” umbrella, the Internet hotline is linked in a network with its partners Klicksafe, jugendschutz.net and the children’s and adolescents’ emergency phone line “Nummer gegen Kummer” (Number against distress). The European Union has for some years been supporting both Internet hotlines and projects for encouraging media skills in individual European countries.
INHOPE is the worldwide network of Internet hotlines for combating illegal content on the Internet, particularly depictions of child abuse, and now includes 51 members from 45 different countries. In 1999, the FSM and seven other hotlines founded the International Association of Internet Hotlines. INHOPE is the main forum for the coordination of hotlines in Europe and worldwide. Both the work of INHOPE and the activities of the FSM hotline are supported by the EU Safer Internet Action Plan. The INHOPE network offers its members the opportunity to forward complaints to the responsible INHOPE partners in each case. INHOPE thus enables these complaints to be dealt with in the respective country of origin and ensures that illegal content can no longer be accessed. Each hotline also collaborates with the national police in its own country. INHOPE has working relationships with both INTERPOL and Europol.
INHOPE has five specific primary objectives:
- to develop more effective, more secure and coordinated mechanisms for exchanging reports between international hotlines
- to create guidelines and best-practice standards for hotlines (code of practice)
- to promote the exchange of expertise between members
- to provide new hotlines around the world with advice and training
- to promote collaboration on an international level and to act as an intermediary between policy makers, law enforcement agencies and other organisations
The work of INHOPE is financed by subsidies from the EU and membership fees.
Report it, don’t ignore it! - how hotlines help to reduce Child Sexual Abuse Material online
Video 1:47 min
FSM/FSF: Youth Media Protection in Germany
In addition to depictions of violence, sexuality and frightening content, content capable of causing socioethical disorientation in adolescents (e.g. depictions of alcohol and drug use, antisocial behaviour, self-endangerment, suicide) is also considered relevant in Germany.
Unlike in other countries, what counts when evaluating material for youth protection purposes is not the depiction in itself but the context. It all comes down not to prohibited themes but to weighing up the potential risks. Such risks are presumed to be present in content which is not limited to the following but for example:
- Strongly aestheticises depictions of violence, tending towards glorification
- Portrays torture or vigilante justice as legitimate
- Presents prejudice and discrimination uncritically
- Shows intolerance, social marginalisation, harassment, bullying or other forms of violence as normal or worthwhile behavior
- Uses stereotypes to denigrate people or whole milieux or places extreme limitations upon gender roles
- Fails to provide sufficient explanation when sexuality is portrayed in conjunction with violence
- Portrays accidents, illness or blows dealt by fate in a sensationalistic and voyeuristic manner
- Incites viewers to subject themselves to extreme risks or extreme body modification
The reasoning found in youth protection evaluations concentrates on potential risks and can provide an objective perspective when dealing with public controversies concerning particular formats or topics.
In cases involving sensitive topics, accompanying measures can also make sense, such as providing additional information, background facts and expert knowledge concerning a given programme or referring readers to services offering help and support or to contact persons.
The age brackets are as follows:
- No age restriction
- Aged 6 and over
- Aged 12 and over
- Aged 16 and over
- Not suitable for persons aged under 18
Television transmission times correspond to the age ratings (cf. Age brackets and time schedule)
Age ratings for cinema showings and material carrier media such as films and computer games on DVD or CD are set out in the German Protection of Young Persons Act (Sect. 14 JuSchG). For broadcasting and telemedia (Internet), they can be found in the German Interstate Treaty on the Protection of Minors in the Media (Sect. 5 JMStV).
Germany has an extremely fine-knit mesh of legal provisions for child protection in the media. The German Interstate Treaty on the Protection of Minors in the Media (JMStV) classifies content in relation to harmfulness. The categories are:
The issue of which technical measures a provider is legally obliged to adopt is determined by the nature of the content:
- Illegal content, e.g. child pornography or violent pornography, may not be offered under any circumstance
- Content illegal for minors (persons under 18) may only be made accessible to adults
Content that is harmful for young people’s development may only be offered if the provider takes care to ensure that minors will not normally be able to access it.
For that, there are different possibilities:
Programming for a recognised youth protection program
Installing technical means
Using watersheds (content for persons aged 18 and over may not be provided until after 11 p.m.; for persons aged 16 and over not until after 10 p.m.; for persons aged 12 and over not until 8 p.m. Content that is suitable for children of all ages can be provided without any further measures.
Since the revision of the Interstate Treaty on the Protection of Minors in the Media (JMStV) in 2016, it has been easier for media providers to obtain a youth protection rating (i.e. an age classification) for their products.
In principle, providers can rate their products themselves or use the free FSM age classification system to create a technical classification. However, the way to obtain the maximum possible protection from sanctions by the authorities is to have your content evaluated by the FSF and to observe the corresponding age rating when disseminating this content, thereby avoiding fines.
Once issued, an FSF age rating can also be used for other marketing channels. For providers, the adoption of television ratings for DVD marketing is both unbureaucratic and cost-effective.
The examination process can also incorporate a variety of ratings for different versions of a product. It is therefore becoming more and more common to produce an edited version for prime-time TV broadcast (for viewers aged 12 and over), while also releasing the original version with a 16 rating on DVD or Internet platforms.
To sum up, whether your content is going out via television, DVD, VOD or streaming, the FSF and the FSM have the right youth protection solution for your product.
Violations against youth media protection laws can cause serious sanctions and fines. Simultaneously, the German legal framework around youth media protection is rather complicated and can be hard to follow.
Becoming a member of the FSM or the FSF reduces the possibilities for direct intervention by state authorities against telemedia providers. In the event of disputes with the federal authorities, first the FSM or the FSF are engaged as an intermediary, thereby legally ruling out direct sanctions (“legal privilege”).
Recognised self-regulation associations therefore function as a buffer between the state authorities and their member companies (Article 20 Para. 5 Sentence 2 JMStV).
See FAQ "Applicants’ questions about the FSF examination of programmes" on FSF website and “What we offer" on FSM website.
Hate speech is usually understood as verbal attacks on persons or groups due to certain attributes such as skin color, origin, sex, sexuality or religion. In social media hate speech often appears in comments, posts, memes or videos. Learn more
Hate speech violates the integrity and dignity of individuals, defames them and ostracizes them from society. If misanthropic comments increasingly appear on the internet, hate messages can go viral and lead to a public spirit in which discrimination and violence against certain groups seem to be acceptable. Thus hate speech also becomes a breeding ground for real acts of violence. You can find more information on page 149 of "Bookmarks – A manual for combating hate speech online through human rights education".
The EU Framework Decision 2008/913/JHA demands from the member states to combat certain forms and expressions of racism and xenophobia by means of criminal law. Member States must ensure that hate speech against a group of persons or a member of such a group defined by reference to race, color, religion, descent or national or ethnic origin. The criminal codes of most Member States contain provisions that deal with hate speech falling under incitement to violence or hatred, but in detail they can differ in terminology. Learn more
Whether a symbol is illegal or not, highly depends on the country the content is related to. In Germany there are numerous prohibited Nazi-symbols like the swastika, but also the flag of the "Islamic State" as a symbol of an unconstitutional organization is not allowed to be distributed in public. In addition, ciphers and codes are used to cover up hate messages. Therefore a closer look at the content is recommended, even if the symbol or sign is not illegal in your country by itself. Some companies do not accept them if they are used to incite hate. Learn more
Content that violates legal provisions or the terms of service can be reported to the platform on which the content has been provided. You can find a list of collected information on how to report content on the different platforms here.
In addition, there are campaigns and hotlines that can provide help and take measures against illegal hate speech in their countries. For the success of a report it is always helpful to keep your report as exact as possible. It is best to name a specific content, or the URL to a particular video, comment, or user profile.
Administrators have tools that other users do not have. You can block or delete posts and users, you can use a netiquette to set rules for the discussion on your site and with campaigns you can encourage your users to stand up against hate speech. Most of the bigger platforms explain the special functions on their services and give tips for administrators.
On the internet there are many good examples of creative measures against racism, xenophobia and extremism. It is important not to let hate speech go unchallenged. Counter speech is a good measure to disprove racially motivated misinformation with arguments. For example you can use memes or create campaigns to cast a slur on hate speech. Such actions are important signs for democracy and show solidarity with the victims of hate speech. Learn more
To work effectively in the field of youth media protection and achieve tangible results, it is essential to be able to work closely with other dedicated organisations and initiatives. Because of this, the FSM cooperates with a highly diverse assortment of partners, especially on the project level.
The Federal Criminal Police Office is Germany’s centralised criminal police office, and as such also coordinates crime fighting on an international level. Its basic responsibilities include communication on police matters with police and judicial authorities and other public authorities outside Germany. International cooperation in pursuit of the fight against crime is a major priority – the BKA has strong relationships with virtually every police headquarters around the world. The FSM hotline works closely with the BKA.
The functions of the Federal Review Board for Media Harmful to Young Persons are to index media harmful to young persons if so requested or suggested (“statutory youth media protection”) and to promote value-based media education and public awareness of youth media protection issues (“youth media protection: media education”).
The FSM has been one of the partners of the “Keeping Germany safe online” (DsiN) initiative since January 2005. DsiN e. V. is a central contact for consumers and medium-sized companies. It is supported by companies, associations and industry associations, and their concrete “promises to act” contribute in practical ways to increased IT safety and enhanced safety awareness among Internet users. DsiN has been sponsored by the German Federal Ministry of the Interior since June 2007.
The Freiwillige Selbstkontrolle Fernsehen (Voluntary Self-Regulation of Television, FSF) is an association comprising Germany’s largest commercial television broadcasters. Three times a year, the FSF and the FSM jointly organise the series of events known as “medien impuls”, covering a variety of topics and engaging with every aspect of the telemedia world and with the broad-based range of issues dealt with by the two self-regulation organisations.