Online content that is problematic for children and young people of a specific age group may, according to the law, only be offered if the provider renders them secure using a technical safeguard.

The following measures can be considered for content with a harmful effect on development: technical labels that recognised youth protection programmes will then be able to read und interpret, technical means that create access barriers, or watersheds which are already customary in television programming. Content illegal for minors but legal for adults, e.g. pornography, may only be accessible in so-called closed groups: Access is restricted to adults by identification and authentication. Illegal content may not be distributed in general.

Technical age labels

Content providers can formulate content that is harmful for development to comply with the law by means of technical age labelling.

Among other things, this requires the age-de.xml file for web content. This file contains the age data that youth protection programmes will then be able to read. This tagging is not visible to the users of a website. Individual sub-pages and sections of a website or indeed an entire (sub-)domain can also be tagged with individual age brackets, ensuring maximum availability for their content, even when youth protection programmes have been activated.

Using the age-de.xml standard, content providers fulfil their legal obligation according to the JMStV. Details are defined in Article 5 Para. 3 No. 1, 11 JMStV.

Youth protection programmes

Using youth protection programmes, internet content can be activated and blocked, depending on a specific age group.

Such software consists of independent filters active only in the end-device (PC, tablet, mobile phone) or home network (via the router). These programmes generally consist of multiple components: blocklists (list of generally inadmissible websites), passlists and an extensive list of age-differentiated content (permissible depending on the age level setting in the software). Youth protection programmes are also able to read technical age labels that conform to the common standard (age-de.xml).

In addition, there are also youth protection programmes for so-called closed groups. A system is closed, for example, if all content within the system is bound to the manufacturer’s own standards and the system cannot usually be left without further ado. Examples of this are video-on-demand services or cloud gaming offers.

As a recognised self-regulatory association the FSM has been examining and approving such programmes since 2016 by means of the law (Article 11 Para. 1 JMStV) following the criteria of the Commission for the Protection of Minors in the Media (KJM) .

Youth protection programmes simply explained
3485

YouTube

External Service

When you play the video, data is transferred to YouTube.

What are the recognised youth protection programmes?
  • JusProgNET
  • Netflix
  • Prime Video
  • RTL+
  • MagentaGaming
  • JusProg Android and iOS
  • Disney+
  • WOW
  • GigaTV
  • Paramount+

Requirements for youth protection programmes according to the JMStV are e.g.

  • Enable access to age-differentiated content
  • Ability to read technical age labels
  • Identify content that impairs development

Further technical means

Another way for content providers to offer content that is harmful for development legally is to use technical means.

Technical means are access barriers that the user must overcome in order to reach the corresponding content. Here, the JMStV does not provide any concrete specifications as to how this hurdle should be formulated, but limits itself to setting out the level of protection. According to this, it must be made impossible or very difficult for children and adolescents to consume the content in question.

In practice, various approaches have become established:
  • Using a PIN
  • Provision of details from the user’s national ID
  • Age verification via webcam

Watersheds

By setting time limits, content that impairs development can also be offered legally.

The rule here is that content for persons aged 18 and over may not be provided until after 11:00 p.m., and for persons aged 16 and over, not until after 10:00 p.m. Already customary in television programming, this method is also used online for services such as media libraries. In the case of content for persons aged 12 and over, the law stipulates that the welfare of younger children must be taken into account in choosing the transmission time.

Kind vor dem TV, von hinten fotografiert

Age verification systems (AVS)

Using an age verification system, providers are allowed toprovide content illegal for minors but legal for adults (e.g. pornography).

Such a system will generally need to consist of two components: first of all, it is necessary for the user to be reliably identified, at which point it is determined whether or not the person in question is of legal age. Given the current state of technology, such a determination can only be made through personal contact (“face-to-face monitoring”). Age verification by purely technical means is also conceivable if it can achieve the same level of reliability as a personal age check.

The second component is authentication at each individual usage: it must always be ensured that the content is only being accessed by the person identified as being of legal age at the first step. The risk of access data being passed on to minors should also be reduced.

Whether these requirements are met will always depend on the individual case. In the event of infringements, the state media authorities can impose sanctions upon the provider in question via the Commission for the Protection of Minors in the Media (KJM).

The FSM has also assessed a large number of age verification systems. Member companies can apply for such an assessment to be carried out by the FSM Expert Evaluator Commission.

Company’s Obligations

Youth protection presents companies with numerous issues to be considered. The regulations laid down by youth media protection law are complex, sanctions for infringements are severe. Companies and providers have to meet these requirements.

Read more

Services for companies

The FSM offers a wide range of services concerning the protection of minors from harmful media content. We support companies providing them with custom-tailored advice and evaluating legal, technical and educational issues.

Read more