Frequently asked questions
Tasks and objectives of the FSM
The German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM) is a non-profit association that aims to protect minors in online media. In addition, the FSM promotes digital media education to ensure safe and positive online experiences for children and teenagers. Parents and families are empowered to protect and support their children, teachers are motivated to include media education into their lessons.
The FSM is also a contact for companies and politicians.
The FSM offers a wide range of information and educational services for parents and teachers as well as children and young people. The aim is to exhaust the potential of digital media and at the same time to protect young people from content that is harmful to them.
We have tips for parents and families so that they can guide and accompany children in the use of apps, games, websites and social networks in an age-appropriate and individual way. Here you will find a selection.
For teachers, the FSM offers numerous suggestions, materials and training opportunities to implement media education in the classroom and in education. You can find more information here.
The FSM is also involved in numerous cooperations, expert committees and panel discussions for media education.
Numerous companies and associations from the telecommunications and online industry have joined the FSM as members. The FSM supports them in complying with the legal provisions for the protection of minors and advises them on all legal, technical and pedagogical issues relating to the protection of minors in the media.
If you notice content on the internet that may be problematic, you can report it. Our hotline reviews all reports on criminal and youth-endangering online content and, with the support of the police and other partners, ensures that this content quickly disappears from the net. If you have also noticed something, please report it directly to us using our online form.
Online youth protection in practice
In order to implement the protection of minors on the internet, companies must comply with a number of laws. These were laid down in the German Interstate Treaty on the Protection of Minors in the Media (JMStV).
Online providers who offer content that is detrimental to development or harmful to young people as well as providers of search engines need a youth protection officer.
In addition, each provider must evaluate its content itself and decide whether it may be problematic for children or young people. If it is indeed content relevant to the protection of minors that is only permissible from the age of 16 or from the age of 18, the law allows various measures to protect minors. Depending on the content, these are possible:
- Technical age labels that allow youth protection programmes to filter out content that is unsuitable for younger users.
- Technical access barriers that allow access to online content only by PIN entry, ID request or age verification via webcam.
- Broadcasting time restrictions, by which content is offered at such a time that adolescents of the age groups concerned cannot perceive it.
- Age verification systems that make certain online content (e.g. pornography) accessible only after personal identification and authentication.
Youth protection officers advise companies and providers on legal and media-pedagogical issues, point out potential dangers and make sure that youth protection is taken into account in all decisions. Externally, youth protection officers are the contact persons for users, legal guardians and supervisory authorities. It must be quickly recognisable on a website who the youth protection officer is and how this person can be contacted.
Youth protection programmes allow parents to control what content their children can and cannot see. There are many different types of such programmes. Some are installed on the end device (PC, tablet, smartphone), others work via the router in the entire home network or at the level of the internet provider. Youth protection programmes can be configured individually for each user so that only suitable content is accessible: via blocklists (list of generally inadmissible websites), passlists and an extensive list of age-differentiated content (permissible depending on the age level setting in the software).
Other youth protection programmes function within closed systems, such as in game consoles, cloud gaming offerings or
video-on-demand services. A system is closed, for example, if all content within the system is bound to the manufacturer’s own standards and the system cannot usually be left without further ado.
Find out more.
The FSM tests and evaluates youth protection programmes on the basis of the legal foundation, the German Interstate Treaty on the Protection of Minors in the Media. Youth protection programmes must be able to do the following, for example:
- allow access to content differentiated according to age groups
- read age labels of internet offers
- recognise developmentally impairing offers according to the state of the art
The FSM has assessed these youth protection programmes as approved after its examination:
- JusProg für Android und iOS
- Prime Video
Laws and content relating to online youth protection in Germany
In Germany, there are numerous laws that regulate the protection of children and young people from unsuitable content and other online risks. These laws define obligations and restrictions for providers of digital media content. Penalties for providers who do not comply with the rules are also listed there.
The most important laws in the area of online youth protection are:
- German Interstate Treaty on the Protection of Minors in the Media (JMStV)
- Media State Treaty (MStV)
- Youth Protection Act (JuSchG)
- German Criminal Code (StGB)
Even if some things on the internet are not nice – they are not automatically forbidden. For in all laws and measures that serve to protect children and young people, we must ensure that the basic values of an enlightened society are upheld: Freedom of expression and information and the prohibition of censorship are very important to us.
Whether pornography, depictions of violence or hate speech, there is a great deal of content on the internet children and young people must be protected from. The legislator defines three categories that are important for the protection of minors from harmful media:
- Content with a harmful effect on their development
- Content illegal for minors but legal for adults
- Illegal content
When is online content harmful for children and young people? Harmful content is content that can emotionally overwhelm, unsettle or frighten minors. This includes, for example, erotic depictions, violent content, but also other frightening depictions, such as images of war and violence without any context. Such offers may be distributed on the internet. However, the respective providers are legally obliged to offer the content in such a way that children and young people do not usually perceive it.
What is content illegal for minors but legal for adults? Content illegal for minors but legal for adults is permitted for adults but prohibited for children and young people. In addition to pornographic offers, this obviously includes content that is seriously dangerous to their development. This can be, for example, pro-ana content, i.e. blogs and forums in which people suffering from anorexia (anorexia nervosa) exchange views and declare eating disorders to be a lifestyle. The Review Board for Media Harmful to Young People of the Federal Centre for the Protection of Children and Young People in the Media (BzKJ) also publishes a list of content illegal for minors but legal for adults.
This content must be offered in such a way that only adults can access it. Age verification systems are a suitable technical option for this purpose.
What is illegal content? Illegal content is generally prohibited, even for adults. For the most part, this is content whose dissemination is already prohibited under the German Criminal Code (StGB), e.g. content glorifying violence and inciting hatred, instructions to commit criminal offences, the dissemination of propaganda material as well as sexual abuse material depicting minors, violent pornography and animal porn.
The Network Enforcement Act (NetzDG) is intended to combat the spread of hate crime and other punishable content on social networks more strongly. This includes, for example, insults, public incitement to commit crimes, incitement of the people, depiction of violence and threats. The NetzDG makes providers such as Facebook, YouTube or Twitter responsible for dealing with complaints about illegal content as quickly as possible and deleting them if there are justified indications. In addition, the law requires effective complaint management, the appointment of a domestic authorised representative for service of process, and a right to information about inventory data for victims of violations of personality rights online. Violations of these obligations can be punished with fines.
In difficult cases, social networks can transfer the decision on the illegality of content to a state-recognised institution of regulated self-regulation. As the first and so far only institution of this kind, the FSM has been active since 2020.