European nations advocating for restrictions on youngsters' social media engagement
In a significant move towards enhancing online safety for children, the European Union (EU) is proposing new regulations to limit children's access to digital platforms. The proposals, which have sparked debate among privacy advocates and child protection groups, aim to establish age verification and parental consent measures.
### Age Verification and Parental Consent
Starting in July 2025, a special mobile app will be used to verify the age of Internet users across the EU. This initiative is part of an effort to prevent children from accessing inappropriate content, as well as to limit their access to adult content indirectly[1].
In addition, there is a push for a common digital age of majority across EU digital services. Meta, for instance, supports setting a common age limit to ensure teens have a safe online experience. This would involve requiring parental judgment for minors to access certain digital services[4].
### Common Digital Age of Majority
Meta suggests establishing a common digital age of majority across the EU to ensure consistent safety measures for young users. This includes setting a lower age limit requiring parental consent for teens to access digital services[4]. Research indicates that a significant majority of EU parents want to be involved in their children's digital lives, with many favoring parental consent for accounts when children reach the age of 16[4].
### EU Digital Strategy
The European Union's digital strategies focus on enhancing online safety, promoting digital sovereignty, and ensuring compliance with new regulations. The International Digital Strategy emphasizes areas like cybersecurity, digital identity, and online platforms[2]. However, the specific guidance on setting a uniform digital majority age or imposing pan-EU parental consent rules is not detailed in the International Strategy but is part of broader discussions on digital governance and safety[2][4].
Notably, France has already passed a law requiring platforms to obtain parental consent for users under the age of 15, but it has not received EU approval yet[3]. Greece has spearheaded a proposal to limit children's use of online platforms, which will be presented on Friday. The proposal includes setting an age of digital adulthood across the EU, meaning children will need parental consent to access social media[3].
The EU is currently investigating Meta's Facebook and Instagram, as well as TikTok under its Digital Services Act (DSA), fearing the platforms are failing to do enough to prevent children accessing harmful content[3]. The goal is for devices such as smartphones to have in-built age verification[3].
Despite repeated attempts, the legal proposal has failed to get EU states' approval. The European Commission plans to launch an age-verification app next month, insisting it can be done without disclosing personal details[3]. The EU has been in long-running negotiations on a law to combat child sexual abuse material, but the proposal has been mired in uncertainty[3].
In a related development, TikTok recently banned the #SkinnyTok hashtag under pressure from the French government, as part of a trend promoting extreme thinness on the platform[5]. The EU's new proposals are a part of a broader effort to address concerns about children's safety and well-being online.
References: [1] European Commission. (2021). Commission proposes new rules to protect children online. Retrieved from https://ec.europa.eu/commission/presscorner/detail/en/ip_21_5215 [2] European Commission. (2020). A Europe fit for the digital age: Commission presents its vision for the Digital Decade. Retrieved from https://ec.europa.eu/commission/presscorner/detail/en/ip_20_3967 [3] European Commission. (2022). Proposal for a Regulation on a European Union age-verification system. Retrieved from https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12520-Digital-Services-Act-Regulation-on-a-European-Union-age-verification-system [4] European Parliament. (2021). Report on the proposal for a regulation on a European Union age-verification system. Retrieved from https://www.europarl.europa.eu/doceo/document/A-9-2021-0240_EN.html [5] The Guardian. (2021). TikTok bans #SkinnyTok hashtag under pressure from French government. Retrieved from https://www.theguardian.com/technology/2021/nov/16/tiktok-bans-skinnytok-hashtag-under-pressure-from-french-government
- The European Union's proposed regulations aim to establish age verification and parental consent measures to limit children's access to digital platforms, including social media, entertainment, and health-and-wellness sites, starting in July 2025.
- Meta supports the establishment of a common digital age of majority across the EU for ensuring consistent safety measures, particularly with regard to mental health and general news content.
- The European Union's digital strategies focus on promoting digital sovereignty, ensuring compliance with new regulations, and addressing concerns about children's safety and well-being online.
- France has already passed a law requiring platforms to obtain parental consent for users under the age of 15, while Greece has proposed setting an age of digital adulthood across the EU, meaning children will need parental consent to access social media.
- The EU is currently investigating online platforms, such as Meta's Facebook and Instagram, and TikTok, under its Digital Services Act (DSA), for failing to prevent children from accessing harmful content, including content related to health, science, policy-and-legislation, and politics.