Next content

Read more

The interplay between EU competition law, the DMA, and national regulation in digital markets

From 22 to 24 April 2026, the Centre for a Digital Society (CDS) at the European University Institute hosted the residential training of ENTraNCE for Judges 2026, bringing together thirty national judges from twenty-one...

On 21 April 2026, the Florence Observatory on Digital Regulation (FLODIR) hosted a workshop examining the tension between the Digital Services Act’s (DSA) risk-based framework and the increasing global trend toward age-based social media bans. The discussion examined the effectiveness of the DSA’s risk-based framework in protecting minors and whether the growing trend toward national age-based bans reflects a fundamental gap in existing regulatory tools.

The event came at a particularly timely moment, as the EU works to translate the DSA’s risk-based approach into concrete enforcement practices. Speakers in the first session, moderated by Elda Brogi (CMPF), focused on the evidence of harm and the sociological context of online risks. Giovanna Mascheroni (Università Cattolica) and Katarzyna Szymielewicz (Panoptykon Foundation) analysed the human rights implications and minors’ perceptions of risk, while Amanda Third (Western Sydney University) and Marta Vega Bayo (Spanish Ministry for Digital Transformation) provided comparative insights into the ambitious national bans being implemented in Australia and Spain. The session concluded with an intervention from Greta Faieta (DSA Enforcement Team), who outlined the Commission’s perspective on the enforcement of DSA provisions for the protection of minors.

The second session, moderated by the President of Agcom Giacomo Lasorella, turned to the enforcement architecture and the practical challenges faced by media authorities. Peter Chapman (Knight-Georgetown Institute) set the stage by discussing how US litigation against major platforms for addictive design could inform the EU’s outcome-oriented approach to risk mitigation. This was followed by a roundtable with senior representatives from key national regulators, including Jeremy Godfrey (CnaM, Ireland), Frederic Bokobza (ARCOM, France), Susanne Lackner (KommAustria), and Michael Terhörst (KidD, Germany).

The workshop, organized under the scientific direction of Pier Luigi Parcu, Elda Brogi, Marco Botta, Andrea Simoncini, and Urbano Reviglio, brought together industry experts, legal scholars, and regulators to identify pathways for turning minor protection into a definitive benchmark for the success of the Digital Services Act.

 

Key takeaways from Session 1: Evidence and policy approaches to the protection of minors on social media

 

Evidence on minors’ online exposure, risks, and harms

 

  • While risks are real and increasing, research also show that online participation offers significant benefits for minors, including socialisation, access to information, civic engagement, and the development of digital skills and resilience. Effective regulation should prevent harm without undermining these opportunities.
  • Recent longitudinal data indicates that cyberbullying has tripled over the last seven years, becoming the most pervasive harm. 
  • Available evidence from Australia’s implementation suggests that age-based bans face significant compliance challenges: seven in ten children continue to hold a social media account despite the legislation, and one in three parents would help their children circumvent the ban.
  • A key conceptual distinction emerged between risk (the probability of exposure to potentially harmful content or interactions) and harm (the actual negative impact on the minor). Not all children exposed to risks experience harm but those who are already vulnerable offline, due to social exclusion or personal vulnerabilities, face amplified risks in digital environments. This supports differentiated regulatory approaches rather than universal bans.
  • The emergence of an “artificial sociality by design” through generative AI is creating new challenges (e.g., minors increasingly use AI for companionship or information but struggle to distinguish between authentic and synthetic content).

 

Limitations and challenges of national age-based bans

 

  • Age-based bans may trigger a “migration effect” toward unregulated and more dangerous platforms. 
  • Such prohibitions often foster a culture of technical circumvention, frequently supported by parents, rather than addressing the root causes of harm.
  • Banning minors from holding social media accounts does not necessarily prevent them from accessing social media content without an account. 
  • As platforms typically offer their safety features through the account layer, removing account access may strip away one of the main mechanisms through which protections are currently provided.
  • Age-based bans risk becoming simplistic, election-driven policy responses to genuine public concern, as illustrated by the Australian experience.
  • Conversely, restrictive measures can be politically instrumentalised and framed as surveillance or censorship, echoing narratives already deployed against the DSA itself.

 

Policy approaches beyond the ban:

 

  • There is a consensus that the regulatory focus must shift from blocking access to regulating platform features. 
  • The objective should move beyond ensuring a “safe” experience toward fostering a digital environment that prioritizes the well-being of the minor.
  • Rather than banning platforms, regulation could apply by default to all platforms, granting exemptions only to those that demonstrate compliance with child-centred design standards, thereby incentivising competition around safety rather than forcing regulators to chase minors’ migration across services.
  • Parental controls and age verification tools are insufficient if underlying business models remain unchanged. Effective protection requires an investment in redesigning the web’s architecture to prioritize safety by design over engagement metrics.
  • The DSA empowers the Commission to prohibit harmful practices but does not provide a clear mandate to prescribe how platforms should be designed. This asymmetry limits the regulatory toolkit. The forthcoming Digital Fairness Act may represent the key opportunity to introduce positive design obligations.

 

Key takeaways from Session 2: DSA enforcement and protection of minors

 

Lessons from international enforcement

 

  • International litigation as enforcement evidence: recent judicial developments in the United States, where platforms have been held liable for “addictive design,” offer a relevant precedent for Europe. These cases surface internal platform data that can transform DSA risk assessments from purely descriptive documents into evidence-based enforcement tools.

 

  • Compliance driven by credible threat: enforcement experience with pornographic platforms has shown that platforms implement age-assurance measures only when faced with the concrete and imminent threat of being blocked. Without such pressure, they do not act. This finding has broader implications: effective enforcement requires not only sound legal provisions but also operational tools that allow regulators to act swiftly and visibly.

 

Challenges in DSA implementation

 

  • Institutional architecture challenges: effective enforcement requires seamless coordination between national regulators and the European Commission to translate high-level guidelines into daily supervision. A unified European front is essential to prevent national fragmentation while allowing for specific safety interventions.

 

  • The “small player” regulatory gap: applying the DSA’s minor protection provisions to smaller platforms poses significant resource challenges. Despite having lower mitigation capacities, these smaller services are increasingly becoming the destination for minors seeking to avoid restrictions on major platforms.

 

  • Measuring age-assurance effectiveness: a critical gap lies not in the existence of age-assurance systems but in how their performance is evaluated. There is insufficient transparency on how platforms measure accuracy, false positives, and false negatives. Regulators need standardised metrics and verifiable evidence to assess whether these systems genuinely prevent underage access.

 

  • The investment asymmetry: platforms dedicate enormous resources and innovation capacity to maximising user engagement, while investment in solving child-safety challenges remains marginal by comparison. This structural imbalance means that even well-designed regulatory obligations may struggle to produce results unless accompanied by pressure to redirect platform resources toward safety.

 

Emerging standards for age assurance

 

  • Proportionality in age assurance: the principle of proportionality remains central. While strict age verification is necessary for high-risk services (e.g., adult content), “age estimation” techniques are often more appropriate for general-purpose platforms to balance safety with privacy and data minimization.

 

  • The high-risk/low-age-threshold paradox: a recurring enforcement question is what standard should apply when a platform is identified as high-risk for minors but its terms of service set a minimum age below 18 (e.g., 13+). The Commission’s position, as reflected in the Article 28 Guidelines, is that where a platform’s own risk assessment identifies high risks, age verification may be appropriate and proportionate regardless of the age threshold in the terms of service.

 

Future Outlook

 

  • Evolution of risk assessments: the next phase of enforcement will leverage data from international court discovery to challenge platforms’ risk assessments, demanding more granular and verifiable mitigation measures.

 

  • EU identity wallet: the European digital identity wallet is viewed as a potential long-term solution for privacy-friendly age verification, provided it does not evolve into a permanent surveillance infrastructure.

 

  • Shift toward “well-being models”: the regulatory benchmark is evolving. It is no longer sufficient for environments to be “safe”; regulators must incentivize architectures that actively discourage harmful patterns like auto-play and toxic algorithmic amplification.

 

  • The role of education and social infrastructure: age verification and regulatory enforcement alone are insufficient. School-based digital literacy programmes, family support initiatives, and safer internet centres play a critical role in preparing minors for the digital environment. Evidence suggests that combining regulatory measures with educational and social interventions produces stronger outcomes.

 

  • From prohibition to positive design obligations: the regulatory horizon is shifting from merely prohibiting harmful practices toward defining what child-appropriate digital environments should look like. The forthcoming Digital Fairness Act may provide the legal basis to complement the current prohibitory framework with positive design obligations.

Find out more about FLODIR work on this topic

 

FLODIR is the result of the collaboration between the Centre for a Digital Society (CDS) and the Centre for Media Pluralism and Media Freedom (CMPF) at the EUI Robert Schuman Centre, and the Department of Legal Studies of the University of Florence. The initiative is endorsed by the Autorità Garante per le Comunicazioni (AGCOM) – i.e., the Italian regulatory authority for electronic communications.

Research: This workshop contributes to FLODIR’s ongoing research on the coherent application of the emerging EU digital acquis. The Observatory specifically focuses on the enforcement challenges faced by national regulatory authorities in balancing the risk-based logic of the Digital Services Act with fundamental rights.

Executive Education: The themes discussed during the workshop are central to FLODIR’s training mission. The 2026 edition of the executive course New Trends in Digital Regulation (scheduled for 25-28 May) will further explore the practical implementation of the EU digitalisation acquis, providing regulators with the technical expertise needed to address platform design and dark patterns.

Upcoming Activities: Building also on the insights from the “Beyond the Ban” debate, FLODIR will continue to foster coordination between competition, data protection, and audiovisual authorities. The upcoming Third Florence Forum on Digital Regulation (12-13 November 2026) will provide a dedicated platform for national regulators and EU officials to discuss the long-term impact of the Digital Omnibus on the enforcement of child safety provisions.

Back to top