CoE Framework Convention signatory
This content is for informational and educational purposes only and does not constitute legal advice.
On 12 February 2026, Parliament passed the Bill concerning the Protection of Children in Digital Environments (PL 398/XVII/1) by general vote. Article 6 requires platforms, services, games, and applications accessible to children to provide a dashboard available to both the child and the holders of parental responsibilities. This dashboard must enable the setting of time limits, monitoring of contacts and interactions flagged as risky, adjustment of enhanced privacy settings, activation of usage time controls by the child, and access to regular information on daily and weekly usage time. Article 10 establishes safe default settings for accounts of children under 16. By default, such accounts must be private, non-searchable, limit algorithmic recommendations to appropriate and non-addictive content, and hide social metrics. Article 11 prohibits certain functionalities on accounts of children under 16, including autoplay, infinite scroll, gamification designed to prolong use, non-essential notifications, particularly during nighttime hours, systems for creating fake images or videos, and loot boxes or similar mechanisms. Essential functionalities may be maintained, provided they are configured to minimise exposure to inappropriate content and the risks of digital addiction.
On 2 February 2026, the Bill on the Protection of Children in Digital Environments (PL 398/XVII/1) was introduced in Parliament. Article 6 requires platforms, services, games, and applications accessible to children to provide a dashboard available to both the child and the holders of parental responsibilities. This dashboard must enable the setting of time limits, monitoring of contacts and interactions flagged as risky, adjustment of enhanced privacy settings, activation of usage time controls by the child, and access to regular information on daily and weekly usage time. Article 10 establishes safe default settings for accounts of children under 16. By default, such accounts must be private, non-searchable, limit algorithmic recommendations to appropriate and non-addictive content, and hide social metrics. Article 11 prohibits certain functionalities on accounts of children under 16, including autoplay, infinite scroll, gamification designed to prolong use, non-essential notifications, particularly during nighttime hours, systems for creating fake images or videos, and loot boxes or similar mechanisms. Essential functionalities may be maintained, provided they are configured to minimise exposure to inappropriate content and the risks of digital addiction.
On 8 October 2025, Portugal’s National Cybersecurity Centre (CNCS) adopted the Cybersecurity Services Certification Scheme to identify and value cybersecurity services across the country. The voluntary scheme applies to all organisations established in Portugal, covering services including incident monitoring and response, vulnerability management, threat intelligence, and penetration testing. Administered by CNCS as the National Cybersecurity Certification Authority and implemented through IPAC-accredited independent bodies, it offers two certification levels, including Basic and Substantial, and requires periodic assessments to ensure compliance. The scheme aims to create a national catalogue of reliable providers, enhancing trust, competitiveness, and integration into European cybersecurity markets.
Last updated: 12/02/2026