Europe · CoE Framework Convention signatory
This content is for informational and educational purposes only and does not constitute legal advice.
On 7 February 2026, Decree No. 2026-60 on the experimentation of games with monetisable digital objects entered into force. The decree was issued pursuant to Articles 40 and 41 of Law No. 2024-449 of 21 May 2024 on securing and regulating the digital space. The decree requires enterprises offering games with monetisable digital objects to prompt players, from the moment they open an account, to set a play-time limit, prohibiting any game activity until the limit is set. The limit, which cannot be pre-set by the enterprise, applies immediately to cumulative play over a seven-day period. Enterprises must continuously provide players with easily accessible information on time spent playing, display a warning and a visible counter when 75% of the allotted time has passed (or at least 30 minutes before expiry), and issue a further warning 10 minutes before the limit is reached. A self-exclusion function, effective immediately, must also be available at all times for periods ranging from 24 hours to 12 months.
On 4 February 2026, the Minister of the Economy, Finance, and Industrial, Energy, and Digital Sovereignty adopted the Decree No. 2026-60 on the experimentation of games with monetisable digital objects. The decree was issued pursuant to Articles 40 and 41 of Law No. 2024-449 of 21 May 2024 on securing and regulating the digital space, and it provides that it enters into force the day after publication. It requires the enterprise offering a game with monetisable digital objects, from the opening of the game account, to ask the player to set a play-time limit, and it prohibits any game action until the player has set that limit, which cannot be pre-set by the enterprise. It provides that the limit applies immediately to cumulative play time over a seven-day period. It requires the enterprise to inform the player at all times, in an easily accessible manner, of the time spent playing. It requires a warning and a continuously displayed counter when 75% of the player’s time has elapsed or at the latest 30 minutes before expiry, and it requires a further warning 10 minutes before expiry. It requires a continuously available self-exclusion function for 24 hours to 12 months, with immediate effect.
On 3 February 2026, the French Data Protection Authority (CNIL) published guidance on deepfakes explaining the associated privacy and security risks and setting out steps for individuals to protect themselves and report illicit content. The guidance defines deepfakes as audio, image, or video content created or modified using artificial intelligence, including techniques such as face swapping and lip-synching, and notes that such content can be used for conduct including identity misuse, online harassment, fraud, and disinformation. The guidance highlights that creating or sharing certain deepfakes can lead to criminal liability under French law, including that creating an image montage of a person without consent may be punishable by one year of imprisonment and a EUR 15'000 fine, and that fraud involving deception to obtain money, property or a service may be punishable by 5 years of imprisonment and a EUR 375'000 fine. The guidance also describes the CNIL’s role in informing the public, supervising compliance with data protection rules, supporting research and detection work, including through the GenFakes project, and contributing to European initiatives, including work on codes of practice for AI-generated content.
On 5 January 2026, the President adopted Order No. 2026-2 relating to the remote marketing of financial services to consumers, implementing EU Directive 2023/2673 on financial services contracts concluded at a distance by introducing binding design requirements for online interfaces used to conclude distance contracts. The Order amends the Consumer Code by inserting Article L. 222-16-3, which prohibits financial service professionals from designing online interfaces in a way that includes dark patterns, repeated choice prompts, asymmetrical subscription and cancellation processes, and other interface designs that alter or hinder consumer decision-making. The Order thus implements the Directive's requirement for member states to enact additional protections in relation to design features of online interfaces. The Order enters into effect on 19 June 2026.
On 21 October 2025, the French Data Protection Authority (CNIL) issued guidance on political communication tools under the Political Advertising Transparency Regulation (PAR). The rules apply to online targeting and dissemination of political messages using personal data. The guidance provides that political actors must collect data directly and obtain explicit consent. Profiling using sensitive data is restricted. It also provides that targeting children under 17 is prohibited. A register documenting targeting tools, including Artificial Intelligence use, must be maintained. Non-targeted outreach, including newsletters or postal mail, may rely on consent or legitimate interest under the General Data Protection Regulation.
On 14 October 2025, the French National Commission on Informatics and Liberty (CNIL) adopted guidance regarding the application of the right to data portability to loyalty programs, following inquiries from the distribution sector. The CNIL specified that Global Trade Item Number (GTIN) barcodes of purchased products, when linked to customer identification information, constitute portable data that must be transmitted upon request. Similarly, the financial amount of promotions obtained through a loyalty program, if attributable to the beneficiary customer, is also considered portable. Conversely, the methodology used to calculate and generate targeted promotions is not classified as portable data.
On 6 October 2025, Decree No. 2025-767 on the presentation of the warning message concerning the illegal nature of behaviours represented in pornographic content enters into force. The decree implements Articles 1-3 of Law No. 2004-575 of 21 June 2004 on confidence in the digital economy, as inserted by Article 12 of Law No. 2024-449 of 21 May 2024 on securing and regulating the digital space. The decree establishes the exact wording of the mandatory warning for content simulating rape, aggravated rape, or incestuous rape, specifying that the message must be displayed in white text on a black background for a minimum of 12 seconds in full screen before access to the content, and at the bottom of the screen throughout its viewing, without any alteration, concealment, or separation from other indications or images. The decree applies to producers of pornographic audiovisual works, providers of online content hosting services containing pornographic material, and state services responsible for combating offences against minors and violent offences against persons, and is applicable in New Caledonia, French Polynesia, and the Wallis and Futuna Islands.
On 30 September 2025, the French Data Protection Authority (CNIL) closes the consultation on guidelines on the deployment of web filtering. The guidelines apply to public and private sector employers implementing web filtering tools for employees, contractors, or visitors using professional internet or Wi-Fi networks, excluding open public Wi-Fi providers. It outlines General Data Protection Regulation (GDPR) - compliant obligations, including limiting data collection on user ID, IP address, domain name. It also focuses on ensuring a legal basis on legitimate interest or legal obligation, conducting Data Protection Impact Assessments where required, consulting employee representatives, and informing affected individuals. It also addresses deployment risks across on-site, Software as a Service (SaaS), and hybrid models, and recommends strong security and pseudonymisation measures to safeguard logs .
On 25 September 2025, the Regulatory Authority for Audiovisual and Digital Communication (Arcom) released the results of its study on online risks for minors. The study was conducted over twelve months and involved 2’000 minors aged 11 to 17, 2’000 parents, 28 in-depth interviews with young people, preparatory hearings with eight platforms and four experts, and a semiological analysis of platform design and terms of service. It found that more than four out of five minors use at least one very large online platform every day. The average age of first social network use is 12 years, and 44% of children access these services before the age of 13 by misrepresenting their age. The study reported that 83% of minors are regularly exposed to at least one of six risks, including hyperconnection, harmful or shocking content, dangerous challenges, cyberbullying, malicious adult contact, and online scams, with hyperconnection and exposure to shocking content being the most frequent. It also highlighted that 45% of young people consider current protective tools insufficient. Arcom identified two regulatory priorities, which are enforcing the minimum age of 13 years for access to platforms and requiring the deployment of services adapted specifically for minors.
On 18 September 2025, the Data Protection Authority issued guidance on inactive accounts for digital content users. It applies to the audio-visual and video game sectors. Organisations must set a retention period and not keep inactive accounts indefinitely. Accounts inactive for two years should generally be deleted, with prior user notification. Data needed to access purchased content, like email, name, pseudonym, or game saves, may be kept longer. Other commercial or statistical data must be limited in retention. Organisations must communicate retention clearly and protect stored data. After two years of inactivity, accounts should be deactivated but remain reactivatable.
On 9 September 2025, the Directorate-General for Enterprise (DGE) and the Directorate-General for Competition, Consumer Affairs and Fraud Control (DGCCRF) published the draft law establishing the scheme for designation of national authorities responsible for implementing Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence (AI Act), to be submitted to Parliament for adoption. The framework designates the Directorate-General for Competition, Consumer Affairs and Fraud Control (DGCCRF) as coordinator and single point of contact under Article 70(2), while the Directorate-General for Enterprise (DGE) represents France in the European AI Committee. Prohibited practices under Article 5 are monitored by the Audiovisual and Digital Communication Regulatory Authority (Arcom) and Directorate-General for Competition, Consumer Affairs and Fraud Control (DGCCRF) for manipulative techniques and exploitation of vulnerabilities, by the National Commission for Information Technology and Civil Liberties (CNIL) and Directorate-General for Competition, Consumer Affairs and Fraud Control (DGCCRF) for social scoring, and solely by the National Commission for Information Technology and Civil Liberties (CNIL) for predictive policing, biometric categorisation, and untargeted facial recognition databases. Oversight of high-risk AI under Annex I is assigned to existing market surveillance and notifying authorities, while Annex III extends responsibilities to the Prudential Supervision and Resolution Authority (ACPR) for credit and insurance, to the Council of State, Court of Cassation and Court of Auditors for judicial AI, and to the National Commission for Information Technology and Civil Liberties (CNIL), Directorate-General for Competition, Consumer Affairs and Fraud Control (DGCCRF) and Audiovisual and Digital Communication Regulatory Authority (Arcom) for areas including education, biometrics, democratic processes, employment, migration and border control. Transparency duties under Article 50 are divided between the National Commission for Information Technology and Civil Liberties (CNIL), Directorate-General for Competition, Consumer Affairs and Fraud Control (DGCCRF) and Audiovisual and Digital Communication Regulatory Authority (Arcom), with technical support from the National Cybersecurity Agency of France (ANSSI) and the Digital Regulation Expertise Centre (PEReN).
On 1 September 2025, the National Commission on Informatics and Liberty (CNIL) fined Infinite Styles Services Limited, the Irish subsidiary of the Shein group, EUR 150 million for breaches of rules on cookies. The fine was imposed following an inspection in August 2023, which determined that cookies, particularly for advertising, were being placed on users’ devices without consent, that information provided through banners was incomplete or misleading, and that mechanisms for refusing or withdrawing consent were ineffective. The CNIL noted that around 12 million people in France visited the website each month, amplifying the scale of the infringement, and stressed that Shein failed to respect choices made by users and did not adequately inform them of the identity of third parties placing cookies. While Shein made changes to its website during the proceedings, the CNIL’s committee considered the violations serious enough to warrant a substantial fine but did not issue compliance orders.
On 1 September 2025, the National Commission on Informatics and Liberty (CNIL) fined Google EUR 325 million for displaying advertisements between Gmail users’ emails without consent and for placing cookies during the creation of Google accounts without valid consent from French users. The case originated from a 2022 complaint by privacy group NOYB and subsequent inspections into Gmail and Google account creation processes. The CNIL found that Gmail displayed promotional emails in the “Promotions” and “Social” tabs in a way that amounted to direct marketing, which required prior user consent under French communications law. It also concluded that Google had made it harder for users to refuse advertising cookies than to accept them, and that users were not properly informed that access to services depended on such cookies, making their consent invalid under the French Data Protection Act. As a result, Google LLC was fined EUR 200 million and Google Ireland Limited EUR 125 million. The companies were also ordered to stop inserting advertisements between emails without prior consent and to ensure valid cookie consent within six months or face additional daily penalties.
On 1 September 2025, the Commission Implementing Decision (EU) 2025/1760 of 19 August 2025 entered into force following its publication in the Official Journal of the European Union (OJ L 2025/1760). The Decision, adopted by the European Commission under Article 41(1) of Directive 2014/53/EU, confirmed that the safeguard measure taken by France through the French National Frequency Agency (ANFR) requiring the withdrawal of the Apple iPhone 12 A2403 was justified. The national measure, first notified on 5 October 2023 via the Information and Communication System for Market Surveillance (ICSMS), was based on findings that the device exceeded the limb specific absorption rate (SAR) limit of 4 W/kg prescribed by harmonised standard EN 50566:2017, which implements Council Recommendation 1999/519/EC. Tests by the accredited laboratory CETECOM confirmed retained SAR values of 5.615 W/kg and 5.740 W/kg, resulting in a withdrawal decision of 12 September 2023. Although Apple issued a corrective software update (iOS 17.1) to activate the “on-body” state of the Body Detect function permanently, its availability was limited to France. The Commission determined that corrective measures must be Union-wide under Article 40(3) of Directive 2014/53/EU, thereby validating the French safeguard measure.
On 28 August 2025, the Regulatory Authority for Audiovisual and Digital Communication (ARCOM) announced compliance of six pornographic sites designated under the 26 February 2025 ministerial order pertaining to the implementation of age verification systems, following formal notices issued under the law on securing and regulating digital space. As a result, Arcom’s Board decided not to pursue blocking or delisting measures, noting that one site reactivated its system after receiving observations earlier in August. Arcom welcomed the deployment of existing age verification tools, stressed it would continue monitoring compliance with its technical framework, and warned of possible sanctions in case of breaches.
On 26 August 2025, the Ministry for Artificial Intelligence and Digital Affairs announced a lawsuit against the livestreaming platform Kick under Article 6-3 of the Law on Confidence in the Digital Economy, which provides for the State’s ability to act against harmful or illegal content online. Article 6-3 of the Law empowers the judicial court president to order measures to prevent or stop harm from online public communication content. The decision followed the death of an individual, which was broadcast live on the platform between 17 and 18 August.
On 25 August 2025, the Paris Prosecutor’s Office opened an investigation into Kick Platform over the alleged proliferation of violent content online. The investigation was instituted under Article 323-3-2 of the Penal Code, following the death of an individual, which was broadcast live on the platform. The investigation seeks to determine whether Kick Platform knowingly provided illegal services by broadcasting violent content and whether it complied with obligations under the European Digital Services Act (DSA), including notifying authorities of risks to life or personal safety. Article 323-3-2, amended by the law of 13 June 2025, criminalises breaches of DSA obligations and carries penalties of up to 10 years’ imprisonment and a EUR one million fine when committed in an organised manner.
On 22 August 2025, the Regulatory Authority for Audiovisual and Digital Communication (Arcom) condemned the unblocking of the jeanpormanove channel on the Kick platform. The channel had previously broadcast violent content linked to the death of an individual. Arcom stressed that sharing recordings with authorities does not justify restoring public access to the channel. It demanded that Kick reinstate the block without delay and warned of possible enforcement measures if the platform fails to comply.
On 20 August 2025, the Regulatory Authority for Audiovisual and Digital Communication (Arcom) opened an investigation into the platform Kick and initiated exchanges with the Maltese regulator responsible for its supervision, following the designation of a legal representative of the service in Malta as required under the Digital Service Act (DSA). Arcom requested detailed information on resources allocated to French-language content moderation and the handling of the channel “Jeanpormanove”, including reports received and measures taken against potentially illegal content. In parallel, Arcom received an initial response from the platform committing to full cooperation, with further exchanges scheduled. Arcom confirmed it will pursue actions to ensure compliance with internet actors’ obligations, particularly concerning the protection of minors, and will intensify its cooperation within the framework of the Observatory of Online Hate with entities such as OFAC/Pharos, DILCRAH, PNLH, CNCDH, and trusted flaggers. The Arcom highlighted that the involvement of the Human Rights League (LDH) had led to engagement with German counterparts and the European Commission and reiterated that both the DSA and national legislation provide a framework to safeguard fundamental rights in the digital sphere while balancing freedom of expression, human dignity, and innovation.
On 7 August 2025, the Decree No. 2025-768 on the establishment of a connection threshold from which online platform operators are required to temporarily retain illegal content that has been reported and removed or made inaccessible enters into force. Online platforms are defined under point (i) of Article 3 of Regulation (EU) 2022/2065 on a Single Market for Digital Services (Digital Services Act). The decree, applicable in mainland France, New Caledonia, French Polynesia, and the Wallis and Futuna Islands, sets this threshold at 10 million unique monthly visitors from French territory, calculated on the basis of the previous calendar year, pursuant to Article 6(VI) of Law No. 2004-575 of 21 June 2004 on confidence in the digital economy, as amended by Article 48 of Law No. 2024-449 of 21 May 2024 on securing and regulating the digital space. It repeals Decree No. 2022-32 of 14 January 2022 adopted under Article 42 of Law No. 2021-1109 of 24 August 2021 on upholding respect for the principles of the Republic and concerning the establishment of a connection threshold for online platform operators in combating the public dissemination of illegal content.
On 4 August 2025, the Regulatory Authority for Audiovisual and Digital Communication (Arcom) issued a formal notification to five pornographic websites based in Cyprus and the Czech Republic to comply with the Law on Securing and Regulating the Digital Space (SREN Law/ LAW No. 2024-449). Under the notification, the sites have three weeks to comply with the age verification requirement in the SREN law before Arcom will initiate delisting and blocking proceedings against them. The notification follows the issuance of formal observation letters by Arcom to the operators of these websites on 11 June 2025, and subsequent failure to comply. These websites had been designated by ministerial order as subject to legally mandated age verification requirements under the SREN law. Arcom also sent a letter of observation to a supplier who had withdrawn its age verification system following the suspension of the order. The letters represent the next step toward potential blocking or delisting if non-compliance continues. The Arcom also notified the relevant national authorities in accordance with the European Union’s cooperation mechanisms. The Aylo group, operator of Pornhub, YouPorn, and RedTube, was not affected, as it had withdrawn from the French market.
On 4 August 2025, the Government adopted Decree No. 2025-767 on the presentation of the warning message concerning the illegal nature of behaviours represented in pornographic content. The decree implements Articles 1-3 of Law No. 2004-575 of 21 June 2004 on confidence in the digital economy, as inserted by Article 12 of Law No. 2024-449 of 21 May 2024 on securing and regulating the digital space. The decree establishes the exact wording of the mandatory warning for content simulating rape, aggravated rape, or incestuous rape, specifying that the message must be displayed in white text on a black background for a minimum of 12 seconds in full screen before access to the content, and at the bottom of the screen throughout its viewing, without any alteration, concealment, or separation from other indications or images. The decree applies to producers of pornographic audiovisual works, providers of online content hosting services containing pornographic material, and state services responsible for combating offences against minors and violent offences against persons, and is applicable in New Caledonia, French Polynesia, and the Wallis and Futuna Islands. It enters into force two months after its publication in the Official Journal of the French Republic.
On 4 August 2025, the Government of France adopted Decree No. 2025-768 on the establishment of a connection threshold from which online platform operators, as defined in point (i) of Article 3 of Regulation (EU) 2022/2065 on a Single Market for Digital Services (Digital Services Act), are required to temporarily retain illegal content that has been reported and removed or made inaccessible. The decree, applicable in mainland France, New Caledonia, French Polynesia, and the Wallis and Futuna Islands, sets this threshold at 10 million unique monthly visitors from French territory, calculated on the basis of the previous calendar year, pursuant to Article 6(VI) of Law No. 2004-575 of 21 June 2004 on confidence in the digital economy, as amended by Article 48 of Law No. 2024-449 of 21 May 2024 on securing and regulating the digital space. It repeals Decree No. 2022-32 of 14 January 2022 adopted under Article 42 of Law No. 2021-1109 of 24 August 2021 on upholding respect for the principles of the Republic and concerning the establishment of a connection threshold for online platform operators in combating the public dissemination of illegal content and enters into force the day after its publication in the Official Journal.
On 28 July 2025, the French Data Protection Authority (CNIL) opened a consultation on guidelines on the deployment of web filtering until 30 September 2025. The guidelines apply to public and private sector employers implementing web filtering tools for employees, contractors, or visitors using professional internet or Wi-Fi networks, excluding open public Wi-Fi providers. It outlines General Data Protection Regulation (GDPR) - compliant obligations, including limiting data collection on user ID, IP address, domain name. It also focuses on ensuring a legal basis on legitimate interest or legal obligation, conducting Data Protection Impact Assessments where required, consulting employee representatives, and informing affected individuals. It also addresses deployment risks across on-site, Software as a Service (SaaS), and hybrid models, and recommends strong security and pseudonymisation measures to safeguard logs .
On 25 July 2025, the Regulatory Authority for Electronic Communications, Postal Services and Print Media Distribution (Arcep) closes the consultation on the draft recommendation concerning the interoperability and portability of cloud computing services, aimed at supporting the implementation of Regulation (EU) 2023/2854 on harmonised rules for fair access to and use of data and Law No. 2024-449 on securing and regulating the digital space (Digital Space Regulation Law). The consultation sought input from stakeholders on non-binding good practices for cloud service providers to facilitate the change of provider and the simultaneous use of services from multiple providers (multi-cloud), in alignment with the Data Act’s requirements for contractual, technical, and organisational measures and the Digital Space Regulation Law’s obligations on interoperability, portability, and the provision of application programming interfaces (APIs). The draft recommendation detailed transparency measures, including the publication of procedural, technical, and security information on data migration, API documentation, compatibility with standards, and advance notice for non-backwards-compatible updates, as well as the adoption of the most recent OpenAPI specification for API description and documentation.
On 24 July 2025, the National Commission on Informatics and Liberty (CNIL) closes its consultation on its draft recommendation on the use of tracking pixels in emails. The draft recommendation applies to all public and private organisations using tracking pixels in emails and their service providers, focusing specifically on email tracking. It aims to clarify obligations, particularly regarding user consent under Article 82 of the French Data Protection Act, transposing the ePrivacy Directive. Any subsequent processing of personal data collected must also comply with the General Data Protection Regulation (GDPR). Generally, prior consent is required for using tracking pixels, except for strictly necessary purposes like user authentication or aggregate, anonymised open rate measurement for deliverability. Purposes requiring consent include individual open rate analysis for campaign performance, tailoring communications, profiling for targeting, and fraud detection. The recommendation provides guidance on informing users clearly about tracker purposes and obtaining valid consent. Consent should ideally be obtained when collecting the email address or via a tracking-free email. Consent must be freely given, allowing granular choices. Users must also be able to withdraw consent easily, typically through an email footer link. Organisations must maintain proof of valid consent.
On 24 July 2025, the National Commission on Informatics and Liberty (CNIL) closes its consultation in the form of a questionnaire. The consultation seeks feedback on the direct and indirect economic impact of a draft recommendation concerning tracking pixels in emails. Stakeholders are invited to provide specific, objective, and quantified information, particularly on competitive impacts and alternative market solutions. The CNIL aims to understand the business models and economic consequences of its proposed regulatory approach. Information collected will help quantify the potential impact of the recommendation compared to the current situation.
On 22 July 2025, the National Commission on Informatics and Liberty (CNIL) adopted the recommendations on the application of the General Data Protection Regulation (GDPR) to the development of artificial intelligence (AI) systems. The recommendations clarify that AI models trained on personal data are often subject to data protection rules due to their memorisation risks. The guidance applies to AI developers, providers, and deployers across sectors, including health, education, and workplaces, requiring clear purpose definition, assignment of responsibilities as controller or processor. It also focuses on lawful basis for data processing, including consent, contract, or legitimate interest. It emphasises data minimisation, retention limits, privacy-by-design approaches including federated learning, homomorphic encryption and robust re-identification risk testing, especially for web-scraped or sensitive data. The CNIL urges transparent communication and mechanisms to enforce rights on access, rectification, erasure, objection, secure data handling, compliant annotation, and retraining or output filtering to mitigate memorisation. Data Protection Impact Assessments (DPIAs) must be performed for high-risk processing, addressing AI-specific risks like bias and data leakage.
On 19 June 2025, the French Data Protection Authority (CNIL) adopted guidance on using legitimate interest as a legal basis for developing Artificial Intelligence (AI) systems under the General Data Protection Regulation. The guidance applies to private organisations that process data without relying on consent and to public bodies engaging in activities beyond their core public missions, including human resources management. It clarifies that legitimate interest can only be used if three conditions are met: the interest pursued is lawful, the processing is necessary, and it does not override individuals’ rights, which requires a balancing test. Data controllers must assess and document compliance with these conditions and implement safeguards, particularly when a data protection impact assessment is required.
On 19 June 2025, the French Data Protection Authority (CNIL) adopted a guideline outlining the obligations for data controllers collecting personal data through web scraping, particularly when relying on legitimate interest as a legal basis to develop Artificial Intelligence (AI) systems. The guidance applies to organisations engaging in harvesting publicly accessible data and emphasises the need to minimise harm to individuals’ rights under the General Data Protection Regulation. It highlights mandatory safeguards, including defining collection criteria, excluding sensitive or unnecessary data, respecting technical and legal opposition to scraping, avoiding data from vulnerable populations or private contexts, and ensuring transparency and avenues for objection. Additional measures include pseudonymisation, anonymisation, and preventing inappropriate cross-referencing of identifiers. The guideline also urges controllers to assess whether such processing aligns with individuals’ reasonable expectations and to ensure compliance with other applicable laws, including copyright.
Last updated: 07/02/2026