Deep Lex
Back to Regulation Tracker

Netherlands AI Regulation

Law(s) enactedTreaty

Europe · CoE Framework Convention signatory

Overview

EU AI ACT
  • The EU AI Act (Regulation 2024/1689) applies directly across all member states. Prohibitions on unacceptable-risk AI systems have been in force since 2 February 2025; GPAI model rules since 2 August 2025. High-risk AI obligations are due from 2 August 2026, subject to the Digital Omnibus proposal which may defer enforcement. For the full implementation timeline, governance structure, and current status, see the European Union overview.
  • The Rijksdienst voor Digitale Infrastructuur (RDI) has been designated as market surveillance authority. The Dutch Data Protection Authority (AP) is active on AI and data governance. The Netherlands is among the more advanced member states on implementation readiness.

Key Sources

EU AI Act (Regulation 2024/1689)View
Rijksinspectie Digitale Infrastructuur (RDI)View
EU AI Act National Implementation TrackerView
Dutch AI Coalition (NL AIC)View
Council of Europe Framework Convention on AI (CETS 225)View

This content is for informational and educational purposes only and does not constitute legal advice.

AI Regulation Timeline

  1. 14/04/2026
    ruling

    Amsterdam Court of Appeal ordered X to disclose user data and clarified access rights in automated moderation decisions

    On 14 April 2026, the Amsterdam Court of Appeal ordered X to provide a user with extensive access to their personal data under the General Data Protection Regulation, following an appeal concerning a temporary, automated account restriction that reduced the user’s visibility. The judgment clarifies the obligations of social media platforms and other platform intermediaries that process personal data and deploy automated decision-making systems. The Court held that internal system records, including moderation labels, advertising suitability classifications, spam and authenticity indicators, and account security logs, constitute personal data and must be disclosed, as they affect user visibility and treatment on the platform. It rejected X’s argument that these records are protected trade secrets, finding that such claims require a proportional balancing test and cannot justify a broad refusal of access, particularly where the data subject seeks to verify lawfulness and challenge automated decisions. The Court permitted only narrow redactions for employee identities and precise timestamps, while requiring disclosure of all other relevant data, including links between actions and specific posts. The judgment confirms that automated content moderation decisions fall within transparency and access obligations under the General Data Protection Regulation and upheld uncapped penalty payments to ensure compliance.

  2. 24/03/2026
    interim ruling

    Authority for Consumers and Markets' reported findings on digital accessibility

    On 24 March 2026, the Authority for Consumers and Markets reported findings from inspections under the European Accessibility Act, highlighting that 61% of the largest Dutch online stores are not digitally accessible. The Act applies to e-commerce and digital service providers, particularly companies with at least 10 employees or an annual turnover exceeding EUR 2 million. The findings indicate that many websites prevent users with disabilities, including those using assistive technologies, from completing purchases, with issues including inaccessible buttons and Completely Automated Public Turing test to tell Computers and Humans Apart (CAPTCHA) systems. It was also stated that the ACM has identified areas for improvement and cautioned that companies failing to address shortcomings risk enforcement action. The Act seeks to enforce obligations to ensure accessibility of digital services and encourages companies to implement technical and design changes.

  3. 05/03/2026
    closure

    Data Protection Authority adopted report on systemic risks and discriminatory use of artificial intelligence and algorithmic systems

    On 5 March 2026, the Data Protection Authority published a report on AI algorithms addressing algorithmic governance, systemic risks, and the use of AI in recruitment and selection in the Netherlands. The report applies to providers and deployers of AI systems, employers, online job platforms, and public sector organisations, with reference to high-risk AI systems under the EU AI Act. It describes the use of AI across recruitment stages, including job posting, candidate matching, CV screening, and candidate assessment, and notes that risks related to bias may arise across these stages. It also reports differences in the level of information provided to employers and candidates in assessment tools, with reference to requirements under the General Data Protection Regulation and transparency-related provisions under the AI Act. The report refers to expectations for providers regarding AI Act requirements on accuracy, non-discrimination, and explainability, and notes that employers should assess the use of such tools and ensure AI literacy among personnel. It further identifies observations related to algorithm registration by public bodies, public sector use of AI systems, protections for children in AI applications, and risks associated with agentic AI. It also outlines planned work on Digital Services Act transparency compliance for job platforms and ongoing guidance on AI Act implementation.

  4. 21/10/2025
    inquiry

    Data Protection Authority issued a report examining risks of using AI chatbots as voting aids for elections

    On 21 October 2025, the Netherlands Data Protection Authority (DPA) issued a report examining the risks of using AI chatbots as voting aids for elections. The DPA found that chatbots provide strongly biased and polarised political advice, misrepresenting the Dutch fifteen-party system. In an experiment testing four chatbots with fictional voter profiles based on Kieskompas and StemWijzer data, two parties, GroenLinks-PvdA and PVV, dominated first-place recommendations, while parties such as CDA, ChristenUnie, DENK, Forum voor Democratie, and SGP were rarely suggested. The study identified that they are funneling left-leaning voters toward GroenLinks-PvdA and right-leaning voters toward PVV, oversimplifying and polarising the political landscape. The DPA noted that, unlike traditional voting aids, chatbots lack transparency, neutrality, and verifiability, with underlying models containing opaque biases. Despite safeguards claimed by providers, the DPA found chatbots offered voting advice in over 99.9% of 21,000 test queries. The DPA also warns citizens against using chatbots for voting advice and urges developers to implement effective safeguards, noting that AI systems influencing elections are classified as high-risk under the EU AI Regulation.

  5. 16/10/2025
    investigation

    Dutch Data Protection Authority issued its decision maintaining EUR 2.7 million fine imposed on Experian for General Data Protection Regulation violations

    On 16 October 2025, the Dutch Data Protection Authority (AP) issued its decision maintaining the EUR 2.7 million fine imposed on Experian imposed 6 December 2023. In addition to the fine, the original decision issued two orders on Experian for infringements of Article 5(1)(a) in conjunction with Article 6(1) and Article 12(1) in conjunction with Article 14(1) and (2) of the General Data Protection Regulation (GDPR). These infringements related to Experian's failure to adequately inform data subjects and its processing of personal data for its “Credit Check” service without a valid legal basis. Experian submitted its arguments in objection to the interim findings. After reviewing Experian’s arguments, the AP maintained its conclusion that Experian infringed the GDPR's transparency and lawfulness principles. The AP found that Experian did not take sufficient active steps to provide data subjects with crucial information regarding the processing, its purposes, legal bases, legitimate interests involved, and their rights concerning access, rectification, erasure, and restriction. Furthermore, Experian's reliance on “legitimate interests” as a legal basis for processing personal data for creditworthiness assessments was deemed unlawful. Experian failed to demonstrate the necessity of processing certain personal data for these assessments, and the AP determined that the interests or fundamental rights and freedoms of data subjects overrode Experian's legitimate interests. The processing of financial data, particularly from non-public sources, was considered a serious interference with fundamental rights, potentially harming data subjects by hindering access to basic needs due to negative credit scores. The safeguards implemented by Experian, primarily focused on data accuracy, were deemed insufficient to mitigate these consequences.

  6. 09/10/2025
    outline

    Dutch Data Protection Authority opened consultation on guidelines on targeting and delivery of political online advertising

    On 9 October 2025, the Dutch Data Protection Authority (AP) opens a public consultation, until 21 November 2025, on the guidelines on targeting and delivery of political online advertising. The consultation concerns the AP’s guidance clarifying the rules on targeting and delivery of political online advertising under the Regulation on Transparency and Targeted Political Advertising (VPR). The VPR enters into force on 10 October 2025. The guidance explains requirements for controllers using targeting and delivery techniques involving personal data, including the need for explicit consent, the prohibition on profiling with special categories of personal data, and the prohibition on targeting of minors under 18 years of age. It further clarifies transparency requirements, such as internal policy, registers, disclosures on mechanisms and parameters, and annual risk assessments, and describes the rights of journalists, researchers, political actors, and election observers to request information. The AP is responsible for the supervision of these provisions, while the Media Authority oversees the other VPR transparency rules.

  7. 02/10/2025
    adoption

    Dutch Data Protection Authority and Netherlands Authority for Consumers and Markets issued joint statement on obligations for chatbot use in customer service

    On 2 October 2025, the Dutch Data Protection Authority (AP) and the Netherlands Authority for Consumers and Markets (ACM) issued a joint statement requiring organisations that use chatbots in customer service to ensure that individuals are always able to communicate with a human representative. Organisations are required to clearly disclose when a chatbot is being used, and to guarantee that the chatbot does not provide incorrect, evasive, or misleading information. The regulators emphasised that consumer law already obliges businesses to communicate directly, effectively, and accurately, a requirement that is reinforced for intermediary services such as social media, marketplaces, and online platforms under the Digital Services Act (DSA), which mandates that customers must have access to a non-automated communication method. The statement noted that forthcoming amendments to consumer law under the Digital Fairness Act will provide additional clarity for other businesses, while new transparency obligations under the Artificial Intelligence Regulation (AI Act), applicable from 2 August 2026, will require organisations to inform users when they are interacting with an automated system. The AP and ACM indicated they will intensify supervision due to increasing complaints, highlighting risks related to poor responses, lack of human access, unclear identification of chatbots, and information security and privacy vulnerabilities.

  8. 02/10/2025
    ruling

    Amsterdam District Court rules that automatic reversion to profiling-based recommendation system violates Digital Services Act

    On 2 October 2025, the Amsterdam District Court ruled that Meta Ireland, the operator of Facebook and Instagram in Europe, violated the Digital Services Act (DSA). The court found that Meta's practice of automatically reverting to a profiling-based recommendation system, even after a user selected a non-profiling alternative, constitutes an illegal "dark pattern." The automatic switchback, which occurs when users navigate the app or close and reopen it, disrupts user autonomy and causes "choice fatigue." Consequently, the court ordered Meta Ireland to make users' choice for a non-profiling recommendation system "persistent," meaning it must be retained until the user actively changes it. The court also ruled that the option to select a non-profiling system was not "directly and easily accessible" in specific parts of the Instagram and Facebook apps and websites. Meta Ireland was ordered to fix these accessibility issues within two weeks. The court imposed a penalty of EUR 100'000 per day, with a maximum of EUR 5 million, for non-compliance with the orders.

  9. 09/09/2025
    investigation

    Netherlands Authority for Consumers and Markets opened investigation into Snapchat over alleged violations of Digital Services Act concerning illegal vape trade to minors

    On 9 September 2025, the Netherlands Authority for Consumers and Markets (ACM) opened an investigation into the online platform Snapchat in connection with the illegal trade of vapourisers to minors, following an enforcement request submitted at the end of August by the Youth Smoking Prevention Foundation. The investigation focuses on whether Snapchat complies with the requirements of the Digital Services Act (DSA), in particular its obligations to take appropriate and proportionate measures to protect minors from illegal and harmful content. As Snapchat is designated a Very Large Online Platform (VLOP), the ACM is coordinating closely with the European Commission, which exercises direct supervision over VLOPs under the DSA. The ACM stated it would not provide further details during the proceedings, the duration of which depends on the progress of the case, but confirmed that non-compliance could lead to enforcement measures such as a binding instruction, fines, or periodic penalty payments. The ACM clarified that while the DSA regulates processes for handling reports of illegal content and the safeguarding of minors, substantive illegality, such as the ban on online vape sales under the Tobacco and Tobacco Products Act, remains subject to oversight by the Netherlands Food and Consumer Product Safety Authority (NVWA).

  10. 24/07/2025
    investigation

    Authority for Consumers and Markets announced postponement of enforcement order against Apple over application store commission fees

    On 24 July 2025, the Netherlands Authority for Consumers and Markets (ACM) deferred enforcing its order against Apple concerning commissions imposed on dating application (app) providers in the app store, due to ongoing discussions between Apple and the European Commission addressing the same concerns. The order applies to Apple’s conduct in the mobile app distribution market, where it was found to have abused its dominant position by requiring dating apps to use its payment system and pay high commissions with 30%, or 15% for smaller providers. Although a court ruling in June 2025 upheld ACM’s decision and lifted the suspension of enforcement, ACM has delayed requiring Apple’s compliance with the commission-related aspect until 1 April 2026.

  11. 23/07/2025
    adoption

    Dutch Data Protection Authority released whitepaper on meaningful human intervention in automated decision making

    On 23 July 2025, the Dutch Data Protection Authority published guidance on implementing meaningful human intervention in automated decision-making systems under Article 22 of the General Data Protection Regulation. The guidance applies to data controllers across all sectors who deploy algorithms for individual decision-making that produce legal effects or significantly affect data subjects. The Authority establishes a four-pillar framework requiring human assessors to consider all relevant factors beyond algorithmic inputs and possess adequate Artificial Intelligence (AI) literacy. It mandates clear technology interfaces that prevent automation bias and present data contextually. The guidance requires appropriate process timing, manageable workloads, and genuine authority to override decisions. It also demands documented governance policies, comprehensive training programmes, and ongoing monitoring through metrics, including override rates.

  12. 15/07/2025
    closure

    Data Protection Authority published report on artificial intelligence and algorithms with a focus on emotion recognition systems

    On 15 July 2025, the Dutch Data Protection Authority (AP) published its fifth edition report on Artificial Intelligence (AI and algorithms, highlighting the growing use yet contested effectiveness of AI emotion recognition systems across sectors, including customer service, healthcare, and wearables. The report warns these systems risk infringing fundamental rights, including privacy and autonomy and stresses organisations must critically assess, transparently deploy, and secure consent when using them. The report highlighted that since February 2025, AI emotion recognition has been banned in education and workplaces in the Netherlands, with broader regulation and political debate ongoing. The AP also emphasises the need for mature AI governance through mandatory algorithm registration, audits, and bias reduction. It also noted progress on harmonised AI standards and the increasing role of AI in national strategies amid evolving European AI regulation frameworks.

  13. 05/06/2025
    adoption

    Authority for Consumers and Markets' issued notice requiring digital accessibility compliance under European Accessibility Act

    On 5 June 2025, the Authority for Consumers and Markets issued a notice requiring digital accessibility compliance under the European Accessibility Act. The Act applies to e-commerce services and electronic communication service providers, particularly companies with at least 10 employees or an annual turnover above EUR 2 million. It also requires websites and applications offering products or services to be accessible to people with disabilities, ensuring independent access to digital services. The ACM encouraged firms to adopt compliance plans, assign responsibility, and implement accessibility measures. It highlights both legal obligations and commercial benefits, noting that non-compliant firms risk enforcement and may lose customers. It was highlighted that compliance would be supervised by the ACM from 28 June 2025, with initial enforcement focusing on critical accessibility issues and allowing users to report non-compliant services.

Last updated: 14/04/2026