CoE Framework Convention signatory
This content is for informational and educational purposes only and does not constitute legal advice.
On 11 December 2025, the Council of States adopted the amended Motion 24.4596 after the National Council passed the Motion in its amended form on 16 September 2025. The Motion instructs the Federal Council to create the necessary legal conditions to ensure that journalistic content and other works protected under copyright law receive comprehensive protection when used by artificial intelligence providers. The Motion requires clarification in the Copyright Act (URG) that copyright holders’ consent is necessary when creative works are processed or reused for generative AI services, and that exceptions or limitations under copyright law cannot be invoked by such providers. The amended text further mandates that protection be designed without weakening Switzerland’s position in AI research, development, and commercialisation, ensuring compatibility with international frameworks. The Federal Council supports the Motion and is tasked with drafting implementation measures.
On 29 October 2025, the Federal Council opened a consultation on the Federal Law on Communications Platforms and Search Engines, which includes content moderation authority governance, until 16 February 2026. The Law would apply to search engine providers and user-generated content platforms whose services are used by at least 10% of the Swiss population within a six-month period. The draft law empowers the Federal Office of Communications (BAKOM) to supervise compliance, requiring providers to supply information and data for oversight. BAKOM identifies which platforms fall under the law, publishes a list of supervised providers, and charges fees and an annual supervisory fee based on administrative costs and provider size, capped at 0.05% of worldwide profit. The BAKOM may require providers to remedy violations and, if necessary, order service restrictions, which can be renewed until resolved. It can impose administrative sanctions of up to 6% of worldwide turnover for serious breaches and lower amounts for less severe violations, considering severity, prior breaches, and financial circumstances. Sanction authority expires after seven years for major violations and four years for others. The Federal Council issues implementing provisions, while BAKOM informs the public of its activities without disclosing confidential business information. BAKOM may process data of legal persons and personal data, including sensitive data, to perform supervisory, evaluation, and reporting tasks. The Federal Council may conclude international agreements or delegate technical agreements to BAKOM. Finally, the Federal Council will review the Act’s effectiveness within five years, report to the Federal Assembly, and determine its entry into force.
On 29 October 2025, the Federal Council opened a consultation on the draft Federal Law on Communications Platforms and Search Engines, which includes content moderation regulation, until 16 February 2026. The Law would apply to search engine providers and user-generated content platforms whose services are used by at least 10% of the Swiss population within a six-month period. The draft Law mandates that providers establish a procedure for users to report content they believe to be unlawful, covering offenses such as depictions of violence, defamation, threats, discrimination, and incitement to hatred. Providers must process these reports, decide on actions in a timely manner, and promptly inform reporting users of their decisions. Additionally, if providers take restrictive measures, such as content removal, visibility restriction, account suspension, or deletion, they must notify the affected user, unless the measure concerns misleading commercial content or user contact details are unknown. Providers must clearly explain in their terms and conditions which user content may face restrictions and how such measures are applied. They are required to include details on reporting procedures, handling of reports, and internal complaint mechanisms, all written in simple German, French, and Italian. Significant changes to terms must be communicated appropriately, and both the full terms and a summary must be publicly accessible. When applying restrictive measures or conducting related procedures, providers must act carefully, fairly, and without discrimination. Procedures should be user-friendly, available electronically, and understandable in a language chosen by the affected user. Notifications of decisions must explain the specific reasons, indicate if automated tools were used, and inform users about their right to lodge complaints internally or through out-of-court dispute resolution.
On 29 October 2025, the Federal Council opened a consultation on the Federal Law on Communications Platforms and Search Engines until 16 February 2026. The Law would apply to search engine providers and user-generated content platforms whose services are used by at least 10 % of the Swiss population within a six-month period. The Law would require providers to establish a designated contact point by which users and the Federal Office of Communications (BAKOM) can quickly reach them electronically in an official language. Furthermore, providers who are not domiciled in Switzerland would need to designate a Swiss legal representative. Providers must publish this contact information, keep it current, and ensure it is easily accessible. Further, providers would be required to submit an annual transparency report to BAKOM specifying the number of average monthly Swiss users over a six-month period, employed automatic and non-automatic content moderation measures, quality assurance measures in content moderation, as well as the number of user reports of unlawful content, restrictive measures taken against user content, internal complaints procedures, and dispute resolution processes, to be updated biannually. Providers would also be obligated to conduct an annual risk assessment on systemic risks created or amplified by search engines or communications platforms, including the dissemination of unlawful content, adverse effects on users' fundamental rights, and negative consequences on public opinion formation, elections, public security and order, and public health. Providers would also be required to undergo an annual audit regarding their compliance with requirements under the Law.
On 29 October 2025, the Federal Council opened a consultation on the Federal Law on Communications Platforms and Search Engines, which includes measures concerning users' rights, until 16 February 2026. The Law would apply to search engine providers and user-generated content platforms whose services are used by at least 10% of the Swiss population within a six-month period and require them to set up a free internal complaint procedure allowing users to challenge decisions or restrictive measures affecting their content. Complaints must be accepted for at least six months after notification, handled promptly, and processed under qualified supervision rather than solely automated means, with users informed of the outcome. The Law would allow users to bring disputes to authorised out-of-court bodies at any time, with providers obliged to participate unless the matter is already before a Swiss court or arbitration. Procedures must generally be completed within 90 days, extendable by up to 90 days in complex cases, and the body must provide a report, but cannot impose binding settlements. Users may be charged a protective fee, which the provider reimburses if the user prevails, while providers bear all other procedural costs. The Federal Office of Communications (BAKOM) will authorise dispute resolution bodies and oversee their independence, procedures, and expertise requirements, and bodies must report annually on their activity, outcomes, and procedure durations.
On 29 October 2025, the Federal Council opened a consultation on the Federal Law on Communications Platforms and Search Engines, which includes data access rules, until 16 February 2026. The Law would enable research bodies and independent civil society organisations to apply to the Federal Office of Communications (BAKOM) for access to data from communication platforms or search engines if the data serve to identify and understand systemic risks. Applications must specify the provider and data requested, guarantee data security and limited access, restrict use to the stated research purposes, and commit to publishing results freely. The BAKOM will approve qualifying applications and instruct providers to grant access within a reasonable period, with the Federal Council setting the application and processing procedures.
On 29 October 2025, the Federal Council opened a consultation on the Federal Law on Communications Platforms and Search Engines, which includes fair marketing and advertising practice requirements, until 16 February 2026. The Law would apply to search engine providers and user-generated content platforms whose services are used by at least 10% of the Swiss population within a six-month period. The Law would require that search engine providers and user-generated content platforms clearly label paid advertising and provide users with easy access to parameters determining advertisement display. It would also require the establishment of a publicly accessible and searchable advertising archive, excluding unlawful or incompatible content, and maintaining retrieved advertising for one year after its last display. Providers would also be required to publish the main parameters of recommendation systems and provide at least one recommendation system not based on profiling.
On 6 October 2025, the Federal Data Protection and Information Commissioner (FDPIC) released version 1.1 of the guidelines on data processing using cookies and similar technologies, updating the first finalised version of 22 January 2025. The revised guidelines introduce several additions and clarifications, including an expanded footnote 5, a new sentence in section 3.1.2 on personal identifiability through cookies, and a clarification in the final sentence of section 3.2.2 regarding data collection by third parties. Further updates include additions in section 3.5.2 on the principle of proportionality, clarifications in section 3.6 on the permissibility of non-essential cookies, and an addition in section 3.8.1 on contractual use. Section 3.9 now contains a new reference on the right to reject cookies and default settings, while section 3.10.1 includes clarifications on qualified use of cookies involving high-risk profiling. The FDPIC also added new material in section 3.11.1 and clarifications in section 3.11.3 on embedded third parties, a new paragraph in section 3.12.3 on specific consent, and further clarification in section 3.12.4 on voluntary consent, particularly concerning free services and cookie paywalls. The updated guidelines continue to set out data protection requirements under the Federal Data Protection Act (FADP) and Article 45c of the Telecommunications Act (TCA), drawing on judicial precedent, supervisory practice, and academic opinion.
On 1 October 2025, the enforcement of fines under the amended Information Security Act begins. Since 1 April 2025, operators including cloud computing providers, search engines, digital security and trust services, and data centres in Switzerland have been required to report cyber-attacks within 24 hours of discovery via the Federal Office for Cyber Security (BACS) platform. During the initial six-month period, reporting was mandatory but not subject to penalties. With the introduction of fines, operators that fail to report incidents affecting infrastructure functionality, causing data leaks or manipulation, or involving threats or coercion will now be subject to enforcement measures.
On 18 September 2025, the Swiss Federal Data Protection and Information Commissioner (FDPIC) and the Information Commissioner’s Office (ICO) of the United Kingdom signed a Memorandum of Understanding (MOU) in Seoul for the facilitation of international cooperation in the field of data protection. The MOU establishes a framework for cooperation in areas such as the exchange of best practices, joint research projects, collaboration in international fora, and potential joint investigations into cross-border data protection incidents, while explicitly excluding the sharing of personal data without a legal basis. It reaffirms applicable legal frameworks, including Article 58(1)(b) of the Swiss Federal Data Protection Act 2020 and Article 50 of the United Kingdom General Data Protection Regulation, and references international instruments such as the Organisation for Economic Co-operation and Development (OECD) Recommendation on Cross-Border Co-operation in the Enforcement of Laws Protecting Privacy and the Global Privacy Assembly’s coordination frameworks.
On 16 September 2025, the National Council adopted with amendments Motion 24.4596, following prior approval by the Council of States on 20 March 2025. The Motion instructs the Federal Council to create the necessary legal conditions to ensure that journalistic content and other works protected under copyright law receive comprehensive protection when used by artificial intelligence providers. The Motion requires clarification in the Copyright Act (URG) that copyright holders’ consent is necessary when creative works are processed or reused for generative AI services, and that exceptions or limitations under copyright law cannot be invoked by such providers. The amended text further mandates that protection be designed without weakening Switzerland’s position in AI research, development, and commercialisation, ensuring compatibility with international frameworks. The Federal Council supports the Motion and is tasked with drafting implementation measures.
On 8 September 2025, the Swiss Federal Council presented its priorities for 2026 at the National Council, including the introduction of a framework law for the secondary use of data. The proposal seeks to establish infrastructures for data reuse under regulated conditions, aiming to balance innovation with the protection of individual rights.
On 8 September 2025, the Swiss Federal Council presented its priorities for 2026 at the National Council, including artificial intelligence (AI) authority governance. It focuses on examining the expansion of a federal AI coordination office. The goals also focus on oversight capacities, promoting coherent governance, and align Switzerland more closely with evolving international standards on AI.
On 20 August 2025, the Swiss Federal Council mandated the Federal Department of Defence, Civil Protection and Sport (DDPS), together with the Federal Department of the Environment, Transport, Energy and Communications (DETEC) and the Federal Department of Economic Affairs, Education and Research (EAER), to draft legislation aimed at strengthening the cyber resilience of digital products in line with Motion 24.3810 of the Security Policy Committee of the Council of States. The drafting was assigned to the National Cyber Security Centre (NCSC), in collaboration with the Federal Office of Communications (OFCOM) and the State Secretariat for Economic Affairs (SECO), with a draft Bill to be submitted for consultation by autumn 2026. The proposed legislation will set cybersecurity requirements for the development and commercialisation of products with digital components, introduce provisions for market surveillance, and establish the legal basis for banning the import and sale of insecure devices. It will also reflect international developments, including the European Union Cyber Resilience Act of 11 December 2024, while being tailored to Switzerland’s economic environment and minimising administrative impact on companies.
On 8 May 2025, the Federal Data Protection and Information Commissioner (FDPIC) of Switzerland released guidance affirming that the current Data Protection Act (DPA), effective since 1 September 2023, is directly applicable to data processing operations involving artificial intelligence (AI). The guidance was issued following Switzerland's signing of the Council of Europe Convention on Artificial Intelligence and Human Rights in March 2025. The FDPIC noted that the DPA, formulated in a technology-neutral manner, mandates transparency in AI operations, requiring disclosure of the purpose, operation, and data sources. Additionally, the DPA stipulates the right of data subjects to object to automated processing and to request human review of automated decisions. High-risk AI applications necessitate data protection impact assessments, while applications undermining privacy, such as large-scale, real-time facial recognition or the global surveillance and assessment of individuals' lifestyles, are prohibited.
On 12 February, the Swiss Federal Council announced its decision to sign the Council of Europe Convention on Artificial Intelligence, Human Rights, Democracy and the Rule of Law. s part of its outlined regulatory strategy, Switzerland plans to align domestic law with the Convention's requirements and participate in the international development of AI governance standards. The Convention establishes requirements in areas such as transparency and non-discrimination in the use of AI and machine learning. Swiss authorities will prepare the necessary legislative amendments and formulate implementation measures by the end of 2026.
On 27 March 2025, the Federal Council of Switzerland signed the Council of Europe Convention on Artificial Intelligence (AI), human rights, democracy, and the rule of law. It was highlighted that Switzerland is committed to international collaboration in setting AI governance standards while ensuring the protection of fundamental rights. The Convention aims to establish rules for transparency and non-discrimination in machine learning and AI development. It was also highlighted that following the signing, authorities, including the Federal Department of Justice and Police, will prepare necessary legislative amendments and devise implementation measures by the end of 2026.
On 20 March 2025, the Federal Data Protection and Information Commissioner (FDPIC) concluded its preliminary investigation into X, formerly Twitter, concerning the utilisation of personal data from Platform X for training its artificial intelligence system, Grok. Following reports in early summer 2024 about the data processing practices, the investigation aimed to assess the transparency of these operations and the opt-out options for users. During this formal enquiry, X designated a local representative in Switzerland and disclosed details about the use of public posts for training machine learning and artificial intelligence models. The FDPIC noted that X introduced an opt-out mechanism on 16 July 2024, allowing users to exclude their public contributions from the training datasets. This feature aligns with the requirements of the Federal Act on Data Protection (FADP), offering control to users in Switzerland and the EU. The FDPIC concluded that with the establishment of the opt-out option, X is in compliance with relevant data protection regulations, ensuring users can manage their data usage preferences.
Last updated: 11/11/2025