CoE Framework Convention signatory
This content is for informational and educational purposes only and does not constitute legal advice.
On 28 January 2026, the Information and Privacy Commissioner of Ontario released guidance on AI scribes in the health sector. The guidance sets out considerations on the development, procurement, and use of artificial intelligence scribes in Ontario’s health sector. It addresses privacy-related obligations under the Personal Health Information Protection Act and refers to the Enhancing Digital Security and Trust Act where applicable. The guidance defines artificial intelligence systems and AI scribes and outlines fundamental principles for the responsible use of artificial intelligence developed jointly with the Ontario Human Rights Commission. It specifies expectations on governance and accountability frameworks, risk management, data minimisation and purpose limitation, privacy impact assessments, security safeguards, breach notification, training and awareness, transparency, and human oversight. It further sets out obligations for developers of AI scribes, custodians who procure AI scribes, and custodians who use AI scribes.
On 28 January 2026, the Office of the Information and Privacy Commissioner for British Columbia released the guidance titled PIPA and AI scribes: best practices for healthcare organizations in BC. The guidance addresses healthcare organisations subject to the British Columbia Personal Information Protection Act (PIPA) that are considering the use of artificial intelligence scribes in clinical settings. It sets out how PIPA applies to the collection, use and disclosure of personal information through AI scribes, including requirements on appropriate purposes under the reasonable person test, express consent from patients and, where applicable, employees, accuracy obligations, security arrangements, access and correction rights, retention and destruction duties, and accountability when personal information is handled by vendors.
On 14 January 2026, the Office of the Privacy Commissioner of Canada expanded an ongoing investigation into X Corp pertaining to the use of the Grok chatbot to create sexualised Artificial Intelligence (AI)-generated deepfake images. The investigation examines compliance with the Personal Information Protection and Electronic Documents Act (PIPEDA), including whether valid consent was obtained for the collection, use, and disclosure of Canadians’ personal information to create deepfakes and whether such processing complies with federal privacy law. The expanded investigation builds on an investigation opened on 27 February 2025 into X's use of personal data to train AI models.
On 9 December 2025, the G7 Industry, Digital, and Technology Ministers issued a Ministerial Declaration outlining their intention to support economic competitiveness and prosperity by advancing initiatives related to artificial intelligence (AI), quantum technologies, the digital economy, and supply chain security. In relation to AI for growth, the Ministers set out plans to promote a human-centric approach, develop conditions for the broader adoption of secure and responsible AI, and address talent shortages and skills gaps. These plans build on the G7 AI Adoption Roadmap, including the SME AI Adoption Blueprint and the Toolkit for SMEs Deploying Artificial Intelligence. For quantum technologies, the Ministers agreed to establish a pilot G7 Joint Working Group to support cooperation on research, development, commercialisation, and related policy discussions. To support a resilient and competitive digital economy, the declaration referenced trusted cross-border Data Free Flow with Trust through the use of Privacy Enhancing Technologies, highlighted efforts to promote competition, fairness, and contestability in digital markets, and reaffirmed the relevance of robust intellectual property frameworks. The Ministers also expressed their intention to strengthen resilient and secure supply chains, particularly in relation to semiconductors and Internet of Things security, and to enhance health security by supporting reliable medical countermeasure supply chains.
On 16 October 2025, the Commission of Access to Information issued guidance to businesses and public organisations on the Scribd platform, a global digital library hosting over 200 million user-uploaded documents. The Commission advised that uploading personal or sensitive information on the platform may pose privacy risks, as Scribd is not a personal cloud service and content is public unless privacy settings are applied.
On 26 September 2025, the Government of Canada, through Innovation, Science and Economic Development Canada (ISED), launched the Artificial Intelligence (AI) Strategy Task Force with a mandate to provide targeted recommendations for the development of Canada’s next AI strategy. The Task Force is composed of experts drawn from academia, industry, finance, and civil society, each consulting their networks on specific themes including research and talent, adoption of AI across industry and government, commercialisation, scaling Canadian AI champions and attracting investment, building safe AI systems and strengthening public trust, education and skills, enabling infrastructure, and the security of Canadian infrastructure and capacity. The Task Force has been established with a mission-focused, time-limited mandate, complementing the ongoing role of the Advisory Council on Artificial Intelligence, and its work will be informed by public engagement through the Consulting Canadians portal. The initiative builds on earlier national measures such as the Pan-Canadian Artificial Intelligence Strategy (PCAIS), with approximately CAD 742 million invested since 2017, and the Canadian Sovereign AI Compute Strategy launched in 2024 with CAD 2 billion in funding to secure access to advanced compute infrastructure.
On 25 September 2025, the Financial Transactions and Reports Analysis Centre (FINTRAC) imposed an administrative monetary penalty of CAD 19. 552 million on Peken Global Limited, also operating as KuCoin, for non-compliance with Part 1 of the Proceeds of Crime (Money Laundering) and Terrorist Financing Act and associated Regulations. The Seychelles-incorporated foreign money services business operating in Canada failed to register with FINTRAC. It also failed to report large virtual currency transactions of CAD 10,000 or more, including the required information. Additionally, it did not submit suspicious transaction reports despite reasonable grounds to suspect involvement in money laundering or terrorist financing.
On 23 September 2025, the privacy commissioners of Canada, Québec, British Columbia and Alberta published a joint investigation report finding TikTok violated federal and provincial privacy laws through inappropriate collection of children's data and inadequate user consent practices. The investigation found TikTok removed approximately 500,000 underage Canadian accounts annually but failed to implement effective age verification beyond voluntary age gates, allowing extensive tracking and profiling of children under 13 and under 14 in Quebec for targeted advertising. TikTok's consent mechanisms were deemed inadequate for both adults and youth users due to unclear privacy policies, inaccessible supplementary documents, and lack of prominent up-front disclosure about data collection practices including biometric facial analysis for age estimation. TikTok also violated Quebec's transparency requirements by failing to provide clear information about tracking technologies and not ensuring privacy-by-default settings. TikTok agreed to implement enhanced age assurance models using facial analysis and natural language processing, cease targeted advertising to under-18 users except for generic categories, enhance privacy communications with prominent notices about data transfers to China, create youth-specific privacy resources, and establish a privacy settings check-up mechanism. Most commitments must be completed within six months, with privacy impact assessments due within one to four months and monthly progress reports required until full compliance.
On 21 September 2025, the Group of Seven (G7) Cyber Expert Group (CEG) issued the statement on Artificial Intelligence (AI) and cybersecurity. It applies to financial institutions, regulators, technology firms, and AI developers. The statement covers generative AI, agentic AI, and advanced systems, highlighting both opportunities and risks for cyber resilience. It urges monitoring AI developments, integrating AI into anomaly detection, fraud prevention and predictive maintenance. Risks include AI-powered phishing, malware, data poisoning, prompt injection, and third-party dependencies. It also stresses governance, human oversight, model risk, and AI literacy. Collaboration with academia and industry is encouraged. The statement aims to guide risk management as AI adoption grows.
On 18 September 2025, the Canadian Digital Regulators Forum (CDRF), including Canadian Radio-television and Telecommunications Commission (CRTC), the Competition Bureau Canada, the Copyright Board of Canada, and the Office of the Privacy Commissioner of Canada, released a paper on synthetic media in the digital landscape. The paper addressed the opportunities and risks of generative artificial intelligence and synthetic media across the mandates of the four institutions, including issues of copyright protection, deceptive marketing practices, privacy rights, and broadcasting regulation. It examined challenges related to copyright infringement, authorship, remuneration of creators, consumer protection, transparency of AI-generated content, and the application of Canada’s Anti-Spam Legislation. The paper also provided context on international frameworks such as the European Union’s Artificial Intelligence Act and regulatory initiatives in the United States and the United Kingdom, alongside Canadian initiatives, including the Voluntary Code of Conduct on the Responsible Development and Management of Advanced Generative AI Systems and the AI Strategy for the Federal Public Service 2025–2027. The paper emphasised the implications of synthetic media for Canadian content production, competition, privacy, and copyright enforcement, as well as the broader regulatory and legal uncertainties domestically and internationally.
On 3 September 2025, the Office of the Information and Privacy Commissioner of Alberta (OIPC) issued guidance titled Artificial Intelligence (AI) Scribe Privacy Impact Assessment Guidance. It sets out requirements under Section 64 of the Health Information Act (HIA) for custodians acquiring and implementing AI scribe tools in health care delivery. The guidance covers project descriptions, data flow diagrams, contractual obligations, the limitation principle under Section 58, the duty of accuracy under Section 61, security obligations under Section 60, breach reporting under Section 60.1, and patients’ rights of access and correction under Sections 7 and 13. Custodians are advised to structure vendor contracts so that the AI scribe vendor, acting as an affiliate or information manager under Section 66, collects, uses, discloses, retains, and destroys health information strictly in accordance with the HIA and its Regulation. Contracts must prohibit unauthorised uses such as AI training, require the secure destruction of information at termination, and mandate policies, procedures, and staff training pursuant to Section 63. Vendors are also expected to provide detailed technical information, including tool architecture, training data, hosting arrangements, integration with electronic medical records, accuracy and session controls, logging and retention capabilities, privacy and security governance, and employee training. This information enables the OIPC to evaluate compliance with the HIA.
On 28 August 2025, the Office of the Privacy Commissioner (OPC) announced seven research projects assessing personal information collected by smart devices. Following the completion of the projects, the OPC will issue a report on the findings of the projects. One project, to be completed by the University of Windsor, aims to develop a system of encryption for data collected by autonomous vehicles, which is machine-readable but meaningless to potential hackers. The second project, by Toronto Metropolitan University, aims to map the array of applications for health-related machine listening with a privacy focus. The third project, by the Automobile Protection Association, aims to produce a compilation and analysis of the privacy permissions and releases automakers must require of their Canadian customers in exchange for access to features in their vehicles. The fourth project, by Vancouver Island University, aims to develop a privacy-by-design toolkit on smart devices tailored to Canadians ages 16-24. The fifth project, by the University of Sherbrooke, aims to assess the technical mechanisms in place to ensure the confidentiality of sensitive information. The sixth project, by the Centre for Addiction and Mental Health, aims to engage the public in discussions of privacy, autonomy, and rights relating to AI in smart devices. The seventh project, by the University of Ottawa, aims to produce recommendations to the legislator with a view to reforming PIPEDA as it applies to female technology mobile applications. It will also launch education and awareness‑raising initiatives targeting young women concerning the protection of their data in the digital environment. All projects funded under the Contributions Program are to be completed by 31 March 2026.
On 27 August 2025, the Office of the Privacy Commissioner (OPC) found that Google violated the Personal Information Protection and Electronic Documents Act (PIPEDA). The OPC concluded that Google should de-list certain media articles from its search results because the harm caused to the complainant’s safety and dignity outweighed the public interest in access. Google did not agree to remove the articles and argued that further direction from the courts is required before it would be appropriate for news articles to be de-listed. The OPC, therefore, considered this part of the complaint to be well-founded and unresolved. The finding followed an examination of whether Google had met the accuracy requirements under PIPEDA and whether it collected, used or disclosed personal information only for a reasonable purpose by continuing to display the search results. The OPC determined that Google’s accuracy obligations under PIPEDA do not extend to the content of the articles themselves. It also found that there are limited situations in which it would be inappropriate for a search engine to return results containing personal information about an individual. The case began in June 2017 when an anonymous individual filed a complaint alleging that Google had contravened PIPEDA by including certain media articles in the list of results displayed when their name was searched. Google challenged the OPC’s jurisdiction to investigate. In July 2021, the Federal Court confirmed that the OPC had jurisdiction, and in September 2023, the Federal Court of Appeal upheld that decision.
On 23 August 2025, the Department of Global Affairs closes the consultation on the development of a Digital Trade Agreement (DTA) with the European Union. The consultation aims to inform exploratory discussions on digital trade rules, complement existing agreements such as the Comprehensive Economic and Trade Agreement (CETA), and align with evolving technological developments. A Canada-EU DTA could support inclusive trade, promote digital economy objectives, and enhance cooperation on issues such as data transfers, cybersecurity, AI standards, privacy, and misinformation. The consultation aims to gather information on data governance, digital authentication, online consumer protection, interoperability standards, and digital inclusion.
On 11 August 2025, the Office of the Privacy Commissioner of Canada (OPC) released the Guidance for processing biometrics for businesses under the Personal Information Protection and Electronic Documents Act (PIPEDA). The guidance provides private-sector organisations with requirements for the collection, use, disclosure, retention, and safeguarding of biometric information. The guidance defines biometric technology, including physiological and behavioural biometrics, and outlines privacy obligations relating to identifying an appropriate purpose, obtaining valid consent, limiting collection, ensuring proportionality, and implementing safeguards proportionate to the sensitivity of biometric data. It specifies measures for accuracy, accountability, and openness, requiring organisations to use privacy-protective systems by design, adopt technical and organisational controls, limit retention periods, maintain audit trails, and ensure compliance by third-party service providers. The guidance emphasises the sensitivity of biometric information capable of uniquely identifying individuals, sets conditions for mandatory breach reporting under section 10.1 of PIPEDA, and mandates transparency on data handling, cross-border transfers, and automated decision-making involving biometric systems.
On 5 August 2025, the Office of the Privacy Commissioner of Canada (OPC) closes its consultation to support the development of a children's privacy code under the Personal Information Protection and Electronic Documents Act (PIPEDA). An element of the consultation is the potential inclusion of age assurance mechanisms, which would support organisations in determining whether children are accessing their products or services. This would enable organisations to apply age-appropriate privacy measures, adjust user interfaces for younger individuals, and obtain parental or guardian consent where applicable. The OPC is seeking input on how such mechanisms can be implemented in a privacy-sensitive manner, balancing the need for reliable age estimation with limiting data collection and processing.
On 5 August 2025, the Office of the Privacy Commissioner of Canada (OPC) closes its consultation on the development of a children’s privacy code under the Personal Information Protection and Electronic Documents Act (PIPEDA). The consultation focuses on data protection regulation concerning the collection, use, disclosure, retention, and safeguarding of children’s personal information. The proposed code is intended to clarify organisational obligations related to meaningful consent, data minimisation, privacy by default, and restrictions on certain data practices. The OPC indicates that the initiative draws on international regulatory developments and is intended to support the exercise of privacy rights by individuals under the age of 18 in digital environments.
On 5 August 2025, the Office of the Privacy Commissioner of Canada (OPC) closes its consultation to support the development of a children's privacy code under the Personal Information Protection and Electronic Documents Act (PIPEDA). The consultation includes proposed obligations for the design of digital systems accessed by individuals under 18. These design obligations address interface features, behavioural defaults, and system-level controls. The proposed code would require organisations to implement privacy-protective default settings, restrict the use of tracking technologies, and prohibit the use of deceptive or manipulative design patterns that may influence children's privacy-related decisions. Organisations would also be expected to incorporate the best interests of the child into product and service development through privacy impact assessments and child-informed design processes. The consultation seeks feedback on how such obligations should be operationalised across age groups and digital environments.
On 1 August 2025, the Canadian Radio-television and Telecommunications Commission (CRTC) issued a decision conditionally approving the plan submitted by the Canadian Association of Broadcasters (CAB) for the establishment of the temporary Commercial Radio News Fund (CRNF), as part of the implementation of the modernised Broadcasting Act. The decision follows a public consultation that was held from 4 November to 4 December 2024, which invited comments on the CAB’s proposed administration of the CRNF. The fund is designed to provide targeted financial support to commercial radio stations operating outside of Montréal, Toronto, Vancouver, Calgary, Edmonton, and Ottawa-Gatineau, where access to local news is considered more limited. The CRNF initiative complements earlier decisions on base contributions, including Broadcasting Regulatory Policy CRTC 2024-121, CRTC 2024-121-1, and Broadcasting Order CRTC 2024-194, which collectively form part of the CRTC’s broader regulatory framework to ensure online streaming services contribute to Canadian and Indigenous content. The regulatory policy 2024-121 requires these services to contribute a minimum of 5% of their Canadian revenues to the Canadian broadcasting system. The requirement applies to online streaming services with annual revenue exceeding CAD 25 million.
On 22 July 2025, the Competition Bureau closes its consultation on algorithmic pricing. The consultation seeks feedback on the prevalence of algorithmic pricing in Canada, the source of the data used in pricing algorithms, the potential impact on markets and consumers, and the challenges it may present for data protection authorities. It also aims to understand the potential impacts on competition, including issues related to price-fixing, competitor collaborations, anti-competitive acts such as predatory pricing and tying and bundling, and deceptive marketing practices concerning consumer data. Finally, it considers the role AI plays in reshaping pricing strategies and altering the dynamics of competition.
On 17 June 2025, the Leaders of the Group of Seven (G7) adopted a statement on artificial intelligence (AI) for prosperity. The statement recognises the potential of a human-centric approach to AI for economic growth, societal benefit, and addressing global challenges. It aims to drive innovation and adoption of secure, responsible, and trustworthy AI, while working with emerging market and developing country partners to close digital divides, aligning with the United Nations Global Digital Compact. The Leaders committed to accelerating AI adoption in the public sector to enhance public services and increase government efficiency, respecting human rights, privacy, transparency, fairness, and accountability. Canada, as G7 presidency, is launching the G7 GovAI Grand Challenge and will host "Rapid Solution Labs" to address barriers in public sector AI adoption, supported by a new G7 AI Network (GAIN). Relevant Ministers are tasked to explore strategic investments. The statement promotes economic prosperity by supporting small and medium-sized enterprises (SMEs) to adopt and develop AI, respecting personal data and intellectual property rights. The G7 launched the AI Adoption Roadmap with actionable pathways for companies, committing to sustain investments in SME adoption programs, publishing a blueprint, deepening talent exchange, and developing tools for trust, collaborating with partners. Further, the statement addresses the energy challenges of increased AI adoption while harnessing its potential for energy efficiency and innovation. The G7 noted their commitment to cooperate on innovative solutions for energy challenges related to AI and data centres and support innovation that improves energy efficiency and optimises operations. They will advance AI solutions to build secure, resilient, and affordable energy systems. Relevant Ministers are tasked with delivering a work plan on AI and energy before the end of the year. Finally, the statement commits to expanding partnerships with emerging markets and developing country partners to increase access to trusted and secure AI technology.
Last updated: 28/01/2026