AI governance in Canada, AIDA and what is happening at provincial level

An owner sitting at her office desk in late afternoon reviewing a printed Canadian privacy and AI summary, pen and notebook in hand, with a closed laptop and a mug of tea in front of her and a wall map of Canada with Quebec and Ontario circled behind
TL;DR

Canada has no federal AI statute after Bill C-27 died on prorogation in January 2025, with successor legislation expected in 2026. Quebec's Law 25 already applies to any organisation processing data on Quebec residents, with no business-size threshold and penalties up to CAD 25 million or four per cent of worldwide revenue. Ontario's Trustworthy AI Framework and Bill 194 FIPPA amendments cover the public sector and set a private-sector best-practice direction. PIPEDA is the federal baseline. For non-Canadian SMEs with Canadian customers, Quebec Law 25 is the binding line, not federal AI law.

Key takeaways

- The federal Artificial Intelligence and Data Act (AIDA) inside Bill C-27 died on the order paper when Parliament was prorogued on 6 January 2025. Canada currently has no dedicated federal AI statute. Successor legislation is widely expected in 2026. - Quebec's Law 25 already applies extraterritorially to any organisation processing personal data of Quebec residents. There is no business-size threshold. A single visit by a Quebec resident can trigger compliance obligations. - Section 12.1 of Law 25 governs automated decision-making. If a decision about a Quebec resident is made exclusively by automated processing, the organisation must inform the person at or before the decision is communicated and provide access, factors, correction, and human review on request. - Ontario's Trustworthy AI Framework and the IPC-OHRC Principles bind the public sector and are increasingly adopted as a private-sector best-practice standard. Bill 194 FIPPA amendments came into force on 1 July 2025. - The Personal Information Protection and Electronic Documents Act (PIPEDA) is the federal baseline for private-sector handling of personal data. Joint OPC guidance from December 2023 covers generative AI specifically. A non-Canadian SME with Canadian customers should treat PIPEDA as the floor and Quebec Law 25 as the binding line.

The owner of a fifteen-person UK consultancy has three clients with Canadian operations, one of which is headquartered in Montreal. She has read in passing that Canada’s federal AI bill stalled, taken that as a sign nothing in Canada really applies to her, and moved on. The reality is the opposite of what she concluded. Federal AI legislation is on hold, but Quebec’s Law 25 already applies to her firm, the automated decision-making provisions cover the AI screening she runs in client work, and the penalties are not theoretical.

This is the position a lot of non-Canadian owners are sitting in. The headline news from Canada in 2025 was the death of AIDA. The unreported news is that provincial rules already cover much of what a federal AI Act would have covered, and they apply extraterritorially. The point of this post is to give you the precise picture, so you can work out which Canadian rules attach to your firm and which do not.

This is not Canadian legal advice. Cross-border specifics warrant a Canadian privacy lawyer for anything that matters. What follows is the orientation map.

What is AI governance in Canada right now?

Canada has no dedicated federal AI statute as of May 2026. The Artificial Intelligence and Data Act, introduced inside Bill C-27 in June 2022, died on the order paper when Parliament was prorogued on 6 January 2025. Federal AI rules are currently a layered mix of PIPEDA, sector-specific guidance from financial regulators, voluntary codes, and active provincial laws. A successor bill is widely expected in 2026 but has not been tabled.

The federal layer rests on the Personal Information Protection and Electronic Documents Act (PIPEDA), which sets ten Fair Information Principles for any private-sector organisation handling personal data for commercial purposes. The Office of the Privacy Commissioner of Canada issued joint federal, provincial and territorial principles for generative AI in December 2023, which interpret PIPEDA for AI use cases. A 2023 Voluntary Code of Conduct for advanced generative AI signed by around thirty firms sits alongside as guidance, not law. OSFI Guideline E-23 covers model risk management in federally regulated financial institutions and takes effect in May 2027 after an 18-month transition.

The provincial layer is where the real obligations sit. Quebec’s Law 25 is the most stringent privacy regime in North America, fully in force since September 2024. Ontario’s Trustworthy AI Framework and the IPC-OHRC Principles bind the public sector and set a private-sector best-practice direction. Alberta has signalled an intention to legislate AI specifically, with British Columbia and other provinces watching closely.

Why does it matter for your business?

It matters because Quebec Law 25 applies extraterritorially with no business-size threshold, and the penalty regime has real teeth. The law covers any organisation, regardless of location, that collects or uses personal information of Quebec residents. Administrative penalties reach CAD 10 million or two per cent of worldwide turnover. Court fines for serious violations reach CAD 25 million or four per cent. Individuals also have a private right of action with statutory damages.

Section 12.1 is the AI-specific provision that catches many common SME use cases. If a decision about a Quebec resident is made exclusively by automated processing, the organisation must inform the person at or before the decision is communicated. On request, the organisation must disclose the personal information used, the factors that led to the decision, the right to correct information, and the opportunity to submit observations to personnel who can review the decision. Resume-screening tools, automated credit-scoring, and AI-driven customer-tier decisions all sit inside this scope unless a human is meaningfully involved, not a rubber-stamp review.

The financial cost of getting this wrong is rarely catastrophic for a small services firm, and the typical first enforcement action is a warning and an order, not a maximum fine. The reputational cost with Canadian clients is harder to quantify and arguably worse, particularly with public-sector or quasi-public-sector customers in Ontario and Quebec.

Where will you actually meet Canadian AI rules?

You will meet them in four places. First, in PIPEDA-level handling of any Canadian customer’s personal data, the federal baseline regardless of province. You need a designated privacy officer, identified purposes before collection, valid consent, limited use, appropriate safeguards, individual access and correction rights, and mandatory breach notification at the real-risk-of-significant-harm threshold. The OPC’s December 2023 generative AI principles add specific guidance for AI-driven processing.

Second, in Quebec Law 25 if any of your customers are Quebec residents. The compliance load is closer to GDPR than to other North American privacy regimes. You need a designated privacy officer, plain-language public privacy policy, opt-in consent for collection, mandatory privacy impact assessments before new digital systems involving personal data are launched, written agreements covering cross-border transfers with documented equivalent-protection assessments, and Section 12.1 disclosures for automated decision-making.

Third, in Ontario if you serve public-sector customers there. Bill 194 amendments to FIPPA came into force on 1 July 2025 and impose mandatory written privacy impact assessments, mandatory breach reporting at the real-risk-of-significant-harm threshold, and expanded IPC oversight including binding orders. The IPC-OHRC Principles for responsible AI use are increasingly being treated as the private-sector best-practice standard, particularly by organisations bidding into Ontario public-sector procurement.

Fourth, in financial-services-specific obligations if you are a vendor to a federally regulated Canadian bank, insurer or trust company. OSFI Guideline E-23 makes the institution responsible for model risk management across the whole enterprise, including vendor-supplied models. As the vendor, you carry the documentation, audit-trail, and pre-deployment testing load that lets the institution demonstrate compliance to OSFI.

When to ask versus when to ignore

Ask when you have any Quebec-resident customers, when you use AI to make or substantially influence decisions about Canadian individuals, when you serve Ontario public-sector customers, or when you supply AI to a federally regulated Canadian financial institution. Ignore the temptation to wait for federal AI legislation before treating the Canadian landscape as live. Provincial rules already cover the ground a federal act would cover, and they have been enforced through 2025 and into 2026.

A useful filter is the four-question test. One, do you handle personal information of any Canadian resident in a commercial capacity? If yes, PIPEDA applies. Two, do any of those Canadians live in Quebec? If yes, Law 25 applies in full. Three, does your AI make or substantially influence decisions about Canadian individuals without meaningful human review? If yes, Section 12.1 attaches for Quebec residents and the OPC generative AI principles attach federally. Four, do you transfer Canadian personal data outside Canada? If yes, Law 25 imposes adequacy assessment, written safeguard agreements, and individual notification for Quebec-resident data.

If the answers are PIPEDA-only with no Quebec customers, the workload is real but proportionate. Appoint a privacy officer, document purposes and consent, implement breach notification, and follow the December 2023 generative AI principles when AI is in the picture. If Quebec is in scope, treat Law 25 as a GDPR-equivalent project and bring in a Canadian privacy lawyer before you finalise the Section 12.1 disclosure wording or the cross-border transfer agreement.

This post sits inside a wider cluster on AI risk, trust and governance for SMEs. The pillar post is AI risk and governance for owner-operated businesses, the structural read on what governance looks like at this scale. For the surrounding regulatory picture, the EU AI Act for UK and EU SMEs, the UK pro-innovation pivot, and the US patchwork cover the other three jurisdictions Canadian-facing firms commonly serve.

For firms operating across more than one jurisdiction, multi-jurisdiction AI compliance for SMEs is the integration view. For internal practice, read the minimum viable AI policy for a small business and the audit trail an SME actually needs. For the Section 12.1 disclosure conversation, disclosing AI use to customers covers the practical wording.

The cluster does not replace specialist legal advice on your specific situation. Quebec Law 25 has edge cases worth a Canadian privacy lawyer’s time, particularly on cross-border transfers and Section 12.1 disclosures for automated decision-making. If you want to talk through where your firm sits in the Canadian picture and what a proportionate response looks like at your scale, book a conversation.

Sources

- Innovation, Science and Economic Development Canada. Artificial Intelligence and Data Act (AIDA) companion document, the official ISED reference for the original AIDA risk-based framework and intended regulatory architecture. https://ised-isde.canada.ca/site/innovation-better-canada/en/artificial-intelligence-and-data-act-aida-companion-document - DataGuidance (2025). Canada, Bill C-27 dies after Parliament prorogued, the news record confirming AIDA's legislative death on 6 January 2025. https://www.dataguidance.com/news/canada-bill-c-27-dies-after-parliament-prorogued - Borden Ladner Gervais (2026). A turning point for AI in Canada in 2026, a Canadian law firm analysis of the post-AIDA federal direction and expected 2026 successor legislation. https://www.blg.com/insights/2026/03/a-turning-point-for-ai-in-canada-in-2026 - Commission d'accès à l'information du Québec. The CAI's English portal, the official regulator page for Law 25 with guidance documents and enforcement actions. https://www.cai.gouv.qc.ca/english - Vidcruiter (2025). Quebec's Law 25 and AI in hiring, a practical guide to Section 12.1 automated decision-making obligations with worked employment examples. https://vidcruiter.com/blog/quebecs-law-25-and-ai-in-hiring/ - Cheq.ai. Quebec Law 25 privacy explainer, a structured walkthrough of the three implementation phases, the extraterritorial scope with no business-size threshold, and the penalty regime up to CAD 25 million or four per cent of worldwide revenue. https://cheq.ai/blog/quebec-law-25-privacy/ - Government of Ontario. Ontario's Trustworthy Artificial Intelligence Framework, the official Ontario page setting out the principles and the Responsible Use of AI Directive that took effect on 1 December 2024. https://www.ontario.ca/page/ontarios-trustworthy-artificial-intelligence-ai-framework - Office of the Information and Privacy Commissioner of Ontario. Bill 194, Strengthening Cyber Security and Building Trust in the Public Sector Act, the IPC's reference for the FIPPA amendments that came into force on 1 July 2025. https://www.ipc.on.ca/en/resources/bill-194-strengthening-cyber-security-and-building-trust-public-sector-act - Office of the Privacy Commissioner of Canada (2023). Principles for responsible, trustworthy and privacy-protective generative AI technologies, the joint federal, provincial and territorial principles for applying privacy law to generative AI. https://www.priv.gc.ca/en/privacy-topics/technology/artificial-intelligence/gd_principles_ai/ - Office of the Superintendent of Financial Institutions. Guideline E-23, Model Risk Management, the federal financial regulator's AI and model risk framework, effective May 2027 following an 18-month transition. https://www.osfi-bsif.gc.ca/en/guidance/guidance-library/guideline-e-23-model-risk-management-2027

Frequently asked questions

I am a UK firm with a handful of Canadian customers. Does any of this apply to me?

Yes, in two layers. PIPEDA applies to any private-sector organisation handling personal information of Canadian customers for commercial purposes. Quebec Law 25 applies extraterritorially to any organisation processing personal data on Quebec residents, with no minimum business size and no threshold of customer numbers. If even one of your Canadian customers is a Quebec resident, Law 25 obligations attach, including the automated decision-making rules in Section 12.1 if you use AI to make or substantially influence decisions about them.

Is AIDA actually dead, or is it just paused?

Dead in its specific form. Bill C-27, which contained AIDA, died on the order paper when Parliament was prorogued on 6 January 2025 following the Prime Minister's resignation. No successor bill has been tabled at the date of writing. The federal government has signalled that a renewed AI bill is expected in 2026, with the policy direction shifting somewhat toward infrastructure investment and talent. Treat any future AI legislation as a fresh bill, not a continuation of AIDA, and watch the consultation process when it opens.

What is the practical difference between PIPEDA and Quebec Law 25 for an AI use case?

PIPEDA is principles-based, with ten Fair Information Principles around accountability, consent, limited collection, and individual access. It does not have AI-specific automated decision-making provisions, though the Privacy Commissioner of Canada issued joint generative AI principles in December 2023 that interpret PIPEDA for AI use. Law 25 has explicit automated decision-making rules in Section 12.1, mandatory privacy impact assessments for new digital systems involving personal data, strict cross-border transfer requirements, and administrative penalties up to CAD 10 million or two per cent of worldwide turnover, with court fines reaching CAD 25 million or four per cent for serious violations. Law 25 is the harder line.

This post is general information and education only, not legal, regulatory, financial, or other professional advice. Regulations evolve, fee benchmarks shift, and every situation is different, so please take qualified professional advice before acting on anything you read here. See the Terms of Use for the full position.

Ready to talk it through?

Book a free 30 minute conversation. No pitch, no pressure, just a useful chat about where AI fits in your business.

Book a conversation

Related reading

If any of this sounds familiar, let's talk.

The next step is a conversation. No pitch, no pressure. Just an honest discussion about where you are and whether I can help.

Book a conversation