110 FEATURE
National Guidance on Artificial Intelligence
in Mental and Substance Use Health Care
Co-led by the Canadian Centre on Substance Use and Addiction (CCSA)
A rtificial intelligence (AI) is increasingly reshaping how we access health care. In mental health and substance use health, where the needs can be urgent and complex, AI holds both promise and risk. Until now, Canada has had no dedicated framework to guide its safe and effective use in the field. There is an immediate need for guidance to ensure that AI-enabled digital mental health and substance use health tools accessible to consumers and healthcare practitioners are safe and follow ethical standards. In a landmark initiative, the Canadian Centre on Substance Use and Addiction (CCSA) and the Mental Health Commission of Canada have partnered to develop Canada’s first National Guidance for Artificial Intelligence Use in Mental Health and Substance Use Health Care. This guidance, expected in 2026, will set a new
standard for balancing innovation with safety, equity and compassion.
to decision makers across the country, harnessing research, curating knowledge and bringing together diverse perspectives. The new national guidance will be developed collaboratively with an advisory group representing health, technology and lived-experience communities. It draws on a literature review and environmental scan that includes existing global guidelines. Preliminary findings highlight critical themes shaping this work, including trust and explainability of AI systems, human-centred care, and equity and data governance. This guidance will ultimately provide tools and resources to help clinicians and organizations assess AI solutions, empower people with lived and living experience to make informed decisions about their care, and guide technology developers in creating safe, ethical and culturally appropriate systems.
Why the guidance matters In Canada, mental health and substance use health needs are highly common, yet many people continue to face significant barriers to care, including limited access, stigma, financial costs and lack of tailored treatment. There is excitement about the potential for AI. It can support triage, service navigation and even elements of therapy delivery, but without clear principles, the risks can be significant. Bias in algorithms, privacy concerns, and the lack of user safeguards can undermine trust and even cause harm.
Grounded in evidence and experience CCSA is a national leader on substance use health in Canada, providing guidance
Made with FlippingBook Digital Publishing Software