Every day, thousands of people turn to the Government of Canada (GC) seeking help with essential services. Many cannot call or visit government offices during business hours, making effective online self-service critical.

How existing tools support users

The GC Feedback tool is being used by many GC teams to improve their Canada.ca content and services. The tool captures over 3,000 daily questions and answers from people navigating immigration applications, employment benefits, tax account management, and more. While teams continue to improve Canada.ca, these improvements alone cannot fully address the challenge of helping people navigate the vast array of GC web content and services.

How AI Answers works

This summer, Canadian Digital Services (CDS) tested a new approach: AI Answers. It's an AI-based service that provides people with short, plain-language responses (sourced exclusively from GC websites, with authoritative links to guide users to their next steps). This blog highlights the AI Answers trial results, including key findings and what we learned.

What we tested

On the Canada.ca Sign in page, we ran a 19-day trial inviting users to try the new AI Answers service (from June 18 to 26 and July 15 to 25, 2025).

Answers were sourced exclusively from Canada.ca, gc.ca, and authorized federal government URLs. Any questions that contained personal identifying information were blocked to protect the user's privacy.

The concept was straightforward: let users ask questions in their own words and receive accurate, government-sourced answers instantly.

Screenshot of the Government of Canada online account sign-in page.

Screenshot of the Government of Canada online account sign-in.

A page showing the login options and a pop up inviting users to try the new AI Answers Service.

The technical approach

Our model-independent architecture uses Azure Canada GPT 4.1 with department-specific prompts for 10 institutions: CIRNAC, CRA, ESDC, FIN, HC, IRCC, ISC, PHAC, PSPC, and TBS.

Since web content changes frequently, the system uses search and downloads specific pages to answer questions, rather than pre-scraping web content. This AI system relies on information provided on GC websites (learn more about AI Answers).

The AI Answers product team built a detailed agentic system (users interact with a specialized agent) prompt to make sure that answers are clear, concise, and helpful. We built in both human expert evaluation and AI scoring systems, with blocking of personal identifying information to protect user privacy.

Trial participation

Trial participation exceeded our expectations: 1,227 user sessions, spanning 32 departments, with questions covering 120+ government tasks. Tasks are what people come to the content to do: That can mean getting answers, like learning about a subject, or performing a transaction, like applying for a program.

The question distribution revealed user priorities:

  • 42% (635 questions) on IRCC services: immigration, work permits, and visas
  • 25% (377 questions) on ESDC services: My Service Canada Account sign-in and registration, employment insurance, CPP, and job searches
  • 22% (332 questions) on CRA services: account access, and tax and business questions

Even less-commonly used government services generated multiple questions per topic across many departments, suggesting broad user interest in AI-assisted government interactions.

Screenshot of the AI Answers service page.

Screenshot of the AI Answers service page.

The page explains that the tool can help users find answers to their questions about Canada.ca services and information.

Four key findings

Accuracy that meets enterprise standards: We achieved a 95% accuracy rate on English questions and 94% on French. This was verified through an expert evaluation of 800 trial questions, completed with GC departments and agency partners. This consistency across both official languages is rare in AI applications, where French typically shows 20% higher error rates.

Users found genuine value: 88% of user feedback was positive. Importantly, 22% of feedback respondents said they no longer needed to call government offices or visit in person, while 52% reported saving time on searching and reading.

Unexpected breadth of use: Despite launching from a sign-in page, 50% of questions were related to the top 20 government services identified in the GC Task Success Survey. Users asked about a wide range of subjects, from work permits, to contributions to the Canada Pension Plan (CPP) benefits, to tax account access. This revealed a strong demand for AI help across the full spectrum of government services.

Multilingual capability: Beyond English and French, the system handled questions in 19 other languages, representing 8.4% of all queries. While accuracy was lower for these languages, we've since implemented translation and search improvements.

What we learned

Short queries need different handling: In June, 12% of questions were just 1-2 words, requiring clarification. We addressed this in July by blocking very short queries, providing instruction on effective use and offering a search alternative.

Length limits matter: We reduced the maximum question length from 400 to 260 characters after finding longer questions were often confusing or included attempts to manipulate the system.

Content gaps become visible: GC partners are now addressing online content gaps and errors that user questions had revealed, turning AI interactions into a diagnostic tool for service improvement.

Safety measures work: Despite concerns about AI risks, expert reviewers found no harmful responses were given, and no attempts to manipulate the system were successful during the trial.

Why this matters

This trial demonstrates that enterprise-scale AI assistance can work across the full breadth of GC services, while maintaining accuracy and safety standards.

The product's flexible, chat agent-based design makes it scalable and ready for future AI needs.

More importantly, it shows we can meet users where they are by providing immediate, accurate assistance right on Canada.ca.

  • This saves time and reduces the need for phone calls and office visits, allowing those service channels to focus on people's more complex service needs.
  • For a digital government, that's not just a technical upgrade, but a meaningful service improvement for people using government services every day.

Next steps

We are making improvements to AI-crafted search queries and translation, and expanding testing across additional Canada.ca pages. The goal is to make sure that people receive the latest, most accurate information possible, no matter which language they use.

Improvement examples:

  • Translating questions that are not in English or French into English before searching (the AI uses the search results to help it find up-to-date information).
  • Transforming user questions into shorter search queries when needed.

We're also exploring integrating custom departmental tools and evaluating potential applications for call-centre agents and social media teams.

Learn more

In the near future, the AI Answers team will publish blog posts about the design, safety, and operations of AI Answers to give deeper insight into this experimental service. We hope that our work helps AI development teams across the public sector deploy safe and effective applications.

Visit the AI Answers trial site.