Artificial intelligence (AI) is no longer science fiction. We are now living in an AI-powered world and AI tools are in every part of our lives. Immigration, Refugees and Citizenship Canada (IRCC), too, is relying on these tools more widely and frequently.
While AI and automation has the potential to enhance Canada’s immigration system by improving processing times, decreasing backlogs, and identifying fraudulent applications, it also creates challenges including lack of transparency, growing risk of oversight, and increasing bias and discrimination within the system.
The Unknowns of Chinook: Is AI Making the Decisions?
Some of you may be familiar with Chinook – that is one of the most commonly known tools used by IRCC. It also frequently comes up in Global Case Management System (GCMS) notes – that includes the Officer’s internal notes for their decisions. If an application is assessed with the assistance of Chinook, then, it is usually noted in the GCMS.
What do we know about Chinook? We know that Officers use Chinook to evaluate temporary resident visas, work and study permits. The tool provides a visual representation of an applicant’s details, and it is intended to simplify information and limit the time required to review applications.
It is important to note that IRCC continuously reassures the applicants that Chinook does not make the actual decisions and is not an automated decision-making tool. According to IRCC officials, Chinook is only a processing aid that helps Officers assess applications – and that Chinook never makes the final decision.
Rethinking Efficiency: Should We be Concerned?
Chinook is intended to be a processing aid rather than a decision-making tool, however, its role in filtering and presenting application information raises significant concerns. It acts as an interface that brings certain aspects of an application to an Officer’s attention. This interface-driven process may oversimplify complex situations by not completely capturing the context that a more detailed human analysis might reveal. These concerns extend beyond Chinook to IRCC’s broader use of automation and advanced analysis tools.
While automation and advanced analytics can potentially streamline IRCC processing, information from past applications is used to develop these tools. The use of historical data risks integrating existing stereotypes or biases into the AI model. If AI is trained with this type of data, it can continue to make and promote similar decisions. AI also raises accountability and transparency concerns about how the tools work. It can be hard to understand how an AI system reached a decision and difficult to challenge what criteria was considered.
A Refusal is not the End
Even when they use the assistance of an AI tool to evaluate an application, immigration Officers are expected to meaningfully engage with the supporting documentation and provide a reasonable decision to the applicants. If you believe the use of AI negatively impacted your application or the Officer overlooked the evidence in your application and did not give you a meaningful reason for the refusal, there are different recourses available to challenge and overturn the decision – including a request for reconsideration or judicial review at the Federal Court. Contact us for a consultation to discuss available options.
Thank you to our Summer Student David Mucz for contributing to this blog.