Spread the love

CBSA to use facial recognition for people facing deportation

Photo: Darryl Dyck The Canadian Press The application will provide the AFSC with relevant information that can be used to contact and monitor the client to detect any early indicators of non-compliance.

Anja Karadeglija – The Canadian Press in Ottawa

Published at 9:53 am

  • Canada

The Canada Border Services Agency (CBSA) plans to implement an application that uses facial recognition technology to track people who have been ordered to be removed from the country.

The mobile reporting app would use biometrics to confirm a person's identity and record their location data when they use the app to check in. Documents obtained through the Access to Information Act indicate that the CBSA has proposed such an app as early as 2021.

A spokesperson confirmed that an app called ReportIn will launch this fall.

The CBSA said in a follow-up comment that the app could also be used for permanent residents and foreign nationals who are under investigation to determine whether they are inadmissible to Canada.

Experts have raised numerous concerns, questioning the validity of user consent and the potential secrecy surrounding how the technology makes its decisions.

Every year, about 2,000 people who have been ordered to leave the country fail to show up, meaning the CBSA “must devote significant resources to investigating, locating and, in some cases, detaining these clients,” a 2021 document states.

The agency touted a smartphone app as an “ideal solution.”

Regular updates

The app allows the CBSA to receive regular updates on a person’s residential address, employment, family status and more, providing the CBSA with relevant information that can be used to contact and monitor the client for early indicators of non-conformity.

“In addition, through automation, the client is more likely to feel engaged and recognize the level of visibility the CBSA has into their file,” it added.

The document also states: “If a client fails to show up for their removal, the information collected on the app will provide good investigative leads to locate the client.”

An algorithmic impact assessment of the project, which has not yet been published on the federal government's website, says the voice biometric technology the CBSA tried to use is being phased out due to “failing technology,” and that it has developed the ReportIn app to replace it.

It says that “facial biometrics and a person’s location, provided by sensors and/or the mobile device/smartphone’s GPS” are recorded on the ReportIn app and then sent to the CBSA system. Once individuals submit photos, a “facial comparison algorithm” generates a similarity score with a reference photo.

If the system doesn’t confirm a facial match, it triggers a process for officers to investigate the matter.

“The location of individuals is also collected each time they present themselves and if the individual fails to comply with their conditions,” it says. The document specifies that individuals will not be “constantly followed.”

200% Deposit Bonus up to €3,000 180% First Deposit Bonus up to $20,000

Amazon wants to be reassuring

The application uses technology from Amazon Web Services. It's a choice that caught the attention of Brenda McPhail, director of the Public Policy in the Digital Society program at McMaster University.

She said that While many facial recognition companies submit their algorithms to the US National Institute of Standards and Technology for testing, Amazon has never done so voluntarily.

An Amazon Web Services spokesperson said its Amazon Rekognition technology is “extensively tested, including by third parties like Credo AI, a responsible AI company, and iBeta Quality Assurance.”

The spokesperson added that Amazon Rekognition is a “large-scale cloud system and therefore not downloadable as described in the (National Institute of Standards and Technology) participation guidelines.”

“That’s why our Rekognition Face Liveness was instead tested against industry standards at the iBeta Lab,” which is accredited by the institute as an independent testing lab, the spokesperson said.

A secret algorithm

The CBSA document states that the algorithm used will be a trade secret. In a situation that could have life-changing consequences, McPhail questioned whether it was “appropriate to use a tool that is protected by trade secrets or proprietary secrets and that denies people the right to understand how decisions about them are actually being made.”

Kristen Thomasen, associate professor and chair of law, robotics and society at the University of Windsor, said the reference to trade secrets is a signal that there could be legal barriers blocking information about the system.

There have been concerns for years that people subjected to errors in the systems would be legally barred from obtaining more information due to proprietary rights intellectual, she explained.

Maria Ladouceur, spokesperson for the CBSA, said the agency “developed this smartphone application to allow foreign nationals and permanent residents subject to immigration enforcement conditions to appear without visiting a CBSA office in person.”

She said the agency has “worked very closely” with the Office of the Privacy Commissioner on the app.

“Registration for ReportIn will be voluntary and users will have to consent to both the use of the app and the use of their image to verify their identity.”

Petra Molnar, associate director of York University’s Refugee Law Lab, said there is a power imbalance between the agency implementing the app and the people receiving it.

“Can someone really consent in this situation where there is a huge power differential?”?

If someone does not consent to participate, they can show up in person as an alternative, a indicated Mrs. Ladouceur.

Risks of error

Kristen Thomasen also warned that there is a risk of error with facial recognition technology, and that this risk is higher for racialized people and people with darker skin. Molnar said it was “very troubling that there was virtually no discussion of human rights impacts in the documents.”

The CBSA spokesperson noted that Credo AI had reviewed the software for bias against demographic groups and found a 99.9% facial match rate across six different demographic groups, adding that the app “will be continually tested after launch to assess its accuracy and performance.”

The final decision will be made by a human, with agents overseeing all submissions, but experts have noted that humans tend to trust judgments made by technology.

Ms Thomasen says there is a “fairly widely recognized psychological tendency for people to defer to the expertise of the computer system,” with the latter perceived as less biased or more accurate.

Teilor Stone

By Teilor Stone

Teilor Stone has been a reporter on the news desk since 2013. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining Thesaxon , Teilor Stone worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my teilor@nizhtimes.com 1-800-268-7116