Is Privacy Paramount? Facial Recognition Under Investigation in Canada

By: Lee Rickwood

March 23, 2020

Halifax, Ottawa, Oshawa, Toronto, Calgary, Edmonton.

These are among the municipal police forces that now acknowledge using facial recognition technology in Canada. That’s only after initial denials, follow-up internal investigations and what appears to be unauthorized use of facial recognition systems by individuals within the police services came to light.

woman's face with digital lines and graphics across it

Biometric identification technologies like facial recognition are widely used around the world for everything from supermarket checkouts to social security to stopping pandemics. The Hans image.

Regional, provincial and national forces are now facing the facts, acknowledging that they have used biometric identification technologies like facial recognition.

And while some police services have mentioned a specific facial recognition tool, one developed by U.S.-based start-up Clearview AI, that is clearly not the only such tool out there, nor is it the only one used by Canadian law enforcement agencies.

The complex story that’s emerging about the use of facial recognition technology in Canada has now triggered an investigation by the country’s Office of the Privacy Commissioner.

Three provincial privacy agencies (Quebec, British Columbia, and Alberta) are joining the federal investigation, looking at Clearview AI in specific and facial recognition in general and whether such practices comply with Canadian privacy legislation.

Clearview reportedly scraped millions of websites and social media networks (where a lot of us have posted pictures) so it could build a huge database of faces that its software can search, match and identify. If the images were obtained without permission, Canada’s privacy office says Clearview AI potentially violated Canadian law.

Clearview, meanwhile, says it’s been providing its technology to hundreds of law enforcement agencies across Canada and the U.S., as well as private sector institutions like banks and financial organizations.

(Some of the companies on Clearview’s client list have been unceremoniously outed as a result of Clearview itself being hacked and an unauthorized user gaining access to its customer list.)

But even as the Clearview controversy swirls, privacy advocates are learning more about Japanese tech giant NEC Corp: it is one of many companies providing facial recognition software (even as some companies try and distance themselves from the controversial technology).

So NEC, not Clearview, may be the world’s leading, if least known, provider; NEC itself says it is working with police forces, private sector companies, and public agencies in 70 different countries, including Canada. One of our leading telecom companies may be a user.

The technology developed by NEC is called NeoFace; company officials recently announced that its biometric identification system (widely deployed, it is called NEC WideNet) would be used during the upcoming Summer Olympic Games in Japan (should they actually be held) to monitor every accredited person — including athletes, officials, staff and media — at more than 40 venues, games villages and media centres.

Coincidentally, not only are the Olympics in doubt, so too, apparently, are plans at one of Canada’s leading telecom companies to roll-out a facial recognition service based on the NEC technology.

Unpublished Bell web pages, cited by and reproduced in a Quebec newspaper, indicate that Bell is the first Canadian carrier to use NEC facial recognition technology. The webpages indicate that the system can be used to “identify facial images in a database to verify customer identities in real-time, helping prevent ATM theft, identify fraud and more.”

The webpages were apparently created as part of a test or pilot project for a proposed new business offering that has not yet come to market.

All this apparent usage is causing a lot of real concern among privacy advocates who point out, among other glaring issues with facial recognition technology, that it is a very error-prone technology.

people seated at tabe, computer ines and graphics over their faces

Facial recognition or face detection technologies can be used in a wide range of settings. Creative Commons image CC BY 3.0. Jimmy answering questions.jpg.

Of course, all technology has a bleeding-edge; new products are often error-ridden.

But what’s as concerning to some are all the errors in not just using the software, but in simply reporting its use, particularly when many such errors are made by the police.

The Toronto Police Service initially denied using Clearview’s program, but the chief has now acknowledged that some officers had been using Clearview AI since last October, apprently without his knowledge. He says he’s ordered them to stop.

(Toronto has been dubbed ‘Canada’s most surveilled city’, based on the number of known video cameras in use, but the efficacy of such installations is questionable. Coupling the cameras with facial recognition software, with its known shortcomings, has not changed that evaluation significantly.)

And when the Ontario Provincial Police confirmed its use of facial recognition, it says it learned officers were using the software as a result of an internal review that was launched following a media inquiry. “Upon learning that some members had commenced testing of the tool through a free trial (!), the OPP ordered the immediate cessation of testing and use of Clearview AI,” the police force said.

The RCMP acknowledged it was using Clearview AI’s facial recognition technology in more than a dozen situations involving investigations into child abuse. The Mounties’ statement acknowledged other uses without adding detail, noting that “[w]hile we recognize that privacy is paramount and a reasonable expectation for Canadians, this must be balanced with the ability of law enforcement to conduct investigations and protect the safety and security of Canadians, including our most vulnerable.”

It sounds reasonable, to use a word from the statement, but look at other words therein:

Paramount, for example. The dictionary meaning is clear: more important than anything else; supreme.

You really can’t balance that out – either something is more important than anything else, or it isn’t. It is paramount, or it isn’t. Canada’s legislators and privacy commissioners need to determine which.

In fact, several elected MPs, members of the House Committee on Access to Information, Privacy, and Ethics, have voted to conduct an investigation of their own into how facial recognition tools are used by governments, police, business, and individuals.

young child's face reproduced in software progtram

Google Vision face detection bounding boxes for all faces detected, landmarks detected on the faces (eyes, nose, mouth, etc.), and confidence ratings for face and image properties (joy, sorrow, anger, surprise, etc.). Image credit: Himanshu Singh Gurjar on Unsplash.

The investigations underway in Canada are examples of what appears to be a global pushback against corporate and government use of facial recognition technologies.

A list of the activities and activists taking action against facial recognition in countries all over the world has been compiled by the Electronic Frontier Foundation; here in Canada, the public advocacy group OpenMedia has assembled information and other resources as part of its Stop Facial Recognition initiative.



  1. Lee Rickwood says:

    In has been some time in coming, but three provincial privacy protection authorities have now (12/21) ordered facial recognition company Clearview AI to comply with recommendations flowing from a joint investigation with the Office of the Privacy Commissioner of Canada. Other countries have announced intent to fine the company tens of millions of dollars, while demanding it delete data on citizens.

  2. Moeugene A says:

    If Clearview violates the Canadian privacy law, then it should be banned – there is no question about it.

    Ofcourse, law enforcement would continue using this tool for efficiency.

Leave a Reply

Your email address will not be published. Required fields are marked *