From many perspectives, it’s a good year for covering your face.
Public health professionals are urging people to don a face mask or facial covering as a way to mitigate the spread of the novel coronavirus.
But covering your face could also mean protecting yourself against the non-consensual use of your face in one of the largest databases of facial images ever – a collection of pictures assembled by Clearview AI and said to number over three billion.
Under enormous social and legal pressure, the facial recognition technology development company Clearview AI has now instituted a mechanism whereby people can request that images of their face be removed from its collection.
(Ironically, one of the key steps in getting Clearview to delete any images they may have of you is to …send them an image of you! The company says it cannot search or ID images in its collection by name or place or date, so image-matching and face recognition is how it works.)
That’s doubly ironic, in fact, because commercial facial recognition systems (FRS) can show bias if not unreliability, according to several studies and government reports.
So, in what was first seen to be a disruptive blow against the unmanaged spread of facial recognition applications, several leading practitioners or proponents of the technology have decided to reverse course (some might say hide their faces): with Microsoft, IBM and Amazon leading the way, they have stopped development and even withdrawn previously pledged financial support for other facial recognition initiatives.
They are but one step ahead of the public policy advisors and advocacy groups calling for an outright ban on the use of facial recognition, whether by police and security forces or businesses and corporations.
As well, privacy commissioners and government agencies in several countries and provinces are conducting their own investigations into the development and use of facial recognition, be that by various police services here in Canada or international businesses and corporations doing business here, such as Clearview.
Nevertheless, the technology is still in widespread use, and certain companies are still committed to developing even greater capabilities for facial recognition systems.
Such as recognizing people who are wearing a mask.
Because more and more people are doing so, facial recognition systems developers are coming up with ways to recognize and identify people when they have a mask on.
A Chinese company called Hanwang Technology (also know as Hanvon) said it has come up with technology that can successfully recognize people even when they are wearing masks.
That capability pushes facial recognition to (same would say past) its limits: training an FRS to identify a face involves a careful algorithmic mapping of a complicated balance of facial reference points, measurements, comparisons, and equations.
When a facial covering masks those reference points on a human face, the confidence or success level of facial recognition systems goes way down. But system programmers and AI coders are fighting back and they are training their systems to use whatever identifying facial elements are available to make matches (yet false identifications are still possible).
A company called Trueface has unveiled a combination system with facial recognition, mask, and temperature detection based on sophisticated AI-based training and image processing capabilities (as explained and demonstrated by computer vision researchers like Mosalam Ebrahimi at Trueface).
So just covering your face these days may not be enough.
Signal, developers of the popular cross-platform encrypted messaging service, has introduced an interesting new blur feature in its image editor that it says can help protect the privacy of people captured in a personal photo, say of fellow participants in ongoing public demonstrations against anti-Black racism and police malfeasance, by obscuring the recognizable details of your face before it gets posted or shared too widely.
Interestingly, the company is also making available cloth face coverings — masks — that can be distributed to the community, as Signal puts it, “as one small offline way to help support everyone self-organizing for change in the streets.”