New techniques and technologies for facial recognition, or face-matching, can be used as a kind of password to unlock digital products and services, but they can also be a threat to personal privacy and used to manipulate photo-postings on social media networks.
In a kind of James Bond-esque naming protocol, government officials and law enforcement agents now refer to “the capability” when describing facial and biometric matching systems.
Such systems capture facial images with a video camera, analyzing them and comparing them with images in an existing database. Faces can be compared and recognized using various physical features, such as the distance between the eyes, cheekbones and eyebrows, as well as other characteristics. There are many such points of comparison and analysis, with designations like B10 or A16 providing a glimpse into just how many.
Researchers say they can very accurately identify and evaluate faces based on their multi-point computational algorithms, and they won’t be thrown off by severe camera angles, eyeglasses or facial hair. Age, sex, and cultural background are easily determined by analyzing such data points.
Beyond simple facial recognition, there’s facial analytics: an emerging technique that can be used to determine not only ages and genders, but also attitudes or emotional states. A patent filed by one development company touted the ability to link identified people with their social networking profiles and determine their relationship status, intelligence, education and income.
By the nature of the technology and in many of its deployments, this kind of recognition, identification and analysis goes on without the subject’s awareness, much less their approval or permission.
Facial recognition software can be easily deployed with new and existing video cameras and monitoring devices, like on a street lamppost, behind a restaurant bar or retail checkout counter, or even integrated into someone’s eyeglasses.
Facial recognition programs also work on existing images, and can tap into millions of images previously posted to social media networks like Facebook or Instagram, or already stored by corporate and government agencies that issue shoppers cards or driving licences.
Advanced collection, analytic and storage systems bring tremendous facial recognition and matching capabilities, and may well be yet another example of technology far outpacing social convention if not independent regulation and protection.
Privacy violations, data breaches and surreptitious use of consumer technologies have happened to and been committed by both private and public sector players, and the potential for misuse of personally identifying data and imagery is just as great.
Of course, the facial recognition technologies themselves face potential misuse, and tech products that protect people or spoof systems are also being developed.
Rather than try to hide or obscure one’s face from facial recognition technology (because such systems can see through attempts to hide or obscure a face by wearing glasses or a big floppy hat), one new product in particular allows users to present a different face and alternative identity to the camera: that of the product developer!
With their somewhat cheeky strategy, the folks at URME Surveillance are developing not just ways to combat ubiquitous surveillance, but also they’re creating tools to help us all realize the impact that possible misuse of data collected in surveillance, recognition and identification processes may have on our identity. Anyone who has felt the sting of discrimination due to race, gender, sexual orientation, disability or other physical characteristic will understand.
What the industry understands is a marketplace that continues to grow despite the apprehensions, concerns and objections voiced by privacy officers, technology analysts and citizen lobby groups.
Noting the participation of companies both large (NEC, Fujitsu, Cognitec, among them) and small (FaceFirst, Aware, ZKTeco), a recent report on the facial recognition and biometric tech sector describes the business opportunity plainly, if not with a disconcertingly broad brush: “These players are focused on developing efficient technologies that will ensure better surveillance or monitoring of people.”
That’s some capability, people.