Some of the largest data-using companies in Canada have come together in a bid to bring greater personal privacy and data protection to the digital world, including popular social media platforms, mobile applications and other online activities.
Self-described as “data custodians”, several private and public sector organizations, including a number of health care providing institutions, say that data anonymization is a way to conduct data-intensive activities in economically responsible and socially beneficial ways.
Those organizations have formed the Canadian Anonymization Network (dubbed CANON) in order to widely promote data anonymization as a privacy-respecting model for business and government activities.
Data anonymization is a technical and administrative process in which the links and connections between gathered data and the sources of that data are removed. The process implies that the data can no longer be traced back to an individual user because all the personal identifying information has been removed.
A further implication of the anonymization process is that people do not or will not object to having data about them collected as long as it cannot be used to trace or track them, especially without their consent or even awareness.
But technology analysts and privacy practitioners are skeptical: data anonymization is not foolproof, they say, and they’re able to point to several situations where, either by accident or malfeasance, anonymization efforts have been defeated. They say that some individuals can still be identified even after anonymization has been carried out.
It is well-known principal of anonymization that all personally identifying information must be removed from data before it is disclosed or released. But even the Canadian government has reportedly acknowledged that data purposefully (and legally) released can (unintentionally) include personal information, such as income, education or occupation. The government has released a list of the number and type of its programs potentially holding personally identifying data.
One of the problems in protecting all that gathered data is not just the number of programs that do so: it is the number of anonymization strategies or techniques that have evolved. Specific identifying data can be removed outright, or data may be replaced by placeholder variables in order to conceal its source. Some anonymization techniques also involve concealing the transmission routes that data may travel: VPNs and multi-router solutions can fall into this category.
But unless personally identifying data is not collected at all, or if it is not completely stripped out at the source, before any processing or collating or aggregating takes place, it is very possible to reidentify data by combining two or more existing data sets and cross-referencing those (even apparently anonymous and disparate) bits into a much more complete and revealing picture.
Tech researchers at MIT have found that the growing practice of compiling massive amounts of data about people’s movement patterns in and around the so-called “smart city” (ostensibly gathered for urban planning and development research) can put people’s private data at risk — even if that data is anonymized.
Nevertheless, the organizations behind CANON (co-founders include AccessPrivacy, a legal team working at the Osler law practice in Toronto; and Privacy Analytics, an American multinational health information technologies company are pushing forward. Other partners and sponsors of the new Canadian Anonymity Network include Bell, Rogers, Symcor, TELUS, TD and TransUnion.
Their efforts come none too soon, and in concert with other government initiatives in the digital space: as part of its recently announced Digital Charter, the federal government will hold consultations about various concepts of de-identification, the risks of reidentification and the need for new standards to control data safely and effectively.
If and when those standards are identified, we will need a private entity or a government organization to monitor and certify those standards are appropriate and carefully followed.
Organizations like ISO, the International Organization for Standardization, already do a lot of that kind of work: pretty much any industrial or commercial activity is covered, and ISO-issued certificates of performance can show whether a company or process is meeting the safety standards established for it or not.
Be it for food or electrical equipment or personal identifying data, once standards for safe usage are set, the authority to lay down monetary and regulatory penalties for non-compliance must also be in place.
In an opinion piece about data collection and consent for same, Konrad von Finckenstein, the former head of the Canadian Radio-television and Telecommunications Commission, wrote that the “authority to levy administrative monetary penalties against data gatherers that misrepresent that they properly anonymize according to the prescribed standards” would be a needed and necessary component of any plan to protect personal data and information.
His call echoes that of the country’s Privacy Commissioner, Daniel Therrien, who wants more muscle for his office’s efforts to police privacy protection for Canadians.
Yes, standards are useful and regulation appropriate. But serious penalties are needed to bring greater personal privacy and data protection to the digital world.