It was a rather disconcerting omission: among the criteria used to determine successful applicants to Canada’s Smart Cities Challenge, there were no requirements for data protection or a privacy impact assessment.
With more than $50 million in prize money to award to communities that want to be smart, the judges seemed to forget about personal – and public– data privacy.
It was, in the words of Brian Beamish, Ontario’s Information and Privacy Commissioner, “a very noticeable gap.”
Noticeable gap or dangerous chasm?
The list of promises and potentials offered by a smart city is almost endless: from better traffic flow and more effective garbage disposal to offices that heat and cool themselves, fridges that order our food for us and so much more. The list of threats or concerns is somewhat shorter: whither data privacy in a surveillance city?
The smarts in a smart city more often than not come from technology: data-collecting sensors and meters and cameras and gauges. The collection and analysis of all that information can make our municipal systems much more efficient and effective, smart city proponents say.
As such, the collected public data has great value for cities and for private companies, like for-profit service providers that are very successful at processing other people’s data. And like any valuable commodity, the collected data is a target for criminal activity or market malfeasance. Not considering its protection was certainly a “gap”.
So Beamish worked with other provincial privacy commissioners and the federal privacy office to have the judging criteria amended. Finalists in the Infrastructure Canada Smart Cities Challenge must now conduct a Privacy Impact Assessment on their project, and they must consult with a local privacy commissioner to address privacy considerations raised by its implementation.
Beamish’s office is also involved with Sidewalk Labs Toronto, the Alphabet-owned (Google) subsidiary that’s behind the largest smart city project in North America.
Ontario’s Information and Privacy Commissioner raised the smart city issue on a day known rather optimistically as Data Privacy Day. His office hosted a Privacy Day Symposium, during which a panel examined the promises and perils of smart cities.
Contributing to the discussion were Lawrence Eta, Toronto’s Deputy CIO, Information & Technology; Teresa Scassa, the Canada Research Chair in Information Law and Policy at the University of Ottawa; and Oriana Sharp, the Manager of Information Management and Archives for the Region of Waterloo, one of the competitors in the Smart City Challenge.
In his introductory remarks, Beamish posed some of the questions he hoped would be answered, or addressed, by the panel: “How do we protect our personal privacy?”, “Who owns, controls and has access to our data?” and “How much power do we have over how our information is used?”
Obviously, important questions that need complete answers (and another reason why the lack of privacy considerations in the Challenge judging criteria was so, well, notable).
But notable too is the notion of personal privacy in the public realm: it’s one thing to protect personal privacy at home, where it should be expected, but it’s another thing altogether to be concerned about privacy in the public arena.
If the smart city conversation needed any further controversy, it came when Beamish noted that some believe data collected by the smart city is not personally identifiable, so collection and analysis raises no particular data protection issues.
But the power of big data analysis now means that group identification, as much as individual identity, is a data risk: groups of people or communities of interest can be identified and tracked just as an individual can, creating a new class of privacy concerns.
Data analysis of a group (such as the social goals, economic abilities or political objectives it might have, the information it has about itself or its members, the activities it holds or plans) can lead to privacy threats on a wider level than previously imagined: the use of data in behavioural modification has tremendous utility at the group as well as the individual level.
As Beamish rightly pointed out, there are existing privacy laws and framework legislation that define how personal information (PIPEDA) as well as municipal data (MFIPPA) must be protected.
Waterloo’s IT manager Oriana Sharp described the administrative challenges facing municipalities as they struggle to ensure that data is used appropriately, with all proper consents in place. In her region, as many as seven different municipalities have collected data over the years and her team must now identify where that data came from and what consent for what use was obtained. Not surprisingly, Sharp said she expects to face even more issues about data and safe uses thereof in the future.
Lawrence Eta spoke to the worries he has about the data-enabled tracking of vulnerable communities or groups. At the individual level certainly, and especially in the aggregate, data collection and analysis should not create additional issues for the originator of that data. There are data laws to be followed, certainly, but also human rights legislation, consumer protection laws and other rules that come into play that must be part of the equation.
How to ensure that the group is protected along with the individual? Eta also spoke to the need for data review boards, made up of people looking at the use of data through a technical, legal and ethical lens.
Scassa said that she’s also thinking about not only individual but group privacy: de-identified data may offer some protection to the individual, but when used in the aggregate to profile a segment of society, unwanted manipulation or illegal surveillance can still be an issue.
She also made mention of some of the privacy laws now in place in Canada, but she said they are out of date and need revision – a point made by other Canadian privacy officials in other contexts.
There was no question on the panel that privacy issues are being raised by the emergence of smart city systems, technologies and policies.
The idea of “function creep” was also raised as a risk in the smart city: data ostensibly gathered for one purpose could be migrated into a project with completely different goals and objectives (and no consent form can cover that evolution).
How and where the data a smart city and its residents generates is stored is clearly an issue as well (particularly for a Canadian smart city with U.S.-based service providers).
Along with the identified challenges smart cities face, some solutions were shared at the Privacy Data Symposium, including an interesting recommendation for the safe storage and supervision of collected data.
Smart city policymakers in Toronto should get the public library to be the developers and overseers of a public realm policy for data and digital issues.
In fact, the Toronto Region Board of Trade released a report called Bibliotech in which it says the Toronto Public Library, with its experience and expertise in data policy and information management, would be the most appropriate public defender of digital privacy.
For its part, TPL noted in response that “We have long played a role in city building and welcome the opportunity to discuss how we can continue to evolve this role in the civic data realm.”
Libraries are public corporations, after all, and they already manage important publicly owned assets managed by appointed authorities who are accountable to government and ultimately to the voting public. Libraries are operated in the public interest and they are repositories of valuable intellectual property, and as such the librarian is likely among those who believe understanding and following the best online privacy and data security practice is a moral if not legal responsibility these days.