They’ve been some 20 years in the making, but long-needed reforms to Canada’s privacy laws are in the works.
The federal government has introduced new privacy legislation that, if passed as tabled, would dramatically reboot the country’s rules about how private sector companies use the data they collect.
There are new privacy protections for consumers, new fines for violators, and some shortcomings still to be addressed.
The proposed Digital Charter Implementation Act, introduced by Minister of Information Science and Economic Development Navdeep Bains, “would significantly increase protections to Canadians’ personal information by giving Canadians more control and greater transparency when companies handle their personal information.” The Act’s title references the Digital Charter, the broad strategy previously introduced by the federal government to address important developments affecting our digital life, work and play.
Under the new Act, Canadians could demand that their information on social media platforms, such as Facebook or Twitter, be permanently deleted, the government says. And the country’s privacy commissioner would have the ability to order, if necessary, a social media company to comply.
The current commissioner, Daniel Therrien, would also have the power to audit companies and organizations under the new Act, also known as Bill C-11.
(Strictly speaking, the Digital Charter Implementation Act, 2020 (DCIA) DCIA will enact the Consumer Privacy Protection Act (CPPA), the Personal Information and Data Protection Tribunal Act (PIDPTA), and make amendments to other related acts. The CPPA will effectively replace the current federal legislative scheme governing the collection, use, and disclosure of personal information by private sector organizations under the Personal Information Protection and Electronic Documents Act (PIPEDA). Got that?!?)
Regardless, there are potentially significant fines in the Act that can be levied if a company fails to meet its privacy requirements: as much as five per cent of an organization’s annual gross global revenue, or $25 million, whichever figure is higher. In announcing the legislation, Minister Bains said the fines would be the highest among G7 countries. The penalties still pale in comparison to certain big tech companies’ balance sheets, much less quarterly earnings.
Overall, the privacy bill is a good step towards creating a legislative environment in which Canadians’ personally identifying information is better protected.
Yet among the privacy bill’s shortcomings is the fact that privacy is not recognized as a fundamental human right.
Although the privacy commissioner has long called for our privacy laws to be reconstituted as rights-based requirements, Bill C-11 seems to have deleted his call. This despite the fact the international organizations the United Nations do recognize privacy as a fundamental right, as stated in the U.N. Secretary General’s blueprint for human rights.
Data loss can be a real nightmare, there is no doubt. But privacy violations can lead to loss of freedom, democracy, equality, and even physical security.
As Therrien has said, “consumer privacy legislation that does not acknowledge the right to privacy is a glaring absence.” He spoke of “the elephant in the room” in terms of privacy threats and the uphill climb many individual consumers face in a legal dispute about data protection.
Moving data protection issues into a collective space, a privacy rights-based environment could mean that the individual is no longer alone in his or her fight against improper, illegal, or non-consensual data collection.
In lieu of such definitive protections, Canada’s new privacy legislation calls for
the establishment of a Personal Information and Data Protection Tribunal, almost a rights arbiter, which will have jurisdiction in respect of all appeals relating to various findings, orders, or decisions (and the imposition of certain penalties) under that Act. Who will sit on the Tribunal has not been determined.
While the new act may have some shortcomings, it’s trying to make up for almost 20 years of neglect, as mentioned.
In that regard, perhaps to avoid another 20-year standstill, the legislation will require businesses to be upfront about their use of automated decision-making systems, like artificial intelligence platforms and machine-learned algorithms that may be used to make significant predictions or decisions about people.
A business will need to be transparent about the purposes of its data collection and how it may use or disclose such information. Companies will need to specify the type of information that will be collected, as well as the names of any third parties to which they may disclose that collected information.
With what we have seen so far about how data is used today, the imagination runs wild when contemplating how our data may be used in the future.
Twenty days in the future, much less 20 years!
# # #