For years, we’ve been warned about third parties.
They pose one of the first risks we face when we go online. Risks that are straight out of today’s high-tech headlines.
Third-party website service providers offer value and utility, they say, through improved website performance and more relevant advertising for the website visitor. For the website owner or publisher, third-party services are often a kind of revenue stream, providing value for data.
Third-party data brokers are those companies whose main business model is the collection, trading, and analysis of personal information. They do not have a direct relationship with the consumer (and that sometimes is the point), but rather they gather data from multiple sources (some public, some not so) and make the results of their activity available for sale or rent.
It really doesn’t matter what website you are visiting: its owners, operators, and publishers regularly work with third-party data brokers.
There are some 4,000 data brokers worldwide, according to the digital marketing company WebFX, operating in an industry valued at more than US$200 billion a year.
These companies can assemble a comprehensive history of what websites you visit, for how long and in what order, what purchases you made or content you viewed while there, what device you used to make your purchase, and where you were when you made it, and so on. They can combine that action history with more static demographic descriptions of education, income, sex, age, and so on. Data brokers can combine multiple sources of online information in order to create more detailed pictures of each consumer. They become repositories of behavioural profiles.
As such, concerns about online privacy and data security are growing when it comes to the practices associated with third-party monetization of personal data.
Disconcertingly, one of those third-party data broker practices was recently uncovered, revealing the breach of some 60 million customer records and associated information.
According to reports, data collected by a third-party vendor called GetHealth were left exposed or unsecured (the breach was repaired shortly after security researchers notified the company about it).
As a third-party data broker, GetHealth worked with the fitness tracking data from most of the major wearable device makers: Apple, Google, Fitbit, Microsoft, Strava, and more. So the breach potentially affected the personal data of some 60 million fitness device users.
That’s one of the risks associated with third-party data brokers: they have to safeguard huge amounts of valuable data, much of it personal information.
When combined with machine learning and artificial intelligence techniques big data collections open the door to data mining, and data mining is described as a technique for knowledge discovery. It finds or uncovers knowledge that was not previously known. That’s the big promise (for business analysts and marketing specialists) and the big challenge (for privacy advocates and legislative bodies): it is hard for a consumer to give prior consent when he or she may not even know the information or knowledge exists. Much less how it might be used.
But another recent third-party issue has arisen lately: not only might we not know how our data is used, we might not even be able to stop its collection in the first place. Even with precautions.
That’s one of the fall-out points from research and testing by a security team at Lockdown Privacy, an application developer.
Using the open-source Lockdown Privacy app and manual testing, they determined that even “Do Not Track” requests sent by users to their app may not be followed.
In the case of Apple, the security researchers at Lockdown found, and in collaboration, The Washington Post reported on their findings, that “App Tracking Transparency made no difference in the total number of active third-party trackers, and had a minimal impact on the total number of third-party tracking connection attempts. We further confirmed that detailed personal or device data was being sent to trackers in almost all cases.”
While personal information is protected in Canada by legislation such as PIPEDA, the Personal Information Protection and Electronic Documents Act, the protection is weakened somewhat by legal definitions of personal information as opposed to aggregated, anonymized data – the kind third-party brokers say they deal with. Jurisdictional boundaries also impact the ability of Canadian laws to regulate foreign companies.
That’s why the Office of Privacy Commissioner of Canada, in a report about data brokers, identifies several issues and serious concerns when attempting to balance commercial needs against the privacy rights of Canadians.
That report is now eight years old.
# # #