Should We Recognize Privacy as a Human Right?

  • December 14, 2020
  • Agnese Smith

Should We Recognize Privacy as a Human Right?

Data-driven innovation is critical to economic growth. But at what cost? As a society, we are easily persuaded to trade our privacy to use apps that track our every move. Opting out is nearly impossible. In an ideal world, we'd have privacy laws in place to fully protect consumers who have limited bargaining power, and help the economy thrive.

With this tricky balance in mind, Canada, along with other jurisdictions, is updating its consumer privacy legislation, which governs how firms are allowed to collect, process and use data. Last month, the federal government introduced Bill C-11, a new Consumer Privacy Protection Act (CPPA). If passed, it will replace the current regime, the Personal Information Protection and Electronic Documents Act (PIPEDA) which most agree is no longer fit for purpose after 20 years of cobbled-together rules.

Will the new framework measure up? The answer is not yet clear. Privacy experts are still considering the potential consequences of the bill. In any event, much will be up for revision as it runs its course through the legislative process. 

Among its notable changes, the Office of the Privacy Commissioner gains much more firepower, and penalties for transgressions will rise substantially, the lack of which privacy advocates have been complaining about for years. The language of consent, the critical component of privacy legislation, will become clearer. 

But there still exists one glaring omission: Bill C-11 does not explicitly recognize privacy as a human right, nor does it or give precedence to privacy rights over commercial considerations. “I think they should just say it,” says University of Ottawa law professor Teresa Scassa. "It would be best situated in a preamble to the legislation or purpose statement which "talks about the human right to privacy that links to other human rights because that’s really what's important. It’s not just about the individual’s rights to privacy but it’s also an acknowledgement that abuse of personal data can affect society as a whole.”

Emily Laidlaw, associate professor at the University of Calgary, called it "a missed opportunity for more profound law reform."

A human rights approach could serve as an effective check on technology's potential dangers while ensuring businesses can function and thrive. Federal Privacy Commissioner Daniel Therrien has long advocated for a rights-based regime that puts consumers at the centre. "Generally, it is possible to concurrently achieve both commercial objectives and privacy protection," he said in a statement after the bill was introduced. However, where there is a conflict, we think that rights should prevail."

"Good lawmaking balances legitimate interests with rights all the time," said Assistant Professor Ignacio Cofone at McGill University, in an email. "It is certainly possible to legislate in a way that makes legitimate commercial interests compatible with fundamental rights such as privacy."

So why the reluctance to include human rights? One theory is that Canadian policymakers feel their hands are tied because of constitutional constraints that dictate provincial jurisdiction over civil rights and consumer protection. How hamstrung the government is, in reality, is debatable.

Another reason for industry's top billing is that Canada, which punches above its weight in the global tech sector, needs to get this legislation right for the sake of prosperity. Digital economic activities reached $109.7 billion, or 5.5%, of total economic activity in 2017, larger as a proportion of the economy than mining, according to StatsCan. Artificial intelligence, where Canada is a world leader, is a particular shining light.

To be sure, keeping the trust of consumers is key to success, say industry observers, and strong privacy legislation is essential.

"Our government is aware that the digital economy is a huge driver to GDP and will be so for the future," said Gillian Stacey, a partner at Davies Ward Phillips & Vineberg. Since the privacy review started, policymakers "stated right from the beginning that it's a balance between promoting innovation and protecting privacy. The government believes you can have both a thriving tech industry and privacy protection."

But the potential for harm from this technology is significant. Current guidelines—and likely any new legislation—specifies that targeted advertising must not use sensitive personal data, like health or details like race or age. Indeed, rules dictating online behavioural targeting have been in place in Canada for nearly a decade. But so far, they have been difficult to enforce, and it's unclear how the technology might evolve to skirt them.

A lot of data gathering happens in the ordinary course of business. Firms rely on data to better communicate and serve customers, or to detect fraud and financial crime. But it's no secret that privacy invasion is baked into much of the current business model, which depends on collecting as many personal details as possible to train algorithms and pinpoint potential customers. It has propelled Big Tech firms into the billion-dollar stratosphere.

Many Canadian companies are now involved in some aspect of this data harvesting, from the devices themselves to the games makers and porn sites that keep eyeballs glued to screens to the data brokers that sell information. Indeed, Montreal-based MindGeek, owner of Pornhub, is one of the world's biggest adult entertainment companies.

As described to Parliament and many media outlets, today's technology has become so refined that firms can create messages that zoom in on individual core psychological profiles and particular life circumstances, like becoming new parents. Systems are designed to predict what consumers may want next based on their behaviour online, from movies to funeral plots. Some worry that this process can easily tip into mood modification and manipulation, with serious potential consequences for democracy. Regulators are struggling to keep up.

Strengthening Canadian privacy laws is also necessary for trade. Aligning rules with other influential jurisdictions, like California's Consumer Privacy Act and Europe's strict "privacy first" General Data Protection Regulation, is crucial for the flow of international data. The latter requires privacy rules both in the commercial and government sphere to be equally as robust to maintain "adequacy." Indeed, countries everywhere are rushing to revamp policies so as to not fall out of line with big markets.

GDPR, which is rooted in human rights legislation, has stricter requirements and obligations when it comes to collecting and processing personal data. Among other important goals is data minimization. Organizations cannot store or reuse personal data beyond their original scope, said Mariano delli Santi, Legal and Policy Officer at London-based Open Rights Group, in an email. Moreover, under GDPR, people's rights trump economic interests.

And GDPR enforcement doesn't just function on paper, something that some privacy advocates have complained about for Canada. As of October 2020, the EU dished out over 220 fines since its inception in 2018, with more expected. "GDPR is working, slowly but inexorably," added delli Santi. "The adtech industry is starting to realise that their practices are untenable."

Even the UK, which has stated unequivocally that it wants to enact more pro-business, pro-sharing legislation to build up its flourishing tech industry, has to keep its enthusiasm in check post-Brexit. "The UK has to have its own approach, but can it diverge substantially? It can't," says Herbert Swaniker, a tech lawyer at Clifford Chance in London. "It's crucial if the UK is to gain an adequacy decision from Europe."

So how does the proposed Canadian legislation stack up in terms of protecting citizens without the explicit human rights-based approach? It's too early to say definitively.

Scassa believes there are elements in the Canadian legislation that mirror GDPR in limiting the amount of personal information shared. "It's not GDPR, because it can't be, but it's got quite a lot of GDPR features," she said, pointing to Sections 12 and 13, which create a necessity and proportionality framework.

"Even if the bill doesn't take an explicit human rights approach, it still protects the human right of privacy through its different provisions," added McGill's Coffone. "It would just have an opportunity to do it better if this were made explicit."

But according to Scassa, there are troubling aspects to the changes, as they remove key consent requirements: when the data is used for any 'business activity' if it is 'de-identified' or for a 'socially beneficial purpose.' Both can lead to varying ways of interpreting the law. "It's infinitely worsening things," said Executive Director John Lawford of Public Interest Advocacy Centre, in an email.

Some computer scientists are also skeptical. "As known by anyone working in the field, de-identification does not fully secure the underlying information," said Mohamed Abdalla, a Ph.D. candidate in the Natural Language Processing Group at the University of Toronto. While the risk of re-identification seems "remote" when you consider the amount of data being processed by companies, the actual risk is not insignificant."

In terms of what could be termed 'socially beneficial,' "anything can be framed as such," Abdalla added. 

The vast majority of AI projects in particular and technology in general aim to increase profits at some stage of their existence. Who is to say this can't include the pursuit of profit—even if it's really bad for society?

Agnese Smith is a regular contributor to CBA National and is based in London, England. This article was originally published on the CBA National.