Ontario seeking input on provincial privacy law, lawyers say Canada’s privacy regime needs update

Better data protection necessary to regulate AI and spur innovation: Éloïse Gratton

Ontario seeking input on provincial privacy law, lawyers say Canada’s privacy regime needs update
Kirsten Thompson, Éloïse Gratton

Denouncing as insufficient the federal Liberal Government’s proposed privacy legislation, Ontario is planning to develop its own provincial privacy law.

Ontario would join Quebec, Alberta and British Columbia, which have also created their own privacy regimes. The Progressive Conservative government said it plans to fill the gaps left by the outdated Personal Information Protection and Electronic Documents Act (PIPEDA), which applies to federally regulated businesses.

Ontario’s proposals include introducing a “rights-based” privacy approach, fostering safe use of artificial intelligence (AI) and automated decision-making, enhancing consent and lawful uses of personal data and ensuring stronger data transparency, said Lisa Thompson, Minister of Government and Consumer Services. The province has released a white paper for private sector and public input detailing its plans.

Last November, the federal Liberals introduced Bill C-11, The Digital Charter Implementation Act. The bill was intended to modernize PIPEDA, but Queen’s Park said Ottawa fell short. In her announcement, Thompson called Bill C-11 “fundamentally flawed” and said it “stripped away key protections.”

While Thompson said her government hopes the feds will read their proposals and correct Bill C-11, if that does not happen, she said a made-in-Ontario privacy law will be the only option.

Bill C-11 is currently stalled, says Kirsten Thompson, national lead of the Transformative Technologies and Data Strategy group at Dentons. The bill has only passed first reading and is unlikely to make it to term before an election is called.

“Now, I expect that regardless of which party prevails in any elections, some sort of federal privacy bill is going to be introduced,” she says.

Otherwise, Canada risks losing adequacy status under Europe’s General Data Protection Regulation (GDPR), says Thompson. This would mean that for personal information to freely flow between Europe and Canada, individual companies would have to each negotiate contracts, have binding corporate rules, or “other more cumbersome and expensive mechanisms,” she says.

“Both parties would be interested in getting the federal privacy legislation passed.”

PIPEDA has not kept up with AI in the last couple decades, says Éloïse Gratton, national co-leader, privacy and data, at Borden Ladner Gervais LLP.

“It has become increasingly clear that this regime is not sufficiently adapted to address challenges related to the development and deployment of AI-based technologies, while also enabling organizations and individuals to benefit from the numerous opportunities that these technologies create,” Gratton says.

Because AI-based tools are shaped by processing vast amounts of data, data protection legislation is at the forefront of this concern, she says.

“For Canada, inefficient data protection laws and regulations could significantly hamper its efforts to retain its competitive advantage in the field of AI,” says Gratton. “Especially as other jurisdictions… look to adopt policies that seek to enable the ethical and responsible development and deployment of AI-based systems.”

PIPEDA also does not give the Office of the Privacy Commissioner of Canada (OPC) order-making power, nor the power to issue fines, and the legislation is missing components now common to the GDPR and other privacy laws, she says. While Bill C-11 includes new enforcement tools and some new rights, it has been criticized by the OPC and does not solve the issue that PIPEDA does not apply to the majority of Ontario employers, says Gratton.

Ontario’s proposals take advantage of areas where the province has jurisdiction and the federal government does not, says Thompson. An example of what is on provincial turf is Ontario’s proposition of establishing “fundamental right to privacy.”

Despite the language, the idea is not a “carte blanche” – privacy as a fundamental human right – but will be more constrained, she says. Using the term “fundamental right” raises issues around ordering and priorities of law and compliance.

“So, if you're going to announce something as a right, you generally want to scope it fairly narrowly,” says Thompson. “And that's sort of telegraphed here, because the white paper says that recognizing such right would then require a clear definition of personal information. And that, to me, suggests that there will be limits on this.”

Ontario is indicating its privacy legislation would be consent-based. Thompson says Canada is unusual among other countries in relying on consent. While the GDPR sets out a number of lawful bases for processing personal information, with user consent as a last resort, Canada “up-ends that,” requiring consent first, she says.

“The problem with that is, while this may have been a great idea 20 years ago, practically, right now, consent is kind of illusory,” she says. “Everybody knows they just click on the box so they can get to the next screen, and nobody actually reads what's happening.”

“So, it's a bit unusual to see a brand-new piece of legislation that's still going to be consent-based.”

But Ontario’s white paper is on target when it comes to de-identified and anonymized personal information, an area where Bill C-11 got it wrong, says Thompson.

The federal legislation proposes that all data – including de-identified and anonymized – be classified as personal information and caught within the regulatory regime. Because big-data analytics, machine learning and AI are all mostly driven by anonymized and de-identified data sets, this “caused a lot of concern,” she says.

“Losing the ability to get out from under the regulatory burden by appropriately anonymizing information, it really impacts innovation.”

There is also a contrast between the two approaches on enforcement, says Thompson. Bill C-11 proposed a two-stage process, where the federal privacy commissioner recommends a penalty, and the penalty is adjudicated by the tribunal. The tribunal then decides whether to levy a fine. Ontario is proposing its privacy commissioner have the power to make a penalty older, which then can be appealed at the Divisional Court.

There is alignment between the two governments when it comes to automated decision-making, says Thompson.

“One of the concerns about automated decision-making is that when you teach the machine how to make decisions, you incorporate all sorts of biases, either intentionally or unintentionally,” she says.

Thompson says Canadians encounter automated decision-making every day: the ads we see or do not see, when we apply for credit cards, pre-screening for employment opportunities and university and college applications.

Both Ontario’s proposals and Bill C-11 would contain a transparency obligation to disclose when automated decision-making is being used.