The chairman of a federal justice committee that is examining online hate speech says that legal changes may be recommended as part of a response to combating online hate speech.
The chairman of a federal justice committee that is examining online hate speech says legal changes may be recommended as part of a response to combating online hate speech.
Anthony Housefather, chairman of the federal standing committee on justice and human rights and MP for Mount Royal, told Law Times that part of the recommendations the committee will make to the federal government may extend to potential legal changes.
The federal committee has been holding meetings as part of its study on online hate speech, with witnesses coming before the committee to share their views.
For example, Housefather says s. 318, 319, 320 and 430 of the Criminal Code deal with hate crimes.
“While nobody who has come before committee has necessarily recommended that those provisions be changed, a couple of witnesses have mentioned that because the [provincial] attorney general needs to consent to prosecutions [of 318, 319 and 320], it makes it less interesting or likely for police to follow through and recommend to prosecutors to proceed,” says Housefather.
“So, one of the things we might look at is whether or not it’s necessary for the attorney general to consent to the prosecution [or] should it be delegated to a different authority.”
Housefather says the committee might also look at recommendations related to s. 13 of the Canadian Human Rights Act.
The section was repealed in 2013 and dealt with hate messages.
Housefather says the committee will be examining if it “should recommend that a modified section be inserted” into the act.
“[On] the issue of s. 13, certainly the committee will be considering whether we recommend some type of revamped s. 13, or not,” he says.
“We’ve had witnesses on both sides of the issue with most saying that something similar to s. 13 should be reinserted, but we also are aware of the problems and the issues that could be caused by such a section, and we would have to propose something that was very carefully drafted and tailored to today’s times.”
Issues around online content have come to prominent attention in recent weeks, including an announcement by Prime Minister Justin Trudeau regarding a digital charter.
David Elmaleh, founder of RE-LAW LLP in Toronto, says the law often moves at an incredibly slow pace, compared with the rapid evolution of technology.
“[G]overnmental efforts to police hate speech have been subject to intense, justifiable criticism. One of the criticisms of the repealed s. 13 of the Canadian Human Rights Act was that it was overly broad,” he says. “Legislatures must exercise extreme caution in any attempts to limit expression and may instead, or in addition, be wise to consider existing, available mechanisms to deter hate speech.”
Justin Safayeni, a partner at Stockwoods LLP in Toronto, says the easiest way to remove problematic content online is to try and convince the platform provider that the content violates its terms of use so that legal action isn’t required.
“Most major platforms will do this for particularly extreme hateful, abusive or racist comments. But for content that doesn’t fit into that category or if the provider does not have such rules, then the only real current alternative is a remedy through the civil court system,” he says.
“This can be extremely challenging on several levels. First, the legal tools available, chiefly, defamation and privacy torts, can be an awkward fit for so-called hate speech. And even if the elements of one of these torts can be made out, obtaining and enforcing a judgment can be a long, expensive and ultimately uncertain road.”
Gil Zvulony, founder of Zvulony & Co., says the laws do need updating.
“Many of the online platforms of today take an active role in determining the content that the public is exposed to. If a user interacts with certain content, an algorithm is likely to feed that person more of the same. There is an echo chamber effect,” he says.
“Some of the platforms have recognized hate speech as a problem and have stated that they are working on solutions, but these solutions are not very transparent. Furthermore, what is considered hateful, racist and dangerous language in Canada is being decided by people outside of Canada. They might not recognize hate speech if they see it.”
Housefather says there are “multiple dimensions” to a response to online hate.
“One would be defining what online hate is, number two [is] tracking online hate, three would be educating people to recognize what is fake news or online hate and then fourth would be how do you best work with internet providers and community groups. What is government’s role in trying to prevent, as much as possible, online hate?” he says.
A spokeswoman for federal Justice Minister David Lametti said the department is “closely monitoring” the study and the minister will be reviewing its conclusions.
“We continue to work closely with partners and allies on other measures to restore trust in the digital space and ensure that Canadians feel safe both online and offline,” she says.
Richard Moon, a law professor at University of Windsor, says he’s not sure there is an ideal set of legal rules that will effectively address the problem of online hate speech.
“I do think there is a place for law in the most extreme forms of this speech, but it’s a real challenge. And I don’t know if there is a way of getting the law right and somehow fixing the problem and adequately addressing it,” he says.