A federally funded panel is recommending the creation of a powerful new government regulator to oversee social media companies such as Facebook and Google and to require them to have strong content-moderation practices and to comply with a new legal duty to act responsibly.

The report by the Public Policy Forum (PPF)’s Canadian Commission on Democratic Expression, to be released on Wednesday, also calls for the creation of a federal “e-tribunal” to hear complaints about specific social media posts.

The federal Liberal government plans to introduce legislation early this year to regulate social media companies, with a focus on online hate and harassment. The report’s recommendations are aimed at influencing that legislation.

“It’s become pretty clear over the last few years that the major platform companies’ business models are causing democratic harms,” Jameel Jaffer, executive director of Columbia University’s Knight First Amendment Institute, said in an interview. Mr. Jaffer is one of seven commissioners who worked on the report. He grew up in Canada and his career has focused on civil liberties and freedom of speech in Canada and the United States. Mr. Jaffer said the companies’ algorithms, which automatically determine which social media posts get priority, can highlight “sensational and extreme” views.

“I think it’s also become evident that self regulation isn’t sufficient here because the companies’ incentives aren’t aligned with the public’s,” he said, adding that a new regulatory framework that better aligns the companies’ incentives with the public interest is needed. “What that framework should look like is a really difficult question, because inevitably, it will require us to make difficult trade-offs between multiple important values.”

The challenge is underscored by the fact that Mr. Jaffer attached a statement to the report that said he could not fully endorse the panel’s call for a duty-to-act-responsibly law and the proposed e-tribunal process.

While the commissioners say the era of self-regulation by internet giants must end, the report cautions against the kind of “reactive takedown laws” that European Union nations such as Germany have adopted that require companies to remove objectionable content in as little as 24 hours or face heavy fines. The report suggests a new Canadian regulator have quick takedown power, however, for matters involving a credible and imminent threat to safety.

The report recommends the regulator focus on ensuring social media companies have strong and transparent policies for moderating content. It says companies should be required to disclose details such as how algorithms are used to detect problematic content, the number and location of human content moderators and their guidelines for Canada. Other proposed transparency measures would be a requirement that bots – computer-generated social media accounts that can appear to be run by a human – be registered and labelled.

“Citizens should know when they are engaging with an agent, bot or other form of [artificial intelligence] impersonating a human,” the report states.

The report says, that to be effective, the regulator must have the power to impose penalties such as massive fines and possible jail time for executives.

The commission’s work was led by Public Policy Forum president and chief executive officer Edward Greenspon, a former editor-in-chief of The Globe and Mail.

The study also relied on a Citizens’ Assembly on Democratic Expression, a gathering of 42 randomly selected Canadians who reviewed the topic of social media regulation and issued recommendations.

The PPF said its work was funded in part by a $625,000 contribution from Canadian Heritage through its Digital Citizens Initiative.

The commissioners’ report says the focus should be on regulating how social media companies enforce their own content rules and how they deal with complaints about content that is already illegal, such as hate speech. It argues against banning additional types of speech through the Criminal Code.

“We have clearly emerged in the regulatory camp, but with a bias toward regulating the system rather than the content. Given the nature and rapid evolution of the medium, an attempt to tick off an exhaustive list of harms, deal with them individually and move on would be fanciful, partial and temporary,” the report states.

The proposed e-tribunal does open the door to government regulation of specific posts. The report said it could be modelled on the B.C. Civil Resolution Tribunal, an online body that resolves issues such as small claims and motor vehicle matters.

Mr. Jaffer said the panel did not define the precise role of the e-tribunal, and left many questions unanswered. In his statement, he said he is not convinced a tribunal is preferable to requiring large platforms, at their own expense, to have an efficient and transparent review and appeals process for specific posts.

He wrote that before he could endorse an e-tribunal, he would want to know more about its mandate, and what relationship it would have to the processes some platforms already use. Mr. Jaffer also cited lack of detail as the reason he could not support the call for a legislated duty to act responsibly.

BILL CURRY
The Globe and Mail, January 27, 2021