Canadian Fee on Democratic Expression calls for brand new federal physique to manage social media

In the deep

The Public Policy Forum (PPF) is a well-known independent, non-profit Canadian think tank for public-private dialogue. In April 2020, the PPF established the Canadian Commission on Democratic Expression (“Commission”) to study and provide sound advice on how to reduce harmful language on the Internet. The Commission published a report recommending six practical steps to put responsibility for hateful and harmful content on the shoulders of technology platforms and their creators. The recommendations are summarized below.

Recommendation # 1: A new legal obligation for platforms to act responsibly

The Commission believes that platform companies need to join a greater public interest by taking responsibility for harmful content appearing on their domains. The creation of a new legal standard for responsible behavior would impose a positive requirement on platforms such as social media companies, large messaging groups, search engines and other internet operators involved in the distribution of user-generated and third-party content.

Recommendation No. 2: A new regulator to monitor and enforce the duty of responsible behavior

In order to monitor and enforce the new duty of responsible behavior on platforms, the Commission calls for the creation of a new regulatory authority (“Regulatory Authority”) to move content moderation and control of platforms outside the control of private sector companies. Regulatory decisions would be judicially based on the rule of law and subject to a review process. The regulator would also be responsible for publishing and enforcing a code of conduct for regulated parties that underpins the duty to act responsibly.

Recommendation No. 3: A Social Media Council as an accessible forum to reduce harm and improve democratic expression on the Internet

The creation of an independent, stakeholder-based social media council would provide an institutional forum for platforms, civil society, citizens and other interested parties to have an inclusive dialogue on ongoing platform governance strategies and practices. It is important that the regulator play an advisory role in providing a broad-based contribution to the Code of Conduct and the policy implications of changing technology, business models and user experience.

Recommendation # 4: A world-leading transparency regime to provide the regulatory and social media council with the information they need

One of the key challenges facing researchers, journalists, political communities, social media users and soon regulators is that the platform ecosystem is viewed as non-transparent. Embedding significant transparency mechanisms at the core of the mandate for the regulator and the social media council would allow better access to information and create a more publicly accountable system.

Recommendation # 5: Ways to enable individuals and groups to deal quickly with complaints about harmful content. An e-tribunal to facilitate and expedite the settlement of disputes and a procedure to deal with complaints quickly and easily before they become disputes.

The Commission believes that the creation of a new e-tribunal for disputes over online content could offset the possible asymmetry in the digital domain from private sector processes within platform companies to a public body dedicated to due process and transparency. An e-tribunal would enable quick and accessible access to content-related dispute resolution.

Recommendation # 6: A mechanism for quickly removing content that poses an imminent threat to an individual

Given the immediate nature of the Internet, the Commission recommends that the regulator be empowered to issue injunctions within 24 hours if a “credible and imminent security threat” is identified. These orders could be challenged in court and would be an exception to the general rule of the Commission that the regulatory authority does not make individual substantive decisions and does not deal with systemic issues.

Other considerations

The Commission considered imposing reactive shutdown requirements on platforms that would require companies to remove “offensive categories” of content in as little as 24 hours or to impose heavy fines. Despite the existence of these mechanisms in other jurisdictions, such changes have been rejected for fear of overcensorship.

Overall, the Commission believes that the regulator must have the power to impose penalties such as substantial fines and possible prison sentences on executives in order to be effective. The Commission intends to develop the positive requirements for the platforms within the framework of laws and regulations.

Participation

For further details, please consult the Canadian Commission’s Final Report on Democratic Expression and the Citizens’ Assembly Report on Democratic Expression.

Comments are closed.