Deliberate social media rules set a harmful precedent
As the federal government prepares for laws to regulate social media content, Canadians have cause for concern about the effectiveness of their approach and the bad example we will set to countries that do not share our commitment to human rights.
THE CANADIAN PRESS / Sean Kilpatrick
The Minister for Cultural Heritage Steven Guilbeault has hinted in recent weeks that Canada’s upcoming legislation will be modeled on Germany’s NetzDG law. The law stipulates that social media companies will be fined up to 50 million euros for failing to remove content labeled “obviously illegal” from their websites within 24 hours of being notified.
The details of the government’s approach remain unknown as no meaningful public consultations have taken place on the development or preparation of this legislation. What we know about the upcoming bill should affect all Canadians for at least two reasons.
“Legal but awful”
The first is that it is not effective in dealing with most of the malicious content we find on the internet today.
Social media companies are not perfect at removing content that violates Canadian law, such as: B. Material related to the sexual exploitation of children or terrorist propaganda, but they have improved significantly in recent years. However, they are struggling with “lawful but awful” content that is legal under the laws of most democracies, including Canada, but which is known to cause harm in the real world.
Look at the tons of pandemic-related misinformation on YouTube and Facebook, or the casually racist or misogynistic memes that populate many Instagram feeds.
Read more: Stop outsourcing hate speech regulation to social media
The extensive protection that the Canadian Charter of Rights and Freedoms provides for freedom of expression makes it difficult for governments to completely ban such content or even to restrict the expression of such harmful and unpleasant ideas in public spaces. Accordingly, a new law to penalize tech companies for not promptly removing illegal material will only scratch the surface of problems with harmful online content.
More worrying, however, is the example that the forthcoming legislation will apply to countries that do not share our respect for human rights.
Authoritarian governments around the world are enacting social media laws similar to those to be featured here in Canada. These laws provide draconian penalties for social media companies that fail to remove content that is illegal under national law.
The problem, however, is that the laws in many authoritarian countries criminalize expressions protected by international human rights law, from voices contradicting the ruling regime to the cultural and religious expression of minority communities.
Pakistan is a good example of this trend. Last year the country passed a law strikingly similar to what Ottawa is considering, but in the context of a legal system where blasphemy can be punished with death and where it is a crime, “religious, cultural or to violate ethnic sensitivities ”.
(Valentyn Ogirenko / Pool via AP)
In Poland, the increasingly authoritarian government of Andrzej Duda introduced similar legislation in parliament last month, while Victor Orban’s government in Hungary is also reportedly considering a similar move.
Internet at risk
Canadians should be concerned about such laws being passed in far-flung places, not just because we value human rights, but because this type of legislation threatens the future of a global internet.
As governments try to regulate the online sphere according to their own national laws – regardless of whether these laws comply with international human rights standards – there is a risk that the Internet will split up into a number of national networks. It has a profound effect on all of us.
Against this bleak international backdrop, Canada needs to think carefully about our approach to managing online claims. Canada should not seek solely to enforce laws aimed at social media companies, but should work with other rights-respecting democracies to develop a multilateral approach to tackling harmful online content.
(AP Photo / Vincent Yu)
This is exactly what was done to deal with terrorist and violent extremist content online following the Christchurch massacre in 2019, when a coalition government led by New Zealand and France worked with industry and civil society stakeholders to develop the Christchurch Call to Action.
A multilateral approach based on the common language of human rights can help keep the internet free and open while at the same time mitigating the worst excesses. It will also deny authoritarians around the world the argument that what is good for Canada is good for them too.