Taming the Hydra: Regulatory options that would truly curb online-falsehoods

French President Emmanuel Macron, speaking to the United States Congress in 2018, warned of the scourge of fake news as it undermines democracy, real choices and rational choices. Earlier this year, in a virtual discussion organized by the Atlantic Council, he expressed his shock that the networks that “sometimes helped” Donald Trump reach the masses so efficiently were thrown back by sending him “the second when they were sure it was “Suspended Over”.
Macron isn’t the only one speaking out against social media giants. Chancellor Angela Merkel and EU President Ursula von Der Leyen made similar statements. Political leaders around the world seek to curtail the power of the “private actors” who act as judges, juries and executioners on matters such as speech, misinformation, antitrust and data.
It’s the wild west in the tech world and the cowboys don’t need a sheriff!

NEW SHERIFFS IN THE CITY
However, the Covid situation could have accelerated regulation. There is a misinformation infodemic about the pandemic. Research suggests that during times of stress and unpredictability, people are more prone to untruths. In response, governments in many countries have put in place a range of legislative, administrative, and judicial measures aimed at tackling falsehoods online. Some recent examples are:
France passed two anti-fake news laws in 2019 after Russia allegedly intervened in the presidential election. In the same year, Germany passed the NetzDG law, which is described as the most ambitious measure of a western democracy to review online content. However, the implementation of the law ran into difficulties as too much content was blocked. Across Europe, the Digital Services Act includes proposals like an external review of how companies stop the spread of misinformation and a revamped national regulator to monitor potentially bad behavior. Multi-million dollar fines tied to annual sales are also a feature. Russian fake news laws require platformers to post corrections and remove content classified as incorrect. Singapore has a similar provision in its online falsehood legislation. Taiwan’s laws are more specific in that they aim to spread disinformation based on interference decisions by foreign powers. The implementation of such a law is problematic, however, as it is not possible to assign it to foreign actors.
Tech companies, on the other hand, are torn between the platform-publisher debate; and require a limited ability to deal with user-generated content, especially live / instantaneous. Your challenge becomes even greater with well-funded, synchronized, pre-planned misinformation campaigns by a multitude of anonymous players. For every fake account and bot they appear to be closing, a dozen more appear in its place.
Twitter, Google, YouTube, Facebook, and Apple have often shared their helplessness in formal Congressional hearings while insisting that they self-regulate.
So here is the million dollar question. If self-regulation isn’t working and government regulation isn’t the answer, what then?

THE HYDRA IS CREATED
Greek mythology speaks of Hydra, a serpent-like monster. The hydra had many heads, and if you cut one off, two more would grow in its place. Finally, Hydra meets in Hercules, who, together with Lolaus, cuts off one head after the other, burns it and finally buries it under a rock so that no more heads can grow.
Big Tech wants to regulate itself. Partly by metaphorically cutting off Hydra’s heads as they show up, and partly through educational and advocacy campaigns with users. Non-coercive measures were at some point preferred by Western governments over top-down regulation. However, faced with serious allegations such as election interference and cyberattacks, many are rethinking this approach, believing that technology companies cannot be left to regulate themselves. The effectiveness of blocking measures to contain willful online falsehoods is increasingly the subject of scientific studies. Research shows that deliberate online falsehoods can be better addressed by providing large amounts of accurate information rather than controlling information or even blocking communication. Mitigate this with FOE activism, enforced regulation could actually become counterproductive.
However, the regulators are not taking any chances. Many impose economic and technical penalties such as banning companies from operating in a jurisdiction and slowing access speeds to offensive platforms, fines to enforce compliance with judgments and guidelines.
In exercising sovereign power, regulators increasingly come into conflicting relationships with dominant technology actors, which are often based on reactive rather than proactive thinking in corridors of power. Proactive thinking and acting requires vigilance in updating policies and laws. Regulators may also need to “act fast, break things,” especially if the regulatory framework for the internet dates back to the 1990s. This should be balanced through the strategic use of their toolkit and the adoption of a multi-pronged approach of coercive and non-coercive measures.

TAME THE HYDRA
Very few states have considered institutional and proactive solutions. For example, setting up an official fact-checking agency with constitutional authority or an expert-led regulatory oversight body or public hearings on matters of public interest or advisory mechanisms that allow Big Tech to bring their perspective, or PPP models to collaborate with foreign interference in domestic matters / elections to prevent.
In short, curbing fake news and untruths is a fairly complex and evolving challenge. Calling this the “sovereign vs. platform” problem diminishes it. In reality, it is the internet platforms if anyone can do something on a large scale and across borders. The Joint Declaration on Freedom of Expression, sponsored by the UN Human Rights Commission, recognizes that the regulation of fake news should be a sovereign-led, multi-stakeholder effort involving key stakeholders such as journalists from tech companies, media companies and a variety of non-social media tech brokers .
The challenge is to find a common ground between the different interest groups, which mostly seem to have opposing goals. Results-oriented legislation and enforcement, based on a clear understanding of the problem and possible solutions, can provide the framework for multiple stakeholders to work together to curb the travesty of online falsehoods.
Taming the Hydra is partly about strategy, but mostly about finding your inner strength and allies. After all, combat is always about combat tactics.

Based in Singapore, Anuraag Saxena is a board advisor and public affairs expert.
Ankur Gupta is a member of the Singapore Chapter of the Internet Society and teaches media and technology law.

Comments are closed.