The Guardian view on regulating social media: needed however dangerous | Editorial

TAt the end of another difficult week, Mark Zuckerberg fled into the technical-utopian environment of his new growth vehicle – the “Metaverse”. Surrounded by avatars of happy coworkers, 3D street art and colorful flying fish, Facebook’s CEO was the tour guide in a short promotional video released on Thursday highlighting the company’s future plans for virtual reality experiences. At the same time as the announcement that Facebook was changing its company name to Meta, the sugar-sweet video and the ominous rebranding were promptly panned to all platforms.

The hostile reception should come as no surprise. In the real world, Facebook has become a figurehead for the negative, polarizing effects of social media on politics and society. Following the release of the leaked Facebook papers – showing how the company has prioritized profit over mitigating the social harm it knew some online tools were causing – its reputation is shattered. As the parliamentary testimony of the former employee who became a whistleblower Frances Haugen made clear, Mr. Zuckerberg and his small circle of trustworthy advisors have ignored the ethical red flags waved by “integrity teams”. There was culpable reluctance to respond to evidence that key engagement mechanisms fueled extreme content and disinformation, and fueled discord around the world. After hearing from Ms. Haugen earlier this week, MPs interrogated Facebook’s global security chief Antigone Davis and highlighted research suggesting the company’s Instagram app is damaging the mental health of one in three teenage girls . Representatives from Twitter, Google and TikTok were interviewed at the same meeting.

Change is almost certain to come – especially the end of the era of big tech self-regulation, when private platforms like Facebook and Twitter failed to keep their homes in order. The desire to detoxify social media is legitimate and understandable. However, the development of a coherent system of external regulation is fraught with difficulties and dilemmas. The government’s draft online security bill – still in its early stages of its journey through parliament – would introduce the most far-reaching web regulation of any liberal democracy. As it stands, it would also create significant risks in itself.

The bill provides an expanded Ofcom as the regulator for large social networks, with the power to impose fines of up to 10% of global profits on companies that fail to adhere to the code of conduct. Services believed to cause significant harm to citizens could be blocked in the UK. The culture minister of the day would have the opportunity to set and change the strategic priorities enforced by Ofcom. This is a tremendous amount of power and discretion that can be given to a minister and a watchdog led by unelected officials. The lack of clarity on the bill’s definition of “legitimate but harmful” online content makes the problem worse and creates what one expert calls the “muddy in-between” field of interpretation. Which criteria determine when the unpleasant turns into the unbearable? In an age of polarization, the scope for aggressively pursuing contentious agendas at the expense of freedom of expression is evident.

Following the assassination of Sir David Amess, Sir Keir Starmer called for the government to speed up the “septic tank cleanup” law of online extremism. A regulatory system that grants the current Minister of Culture Nadine Dorries and a future chairman of Ofcom (Paul Dacre?) Far-reaching and unclearly defined powers is not the right solution. The self-regulation of the social media giants is not working. But what it replaces needs to be thought through more carefully and its categories clearly defined. Facebook’s mistakes do not justify a new era of censorship from above.

Comments are closed.