Opinion: We want governance of social media to cease violence and hate

Social media companies are increasingly facing dangerous posts on their platforms. We need social media governance solutions, and we need them now. Photo by: dole777 on Unsplash

We know social media content can lead to violence, but is there a plan to stop it? Facebook has tried to improve content moderation after being used as a tool to spread online incitement to violence in Myanmar. However, the recent violence in France and Ethiopia, which attracted attention at the United States Congress, shows that much remains to be done.

Meanwhile, YouTube, WhatsApp, Twitter and TikTok are facing increasingly dangerous posts on their platforms. We need social media governance solutions, and we need them now. And independent oversight is the only way to make real change.

In many parts of the global south, Facebook relies heavily on content flags from third-party providers, such as civil society organizations and other local organizations. Take Myanmar as an example. There, Facebook uses third party CSOs to identify dangerous content – which is certainly a more effective method than its own detection mechanisms in some countries, as CSOs are staffed with locals who speak local languages ​​and have a deep understanding of the country’s social and political context. are more suitable for pointing out linguistic and cultural nuances in online posts and comments.

Facebook has made progress in consulting civil society: group consultations are being held with several civil society organizations for high profile events in high profile countries, such as the Myanmar 2020 elections.

However, Facebook is reportedly not always transparent to these CSOs. Perhaps the company would cite potential legal issues with sharing user information, but a structural investigation would show that Facebook, as a for-profit company, has no real transparency obligations to these organizations.

Social media businesses should be monitored by an entity comprised of civil society, multilateral organizations, and researchers, rather than an entity occupied with recommendations from these companies.

– –

In either case, the CSOs who are deeply involved with content moderation may not be able to figure out what is happening in content moderation systems. Once they report content to Facebook, it can take some time for Facebook to respond. In some cases, CSOs may never know what action will be taken on flagged posts, how such decisions will be made, or where that data will be stored.

The lack of transparency can not only be observed in the global south: internal political processes at Facebook are not always clear. A 2020 survey of Facebook’s internal processes found ad hoc decisions and an environment where employees say they are “making rules”.

In short, no one really knows how Facebook – or any other social media company – makes content decisions, and given the damage it can do, that has to change.

Unfortunately, these companies cannot repair themselves. They have no reason to be transparent and struggle to hold themselves accountable. The most sensible attempt at real change so far is the Facebook Oversight Board, which has just started taking cases.

However, its powers are too limited: to the best of our knowledge, its only real authority is to make one-off decisions about content that arises as part of the appeal process, and the board is unable to review Facebook’s data practices, procedures for determining Rules and regulation enforcement oversee the scale community standards or their content detection algorithms. As others have noted, it is not set up to address the relevant issues.

If self-regulation is off the table, what about government oversight? This is also not a good idea as there is a huge conflict of interest – several government cases this year alone have put pressure on these private companies. For example, when the Thai government ordered Facebook to block a group deemed critical of the country’s monarchy, or when a US house committee pressured Twitter to disclose information about a data breach relating to alleged Saudi spies .

Get the top news headlines in your inbox every day.

Thanks for subscribing!

Although the merits of these cases vary, they illustrate the same idea: government interests collide with good governance, be it a politician’s reputation or national security. We cannot and should not expect social media companies to be completely transparent to states.

We propose a type of oversight that is independent, cooperative and accountable. Social media businesses should be overseen by an entity comprised of civil society, multilateral organizations, and researchers, rather than an entity occupied with recommendations from these companies.

Its powers should be broad, including enforcing language standards, algorithms, human reviewers, privacy practices, and internal policy processes through audit platforms.

This ideal supervisory body should have a range of specialist knowledge: from international law background to software skills to the local socio-political context in different countries. It should be able to tap into global networks of civil society and grassroots organizations. It should center on a human rights approach – free from competing government interests. And of course it cannot be a profit-maximizing initiative: in order to hold social media accountable, its first responsibility must be good governance.

It would be crucial that this body enables coordination. When multiple platforms in similar regions face threats, they can communicate about risk mitigation – communications that are badly needed in countries like Myanmar.

It would also harmonize communication between civil society within and between countries. When it comes to content decisions, there isn’t always one right answer to whether a post is a violation.

We’re not suggesting that Facebook and Twitter always have to make the same decisions when faced with identical content. However, in a coordinated independent review, the platforms would be guided by similar frameworks and assessed using similar metrics, all of which come from one place of local knowledge: civil society.

Just because governments don’t monitor platforms directly doesn’t mean they shouldn’t interfere. An independent body could continue to set standards for legislation.

If the panel goes well, regulators could require that social media companies be scrutinized by the panel – adding another level of coordination to social media oversight – because we haven’t taken enough steps towards transparency and accountability and the stakes are too high wait.

Printing items to share with others is against our terms and conditions and copyright guidelines. Please use the sharing options on the left of the article. Devex Pro subscribers can use the Pro Share tool () to share up to 10 articles per month.

Comments are closed.