Social Platforms are Dealing with Extra Regulation in Extra Areas – However is it Progress?

Will 2022 be a milestone for the regulation of social media platforms?

It is still difficult to determine how the various proposed approaches to social media legislation actually work and what implications they will have, but as the UK is outlining its recent attempt to make social platforms take more responsibility for the content they host is on the movement for significant regulatory change is picking up, which is likely to become a key discussion point over the next year.

The UK’s Online Safety Bill announced today outlines new protections for young users and stricter rules on fake ads and fraud to better protect online consumers.

As stated by BBC:

The report also recommends creating a wide range of new offenses based on Proposals by the Legal Commission, and included in the invoice, including pPromote or “conjure up” violence against women, or based on gender or disability and know spreading seriously harmful misinformation

Essentially, the bill aims to introduce tougher penalties for social platforms to ensure they are held more accountable for enforcement in order to address growing concerns about the impact of digital communication and connectivity. However, questions remain as to how exactly such regulations can be effectively enforced, much depending on what is considered “reasonable” in terms of response times in handling such complaints.

Various regulatory groups have tried to enforce similar rules and penalties by imposing clearer parameters of what social platforms are expected to do in response to official complaints. However, Meta has generally been able to argue that it cannot reasonably be expected to remove content within, for example, 24 hours unless advised to do so. When an official complaint is filed, such a response can be waived, but often the harm is caused by content that initially does not raise concern, making it difficult to be effectively enforced.

For its part, Meta has repeatedly stated its constant desire for improvement in its regular community standards enforcement reports, but gaps remain between community and government expectations and realistic actionability as all users can post what they want, in real time, and automated systems improve, but cannot capture everything before someone sees it.

The arguments then relate to what is reasonable, what is possible in enforcement and action, and in turn the remaining gap between the expectations of regulators and what social platforms can offer given their real-time nature.

Is it possible to ever bridge such prospects – and most importantly, will harsher punishments actually improve this situation?

It’s hard to say in general, but there are other elements that Meta can be held accountable for and will face even more pressure over the next year as governments look for more ways to take matters into their own hands take control where they can.

A key element on this front is the sharing of user data and its accessibility to law enforcement agencies. At the moment, Meta is in the process of implementing end-to-end encryption as the standard for all of its messaging apps (Messenger, WhatsApp and Instagram Direct), which various authorities claim to offer protection from criminal activity by preventing potential Block detections and eavesdropping.

Meta claims it is working to meet rising privacy expectations, but various governments are now scrambling to take new measures to block privacy Encryption plans or establishing new methods to extract user data from social platforms.

For example, the Australian government recently announced new laws This would essentially force social media companies to divulge the identities of anonymous troll accounts, which would open a path for legal action against these users.

According to The Guardian:

“Under the law, the law would require social media companies to collect personal information from current and new users and give courts access to users’ identities in order to initiate libel suits.

Which in itself is flawed as social platforms do not currently enforce user identity and attach real contact information to accounts as such. If this goes into effect, it would essentially force the platforms to validate real-world information from millions of users, which in itself would be a huge undertaking, even before you even consider the implications of free speech and law enforcement .

Australia’s High Court also approved the interpretation of the law that puts more on us Media companies in relation to inciting defamatory comments on their Facebook pages. Some have suggested that media companies be held legally accountable for any comments on their social media profiles, but the actual details of the case are much more nuanced as a direct link between incitement and action is required to appeal.

This is where all of these legislative and regulatory approaches really mix – the interpretation between actual cause and effect and how that works in the legal sense when looking at online language. Social platforms have changed the paradigms of communication by providing a platform for everyone to be heard, with the immediacy of the format making enforcement essentially impossible as there is no moderation between user and output.

And with billions of users it is not possible for any platform to moderate all comments on a large scale, much room for discussion.

So while it seems like the regulatory walls are closing around social platforms, the reality is that there are many gray areas within any approach. And while governments are keen to come up with their “solutions”, especially in the run-up to their respective elections, given the wider focus on misinformation and abuse on social media, it still feels like we’re still far from real, solid ones Progress removed.

Different approaches produce some results, but a more uniform, international regulatory approach to digital language and enforcement needs to be created to set clear parameters and expectations in all regions, ideally including parameters related to algorithmic amplification, and the role which it plays in reinforcing certain elements.

The discrepancy between reverence for political gain and actual, effective action tarnishes real progress on these key elements.

Comments are closed.