Examine: Social media can learn to regulate

LAWRENCE – As social media giants like Facebook and Twitter come under increasing criticism for how they approach the type of language allowed on their platforms, another type of online group went through similar struggles more than a decade ago, so two Scientists from the University of Kansas who have written a new article suggesting that social platforms model their approaches to language regulation based on what the gaming community is saying.

Harrison Rosenthal, PhD student in Journalism and Mass Communication, and Genelle Belmas, Associate Professor of Journalism and Mass Communication, are co-authors of an article that tracks the evolution of social media gaming and recommends the latter to pursue moderation approaches similar to the former. It was published in Jurimetrics, the official legal and technology journal of the American Bar Association.

The authors point out that social media evolved from games as places for people to communicate, and while there is no explicit gameplay, such sites are actually a game in their own right, with people asking for likes, retweets, or other engagement Looking for. The gaming world eventually evolved into a community-based approach where users set the standards and control what is acceptable, but social media is still struggling with top-down approaches where executives decide what is allowed.

“Over time, the game world has evolved from people who are mainly concerned with the rules and results of the game, to more online situations and interaction with people. Our argument is that your portrayal on social media, whether you like it or not, is an avatar, ”said Rosenthal, a lawyer who received his lawyer doctor from the KU. “Language is regulated in many contexts, but the way in which it is regulated is completely misunderstood. People come to social media with a fundamental misunderstanding of their rights. “

Belmas, an avid gamer, shares one such successful example of community self-regulation where a friend was referred to as a “guardian” in an online game. As a trusted player and community member, the Guardian was not a game company official, but was allowed to intervene if other players were abused.

“He was empowered to take people out of the game and talk to them about how they played and treated other players,” Belmas said of the Guardian. “He was empowered to make regulatory decisions, and this system where Guardians or others with guilds or bottom-up users works well, and social media could benefit from the same approach.”

Rosenthal and Belmas point out that some parts of the internet have already successfully adopted the approach. Wikipedia and Reddit are two examples that allow trusted users who have received “certification” for the quality and quantity of their posts, edits and corrections to have the right to regulate what is allowed on the platform. This approach would work better than CEOs like Facebook’s Mark Zuckerberg or Twitter’s Jack Dorsey, who implement their own guidelines, argue the authors, for several reasons.

First, no individual could foresee all the possible controversies that could arise on any given platform. The authors cite two examples from Facebook where the policy of not allowing nudity backfired. The famous “Napalm Girl” photo from the Vietnam War, in which a naked young girl is fleeing from a napalm attack, and the “Brelfie” movement, in which breastfeeding mothers shared photos of themselves breastfeeding, were used initially on Facebook as inadmissible. But after criticism, both decisions were eventually overturned. When it comes to gaming, it is easy to regulate when the rules are set like in the board game Monopoly or basketball. But if suddenly 10 baskets or 200 properties were available, new official problems would arise. This is the case with social media, the researchers said.

Similarly, the authors argued that a bottom-up approach would work better because of economies of scale and cultural differences. Social media companies employ thousands of people to review potentially problematic posts and make decisions about whether they are acceptable. While many of these reviewers are based outside of the United States, social media executives and lawyers are largely based in Silicon Valley so misunderstandings about what is acceptable in one culture and not in another are inevitable. Users are better able to understand what is acceptable and what is hateful, discriminatory, or problematic in their own culture, said Rosenthal and Belmas. In addition, users have no financial incentive.

“Social media companies will always surrender if it serves their profit,” said Belmas. “The question is to what extent language gives way to money, and the answer is always unless you use the model in which users have the power.”

The authors also point out how naturally speaking is regulated in various professions. In law and medicine, to name two examples, professionals can lose their licenses or face disciplinary action because of language that is harmful to the subject. As such, various social media communities could determine what is acceptable for their own community, be it a community for professionals, gamers, hobbyists, people with specific political views, or other groupings of people with similar interests or connections.

Criticism of the current approach to social media is almost ubiquitous, and lawmakers from across the political spectrum have called for change. Rosenthal and Belmas said social media served well in empowering trusted users and the community to regulate what language they tolerate rather than allowing the government to dictate online language guidelines. Online gaming has gone through similar struggles in the past and developed an effective way to deal with problematic language.

“Like it or not, social media companies are becoming more powerful and the political will is that something must be done,” said Belmas. “One of the best approaches we can see is a bottom-up, user-generated approach. In such a model, social media companies don’t give up power. You redistribute it. “

“This is in the economic interest of the company, on the one hand it can help prevent incidents like ‘Napalm Girl’ or ‘Brelfies’ from exploding,” says Rosenthal. “It would work better if the people were the buffer.”

Research method

Systematic review

Research subject

persons

Article heading

CYBER RECAPITULATION? WHAT ONLINE GAMES SOCIAL MEDIA CAN TEACH ABOUT CONTENT MANAGEMENT

Publication date of the article

August 1, 2021

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of press releases sent to EurekAlert! by contributing institutions or for the use of information via the EurekAlert system.

Comments are closed.