Likes and protein spikes – Fb tries to pre-empt regulation by squeezing anti-vaxxers | United States

The social media platform wants and wants to be the arbiter of the truth at the same time

February 13, 2021

“I Just have a firm belief that Facebook shouldn’t be the arbiter of the truth about everything people say online, ”said Mark Zuckerberg, the head of social media, last year. Despite Mr. Zuckerberg’s hope, this has become Facebook. Like a utility company, Facebook can illuminate certain voices and take others off the grid. The best-known example of the social media company wielding its power was in January when Facebook picked up Mr Trump over the riot at the Capitol. (Twitter has also suspended him.) Facebook’s decision is currently under scrutiny by the social media company’s internal jury, the Oversight Board, which advises the company on sensitive content issues, of which there are many more.

On February 8, Facebook announced that it was taking another stand on what can’t appear on its platform: falsehoods about vaccinations. The company will now be removing posts and blocking groups who claim vaccines make people sick or cause autism. Previously, the company had only downgraded such claims, which left them less prominent in user feeds and search results. Facebook and other internet companies have been under pressure from politicians and the press since 2019 to do more against police anti-Vax content when measles outbreaks led to nonsense in New York. Covid-19 has given the issue new urgency and attention.

In America, social media platforms are not just tools for spreading misinformation, but also for coordination. The anti-Vax activists who briefly stopped vaccinations at Dodger Stadium in Los Angeles used a Facebook page to organize, says Allison Winnike, head of the Immunization Partnership, a nonprofit that raises awareness about vaccination. Activists also use social media to post anti-Vax bills in many states in America, says Joe Smyser, who runs Public Good Projects, a nonprofit that focuses on public health. For example, a bill circulated in Kentucky seeks to eliminate all vaccine requirements for employees. Another goal is to create a preventive opt-out in the event that Covid-19 vaccinations are ever required.

How much Facebook actually curbs misinformation about vaccines is an open question. It helps show users truthful content in their feeds and searches. However, removing problematic content can lead users to other platforms as well. After Facebook and Twitter cracked down on accounts promoting QAnon conspiracies, those users simply went elsewhere, says Renee DiResta of Stanford Internet Observatory. “Social media companies run the risk of turning what can easily be tackled with a label into forbidden knowledge,” she says. Also, not everyone is happy with the staff, contractors, oversight board, and Facebook’s 80 outside companies who do various levels of fact checking and suggest what should be deleted. “It’s really problematic to censor speeches and pretend you can make them disappear,” said Matt Perault, former director of public policy at Facebook who now heads Duke University’s science and technology policy center.

The decision to fight anti-Vax propaganda may have as much to do with Facebook’s own PR problems as it does with the desire to cleanse the platform of life-threatening inventions. In Washington, Democratic leaders are pushing platforms to do more for police content. In January, three Democratic senators, including Amy Klobuchar, sent a letter to internet companies calling for action to be taken to combat misinformation about vaccines.

Facebook might read today’s political sentiment right as Democrats control the House, Senate, and White House. But his actions could further alienate conservatives concerned about free speech censorship and reinvigorate the antitrust enforcement debate among the many politicians concerned about the dominance of big tech.

Already now, politicians and regulators are grappling with the question of how and if the liability protection Internet companies have for hosting user content under Section 230 of the Communications Decency Act, an Internet law passed in 1996, should be changed. Facebook may hope to start the discussion about optimization or to end the complete elimination of Section 230.

That’s optimistic. Support for repealing Section 230 has been a rare point in the settlement between Donald Trump and Joe Biden, although they are in favor of it for very different reasons. Instead of repealing Section 230, however, several Senate proposals are circulating to reform it. Facebook and other internet companies fear that the optimization of Section 230 could lead to a flurry of lawsuits from people holding them responsible for the material posted on their websites. After Mr. Zuckerberg once stated that Facebook shouldn’t be an “arbiter of the truth”, he acts more like one and hopes this will work in his favor.

This article appeared in the “United States” section of the print version under the heading “Likes and Protein Spikes”.

Comments are closed.