Fb’s Whistleblower Mentioned the Firm Does Too Little to Shield Customers. A lot of the Public Agrees

According to former Facebook Inc. employee and whistleblower Frances Haugen, most of the public recently said in a Morning Consult opinion poll that the company is not doing enough to protect user safety and is supporting both congressional and corporate actions to correct the situation.

Sixty percent of adults in the United States said Facebook isn’t doing enough to keep users safe, compared to 19 percent who said the social media company is doing enough.

Public on board with stricter content controls

Problems Facebook has encountered in the past few weeks include reports that the company’s platforms like Instagram have negatively impacted the mental health of younger users, as well as claims that the company provided misinformation on topics like COVID-19 vaccines has allowed to spread on its pages. The poll shows that the public would support actions Facebook could take to address some of these concerns.

One of Facebook’s most popular measures was to introduce stricter rules and standards to ban content that promotes misinformation, hate speech, illegal activity or acts of violence, which was supported by 69 percent of respondents overall. Receiving an equal share of support increased the age of eligibility to have an account on the company’s platforms.

Despite public support for increasing the barrier to entry, some experts are not convinced. Sean Blair, assistant professor of marketing at the McDonough School of Business at Georgetown University, said such restrictions “just don’t work.”

“People will always find ways around them so the problem doesn’t just go away,” Blair said. “That doesn’t mean we shouldn’t have any barriers or age requirements at all, but it does mean we probably shouldn’t rely on them to solve the problem. Ultimately, I think everyone – companies, users, parents, children, regulators – has a role to play in this process. “

Most support expanding Facebook’s ability to censor and remove certain types of content, and also a measure to display the news feed in chronological order rather than using algorithms to customize what is shown.

Majority support for congress action

The public is also in favor of congressional intervention and tighter regulation of the social media giant.

The most popular suggestion was stronger protection for children on social media platforms, which received 77 percent support. That came after Facebook announced it was pausing its Instagram Kids initiative. a move supported by 52 percent of US adults in a Morning Consult poll conducted shortly after the announcement.

A plan for Congress to create an independent government agency with ex-technicians to investigate Facebook’s use of algorithms and the risk they pose to the public received widespread support, as did regulations that make algorithms more transparent and require their use by social media companies.

And 64 percent said they support making social media companies at least to some extent liable for the actions of their users, which is what a group of House Democrats are seeking a new bill the platforms would put responsibility if personalized recommendations made through their content algorithms promote harmful content that causes emotional or physical harm.

This would be a huge challenge for corporate liability protection under Section 230 of the Communications Decency Act, although some tech companies have said it’s not the best way to go.

“Rather than blaming the algorithm, Congress should work with platforms to develop best practices for quickly identifying and removing malicious content, and giving users the skills and tools they need to stay safe online stay, ”Daniel Castro, vice president of the Information Technology and Innovation Foundation, said in a statement.

Adam Kovacevich, chairman of the Tech Policy Group of the Chamber of Progress, warned in a statement that the bill exacerbates “the problem” of harmful content. “By banning companies from using personally identifiable information to recommend relevant content to users, platforms could be forced to rely more on metrics like viral engagement that lead to the spread of bad content,” he said.

What else can you do

Others have suggested a different approach. A group of more than 40 human rights organizations called Federal Data Protection Act to make it illegal for social media companies to collect data and use it for their personalized recommendation algorithms. The groups said the law should be “strong enough to end Facebook’s current business model.”

A law requiring Facebook to disclose its internal research was also heavily supported, with 68 percent saying they were in favor of such a move. Much of the recent controversy surrounding the company has centered on the “Facebook files“Who released a series of internal documents showing how the company downplayed various negatives related to its platforms, including Instagram’s impact on children’s mental health.

The public seems to support regulation of social media companies in general, with 52 percent saying they are in favor of such a move by lawmakers. And 43 percent said Facebook is not adequately regulated, compared to 19 percent who said it had the right level of oversight and 17 percent who said it had too much.

A Facebook spokesman declined to comment on the results and pointed out an opinion piece by Nick Clegg, the company’s vice president for global affairs, calling for new internet regulations, including reform of Section 230.

Comments are closed.