5 fixes ⋆ 4State Information MO AR KS OK

One of the most disturbing revelations in the cache of internal documents leaked by former Facebook employee Frances Haugen was how little we know about Facebook and how unprepared our political culture is to do something about whatever it is.

That’s the first problem with fixing Facebook: there isn’t much consensus on what exactly the problem with Facebook is. The left says it is Facebook’s reinforcement of hatred, extremism and misinformation about vaccines and the last presidential election, among other things. President Joe Biden put it bluntly this summer: “You kill people.”

Former President Donald Trump and others on the right say the opposite: social media giants are run by liberals eager to silence opposing views. In a recent statement, Trump described Facebook founder Mark Zuckerberg as “a criminal” who “changed the course of a presidential election.”

Aside from concerns about the distortion of domestic politics, there are a number of other questions about Facebook, Instagram and WhatsApp – all of which, as Zuckerberg recently announced, are now grouped under a new corporate umbrella called Meta.

Does Instagram contribute to fear and body embarrassment in teenagers? Are Facebook’s outrageous algorithms destabilizing developing countries where the company spends fewer resources monitoring its platform than it does in its large markets? Is Facebook perpetuating racism through biased algorithms? Is it the cause of global polarization that divides societies into uncooperative ingroups?

Underlying these concerns is a wider concern: the alarming power of Facebook. The company is among humankind’s greatest collectors of private information, one of the most wanted news sources in the world, and it appears to have the ability to transform public discourse to some extent.

Worse still, essentially all of Facebook’s power rests in Zuckerberg alone. That feels unbearable; as the philosopher Kanye West put it: “Nobody should have all this power.”

So what to do I’ve asked this question to more than a dozen experts in the past few days. Here are some

their top ideas and what I think about them.

1. Break it open

Under the tech-friendly Obama administration, the Department of Justice and the Federal Trade Commission allowed Facebook to devour fast-growing potential rivals. Splitting Facebook into three or more independent companies would reverse this regulatory misstep and immediately reduce Zuckerberg’s power over global discourse.

It could also improve the tenor of social media, as the new independent networks “would compete with each other by differentiating themselves as better and safer products,” said Matt Stoller, director of research at the American Economic Liberties Project, an anti-monopoly advocate group.

Still, as Stoller notes, separation can be a necessary measure, but it is hardly sufficient; Regardless of the competition, if we split, we would be left with three networks that store the huge data of Facebook and its many corporate pathologies.

The separation plan also faces steep hurdles. In the past few decades, American antitrust law has become ruthlessly amicable with corporations. It is unclear how to reverse that. In June, a federal judge dismissed extensive antitrust proceedings against Facebook filed by the FTC and 40 states on the grounds that they had failed to prove Facebook was a social media monopoly.

2. Narrow the content

Imposing rules on what Facebook can and cannot publish or expand is a hot topic among politicians. Democrats in Congress have tabled proposals for police misinformation on Facebook, while lawmakers in Texas and Florida have tried to prevent social media companies from kicking people out for speech crimes, including Trump.

This policy gives me the creeps as it inevitably leads to the government imposing language rules. Almost all of them appear to be in violation of the First Amendment.

Oddly, however, content rules have become the leading suggestions for fixing Facebook; The repeal of Section 230 of the Communications Decency Act – which limits technology platforms’ liability for damage caused by user-posted content – is often called a panacea. Of the many ways to combat Facebook’s ills, language rules are the least palatable.

3. Regulation of “surveillance capitalism”

Here’s a seemingly obvious way to bring Facebook to its knees: forbid it from collecting and storing the data it has on us, which severely hampers its main business, targeted advertising.

The reason for this is simple. Imagine we find that the social damage caused by “surveillance capitalism,” Harvard Professor Shoshana Zuboff’s aptly creepy term for the ad tech business, poses a collective threat to public safety . In other of these industries – automobiles, pharmaceuticals, financial products – we mitigate damage through strict regulation; the digital advertising industry has few limits to its behavior.

So let’s change that. Congress could enact far-reaching rules on how advertising giants like Facebook and Google collect, store and use personal data. Perhaps more importantly, a regulator with resources to investigate and enforce the rules could make it.

“At least,” said Roger McNamee, an early Facebook investor who is one of the most vocal critics today, should regulators prohibit the use of the most intimate data such as health, location, browsing history, and third-party application data. “

Data protection rules are one of the most important ways that European regulators can help curb the impact of social media. Why don’t we hear more about it in America?

I suspect it’s because this is a bigger solution than Facebook. All of the tech giants – even Apple, which has criticized the digital advertising business’s hunger for private data – make billions of dollars off ads, and there are plenty of other companies that have become addicted to ad targeting. When California tried to improve consumer privacy, corporate lobbyists pushed for the rules to be watered down. I’m afraid Congress wouldn’t fare much better.

4. Force internal data to be shared

Nathaniel Persily, a professor at Stanford Law School, describes the most fundamental problem facing Facebook police in a beautiful way: Currently he has written about the impact of social media on the world: “We don’t even know what we don’t know “. .

Persily suggests piercing the black box before we do anything else. He has drafted a bill that would force large technology platforms to provide outside researchers with a range of data about what users see on the service, how they interact with it, and what information the platform provides to advertisers and governments.

Rashad Robinson, president of civil rights group Color of Charge, endorsed another bill, the Algorithmic Justice and Online Platform Transparency Act, which would also require platforms to publish data on how they collect and use personal data across demographic categories, including race , ethnicity, gender, religion, gender identity, sexual orientation and disability status of users to show whether their systems are used in a discriminatory manner.

Tech companies enjoy secrecy, but apart from their opposition, it’s hard to imagine many of the drawbacks of transparency mandates. Even if we don’t change the way Facebook works, we should at least find out what it does.

5. Improve digital literacy

Renee DiResta, technical research manager at Stanford Internet Observatory and longtime scientist of the anti-vaccine movement’s digital presence, described one idea as “unsexy but important”: educating the public so they can’t believe everything they see online.

This is not just a thing for schools; Some of the most egregious reinforcers of online mendacity are the elderly.

So what we need is something like a societal effort to teach people how to process digital information. For example, Mike Caulfield, a digital literacy expert at the University of Washington, developed a four-step process called SIFT to assess the accuracy of information. After Caulfield’s trial was deeply rooted in his students, he said, “We see students make better judgments about sources and claims in 90 seconds than they did in 20 minutes.”

Or … do nothing

In his new book, Tech Panic: Why We Shouldn’t Fear Facebook and the Future, Robby Soave, editor of Reason magazine, argues that the media and lawmakers have become too upset about the dangers of Facebook.

He doesn’t deny that the company’s rise had some horrific effects, but fears some suggestions could exacerbate Facebook’s dominance, a point I agree with.

The best cure for Facebook, Soave told me in an email, is “to do nothing and watch Facebook gradually collapse on its own.”

Soave’s argument is not unreasonable. Once indomitable tech companies have fallen before. Facebook is still making a lot of money, but it has lost consumer trust, its employees are upset and licking left and right, and since most of its popular products were acquired through takeovers – which regulators are likely to ban in the future – it seems unlikely to renew his way out of his troubles.

I disagree with Soave that we should do absolutely nothing about Facebook. I would prefer strict data protection and transparency rules.

But Soave will likely get what he wants. As long as there is major disagreement among politicians about how to deal with the evils of Facebook, doing nothing could be the most likely outcome.

Comments are closed.