How web platform accountability is altering underneath Biden
Two people were dead; one was injured; and Jason Flores-Williams wanted to hold Facebook accountable.
After Flores-Williams filed a lawsuit in September alleging that 17-year-old Kyle Rittenhouse killed two protesters in Kenosha, Wisconsin, that summer, the lawsuit withdrew in January. His struggle for accountability had collided with what the activist’s lawyer viewed as a “wall”.
“You have no levers of control, no leverage,” he told the Times. “You are dealing with Section 230.”
Section 230, a text excerpt from the Telecommunications Act of 1996, is the rule according to which websites are largely free to decide whether and how to moderate user-generated content. Flores-Williams claimed that a Facebook post by the Kenosha Guard militia calling armed civilians into town laid the foundation for Rittenhouse’s violence there. But as written in section 230, Facebook and its peers are rarely liable for what their users post – even if it results in death.
Flores-Williams isn’t the only one who thinks the law is out of date. President Biden, former President Donald Trump, and a long list of Democrats and Republicans have all pushed for the law to be restructured or scrapped altogether amid increasingly bipartisan criticism of big tech.
If liberals and conservatives agree on their calls for reform, they will disagree on what that reform should look like. Internet companies remain in a limbo in which a massive, forced change in their business model is constantly discussed, but never fully materializes.
Meanwhile, those trying to hold platforms accountable for the damage caused by content diffusion are looking for new approaches that might offer a greater chance of success – that is, none at all.
Section 230 takes a two-pronged approach to content moderation: it not only exempts websites from liability for unmoderated user content, but also states that they can moderate user content as they see fit. This allows social networks, chat forums, and review websites to host millions of users without going to court every time they leave an improper post or unsubscribe.
Online platforms typically, though not consistently, support keeping Section 230 in place. In a congressional hearing last fall, Alphabet CEO Sundar Pichai and Twitter CEO Jack Dorsey warned that the Internet was only going thanks to its proprietary protection. Facebook CEO Mark Zuckerberg broke the ranks to say the law should be updated, highlighting the need to promote transparency about moderation practices.
Of the critics of the law, conservatives tend to be open-minded. A Trump executive order attempted to amend the law to allow users to sue platforms for restricting content that wasn’t violent, obscene, or harassing, despite legal experts saying the order is unlikely to stand in court and appears to have little effect on how the platforms behaved.
On the left, critics have called for a version of Section 230 that would encourage more stringent moderation. Reforms aimed at sex trafficking and child abuse have also found non-partisan support in the past.
Both sides have only gotten louder in the past few weeks: the siege of the U.S. Capitol on January 6 sparked concern among the left about the role that unregulated social media can play in organizing real world violence during the subsequent ban on Trump’s Facebook and Twitter accounts enabled a striking example of how easily tech platforms can silence their users.
With the Democrats now in control of the presidency and both houses of Congress, the party has the option to rewrite Section 230 but has yet to reach consensus as members published several differently calibrated proposals over the past year.
The latest of these is the SAFE TECH Act, proposed last month by Sens. Mazie Hirono (D-Hawaii), Amy Klobuchar (D-Minn.), And Mark R. Warner (D-Va.). The bill would increase platforms’ liability for paid content and in cases of discrimination, cyberstalking, targeted harassment and death.
Flores-Williams said the last item in particular, which sponsors say would “allow a deceased’s family to bring lawsuits against platforms where they may have directly contributed to loss of life,” opens the door for future events in the sense of opening his withdrawn suit.
It could also bolster complaints of deaths like that of Brian Sicknick, the Capitol police officer who died after defending the Capitol on Jan. 6. The official cause of Sicknick’s death has yet to be determined, but the case is cited by the bill sponsors in their case for the spin-off.
The impact could extend well beyond high profile deaths as well.
“You’re talking about locks, right?” Said Daniel Powell, an attorney for the internet-based law firm Minc Law. “Floodgates for millions of people liable for lawsuits that have killed people for any reason that is tangential to social media in any way.”
It’s not clear how extensively lawmakers and prosecutors would try to interpret SAFE TECH’s provisions, but if the law is passed it could force tech companies to rethink how they deal with user-generated content.
Nadav Shoval, CEO and co-founder of OpenWeb – a platform that manages comment areas for online media such as TechCrunch and Salon – said changes to section 230 could hinder innovation by being too broad of liability.
“I have more questions than answers on this particular proposal, but I remain confident that changing the law is a mistake at all,” Shoval said via email on the SAFE TECH Act. “We have other laws that don’t exist [Section] 230 to ensure that the communities we host are safe and free from violence, hate speech, discrimination, etc. “
However, clearer guidelines for moderating and distributing user content would be helpful, Shoval said. These are areas “that should be slightly regulated or at least clearer, because at the moment there are a lot of gray areas … in which instructions would definitely help.”
Other social media platforms that would be affected by the passage of the SAFE TECH Act – including Facebook, Twitter, Google, Reddit, and Snapchat – declined or failed to respond to a request for comment on the bill.
Legislation faces a rocky road forward. The opposition to content moderation has turned into a major Republican rally under Trump, and the party has significant powers to block Senate legislation through filibusters. With Democrats hit by the COVID-19 pandemic and the accompanying economic crisis, liberal leaders may be reluctant to devote their time and energy to abstruse social media policies.
In the absence of an imminent reform, some attorneys have adopted a different strategy: trying to find novel legal theories that can hold platforms accountable for user content while Section 230 still remains in effect.
“For as long as [Section 230] There were lawyers for the plaintiff trying to stand up for the immunity it offers, ”said Jeffrey Neuburger, partner at Proskauer who co-leads the law firm’s technology, media and telecommunications group.
But the courts “usually with a few exceptions” shot down these efforts, added Neuburger. For example, he wrote via email, courts have “routinely and consistently rejected” arguments that websites are held liable for user content when performing editorial functions such as removing content or deleting accounts. and have also rejected arguments that the “Acceptable Use” of websites guidelines are legally binding promises. And in the few cases in which the plaintiffs succeeded in evading the defense under Section 230, the judgments on appeal were generally overturned.
“There are no easy answers,” said Neuburger. “It’s difficult to regulate content online.”
One approach that could make it easier to regulate the places where content lives is to change the legal status of large internet platforms so that they are subject to stronger government control.
“Instead of trying to change Section 230 because I’m not sure it will work… maybe [try] Treat these providers like public utilities, ”said Daniel Warner, founding partner of online defamation law firm RM Warner. “You can’t give electricity to anyone because they support Joe Biden or Donald Trump. It just doesn’t and shouldn’t work that way. I think the same goes for social media. “
While the urge to use antitrust law to break up the biggest tech companies has gained momentum in recent years, public utilities are turning in the opposite direction by viewing and allowing networks like Facebook, Amazon and Google to be “natural monopolies” to dominate their respective markets – but only under strict government regulation.
Proponents of this approach argue that social networks are central to the lives of their users and prohibitively difficult to leave. Critics say that when compared to traditional utilities like railways and sewer systems, social networks are less important to consumer lives and easier for emerging businesses to compete.
For Warner, the public utilities approach at this point is largely theoretical: “We have not yet had a chance to make that argument and really examine it in detail.”
And going down that path could create new legal problems, said Neuburger, such as forcing the government to determine which platforms are public utilities and which are not, or to clarify how Section 230 should interact with conflicting state laws.
Right now, everyone involved – from kneeling lawyers to directionless politicians to endangered tech managers – is caught between an unpopular present and an unclear future.