Addressing Huge Tech’s Energy Over Speech
Antitrust law plays a role in combating the considerable power social media companies have over speech.
At many points during the 2020 US presidential election, social media platforms demonstrated their power over speech. Twitter decided in October 2019 to permanently ban political advertising, sparking a heated debate over freedom of expression and so-called “paid disinformation”. A year later, shortly after polling stations closed, Facebook and Google imposed temporary restrictions on political advertising. In May 2020, Twitter assigned two misleading tweets from then-President Donald J. Trump about postal ballot papers to fact-check labels; Facebook initially refused to follow, but later adopted its own fact-checking policy for politicians.
In June 2020, Twitter first “hid” one of President Trump’s tweets apparently inciting violence against Black Lives Matter protesters. Facebook decided to leave the post up. Ultimately, all three platforms suspended Trump’s account after the attack on the U.S. Capitol on January 6, 2021. In the days following President Trump’s suspension, online misinformation about election fraud on multiple platforms fell by nearly 75 percent.
These events demonstrate the ability of Facebook, YouTube, Twitter and others to increase or limit the dissemination of information to their hundreds of millions of users. While we applaud the steps these companies eventually took to tackle political misinformation and extremism during the electoral cycle, their actions are also a sobering reminder of their power over our access to information. Raw power harbors the possibility of abuse – without guard rails there is no guarantee that dominant platforms will always use them in the future to promote public discourse.
Some leaders have suggested using antitrust law to curb the power of social media companies. US Representative David Cicilline (DR.I.) reiterated this opinion at a hearing of the House Antitrust Subcommittee last summer, accusing Facebook of “getting away from it” by spreading misinformation because it was “the only game in the world City “is. He went on to say that for social media giants “there is no competition that forces you to monitor your own platform”.
And the focus on the competition is understandable. Because the political power of social media companies grows from their economic power. Facebook, Instagram and YouTube benefit from network effects, where their value for both users and advertisers increases with the number of active accounts. Large social media platforms also collect a significant amount of personally identifiable information about individuals so that they can monetize advertising and target it to users more effectively. In addition, some companies have exhibited certain behaviors – such as Facebook’s takeover of Instagram and WhatsApp, and Google’s pre-installation agreements for YouTube and other apps – that have cemented their market power. The Federal Trade Commission, the US Department of Justice, and numerous attorneys general have recently filed lawsuits against Google (which owns YouTube) and Facebook for violating the Sherman Act and harming consumers and economic competition.
These pending lawsuits reflect the current state of antitrust law by focusing on the economic impact Facebook and Google have on consumers and competition – not the political or other social impact. The Chicago School’s jurisprudence, which has led antitrust enforcement for the past four decades, deals primarily with the price impact on consumers – not the political harm or other risks associated with content moderation by powerful platforms. And since most social media platforms offer their services to consumers at no monetary cost, US antitrust laws – as currently interpreted – do not take into account the full range of non-monetary effects that result from a lack of competition.
Antitrust doctrine does not address how social media companies collect large and detailed amounts of personal data, control misinformation, combat extremism, demonstrate transparency and accountability, and generally influence democratic institutions. However, as former Federal Trade Commission chairman Robert Pitofsky wrote in 1979, the underlying Congressional intent of the US antitrust law was not focused solely on economics: “It is bad history, bad politics and bad law, certain political values in the interpretation of the law Antitrust law. “
It’s possible that Facebook and Google’s antitrust lawsuits could limit companies’ control over the content we access – a change that would be neither easy nor quick. For example, if these lawsuits lead to the liquidation of either company, they could create a more competitive environment with a wider distribution of power over political information. However, it will take years for these cases to be litigated, and government law enforcement officers face a heavy burden of proof in court.
Additionally, courts have traditionally taken a conservative stance in antitrust enforcement, interpreting the Clayton and Sherman laws for the past 40 years to require high levels of confidence that anti-competitive behavior will cause financial harm to consumers and competition would – and the resolution of these cases leaves cases uncertain.
While current antitrust laws do not adequately reflect the power of social media to influence democratic processes, Congressmen have expressed an interest in re-evaluating or updating these laws. US Senator Amy Klobuchar (D-Minn.) Recently proposed a bill to amend the Clayton and Sherman laws. In addition, the House of Representatives Antitrust Subcommittee released a majority staff report last year, and US Representative Ken Buck (R-Colo.) Released a separate report. Both called for reforms, suggesting a bipartisan interest in reducing the raw power of a few less dominant companies and thereby helping new social media platforms compete.
There are alternative ways too: Congress could address the potential of platforms to abuse their power over information and hate speech by updating Section 230 of the Communications Act of 1934, which sets certain standards of liability for social media platforms and user-generated content.
Whichever direction Congress takes, the constraints of current antitrust laws to address the modern issues surrounding dominant social media platforms require a fresh look at how the United States deals with the political and social consequences of economic power. As the role of social media in the 2020 elections shows, dominant technology platforms can limit the spread of dangerous disinformation. But the same power can be used irresponsibly, either inappropriately restricting access to important information or perpetuating the “Big Lie”. Damage in this sense is not limited to direct price effects. It is becoming increasingly difficult to overlook the reality that some changes may be needed to deal with the power and risks that come with the dominance of social media platforms.
Bill Baer is a Visiting Fellow in Governance Studies at the Brookings Institution.
Caitlin Chin is a research analyst at the Brookings Institution’s Center for Technology Innovation.