Free Speech and Company Accountability Can Coexist On-line
Growing up, every time I wrote a letter to my grandfather, I worried that it would be censored. My father had fled communist Poland to the United States, but my grandfather could not escape and still lived behind the Iron Curtain. I learned very early on that it can be dangerous when governments go too far.
As the CEO of YouTube, I deal with freedom of expression and responsibility on a daily basis. Business, civil society and governments face unprecedented challenges and need to resolve complex issues to determine where to draw the lines of language in the 21st century. Policy makers around the world come up with regulatory proposals – some argue that too much content is left on the platforms, while others say that too much is removed. At YouTube, we are working to protect our community while allowing new and diverse voices to break through. Three principles should guide discussions about regulating online language.
First, the open internet has changed society in incredible ways. The leaders of the Group of Seven issued a recent statement reaffirming the fundamental value of openness. YouTube makes information available to anyone with an internet connection. People all over the world visit YouTube to find information, learn, and build a community. But creating a space that is open to everyone means that bad actors sometimes push the envelope.
YouTube has always had community guidelines that set the traffic rules. We are removing content that could cause real harm, such as violent extremism, copyright infringement, and dangerous pranks. Some of our decisions are controversial, but we apply our guidelines equally regardless of who is posting the content or what political position is being taken. At the same time, we are embracing the complexity and disorder inherent in the internet. Removing anything that is controversial could silence important voices and ideas.
The second principle: Democratic governments must give companies clear guidelines on illegal speech. This helps us to remove illegal content faster and more efficiently. These laws must be based on international standards as officials weigh the right to information against the risk of harm. The rules for the Internet are updated regularly, from copyright to elections to political campaigns. YouTube stands ready to work with governments to address these and other issues.
But not everything that involves moderating content is overseen by governments, which is why I firmly believe in the third principle: Businesses should have the flexibility to develop responsible practices for dealing with legal, but potentially harmful, statements. Some policy makers are debating what legal speech should be allowed on platforms, but such regulations could have serious consequences.
Suppose officials decide to regulate legal content that they consider graphical. This can lead to the removal of protest material, video games and music videos. Evidence on YouTube helped prosecutors in Sweden hold the Syrian regime and rebel fighters accountable for war crimes. What if these videos were removed because they were considered too graphic?
Organizations also need to be able to react quickly when new threats emerge. When cell towers were set on fire in the UK last year after a conspiracy theory accused Covid-19 on 5G networks, we updated our guidelines in a single day to remove the malicious content. Our community is relying on us to take action and we must continue to be able to act quickly.
Some might say that governments should monitor online speech, but we need flexibility to strike the right balance between openness and responsibility. When we get something wrong or lean too much in one direction or another, it affects our business and the millions of YouTubers small businesses built on YouTube. Advertisers have deducted YouTube spend when their ads ran alongside problematic content.
We work hard every day to take responsibility, and our advertisers, users, and YouTubers blame us for it. We are working with the Global Alliance for Responsible Media to develop industry-specific definitions of content that is unsuitable for advertising. We are also a founding member of the Global Internet Forum to Counter Terrorism, an organization that works to prevent violent extremists from exploiting digital platforms. We also provide tools and controls for users to manage their YouTube experience.
Responsible use of our platform is good for business. We are also working to be more transparent about our efforts. We recently released our Violative View Rate, which is an estimate of the number of times viewers see content that violates our guidelines. The rate was down more than 70% compared to 2017, largely thanks to investments in machine learning, which help to report potentially infringing content. In the first quarter the rate was between 0.16% and 0.18%. This means that 16 to 18 out of 10,000 views on YouTube are from infringing content.
Much is at stake in updating our approach to online language. Overregulating legal content would have a language deterrent effect and could take away the next big idea or discovery. I am confident that there is a way forward that will both keep our community safe and allow freedom of expression.
Ms. Wojcicki is the CEO of YouTube.
Copyright © 2021 Dow Jones & Company, Inc. All rights reserved. 87990cbe856818d5eddac44c7b1cdeb8
Comments are closed.