Internet science pioneer says we’d like a greater definition of misinformation if we wish social media to weed it out
Even companies like Facebook, which run prominent advertisements for regulating the internet, play an important role in creating effective policies. For example, social media companies need a better definition of misinformation in order to root it out, says web science pioneer James Hendler.
Without a set of socially-agreed rules on how to define defamation and defamation, companies are currently creating and enforcing their own policies – and the results are mixed at best.
“If Trump wants to sue to get his Twitter account back, there is no obvious legal framework. There’s nothing to be said about the platform: “If she does X, Y, or Z, she’s breaking the law,” said Hendler, director of the Institute for Data Exploration and Applications at Rensselaer Polytechnic Institute. “If there were, Trump would have to prove in court that it doesn’t do X, Y or Z, or Twitter would have to prove it does, and we would have a way to decide.”
As demonstrated in disputes over the 2020 presidential election results, the political polarization is triggered by the spread of misinformation online. Hendler, co-author of the landmark 2006 science article that launched the concept of web science, said, “As society grapples with the social, ethical, and legal issues surrounding misinformation and the regulation of social media, it needs technologists to guide this debate to inform. ”
“People say artificial intelligence will handle it, but computers and AI are very bad at knowing when I see it,” said Hendler, whose latest book is entitled “Social Machines: The Imminent Collision of Artificial Intelligence.” Social networks and humanity. “What we need is a framework that makes it much clearer: What are we looking for? What if we find it? And who is responsible? ”
The legal restrictions on social media businesses are largely dictated by a single sentence in the Communications Decency Act of 1996 known as Section 230, which states that internet providers and services are not treated as traditional publishers and therefore not legally responsible for much are of the content to which they link. According to Hendler, this clause no longer adequately takes into account the extent and scope of power these companies currently have.
“Social media companies offer a podium with an international reach of hundreds of millions of people. Just because social media companies are legally considered content providers rather than publishers doesn’t mean they aren’t responsible for anything on their website, ”said Hendler. “What counts as harmful misinformation? With individuals and publishers, we are constantly answering this question with libel and defamation laws. What we don’t have, however, are appropriate principles for assessing damage from social media. “
Hendler has extensive experience in political and advisory positions that address aspects of artificial intelligence, cybersecurity, and internet and web technologies as they impact issues such as social media regulation and powerful technologies such as facial recognition and artificial intelligence. Hendler is available to discuss various aspects of social media, information technology, and AI policies.