‘Legislation unto themselves’: the Australian battle to curb Fb and Twitter’s energy | Australia information
N.MP Anne Webster and Labor MP Sharon Claydon are less concerned about why Donald Trump has been removed from social media and more concerned about what platforms like Facebook are doing to stop online defamation and abuse.
Webster and Claydon are the co-chairs of the Parliamentary Friends for Social Media Safety, a group that aims to “highlight the social media environment and the risks involved” and make platforms more accountable. It now has 50+ members, thanks in part to the response from Twitter and Facebook to last week’s attack on the US Capitol.
For Webster, it’s personal. After winning a defamation case against a conspiracy theorist who falsely accused her of “being a member of a secret pedophile network,” she wants Facebook to be treated as a publisher.
The decision by Twitter and other social media platforms to first remove posts and then suspend Trump’s account sparked outrage among some Conservatives, including National MP George Christensen and Liberal MP Craig Kelly.
The outspoken couple are in favor of both changes to prevent social media platforms from censoring legitimate content created by their users – a push towards greater freedom of speech and less responsibility for content on the part of the platforms.
Webster tells Guardian Australia that while she is glad the Trump controversy and the Chinese State Department’s tweet accusing Australia of war crimes in Afghanistan “set fire to the debate” there is now a wider discussion of regulation social media.
According to Webster, social media companies are “for the most part a law of its own.” Her defamation case “cost me dearly both financially and emotionally,” and Webster said most of those affected could not afford to fight defamatory positions in court.
The legal position on defaming social media is unclear. David Rolph, professor at the University of Sydney, an expert on defamation law, says that “in principle” the social media companies can be held liable.
Just as media companies were held liable for comments on their Facebook page in the Dylan Voller case because they were “responsible for everything that resulted from the establishment of a public page,” “this analysis could refer to the social media platform itself extend, “says Rolph.
He says there are also “jurisdiction and enforcement issues” with the acquisition of overseas-based companies, so plaintiffs rarely investigate internet giants and possible immunity in the Broadcasting Services Act if social media can argue they are a ” Internet content hosts “are”.
Webster says in her case that the use of Facebook was “appalling – it took months” and was only prompted by her legal action.
“Freedom of speech must be valued, but it should not give people the right to stir up an uproar or lie about people.
“Social media companies have benefited from online conversations, but there are rights and obligations … if not held accountable, the number of falsehoods increases with the speed of the knots.”
MP Anne Webster won a defamation case against a conspiracy theorist who falsely accused her of “being a member of a secret pedophile network”. Photo: Mike Bowers / The Guardian
Mia Garlick, Facebook’s director of politics in Australia and New Zealand, told a parliamentary committee that the company had geoblocked some posts from Webster’s accusers and removed the account after repeated violations of community standards. She blamed “additional legal complexity in this case” for the delay.
Claydon got involved based on the experiences of her constituents with “online harassment, posting of intimate photos, cyber stalking and women found by violent criminals in the family via social media platforms”.
“I had a growing interest because there were posts and sites that allowed abuse of women – and when people complained, they fell somewhere into a deep dark void, and the complaints didn’t really go anywhere.”
According to Claydon, users agree not to spread hate speech, incite violence, or purposely spread dangerous misinformation. So the platforms don’t go wrong by removing users who violate the regulations, such as B. Trump.
For Claydon, Trump’s de-platforming raises the question “why it took four years for him to clearly break their rules” – and the fact that social media platforms only found courage on the eve of a new presidency, shows the limits of self-regulation.
“They see themselves as large global entities and are not particularly accountable to anyone,” she says.
According to the E-Safety Commissioner, 14% of Australians are exposed to online hate speech. Claydon wants to build bipartisan support to prevent social media from becoming “a dangerous weapon for half of our citizens” rather than “letting those with the biggest mouths storm out and take shape”.
Despite Christensen’s call for a return to freedom of speech, creating a safer space is also the direction the government is headed.
In December, Communications Minister Paul Fletcher published a draft online security law proposing to give the Electronic Security Commissioner the power to order the removal of harmful content.
E-Safety Commissioner Julie Inman Grant said the bill would ensure that social media moderation is “fairly and consistently” applied, but does not address concerns of some coalition members about de-platforming.
The legislation would be the first of its kind to tackle not only illegal content “but also serious online harm, including image-based abuse, cyber-bullying among young people, and … serious cyber-abuse among adults with the intent to cause harm”.
There is also a voluntary code on disinformation that is being drawn up by the social media giants and enforced by the Australian Communications and Media Authority and is expected to be completed by mid-year.
While senior coalition officials, including incumbent Prime Minister Michael McCormack and Deputy Liberal Leader Josh Frydenberg, expressed concern over Trump’s dismissal, there were no suggestions that the government would change course to support Christen’s call for the abolition of community standards speaking of something other than illegal to correspond to.
Fletcher has signaled that he is cold with the idea of going beyond the existing package, arguing that it already “creates a public legal framework within which decisions about removing content from social media platforms will be made (and if necessary by Decisions can be overridden by) government) “.
A common aspect of reform calls is that participants want more transparency about decisions that are made to block posts or remove users.
Labor MP Sharon Claydon says the de-platforming of Donald Trump on the eve of a new presidency shows the limits of self-regulation. Photo: Andre M. Chang / ZUMA Wire / REX / Shutterstock
Australian Competition and Consumers Commission Chairman Rod Sims, who led the review of the digital platforms, said, given the level of control they have over what we see and read, “We definitely need the government to get this into to get a grip. We can’t just leave it to the digital platforms. “
The e-safety officer says that the platforms “are not always transparent about how they enforce and apply these guidelines, and it is not always clear why they remove one piece of content and not another”.
Transparency would be enhanced by the basic online safety expectations of the Online Safety Act, which “set out expectations for platforms that reflect community standards, and fair and consistent implementation of appropriate reporting and moderation on their websites,” she told Guardian Australia.
“This could include, for example, the rules that platforms currently apply to ensure the safety of their users online, including against the threat of violence.”
Liberal MP Trent Zimmerman backed the platforms’ decision to remove Trump, whom he accused of fueling the flames of a threat to peaceful power transfer in the US.
However, the episode showed that “inconsistent standards were being applied” when Trump was removed, while “many authoritarian leaders can continue to use these platforms for their propaganda”.
“We need clear, transparent rules. And it would be helpful to clarify what options there are to find explanations or to appeal against these decisions. “
Despite high-level discomfort with the Australian government over de-platforming, the mood for more – not less – regulation still prevails.
For those like Webster or Claydon’s constituents, basic enforcement of existing standards would be an improvement.