Australia’s attorneys-general are in the process of updating the country’s defamation laws so that they take into account the role of digital platforms in distributing and displaying online content.
Despite facing the Parliamentary Joint Committee on Intelligence and Security as part of its inquiry into extremist movements and radicalism in Australia, representatives from Facebook, Google, and Twitter were asked for their opinions on the work under way by the attorneys-general.
Specifically the implications if, for the purposes of defamation law, digital platforms were to be treated as publishers and therefore be legally liable in the same way that a media outlet would be if they published content that was defamatory in Australia.
“One of the policy principles that I think is really important when it comes to defamation, though, is to ensure that the responsibility primarily sits with the speaker, with the person who has control over what they are saying,” Facebook Australia’s head of policy Josh Machin said.
“Because if you have any misalignment between — if you make another party responsible for something that someone is saying, then you have potentially the wrong incentives.”
Machin was asked to ponder the ramifications of defamatory comments being left on Facebook in a scenario where it was legally liable for such remarks.
“I’ll tell you what we what we currently do,” he opted for instead.
“Firstly, we review them against our community standards. And if they violate our community standards, we’ll remove that content. So if someone is saying something that represents hate speech, if it violates our bullying policies, we’re able to remove straightaway, because that already violates the rules that we’ve set for the discussions that we have on Facebook.”
If the content does not violate Facebook’s community standards, then it will undertake a legal review process that considers the likes of defamation law.
“We consider if it is potentially defamatory, what the defences might be, and then the potential liability that could sit with us as well,” he continued. “And certainly if we receive a clear court order that something is defamatory … we’ll take steps to geo-block that content.
See also: Everything you need to know about Facebook’s Oversight Board
Google Australia’s head of government affairs and public policy Samantha Yorke argued the question from committee chair senator James Paterson isn’t as hypothetical as it is being perceived to be.
“There are a number of legislative frameworks in place in Australia today that do impose legal liability, both criminal and civil, on digital platforms in relation to content that could be harmful,” she said. “And in fact, Google has been found by several courts, in different jurisdictions across Australia, to be a publisher for the purposes of defamation law, and just linking to a website that contains defamatory material.”
She said Google “craves legal certainty”, that it wants more clarification in the law about what the roles and responsibilities of digital platforms are.
“If indeed the outcome of that was to determine that actually, digital platforms are equally as responsible for content that they host as the person who actually created the content in the first place — it’s a bit hard to talk about how we might have to change our business models. But you could see a scenario whereby we would have to behave more like a traditional media company, pre-vet all content, and have to make editorial decisions about what is published.”
Yorke considered this as clearly a “significant deviation” from the way Google currently operates.
“It would be challenging, frankly, given the sheer volume of content that is uploaded onto our services every day. But it would, of course, logically require us to rethink the way that our businesses operate here,” she said.
Twitter’s senior director of public policy and philanthropy in the APAC region Kathleen Reen said that with more than a billion tweets every two days, her platform “couldn’t possibly review every tweet, or litigate, or support the litigation” given the sheer size of her company.
She considered the policies and procedures Twitter places on users of the platform as sufficient, particularly given they’re constantly evolving.
“Twitter is very well known for its commitments to freedom of expression, and I wouldn’t be doing my job or really expressing that sentiment very well if I didn’t say that we do have a concern around protected speech and freedom of expression,” Reen said.
“And we do have a concern about the costs and how easy it would be to suppress speech, and uncomfortable or unpopular speech or debate. There’s a lot of different views about what should and shouldn’t be in there. So we’re looking forward to the conversation.”
Consultation on a discussion paper is being held on changes to defamation law by the attorneys-general until May 18.