The Select Committee on Foreign Interference through Social Media has been tasked with probing the risk posed to the nation's democracy by foreign actors online, but it's been warned against ignoring the power of domestic influence in spreading misinformation.
It's also been cautioned against simply enforcing content blocking and leaving the responsibility to a handful of mostly US-based tech companies.
The committee on Monday heard from evelyn douek from the Berkman Klein Centre for Internet & Society and Alex Stamos from Stanford Internet Observatory, who both agree it's a battle best fought with transparency and not one about setting a guideline of what is right or wrong information.
Stamos, who in a previous life was Facebook's chief security officer and is now contracting to controversial video conferencing platform Zoom, told the committee they should start by looking at advertising from a policy or regulatory perspective.
"If you're thinking about policy responses here I would think not about whether something is true or false; I'd [also] be careful of wading into the authenticity of [social media] accounts as that becomes a very difficult, complicated issue," Stamos said.
"I would think about amplification. It is totally appropriate for Chinese state media to have their position, just as ABC has a position, just as the BBC has a position, just as Voice of America has its position. What you don't want is countries [being] able to amplify those messages beyond people seeking them out and seeing them organically."
Stamos believes the most dangerous part of these platforms from that perspective is their advertising systems.
"What advertising does is allows you to put content in front of people who did not ask to see it … we don't talk enough about the freedom of people to find information they're looking for," he said.
"In situations where people want to read the output of CCTV, I think in democracies we allow them to do that, what we don't have to allow them to do is for them to spend millions of dollars to put that content in front of people who are never looking for it."
Meanwhile, douek said a centrepiece for any regulation or policy is getting greater transparency from platforms, telling the committee on Monday, "we cannot fix problems that we don't understand".
She said the idea that platforms can and should do more is oversimplifying the problem.
Touching on what Australia in particular is facing, douek said overt influence campaigns and homegrown conspiracy theories often receive far higher levels of engagement than covert ones from overseas actors.
"Overhyping and securitising the discourse around disinformation campaigns only furthers the aim of such campaigns by increasing the levels of distrust in and apathy towards public discourse more generally," she said. "These second order effects will be, in the long term, far more [of a] panacea than any individual information operation."
To that end, douek said the Australian government's response must be grounded in democratic values, including respect for free speech.
"Disinformation is a truly elusive concept to define in law and responses rooted in censorship can interfere with individual's freedom of expression and undermine the project of democratic self-governance," douek said, quoting UN Special Rapporteur for freedom of expression David Kaye from his report on COVID-19.
Douek said asking private platforms to adjudicate claims made on their platforms, despite not having democratic legitimacy or no democratic responsiveness, is a "fraught exercise".
"And then it becomes even all the more fraught when we're talking about matters of politics where one person's propaganda is another person's truth. And so defining -- asking these platforms and American companies what's more in the context of the Australian democracy, there's a severe democratic deficit there," she said.
"I think it's really important as well that we get out of the sort of 'take down, leave up' paradigm of content moderation; the idea that the most effective response to misinformation or disinformation is always to simply remove it.
"Actually, platforms have a lot more tools available at their disposal than that. And that's why when we're talking about things on platforms we have a lot more tools like labelling information, fact-checking, reducing circulation, adding frictions so things can't go viral as quickly or as fast."
She believes building up the fact-checking capabilities in Australia will be more effective than outsourcing the role of arbitrating public debate to private American companies.
"It makes dealing with this as a policy perspective really difficult because you're not really talking about true or false," Stamos added. "You're talking about who gets to decide the context in which these facts are interpreted and what interpretation, because the dominant interpretation and how much amplification are you allowed to get for your interpretation by pushing it via either amplification via advertising or by the creation of lots of accounts that pretend to believe in the same kind of idea."
COVID-19 has seen social media platforms take a "more aggressive" approach toward policing misinformation and boosting authoritative information, such as when it is related to the World Health Organization, douek said
This has not fixed the problem, however, as those considered authoritative can conflict with one another, or simply publish incorrect information themselves.
"It really comes down to the fact -- the notion of objective truth is very difficult to define and we've seen that even in the context of the pandemic which you would think would be one of the easier cases for us to work out what is harmful misinformation and what is not," douek added.