Facebook whistleblower Frances Haugen calls for new powers to stop the spread of disinformation on social media and warns of election risks
She has been invited by the Morrison government to provide evidence in her investigation into social media giants and the “toxic material” they allow to be posted and circulated online.
“Self-regulation doesn’t work”
Ms Haugen, a data scientist, commended Australia for adopting Big Tech, including launching the News Negotiation Code to support public interest journalism.
But she urged the government to go further, as UK and European officials have done, and try to regulate “the heart of the platforms’ business model” – the algorithms and design features that enable the spread of disinformation and hate, as well as exploitation of people enable data.
“In all parts of the world it has become clear in recent years that self-regulation does not work. Platforms cannot be trusted to act in the public interest,” she said.
“General elections are coming up in Australia in a few months. As we have seen in many recent elections around the world, particularly in the United States, threats to electoral integrity are unpredictable and can quickly escalate.
“Australia, with its unique geopolitical location, is no exception. Social media is an important vector for coordinated disinformation campaigns. It adds fuel to the fire of confusion and fake news. It reinforces the most sensational content at the expense of balance and facts.
“China is a very, very sophisticated operator that has invested heavily in weaponizing social media platforms in open societies because they see our strengths as our weaknesses.”
Ms Haugen said societies can no longer rely on platforms to moderate harmful and illegal content downstream, given the sheer volume of material produced, and that Facebook often “blames the victim” for criticism.
She said many things could be done to make platforms safer for children and reduce the spread of dangerous content and disinformation.
She said social media companies should be made to assess the risks of their products and make it public, while regulators, parents, doctors and others in the community should do the same so they could be held accountable.
Academics should also be given a safe haven to analyze Facebook’s data, she said, noting that the company had threatened to sue US academics and ban their accounts after they tried to investigate how Facebook -Advertisement rigged US elections.
One policy change that could be made was appointing human moderators to approve content posting for groups with more than 10,000 members.