This Is What Meta Says It’s Doing to Prevent Misinformation in the Lead up to the 2022 Federal Election

This Is What Meta Says It’s Doing to Prevent Misinformation in the Lead up to the 2022 Federal Election
Image: Meta

Meta (the former Facebook) is adding RMIT FactLab to its cohort of third-party fact-checking organisations in the lead-up to the 2022 Australian Federal Election.

Meta says that it’s combatting misinformation, election interference and forms of abuse through a “comprehensive strategy” that it’s deploying for the coming election. The big news today is that the social media giant is adding RMIT FactLab to its third-party fact-checking program.

Supposedly, Meta has learned some key lessons over the more than 200 elections it has been a part of since 2017 (including the massively volatile 2020 U.S. presidential election that resulted in a platform-wide ban of Donald Trump), which it’s taking into consideration for the Australian Federal Election.

In case you haven’t been paying attention, we don’t have a locked-in date for the Australian election just yet, but we know it’s coming in the first half of this year (we’re expecting May).

This year’s different however, at least as far as Meta is concerned. Meta says it has invested $7 billion (in Aussie terms) over 2021 internationally in addressing problems with its platform and misinformation, election interference and online harms. Meta wants to stop abuses before they occur and not after they happen, it says.

As a part of this, Meta is adding another official fact-checker to its third-party fact-checking program in Australia – RMIT FactLab. RMIT FactLab is a research division at RMIT University that debunks misinformation online. It also produces its own research on digital news.

It’s the third Australia-based fact-checking organisation to join Meta’s third-party program, after the Australian Associated Press (AAP) and Agence France-Presse (AFP). Internationally, the program has 80 members.

Meta, however, told reporters that fact-checking of politician claims will not be part of its measures for preventing the spread of misinformation.

“The speech of politicians are already very highly scrutinised,” Josh Machin, the head of policy at Meta Australia, is quoted by ZDNet as saying. “It’s scrutinised by [journalists], but also by academics, experts and their political opponents who are pretty well-positioned to push back or indicate they don’t believe something’s right if they think they’re being mischaracterised.”

Meta says it will remind political candidates to enable two-factor authentication for their profiles and will be holding social media information sessions for candidates, however.

That aside, Russell Skelton, the director of RMIT FactLab, said he sees the overall project as “really important public service”.

“If we can play a role in preventing the dissemination of misinformation on social media that has the potential to mislead or harm, then we see that as providing a really critical service,” he added.

“A continuing focus of our work is to identify the super spreaders of misinformation and the ecosystems in which they operate. High impact misinformation disrupts evidence-based public policy and debate and so it is crucial we gain a better understanding of what drives this.”

This appointment to Meta’s group of third-party fact-checkers gives RMIT FactLab a fair amount of power when it comes to verifying claims on social media. For instance, when RMIT FactLab rates and reviews a social media post in Australia that has been identified as false, Meta reduces the reach of the content so that fewer people can see it. Warning labels are also applied to false or misleading content, which you might be familiar with.

RMIT FactLab will begin fact-checking in partnership with Meta from March 21, leading up to the 2022 Federal Election.