Over 180,000 pieces of content were removed last year from Facebook and Instagram Pages or accounts specific to Australia for violating Meta’s Community Standards in relation to harmful health misinformation.
The 180,000 was an increase from 110,000 in 2020. Meta said Australians benefitted from the content it removed from other countries as well, which when talking globally, this number tipped 11 million.
And with an information hub dedicated to COVID-19 on its platforms, Meta said in the fourth quarter of 2021 (October, November, December), over 3.5 million visits were from Australian users. Since the beginning of the pandemic to June 2021, Meta has removed over 3,000 accounts, pages and groups for repeatedly violating its rules against spreading COVID-19 and vaccine misinformation.
The figures were revealed in a transparency report from Meta, focused on Australia and published by the Digital Industry Group (DIGI) as part of its guardianship of the Australian Code of Practice on Disinformation and Misinformation.
In February 2021, Google, Microsoft, Tik-Tok, Twitter, Facebook and Redbubble signed onto DIGI’s voluntary code of practice, which is aimed at combating the spread of misinformation and disinformation in Australia. Since launch, the code has seen two further signatories in Adobe and Apple.
Under the Australian Code of Practice on Disinformation and Misinformation, signatories have committed to safeguards to protect against online disinformation and misinformation, including publishing and implementing policies on their approach, and providing a way for their users to report content that may violate those policies. You can read more about the code here.
Part of the commitment to the code is publishing transparency reports on the work done by each company on their respective platforms.
“If we can increase understanding of these complex challenges over time, then industry, government, civil society and academics can all continuously improve their policies and approaches,” DIGI said.
Despite revelations earlier this month that Meta knew exactly what it was doing when it removed emergency services from Facebook in Australia that time it pulled news from our feeds, the report touted the launch of ‘Local Alerts’. Local Alerts, if you’re unfamiliar, allows first responders to communicate directly with Facebook users on urgent announcements such as COVID-19 outbreaks. It also praised its efforts in launching a climate science information hub, sponsoring some campaigns and surveys and giving cash to NGOs. As well as partnering with Australian government agencies and other local policy institutes to tackle inauthentic behaviour on its platforms. Meta also works with 80 fact-checking organisations globally, so that’s pretty good to know.
It said it will also fund training for Australian journalists on “how to identify and prevent amplifying mis- and disinformation” and focus on new areas of research relating to misinformation and disinformation, including media literacy of First Nations peoples.
Meta’s report was a total of 49 pages, mostly listing the initiatives it has in place around the 45 commitments it made to DIGI’s code. If you use Facebook or Instagram, chances are you already know about a lot of these, including a notice that links to their COVID-19 information hub and other warning labels on content that it hasn’t removed, but has fact-checked. There’s also notices in Messenger, for example, that tells you when something has been shared around quite a few times.