The Select Committee on Social Media and Online Safety opened for submissions in December. It’s that committee that is meant to be looking into the range of online harms that may be faced by Australians on social media and other online platforms. And it’s the inquiry Craig Kelly has been appointed to.
While the inquiry has the opportunity to dive into the potential impacts of online harms on the mental health and wellbeing of Australians, and the extent to which algorithms are used by these platforms to permit, increase or reduce these online harms, Thursday’s hearing was overshadowed by Kelly questioning why he and those that agree with him, are being banned.
Thursday’s hearing saw representatives from Google, Meta and TikTok, three of the largest, if not the largest, social media platform and information organisers used by Australians. A number of MPs got some questions to the representatives around misinformation, mostly COVID-related, because obviously, but Kelly’s questioning had a loud agenda to it.
Facing Kelly’s questioning first was Lucinda Longcroft, head of government affairs and public policy for Google in Australia and New Zealand. She was asked to define ‘COVID medical misinformation’. Kelly was particularly interested in where the line lies between misinformation and a genuine, valid opinion.
She told Kelly that Google’s COVID-19 misinformation policy was developed at the start of the pandemic and that it is quite detailed.
“It takes into account very clear guidelines from global authorities, health organisations, as well as trusted medical and scientific authorities and governmental authorities,” she said. “It is constantly reviewed and revised as scientific and medical information is updated.”
On what constitutes misinformation, Longcroft said it’s the likes of COVID denial or something such as ivermectin being a treatment. But Kelly interjected with words from a statement that said something or other about the regular use of ivermectin being effective. He read from it and used it as his example of Google still blocking this study, despite it being peer-reviewed.
“I wouldn’t speak to a particular example,” Longcroft said in response. Her frustration clearly growing. She also said the context in which certain information is published on Google is also taken into account.
Kelly declared Google’s standards deny people that have ‘alternate views’ and ‘diverse opinions’, Longcroft pointed again to official sources being, well, the trusted source.
He kept going, though, saying Google was causing harm by denying people access to information on ivermectin being ‘good’. He asked “who checks your fact-checkers?”. Longcroft then rattled off the organisations Google partners with.
Next up was Meta, with Mia Garlick, director of policy for the company formerly known as Facebook in Australia and New Zealand.
He wasted no time reminding everyone his Facebook account was removed and that he previously accused the company of engaging in grossly “improper conduct”. He tried to trap Garlick into saying that members of Parliament need to have a Facebook account to do their civic duty, basically, his argument was: Facebook offers members of Parliament an important tool in the kit bag of their performance of their duties as an MP.
“I think people use our services in a range of different ways,” Garlick, also frustrated, replied with. “Certainly it is a tool they can use.”
She clarified that rules surrounding harmful information and misinformation is applied regardless of who the individual is, but Kelly thinks just because people can hold opinions that are contrary to “government medical bureaucrats” doesn’t warrant being thrown from the platform.
But like Longcroft, Garlick listed the likes of the WHO and TGA as trusted sources of information.
Kelly went on to discuss boosters and kids, but Josh Machin, Meta’s head of public policy for Australia said claims about a potential treatment that potentially discourage people from being vaccinated, or taking the globally accepted steps in order to protect themselves from the virus, are removed from Facebook (and Instagram, ‘cause let’s not forget that’s a Zuckerberg-Meta platform, too).
“We don’t allow people to make definitive claims about alternate treatments or cures,” he added.
But Kelly isn’t convinced that this isn’t “silencing one side of the debate”.
Kelly denies he was spreading misinformation about COVID-19, previously saying, “It is not misinformation if you have a difference of opinion”.
With TikTok also on the bill for Thursday’s inquiry, Kelly wanted to know why his video got removed from their platform. His video, btw, was promoting protesting against lockdowns.
Julie de Bailliencourt, global head of public policy for TikTok essentially told Kelly that the guidelines dictate what’s allowed and not allowed, that TikTok is conservative with its banhammer and that people can dispute the takedown.
Anyway, that’s what you missed by not tuning into the Craig Kelly show, I mean the Select Committee on Social Media and Online Safety’s January 20 inquiry.