Australian regulator demands social media firms disclose anti-terrorism efforts
By Byron Kaye
SYDNEY (Reuters) – An Australian regulator has sent legal letters to social media platforms from YouTube, X and Facebook to Telegram and Reddit, demanding they hand over information about their efforts to stamp out terrorism content.
The e-Safety Commission said it was concerned the platforms were not doing enough to stop extremists from using live-streaming features, algorithms and recommendation systems to recruit users.
Since 2022, the regulator has had the power to demand big tech firms give information about the prevalence of illegal content and their measures to prevent that content from spreading. Failure to do so can result in hefty fines.
Inman Grant said Telegram was the most used by violent extremist groups to radicalise and recruit.
The Dubai-based messaging service, which a 2022 OECD report ranked at No. 1 for frequency of what it described as terrorist and violent extremist content, did not immediately respond to a Reuters request for comment.
“We don’t know if they actually have the people and resources in place to even be able to respond to these notices, but we will see,” Commissioner Julie Inman Grant told Reuters in an interview. “We’re not going to be afraid to take it as far as we need to, to get the answers we need or to find them out of compliance and fine them.”
YouTube, ranked second for violent extremist content, “has the power through their clever algorithms to spread propaganda broadly … in a very overt way or a very subtle way that sends people down rabbit holes,” she added.
Subjects considered terrorism ranged from responses to the wars in Ukraine and Gaza to violent conspiracy theories to “misogynistic tropes that spill over into real-world violence against women”, Inman Grant said.
The regulator has previously sent legal letters to platforms seeking information about the handling of child abuse material and hate speech, but its anti-terrorism blitz has been the most complex because of the wide range of content and methods of amplifying content, she said.
Elon Musk’s X was handed the first e-Safety fine in 2023 over its response to questions about its handling of child abuse content. It is challenging the $386,000 fine in the court.
The commission targeted Telegram and chat forum website Reddit for the first time in this round of legal letters which were sent on Monday. A white supremacist who killed 10 Black people in Buffalo, New York in 2022, has said the platform contributed to his radicalisation, according to the commissioner.
X and YouTube owner Alphabet were not immediately available for comment.
A spokesperson for Facebook owner Meta said the company was reviewing the commission’s notices and that “there is no place on our platforms for terrorism, extremism, or any promotion of violence or hate”.
A Reddit spokesperson also said terrorism content had no place on the platform and “we look forward to sharing more information on how we detect, remove, and prevent this and other forms of harmful content directly with the e-Safety Commissioner”.