Australia’s eSafety Commissioner Julie Inman Grant said Elon Musk has created a “perfect storm” by significantly reducing safety and public policy staff for his social media platform X.
A transparency report released by the eSafety Commissioner has revealed massive cuts to staff at X Corp., both in Australia and worldwide, whose work was dedicated to online safety and regulation on the platform.
Musk acquired the social media platform Twitter in October 2022, renaming it X in July last year. Since then, as Inman Grant’s report found, the global Trust and Safety staff at X Corp. was reduced by 30 per cent.
In the Asia Pacific region, including Australia, Trust and Safety staff were nearly halved, as 45 per cent of staff were let go.
The public policy teams for X Corp. around the world have also been heavily reduced since Musk took over in October 2022. The entirety of X’s public policy team in Australia was let go.
More than three quarters (78 per cent) of public policy staff globally were made redundant, including 73 per cent in the Asia Pacific region.
The eSafety Commissioner’s report revealed that from November 2022 to May 2023, 6,103 previously banned accounts were reinstated by Twitter. According to the Commissioner, these accounts were Australia-based.
Out of those reinstated accounts, 194 of them were previously suspended for hateful conduct violations before being reinstated. X Corp. told the eSafety Commissioner these accounts were not subject to any additional scrutiny after being reinstated on the platform.
Our latest transparency report shows information provided by X Corp., owner of social media platform Twitter/@X, revealing the extent of deep cuts the company has made to safety and public policy personnel, and gaps in its measures to tackle online hate: https://t.co/7knm2xwZyf pic.twitter.com/CHxAqBt2P8
— eSafety Commissioner (@eSafetyOffice) January 10, 2024
Julie Inman Grant, Australia’s eSafety Commissioner, said the “toxic” results of safety staff reductions and previously banned accounts returning to the platform were unavoidable.
“It’s almost inevitable that any social media platform will become more toxic and less safe for users if you combine significant reductions to safety and local public policy personnel with thousands of account reinstatements of previously banned users,” Inman Grant said.
“You’re really creating a bit of a perfect storm.”
Inman Grant raised concerns particularly in relation to the thousands of suspended accounts that were reinstated as of May last year.
“If you let the worst offenders back on while at the same time significantly reducing trust and safety personnel whose job it is to protect users from harm, there are clear concerns about the implications for the safety of users.”
The eSafety Commissioner’s report said the reduction in staff globally has drastically slowed response times for users when they report online hate on X.
“Response times to hateful tweets have slowed by 20 per cent since the acquisition and response times to hateful direct messages have slowed by 75 per cent, with users not receiving a response for up to 28 hours,” Inman Grant said.
“We know from that online abuse is frequently targeted at victims via services’ direct message features, with clear intent to cause harm.”
First Nations hate speech
According to a recent study from eSafety, First Nations youth experience hate speech online three times as much as their non-Indigenous counterparts.
Targeted online hate speech at Australia’s First Peoples saw a particular increase during the Voice to Parliament referendum last year, where more than 60 per cent of Australia’s voting population voted No to constitutional recognition of Aboriginal and Torres Strait Islander peoples.
In June 2023, eSafety issued a legal notice to X Corp. under Australia’s Online Safety Act to see what steps X was taking to meet the Australian government’s Basic Online Safety Expectations.
In responding to the legal notice, X Corp. said it had failed to formally engage with any First Nations organisations between when staff reductions first began in 2022 and May 2023. X Corp. noted it had engaged with First Nations peoples over many years in the past.
Inman Grant noted the lack of dialogue between X Corp. and First Nations peoples and organisations is a concern for online safety in Australia.
“Understanding nuance and the unique cultural context of Australian communities is important to ensure platforms can tackle the online harms that can manifest and damage local communities,” she said.
Islamophobia and anti-Semitism
Anti-Semitism and Islamophobia has seen a significant uptick in Australia, especially in the online space, since October 7, 2023.
According to a report from the Islamophobia Register Australia, the Israel-Gaza war, dubbed “the Instagram war”, has incited dangerous hate speech online. Since Hamas attacked Israel on October 7 2023, killing up to 1200 Israeli civilians, the Register recorded a 1300 per cent increase in incidents of Islamophobia.
A similar surge in anti-Semitism has occurred in the online space since October 7. According to US-based organisation Anti-Defamation League, anti-Semitism on social media showed a 919 per cent week-over-week increase.