Since the 7 October attack on Israel, the following claims have gone viral on X, fuelled by verified accounts: the Israeli embassy in Bahrain is on fire; Ukraine is providing weapons to Hamas; Palestinian children are faking injuries.
None of them are true.
So what? Today marks a year since Elon Musk finalised a $44 billion deal to buy X, formerly known as Twitter. During that time Musk has:
- dismantled his trust and safety team and fired content moderators;
- removed the verification processes that signified authentic accounts and allowed any user to become “verified” for $8 a month, which also boosts their posts; and
- rolled out a “creator” scheme that allows verified users to make money from their posts.
This has enabled self-styled “news” accounts and partisan influencers to spread disinformation and graphic videos during the Israel-Hamas conflict. Giancarlo Fiorella, director for research and training at Bellingcat, the online investigations group, says changes made by Musk have created “a race to the bottom”.
The Great X Pivot. In January 2023 @censoredmen was a new X account with a singular focus: defend the self-proclaimed misogynist Andrew Tate after he was arrested in Romania. The account now has 657,000 followers and is a popular information source for the Israel-Hamas conflict (Tortoise is not linking to these accounts due to graphic content).
With a moodboard of highly emotive and graphic videos, the account has gained more than 200,000 followers in the past three weeks. Tortoise analysis found that its last ten posts about the conflict, as of Wednesday evening, have a total of 142,000 “likes”, a signal of user engagement.
The last ten posts from CNN’s breaking news account, which has 64 million followers, have just 9,600.
The problem. The anonymous account regularly shares false information under the guise of being a verified source of “alternative media”. It has recycled footage from the Syrian civil war, falsely claimed that MSNBC has suspended Muslim anchors from its network and incorrectly announced that there would be a five-hour ceasefire at the Rafah crossing.
Tip of the iceberg. It is part of a network of verified users that share each other’s content and appear to prioritise engagement above the truth. @ShaykhSulaiman, who has 185,000 followers and describes himself as an investigative journalist, made a similar pivot from Andrew Tate to the Israel-Hamas conflict. He posted a message about Palestinian children killed in the current conflict using an image of dead children from Syria – he later issued a clarification saying his intent was “to present a general depiction of child mortality”.
And the rest. Vocal pro-Russian accounts have also pivoted to the Israel-Hamas conflict with posts that have repeatedly shared misinformation. They include @MyLordBebo (229k followers), who has previously posted a deepfake video of Ukraine’s Volodymyr Zelensky, and @JacksonHinklle (1.6m followers), a former climate activist who on the day of the Hamas assault against Israel wished happy birthday to Vladimir Putin.
It’s not possible to calculate how much money these accounts could be making:
- The figure is based on verified impressions on ads displayed in post replies – rather than direct engagement.
- In July, Andrew Tate and far-right user Ian Miles Cheong claimed they were paid tens of thousands of dollars by X for their posts.
- Verified users can also make money through subscriptions on X, or by encouraging their followers to join them on other platforms or buy merchandise.
The response. X CEO Linda Yaccarino defended the platform’s response to the Israel-Hamas conflict in a letter to the European Commission. She said that X was committed to transparency and safety and highlighted the use of community notes, the platform’s crowdsourced fact-checking tool, which she said were seen tens of millions of times in the first days of the conflict.
But… these notes can fall short. According to NewsGuard, which monitors online information, community notes didn’t appear on 68 per cent of posts containing misinformation about the conflict. “A community-based fact-checking tool is only as good as how fast it can react in real-time to a war,” says Jack Brewster, an editor at NewsGuard. “In this case, it’s just a game of whack-a-mole.”
X responded to a request for comment with an automated message: busy now, please check back later.
aLSO, in the nibs
NEW from tortoise
What does Saudi Arabia want?
Matt Chorley joins the Tortoise team to discuss Saudi Arabia’s role in the Israel-Hamas war and Keir Starmer’s position on the conflict. Plus: bankers’ bonuses and look ahead to the general election