Public Radio for Alaska's Bristol Bay
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Fake accounts, old videos and rumors fuel chaos around Gaza hospital explosion

An aerial view of the complex housing the Ahli Arab hospital in Gaza City after an explosion on the hospital grounds that killed hundreds, according to Palestinian officials. Unraveling the facts behind the explosion has been made difficult because of swarms of social media accounts spreading false information about the explosion.
Shadi Al-Tabatibi
/
AFP via Getty Images
An aerial view of the complex housing the Ahli Arab hospital in Gaza City after an explosion on the hospital grounds that killed hundreds, according to Palestinian officials. Unraveling the facts behind the explosion has been made difficult because of swarms of social media accounts spreading false information about the explosion.

Journalists and researchers are still piecing together a full picture of what caused a massive explosion at the Al Ahli Arab Hospital in Gaza on Tuesday. The blast killed hundreds of people, many of whom were reportedly sheltering from bombardment elsewhere.

Almost immediately, claims and counterclaims flew about who was responsible. Many initial news stories reported it as an Israeli airstrike, citing the Palestinian health ministry. Israel denied the accusation and said it was caused by a misfired rocket launched by a Palestinian militant group. On Wednesday, the U.S. backed up Israel's claim, based on its own analysis of "overhead imagery, intercepts and open source information."

As more evidence has emerged, including photos of the blast site and videos from the time of the explosion, the majority of independent analysts say the damage is not consistent with a standard Israeli airstrike.

But in the immediate aftermath of the tragedy, the shifting accounts in news outlets and the rapid spread on social media of unverified information, old videos and bogus eyewitness accounts fueled speculation, suspicion and outrage — and, experts say, are making it more difficult to establish accountability for the tragedy.

Even before evidence was available and fully assessed, many people had already made up their minds about whether Israel or Palestinians were to blame for the carnage. Protests broke out across the Middle East and a planned summit between President Biden and Palestinian, Egyptian and Jordanian leaders was canceled.

"There really so far does appear to be a flood of misinformation in a very short time, and in a way that's having a material impact on the diplomacy around the conflict, on the mass mobilization and protests, some of which have the ability to lead to violence," said Daniel Silverman, a political science professor at Carnegie Mellon University who studies war and misinformation. "It's hard to argue misinformation isn't a central story here, and a really consequential one."

Recycled videos, fake accounts

Soon after news of the explosion broke, videos began circulating online — but some did not actually show the incident. Israel's official account on X, the platform formerly known as Twitter, posted a video it said showed the explosion was caused by a Palestinian rocket — but the post was edited to remove the video after a New York Times journalist noted its timestamp was well after the blast.

Another much-viewed video claiming to show the hospital blast was first posted in 2022, in what's become a common tactic of recycling and misrepresenting conflict footage.

Amid the chaos, some social media accounts seized the opportunity to push their own narratives and gain followings.

"In the time between something happening and us having a really good assessment of what happened, there are a lot of people who will seek to use this situation — if they can make you believe something about it — that is absolutely to their benefit," said E. Rosalie Li, a researcher and founder of the Information Epidemiology Lab.

That has been particularly pronounced on X, where owner Elon Musk has made changes that have made it harder to identify credible sources and that favor engaging posts regardless of whether they are accurate.

On Tuesday, an X account purporting to be a journalist at Al Jazeera claimed to have seen eyewitness evidence that the hospital was hit by a Hamas rocket. Al Jazeera disavowed the account, saying it had no journalist by that name. A quick perusal of the account's posts showed that until very recently it had been posting about Indian politics and trolling Pakistan's cricket team.

The account was eventually taken down, but not before it rapidly gained followers and was shared by other large accounts, including by a conservative national security group in the U.S. (X responded to a request for comment with an email auto-reply saying "Busy now, please check back later.")

Musk's changes to X fuel rush to monetize false information

Many of the unverified or bogus claims about the hospital explosion, as well as other misleading narratives about the Israel-Hamas war, are being made and amplified by X accounts carrying checkmarks. Those used to signal an account was who it said it was, but under Musk, anyone can pay an $8 monthly subscription fee to get one. Accounts with the checkmarks are boosted on the platform and are eligible to earn advertising money if their posts get enough views.

Critics say the arrangement incentivizes posting regardless of truth and, at worst, enables bad actors to monetize false information.

"The way that the platform has been shifted just rewards, encourages, incentivizes and amplifies the bulls***," said John Scott-Railton, a senior researcher at the University of Toronto's Citizen Lab.

One post from a checkmarked account contained a screenshot of a fake Facebook page appearing to show Israel's military claiming credit for the attack. It received more than a million views. The post has since been deleted, but many others using the same language and the same screenshot remain on X.

According to NewsGuard, a company that rates the reliability of online news sources, nearly three-quarters of the 250 most-engaged posts on X promoting false or unsubstantiated narratives about the conflict were made by accounts carrying subscription checkmarks.

Other changes under Musk, including no longer displaying headlines on links posted to X, as well as the exodus of many users including journalists, experts and some media outlets (including NPR), are also making it harder to vet much of the information on the platform.

Kolina Koltai, of the open source investigations group Bellingcat, is among the researchers sifting through video and images for clues about what caused the hospital explosion. She and her colleagues have been cautious about making any declarative statements before they have a fuller understanding. She emphasizes that this work takes time and patience.

"We want answers right away. And sometimes we don't have answers right away," she said. "In times like this, where you can't take the slow, methodical work that usually [open-source investigation] requires, it could have really dangerous repercussions."

That hasn't stopped some accounts on X claiming to do open-source investigations from rapidly pushing out definitive takes which later turn out to be wrong.

"Right now [X] is an absolute misinformation and disinformation crisis, and that environment is uniquely unhelpful to getting to a shared understanding of what happened and trying to make sure there's accountability around what looks like a pretty clear disaster," Scott-Railton said.

Copyright 2023 NPR. To see more, visit https://www.npr.org.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.