AI fakes and recycled footage fuel disinformation surge in US-Israel war on Iran

The Daily Star looks into fake war stories thriving on social media
A
Abir Ayon
T
Tarek Hosen

On March 19, 2026, a fabricated photocard shared a claim that Yariv Levin was appointed as Israel’s new Prime Minister following the alleged death of Benjamin Netanyahu. The post falsely asserted that Netanyahu was killed in an Iranian attack and that the US and Israeli governments were using AI to fake his survival to maintain military morale.

Despite being entirely baseless, the misinformation gained significant traction among users, garnering 6,800 likes, 735 comments, and 806 shares by the time of this report.

Just a day earlier, an AI-generated video circulated with the caption 'Severe Iranian attack on America,' showing missiles raining down on a city as people scrambled for their lives. By the time of this report, the post had been viewed 86,000 times, and many users were actively sharing it.

Over the past week, at least 38 pieces of misinformation related to the Iran–Israel conflict were identified by fact-checking organisations Rumor Scanner, Dismislab, and FactWatch.

Not only during this past week, misinformation has been circulating on social media since the onset of the Iran-Israel conflict. Between February 28 and March 6, The Daily Star documented a total of 64 disinformation posts related to the Iran–Israel war circulating across various Facebook profiles and pages. Of these, 22 were identified as AI-generated or deepfake content, where artificial intelligence was used to create fabricated or misleading visuals and narratives.

On March 5, a video began circulating, claiming to show an Iranian soldier launching a missile at the residence of Donald Trump. The footage depicted the former US President fleeing in panic from the strike. The video concluded with a narrator urging viewers to ‘like, comment, and share’ to help fund the missile costs, specifically asking for the comment ‘Victory for Iran’.

Despite the outlandish nature of the claim and the evident use of AI to generate the imagery, the post achieved explosive reach, over 66 thousand engagements. Users are engaging with the post by following its specific call-to-action in the comments

The remaining content detailed 31 video posts, a significant portion of which were misleading, presenting old or already circulated footage as if it depicted current events. Furthermore, the complete list included six photocards and five posts consisting solely of text.

These deceptive narratives were propagated through 51 unique profiles and pages, of which 10 accounts explicitly identified themselves in their bio as 'Media' entities, lending a false sense of credibility to the disinformation.

video was posted on March 3 from the Facebook page of a broadcasting channel named “Channel One News,” claiming in its caption that “Iranian drones are raining down on Tel Aviv’s sky.” The video cited “Source X” as its source. At the time of writing this report, the video had garnered around 60,000 views.

However, our investigation found that the video was generated using artificial intelligence (AI). Closer observation shows several visual inconsistencies commonly found in AI-generated videos. These include unrealistic depictions of missile movements, the density of smoke, and exaggerated explosions.

Using Google’s reverse image search, the original source of the footage was traced to an Instagram account named “Paralelverse net.” The caption of the original post clearly states that the video was created using AI purely for entertainment purposes.

The Newspaper has also confirmed that several Facebook pages claiming to operate as media outlets including “VOD World International”“Face the People News”“IBN News”, and “Jagoron News” have been spreading AI-generated videos and zombie content.

A Facebook page named “Shipon Islam” was also among those spreading misinformation. On March 4, the page posted a video with the caption claiming,“Alhamdulillah, finally the traitor bodyguard of Ayatollah Khamenei who gave Khamenei’s location to Israel has been brought out publicly by the Iranian army on the order of Khamenei’s son.”

However, the investigation found that the same video had previously been posted on X (formerly Twitter) on April 25, 2025. The caption of that post, written in Arabic, stated that assassin Taysir Mahfouz was confronting victims in Damascus’s Mezzeh district. This indicates that the video circulated with the claim that Ayatollah Ali Khamenei’s bodyguard was publicly exposed is not related to Iran. In reality, the footage shows the arrest-related aftermath of Taysir Mahfouz in Damascus, Syria, from nearly a year earlier.

Two days later, the same account posted another video claiming,“Alhamdulillah, the US warship Abraham Lincoln has been destroyed by an Iranian drone attack.” However, the investigation revealed that the footage does not show damage to the aircraft carrier USS Abraham Lincoln. Instead, it depicts a fire that broke out aboard the USS Bonhomme Richard at the San Diego Naval Base in California in July 2020.

Another page named “Chetona News” was also identified as a major source of misinformation. Most of the posts published by the page appear to be AI-generated, with at least six such posts identified by The Daily Star (1,2).

In one video post, the page claimed, “Iran has shot down America’s pride, the B-2 bomber, and is dragging it away.” However, the investigation found that the video was generated using artificial intelligence.

One such post featured an 11-second video showing extensive destruction, with the caption claiming, “Iran will wipe Israel off the map. This is the current scene in Tel Aviv, Israel.” However, the investigation found that the footage was not from Tel Aviv. Instead, it shows destruction from the 7.4-magnitude earthquake that struck the Pazarcık district of Turkey in February 2023.

These Facebook posts have spread several types of rumors, with one of the most prominent being related to the death of Iran's Supreme Leader, Ayatollah Khamenei. 

On March 5, 2026, a video was circulated with a ‘just-in’ caption, claiming to show footage of Ayatollah Ali Khamenei's wellbeing to debunk rumors of his death. However, investigations found the same footage from 2014, meaning a video at least over a decade old was presented as current. By the time of this report, the video had been viewed 7.4 million times, while the post had garnered 324,000 reactions and 5,700 shares.

A number of posts have been identified that claim Khamenei is still alive. These claims were made using deepfakes, text-based posts, photocards, or by presenting old videos as current ones (1,2,3,4).

Several posts have also claimed that Israel’s Prime Minister, Benjamin Netanyahu, has been killed (1,2,3,4). A photo was posted stating that “Israel’s head of state (Benjamin Netanyahu) was killed in an Iranian missile attack.” However, AI-detection tool Hive Moderation identified the image as 99.9 percent AI-generated.