Middle East

His mother waved him off to school. He didn’t return—and AI may be to blame

Mikaeil Mirdoraghi was killed by the US strike on his school in Minab, Iran. The use of AI in military operations is now in question

April 03, 2026
Image: Alamy/Prospect
Illustration by Prospect

On the morning of 28th February, Mikaeil Mirdoraghi, a nine-year-old pupil at the Shajareh Tayyebeh primary school in Minab, a city in southern Iran, left for school. Standing at the top of the stairs in the block of flats where he lived, he asked his mother to “take a picture of me”. As he turned around to wave goodbye, his face framed with thick-rimmed eyeglasses and the lanyard that held them around his neck, Shakiba Derykund, 31, captured the small gesture on her phone camera.

She didn’t know this would be the last photo that she would take of her little boy, who would become the face of the war engulfing his country. That day, the month-long sabre-rattling between the Iranian regime and the US government had finally culminated in the declaration of war. Within hours of taking the picture, Derykund received a call informing her that her son’s school had been bombed.

The school was among the first few targets of the US and Israel’s joint operation. At least 168 people, 110 of them children, were killed, according to the Iranian government. Several independent investigations, including by media and human rights organisations such as Amnesty International, have concluded that it was a US missile that killed Mikaeil Mirdoraghi and his schoolmates. Pete Hegseth, the US secretary of war, has told the press that his department is investigating.

When the US began integrating AI in its military around 2017, human analysts remained central to the equation, and every expert I have talked to has emphasised the dangers of “automation bias”. The Pentagon first used AI in Project Maven, developed with tech firm Palantir, which helped analysts process large amounts of data. Later, it used Raven Sentry, an AI warning system, to predict insurgent attacks in Afghanistan, reportedly with 70 per cent accuracy

But as the use of military AI becomes mainstream, experts fear that human oversight is being phased out.

Last year, the director of the US’s National Geospatial-Intelligence Agency boasted of US army units using AI tools to “make 1,000 high-quality decisions, choosing and dismissing targets on the battlefield, in one hour”. And this has continued in its operations against Iran: according to the Washington Post, the US military “leveraged the most advanced artificial intelligence it’s ever used in warfare” during the first 24 hours of the war. Military targets in the country were generated by the Maven Smart System, a command-and-control platform developed by Palantir, which used AI to sift through large amounts of surveillance data and intelligence gathered from satellites and other sources. 

And as further details have emerged since the strike, questions over the use of automated systems in military decision-making have been increasing. Satellite images from 2013 show that the Minab school complex was once part of an Iranian Revolutionary Guards Corps compound, but was separated by a wall. Google Earth also shows a clinic and an outdoor play area in the vicinity, as early as 2017. According to CNN, outdated information provided by the defence department was used by the US military to create the target coordinates for strikes on Iran. 

In a statement after the strike, the NGO Human Rights Watch said US forces had actually improved targeting processes in recent years, to minimise civilian harm. This involved “… relying on multiple intelligence sources, teams to advise on the civilian environment, and confirmation that the target is a lawful military objective before a strike is approved”. But it could well be that the military’s AI tools, fed with outdated data and lacking human oversight, misinterpreted the school as a legitimate target.

The risks of outsourcing military decision-making to an automated system have arisen in other contexts too. In recent years, the use of AI to identify targeting has been one focus. In 2023, Lavender, an AI programme used by the Israeli military, was used to designate nearly 37,000 Palestinians as Hamas militants, creating a “kill list” with very little human insight or input. Similarly, a recent investigation by Airwars, a watchdog NGO, confirmed that a 2024 US strike in Iraq that killed a civilian, the 20-year-old student Abdul-Rahman al-Rawi, had used AI targeting. 

On the same day that the US launched its war on Iran, the White House was in dispute with Anthropic, creator of the Claude generative AI chatbot, which was being used by the Pentagon and the US military. Anthropic objected to the use of Claude in the US operation in Venezuela in January that led to the capture of Nicolás Maduro, pointing out that Anthropic’s terms of use do not allow Claude to be used either for violent ends, such as in the service of autonomous weapons, or for mass surveillance of American citizens. In response, the US’s defence department designated Anthropic a “supply chain risk” and Trump posted on social media to announce it would be ending business with the firm, calling Anthropic a “Radical Left AI company run by people who have no idea what the real World is all about”. 

Hours later, the US military used the very same tools to identify targets inside Iran, including the school in Minab Located in Hormozgan province, one of Iran’s poorest, the school was a non-profit, chiefly providing education for children from economically deprived communities. 

Elnaz Mohammadi, an Iranian journalist who spoke with Mikaeil’s mother, reports that the family originally came from Andimeshk in the southwestern Khuzestan province of Iran, and had moved 1,300km to the small city for his father’s job. Known for his love for music and art, Mikaeil loved to perform to an audience and a camera, as evidenced by the many videos of him singing that have been shared on an otherwise intermittent and repressive social media space in Iran.

His mother described her son’s personality, encapsulated in his generous spirit and an angelic heart, to an Iranian news broadcaster: “If he sensed someone was unhappy, he would tell them, ‘My name is Mikaeil. Mikaeil means God’s angel. If anyone has a wish, tell me so I can fulfil it.”

“He’s among angels now,” she said.