61.8 F
Chicago
Tuesday, April 28, 2026
Home Blog Page 77

Ukraine Becomes World’s AI Weapons Laboratory

Ukraine Becomes World’s AI Weapons Laboratory

By Craig S. Smith of Eye on AI

I was in Ukraine in February and wrote this piece before the start of the war with Iran, but its implications are even more relevant today. My interest in lethal autonomous weapons dates back to my time with the National Security Commission on Artificial Intelligence, where full autonomy was debated but largely dismissed as ethically unacceptable.

But in practice, the step to full autonomy is smaller than it sounds. Once a human is no longer actively controlling a system and is only monitoring it with the option to intervene, the shift to removing that human entirely is incremental.

It’s similar to how Iran describes its nuclear program. Uranium enrichment for civilian energy is presented as benign, but once enrichment reaches reactor-grade levels, the remaining technical steps to weapons-grade material are a matter of time and intent, not capability.

It is becoming increasingly difficult to argue that fully autonomous weapons will not arrive. They follow naturally from realities already on the battlefield. What is easier to grasp is the fear they generate. Watch first-person-view footage of a quadcopter chasing a soldier to his inevitable death and the abstraction disappears.

Bundled against Ukraine’s subzero February chill, a man in a gray coat threw what looked like a gray model airplane into the pale blue sky. The buzzing of the drone’s propeller slowly faded as it climbed above snowy fields and barren hedgerows. It looked like a toy.

Oleksandr Liannyi was not playing, however. He was working on a way to make drones far deadlier than they are today.

“It’s mostly about accuracy of positioning, of how the navigation part will perform in different conditions,” said Liannyi, cofounder of NORDA Dynamics, which builds autonomous navigation and targeting modules for military drones..

Liannyi and his colleagues and other Ukrainian teams have achieved partial autonomy, allowing drones to navigate to and strike human-selected targets on their own. The next step is far more controversial: fully autonomous drones, which could navigate to an active front, hunt for targets, and strike without human input. Empowered to make life-or-death choices, such drones would fundamentally change the nature not only of this war, but of all wars.

“The technology is very close,” Liannyi said later inside a battered white van at the tree line. He noted that a number of intermediate stages still need to be developed before such systems exist and that NORDA Dynamics continues to emphasize human approval in the loop when it comes to the strike decision.

Under International Humanitarian Law, humans can’t pass responsibility for killing to a machine.

But Liannyi argues that even if a human is legally required to approve a lethal strike, autonomous target acquisition will, at the very least, increase the number of drones a single pilot can manage. “The drone can notify you when it sees the target, and then you can pull up the picture and approve it, so you can control lots of drones simultaneously,” he said.

I had come to Ukraine, improbably, with a Silicon Valley startup founder to witness tests of his company’s humanoid robot in a combat setting. But because of its sensitive nature, the robot never made it out of its crate at the airport in Warsaw and, for the same reason, never got past the Polish-Ukrainian border in the middle of a snowy night. It was eventually sent back to California. So I began interviewing people about the growing autonomy of weapons in the current war. That led me to the white van on the edge of a snowy field in Western Ukraine – what the Ukrainians call a “polygon,” after the 19th-century European term for a military training ground.

Beside us in the van, a young blond man in a gray parka sat hunched over a screen, watching a video feed from the drone’s camera. He moved a small white box across the screen with his thumbs on the prongs of a drone controller until he spotted a distant tree and flipped a switch with his finger. The box turned green, a red bar at the top of the screen flashed “ENGAGE,” and he lifted his hands away from the controls as if to emphasize that the drone was now flying on its own.

Almost immediately, the drone banked toward the tree outlined on screen by the green glowing square and, within seconds, was hurtling toward it. A moment before the collision, the man took control of the drone again, sending it swooping back into the sky. “Oho!” he exclaimed. Another man in the van muttered in Ukrainian, “Duzhe kruto,” or “very cool.”

Liannyi and his colleagues were testing new control algorithms that can guide a drone to its intended target without human control, a necessity when pilots lose contact with their drones because the enemy has jammed the radio link. Most of these systems allow drones to fly in complete radio silence for the last half mile to two miles, depending on the weather and the cameras used. Once flying autonomously at roughly a hundred miles per hour, the drone is virtually undetectable by the enemy until it is too late.

Autonomy on a Circuit Board

Inside the drone’s plastic housing is a cheap computer chip soldered to a green circuit board modeled on Raspberry Pi, a single-board computer originally designed to teach British schoolkids to code. These boards are imported from China, but Ukraine is now developing its own onboard AI, including homegrown boards built by dozens of local companies. NVIDIA’s more powerful Jetson Orin modules are used in some long-range, high-value drones, but they are expensive. Cheaper modules offer enough onboard AI to lock onto a target while keeping the unit cost low enough to lose in combat.

Currently, attack drones are still flown by a human operator, who uses a screen and controls to steer the aircraft, choose a target, and decide when to strike. With partial autonomy from companies such as NORDA Dynamics, the machine can take over the final phase of the attack. Once a human has picked the target and sent the drone toward it, onboard software handles the last stretch of navigation, avoiding obstacles and lining up the final approach. In practice, that means the person still decides who or what can be attacked, but the drone’s autonomy decides exactly how to get there and hit.

Full autonomy would mean the drone, not a human, decides who or what to attack and carries out the strike on its own. The system would search for potential targets, decide which ones fit its programmed rules, and then launch and complete an attack without asking a person for approval.

Such lethal autonomous weapons, called LAWs, would allow warfighters to define a kill box: a geofenced zone in which autonomous drones could hunt, killing any person or destroying any vehicle they find. The box could be as small as a crossroads or as large as 20 square miles of frontline terrain.

The Legal Gray Zone

To turn the kill box into reality, drones must be able to distinguish a soldier from a medic, a fleeing civilian from a retreating infantryman, a tank from a tractor, in rain and snow, day and night, and do it well enough so that commanders and lawyers are willing to let them fire without a human making the final decision.

Neither International Humanitarian Law nor Ukrainian law specifically prohibits fully autonomous weapons. They require only that weapons distinguish soldiers from civilians and medics, avoid excessive civilian casualties, and allow humans to halt or adjust attacks as battlefield conditions change. Even U.S. law and military doctrine require only that autonomous weapons be designed so commanders and operators can exercise “appropriate levels of human judgment over the use of force.”

Already, Western officials have moved from talking about a human “in the loop,” meaning a person must actively approve each strike, to a human “on the loop,” meaning a person supervises the system and can intervene to stop an attack. Because of “automation bias,” the tendency for humans to trust machines that have proven accurate in the past, “on the loop” risks humans effectively rubber-stamping machine decisions to keep up with the pace of battle.

But autonomy opponents warn about algorithmic errors or hacks that could propagate at machine speed.

“The risks they pose to civilians, friendly forces, and human security in general are staggering,” Dr. Peter Asaro, the Vice Chair of Stop Killer Robots, wrote in an email. “While it may seem expedient in a desperate situation, we need to consider the long-term ramifications of developing these technologies.”

The Asymmetry

Aleksandr Palamarchuk, a soldier with the Azov Brigade who goes by the call sign Paradise, appears as a ghostly image on the laptop screen in my Kyiv hotel room to talk about where the technology is today. A virtual background of the aurora borealis hides any clues to his whereabouts, which he says is a research and development lab within a hundred miles of the front.

Azov Brigade is a Ukrainian National Guard special forces unit, formed in 2014 as a volunteer militia to fight Russian-backed forces in Donbas. It has since become one of Ukraine’s fiercest combat units while remaining controversial because of its early ties to far-right groups.

“You need to be 100 percent sure it’s an enemy,” Palamarchuk said, noting that any civilians killed are Ukrainian because the war is primarily on Ukrainian soil. (Russian civilians in border regions have also died from Ukrainian strikes, but in far smaller numbers.)

However, Russia doesn’t play by the same rules. A recent report by the Institute for the Study of War, a U.S. nonprofit funded by private donations, concluded that Russian drone strikes against unmistakably civilian targets, from pedestrians to apartment blocks, are meant to depopulate frontline-adjacent areas. It also argues that this approach is being institutionalized in Russian doctrine and practice, creating a frontline red zone where any movement or vehicle is treated as a legitimate target.

Russia has shown a willingness to kill civilians since the outset of the war, from the indiscriminate shootings in the town of Bucha, just west of Kyiv, to continued strikes on residential buildings in the capital itself.

For Palamarchuk, that is the core asymmetry of the war. “It’s much easier for them to make absolutely autonomous missions, because they don’t care about the target type or where they hit,” he said.

Palamarchuk said Ukraine is seeking to counterbalance that asymmetry by developing AI that can reliably distinguish legitimate military targets from civilians. He said Azov is experimenting with drones that can fly entire missions by themselves.

“You just place the drone on the ground, then you create a mission for it, and it takes off by itself,” he said. “Then AI models can recognize targets by themselves.”

Ukraine is being forced to innovate faster than any other army on Earth and is restructuring its military around unmanned operations, including giving drones full autonomy. It is planning for a 15-kilometer-wide zone along the front in which machines, not infantry, do most of the work.

The First Robot Assault

In early December 2024, a Ukrainian brigade executed what analysts describe as the first successful unmanned air and land assault in military history, against Russian positions in the Kharkiv region. The dawn attack was coordinated by remote operators who simultaneously deployed an integrated swarm of aerial and ground robots. Kamikaze ground vehicles and robotic machine-gun platforms advanced on the trenches, supported by heavily armed quadcopter bombers and smaller, nimble kamikaze drones acting as close-air support, while dozens of reconnaissance drones provided a total operational overview. The intense, two-hour robotic strike caught Russian forces off guard and destroyed the targeted positions.

Ukraine is still scaling command and control tools to make that repeatable.

At the same time, Ukrainian forces are running an enormous, iterative experiment in unmanned and AI-enabled warfare, with constant adjustments by drone makers based on feedback from the front lines.

Kyiv has formalized this role through its “Test in Ukraine” policy, which invites companies to push new drones, ground robots, missiles, and other systems straight into combat, then feed performance data back to industry and governments.

Western and particularly U.S. firms are among those whose systems are being tested on the battlefield — everything from long-range strike drones to maritime and loitering drones that wait in an area until a target appears — sometimes with very public failures.

Altius loitering munitions, built by U.S. manufacturer Anduril, repeatedly crashed or failed to hit targets and proved highly vulnerable to Russian electronic jamming. They were ultimately withdrawn from use by Ukrainian forces in 2024. Anduril says it has since revised the Altius system based on Ukrainian feedback, and that updated versions have been redeployed with some Ukrainian units.

Ukraine’s breakneck cycle of battlefield experimentation offers a trove of operational data about what works, what fails, and how adversaries adapt. The country’s Ministry of Defense has created a Universal Military Dataset, among the largest of its kind in the world, which can be used to train other AI tools in Ukraine’s defense arsenal. The dataset contains more than two million hours of drone footage and millions of labeled military objects.

The ministry has also developed an AI system called Avengers, which processes live video streams, automatically detecting, classifying, and flagging enemy equipment. Ukrainian officials say this combination of scale and detailed labeling allows the system to recognize most Russian weapons in live video in just a few seconds.

Avengers is integrated into the country’s command-and-control system so that AI-detected targets appear directly on tactical maps, passed almost instantly to drone pilots.

While publicly these systems are described as AI-enabled or semiautonomous, with humans nominally in the loop, the line separating that from full autonomy is blurring. A drone can decide to hit a tank, or a commander can pre-authorize that decision so thoroughly that the last human yes becomes more of a given than a true ethical barrier.

The Army of Drones

Much of this innovation was driven by Kateryna Chernohorenko, who served as Ukraine’s Deputy Minister of Defense for Digital Development from 2023 to 2025. She arrived at my hotel looking more like a student than a former government official, wearing sneakers and black pants with a striped dress shirt open over a white T-shirt. Her laptop was covered in defense-themed stickers. Her energy and creativity have made her integral to Ukraine’s war.

One of her ideas was the Army of Drones project, which has centralized procurement and standardized platforms, treating drones as standard equipment rather than ad hoc volunteer gear.

“There was a need to have a systemic look at drones’ capabilities and practice,” she said.

That project channeled civilian crowdfunding and volunteer innovation into a coordinated pipeline that supplies the military with thousands of reconnaissance and strike drones, sets technical requirements, and fields them where they are most needed. It also created training and certification tracks for operators, helping build a professionalized cadre of drone units rather than scattered, self-taught teams.

By setting standards, aggregating orders, and validating new concepts at the front, the Army of Drones has turned Ukraine into a live testbed for military drone innovation and influenced how other countries and defense firms think about scaling unmanned systems for modern, high-intensity warfare.

It has also created a thriving defense sector with hundreds of companies in Ukraine building drones that operate in the air, on the ground, or on water. A recent defense technology expo sponsored by Azov took place at Kyiv’s National Museum of the History of Ukraine in the Second World War, a Soviet-era bunker-like building embedded in the Pechersk hills overlooking the Dnipro river. Above it, a towering stainless-steel figure of Mother Ukraine rises hundreds of feet into the air, arms raised, a sword and a shield lifted over the city.

Inside, dozens of firms presented their products. Among the company representatives at the expo was Marko Kushnir, a director at the Ukrainian drone maker General Cherry, whose name refers to the fruit associated with the region where the company’s founders are from.

General Cherry is one of two Ukrainian companies selected to compete in the Pentagon’s Drone Dominance Program, a $1.1 billion initiative to field large numbers of cheap, effective one-way attack drones for American forces. Both General Cherry and Ukrainian Defense Drones Tech Corp. have demonstrated they can mass-produce drones on short notice. General Cherry is now in talks with several Persian Gulf states about supplying interceptor drones to the Iran war.

Kushnir visited me later in my hotel, bringing a General Cherry hoodie and other branded swag. He also brought an unarmed Bullet, a nearly three-foot-tall drone shaped like a rocket and built to hunt other unmanned aircraft.

The Bullet is built to knock out Russia’s long-range, fixed-wing kamikaze drones based on Iran’s Shahed and produced under license in central Russia’s Volga region. Known in Russia as the Geran, the rear-propeller drone has become one of Moscow’s primary weapons for striking Ukraine’s energy infrastructure and residential buildings.

“Our drone can understand that it’s a Shahed,” said Kushnir. “It can go to the target without any operator control.”

The Outsiders

Among the most prominent outsiders building for this new battlespace is former Google C.E.O. Eric Schmidt. His military drone company, Swift Beat, produces a line of drones with bee-inspired names. Its flagship is the Bumblebee, a low-cost AI-enabled kamikaze quadcopter that has logged thousands of combat flights against Russian targets in Ukraine. The drone uses onboard cameras and internal motion sensors to navigate by comparing ground features to maps stored in memory, allowing it to operate without GPS, radio signals, or a live data link. Once a pilot designates a target, the AI takes over.

Neither Schmidt nor Swift Beat would comment for this article.

Swift Beat also produces an AI-powered interceptor system designed to hunt and destroy Russian Shahed drones. Called Merops, after the genus of bee-eating birds, it fires fixed-wing drones from mobile launchers and uses onboard machine vision to track and physically ram targets, bypassing radio jamming.

Merops are now being deployed on NATO’s eastern flank. Romania has begun integrating mobile interceptor units into its short-range air-defense networks, and Poland is training military personnel on the system as part of a broader anti-drone shield.

The underlying parts – small minicomputers, commercial computer-vision libraries, visual-inertial navigation – are mostly dual-use technology rather than exotic military hardware. What is emerging in Ukraine is not only a new class of weapon, but a new production logic: autonomy assembled from cheap sensors, commercial computers, and battlefield iteration, then scaled fast enough to make a difference on the battlefield.

Five Levels of Autonomy

While Schmidt is the most prominent technologist building drones for Ukraine, people in the country point to Ukrainian entrepreneur Yaroslav Azhnyuk as the leading expert on autonomy in the drone race.

Azhnyuk is best known in Silicon Valley as the co-founder of Petcube, a startup that makes interactive pet cameras. After Russia’s full-scale invasion, he used his expertise in cameras that detect motion, interpret behavior, and stream video reliably across unstable networks to build AI-driven autonomous systems for drones.

He likens drone autonomy to the five levels of self-driving cars. “Level one is autonomous terminal guidance,” Azhnyuk explained over breakfast at a fashionable gastropub in central Kyiv. “You fly manually, you lock the target, and from that moment the drone can hit it autonomously under all conditions.”

Level two introduces autonomous bombing: the system calculates release timing and performs an escape maneuver. Level three is more controversial: autonomous target recognition and strike decision-making within a defined kill zone.

“The system scans what it sees, recognizes the target, reaches enough confidence, and initiates the strike,” Azhnyuk explained as he ate pork brisket with pink pickled onions.

Level four adds autonomous navigation from launch to the target area without radio or satellite guidance. Level five includes autonomous takeoff and landing, enabling reusable systems rather than one-way missions.

In his framing, the ethical debate may invert. “Within five to ten years,” he said, “it may become unethical to use weapons without AI,” he said, arguing that autonomous precision systems could cause less collateral damage than purely human-operated alternatives.

Baba Yaga

When Russia invaded in 2022, many Ukrainians pivoted into drone warfare. Pavlo Yelizarov, nicknamed Lasar, was a television producer who bought a smuggled agricultural drone and strapped an anti-tank mine to its undercarriage. That effort evolved into Lasar’s Group, one of the military’s most formidable drone formations.

It was the first to put Starlink satellite terminals onto heavy bomber drones, allowing pilots to operate from secure rear positions via internet-based control links, sidestepping Russian jamming of radio frequencies. The arrangement effectively decoupled the pilot’s physical location from the drone’s, allowing the pilots to remain far in the rear — or indeed be based anywhere in the world.

The group has destroyed more than $13 billion of Russian military equipment, including tanks, each strike documented by onboard video. Its signature platform is a four-rotor heavy bomber that Russian troops have nicknamed Baba Yaga, after a witch in Slavic folklore. The drone, mounted with a satellite receiver from Elon Musk’s Starlink, can carry up to 5 kilograms of munitions and travel as far as 35 kilometers and back, often flying low, at treetop level.

Yet even as Lasar’s Group has refined remote piloting, some of its commanders are looking beyond radio, satellite or fiber-optic connections altogether to a day when drones operate without a human pilot at all.

A major named Yurii, who declined to give his family name for security reasons, oversees training and testing of new engineering solutions within Lasar’s Group, an elite military drone unit. He came to see me in my hotel room wearing military fatigues and a name patch that read “Phoenix,” his radio call sign. He told me that, in his view, the next frontier of drone warfare is full onboard autonomy: once a drone is launched, he said, navigation, targeting, and execution will eventually be autonomous, with no need for a live communication link to a pilot.

“Connectivity can be jammed, so you’ve got to do all of that on the edge,” he said, sitting bolt upright, head shaved and a reddish beard fading white at the point. In other words, the drone must be able to see, orient itself, identify what matters, and act without relying on a distant operator or a remote server.

“This will help us to place our personnel far away from our enemy, without direct contact,” he said. “It will create a war of drones, not a war of humans.”

To move in that direction, Lasar’s Group is developing what Phoenix calls autonomy modules – standardized packages of hardware and software that can be attached to different airframes. “We are building drones, but we are also building the autonomy modules,” he said. The decision-making element is migrating into code.

The Cost

For now, it’s still a war of drones against humans, machines against men, with devastating consequences. Drones now account for over 70% of casualties on both sides.

At a rehabilitation hospital outside Lviv, I met Vyacheslav Kondrashenko, a soldier with Ukraine’s 93rd Separate Mechanized Brigade. A year earlier, he had been carrying a 15-inch-square quadcopter fitted with two sixty-millimeter mortars in the fiercely contested eastern reaches of Donetsk. As he emerged from his dugout into the open, a smaller Russian quadcopter, carrying a munition of its own, struck his right arm and exploded. The blast set off the mortar rounds he was carrying. When the smoke cleared, Kondrashenko – Slava, to his friends – had lost his right arm below the elbow and both legs above the knee. His remaining left hand was rendered useless.

“He was waiting for me,” Slava told me from his wheelchair. “I didn’t have a chance.”

The drone that hit him had been resting on the ground outside the dugout. Somewhere miles away, a Russian operator was watching the entrance through the drone’s video feed, delivered in real time through a fiber-optic cable as thin as fishing line, which had unspooled behind it, draping over fields and trees.

A few days after speaking with Slava, I stood outside the Garrison Church of Saints Peter and Paul in Lviv, the main house of worship for the city’s military. A priest in black-and-gold vestments appeared with a cross, followed by uniformed pallbearers bearing a black coffin on their shoulders. A military band played a funerary dirge.

There are funerals nearly every day in cities across Ukraine. This one was for Taras Novoselskyi, killed on his 47th birthday.

Ukraine’s cities, with their trams, baroque facades, and coffeehouses, can still seem improbably normal until a military coffin passes through. Then the war becomes visible again – not as a weapons system, or a software stack, or a theory of machine autonomy, but as a dead body being carried to the grave.

The procession moved with the choreography of grief. At the town hall, a lone bugler appeared in an upper window. He played “Il Silenzio,” the final call. People stopped to watch. Some crossed themselves. Others simply stood still.

The drive for full autonomy isn’t restricted to Ukraine. Russia has begun equipping its Lancet drone with machine-vision systems that can patrol a designated area, searching for vehicles or other targets that fit a predefined profile.

The war with Iran is accelerating the move toward machine-led killing. Israel has reportedly used AI-assisted targeting in its campaign against Iran, while the Pentagon says the United States is pushing to field swarms of low-cost attack drones and more autonomous systems of its own. Meanwhile, Ukraine has said it will share interceptor drones, training, and counter-drone expertise with the United States and Gulf partners.

There is no public evidence that terrorist groups are building such systems inside the United States. But the technology is spreading, the costs are falling, and U.S. officials have been warning that the homeland drone threat is growing.

I thought of a comment the entrepreneur Azhnyuk made at breakfast the previous day when I asked if the prospect of fully autonomous weapons frightened him. “What I’m terrified about is that we won’t get there as fast as the enemy does.”

Watch: The March Toward Fully Autonomous Weapons

* * * 

Tyler Durden
Thu, 04/02/2026 – 22:10

Allegations Of Pentagon “Casualty Cover-Up”: The Intercept

Allegations Of Pentagon “Casualty Cover-Up”: The Intercept

Well-known national security news site The Intercept has issued fresh reporting which alleges a Pentagon cover-up when it comes to mounting US casualties from Trump’s Operation Epic Fury. Speculation and questions have lately surged among the public and analysts given that casualty updates put out by the Pentagon have been very few and far between. It actually accuses the Pentagon of shoddy record-keeping going back significantly before the current Iran war.

Currently the official numbers… “Since the start of Operation Epic Fury, approximately 303 US service members have been wounded,” CENTCOM spokesman Tim Hawkins said earlier this week. And, as of April 2nd, 13 US service members have been confirmed killed going back to the war’s start on February 28, 2026. But The Intercept is alleging an astounding “casualty cover-up” by the Trump administration:

Almost 750 U.S. troops have been wounded or killed in the Middle East since October 2023, an analysis by The Intercept has found. But the Pentagon won’t acknowledge it.

U.S. Central Command, or CENTCOM, which oversees military operations in the Middle East, appears to be engaged in what a defense official called a “casualty cover-up,” offering The Intercept low-ball and outdated figures and failing to provide clarifications on military deaths and injuries.

Getty Images

Two officials confirmed that at least 15 soldiers were injured last week in an Iranian strike on a Saudi air base, adding that “Hundreds of US personnel have been killed or injured in the region since the US launched a war on Iran just over a month ago.”

The Intercept found that CENTCOM’s latest April 2nd casualty count and ‘update’ to be “three days old and excluded at least 15 wounded in the Friday attack on Prince Sultan Air Base in Saudi Arabia,” noting that “The command did not reply to repeated requests for updated figures.” This has raised suspicions that other incidents are being omitted too.

The US military has also declined to provide a confirmed death toll since the start of the Iran war. The Intercept estimates it is “no less than 15” – while Washington has publicly acknowledged no more than 13 fatalities.

“This is, quite obviously, a subject that [War Secretary Pete] Hegseth and the White House want to keep under major wraps,” an anonymous defense official was cited in The Intercept as saying. The report ultimately charges the US Army with “hiding losses”.

Figures released under President Trump “lack detail and clarity” – The Intercept alleges further. It cites the following incident as but one example:

The Trump administration’s numbers, by comparison, lack detail and clarity. The current CENTCOM casualty figures do not appear to include more than 200 sailors treated for smoke inhalation or otherwise injured due to a fire that raged aboard the USS Gerald R. Ford before it limped off to Souda Bay, Greece, for repairs. CENTCOM did not reply to close to a dozen requests for clarification on the casualty count and related information sent this week.

Recent polls have shown greater American public skepticism toward the war, especially amid talk there could be some kind of ground operation introduced – which the US public overwhelmingly opposes.

Large US casualties related to the Iran war would likely almost immediately result in a revolt against Trump’s war among not only the broader US public, but could split the Republican party as well in terms of Iran policy.

Tyler Durden
Thu, 04/02/2026 – 21:45

Why States Are Right To Reject AI Legal Personhood

Why States Are Right To Reject AI Legal Personhood

Authored by Siri Terjesen and Michael Ryall via The Epoch Times,

A quiet but consequential legal movement is gathering momentum. Idaho and Utah have enacted statutes declaring that artificial intelligence systems are not legal persons. Ohio’s House Bill 469 proposes to declare that AI systems are “nonsentient entities” and bars them from acquiring any form of legal personhood. Similar bills are advancing in Pennsylvania, Oklahoma, Missouri, South Carolina, and Washington. The legislatures driving this movement are not technophobes. They are drawing a necessary line that philosophy, law, and common sense all demand.

The pressure in the opposite direction is real. In January, at the World Economic Forum in Davos, historian Yuval Noah Harari described AI as “mastering language.” Since language is the medium through which law, religion, finance, and culture are constituted, AI may soon be capable of acting within every institution humans have built. Harari asked whether countries would recognize AI as legal persons—whether AI could open bank accounts, file lawsuits, and own property without human supervision. The prospect is not science fiction. It is a policy choice, and the wrong choice would be deeply consequential.

Phantasms versus Nous

Aristotle argued in De Anima that all sentient creatures share a basic cognitive capacity to perceive the world, retain impressions of it, and recombine impressions into new configurations—what he called phantasia, imagination. A dog, a crow, and a chess grand master possess this competency.

Aristotle distinguished human beings as categorically different: possessing nous, the capacity to grasp universal, abstract concepts—ideas like justice, causation, and the good—that cannot be derived from any sensory experience alone. A dog can recognize its owner, but it cannot grasp the concept of ownership. A parrot can reproduce a sentence about fairness, but it has no understanding of fairness.

What is the distinction? Can’t we simply feed an AI system Webster’s definition of “fairness” and let it work from there? No—feeding a machine the dictionary definition only gives it more words to pattern-match against—the concept is not in the words. Any child who grasps fairness can apply it correctly to a situation no definition anticipates. AI can only produce text that statistically resembles how humans talked about fairness before.

This is not a gap that more computing power or better training data will close. Computer scientist Judea Pearl demonstrated mathematically that no amount of pattern recognition over observational data can substitute for genuine causal inference. The appearance of understanding is not understanding itself. And it is precisely the capacity for genuine understanding—for deliberating about what is good and right—that grounds moral responsibility, which is the only coherent basis for legal personhood.

The Problem With the Corporate Analogy

Proponents of AI personhood often invoke corporate personhood as precedent. Corporations are not natural persons, yet the law treats them as legal persons capable of owning property, entering contracts, and being sued. Why not extend this pragmatic fiction to AI? The analogy breaks down at accountability.

Corporate personhood is a legal convenience built on human moral agency. Behind every corporation is a structured network of natural persons—board members, executives, shareholders—who bear fiduciary duties, can be deposed and held liable under piercing-the-veil doctrine, and face reputational and criminal consequences for their decisions. The corporation is a vehicle for organizing human action, not a substitute.

Ohio’s HB 469 captures this logic by denying AI legal personhood, prohibiting AI systems from serving as corporate officers or directors, and assigning all liability for AI-caused harm to identifiable human owners, developers, and deployers.

Labeling a system “aligned” or “ethically trained” does not discharge human responsibility. Granting AI legal personhood would shatter this accountability architecture. An AI “person” could own intellectual property, hold financial assets, and bring lawsuits—all without a human principal who can be held responsible. Sophisticated actors could construct chains of AI-owned shell companies that dissolve liability through layers of nominal personhood.

The result would not be extending rights to a new class of beings; it would be creating accountability vacuums that benefit the powerful humans who deploy AI while insulating them from consequence.

The Moral Stakes for Real People

A deeper moral issue underlies all of this. Legal personhood is not merely an administrative category; it carries normative weight. It signals that an entity has standing to make claims, to be wronged, and to bear obligations. Extending that status to systems that cannot genuinely deliberate, cannot suffer, and cannot be held morally responsible would dilute the concept of personhood in ways that could ultimately harm the humans who most need its protections.

We have not yet secured the full benefits of legal personhood for all human beings in practice—for the displaced, stateless, and structurally invisible. Rushing to extend a contested status to machines while that work remains unfinished would be a profound misallocation of moral and legal energy.

None of this requires hostility to AI as a technology. AI systems can be powerful, useful, and—when properly governed—enormously beneficial. What AI systems cannot be is persons. The states passing anti-personhood legislation are preserving something more important than a competitive advantage—a clear chain of human accountability from every AI action to every AI consequence. When an AI system causes harm, there must always be a human who answers for it. That principle is not a constraint on technology; it is the foundation of a just society.

Aristotle taught that law is reason without passion—a framework for coordinating human beings capable of living well together. AI can help us pursue the good life, but it cannot deliberate about what that life requires. As states across the country move to codify this distinction, they are doing exactly what legislatures exist to do—drawing lines that protect persons: all of them, and only them.

Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times or ZeroHedge.

Tyler Durden
Thu, 04/02/2026 – 21:20

Putin To Saudi Crown Prince: Russia Ready To Do Everything To Stabilize Mideast Crisis

Putin To Saudi Crown Prince: Russia Ready To Do Everything To Stabilize Mideast Crisis

Russian President Vladimir Putin said Thursday that Russia is willing to do anything toward bringing stability to the Persian Gulf and Iran crises.  “Russia is counting on an early end to the conflict in the Middle East, and is ready to do everything to bring the situation back to normal,” Putin’s words were paraphrased in state media as telling Egyptian Foreign Minister Badr Abdelatty, who was hosted at the Kremlin.

“We all hope that the conflict will be ended as soon as possible. Yesterday, the US President [Donald] Trump spoke about this. I repeat it again: For our part, we are ready to do everything to bring the situation back to normal, as they say in such cases, to a stable state,” Putin said.

“The situation in the region is of common concern to us,” Putin added. He also on the same day held a phone call with Saudi Crown Prince Mohammed bin Salman, where the Russian leader’s message was similar.

Anadolu Agency

The Kremlin readout of the call indicated that “Both sides emphasized the need for a rapid cessation of hostilities and the intensification of political and diplomatic efforts to achieve a long-term settlement of the conflict.”

The timing of the Putin-MbS call is additionally interesting given Ukraine’s President Zelensky just did a tour of the Gulf states, seeking to deepen relations based on Ukraine selling small drone technology, capable of defending the skies against threats from Iran. He inked a deal with Riyadh for Ukrainian drone expertise.

According to a review of Ukraine’s latest Gulf deal-making in the NY Times:

In the Mideast conflict, Ukraine has sought to shift its image from a recipient of military aid to a supplier. It sees an opening to export its low-cost, innovative designs created during the war with Russia to compensate for shortages of weapons and ammunition. Ukraine’s military often relies on consumer technologies such as virtual-reality goggles for gamers and off-the-shelf drone components.

The agreements under negotiation with the United Arab Emirates and finalized with Qatar extend for 10 years, Mr. Zelensky told reporters on a conference call, and could be worth “billions.” He spoke from Qatar, one of the Persian Gulf states that has been targeted by Iranian drones.

In their call, Putin and Saudi Arabia’s crown prince further stressed that “problems with energy production and transportation resulting from the crisis are negatively impacting global energy security.”

Both were closely watching whether President Trump’s Wednesday night speech would wind down US operations against Iran. This did not happen, however, given Trump issued no timeline while assuring Iran would be hit very hard over at least the next two to three weeks.

But Moscow has still be seen as a beneficiary to the prolonged war, given the US lifted some oil sanctions, and prices have been pushed higher – which means more billions flowing into Russian state coffers.

Tyler Durden
Thu, 04/02/2026 – 20:55

DOJ Sues New Jersey Town Over Natural Gas Ban

DOJ Sues New Jersey Town Over Natural Gas Ban

Authored by Naveen Athrappully via The Epoch Times (emphasis ours),

The Department of Justice (DOJ) filed a lawsuit against Morris Township in New Jersey over its ban on natural gas and other fossil fuels in newly constructed buildings, the department said in an April 1 statement.

Blue flames from a gas stove at a home in Arlington, Va., on May 3, 2023. Olivier Douliery/AFP via Getty Images

The ban “drives up energy costs for everyday American consumers and weakens our Nation’s energy dominance,” the DOJ said.

“Such policies reflect a radical left effort to outlaw federally regulated gas stoves, furnaces, water heaters, dryers, and other appliances that American families rely on daily to cook their meals and heat their homes.”

The lawsuit, filed on March 31 at the U.S. District Court for the District of New Jersey, takes issue with an ordinance the township passed in 2022.

The ordinance said that beginning Sept. 1, 2022, officials shall not issue a construction permit for any new apartments consisting of 12 or more units unless the building is all-electric.

The ordinance defines an all-electric building as not using natural gas, propane, or oil heaters, or their associated delivery systems—boilers, piping systems, fixtures, and infrastructures—to meet its energy needs.

In its lawsuit, the DOJ argues that the ordinance denies the township’s consumers “reliable, resilient, and affordable energy,” as well as the option to use commonplace gas appliances for heating, cooking, and other household tasks.

Moreover, the township’s ban on natural gas is unlawful, as the Energy Policy and Conservation Act of 1975 preempts state and local regulations related to energy efficiency or energy use of any product subject to the federal government’s energy conservation standard, the complaint said.

The DOJ argued that the Ninth Circuit Court recently ruled that banning the installation of natural gas piping in new buildings was preempted by Congress via EPCA. This legal precedent makes Morris Township’s gas ban “invalid.”

The department asked the court to rule the township’s ordinance as “void and unenforceable.”

The Epoch Times reached out to the mayor of Morris Township for comment but did not receive a response by publication time.

“Where the federal government has exclusive authority to regulate appliances and infrastructure, we will fight state and local overreach,” Principal Deputy Assistant Attorney General Adam Gustafson, from the DOJ’s Environment and Natural Resources Division, said.

“Banning natural gas is illegal. It makes heating, cooking, drying, and other life functions more unaffordable for consumers. This Administration is committed to unleashing American energy and empowering Americans.”

Trump’s Executive Order

In the lawsuit, the DOJ cited President Donald Trump’s April 8, 2025, executive order, titled Protecting American Energy From State Overreach.

State laws and policies that seek to institute climate regulations related to energy weaken America’s national security and bring about financial ruin by pushing up energy costs for families, Trump wrote in the order, adding that such rules undermine federalism by “projecting the regulatory preferences of a few States into all States.”

Trump instructed the Attorney General to take “all appropriate action” necessary to stop the enforcement of state and local laws, policies, and practices that burden the development and use of domestic energy resources.

Attorney General Pamela Bondi said the DOJ’s lawsuit against Morris Township follows two similar successful lawsuits in California.

Radical environmentalist policies that drive up costs and limit consumer choice will not stand,” Bondi said.

In January, the DOJ filed a lawsuit against Morgan Hill and Petaluma, cities in California, over their natural gas bans.

The DOJ said in the recent statement that due to the lawsuit, both cities recently passed ordinances rescinding natural gas bans.

Meanwhile, a new bill, the Affordable Home Energy Protection Act, which seeks to tackle the issue of local energy restrictions, was introduced last month in the Legislature of New Jersey, where Morris Township is located.

Several localities have attempted to ban or restrict the use of natural gas hookups or combustion-based appliances in newly constructed or renovated buildings without properly considering costs, feasibility, or consumer preferences, the measure said.

The bill explicitly bans state agencies and local governments from adopting any rule that “prohibits or unduly restricts the installation, connection, or use of appliances or heating systems powered by natural gas, propane, or fuel oil in residential or commercial buildings.”

Tyler Durden
Thu, 04/02/2026 – 20:30

Russian Embassy Outraged After US-Israeli Strikes Damage Orthodox Church In Tehran

Russian Embassy Outraged After US-Israeli Strikes Damage Orthodox Church In Tehran

The Russian Embassy in Tehran has issued an angry denunciation of Wednesday US-Israeli airstrikes which damaged a Russian Orthodox Church and an adjacent nursing home. 

“Two missile strikes on the morning of April 1st in the immediate vicinity of St. Nicholas Orthodox Cathedral in Tehran caused damage to the main building and outbuildings (windows and doors were blown out),” the Russian Embassy said in a statement which featured some photos of the damage.

Not only the main church complex, but an “almshouse” also suffered damage, according to the official statement. As US-based pro-Iranian opposition NGO also confirmed the strike.

The embassy statement further underscored that the tragedy happened during a sacred week of the Christian calendar.

“We note that St. Nicholas Cathedral was damaged during Great Lent and ahead of one of the main religious holidays, Easter. Due to the military recklessness of the United States and Israel, the Orthodox community in Iran is unable to visit the church,” the diplomatic mission wrote on Telegram.

“The adjacent Russian Nursing Home, where elderly residents still live, also sustained significant damage (including a collapsed roof). Thankfully, there were no casualties,” it added.

There has been a significant Russian emigre community in Tehran – with a church there – going back to the period immediately following 1917 Soviet Revolution, after which Russian Orthodox Christians were heavily persecuted in their home country by Soviet authorities. Many so-called White Russians fled to central and eastern Asia and beyond – and to the West.

The Armenian Orthodox Church has also had a long, very ancient presence in Iran, with the total Christian population of Iran numbering in the low hundreds of thousands – though some modern estimates have put it around possibly one million.

St. Nicholas Church in Tehran

AntiWar.com, citing the AP, has reported that “the missile strike appeared to have targeted the nearby former US embassy compound in Tehran, from where the CIA coordinated the 1953 coup in Iran and where the hostage crisis started in 1979 following the Islamic Revolution that ousted the US-backed Shah.”

“Part of the former embassy has been turned into a museum highlighting the US role in the coup, called the Den of Espionage Museum,” the report continues.

Most Americans are likely unaware of the strong Christian, as well as Jewish presence, throughout Iran. Akin to Bush’s Iraq war, US mainstream media has done little to educate the American public on the sizable Christian minorities – part of ancient communities which predate the advent of Islam the Middle East. Middle East Christians are often the first bystanders in NeoCon regime change wars.

Tyler Durden
Thu, 04/02/2026 – 20:05

Australia Considers Emergency Powers To Protect Domestic Gas Supply

Australia Considers Emergency Powers To Protect Domestic Gas Supply

Authored by Tsvetana Paraskova via OilPrice.com,

Australia’s government intends to consider using emergency powers to protect domestic natural gas supply in case of a shortfall on its east coast in the third quarter of 2026.

The potential consideration of using such powers would be part of the steps the Albanese Government is taking to secure domestic gas supplies for Australian households and industry as the Middle East conflict disrupts global energy markets.

Australian Minister for Resources, Madeleine King, has given notice of her intention to consider using powers under the Australian Domestic Gas Security Mechanism (ADGSM) to protect Australian energy supplies in the event of a possible east coast domestic gas shortfall in the third quarter of 2026, the winter months Down Under.

The minister will consult with major gas producers over the next 30 days regarding supplies to the domestic market and will make a decision on whether to use the ADGSM by the middle of May, the government said.

“My decision to issue a notice of intent is a precautionary measure that gives me the flexibility to intervene if Australia is at risk of facing an energy shortfall,” King said in a statement.

“The notice does not place any limits on gas exports. Currently, Australia’s domestic market is well supplied with Australian gas.”

Australia remains a reliable gas supplier to international partners, but if there is a risk of domestic supply shortfall, Australians will be priority for energy supplies during the disruption on the global markets caused by the war in the Middle East, the minister said.

On Wednesday, the Australian Competition and Consumer Commission (ACCC) said that wholesale gas supply on Australia’s east coast is expected to be tight and large volumes of gas will likely be required from storage to meet demand in the third quarter of 2026.

Apart from gas supply, Australia has moved to protect consumers from soaring fuel prices.

Early this week, the government halved the fuel excise on gasoline and diesel for three months in a bid to alleviate financial stress from spiking fuel prices.

Tyler Durden
Thu, 04/02/2026 – 19:40

Oracle’s Dubai Data Center Reportedly Hit As Iran Expands Attack On AI Infrastructure

Oracle’s Dubai Data Center Reportedly Hit As Iran Expands Attack On AI Infrastructure

According to Reuters national security reporter Phil Stewart on X, the Islamic Revolutionary Guard Corps has targeted a data center facility operated by Oracle in Dubai. 

Not much is known about the IRGC strike on Oracle’s data center or what type of air-delivered munitions were involved. There is no word on what damage the facility sustained.

Context on Oracle’s data center operations in the Middle East: 

Oracle’s Dubai facility is its Oracle Cloud UAE East region, with the region identifier me-dubai-1 and region key DXB. Oracle says the Dubai cloud region is located in Dubai, UAE, and the company also operates a second UAE region in Abu Dhabi. 

Oracle’s data center map:

Oracle’s status page currently shows no operational issues in Dubai or worldwide. 

On Wednesday, the IRGC targeted Amazon’s cloud computing operation in Bahrain. Also, last month, numerous data centers operated by U.S. companies were hit by IRGC drones (read report).

Earlier this week, Sepah News, the IRGC’s official news outlet, named 18 U.S. companies with operations in the Middle East that are now considered “legitimate targets.”

“From now on, for every assassination, an American company will be destroyed,” the RGC-affiliated news outlet said.

The list of companies also included Cisco, HP, Intel, Oracle, IBM, Dell, Palantir, JPMorgan, Tesla, GE, Spire Solutions, Boeing, and UAE-based artificial intelligence company G42.

Related:

One thing the U.S.-Iran conflict has taught the world is that civilian infrastructure is not off limits, as well as the massive security gaps in protecting data centers from cheap drones. 

Tyler Durden
Thu, 04/02/2026 – 19:15

The Case Against Federal Reserve Independence

The Case Against Federal Reserve Independence

Authored by Alexander William Salter via AmericanMind.org,

It’s illegal in theory and ineffective in practice.

The independence of the Federal Reserve System has become a major source of public controversy. As political leaders signal dissatisfaction with monetary policy, officials and commentators rush to defend the central bank’s insulation from democratic pressure.

We are told, as if it were self-evident, that central bank independence is a pillar of sound economic governance.

But this confidence is misplaced. The economic case for central bank independence is far weaker than its defenders suggest. And the constitutional case is weaker still.

Start with economics. The standard argument is that independent central banks deliver low and stable inflation because they are insulated from short-term political incentives.

Elected officials, facing electoral pressures, might be tempted to juice the economy with artificially loose monetary policy. By contrast, independent technocrats can take the long view.

Early empirical studies did show that countries with independent central banks experienced lower inflation. Yet more recent research has cast doubt on this relationship.

The correlation is sensitive to different samples and methods. In many cases, the supposed benefits of independence disappear entirely.

A more plausible explanation has emerged. Countries that enjoy low and stable inflation share deeper institutional characteristics: respect for the rule of law, stable political systems, and credible commitments to property rights. These are the real foundations of sound money. Central bank independence accompanies these basic governance norms, but its standalone effect is debatable.

This matters for a free-enterprise economy. Monetary policy is not a neutral technocratic exercise. Interest rates are prices: the price of time, risk, and capital. When insulated officials tinker with those prices at their discretion, the result is distorted market signals. Cheap credit can mislead investors, encourage unsustainable projects, and redistribute wealth in opaque ways. Independence does not eliminate politics. It simply hides politics behind a veil of expertise.

If the economic case for independence is overstated, the constitutional case is entirely bunk. The Constitution is clear: Congress holds the power “to coin Money” and “regulate the Value thereof.” Monetary authority, like all legislative power, originates with the people’s representatives. Congress may delegate certain functions to administrative bodies, including by creating a central bank. But delegation is not abdication. Those who exercise delegated authority remain accountable to the laws Congress passes and, ultimately, to the chief executive charged with enforcing them.

Yet the modern Fed operates as if our constitutional framework were irrelevant. Its leaders enjoy significant protection from removal. Its decisions (targeting interest rates, allocating credit, regulating banks, etc.) have sweeping consequences for the entire economy. If this does not constitute the exercise of executive power, it is hard to say what does.

The Supreme Court has recently emphasized that administrative agencies cannot be insulated from presidential oversight simply because they possess technical expertise.

The separation of powers does not yield to convenience, nor to the promise of better policy outcomes.

Yet when it comes to the Federal Reserve, the Court has signaled a willingness to tolerate precisely such insulation—a “special case” for the most powerful economic institution in the country.

This exception is indefensible. Appeals to history or prudence, however well-grounded, are not constitutional arguments. An agency that wields executive power must answer to the chief executive. Concerns about how that works in practice does not justify ignoring the Constitution.

The truth is that central bank independence persists not because it is firmly grounded in law or economics, but because the alternative unsettles us. We worry, not without reason, that elected officials might misuse monetary policy for short-term gain. But the Constitution does not permit us to resolve that fear by concentrating vast economic power in the hands of unaccountable experts. A free and self-governing people must confront the difficult task of designing institutions that combine competence with accountability.

That begins with Congress. There are several legislative reforms that can restore the rule of law to monetary policy. First, lawmakers should narrow the Federal Reserve’s mandate to a single, clear objective—price stability—rather than the vague and conflicting goals it currently pursues. A simpler mandate would make it easier to evaluate performance and hold policymakers responsible when they fail.

Second, Congress should revisit the legal protections that shield senior Fed officials from removal. Freedom of judgment is one thing; freedom from oversight is another. Officials entrusted with such consequential authority must ultimately answer to elected leadership. Legislators ought to make it easier to fire central bankers.

Finally, the president should take a more active role in ensuring that the Fed operates within its statutory and constitutional bounds. This does not mean dictating day-to-day interest rate decisions. Instead, it means recognizing that monetary policy, like all exercises of government power, must remain subject to democratic control. President Trump’s nomination of Kevin Warsh as the next Fed chairman is a good start. The two must work together to restore normalcy to the Fed’s everyday operations, something missing since the 2007-08 financial crisis.

Economic stability is obviously desirable. But we cannot purchase it at the cost of self-government.

Republican principles require officials to be answerable to the people.

If we are serious about preserving the constitutional order and free enterprise, we must abandon the comforting myths of central bank independence and restore accountability to the Federal Reserve.

Tyler Durden
Thu, 04/02/2026 – 18:50

“Save Every Drop Of Fuel”: South Korea Tells Citizens To Conserve, Ride Public Transit Amid Energy Shock

“Save Every Drop Of Fuel”: South Korea Tells Citizens To Conserve, Ride Public Transit Amid Energy Shock

The Gulf energy shock is now hitting Asian economies with full force.

In Seoul, President Lee Jae Myung on Thursday urged citizens to “save every drop of fuel,” a new sign policymakers are moving quickly into crisis mode as the U.S.-Iran conflict raises the risk of nasty fuel shortages across some of Asia’s most energy- and Hormuz-dependent economies.

Lee told lawmakers in a parliamentary address about the urgent need to conserve fuel. He warned that the Middle East crisis has triggered one of the worst energy shocks ever.

I earnestly appeal to all citizens to actively participate in energy-saving movements in daily lives, such as taking public transportation and conserving electricity,” Lee said.

He added, “The current crisis is not a passing shower that quickly subsides, but rather a massive storm whose duration is uncertain, making it all the more severe.”

South Korea is trying to offset a collapse in energy imports from the Gulf region, with the Hormuz chokepoint still disrupted as the U.S.-Iran conflict enters its second month.

Lee’s government proposed a $17 billion emergency program to cushion households and businesses against fuel price shock.

Seoul has already imposed a fuel price cap, expanded fuel tax cuts, and moved to secure alternative supplies of key petrochemicals such as naphtha, as well as urea for fertilizer.

Seoul also announced it will delay the shutdown of coal-fired power plants and has lifted caps on coal-fired electricity, as coal switching across Asia goes into high gear to offset losses in Gulf energy flows.

In South Asia, India has told coal-fired power plants to crank up power generation. 

Australian officials asked citizens to trade their cars for public transport to conserve fuel. Fuel shortages in the country have become visible in recent weeks because it is highly exposed to Gulf energy flows.

Last month, China halted refined fuel exports to the region to preempt a potential fuel shortage.

JPMorgan analysts recently explained that the energy shockwave from the Iran war is already hitting Asia, with Africa next, then Europe, and shortly thereafter the U.S., especially West Coast states.

Source

The disruption of petrochemical production in the Gulf region has also sparked a global plastics supply crisis. To note, China is the world’s largest plastic consumer and producer.

Which country will be next to declare a fuel crisis and urge citizens to take public transportation? 

Tyler Durden
Thu, 04/02/2026 – 18:25