This Israeli-made killing racing drone is a nightmare for some


Last week, an Israeli defense company painted a frightening picture. In a roughly two-minute video on YouTube that resembles an action movie, soldiers out on a mission are suddenly pinned down by enemy gunfire and calling for help.

In response, a tiny drone zips off its mother ship to the rescue, zooming behind the enemy soldiers and killing them with ease. While the situation is fake, the drone — unveiled last week by Israel based Elbit Systems — is not.

The Lanius, which in Latin can refer to butcherbirds, represents a new generation of drone: nimble, wired with artificial intelligence, and able to scout and kill. The machine is based on racing drone design, allowing it to maneuver into tight spaces, such as alleyways and small buildings.

Exploding ‘kamikaze’ drones are ushering in a new era of warfare in Ukraine

The company’s promotional content touts its upgrades. After being sent into battle, Lanius’s algorithm can make a map of the scene and scan people, differentiating enemies from allies — feeding all that data back to soldiers who can then simply push a button to attack or kill whom they want.

For weapons critics, that represents a nightmare scenario, which could alter the dynamics of war.

“It’s extremely concerning,” said Catherine Connolly, an arms expert at Stop Killer Robots, an anti-weapons advocacy group. “It’s basically just allowing the machine to decide if you live or die if we remove the human control element for that.”

Representatives from Elbit Systems did not return a request for comment.

Using drones in warfare has become commonplace. The United States’ arsenal of drones is responsible for enemy and civilian deaths in the Middle East. In Russia’s war against Ukraine, Moscow has been seen using a killer drone that can divebomb into targets, destroying them with little notice.

Drones large and small have been making an impact in war. Notably, Ukraine’s use of the Turkish-made Bayraktar TB2 — a drone the size of a small airplane and equipped with laser-guided missiles — has wreaked havoc on Russian tanks and trucks.

U.S. has viewed wreckage of kamikaze drones Russia used in Ukraine

For weapons manufacturers, that provides an appealing target.

Elbit Systems, headquartered out of Haifa, Israel, says in promotional content its Lanius is outfitted with features that would be particularly helpful in urban warfare settings, where troops can’t see their enemy well.

According to the drone’s data sheet, the drone is palm-size, roughly 11 inches by 6 inches. It has a top speed of 45 miles per hour. It can fly for about 7 minutes, and has the ability to carry lethal and nonlethal materials. It’s unclear how deadly the lethal materials would be.

The drone is outfitted with WiFi and radio technology for communication. It can maneuver using GPS navigation, and the drone’s onboard artificial intelligence system can scan and map urban battle spaces, feeding soldiers back a 3D map of its surroundings.

The drone’s autonomous software helps with “enemy detection and classification,” according to the company, helpful for “lethal ambush.”

The company notes that the drone cannot decide to kill someone itself and needs a “human-in-the-loop” to make the decision and pull the trigger.

Russia pummels Ukraine with missiles and drones, injuring civilians

Despite that, Stop Killer Robots’ Connolly has numerous concerns.

The feature requiring humans to be involved in a decision to kill can probably be overridden, she said. “Changing that would probably just require a software upgrade,” Connolly added. “There’s … absolutely nothing to stop the manufacturer from doing that or from a lawyer or agent who’s buying these systems request that they do that.”

The Lanius’s ability to use algorithms to differentiate enemies from allies seems worrisome, she said. The general public should know how the drone distinguishes between combatant and civilian, what data the system’s algorithm is trained on to make those calls, who has labeled the data being used, and what kind of behavior is being flagged as making someone seem threatening, she said.

“It’s basically just showing that systems now can all but decide to, using an algorithm … to take human life,” she said.