Advancements in Drone Warfare Create New Challenges in Fielding Lethal Autonomous Weapons System

Note from the Editor: The Federalist Society takes no positions on particular legal and public policy matters. Any expressions of opinion are those of the author. We welcome responses to the views presented here. To join the debate, please email us at [email protected].
Autonomous drone weapon systems have transitioned from a science fiction trope to a present-day reality on modern battlefields. On the Ukrainian front lines, the deployment of drones and anti-drone countermeasures has shifted drones from rudimentary remote-controlled devices to sophisticated and increasingly autonomous weapons of war. This rapidly escalating arms race creates new pressures on commanders in their implementation of Department of Defense Directive 3000.09, which outlines the Department’s policy regarding autonomous weapons. This one-of-a-kind publicly available policy also highlights how bad faith actors might implement similar technology on the battlefield, but without the same restraint.
Electronic warfare has significantly influenced the development of drone technology. As electronic jamming devices have been implemented throughout the Ukrainian and Russian front lines to interfere with drone control, traditional remotely piloted systems have become less effective. At any given moment, only 20% of drones are operational. In response, alternative methods of control are being developed, such as frequency-hopping radio systems that attempt to evade jamming through constant channel-switching. While this tactic has some success, it remains susceptible to advanced countermeasures. Both sides also deploy fiber-optic tethered drones. These systems use kilometers of physical fiber optic cable to transmit control signals, making them immune to radio-frequency jamming and difficult to spot using acoustical detection. Employed on the front lines, range limitations make this a solution of limited success. Fiber-optic systems are unsuitable for deep strikes, such as the recent strikes in “Operation Spiderweb.”
In its Operation Spiderweb, Ukraine has taken the next step, deploying drones equipped with optical recognition and neural-network-based guidance systems which can autonomously strike the target after the drone’s signal has been jammed. These drones can also navigate using landmarks when GPS is unavailable. Training their drones on photographs of Russian aircraft, Ukraine’s drones could identify and strike their target even if the remote-control signal was being blocked. This implementation of new technology allows maintaining operational capability for the drones, even in contested electromagnetic environments. Combined with the new “mothership” drone delivery program, this creates potential for long-range low-cost attacks.
All these measures are taken to minimize the effectiveness of anti-drone defenses, which are also drastically improving. As energy- and ammunition-based anti‑drone weapons start shooting down more drones, more drones will have to be launched to overcome those defenses. This strategy, called “swarming,” is typically limited by the number of men who can actively operate a drone. A fully autonomous drone does not suffer that same limitation, a drone would not need an operator to actively manage the flight path or target selection. Ukraine’s latest drone designs reportedly incorporate elements of this, suggesting a near-future battlefield dominated by machines capable of executing missions independently. The next step is likely to expand recognition beyond airplanes and increase target selection to enemy combatants. These two steps would be necessary increase the quantity of drones launched in an attack.
This trend puts significant pressure on commanders as they develop guidance in fielding these weapons. U.S. policy, as outlined in Department of Defense Directive (DoDD) 3000.09, mandates “appropriate levels of human judgment” in the use of autonomous systems. It is up to the commanders to determine the appropriate level of human judgment based on the risk associated with the system. While an autonomous system launching nuclear weapons would be inappropriate, a system that oversees back-end mathematical calculations would pose no real risk and would need little oversight. An autonomous drone loaded with explosives lands at the center of this scale.
In practice, launching an autonomous drone with preprogrammed geographic and temporal constraints may fulfill the human judgment requirement, even if no human intervenes during the mission itself. But as combat becomes more complex than a simple geographical front line, these drones would have to discern between a combatant and civilian, creating significant ethical and legal concerns. DoDD 3000.09 addresses this concern, calling for robust testing and risk minimization in lethal autonomous weapon systems before they are fielded. DoDD 3000.09 also requires the troops fielding these weapons to understand how they operate. With that knowledge, the troops will be able to decide whether to field the weapons based on battlefield risks and opportunities.
While the United States has outlined a risk minimization policy in lethal autonomous weapon systems, Russia has displayed a more permissive approach towards civilian casualties, viewing them as an acceptable risk, if not the underlying objective. While the official stance of the Russian government is that a loss of meaningful human control of autonomous weapon systems is unacceptable, Russia has consistently refused to agree to any international guidelines on implementation of these systems and has “prevent[ed] efforts to start negotiations on a new treaty to retain meaningful human control over the use of force.” Russian battlefield actions display a policy of strategic advantage over one of risk minimization. The Russian view is one of self-preservation as it attempts to catch up to its competitors, who are also heavily investing in research and development of autonomous systems.
Russian defense officials have already announced plans to produce up to two million FPV drones this year, and they have signaled substantial investment in increasing both quantity and sophistication of these drones. Seeing the advantage on the battlefield, where 80% of Russian casualties result from drone attacks, the country is ramping up its investments. These cutting-edge drones are expected to use machine vision and implement artificial intelligence technology, similarly to the Ukrainian drones. With strategic partnerships between Russia and China accelerating the development of advanced drone platforms, we will soon see a significant test to laws of war through deployment of advanced lethal autonomous weapon systems.