RESEARCH AND COMMENTARY

Through its publications INSS provides cutting-edge research, analyses, and innovative solutions on critical national security issues in support of the joint warfighter and Department of War stakeholders.

 
News | March 11, 2026

Precision in Words, Precision in Warfare: Terminology and Control in Military Discourse on Unmanned Systems

By Dr. Elise Annett, John Bitterman, and Dr. James Giordano Strategic Insights

The Worth of Words to Warfighting

Unmanned vehicular systems (UVS) spanning aerial, maritime, terrestrial, and sub-surface domains have become integral to intelligence, surveillance, reconnaissance (ISR), logistics, strike operations, and electronic warfare. Yet despite the increasing ubiquity and sophistication of these technologies, discourse surrounding their capabilities can be undermined by imprecise terminology that conflates the terms automatic, remote, and autonomous in policy, technical, operational, strategic and policy briefings and planning. As philosopher Ludwig Wittgenstein eruditely noted, linguistic imprecision is not merely academic, but rather incurs existential, ethical and legal implications and manifestations. In military settings, this can directly affect operational planning, command authority, and the guidance and governance of emerging weapons systems. Simply put, words matter, and precision in language is critical to precision in warfighting.

At a technical level, distinctions between automatic, remote, and autonomous systems are clear, yet frequently blurred in practice. To wit: Automatic systems operate according to pre-programmed algorithms and/or mechanical triggers. Once activated, these systems execute a specific function or set of functions in response to defined inputs. Importantly, automatic systems cannot and do not interpret context or make adaptive decisions beyond the parameters for which they have been programmed. Classic examples include proximity-fused munitions, mine-triggering mechanisms, and (more complex) automated missile defense interceptors. Their actions are deterministic, predictable within discrete operational conditions, and constrained by the programming that bounds their responses.

Remote systems require human operators to control the platform from a distance. Remotely piloted aircraft (RPAs), remotely operated underwater vehicles (ROVs), and teleoperated ground vehicles/robots are in this category. The system itself cannot make independent navigation, targeting, or engagement discriminations. So, while the platform may incorporate automated subsystems, such as stabilization, waypoint navigation, or target tracking, the locus of action (and agency) remains human.

Autonomous systems, however, entail a fundamentally different capability in that they can interpret environmental inputs, evaluate conditions, and select actions in pursuit of programmed goals without direct human control at the moment of decision. Autonomous platforms can adapt to changing contingencies, update internal models of the environment (and, with advanced autonomous systems, aspects of the task pursuant to defined goals), and generate novel responses that were not explicitly programmed in advance. Machine learning (ML) and artificial intelligence (AI) architectures are central to enabling these capabilities.

These distinctions may appear straightforward, yet the operational environment can obscure them. For instance, a remotely piloted aircraft may operate in automatic navigation mode during transit; an autonomous maritime vehicle may periodically receive remote tasking updates; and a missile defense system may execute automatic intercept sequences once a human operator authorizes engagement. Thus, systems can incorporate hybrid architectures that combine automatic functions, remote control, and autonomous decision-making.  Acknowledging such hybridity should not deflect the need for conceptual clarity; to the contrary, we opine that it makes it all the more necessary.

Operational Consequences of Terminological Ambiguity

Failure to clearly distinguish these terms can confuse operational planning, execution, and responsibility. Commanders must understand precisely how a system functions and will act under conditions of disrupted communications, and/or sensor performance in contested electromagnetic environments. If a system that is described as “autonomous” is actually dependent upon remote human control, its reliability in denied environments may be overestimated. Conversely, if a system labeled “remotely operated” possesses autonomous target-selection functions, its potential for independent engagement may be underestimated. 

Such misunderstandings can produce missional risk, complicate the rules of engagement (ROE), and incur ethic-legal effects that extend beyond the battlefield. If the terminology used by defense institutions is inconsistent or poorly defined, it invites misunderstanding, mistrust, and potentially destabilizing assumptions about military intent. To be sure, military doctrine and law require clear attribution of decision authority in the use of force. When terminology fails to accurately define the locus and process of command decisions, whether by human operator, automated process, or autonomous system capability, accountability becomes difficult to establish. This issue is especially salient given the growing international discourse surrounding lethal autonomous weapons systems (LAWS).

The Rise of Intelligent Systems

The urgency of resolving these terminological ambiguities is amplified by the rapid evolution of intelligent systems in warfare. Advances in AI, edge computing, sensor fusion, and distributed networks enable ever more sophisticated forms of machine decision-making. Swarming drone architectures, autonomous vehicles capable of long-duration patrols, and terrestrial vehicles equipped with adaptive navigation systems are all current operational realities.

Critical to such functions is the evolving relationship between human decision-makers and machine tools, actors and agents. Military doctrine emphasizes meaningful human control in the application of lethal force, inclusive of engagements employing iteratively autonomous systems. However, the parameters, implications, and accountability of this control become ambiguous when terminology fails to clearly identify how decisions are generated within a system. It is important to recognize that terminology functions as an instrument of command authority in machine-enabled warfare. How autonomy is defined determines how responsibility is assigned, how oversight is structured, and how escalation is calibrated. We assert that clear conceptualization ensures that human intent remains sovereign within increasingly complex systems while reinforcing operational tempo, strategic credibility, and the lawful application of force. 

It's equally important to note that peer competitors are heavily investing in intelligent systems. The People’s Republic of China (PRC) is pursuing extensive research into autonomous swarm systems and unmanned aerial and maritime platforms. Russia has emphasized robotic ground vehicles and loitering munitions systems that incorporate autonomous targeting functions. These investments underscore the urgency of domestic and allied efforts to secure critical technology supply chains—semiconductors, rare earths, edge computing components—essential to scaling UVS production and deployment without adversary leverage. The global security environment is becoming increasingly shaped by and dependent upon these technologies. Therefore, doctrinal clarity focal to their actual capabilities is essential to their tactical viability and value, and their relevance to achieving strategic goals as articulated in the 2025 National Security Strategy and the 2026 National Defense Strategy—namely, restoring peace through strength via unmatched military lethality, technological dominance, and resilient supply chains.

In light of this, any authentic discussions of deterrence, arms control and escalation containment will demand realistic understanding and accurate representation of these systems’ capabilities and limitations. Linguistic precision is not pedantry; it is of vital value to military mission planning, operational direction, and foundational to strategic stability. 

Recommendations

Given the operational and strategic importance of terminological precision, we propose the following recommendations:

(1) The Department of War (DoW) should establish a standardized doctrinal lexicon for unmanned systems that clearly defines automatic, remote, semi-autonomous, and autonomous functions. Such definitions should emphasize the locus of decision authority, degree of environmental interpretation, and level of human oversight required, and should be integrated into joint doctrine publications, acquisition documentation, and training materials.

(2) Operational documentation should specify functional architecture(s) rather than relying on generalized nomenclature. For example, rather than describing a platform as "autonomous," technical documentation and operational briefings should identify which subsystems are automatic, remotely controlled, and which possess particular levels of autonomous decision-making capability. This clarity and transparency would afford a better understanding, and we believe employment, of various systems under particular operational conditions.

(3) Professional military education (PME) and training should include taxonomies of (automatic, remote, and autonomous) systems and human—machine interfaces and interaction(s). Personnel responsible for employing unmanned systems must understand both how these systems operate, and how their capabilities are conceptually defined, as this knowledge is essential for effective operational planning and communicating system capabilities to the chain-of-command, allies, and policymakers.

(4) Acquisition programs should incorporate standardized descriptions of system capability. When unmanned systems are proposed or evaluated, acquisition documents should clearly specify the automaticity, remote capability and/or extent of autonomy obtained and entailed. This is important to decision-makers’ understanding the operational capabilities and implications of emergent technologies before and during their operational fielding.

(5) International military dialogues should adopt and promote standardized terminology. Given that UVS are increasingly a component of multinational operations and arms-control discussions, the DoW should collaborate with allies and international organizations to establish shared definitions of automatic, remote, and autonomous capabilities. We posit that alignment of terminologies would improve coordination (and collaborative effectiveness and efficiency) in coalition operating environments.

Conclusion

The effective integration of new technologies depends both engineering progress and conceptual clarity. The conflation of the terms automatic, remote, and autonomous relative to the capabilities of UVS can become problematic to military discourses aimed at evaluating and employing these technologies in various operational settings and conditions. In the emerging battlespace, wherein machines increasingly sense, decide, and act alongside human warfighters, the words used to describe their capabilities are more than simple semantics; they are operational necessities reflecting that terminological accuracy is an instrument of strategic competence.

Disclaimer

The views and opinions expressed in this essay are those of the authors and do not necessarily reflect those of the United States government, Department of War or the National Defense University.

Elise AnnettDr. Elise Annett is a Research Fellow in the Program for Disruptive Technology and Future Warfare of the Institute for National Strategic Studies at the National Defense University.



 

Dr. James GiordanoMr. John Bitterman (CDR, USCG, ret.) is a non-resident Research Fellow in the Program of Disruptive Technology and Future Warfare of the Institute for National Strategic Studies at the National Defense University.



 

Dr. James Giordano

Head of the Center for Strategic Deterrence and Study of Weapons of Mass Destruction, and Program Lead for Disruptive Technology and Future Warfare of the Institute for National Strategic Studies at the National Defense University.