STRATEGIC INSIGHTS

 

Strategic Insights is a forum for concise analyses of critical policy issues that affect U.S. national security interests. It is maintained by the Institute for National Strategic Studies (INSS) at the National Defense University (NDU). Strategic Insights is intended for the exchange of research-informed analysis. It is not a venue for the dissemination of unofficial information and comments, or as a means to survey visitor opinions. The views, findings, conclusions, and recommendations made by Strategic Insights are solely those of the author. They do not constitute the official position of INSS, NDU or the U.S. Department of War (DoW).

 

Strategic Insights

Hacker binary attack code.

Strategic Insights |

Artificial Intelligence and a Reconfiguration of Military Power

Elise Annett and Dr. James Giordano

Under Secretary of War for Research and Engineering Emil Michael has emphasized that the DoW has historically under-deployed artificial intelligence (AI) and that the current moment demands rapid, enterprise-wide integration of AI capabilities across the DoW workforce to better support both efficiency and warfighting functions.

LEARN MORE →


close-up circuit chip

Strategic Insights |

Fortifying Technologic Innovation in National Defense: Strategic Security Imperatives for Research and Acquisition

Dr. James Giordano and Dr. Diane DiEuliis

The recently announced Fundamental Research Security Initiatives and Implementation Memorandum, intended to strengthen protections for Department of War (DoW)-funded research, represents a crucial evolution in how the United States (U.S.) secures innovation enterprise within the defense industrial base (DIB). This initiative affirms that security and innovation are equal, co-foundational components of national defense and activities of the DIB.

LEARN MORE →


close up of a human brain

Strategic Insights |

Cognitive Warfare 2026: NATO’s Chief Scientist Report as Sentinel Call for Operational Readiness

Dr. James Girodano

The recently released NATO Chief Scientist’s 2025 Report on Cognitive Warfare provides a timely acknowledgment of a strategic reality that contemporary conflict is increasingly behavior-centric, and the decisive terrain is often not geographic but how individuals and groups perceive, interpret, decide, and act.

LEARN MORE →


close up of a human brain

Strategic Insights |

Cognitive Warfare 2026: NATO’s Chief Scientist Report as Sentinel Call for Operational Readiness

Dr. James Girodano

The recently released NATO Chief Scientist’s 2025 Report on Cognitive Warfare provides a timely acknowledgment of a strategic reality that contemporary conflict is increasingly behavior-centric, and the decisive terrain is often not geographic but how individuals and groups perceive, interpret, decide, and act.

LEARN MORE →


Photo by: NIH/National Human Genome Research Institute.
A stylized digital illustration of a glowing DNA double helix suspended in a futuristic blue interface. Surrounding the helix are schematic icons representing molecular structures, chemical formulas, data grids, network nodes, and atomic symbols. The image visually conveys the convergence of biotechnology, data, artificial intelligence, and advanced scientific systems in a highly networked, modern research environment.

Strategic Insights |

Biotechnologies and the Treaty Gap: Why Biological Weapons Governance Is Falling Behind; and Some Thoughts on How to Fix It

Dr. James Giordano

The Scottish ballad Auld Lang Syne, written in 1788 by poet Robert Burns is a tune traditionally played to ring out the passing year and herald in the new. The lyrics offer an invitation to celebrate that which was good, and toast to what may come.

LEARN MORE →


Tubes in a laboratory

Strategic Insights |

Biotechnology in the FY 2026 NDAA: Strategic Implications — and Recommendations — for Joint Force Readiness

Dr. James Giordano

The newly released FY 26 NDAA places explicit emphasis upon the increasing involvement of biotechnology in US military missions. As 2025 comes to a close, and we look ahead to the new year, Dr. James Giordano, Director of the CDTFW, offers a view to why biotechnology is — and will be ever more — intrinsic and important to national defense and offers a set of recommendations for fortifying Joint Force engagement in the biotechnological domain.

LEARN MORE →


DNA strand graphic

Strategic Insights |

Artificial Intelligence: A Double-Edged Sword in Support and Subversion of the Biological Weapons Convention; Part Two: Implications and Recommendations

Dr. Diane DiEuliis, Elise Annett, Dr. James Giordano

As we noted, the integration of artificial intelligence (AI) into biosurveillance and biodefense architectures to strengthen verification and enforcement mechanisms associated with the Biological Weapons Convention (BWC) can also enable state and non-state actors to obscure, circumvent, or strategically exploit the very compliance frameworks that AI is intended to enhance.

LEARN MORE →


DNA strand graphic

Strategic Insights |

Artificial Intelligence: A Double-Edged Sword in Support and Subversion of the Biological Weapons Convention Part One: Framing the Issues

Elise Annett, Diane DiEuliis, Ph.D., James Giordano, Ph.D.

The recent announcement that artificial intelligence (AI) will be employed to surveille and support compliance with the Biological Weapons Convention (BWC) reflects both the capabilities for data collection, integration and analysis that such systems enable, and the iterative integration of AI within biodefense ecologies and operations.

LEARN MORE →


A soldier wears virtual reality glasses; a graphic depiction of a chess set sits in the foreground. Illustration created by NIWC Pacific.

Strategic Insights |

Critical Technology Areas Part 2: Implications and Recommendations for the Warfighter and Warfighting

Dr. James Giordano

As noted in last week’s special edition Strategic Insights, the Department of War will focus upon furthering research, testing and use of six key domains of disruptive technology (viz., applied artificial intelligence [AI], biomanufacturing, contested logistics technologies, quantum and battlefield information dominance, scaled directed energy, and scaled hypersonics).

LEARN MORE →


Special Edition Image

Strategic Insights |

Convergent Critical Technologies Part 1: The Integrative Transformation of Warfighting

Dr. James Giordano

The Under Secretary of War for Research and Engineering’s designation of six Critical Technology Areas (CTAs; viz., Applied Artificial Intelligence, Biomanufacturing, Contested Logistics Technologies, Quantum and Battlefield Information Dominance, Scaled Directed Energy, and Scaled Hypersonics) constitutes a fundamental conceptualization of how power will be projected, contested, and sustained across the conflict spectrum.

LEARN MORE →


Soldier using virtual tablet hologram army technology

Strategic Insights |

The Agentic Database and Military Command: A Perspective on Autonomous C2 Systems

Elise Annett and Dr. James Giordano

The shift from passive databases to “active reasoning engines” in commercial agentic AI signals a fundamental transformation in how decisions are made, authority is exercised, and accountability is maintained.

LEARN MORE →


Soldier interacting with futuristic interface

Strategic Insights |

Autonomous Artificial Intelligence in Armed Conflict: Toward a Model of Strategic Integration, Ethical Authority, and Operational Constraint

Elise Annett and Dr. James Giordano

Artificially intelligent systems are being developed to have iteratively autonomous function, and these systems are increasingly being considered for use in military settings, weapon platforms, and operations.

LEARN MORE →


Image of a pile of microplastic chips.

Strategic Insights |

Tiny Particles, Big Stakes: The Strategic Implications of Micro‑ and Nanoplastics

Dr. James Giordano and Dr. Ashok Vaseashta

During World War II, plastic production was ramped up to meet demands from the defense industry. In the post-war consumer culture, using technological innovations and advanced synthesis methods to create and manipulate isomers, synthetic polymers became an integral part of our daily existence. Since then, global plastic production has increased exponentially, and current production is over 502.5 million tons (MT) worldwide. At this trajectory and barring any binding treaty to limit plastic production, the number is on track to more than double by 2050.

LEARN MORE →


Digital illustration of a human head profile, overlaid on a digital background of electronic circuits, symbolizing artificial intelligence and the fusion of technology with the human mind.

Strategic Insights |

Moving at WARP Speed Toward Developing the Cyborg Soldier

Dr. James Giordano and Dr. Diane DiEuliis

There is an adage that the fruits of scientific achievement applicable to real-world settings tend to blossom with the fertilization of time and trends.

LEARN MORE →


Two figures. Figure 1 (left): All Military Leader Engagements with Africa.  Figure 2 (right): CMC Vice Chair travel to Africa.

Strategic Insights |

China’s Military Diplomacy in Africa

Matt Kuhlman, Raina Nelson, and Phillip C. Saunders

This article shows another application for regional researchers, analysts, and policymakers. Specifically, it uses the database to explore some specific aspects of the People’s Liberation Army’s (PLA’s) evolving engagement in Africa.

LEARN MORE →


Chinese Defense Minister Chang Wanquan meets with Vietnamese Minister of Defense Ngo Xuan Lich in Beijing, January 13, 2017
(Liu Fang/Xinhua/Alamy Live News)

Strategic Insights |

Visualizing China’s Military Diplomacy

Raina Nelson, Matt Kuhlman, and Phillip Saunders

The National Defense University (NDU) recently released a major update to its comprehensive, publicly available database tracking the People’s Liberation Army’s (PLA) international military-diplomatic engagements from 2002 to 2024.

LEARN MORE →


Biohazard symbol

Strategic Insights |

Bold New Bioweapons: Part 2 — Bold Bolstering of Deterrence and Defense

Dr. James Giordano

Last week’s Strategic Insights addressed how biotechnology has emerged as a foundational and formidable element in the evolving character of warfare. The integrative convergence of big data analytics, artificial intelligence (AI), and advanced bioengineering and manufacturing has created rapidly expanding dual-use capabilities that can be leveraged in both non-kinetic and kinetic engagements.

LEARN MORE →


Biohazard symbol

Strategic Insights |

Bold New Bioweapons: Part 1 — The Burdens of Detection and Attribution

Dr. James Giordano

It has been more than fifty years since the ratification of the Biological Weapons Convention (BWC) in 1972, which sought to provide a formalized venue for international control and prohibition of development, production, and stockpiling of biological and toxin weapons.

LEARN MORE →


A circuit board contains multiple examples of important microelectronics innovation. The Defense Department's microelectronics commons aims to close gaps in America's ability to bring new microelectronics technology to market.

Strategic Insights |

Major Concerns About Microelectronics

Elise Annett, Steven Hanson, Dr. James Giordano

Artificial Intelligence (AI) is decisively shaping the future of warfare. It accelerates decision cycles, extends operational reach, and enables exercised control of the informational, and cognitive dimensions of engagement.

LEARN MORE →


Cover image of the article

Strategic Insights |

Strategic Innovation in the DoD FY 2026 RDTE Budget: Leveraging Disruptive Technologies for Deterrence, Defense, and Command and Control

Dr. James Giordano

The Department of Defense FY 2026 Research, Development, Test and Evaluation (RDTE) budget request marks a strategic inflection that reflects a doctrinal shift toward convergent disruptive technologies, and with it, a re-posturing of how deterrence, defense and decisive command will be engaged on the near-future battlefield.

LEARN MORE →


Eye watching over the earth from space

Strategic Insights |

The Orb’s Eye: Seeing the National Security Implications of Iris Based ‘Proof of Humanity’

Elise Annett, James Keagle, James Giordano

As recently reported in the cover story of Time magazine, the launch of The Orb — a beach‑ball‑sized biometric device developed by Tools for Humanity (co‑founded by Sam Altman) — marks a paradigmatic shift in digital identity and biosecurity technology and its implications.

LEARN MORE →


Magnified glass globe

Strategic Insights |

Brain Scanning: Assessing Emigration of U.S. Scientific Talent to Surveille Strategic Implications for China’s Dual-Use Technological Capabilities

Dr. Diane DiEuliis and Dr. James Giordano

Intensifying global competition in science and technology (S/T), particularly in fields with considerable disruptive potential - such as artificial intelligence (AI), quantum computing, synthetic biology, and neurotechnology—has become a defining feature of 21st-century geopolitical dynamics.

LEARN MORE →


Illustration of a human head and brain, set against a futuristic blue digital background representing neural activity and data flow.

Strategic Insights |

The “Ins” and “Outs” of Cognitive Warfare: What’s the Next Move?

Elise Annett and Dr. James Giordano

INSS has relaunched Strategic Insights. Read the latest post by Elise Annett and Dr. James Giordano.

LEARN MORE →


ArticleCS - Article List (HIDDEN)

News | Sept. 24, 2025

Beyond Mechanistic Control: Causal Decision Processing in Neuromorphic Military Artificial Intelligence

By Dr. James Giordano Strategic Insights

The Next Step in AI: From Simple Cadence to Causal Processing

Recently, a paper by Kevin Mitchell and Henry Potter in the European Journal of Neuroscience provided a valuable overview of current understanding of causation in neurocognitive processing, which I believe has interesting implications for military applications of neuromorphically-based artificial intelligence (AI) systems. As we transition from traditional mechanistic AI architectures to those that are designed and developed to more closely mirror the complex causal dynamics of neural systems, military stake and shareholders (and oversight organizations) must confront new paradigms of autonomous decision-making that can challenge conventional understandings of predictability, command control, and accountability in AI.

To date, military AI systems have operated upon relatively linear mechanistic principles; in other words, input A leads to process B, which can generate output C; output C can affect process D to incur output E, etc. This deterministic framework has proven to be highly valuable for specific tactical applications where clear stimulus-response patterns are sufficient. However, with the development of increasingly sophisticated iteratively autonomous systems that are capable of operating in complex dynamic environments, the limitations of purely mechanistic approaches become evident. Modern battlespaces and engagements are characterized by ambiguity, rapid contextual shifts, and the need for more nuanced interpretation of incomplete information. These are all challenges that mirror, or in some cases are identical to those engaged by biological neural networks. 

The fundamental weakness of the mechanistic paradigm of AI processing is its inability to account for contextual sensitivity, and exercise the adaptive flexibility characteristic of, and necessary for effective military decision-making. So, while mechanistic AI may excel at pattern recognition in controlled scenarios, it can fail (and perhaps catastrophically) when confronted with novel or completely new situations that require interpretation beyond parameters of initially programmed processes of stimulus recognition and response. This inflexibility becomes particularly problematic when considering the potential lethal consequences inherent to certain military applications.

Criterial Causation and Autonomous Military AI

The concept of criterial causation, as originally described by Peter Ulric Tse, offers a more sophisticated framework for understanding how neuromorphically-designed and -structured AI systems might operate in military contexts rather than simple stimulus response mechanisms (i.e., triggering these systems would evaluate multiple, often distinct and divergent convergent criteria before initiating both decisional processes and actions). In neuromorphic military AI, the equivalent of synaptic weights, thresholds, and contextual factors would determine whether incoming data (i.e., inputs) meet the criteria necessary and sufficient to evoke specific operational decisions and action (i.e., output) responses.

For example, let's consider an autonomous, unmanned vehicular system tasked with identifying and engaging a target. A mechanistic system might rely upon predetermined optical signals/cues to identify target features, and associated these with recognized risks and threats. In contrast, a neuromorphic system operating on criterially causative principles would integrate multiple streams of visual data, communication intercepts, behavioral patterns, environmental contexts, and mission parameters to establish dynamic characteristics that must be evaluated and satisfied before a decision is initiated and an output action is authorized. The system's relative weighting of distinct types and amounts of information would continuously be adjusted based upon accumulated experience and changing operational conditions (i.e., it would act as a dynamical Bayesian system). 

This approach offers considerable advantages in terms of operational flexibility and reduced false positive rates and outcomes. However, it also introduces new challenges for military oversight. Traditional command and control structures tend to assume predictable, attributable decision pathways. In contrast, criterial causations may make tactically sound decisions through processes that can be difficult to reconstruct or predict based upon the diversity, expanse and differential weighting of various types, levels and amounts of data, thereby complicating both real-time oversight and post-action accountability. 

Historical Causation and Military Neuromorphic AI Systems

The integration of historical causation into neuromorphically-based military AI represents perhaps the most transformational aspect of these emerging technologies. By incorporating historical causation, such a system would continuously modify its operational parameters based upon accumulated experience, training data and environmental/ situational contingencies and exigencies. The temporal dimension of this type of causation has important implications for military effectiveness and oversight: an AI system that learns from past engagements, adapts to adversarial countermeasures and develops tactical approaches could provide unprecedented battlefield advantages. Yet, this also presents challenges for command structures that are designed around predictably controllable assets. 

Historical causation processing suggests that neuromorphically-based AI’s current decision-making capacity is inextricably linked to its developmental history, training data, previous operational experience, and environmental forces and evolutionary pressures encountered during various deployments. This creates a type of “institutional memory” within the system that may be difficult to audit, modify, or reset. Thus, it will be important to consider how these systems can be managed given their current capabilities and limitations, which are products of complex historical trajectories rather than direct results of explicit programming.

Semantic Causation and Military Neuromorphic AI

Perhaps the most significant advancement in military neuromorphic AI is semantic causation, whereby AI could derive decision-making power not merely from pattern recognition, but from the relative meaning and value such patterns obtain and portend within various operational contexts. A neuromorphically-based military AI operating through semantic causation processes would interpret incoming data through learned association, contextual significance and adaptive value. This approach directly mimics how human intelligence analysts process information. 

For instance, similar intelligence data might evoke differing responses depending upon operational context, missional parameters and accumulated expertise. An adversary's communication intercept that appears to be routine and benign in one context might signal imminent risk or threat in another, based upon semantic meaning derived from historical patterns and current situational realities and awareness. In military applications, semantic causation offers potential for more subtle, contextually appropriate responses to increasingly complex circumstances and scenarios. But the meaning an AI system assigns to incoming data may not align with human interpretation, which can lead to unexpected or inappropriate reactions. This can incur interpretive variability that may challenge traditional military doctrine that emphasizes standardized responses and predictable outcomes. 

Implications for Military Oversight and Ethics

The transition from mechanistic to neuromorphically-based AI systems that exercise causal decision-making processes necessitates reconsideration of some fundamental aspects of military oversight. Traditional approaches to AI governance assume transparent, auditable decision processes and pathways that can be identified, verified, modified, and held accountable. Systems operating through criterial, historical, and semantic causation challenge each of these assumptions and oversight requirements.

Identification and verification of inherent AI decision-making processes of data assimilation, association and valuation are somewhat more difficult given the complexity (and perhaps opacity) of criterially causal mechanisms. The dynamic nature of these systems means that verification at one point in time may not predict behavior under different conditions or after additional learning experiences. Military oversight must develop new methods for assessing system reliability and appropriateness that account for this inherent dynamism.
Modification of neuromorphic systems presents additional challenges in that expanded causal frameworks may require more multifaceted interventions that account for historical causation patterns and semantic meaning structures that have evolved over time and system experience. Military planners should consider what changes to a system are required, and how to implement these changes without disrupting and compromising the adaptively beneficial capabilities they provide.

Recommendations for Military Implementation

The integration of neuromorphically-based AI systems that utilize expanded causal frameworks into military applications will require careful consideration of opportunities, benefits, burden(s) and risks. Toward such evaluation and to facilitate potential adoption and use of such systems, I recommend a phased approach that begins with low-risk applications where system autonomy can be gradually increased as understanding and oversight capabilities mature.

• First, initial deployment of neuromorphically-based AI systems that use causal decision processing should focus upon intelligence analysis and decision support roles where human oversight and engagement remains robust and final authority for output actions remains with human operators. 

• Second, as experience with these systems grows and oversight mechanisms evolve, gradual expansion of system autonomy in operational contexts becomes more feasible and should be modeled in particular settings and under discrete circumstances.

• Third, military institutions should solicit, develop and sustainably cultivate new expertise in neuromorphically-based AI systems and their functions, inclusive of personnel well-versed in applications of criterial, historical, and semantic causation principles.

• Fourth, military technical education, focused on mechanical and electronic systems, should be supplemented with training in complex adaptive systems’ architectures, functions, and the emergent behavior patterns they obtain, so as to fortify the complement of competent system developers, operators, and command personnel.

Conclusion

The concepts of expanded causal frameworks provide key insights for understanding and implementing neuromorphically-base AI systems that can be employed in military contexts. Without doubt, these systems offer considerable capabilities for adaptive, context-sensitive decision-making; yet they also challenge undergirding assumptions about predictability, control, and accountability that are essential to traditional military doctrine.

Therefore, I posit that success in integrating these technologies will require knowledge, appreciation, and engagement of technical advancement as well as modification of military thinking, training, and oversight mechanisms that will enable effective, efficient and ethically sound operational use of such systems. Simply put, I believe that the stakes are far too high for anything less than engaging a careful, systematic approach to understanding and managing these emerging capabilities. With the integration of iteratively autonomous AI systems in military contexts, I offer that lessons from the brain sciences about the complexity of causal decisional and action processes in neural systems should inform and guide prudent approaches to neuromorphically-based artificial ones.
 

Disclaimer

The views and opinions presented in this essay are those of the author, and do not necessarily reflect those of the Department of War, United States government, or the National Defense University.

 

Dr. James GiordanoDr. James Giordano is Director of the Center for Disruptive Technology and Future Warfare of the Institute for National Strategic Studies at the National Defense University.

 

 

Notes

https://doi.org/10.1111/ejn.70064

Tse, Peter Ulric (2013) The Neural Basis of Free Will. Cambridge: MIT Press, ISBN: 97802262313162