A common phrase in military parlance is “we are always fighting the last war”. In the face of global response to the current SARS-CoV-2 (COVID-19) crisis, the proverbial “last war” was the influenza pandemic of 1918, which killed an estimated 50 million worldwide, including 675,000 people in the United States (US). Although history provides invaluable lessons, it is vital to re-contextualize the issues, problems and solutions of the past in light of the circumstances, capabilities and complexities of the present. This is particularly important when considering the technical tools that are currently available for pandemic response, information acquisition, and issues of governmental transparency and communication in the face of public fear.
But in such light, it is also important to heed another military phrase: “train as you fight, and fight as you train”. The use of the tools necessary for effective and efficient response to biosecurity threats requires development of strategic goals during “peacetime”, and flexible tactics that enable situational adaptability to achieve such objectives during a real time event. Operationally, this entails methods to employ technology based upon a backdrop of reliable data so as to provide the greatest confidence in operational decision-making. As well, the social methods employed should be reflective of—and responsive to—the socio-cultural needs, values, and attitudes of the present, and not reliant upon the social constructs and norms of 100 years ago.
This is particularly true of emerging and modern biotechnologies. Since the advent of gene editors like CRISPR, and other biotechnological tools, there has been high concern about their potential misuse1, or diversion to the creation of bioweapons. This prompted a plethora of studies in academic and policy arenas for the past several years. In all of these studies2 3, of particular note is the ‘democratization’ of biotechnology (the relative ease of use by non-biological experts), convergence with other technologies4 like artificial intelligence (AI) or 3D printing, and the fact that development of these technologies is primarily being driven by the private sector, where government has less influence on their trajectories. While not highlighted in explicit detail in these studies, all note that the benefits frequently outweigh the risks, and that many of these amazing advances will be the exact tools needed to combat any pandemic worst-case scenario.
The COVID-19 pandemic has generated such a worst-case scenario: the very real dilemma of ensuring public safety and health, saving lives, while simultaneously attempting to stabilize and sustain the economy. More extensive testing, rapid progression of traditional investigational treatments and vaccines through translational pipelines, and surveillance, quarantine, and curfews are all critical tools for managing the pandemic at present. Yet, mandatory testing and large-scale surveillance and social control have been perceived by some as severe. This has fostered debates about the relative effectiveness of authoritarian imposition upon—and restrictions of—individual liberties and civil rights. We believe that it is unnecessary to take pandemic preparedness and response to such extremes; but what is required is a prudent application of today’s more effective biotechnologies and other tools to pandemic readiness and response5. Thus, it is important to query if the demonstrated benefits of currently available advanced technologies are being fully utilized to maximize ethically sound, effective and efficient approaches to reducing the pandemic burden and its effects.
We would argue that the answer would be an unfortunate ‘no’. We have stated elsewhere, and re-iterate here, that the circumstances are a justification to employ viable new tools and techniques that have the most validity—and therefore potentially greatest value—in achieving defined, desired outcomes. The authors suggest that if there was ever a time to put such tools to use, it is now. The variables that contribute to the threat domain of the burden-to-benefit calculus could be considered, and leveraged accordingly, when assessing risks and harms of use (i.e., harms of commission) versus non-use (i.e., harms of omission) of emerging technologies.
A risk assessment and mitigation paradigm to elucidate both idiosyncratic and systemic risks of new technologies has been previously proposed.6 An analytic construct coupled to methods of gap analysis, would provide insight to determine what aspects of biosecurity preparedness, response and sustainment are most in need of technological support. This analytical construct could facilitate decision making on which technologies are best suited and sufficiently developed to support future responses. This effort should not be limited to the governmental, academic, or commercial sector, but should conjoin resources in a coordinated whole-of-nation approach. Here, we examine what we believe to be key, observable vulnerabilities revealed by the COVID-19 outbreak, with recommendations for the use of emerging technologies that may fortify current and future biosecurity preparedness and response efforts.
Platforms for disease surveillance data need to be modernized
Testing provides the basis to understanding epidemiological characteristics and patterns (i.e., incidence, prevalence, recovery, and demography). A vital lesson to be learned from the current situation is that testing should have begun much earlier, and with greater fidelity in data acquisition, analysis, dissemination, and use. The reliance on a central federal diagnostic response originating out of the Centers for Disease Control (CDC) will have to be reconsidered in light of systemic missteps in rolling out nationwide testing7 capabilities. As we have noted, the private sector is driving biotechnological innovation, and as of this writing, the Food and Drug Administration (FDA) has granted Emergency Use Authorizations to dozens of diagnostic and serological tests8, many created by small synthetic biology companies. However, it is only now that preparations are underway to employ next-generation sequencing (NGS), serum testing, or the assessment of patient genomic data for both vulnerability and resilience to the virus. Serological tests, in particular, are important to reveal those individuals (and groups) who may have been exposed to the virus, but do not express signs or symptoms. The technologies for these tests are mature, but they are uncoordinated and unlinked to more capable scale-up pathways found in the private sector. The result is that this crucial data is absent, yet needed to inform current and future decision- making regarding extent of physical distancing and the implications for socio-economic revitalization. Specifically, it may be that those exposed, who no longer express transmissible virus could safely operate in an infectious environment, and this is particularly important for the readiness of US military forces.9
Similarly, NGS testing could reveal genetic factors that render certain individuals especially vulnerable to the virus, its manifest multi-organ system effects, and/or secondary, opportunistic bacterial infection. At present, such experiments are being performed in cells and animal models10, but these models are slower and less facile than NGS—as China11 and other countries have the capability to utilize such tools, and indeed employed them in the heart of the original outbreak. These techniques and tools are being employed toward the U.S. pandemic response “on the fly”; in large part because our preparedness surveillance platforms have not been oriented to these novel capabilities in order to create reliable baselines12 from which to engage an adequate and sufficient level of response.
It was previously proposed that early recognition of infectious disease threats like COVID-19 requires a globally-distributed array of interoperable hardware and software, fielded in sufficient numbers to create a network of linked collection nodes. We argue that achievement of this bio-surveillance capability will require a degree of cooperation that does not exist at this time—either across the US federal government or among our global partners. Successful fielding of a family of interoperable technologies will necessitate interagency research, development, and purchase (“acquisition”) of bio-surveillance systems through collaborative ventures that likely will involve our strategic allies and public-private partnerships.13
Advanced computational tools should be utilized regularly
Artificial intelligence (AI), machine learning (ML), and decision technologies should be employed for modeling the potential spread of the virus, as well as for identification, design and development of therapeutic—if not preventative—interventions. AI is currently being used14 to speed selection of potential drug candidates. However, this was not engaged as part of a preparedness protocol in the event that any one of the presently most concerning pathogens were to become pandemic, and only began two months into the outbreak. Such preparedness engagement of AI and related data and decisional technologies would have advanced drug identification and selection processes, and would thus be “ahead of the curve” if and when the pandemic began (rather than having to “jump start” such efforts in pandemic response amidst a rapidly evolving pandemic crisis).
Additionally, models of disease spread could be developed beyond those that have been, and currently are traditionally used for epidemiological forecasting. Current decision-making is being driven by geographic distribution and demographics of confirmed case counts, disease severity and fatalities. While necessary, such metrics are insufficient to additionally account for complex public behavior, economics, or features of the virus learned in real time. For example, the decision to shelter in place across all of the US did not have the requisite data and adaptability to account for the virus’ most vulnerable populations and where they most commonly reside in the US15; nor could such modeling afford rigorous assessment and prediction of the emergence/occurrence of the next infectious hotspots. In light of this, ongoing debate about whether to sustain social distancing and isolation, or re-start commercial enterprise and re-engage the economy is fraught with inadequate data and models upon which to base decisions that will significantly affect public health, social life, and community, national, and international economics.
The ability to efficiently and effectively respond to biosecurity threat(s) is further hampered by a failure to share data and a lack of governmental—and non-governmental resource interoperability that would enable more timely decisions and actions. At present, biosecurity and defense systems and infrastructures fail to share data, are incapable of cooperative and complementary leveraging for predictions, readiness and response, and have become disconnected from, and non-engaged with international allies and partners.
Supply chains require technology modernization and fidelity
The impact of COVID-19 upon global trade has clearly illuminated vulnerabilities in the US’ supply chains, particularly a dependence on foreign companies. The commercial sector’s embrace of just-in-time manufacturing in order to minimize sustaining large supply inventories has been quite effective. Unfortunately, this principle didn’t anticipate the paralysis in global supply chains, leading to shortfalls in critical materials necessary to sustain the fight against the pandemic. Moving forward, innovations in US’ manufacturing could form a basis for a more permanent transition to less brittle supply chains that are not as susceptible to single points of failure abroad. Many companies in the private sector have demonstrated16, by their own initiative, that they can rapidly and flexibly steer biotechnology platforms toward the production and scaling of medical molecules, assisted by computational tools. A similar scenario is occurring with diagnostic tests. Further, there has been an uptick in shifting manufacturing platforms to meet the need for medical products (e.g., a shift from parachutes to medical masks, and/or clothing to medical gowns), and the creative use of additive manufacturing to create necessary personal protective equipment and parts for medical devices (such as ventilators17). Moreover, companies have created innovative ways to re-sterilize18 and re-use such medical supplies.
The defense logistical expertise leveraged during the outbreak should also be considered for more routine use. The COVID-19 crisis has necessitated more flexible and expedient supply lines that require rigorous inspections of foreign imports for fidelity and protection from fraud. Measures of this sort fortify the reliability of any products and resources that are obtained from non-domestic suppliers, and in so doing, sustain the integrity and functionality of US supply chain dependencies. We believe that such measures should become standard operating procedures for US supply chain requirements. These modifications to governmental coordination of the scope and services of commercial sector engagements should serve as lessons learned for the adoption of the whole-of-nation readiness and responsiveness that we strongly advocate.19
Digital health is a useful tool—when used judiciously
COVID-19 is the first pandemic to occur in the digital age. While the US had been iteratively developing tools of telehealth, telemedicine, and “wearable” health monitoring devices, these technologies and protocols for their use were not ready to enable large-scale deployment and data gathering for real-time national pandemic modeling, and/or to function in the US’ first large-scale effort in monitoring social distancing. These tools could provide means to monitor baselines of individual, group, community and national health status that would be important for tracking disease spread, vulnerability, and recovery—and in this way, develop epidemiological models and forecasts in real time.
The use of digital health technologies is not restricted to assessments of clinical signs and symptoms, but can also be utilized to correlate other patient and group data (such as genomics, genetics, and existing medical conditions). It can be used to generate correlations that may be important to determining vulnerability to disease and susceptibility to more severe infection. This would represent a significant improvement upon current modeling methods that tend to suffer from using a relative paucity of observable available data (such as those obtained from emergency room visits and “syndromic” reports of flu-like symptoms in health care settings. These tend to be misrepresentational, and incur delays in treating those patients who are most likely to progress to serious or critical presentation of the disease). Taken together with more readily available and accessible polymerase chain reaction (PCR) and serological testing as described above, these digital health methods could provide information essential to determinations of continued social distancing, relative immunity, and re-initiation of commerce, the economy, military force readiness and public safety.20
Of course, any such approaches to enable and provide individuals’ readily-accessible real-time medical data generates concerns for personal privacy, socioeconomic bias in healthcare, and ethical, legal, and social concerns (in both civilian and military settings). There are ongoing debates about whether any such data should be maintained under governmental or commercial custodianship, and to what extent such information should be governed so as to ensure public health, safety, and national security, while still affording personal protection under the scope of civil and constitutional rights.
While a complete address of such issues is beyond the scope of this essay, we have advocated the need for and development of legal measures, such as a medical information non-discrimination act (MINA), as based upon the Genetic Information Non-discrimination Act (GINA), and as consistent with similarly proposed policy for neurological data (i.e., a neurological information non-discrimination act (NINA)21 22). And while constitutional rights and civil liberties may not be threatened by current uses of such data,23 the expansion in both capability and use of digital approaches such as those we describe, would suggest, if not dictate the need for aligned policy and laws. But as we note elsewhere, these too are not without potential problems; and the current pandemic brings these issues – and the need for their address- into stark relief.4 Overall, the use of emerging capabilities regarding digital “health identities” or “health status” have the potential to be beneficial, when risks are explored and mitigated appropriately—before we are in the midst of a pandemic scenario.
Conclusions: It’s not 1918, and it needn’t be again moving forward
In sum, the current COVID-19 crisis should be regarded in its literal sense, as a time of change. We believe that there are indeed lessons to be learned from the pandemic of a century ago common to the one we face at present. Just as healthcare systems and national and international systems of medical response were threatened, prompted to respond, and altered by the influenza pandemic of 1918, so too must those infrastructures and functions of the governmental, civilian, and commercial sectors of today.
However, today’s landscape of emerging biotechnologies—including their broadening availability, convergence with other tools and techniques, and localized strength within the private sector—are features which could be adopted for benefits in the current response, or future response. The ability to use advanced diagnostic tests and tools, create more computationally complex and educated epidemiological models, innovate medical countermeasures and manufacturing platforms, and digitally engage the US citizenry in the response, all fall within the attributes of the emerging biotechnologies landscape.
Toward this end we advocate the approaches described in this essay, not as stand-alone tools and techniques, but as part of a larger networked biosecurity enterprise that cooperatively and collaboratively engages a whole-of-nation system of readiness, preparedness and response to risk assessment and threat reduction, so as to enable more effective and efficient response—irrespective of whether future threats are of natural or man-made origin. The current pandemic gives us an opportunity to envision new tools, methods, and response policies that leverage emerging technologies, which, if adopted and prudently employed, would enable capability to far better predict, prepare, if not prevent the “next” biosecurity war, and not merely repeat the errors of the “last”.
3 National Academies of Sciences, Engineering, and Medicine. 2018. Biodefense in the Age of Synthetic Biology. Washington, DC: The National Academies Press. https://doi.org/10.17226/24890.
6 Giordano J. Toward an operational neuroethical risk analysis and mitigation paradigm for emerging neuroscience and technology (neuroS/T). Exp Neurol 287 (4): 492-495 (2017).
13 Emanuel P1, Jones F, Smith M, Huff W, Jaffe R, Roos J. The key to enabling biosurveillance is cooperative technology development. Biosecur Bioterror. 2011 Dec; 9(4):386-93. doi: 10.1089/bsp.2011.0020.
19 DeFranco JP, DiEuliis D, Bremseth LR, Snow JJ. Giordano J. Emerging technologies for disruptive effects in non-kinetic engagements. HDIAC Currents 6(2): 49-54 (2019).
21 Kostiuk, S. After GINA, NINA? Vanderbilt Law Review 65(3): 933-977 (2012).
23 Kraft C, Giordano J. Integrating brain science and law: Neuroscientific evidence and legal perspectives on protecting individual liberties. Front Neurosci 11: 1-10 (2017).
Dr. Diane DiEuliis is a Senior Research Fellow at the Center for the Study of Weapons of Mass Destruction at the National Defense University. Dr. Peter Emanuel is a Senior Research Scientist (ST) Bioengineering at the U.S. Army Combat Capabilities Development Command Chemical Biological Center. Dr. Alexander Titus is the Founder & CEO of Bioeconomy.xyz. Dr. James Giordano, MPhil, is Chief of the Neuroethics Studies Program; Co-director of the O’Neill-Pellegrino Program in Brain Science and Global Health Law and Policy in the Pellegrino Center for Clinical Bioethics; and Professor in the Departments of Neurology and Biochemistry at Georgetown University Medical Center.
The views expressed in this piece are the authors' own and do not reflect the official policy of NDU, the Department of Defense or the U.S. government.