Some inventions, Abraham Lincoln once said, are of “peculiar value, on account of their great efficiency in facilitating all other inventions and discoveries.” Although many examples of disruptive ingenuity predate Lincoln’s 1860 speech, the list has grown rapidly since—and with revolutions seemingly occurring at a greater frequency. Consider electricity, aviation, antibiotics, vaccines, nuclear energy, space travel, the internet, and most recently, generative artificial intelligence. Like other groundbreaking innovations, many of these technologies can be put toward peaceful or malevolent ends, depending on the goals of who is using them.
Now it’s the rise of biotechnology, a sector with its origins in the ability to sequence, synthesize, and edit the genes of organisms, that is joining the pantheon of advances that pose perplexing dual-use implications. Life sciences progress is creating both the potential for unprecedented benefits in medicine (and other areas) as well as new risks of harm from accidents, unforeseen consequences, or even deliberate abuse. There has never been a major technological disruption that countries haven’t sought to use for industrial or military superiority, or for terrorists and criminals in pursuit of their aims. The examples of historical bioweapons programs and bioterrorism suggest that these maxims will likely remain the case as the bioeconomy grows.
Despite the dramatic pace of discoveries in the life sciences, however, the regulatory systems established for other dual-use risk domains, such as chemical and nuclear research, remain far more mature than those for oversight of the bioeconomy. This reflects fears stemming from the last century’s wars—the chemical trench warfare of World War I and nuclear bombings during World War II—as well as from the health and environmental consequences of accidents in the nuclear and industrial sectors like Chernobyl or the Deepwater Horizon oil well disaster.
The oversight of scientific research exists at international, national, state, and institutional levels and within multiple agencies and jurisdictions. Regulations, guidelines, and policies on the safety and security of scientific research have been ratified in many legal instruments. The outdated mode of biosecurity governance starts at the level of global diplomacy.
The 1972 Biological Weapons Convention, now signed by 183 nations, was designed to prohibit the development, production, acquisition, transfer, and stockpiling of biological and toxin weapons. The type of hazards identified in the convention have not kept pace with advances in molecular biology, genetics, and synthetic biology. And within the United States and elsewhere, these powerful technologies, now increasingly combined with progress in engineering, autonomous robotics systems, and advanced computing, substantially complicate the development of all embracing regulatory and legislative oversight frameworks that look beyond simply controlling pathogen research to limit risks without constraining growth in the global bioeconomy.
Developing a well-balanced oversight system will not be easy. Nonetheless, the expanding gaps in national and international governance of dual-use biotechnology dictate that this subject be a core component of national security policies.
The evolution of biological dual-use risk oversight. Building on experiments in the 1950s that established DNA as the genetic code of life, the science of manipulating genomes, modifying genetic control mechanisms, and creating novel biological functions and organisms not seen in nature progressed rapidly. But with these advances came increasing public concern over progress in genetic research.
Paul Berg, a biochemist, played a role in precipitating the controversy over recombinant (hybrid) DNA with his work in the 1970s introducing bacterial genes into a virus known to cause tumors in rodents. Although Berg had planned to introduce the modified viral DNA to bacterial cells, concern over whether infected cells could escape and cause human cancers ultimately led him to pause the work. Berg and other scientists organized the Asilomar Conference on recombinant DNA in 1975 with the goal of assuaging public fears over the new technology and touting the capability of the scientific community to self-police. The now-famous conference led to guidelines for government-sponsored genetic engineering research, but no onerous new rules—arguably a light-touch oversight approach in the life sciences that has largely endured.