NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

National Research Council (US) Committee on Trends in Science and Technology Relevant to the Biological Weapons Convention: An International Workshop. Life Sciences and Related Fields: Trends Relevant to the Biological Weapons Convention. Washington (DC): National Academies Press (US); 2011.

Cover of Life Sciences and Related Fields

Life Sciences and Related Fields: Trends Relevant to the Biological Weapons Convention.

Show details

2The Pace of Developments in the Life Sciences

As the range of presentations covered at the workshop illustrates, the meeting surveyed developments in the life sciences broadly. Although it was not able to cover all possible topics in depth, the committee sought to identify major themes and trends and then to consider ways in which these scientific developments might relate to the Biological and Toxin Weapons Convention (BWC). The committee’s discussions were guided by the three major trends identified in Chapter 1:

  • The pace of relevant advances in science and technology (S&T) and in related, enabling technologies;
  • The diffusion of S&T research and its applications; and
  • The breadth of fields now engaged in the “life sciences.”

This chapter examines the first of these trends.


2.1.1. Developments Since 2006

As the message from United Nations Secretary General Ban Ki-moon to the BWC States Parties in 2010 (see Chapter 1) illustrates, one of the important trends that potentially affects the future of the BWC is the rapid pace of advances in S&T. The 2010 workshop provided the international scientific community with an opportunity to review major developments in S&T since the 2006 meeting organized by IAP, the International Council for Science, and the Royal Society. Many of the subject areas discussed in 2010 echoed those that were highlighted in 2006, including the “omics” fields,1 synthetic biology, delivery technology, and vaccine and counter‐measures development. The workshop reviewed not only the potential to apply areas of S&T to the creation or delivery of biological agents that could be employed as weapons, but also to prevention, defense, and response against the misuse of biological agents, and to the promotion of beneficial uses of biology. Progress continues to be made in many of the research areas discussed in 2006 and 2010. Examples of key developments in advancing areas of life sciences are highlighted below. Particularly rapid developments have also occurred in enabling technologies and are discussed in more detail in Section 2.2.

2.1.2. Genomics, Systems Biology, and Synthetic Biology


Since the draft sequence of the human genome was published in 2001 and the completed sequence announced in 2003 (HHS and DOE, 2003; International Human Genome Sequencing Consortium, 2004), the sequencing of additional human genomes has proceeded rapidly. A variety of large-scale collaborative genome sequencing initiatives have been undertaken, such as the international 1000 Genomes Project to catalogue human genetic variation as a resource for future biomedical research, which was mentioned at the workshop (The 1000 Genomes Project Consortium, 2010). A recent article on worldwide human genome sequencing efforts notes, “although far from comprehensive, the tally indicates that at least 2,700 human genomes will have been completed by the end of this month [October 2010], and that the total will rise to more than 30,000 by the end of 2011” (Nature, 2010). A significant proportion of this increased sequencing capacity is expected to come from China, where BGI (formerly the Beijing Genomics Institute) is now one of the world’s largest sequencing centers2 and reportedly predicted in 2010 that it would complete 10,000 to 20,000 human genomes by the end of 2011 (Nature, 2010). Beyond human genome sequencing, international collaborations are under way to sequence 1,000 plants and animals of economic and scientific importance (Fox and Kling, 2010) and to characterize the earth’s microbial communities from the soil, air, and water through the Earth Microbiome Project. The project, launched in 2010, plans to “analyze 200,000 samples from these communities using metagenomics, metatranscriptomics and amplicon sequencing to produce a global Gene Atlas describing protein space, environmental metabolic models for each biome, approximately 500,000 reconstructed microbial genomes, a global metabolic model, and a data-analysis portal for visualization of all information” (; accessed June 1, 2011).3

As several workshop presenters explained, additional omics fields continue to advance steadily and build on the understanding gained through genomics, providing researchers with functional information to annotate the more static genomic data (de Villiers, 2010; Dhar, 2010; Pitt, 2010a,b). The field of systems biology seeks to integrate these multiple levels of biological knowledge into descriptive, and ultimately predictive, mathematical models, combining experimental knowledge with computational tools in order to study the interactions between the components that make up a particular biological system. As a result, a primary goal of systems biology is to understand how the system being studied functions, what its properties are that arise from the interactions of its individual components (also referred to as emergent properties), and the design principles on which it operates (Bruggeman and Westerhoff, 2007; Ferrell, 2009).

The field of synthetic biology seeks to use the knowledge gained through these other biological disciplines in order to design new pathways4 having defined functions. Perhaps of all the S&T areas examined during the workshop, synthetic biology has received the greatest public and policy attention, both for its potential contributions to health, the economy, and the environment and for the security risks that misuse of its discoveries could pose.5 Given this, the committee addressed synthetic biology in the context of all three major trends it identified, and discussions of aspects of synthetic biology are found in Chapters 3 and 4 as well as here.

Synthetic biology has now resulted in the successful creation of individual components or elements that can be used as building blocks within a larger genetic network or pathway (Khalil and Collins, 2010; Purnick and Weiss, 2009),6 bringing ever closer the promise of practical applications based on synthetic biology principles. Examples of successful engineering of specific cellular pathways derived from existing genetic sequences have already been reported, notably the design of a terpenoid biosynthesis pathway in yeast to produce the plant-derived antimalarial drug precursor artemisinic acid (Ro et al., 2006). Terpenoids are a very large class of molecules with diverse functions, many of which may have potential pharmaceutical uses (statin drugs, for example, inhibit an enzyme in a terpenoid synthesis pathway resulting in decreased downstream production of cholesterol). Understanding and manipulating terpenoid pathways, the enzymes involved in those pathways, and pathway regulation also hold promise for the development of novel antimicrobial drugs (Muntendam et al., 2009).

In 2010, yet another milestone in synthetic biology was reported—the design and synthesis of a functioning bacterial genome and its insertion into a cell from which the natural genetic material had been removed (Gibson et al., 2010). This advance was notable because it represented the creation of a fully synthetic genome able to successfully direct the range of activities needed for the bacterial cell to survive, grow, and reproduce itself. It also represented progress along the pathway toward “synthetic life,” although the study itself did not create a fully synthetic organism from scratch (i.e., from a pool of chemical precursors to create not only the genetic information but also the cell membrane and necessary cellular machinery), an achievement that still remains out of reach.

Discussion and Implications

The combination of enabling tools, particularly high throughput measurement techniques (see Section 2.2), and the number of omics projects being undertaken results in the creation of vast amounts of biological data to be analyzed and converted into information that will be useful to systems and synthetic biologists. Based on the workshop discussions, the committee emphasizes, however, that the complexity of biological systems remains a significant obstacle to the ability to construct accurate mathematical models, even at the level of a single signaling pathway. For example, Dr. Andrew Pitt of the University of Glasgow in the United Kingdom7 noted at the workshop that solving a mathematical model of the epidermal growth factor receptor pathway requires equations for 322 components and the 211 reactions in which they are involved (Oda et al., 2005). As a result, truly rational systems design in biology remains a goal of the field (Pitt, 2010a). As a recent review of developments in synthetic biology notes,

Whereas traditional engineering practices typically rely on the standardization of parts, the uncertain and intricate nature of biology makes standardization in the synthetic biology field difficult. Beyond typical circuit design issues, synthetic biologists must also account for cell death, crosstalk, mutations, intracellular, intercellular and extracellular conditions, noise and other biological phenomena. A further difficult task is to correctly match suitable components in a designed system. As the number of system components grows, it becomes increasingly difficult to coordinate component inputs and outputs to produce the overall desired behavior. (Purnick and Weiss, 2009)

Nevertheless, advances in omics, systems, and synthetic biology have potential implications for the BWC in several overarching areas. On a fundamental level, these fields continue to advance the understanding of biological systems—including human, animal, plant, and microbial physiology. These fields provide information on how systems function, on networks of interactions (for example, between receptors, ligands that bind to them, and resulting cascades of signaling molecules), and on points at which such systems might be modified or acted upon to cause specified biological effects. In addition to the goal of improving the understanding of existing systems, scientists are exploring how to control these systems in ways that we currently cannot and to enable the design of completely new systems. The knowledge that results from these discoveries might eventually be used to explore new targets and mechanisms of action of biological agents, or new agents themselves, with implications for both protective and prophylactic purposes or for bioweapons. For example, understanding of immune pathways gained through systems biology approaches can be applied to the development of new vaccines (Oberg et al., 2011), while studies of drugs and their networks of interactions in the body can aid in the identification of new drug targets (Chua and Roth, 2011). Laboratories in synthetic biology are already working toward designing and synthesizing new microorganisms by manipulating metabolic and biosynthetic pathways, work that is being conducted for socially beneficial ends such as biofuel production (Alper and Stephanopoulos, 2009; Keasling, 2010). However, advances in synthetic biology may also enable the synthetic re-creation of known pathogens, the combination of sequences from several microorganisms to create new chimeric pathogens, or even the design and synthesis of novel pathogens (NRC, 2010b; Tucker and Zilinskas, 2006).8

2.1.3. Immunology

The workshop surveyed the state of life sciences research broadly and considered both whether S&T developments might have the potential to be misused and how advances in science could help provide solutions to BWC concerns. Developments in understanding the immune system have potential relevance to both of these themes.


Advances in molecular biology, high throughput techniques, and bioinformatics tools for data analysis are moving the field from empirical, trial-and-error design of vaccines and drugs toward rational design (Adams et al., 2011; Bagnoli et al., 2011; Bowick and Barrett, 2010; Connell, 2010; Plotkin, 2009). To accomplish this goal, scientists characterize the pathogens, their hosts,9 and systems of pathogen-host interactions that occur during infection and subsequent immune responses. For example, by comparing the genomic sequences of multiple strains of a pathogen, researchers may identify genetic alterations that correlate with greater or lesser virulence. In fact, increasing virulence of a pathogen is a useful experimental approach to understanding pathogenic mechanisms (Shimono et al., 2003). Yet such manipulations of even mildly virulent organisms could lead to the creation of novel pathogens, which could result in some States Parties questioning whether the project could be a possible violation of Article I. By using high throughput microarrays, scientists can also identify the patterns and changes of gene and protein expression that occur in the pathogen and the host. All of these techniques are directed toward determining the specific molecules and signaling pathways involved in host responses to a pathogen and the ways that pathogens disrupt effective host immune reactions in both plant and animal species,10 ultimately enabling scientists to move toward a systems-level understanding of the infection process. This expanded base of knowledge is used to identify proteins, nucleic acids, or attenuated pathogen strains for testing as vaccine candidates, to design vaccines and countermeasures that will stimulate aspects of the host immune response that are predicted to be effective in eliminating the pathogen, or to disrupt the mechanisms that a pathogen uses to bypass an effective host response. The increased DNA sequencing and characterization of individual genomic data and the correlation of different genetic variations with different responses to a pathogen or to a vaccine are also moving the field toward “personalized vaccinology” (Connell, 2010).

Researchers developing vaccines and countermeasures are actively studying new expression and delivery systems (see Section 2.1.6), along with options to enable more rapid development and manufacturing (Bagnoli et al., 2011; Plotkin, 2009). One example mentioned during the workshop is the use of nonpathogenic latent viruses as transgene vaccine delivery systems (Connell, 2010). Such viruses result in an ongoing but nonsymptomatic and nondisease-causing infection and so can provide a more long-lived boost to the immune system through continued production of immunogens. For example, altered strains of Herpes Simplex Virus-1 (HSV-1) are being developed to deliver foreign antigens (i.e., immunogenic proteins for protection against infection by bacteria and non-Herpes viruses) (Manservigi et al., 2010; Marconi et al., 2009). An added advantage of this approach is that HSV-1–based vaccines are capable of eliciting a strong cellular immune response. 11 DNA-based vaccines are another option, particularly when combined with adjuvants or as the first (prime) immunization in a two-pronged prime and boost strategy (Liu, 2011). The DNA that encodes pathogen proteins against which an immune response is desired can be delivered to cells using viruses or bacteria as vectors or using lipid or polymer-based nonviral particles, as discussed in Section 2.1.6. The immunoprotective proteins encoded by the DNA are subsequently produced within host cells and expressed as antigens on host cell surfaces, generating immune responses (Ledgerwood and Graham, 2009; Plotkin, 2009).

There is also significant interest in the development of new human and veterinary adjuvants, which work in conjunction with vaccines to boost immune responses (Heegaard et al., 2011; Reed et al., 2009). All adjuvants appear to act by stimulating components of the innate immune system, thereby affecting the outcome of adaptive immunity. Thus as more is learned about innate immunity, adjuvants can be designed in ways that direct the efficacy of a given vaccine toward a specific outcome. These studies will greatly enhance vaccine development in the future.

New vaccine platforms are another major focus of countermeasures research. Platforms are flexible systems of vectors (whether viruses, bacteria, or particles) that deliver genes for the pathogen-associated proteins against which immunity is desired, are adaptable so that genes of interest can be swapped in and out of the base platform system, and are optimized for rapid production (Drew, 2007; Ledgerwood and Graham, 2009). Finally, the global prevalence of antimicrobial resistance remains a significant and growing concern, including the spread of multidrug resistant strains, and new antibiotic and antiviral countermeasures are clearly needed. Although the introduction of high throughput screening has greatly reduced cost and increased efficiency of drug discovery and the search for new antibiotics, the length of time, regulatory hurdles, and costs of bringing new compounds into the clinic remain high (Hamad, 2010; IDSA, 2011; IOM, 2010).

Discussion and Implications

Advances in vaccine design and production, in particular those associated with rapid manufacturing methodologies, will have obvious benefits for global health and for preparedness for and response to the potential use of bioweapons or bioterrorism, as well as serving an important public health function. Advances in understanding plant immune systems and plant defenses against infection similarly have relevance to the protection of crops against both natural disease outbreaks and potential intentionally introduced pathogens. Article X of the BWC, which addresses cooperation in the prevention of disease, promotes the sharing of materials and knowledge in the development of infectious disease therapeutics. However, advanced understanding of the immune system has potential dual use implications because it could be misapplied to create pathogens with increased virulence or to decrease the effectiveness of a human, animal, or plant immune response. A concern has been raised, for example, that as synthetic biology continues to advance it could be used to design novel pathogens for these functions.

Effectively modulating and controlling the immune system whether for beneficial or harmful purposes remains a challenge because of the complexity of the immune system itself and because of the complexity of immune system interactions with other physiological systems like the endocrine and nervous systems. Biological systems exist in an “exquisite balance” (Connell, 2010), and although scientific knowledge continues to expand, it is still not possible to predict with certitude the downstream effects of disrupting these biological control systems (Connell, 2010; Nixdorff, 2010). The well-known mousepox case study represents one example in which immune modification provoked unintentional negative effects, creating a lethal vaccine (Jackson et al., 2001).12

Other significant challenges are associated with the development of new vaccines and countermeasures against infectious diseases. Sophisticated laboratory containment systems are required to safely handle certain pathogens, particularly ones of concern as potential bioweapons and as new emerging diseases. Developing and testing vaccines against these pathogens often requires the use of animal models because of ethical considerations that prevent experimental infection in humans and make conducting clinical trials problematic. In many cases, suitable animal models may not currently exist or the specific types and levels of immune responses that correlate with protection in humans are not well known (Matheny et al., 2007; NRC, 2006b). There are also few significant commercial markets for vaccines, and this fact coupled with the regulatory requirements necessary to develop a licensed product result in low commercial interest. As a result, incentives and government and philanthropic investments have been used to drive the creation of new vaccines and medical countermeasures.

Many pathogens of concern as bioweapons and as emerging infectious diseases are zoonoses (e.g., Bacillus anthracis (anthrax), Yersinia pestis (plague), Rift Valley fever virus (Rift Valley fever), Coxiella burnetii (Q fever), Burkholderia mallei (glanders), equine encephalitis viruses (Eastern, Western, and Venezuelan equine encephalitis), Ebola virus (Ebola hemorrhagic fever), influenza viruses such as H5N1 (avian influenza), and others).13 This fact highlights the fundamental importance of cooperation among human, animal, and plant health research communities to support new medicine and vaccine development efforts and global disease surveillance; natural partners include the World Health Organization (WHO), the World Organisation for Animal Health (OIE), and the United Nations’ Food and Agriculture Organization (FAO). The creation of appropriate animal models to support the development and testing of new licensed human products against pathogens of concern is an obvious area for collaboration. The committee noted that contact already exists between the BWC, WHO, FAO, OIE, and other potential partners.14 Further descriptions of this engagement may be found in Chapter 3 as part of a broader discussion of international collaboration on public health.

2.1.4. Neuroscience

The ability to target and deliver substances to the brain and central nervous system brings great promise to the treatment of diseases like brain cancer. Delivery of therapeutics to influence mood and cognition also play roles in treating a range of neurological disorders like depression, attention deficit disorder, and many others.


Neuroscience research is providing new insights into gene expression, variability, and phenotypic plasticity at the level of individual nervous system cells, knowledge that is helpful to understanding the functions of cells in the nervous system as well as exploring improved options for drug screening platforms (Eberwine, 2010). It is also helping scientists to better understand processes in disease development and pathology, for example in elucidating the role of genetics and molecular interactions in Alzheimer’s disease (Holtzman et al., 2011). Advances in delivery methods and formulations intersect with neuroscience research in, for example, developing improved therapeutics to cross the blood brain barrier (BBB).15 Finally, research continues to actively explore the brain-machine interface, which could have positive applications for the replacement of motor or sensory system functions lost due to injury and the creation of functional prosthetics. Signals captured from neurons in the brain can be processed computationally, for example, to allow a subject to move a cursor on a screen or to move a robotic hand (Leuthardt et al., 2009; Warwick, 2011). This area has received significant civilian and military attention and some overstatement of current levels of development. Commercial games using noninvasive methods to capture neural output (for example, by wearing a helmet that monitors brain electrical signals) have been on the market for several years (Li, 2010). Small numbers of patients have received initial prototypes of invasive or noninvasive neural interfaces, several companies are actively developing neural systems (e.g., BrainGate,, and clinical trials are ongoing (e.g., the U.S. study “Microelectrode Brain-Machine Interface for Individuals with Tetraplegia,”, accessed August 18, 2011). A variety of scientific and technical hurdles remain to be overcome, however, in creating more sophisticated and accurate medical devices (Lega et al., 2011).

Advances in the delivery of molecules to the brain also raise the possibility of delivering substances that could influence brain and body pathways as bioregulators and that could either enhance or degrade aspects of cognition, performance, and mood. Oxytocin, for example, is a 9 amino acid peptide found naturally at high levels in women following childbirth and has been associated with a variety of effects including social behaviors, bonding, and the promotion of trust (Ebstein et al., 2010; Lee et al., 2009). Several recent studies on the intranasal administration of oxytocin in situations of group competition have found more complex effects, including promotion of “in-group trust and cooperation, and defensive, but not offensive, aggression toward competing out-groups” (De Dreu et al., 2010, 2011). The suggestion that oxytocin “enhances the cognitive availability of salient information in the social environment” (Chen et al., 2011) has been raised as an alternative explanation for the results, and further research may be needed to clarify the details of oxytocin’s effects. It has been suggested that there could be dual use military applications for oxytocin because of its trust-promoting properties (Dando, 2011; Nixdorff, 2010), which could perhaps play a role in reinforcing social cohesion and bonding within a military unit. Significantly, it has also been noted that experiments such as those delivering oxytocin demonstrate the theoretical feasibility of employing a bioregulatory molecule to produce changes to a subject’s mood or behavior (Dando, 2011). Understanding the complexity of a particular bioregulator’s effects and issues of dosing and delivery would remain as challenges to actual use (additional discussion of bioregulators may be found in Chapter 4).

A variety of advances in the understanding of human neuroscience could conceivably be used to enhance military performance (e.g., use of the antisleepiness drug modafinil to maintain alertness in pilots [Caldwell et al., 2004; Eliyahu et al., 2007]) or might be considered for law enforcement purposes (e.g., the development of neuroimaging techniques with the goal of detecting lying).16 International frameworks and conventions address appropriate uses of chemical and biological agents under treaties such as the BWC and CWC, under international human rights and humanitarian law, and in human subjects for medical research. The association of neuroscience with personality and with the integrity and dignity of a person seems to raise particular social and ethical issues that should be carefully considered. Science is still far from understanding many details of the brain, and the scientific community can contribute to discussions not only on what is possible (including offering a “reality check” of what is truly feasible, when warranted), but also on the potential implications of emerging neuroscience research.

Discussion and Implications

The social, ethical, and military implications of neuroscience research across many areas of research have garnered increasing attention in recent years (NRC, 2008, 2009d; Royal Society, 2011a). Developments in this field have the potential to raise complex issues about the types of applications that are feasible, ethical, and acceptable for military or law enforcement purposes in the context of international legal frameworks. In the context of the BWC, the improving systems-level understanding of the nervous system and its interactions with other physiological systems, methods that enable improved delivery of drugs and genes to the central nervous system, and the delivery of drugs or peptides to influence cognition or motivation, are all areas of potential relevance should such advancing knowledge be used to cause harm. It appears that the science has not yet developed to a point where many of the potential applications of emerging neuroscience research are imminent, but progress continues to be made and interest in these areas is significant. The Beijing workshop, which surveyed developments in S&T broadly, did not allow potential neuroscience issues to be examined in detail. Additional scientific dialogues to examine topics in neuroscience in the context of social and policy issues are ongoing (for example, The Royal Society’s Brain Waves project; see, and this may be an interesting area for further monitoring as research progresses and developments move closer to applications.

2.1.5. Production Systems

Another area of active research in the life sciences is protein production, whether through the process of translation in transgenic organisms and cell culture systems, through the use of “cell-free” extracts, or by means of chemical synthesis. The increasing importance of biologics to the pharmaceutical and biotechnology industries is helping to drive this trend, and a variety of scientific and enabling technical developments are expanding efficient production options for proteins and peptides.17


Transgenic Organisms

As highlighted during the workshop, multiple options exist for protein production in transgenic organisms. Factors such as cost, required production yield and scale, the need for post-translational protein modifications,18 safety concerns related to potential contaminants, and regulatory requirements influence the selection of a production system (Ma, 2010; Slomski et al., 2010). In general, cell culture systems (both bacterial and mammalian) remain the most popular means of producing large quantities of a particular protein, and these systems are relatively straightforward to scale up in bioreactors, as discussed further below.

Therapeutic proteins are also produced in a variety of animal models, including rabbits, sheep, pigs, and goats. However, the creation of transgenic animals and the optimization of protein production in these systems generally require collaboration among teams of scientists and are both more expensive and more time consuming than is creation of a cell culture–based protein expression system (Slomski et al., 2010). The silkworm, an insect-based system, also serves as a feasible model for protein production because fairly high expression levels can be achieved (Kato et al., 2010). Plants, which are already grown economically at very large scale for agriculture, offer another interesting option for the production of recombinant proteins, have already demonstrated proof of principle in a variety of systems, and may be coming closer to practical application (Ma, 2010; Rybicki, 2010).

The use of plant-based production systems has been explored for the creation of edible, lower-cost vaccines that would not require coldchain transport; however, concerns about reproducible dosing and potential environmental escape of the transgenic crop remain. As a result, recent efforts have focused increasingly on transient protein expression in nonedible plant species, such as tobacco, using viral or bacterial infiltration to carry the genes encoding the desired proteins into the plant tissues (Rybicki, 2010). These systems can result in very fast production times— for example, virus-like particles in tobacco, made from H1N1 influenza hemagglutinin (HA) protein for testing in mice as an anti-influenza vaccine, have reportedly been produced in only 18 days from the starting HA DNA sequence (D’Aoust et al., 2008). The current system of influenza vaccine production in chicken eggs takes months. Although biotechnology and pharmaceutical companies have existing investments in cell culture–based production facilities and may be less likely to switch in the near term to plant-based systems for major drugs, it has been suggested that plant production systems could gain a foothold in the production of veterinary drugs and in the rapid production of vaccines against emerging pathogens, including against potential biothreat pathogens (Rybicki, 2010). The field is still developing, but as Julian Ma of the University of London noted during the workshop, the demonstrated ability to rapidly produce active therapeutic proteins and vaccines from plant systems may increasingly provide a “low-tech high-tech” option for economical, massive-scale production (Ma, 2010).


Another notable trend is the increasing sophistication of small, laboratory-scale benchtop bioreactors for cell culture production. These systems, which vary in their construction materials and in the design of components like stirrers and mixers, are used to culture bacterial, mammalian, and insect cell lines to express a desired protein, such as a monoclonal antibody therapeutic. Benchtop bioreactor systems generally hold several liters of cell culture (up to approximately 20L), and typical protein yields may be milligrams to grams of protein per liter. Real-time sensors are increasingly being integrated into bioreactors to measure parameters that affect cell growth such as temperature, pH, and dissolved oxygen. The data gathered by these sensor systems are also being fed back to computerized control systems to increase process optimization and automation. There is a similar trend toward use of disposable bioreactors, such as wave-mixed culture bags, which decrease sterilization requirements when switching from one product or cell line to another and decrease equipment lead time (Bareither and Pollard, 2011).

Bioreactor process optimization remains an important step in culture-based protein production, and it can take several months or more to optimize production in a lab-scale system by adjusting environmental conditions, cell density, and concentrations of nutrients or enzymes that might be needed to produce the protein. Production is generally optimized in small-scale systems (micro reactors and bench-scale reactors) and gradually scaled up to pilot manufacturing and full manufacturing capacity in very large bioreactor tanks. However, scale-up of cell cultures is not always an easy process. Smaller scale bioreactors offer the ability to more rapidly optimize production conditions. It is also possible to subsequently operate multiple smaller scale bioreactors in parallel to produce desired protein quantities, assuming that very large scale manufacturing capacity is not needed, and the small size of micro- and bench-scale bioreactors would make them difficult to monitor or detect.

Cell-free Systems

Increasingly feasible options also exist for creating biological molecules like peptides and proteins in cell-free systems and through chemical synthesis. Cell-free systems rely on the principal biological machinery used to translate proteins (e.g., mRNA, ribosomes, amino acids, and tRNAs), but the steps are carried out in solution rather than inside a cell. These systems may facilitate subsequent protein purification and reduce potential contamination, and may also be advantageous if the protein being produced causes toxicity in the producing cell line at high concentrations. A desired protein can also be fully chemically synthesized, eliminating entirely the need for cell culture or transgenic expression and production. Furthermore, chemical synthetic systems offer the possibility of more easily incorporating unusual or non-natural amino acids or otherwise modifying the protein to include desired functional groups or other chemistry. Chemical synthesis of significant quantities of a product still remains limited to peptides rather than to larger proteins, and the complexity of the chemistry needed to synthesize a particular peptide can vary widely, affecting time and cost (Thayer, 2011). However, a chemically synthesized peptide therapeutic, Fuzeon, has been on the market since 2003 and is produced in industrial-scale quantities. The scalability and purification of peptide synthesis have improved, and the market for synthesized peptides is expected to grow (Thayer, 2011).

Discussion and Implications

The pharmaceutical market for biological products is currently more than $100 billion a year and continues to grow (Bain and Shortmoor, 2010). Monoclonal antibodies are a significant component of this market, along with other protein and peptide drugs. The potential magnitude of these markets will continue to drive developments in protein production, although, as with all biological products, substantial investments of knowledge, time, and money in research, development, and clinical trials are required in order to develop a licensed therapeutic, regardless of the production method.

The creation and optimization of transgenic animal and plant models and the design of sophisticated chemical synthetic pathways require significant scientific expertise. Continuing developments in plant-based production systems, however, are expanding the options for rapid, economical, and large-scale protein production. These systems may turn out to be useful for the rapid production of vaccines against emerging pathogens or other disease agents of concern, although such systems could theoretically also be misapplied to create protein toxins for bioweapons. Laboratory-scale cell culture bioreactors are already widely available and enable fairly rapid production of smaller quantities of proteins as well as the ability to scale up production by operating multiple bioreactors in parallel. The small size of laboratory-scale bioreactors also renders it difficult to detect protein production capacity. All of these production systems (transgenic animals and plants, small-scale cell culture bioreactors, and chemical synthesis) thus have the potential to expand the definition of production facilities relevant to the BWC beyond traditional industrialscale operations.

2.1.6. Delivery Systems for Biological Molecules

The prohibitions embodied by the BWC also apply to the means of delivery of biological agents. Developing effective delivery methods is often cited as a key technological hurdle for the creation of a bioweapons or bioterrorism program (see, for example, Danzig et al., 2011). As a result, the committee considered both aerosol science and recent developments in viral and nonviral delivery technology as part of its analysis of the overall picture of trends in the life sciences.


The delivery of drugs or vaccines through an aerosol route (such as through the use of individual inhalers) has been widely studied as an alternative method to injection. Only a thin wall separates the air spaces from the bloodstream in the alveolar cells of the deep lung, and this can enable drugs or other molecules to pass into the body. Aerosol delivery may also increase the concentration of an agent reaching the bloodstream by avoiding the “first pass” metabolism that occurs in the liver following oral absorption. As described during the workshop, the fluid dynamics of particles in inhaled air, whether droplets of drugs, viruses, or dust, reach and deposit into different regions of the respiratory system based on factors such as particle size and density.19 Researchers study and optimize these parameters in order to create successful inhaled-delivery systems (Roy, 2010).

As several workshop presenters discussed, research on the development of delivery systems that protect drugs, vaccines, and even bacterial or viral vectors from degradation in the environment and in the body, that increase uptake, and that target delivery to specific cells and tissues is actively ongoing (Nixdorff, 2010; Roy, 2010; Ying, 2010). One delivery option is the use of various DNA and RNA viruses as vectors, because viruses have evolved specific strategies to infect target cells and deliver the nucleic acids they contain.20 These viral properties can be harnessed to deliver therapeutic, nonviral DNA. Genes and drugs can also be encapsulated or embedded in various types of lipids and polymers as both nano- and micro-particles. These systems protect the molecules from degradation in the body and can be chemically functionalized to target particular cell and tissue types (e.g., through the conjugation of ligands on their exterior to interact with cellular receptors and promote particle uptake into cells via receptor-mediated endocytosis, or through many other strategies). As a result, functionalized nanoparticle delivery systems seek to mimic some of the types of properties that make viruses such efficient delivery vehicles (such as mechanisms for uptake into target cells and for effective transfer of the payload they contain) (Nixdorff, 2010). Nonviral materials can also be designed so that their properties change in response to relevant physiological signals like temperature or glucose concentration (Ying, 2010). Advances in delivery technology, such as the use of liposomal nanoparticles or carriers targeted to transport pathways, may help achieve more effective delivery of drugs, genes, and imaging agents across the blood brain barrier, although effective delivery to the brain remains a challenge. In particular, nanoparticulate drug delivery systems are under development that can be targeted to specific cells and organs, such as those of the reticuloendothelial system, by incorporation of surface recognition molecules from viruses or other infectious agents that normally hone to these cellular targets. The nanoparticulates may be lipid- or polymer-based and have been modified to carry antibiotics, siRNAs, peptides, nucleic acids, and other small molecules for immunogenic, therapeutic, or antimicrobial effects.

Pulmonary delivery systems are under intensive development because this route lacks many of the barriers to successful drug delivery found in the intestinal tract, such as low pH and mucosal surfaces. Pulmonary vehicles include aerosols and aerosol inhaler systems, dry powder inhalers, and nebulizers. The treatment of respiratory diseases and efficient systemic dissemination of aerosolized drugs make the lung an attractive target; in fact, more than 30 percent of the global drug delivery market consists of aerosol delivery (Kaparissides et al., 2006).

Thus, active research is expected to continue on methods to protect therapeutic molecules such as genes, proteins, and other drugs from premature biological degradation and to increase the delivery of these molecules to target cells and tissues. In particular, aerosol delivery is playing an expanding role in the pharmaceutical enterprise. As with protein production technology, the healthcare industry is expected to remain a significant driver of the delivery technology field. However, formidable biological challenges like protection from physiological degradation and clearance mechanisms, immune system responses, and achievement of therapeutic levels of targeting, uptake, and expression remain, despite continued progress. As a result, the creation of these systems continues to require training and expertise.

Discussion and Implications

Article I of the BWC addresses “means of delivery designed to use such [biological] agents or toxins for hostile purposes or in armed conflict.” This provision relates to means of delivery specifically developed for the dissemination of biological and toxin agents for warfare purposes. But new biological weapons delivery methods may also come about as the result of legitimate research and development into means and methods of dissemination and administration of treatments for entirely legitimate purposes, for example to administer improved therapeutics and vaccines. Although the focus of that work is on better control of the dose, improvement of patient compliance, and better absorption and targeted delivery of the treatment, some of the underlying physical and engineering principles may well be adaptable to biological weapons delivery systems. Advances in delivery of small and large molecules, both protein and nonprotein, along with targeting, would place this research within the purview of Article I. On the other hand, the pharmaceutical industry, which drives much of this research, is largely focused on the individual having easy access to aerosolized therapeutics.

There may also be significant hurdles to scaling up these new delivery systems. Advances in traditional lower technologies for delivery might also require monitoring. For example, it was noted during the workshop that the simplest “delivery system” could consist of an infected human used as a vector to spread disease. Incidents of disease transmission in airline passengers seated within several rows of a SARS-infected traveler illustrate this possibility (ECDC, 2010). It is worth noting that the ease with which a disease is transmitted varies, potentially rendering this method less effective at spreading disease to large numbers of people. However, it may still be possible to create public disruption even if only small numbers of people are directly affected.

2.1.7. Biosensors


As described during the workshop, a wide array of strategies can be employed to create biosensors (Kurochkin, 2010; Resnick, 2010), which also help to support and enable life sciences research. As noted above, sensors incorporated into cell culture production systems are used to control and optimize culture conditions. However, biosensors are also used as diagnostic tools in medicine (Mascini and Tombelli, 2008; Rapp et al., 2010), as tools to support public health disease surveillance (Hajslova et al., 2011; Kamikawa et al., 2010; Pejcic et al., 2006; Rodrigues Ribeiro Teles et al., 2010), and as detection tools for biosecurity monitoring (Cirino et al., 2004; Fischer et al., 2007). Many different technologies are used for these purposes, each with its own advantages and limitations.

Roughly speaking, a wide range of biosensor configurations is possible, and the responsive elements of a sensor may employ direct observation of the material or employ antibodies, enzymes, nucleic acids, physical adsorption, or other techniques. When the sensing element is triggered, the response is translated into changes in electrical, magnetic, chemical, or optical signals that are amplified and separated from background noise and displayed in a form that can be read by end users. One of the goals of biosensors is the rapid identification of molecules or organisms, such as pathogens, without first needing to isolate and culture them (steps which generally require both laboratory conditions and time). In addition to rapid identification, general trends in the field include miniaturization and efforts to develop sensors that can detect multiple and/or complex substances under real-time and real-world conditions. A truly robust, broadly sensitive, handheld system would revolutionize biosensor development, but there are still a number of technical hurdles to overcome before this goal is achieved.21

Considering the role that biosensors play in the areas listed above, it is important to recognize that the same technology is not appropriate for all applications. For example, within a medical setting and if technical support is available, it may not be important that the analysis of biological material be automated. For environmental sensors in the field or in a remote setting, automation of analysis is a higher priority. Within a medical setting, personnel attempting to identify a pathogen may seek a sensor that is sensitive to a broad range of organisms. In addition, it may be acceptable for a sensor to identify multiple possible organisms if the patient is symptomatic, because this is additional information that can be brought into the overall evaluation of the situation. In a setting where the primary concern is exposure to a specific, known pathogen, a selective biosensor with high sensitivity and a low false-positive and -negative rate may be the preferred choice to allow for a greater chance of detection to warn exposed populations. A fixed-site facility may not require the same device portability as does a mobile diagnostics laboratory. Thus, the strong drivers that influence the development of devices for the public health community, for example, will not necessarily result in the development of devices that are appropriate for any other community.22

As noted above, there are strong commercial drivers for the development of improved biosensors, particularly in the area of healthcare diagnostics as well as in detection systems for national security applications. The field continues to advance rapidly, although workshop participants noted that developing sensor systems requires making compromises among variables such as sensitivity, specificity, cost, size, and portability. The particular balance of variables chosen to create a cancer diagnostic for use in a hospital setting, for example, may not be the same as the optimal balance for creating a pathogen detector for use in the field during a disease outbreak. In some cases, for example, a preliminary positive response from a sensor is subsequently confirmed through a more specific, often slower, secondary screening test. As a result, a “one size fits all” sensor platform does not exist. Limits in data analysis and interpretation, such as amplification of true signal from background noise and minimization of false positives and false negatives, also remain. Sensors are thus one from a range of identification and monitoring tools.

The increased sequencing speed described previously is a recent development that could improve the overall efficacy of biosensors and detectors and has two main effects: first, high-speed sequencing can be incorporated into the analysis, whether as part of the device itself or as part of secondary analysis after collection of biological material; second, the pace of genetic sequencing allows researchers to create information databases that can be accessed by analysts to assist in identification of known bacteria and viruses. As the known genetic universe expands, both in the area of pathogens and environmental microecology, more sensitive, targeted analysis systems can be developed, and the risk of false positives can be reduced. In addition, antigen-based sensors continue to improve, allowing for the development of systems that more closely mimic human immune responses than previously possible (Ma et al., 2011). If fully developed, these sensors have the potential to allow for responses based on known and unknown pathogens, which is currently difficult to achieve.

Discussion and Implications

Certainly advances in biosensor technology represent an improvement in the tools available to provide advance warning of the release or emergence of a biological threat to human health. However, it is important to remember that in order to interpret results from a biosensor, it is necessary to understand the limitations of the device and the context in which it is being used. For example, a system that relies on collection of culturable material on a surface or in liquid will only be effective if the virus or bacterium survives collection and impact. A detector based on identification of genetic material will identify species that do not survive impact, but it could also issue unnecessary alerts by identifying material that belongs to dead bacteria or viruses posing no immediate risk to human or animal health. One must also be aware of the fact that every biosensor can result in false positives or negatives, misidentification of species, and other instrumental failures.

Even if an ideal biosensor were to be developed (one that could combine attributes such as being portable, sensitive and accurate for a broad range of pathogens, able to determine viability of the material, having a low false-positive rate, etc.), having a robust administrative structure in place to respond to positive reports from the device will still be necessary. Information transfer from one group to another—for example, local to national authorities, or defense to public health officials—is critical. Who has responsibility for the information, who has the right to access it, and who decides the actions to take in a geographic area are all questions that might be addressed in advance of an alert. The population being protected by the device, military vs. general public, for example, could be considered because it will likely change the chosen response. Does the area being monitored contain particularly vulnerable populations, such as elderly or young children? What systems of emergency preparedness and public health infrastructure are in place in the area and what are the vaccination policies? How will the response differ if the alert is a result of a natural disease outbreak, the unintentional release of a pathogen (such as escape from a laboratory), or the intentional introduction of a pathogen? Although at the policy level, the need for these decisions may be triggered by the result from a piece of sophisticated technology, and thus are presented here for consideration.

Workshop participants noted that understanding the scientific basis for biosensor mechanisms has the potential to raise dual use concerns, because such knowledge could theoretically be used to try to evade or take advantage of the biosensors’ limitations. For example, a system employing identification based on genetic material as described above could be manipulated to create a positive result and trigger an emergency or community response that could reveal weaknesses in the response infrastructure, waste resources, reduce confidence in the overall system, and cause fatigue in responders.

Because broad response biosensors pose serious technical challenges (loss of selectivity leading to high false-positive rates, for example), most biosensors today are based on a specific biological or biochemical response to the presence of the target molecule/protein/organism, and this specificity presents a potential target for manipulation or misdirection. For example, at the simplest level, material from threat organisms that have been rendered inactive could be introduced to a targeted sensor to trigger a positive response. At a more sophisticated level, as the ability to change the surface characteristics of organisms becomes more commonplace and easier to accomplish, the ability to change the surface characteristics of potential threat organisms also becomes easier, which could reduce the efficacy of existing detection systems. Acting as a barrier to actually accomplishing this task is the complexity of biology itself: changing the surface characteristics can also change the response of a human being to an organism, so any modification could kill the organism or enhance or negate the anticipated risk to a given population.

Another possible option involves encapsulation of the threat organism or agent within material designed for easier delivery. As discussed earlier in this chapter, this technology advances drug delivery options, allowing for improved uptake of therapeutics. However, such encapsulation technology could also present a challenge for biosensors because it could hide the very surface characteristics being used to identify organisms of concern. This is one example of how advances in technology may result in the emergence of new threats that were not anticipated when the sensor was designed. Technology can be created to respond to known or predicted threats, but the “unknown threat,” whether an emerging infectious disease or an engineered pathogen, will be difficult to identify in this manner.

2.1.8. Discussion and Implications of the Pace of Advances in Science and Technology

Continued progress is being made in a wide variety of S&T areas, although the committee did not identify any advances since 2006 that fundamentally alter the nature of life sciences research. Life sciences research continues to advance rapidly and is expected to do so for the foreseeable future, driven by a combination of academic, commercial, and government influences. The enormous amounts of data and information being generated from research in omics technologies and fields such as immunology, neuroscience, and systems biology are providing scientists with information to better understand processes within biological systems. Research in these fields is helping to support a more complete understanding of human, animal, and plant variability and its relationship to disease and is also identifying and characterizing new microbes and their roles in multiple environments. Scientists are actively seeking to integrate information at multiple biological levels (from genes, to proteins, to networks of intra- and inter-cellular interactions, to community dynamics) in order to improve biological understanding and to support rational engineering and design. As a result, advances in S&T are increasing the overall understanding of biological systems.

Important milestones have been achieved in molecular biology and synthetic biology, and very active research in these areas is expected to continue worldwide. The extraordinary complexity of biological systems and the challenges this complexity presents to the effective understanding and design of biological systems remain significant barriers even as applications building on these research fields draw closer to fruition. This complexity is likely to remain a defining feature of biological systems for the foreseeable future. As a result of this complexity, for example, ab initio design of biological organisms will likely be unachievable for a number of years to come. Well-funded and well-organized research programs are making significant steps toward this goal, but their efforts remain far from commonplace. Although genetic modifications of organisms are already possible and relatively straightforward today, the complexity and stochastic nature of many biological interactions can also render the outcome of novel modifications unpredictable. Understandings reached by the Sixth Review Conference of the BWC include “that all naturally or artificially created or altered microbial and other biological agents and toxins, as well as their components, regardless of their origin and method of production and whether they affect humans, animals or plants, of types and in quantities that have no justification for prophylactic, protective or other peaceful purposes, are unequivocally covered by Article I” (BWC, 2006). This suggests that any forms of artificial biological systems (such as might be created by synthetic biology), or synthetic chemical analogs of biological molecules, would be covered under the prohibitions enshrined in Article 1. However, as science continues to advance rapidly new research developments may provide additional opportunities for further clarification and understandings to be reached.

Developments in S&T in areas such as transgenic animal expression systems, production of proteins in plants through “pharming,” availability and sophistication of small-scale bioreactors, and chemical synthetic methods to produce biological molecules also affect the ways in which biological materials are produced or reduce the time, space, or cost requirements needed to produce them. These advances raise the possibility that molecules that have previously been very difficult or expensive to obtain may be more readily produced in larger amounts (for example, extraction in the 1960s of several grams of the neurotoxin saxitoxin reportedly required processing tons of affected clams [Tucker, 2011b]). The changing nature of biological production systems thus expands the understanding of potentially relevant production capabilities beyond the traditional model of fixed, industrial-scale, cell culture fermentation tanks.

Advances also continue in the development of effective injectable, implantable, and inhalable delivery systems for molecules such as genes and drugs. The medical industry is a primary driver of this development, and the most notable advances are being made at the level of individual-use systems (for example, the delivery of nanoparticles encapsulating chemotherapeutic agents to a cancer patient or the implantation of materials able to release insulin in a diabetic patient in response to glucose levels). In the context of the BWC, questions on the potential for advanced or targeted delivery systems to be scaled up and delivered to multiple people, such as through environmental aerosol dispersal, are particularly relevant. The committee interpreted the obligations contained in Article 1(b) as covering advanced forms of delivery systems, should such systems be used to deliver biological agents in violation of the other provisions of the BWC, but noted that delivery systems developed for medical (veterinary, pest control, etc.) purposes may be relevant to the overall assessment of risks posed to the objectives of the BWC by new technological advances. Detailed discussions on these questions were beyond the scope of the Beijing workshop and current report, but may be areas for further discussions and monitoring.

Biosensors and detectors are another area that has seen significant interest since 2006. The biological and engineering advances that under-pin the development of these sensors continue to move forward, although there are still limitations in what can be achieved, and sensor development balances factors such as specificity, sensitivity, range of target molecules analyzed, and type of use (for example, sampling environmental components such as a building’s air supply or sampling fluids such as blood from a single individual for diagnostic purposes). Biosensors are also only one tool and are used with information provided by other scientific and policy tools in order to make decisions.

Finally, the committee noted that multiple, parallel S&T fields are developing and advancing. As key advances are achieved in one field, they may be combined with developments in others to achieve new opportunities and new applications.


Some of the most notable developments since 2006 can be found in the enabling technologies that underlie and support significant advances in life sciences research, particularly the availability of high throughput systems and powerful computational resources. Access to these resources and the availability of large amounts of data storage capacity underpin many of the developments in the omics fields and in systems and synthetic biology (see Section 2.1.1). Increasing global access to computational and data resources is also cited in the Chapter 3 discussion on diffusion of research capacity and applications. These enabling technologies have general implications relevant to the BWC because they are helping to push the overall life sciences research enterprise forward at an ever more rapid pace. Unlike in the previous section, specific implications for the BWC are not drawn out within each subsection; rather a broader discussion of the potential implications of enabling technologies is provided in Section 2.2.4.

2.2.1. High Throughput Systems

Significant research and development are taking place in new technologies for high throughput sample analysis. High throughput systems generally rely on robotics, computer-based control systems, and detector technologies to automate sample handling and analysis, emphasizing the multidisciplinary nature of modern life sciences research. Although an initial investment in such systems can be significant, they have the ability to greatly increase speed and capacity by analyzing multiple samples in parallel.

DNA sequencing technology is one area that has experienced particularly rapid advances (de Villiers, 2010; Dhar, 2010; Pitt, 2010a,b; Taylor, 2010).23 Next or “second generation” DNA sequencing systems, such as the Illumina HiSeq 2000 released in 2010, have significantly increased DNA throughput capacity. The HiSeq 2000, for example, can reportedly read up to 25 billion bases of DNA per day in 100 base pair read lengths using a modified method of sequencing during synthesis (, Second generation sequencing technology such as the HiSeq generates relatively short lengths of DNA sequence, which are aligned and assembled into the complete sequence using software and computer systems. This process is made significantly easier when a previously sequenced reference genome is available to help guide the alignment, such as the reference human genome sequenced in 2003. A variety of new (“third” or “fourth” generation) DNA sequencing technologies are also on the horizon, some of which might produce longer DNA sequence lengths and higher accuracy than the current technology or might further increase speed and decrease costs (Niedringhaus et al., 2011; Shendure and Ji, 2008). In some cases, these technologies streamline steps in the sequencing process so that each nucleotide is directly read as it is incorporated into a single molecular DNA chain (e.g., Pacific Biosciences) (Niedringhaus et al., 2011). In other cases, very different technical processes are being explored for sequencing, such as the detection of alterations in current as individual bases of a DNA molecule pass through a nanopore (e.g., Oxford Nanopore) (Niedringhaus et al., 2011).

Along with the increase in speed has come a dramatic decrease in DNA sequencing costs. Figure 2.1 analyzes data from the U.S. National Human Genome Research Institute (NHGRI). Since 2008, costs have decreased even more rapidly than would be predicted by Moore’s Law,24 reflecting the use of second generation sequencing systems combined with the availability of the existing human genome reference (Wetterstrand, 2011). As a result, human genome sequencing can now be accomplished for approximately $0.10 per million bases of DNA or less than $10,000 per human whole genome, with costs dependent on factors like the sequencing coverage and error rates, as well as which specific costs are factored into the calculation. In 2010, the company Complete Genomics announced that it had sequenced a genome for a cost of approximately $4,400 in consumables such as reagents (Drmanac et al., 2010). Science may be approaching the $1,000 genome in the not too distant future, a price that may in turn bring the concept of personalized medicine closer to reality (Pitt, 2010b; Venter, 2010).

A graph showing cost in dollars on the vertical axis and date on the horizontal axis. The cost per megabase of DNA sequence is plotted from July 2001 (when this cost is approximately $5,000–6,000) through July 2011 (when the cost has declined to approximately $0.10). The decrease in cost initially appears to follow Moore™s Law but begins to decline far more rapidly after October 2007.


Decreasing costs of DNA sequencing. NOTE: Based on “production cost” data from the Large-Scale Genome Sequencing Program of the U.S. National Human Genome Research Institute. Costs include labor, reagents and consumables, DNA preparation, (more...)

High throughput systems are also available to analyze gene and protein expression. For example, gene microarrays consist of small pieces of DNA attached to a solid surface to act as probes. Pieces of nucleic acid from a biological sample will hybridize with the fixed probes if they have a complementary sequence, and through this process researchers identify those genes that are expressed (turned into messenger RNA) in a particular cell and their relative expression levels. Similarly, a variety of protein microarrays exist to identify and quantify the proteins found in a biological sample (Chandra et al., 2011). The use of mass spectrometry (MS), which ionizes proteins and measures the mass-to-charge ratio of the intact protein molecules and fragment ions, has also become a powerful and widely used tool to characterize the proteins and peptides in biological samples and to support proteomics research (Domon and Aebersold, 2006). Improvements in techniques to generate ions from biological molecules, including matrix assisted laser desorption/ionization (MALDI), have enabled improvements in analysis methods that can provide more detailed structural information about peptides. Examples include time-of-flight (TOF) analysis, in which the mass-to-charge ratio of ions is determined by measuring the time it takes the ion to travel through a vacuum after being accelerated by an electric field, and tandem mass spectrometry (MS/MS), which makes use of multiple stages of MS analysis. These techniques can enable the rapid and sensitive identification of microorganisms as well as their toxins; MS can also be applied to detect a microorganism’s nucleic acids amplified through techniques such as polymerase chain reaction (PCR), which may be useful in cases in which a microorganism cannot be cultured (Boyer et al., 2011; Ho and Reddy, 2011). As a result, these advances can contribute to areas relevant to the BWC including monitoring, diagnostics, and bioforensics.

These types of high throughput systems all function as tools to help support active research in many of the areas discussed at the workshop, including genomics, proteomics, systems biology, and synthetic biology (de Villiers, 2010; Dhar, 2010; Pitt, 2010a). The characterization of changes in gene and protein expression during the progress of different diseases helps researchers identify new targets for the development of diagnostics and therapeutics, while the ability to analyze gene and protein expression in individuals helps advance the concept of personalized medicine.

2.2.2. Computational Technologies and Data Resources

Increasingly powerful stand-alone supercomputers are being constructed, including specialized computers to investigate computationally intensive problems in the life sciences. For example, Anton, constructed by D.E. Shaw Research in 2008, is a massively parallel machine designed specifically to enable atomic-level simulations to be conducted of biological molecules up to millisecond-length time scales and up to 100 times faster than previously possible (Shaw et al., 2008; These molecular dynamics simulations can be used to investigate the folding and interactions of proteins and nucleic acids, for example to examine predicted interactions between cellular receptors and drug candidates in efforts to advance biological understanding and improve therapeutics development. Supercomputing resources are also now available in regions beyond the United States and Europe. Until June 2011, the world’s fastest stand-alone supercomputer, Tianhe-1A, was located at the National Supercomputing Center in Tianjin, China, surpassing the U.S.-developed supercomputer, Jaguar, in the November 2010 rankings published by the Top500 Project. In June, a computer at the RIKEN Advanced Institute for Computational Science in Japan bumped Tianhe-1A to number two on the list and four of the top five fastest super-computers are now located in Asia.25

An alternative strategy to the use of ever more powerful individual supercomputers is the use of distributed computing.26 This strategy allows a network of smaller computers to create the equivalent of a supercomputer, thus enabling wider research access to significant computational resources and the analysis of far more complex problems. In his presentation to the workshop, Dr. Etienne de Villiers of the International Livestock Research Institute (ILRI) in Kenya cited the successful distributed computing example of Folding@Home, a project based at Stanford University that is devoted to understanding protein folding and the relationship of misfolding to disease (De Villiers, 2010). By downloading project software, participants donate a portion of their unused computing resources; the project website notes that “since October 1, 2000, over 5,000,000 CPUs throughout the world have participated in Folding@ Home” (, making it the equivalent of the largest computer in the world. Similar types of volunteer distributed computing networks are available worldwide. The Asia@home project promotes the use of volunteer computing resources in Southeast Asia, and a recent “Asia@home hackfest” was held during the International Symposium on Grids and Clouds 2011 in Taiwan and focused on applications for earthquake science ( Project websites generally describe the motivations, goals, and problems being undertaken and may subsequently publish results. Although participants in these networks control how much of their computing capacity they are willing to make available to the project, they do not know the specific uses to which it is put.

More specialized distributed computing networks, such as the Teragrid system supported by the U.S. National Science Foundation, also provide the research community with access to high-performance computing and data analysis. Teragrid, coordinated through the Grid Infrastructure Group at the University of Chicago, links computers from 11 U.S. partner sites to provide computing capability, online and archival data storage, and access to more than 100 discipline-specific databases ( Similarly, EGI in Europe “maintain[s] a pan-European Grid Infrastructure (EGI) in collaboration with National Grid Initiatives (NGIs) and European International Research Organisations (EIROs), to guarantee the long-term availability of a generic e-infrastructure for all European research communities and their international collaborators” ( These increasingly available distributed computing networks provide researchers with access to computing power, databases, software, and other tools. As a result, they can be thought of as evolving toward “knowledge grids,” a term that has come into use in the past decade to represent virtual social environments that enable access to resources and information as well as the sharing and creation of knowledge (Konagaya, 2006; Zhuge, 2004).

2.2.3. Communication Technologies

Changes in communication technologies, including access to the Internet, email, blogs, social media, mobile communication platforms, and open access publishing, are also enabling widespread dissemination of data and viewpoints and have the potential to change the ways in which scientists work (Meadway, 2010; Royal Society, 2011b).

Internet usage has grown very rapidly. For example, China and Tunisia have experienced 1,800 and 3,000 percent user growth, respectively, since 2000 (Meadway, 2010). A recent report from the Royal Society on international scientific collaborations notes that “the countries showing the fastest rate of growth in publication output and those rising up the global league tables as collaborative hubs show strong trends of growth in mobile phone usage and in internet penetration” (Royal Society, 2011b). Internet penetration is not yet universal and continues to vary widely even among countries in the same region.27 Despite some remaining access challenges, however, the growth in connectivity enables scientists from multiple countries to search and access information, communicate more easily and informally with each other through means like email and video conferences, and share documents for collaborative editing.

Communication tools have enhanced researchers’ access to information in several ways. The ability to search widely used online journal databases such as PubMed, operated by the U.S. National Library of Medicine, coupled with the ability to link to and download journal articles, has become more global as Internet usage has expanded, although institutional subscriptions may be required to access an article’s full text. Several online-only life sciences journals also exist (e.g., PLoS One, Nature Communications). These journals frequently employ some system of peer review, but their online-only format can speed up traditional publishing times. In addition, articles that will appear in future issues of a print journal are frequently available electronically in advance of print publication. The Internet also helps scientists identify specialists with whom to collaborate, although it has been reported that 90 percent of all collaborations are initiated in person (Royal Society, 2011b). However, the Internet and other communication tools certainly help collaborations to develop and move forward once established. In these ways, advances in communications technology continue to improve the ease, speed, and global reach of the traditional ways in which science has been done (in particular, the establishment of individual investigator-to-investigator collaborations that might be initiated at a scientific conference and then carried over to the Internet, ideally leading to the joint publication of a peer-reviewed journal article).

As discussed during the workshop, an additional level of interaction involving greater social participation and networking can also be increasingly facilitated with “Web 2.0” technologies. Sites such as Wikipedia, for example, rely on user-generated content and collective wisdom, and other possibilities include science blogging, direct commenting on scientific articles, tagging of articles of interest to share with fellow users of a particular social networking site, posting updates on Twitter, or others. It is not yet clear the extent to which use of these types of tools has become widespread among practicing life scientists. Reportedly, fewer than 10 percent of a sample of 19,800 blogs tagged “science” were written by scientists, and only low percentages of U.K. researchers in 2009 used Twitter (10 percent) or regularly wrote a blog (4 percent) (Meadway, 2010). The challenges involved in creating new Web 2.0 resources that will be useful to life scientists and that can effectively integrate with the existing ways in which science is done have been noted by several authors (Crotty, 2008; Stafford, 2009). David Crotty, formerly an executive editor at the Cold Spring Harbor Laboratory Press, suggested in 2008 that some of these tools, such as blogging or tagging, take investments of time and currently yield insufficient benefits for a scientist, given the continuing emphasis on peer-reviewed journal publications as the gold-standard by which academic productivity is judged (Crotty, 2008). There are also variations in the uses of technology by discipline, with fields such as computer science and mathematics reportedly making more widespread use of newer communications technologies than fields such as medical science (Meadway, 2010). Within the biosciences, it appears that the synthetic biology community may have adopted some of these newer communications tools—the teams participating in the International Genetically Engineered Machine (iGEM) competition, for example, all develop wiki pages as one of the competition requirements.

2.2.4. Discussion and Implications of Enabling Technologies

There has been particularly rapid progress in both access to and power of enabling technologies that underpin life sciences research, including computational and communication resources and high throughput laboratory technologies. The computational power available to researchers continues to increase, through both specialized stand-alone computers and distributed computing networks. The use of high throughput sample handling and analysis methods has become widespread, and these tools increase the speed with which researchers can conduct studies as well as the volume of data they obtain.

As discussed above, the uses of high throughput analysis tools and computational resources are enabling faster and cheaper developments in the life sciences while the rapid global spread of the Internet and other forms of electronic and mobile communication significantly enable global scientific collaboration and the dissemination of scientific information. Some of the newer “Web 2.0” tools also have the potential to provide a greater social context to the process of scientific knowledge creation, and dissemination and the use of these types of tools in the life sciences may become more widespread as ways to integrate them into the existing system of science become more clearly defined.

These developments have several general implications for the BWC. First, the technologies underpin other developments in the life sciences and contribute to the pace and nature of advances being made in fields that might have specific relevance to the treaty. For example, high through-put techniques yield large amounts of data to advance systems biology understanding in areas like immunology and neuroscience, while computational capacity is used to address problems such as protein structure as part of screening drug candidates for therapeutics development. Second, the global and widespread use of communication technologies, along with models such as online and open access publishing of experimental results, make efforts to control or restrict access to scientific knowledge ever harder. Finally, the same types of mobile and electronic tools that scientists can use to collaborate and share information could also be used by other types of distributed groups, whether state- or non-state actors, to trade information and knowledge. Technological resources that enable the life sciences are now available worldwide, although access to them is not yet evenly distributed. However, the life sciences community is only one of many communities that use computational and communication technologies. As a result, rapid progress in these fields is driven by many factors beyond the life sciences.


Developments in advancing and enabling areas of S&T provide both opportunities and potential challenges relevant to the BWC. One potential challenge posed by advancing S&T is the possibility that a novel development will fall outside the scope of the treaty. As discussed in Section 2.1, the committee did not identify any developments among those it surveyed that did so, a finding also reached by the scientific community at a workshop held prior to the Sixth BWC Review Conference in 2006 (Royal Society, 2006a,b). However, rapid advances in the life sciences on many fronts will likely continue to pose challenges for tracking and assessing future research progress—in establishing priorities for which areas to monitor, anticipating new combinations of advances drawn from progress in multiple fields, and expanding the types of expertise required to assess new developments.

Advances in S&T also provide opportunities to address specific BWC concerns. For example, knowledge derived from omics, systems biology, and immunology, and the high throughput tools, computational resources, and bioinformatics that enable these fields can support rational vaccine and drug design, along with efforts to better understand the immune system, pathogen virulence, and how to modulate these factors. This understanding is critical for effective vaccine and countermeasures development.

As has already been widely recognized, there is a potential dual nature to advances in many fields of the life sciences, because the information that could enable scientists to better understand and manipulate fundamental life processes could potentially also be misused to create harm, and a clear dividing line cannot be drawn between the knowledge, skills, and equipment that would be needed for beneficial or for harmful purposes (Atlas and Dando, 2006; Azzi, 2009; NRC, 2004; van der Bruggen, 2011). It has also been widely recognized that engaging the scientific community in discussions on the safety, security, and ethical implications of research are inherently international, given the global nature of the life sciences research enterprise. This global research capacity and growing numbers of international collaborations in the life sciences are discussed further in the following chapter.



“Omics” fields in the life sciences generally refer to the holistic analysis of a set of biological information, in order to achieve a comprehensive understanding of its structure, function, interactions, and other properties. Omics fields include genomics, the study of the complete DNA sequence of an organism; metagenomics, the identification and analysis of the genomes of a community of organisms without first culturing and separating them; transcriptomics, the analysis of the set of RNA transcripts expressed by a cell, tissue, or organism; proteomics, the study of the set of expressed proteins that result from these transcripts; interactomics, the analysis of interactions among the molecules in a cell; metabolomics, the study of the cellular metabolites produced by the cell, tissue, or organism; and many others.


Second generation sequencers at BGI include 137 HiSeq 2000 systems from Illumina and 27 SOLiD 4 systems from Applied Biosystems, along with multiple, earlier generation capillary electrophoresis (“Sanger method”) sequencers (http://www​ BGI has locations in China, the United States, and Europe.


Descriptions of genomic sequencing projects are derived from articles current at the time of committee discussions. With rapid development in research and sequencing capacity, the state of these projects and the numbers of genomes sequenced also change rapidly.


“A biological pathway is a series of actions among molecules in a cell that leads to a certain product or a change in a cell. Such a pathway can trigger the assembly of new molecules, such as a fat or protein. Pathways can also turn genes on and off, or spur a cell to move” (U.S. National Human Genome Research Institute, Fact Sheets: Biological Pathways, http://www​, accessed August 29, 2011).


For example, SYNBIOSAFE, a project supported by the European Commission, examines issues of safety, security, and ethics in synthetic biology (http://www​ Ethical and security issues in synthetic biology have also been addressed in reports from the U.S. Presidential Commission for the Study of Bioethical Issues (2010) and the U.S. National Science Advisory Board for Biosecurity (2010). The Implementation Support Unit (ISU) of the BWC has co-hosted workshops on synthetic biology in partnership with the United Nations Interregional Crime and Justice Research Institute (UNICRI) and with the Geneva Forum, as well as delivered presentations on biosecurity issues at synthetic biology conferences (reports of the activities of the ISU are available at http://www​


These include, for example, various promoters and regulators to influence gene expression. Building on roots in both molecular biology and traditional engineering disciplines, synthetic biologists frequently conceive of cellular systems through the framework of electronic circuit design. As a result, biological modules may be viewed as functioning like switches, oscillators, logic-gates, and other electronic components; the framework is used as an aid in trying to design and conceptualize biological systems similar to the manner in which engineers design machines. Synthetic biologists have also borrowed terminology from the computational sciences, referring to the ability of genetic material to operate as the “software” of living systems and to “boot up” the operations of a cell (which can analogously be thought of as the hardware).


Dr. Pitt is currently affiliated with Aston University.


Discussion continues about the relative risks and extent to which advances in areas such as DNA synthesis and synthetic biology enable the construction of novel viral or bacterial pathogens. Design issues arising from the complex nature of biological systems are noted above (Purnick and Weiss, 2009), suggesting that creating a novel genome that yields specifically desired pathogen functions and virulence, either by de novo design or by combining sequences derived from existing microorganisms in new ways, would continue to take significant time and effort. To create a functional pathogen also requires additional, nontrivial steps beyond the construction of a nucleic acid genome. These include packaging the genome into a viral capsid or a bacterium, replication and production of larger quantities of the pathogen, and possibly steps to protect the pathogen from environmental degradation and render it more suitable for delivery (Tucker, 2011a). Further discussion about tacit and explicit knowledge required to conduct complex scientific experiments may be found in Section 5.1.2.


Because potential biothreat agents could be used not only to cause human disease but also to act against veterinary or agricultural targets, the relevant “host” for a pathogen could be a human, a nonhuman animal, or a plant.


Many pathogens employ strategies designed to diminish the effectiveness of a host’s immune response against them. For example, almost all human cells display Major Histocompatibility Complex (MHC) class II molecules on their surfaces, and certain cells also display MCH class I molecules. These molecules present antigens derived from infecting pathogens to the immune system. Some pathogens decrease MHC I or II expression on cell surfaces, diminishing the resulting immune response. Other pathogens directly target and kill frontline immune sentinel cells such as macrophages and dendritic cells. Plant pathogens also employ strategies to decrease the effectiveness of plant immune responses directed against pathogen-associated molecular patterns and virulence factors. Although plants lack some types of immune responses exhibited by mammals, they employ similar types of “innate” immune responses (Jones and Dangl, 2006).


The mammalian immune system includes innate immune responses (which are rapid in response and are frequently directed against conserved pathogen signals such as bacterial lipopolysaccharides) and adaptive immune responses. The adaptive immune system includes two broad pathways—one that results in the generation of circulating antibodies directed against an extracellular pathogen or toxin (“humoral immunity”), and one that directs the immune system to kill cells that have been infected with an intracellular pathogen such as a virus (“cellular immunity”). The nature and extent of immune system responses are influenced by many factors, including the type and location of immune cells that first encounter the pathogen and by chemical signals such as cytokines that preferentially direct the immune response toward one or the other pathway.


Researchers seeking to create a contraceptive vaccine used a nonpathogenic strain of the Ectromelia virus, which causes mousepox, to deliver DNA encoding a mouse egg protein to mice. The goal was to induce an immune response against the egg protein, preventing fertility. In order to boost the effectiveness of their vaccine, researchers also co-delivered DNA for the cytokine IL-4, which modulates the immune system. By influencing the immune system in such a way that it mounted a less effective response to the vaccine virus, the researchers unintentionally created a mousepox virus that was lethal to the mice.


A zoonotic disease is one that can be transmitted between wild or domesticated animals and humans.


Reports of the activities of the BWC ISU reference relevant meetings with a variety of intergovernmental and nongovernmental organizations and are available at http://www​


The blood brain barrier inhibits the movement of most molecules from the body’s bloodstream into the brain and central nervous system, although small molecules such as dissolved oxygen can pass, and some molecules, such as glucose needed by brain cells, are actively transported across. The barrier consists largely of tight junctions between the endothelial cells that line the capillaries.


It is important to note that limitations continue to exist in the ability of neuroimaging technologies to accurately detect states such as deception or memory. Methods such as magnetic resonance imaging, positron emission tomography, and other techniques aggregate signals from multiple neurons, and spatial and temporal resolution vary depending on the particular technology. How closely imaging studies in controlled laboratory conditions on compliant, healthy volunteers would correlate with other populations also remains unknown. The current state of neuroimaging techniques is discussed in a recent series of modules from the Royal Society (2011a) and the NRC (2008) as well as in presentations from the Second Raymond and Beverly Sackler USA-UK Scientific Forum: Neuroscience and the Law, March 2011 (http://sites​.nationalacademies​.org/PGA/stl/PGA_062477).


Proteins and peptides are both composed of a series of amino acid building blocks. Molecules containing less than approximately 50 amino acids are generally referred to peptides; larger molecules are generally referred to as proteins.


Eukaryotic organisms like animals and plants frequently modify proteins after they have been translated from mRNA—well-known modifications include the addition of phosphate groups (phosphorylation) and the addition of specific carbohydrate molecules (glycosylation). Bacteria lack the ability to conduct many post-translational modifications, and different eukaryotic systems (yeast, different plant and animal species) also vary in the details of the specific modifications they conduct. The influences that these post-translational modifications have on protein properties, including on correct folding and protein activity, are also ongoing areas of life sciences research.


Gravitational settling of a particle is affected by its aerodynamic diameter, which depends on a particle’s geometric diameter and density and which represents the diameter of an equivalent spherical particle with a density of 1 gram per cubic centimeter that settles at the same rate. Particles with aerodynamic diameters of 2-10 micrometers generally deposit in the trachea and bronchi while particles of less than 2 micrometers reach the alveoli of the lungs (NRC, 2011c). This concept has been employed to develop novel aerosol delivery systems using, for example, porous polymer particles over 5 micrometers in size but having relatively low density (producing an aerodynamic diameter of approximately 2 micrometers) (Edwards et al., 1997). Such particles were inhaled into the deep lung and showed prolonged systemic drug levels compared to smaller and denser particles with the same aerodynamic diameter.


Many viruses have been studied as potential delivery systems. Some of the most commonly studied are retroviruses (particularly lentivirus), vaccinia virus, adenoviruses, and adeno-associated viruses.


Two concepts of relevance to the development of biosensors are sensitivity and specificity. Sensitivity refers to the ability of a sensor to accurately identify true positive signals (i.e., it does not miss cases). Specificity, on the other hand, refers to the ability of a sensor to accurately distinguish true negatives (i.e., it does not give false positive readings). In general, development reflects a balance between these two goals and it would be extremely difficult, if not impossible, to design a sensor to be both perfectly sensitive and perfectly specific.


A variety of portable biosensors exist, including “electronic noses” for applications ranging from industry to law enforcement. Such sensors can, for example, detect cocaine molecules (Stubbs et al., 2003) or help to identify bacterial species (Dutta and Dutta, 2006).


“First generation” DNA sequencing was based on a method initially developed by Frederick Sanger in the 1970s and on the fact that double-stranded DNA is synthesized using its complementary strand as a template. As this synthesis is conducted, regular deoxynucleotide triphosphates (the building blocks of DNA) are mixed with labeled dideoxynucleotides that will terminate an extending DNA chain. The result is a series of DNA molecules that each differ by one nucleotide in length; these are separated by capillary electrophoresis and the terminal nucleotide identified, allowing the DNA sequence to be read.


“Moore’s Law” is the observation by Gordon Moore, the founder of Intel Corporation, that the number of transistors on a computer chip roughly doubles every two years. The comparison has frequently been drawn between this exponential growth and a comparable growth in DNA sequencing and synthesis capabilities.


Supercomputer rankings by the Top500 project are released twice a year based on the use of a benchmark performance measure. See http://www​


Distributed computing “is any computing that involves multiple computers remote from each other” (de Villiers, 2010); the systems exist in various configurations with slightly different properties (e.g., cloud computing, grid computing). For further examples on the uses of distributed computing in life sciences research, see Burrage et al. (2006), den Besten et al. (2009), Schatz et al. (2010).


The International Telecommunication Union (ITU) monitors global trends and has created an ICT Development Index (IDI) that reflects multiple factors such as fixed and mobile telephone and Internet infrastructure, access, usage, and skills combined into a single score. Among 159 countries in 2008, Sweden had the highest IDI score (7.85), but significant country-to-country variation is present. Argentina, for example, had an IDI score of 4.38 (number 49 on the list), while Bolivia had a score of 2.62 (number 101); in Africa, Morocco had an IDI score of 2.68 (number 97), while Uganda had a score of 1.30 (number 145) (ITU, 2010). Other groups also monitor trends in world Internet usage. For example, although 66 percent of the general population in Argentina reportedly had access to the Internet as of March 2011, only 10.9 percent did in Bolivia. The rate was 41.3 percent in Morocco, versus 9.2 percent in Uganda and only 0.5 percent in Ethiopia (http://www​.internetworldstats​.com/stats.htm, accessed July 10, 2011).

Copyright © 2011, National Academy of Sciences.
Bookshelf ID: NBK91462
PubReader format: click here to try


  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (2.3M)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...