Panel II: Federal R&D Strategies

Taylor T.

Publication Details

Mr. Taylor Introduced himself as “born and raised” in Washington, and said that he believed strongly in the innovation power of the federal government after a career with NASA, including responsibilities for technology transfer. While the infrastructural roles of government were well known, he said, such as building highways and bridges, he felt strongly that the range of activities described at this workshop were typical of the agencies’ efforts to apply the results of research to innovation, firm formation, and economic growth.



Directorate for Engineering, National Science Foundation

Dr. Peterson began with an overview of the National Science Foundation (NSF)—a national funding agency with no labs of its own. “All the money that comes in, goes out—primarily to universities—to support basic research in science and engineering, as well as educational activities.” He said he would review the parts of the NSF mission that focuses on innovation, with emphasis on the importance of public-private partnerships.

He referred to Dr. Mirkin’s early comment that innovation must begin with people—a talented work force. “We invest a lot in programs related to that work force,” he said. “Then we provide research support so that faculty and students can develop their good ideas. Third, we provide mechanisms to facilitate innovation.”

One such mechanism, he said, is the design of the professional staffing at the NSF. “These are people like me,” he said. “We are rotators. We come in from other locations, typically universities, spend three or four years at NSF, and then return to our institutions. The advantage of this process is that it brings in people who have direct experience in universities and the challenges they have. The disadvantage is that sometimes it takes us two or three years to figure out the procedures of government and how to work with them.”

He began with the President’s Innovation Agenda 2009, which has three levels. The first is basic research to “catalyze breakthroughs for national priorities.” This level provides the ideas and “raw material” for innovation and further development. The second level is “translational work” to promote competitive markets that spur productive entrepreneurship. The third is to tackle “grand challenge” issues through investment in the building blocks of American innovation.

He noted that the National Academies, as well as the NSF, are familiar with the role of basic research, which is “our bread and butter.” At the same time, he said, many people are surprised to learn that the charter establishing the National Science Foundation (NSF) in the 1950s also contained a clear mandate to focus on activities that have societal benefit. “So it’s not outside the purview of NSF to ask what some of its investments are doing in terms of eventual market value and application.”

Of the NSF’s $7.5 billion budget, he said, “almost all that money goes out the door.” The overhead budget is about 5 percent, he added, and suggested that the agency should “invest at least a small fraction in helping those researchers who may have a potential commercial idea to take it to the next step.”

He said that basic research is an integral part of broad research topics, such as advanced manufacturing, “in which there are not only engineering questions, but also many fundamental issues,” with research investment made throughout many of the NSF directorates. Examples of basic research areas within advanced manufacturing included complex systems design, cyber-based approaches, materials design, and scalable manufacturing.

The National Science Foundation is also investing substantially in “secure and trustworthy cyber space (SaTC),” which is “a huge issue right now.” NSF support is focused primarily on the engineering aspects of the Networking and Information Technology Research and Development (NITRD) strategic plan. The NSF is allocating its investments across a number of directorates in a program titled Enhancing Access to the Radio Spectrum (EARS). The objectives are to develop more efficient radio spectrum use and energy-conserving device technologies.

Dr. Peterson said that some investments by NSF might be surprising, such as its translational research programs. The agency supports many center-like programs that fund not only the principal investigator, but also teams of investigators and teams of universities partnering with teams from industry. These investments span the spectrum from fundamental discovery to potential commercialization. NSF’s first priority in these translational research efforts is not commercialization per se, but while most such activities fall under the category of basic research, many look simultaneously at potential applications. At the same time, they gather input from industry to adjust existing basic research programs or design new ones. He noted that many of these translational programs are not new—some have been operating for as long as three decades. The Industry-University Cooperative Research Centers (I/UCRC) started in the late 1970s, the first SBIR program was launched at NSF in 1982, and the Engineering Research Centers (ERCs) were first funded in 1985. The newest of these programs, the i-Corps, began in 2011. Additional translational research programs include the Science and Technology Centers (STC), Materials Research Science and Engineering Centers (MRSEC), Grant Opportunities for Academic Liaison with Industry (GOALI), Partnerships for Innovation (PFI), and Nanoscale Science and Engineering Center (NSEC).

The role the NSF can play in economic development is greatly extended by partnerships with universities and industries. For example, the I/UCRCs are usually located not only near or at universities, but also near industries with strong research in engineering, computing, or information science, such as Corning, BASF, Kyocera, Kennametal, and Ceradyne. The ERCs also have an extensive geographic reach, with locations in nearly every state, and unexpected endurance. The official lifetime of an ERC under NSF support is only 10 years, but virtually all are still in existence after raising their own continuing support from local, state, federal, and industry partners. About half of them focus on some form of engineering, but mathematics, physical sciences, social sciences, and health sciences are well represented. Industrial partners include Applied Materials, Corning, Raytheon, IBM, Michelin, Genencor, Cisco, Boeing, Agilent, BASF, and many others. “These are fantastic anchors for regional innovation,” he said. “They are especially important at NSF, because unlike DoD, NASA, and some other agencies, we will never buy a product we have invested in. So we can invest in any good idea.” Some companies built on NSF SBIR research include IntraLase Corp, Bluefin Labs, Inc., and ABS Materials.

The NSF has focused on the most effective ways to leverage small amounts of money to allow the fruits of basic research to develop to the proof-of-concept or prototype stage, and from there persuade a VC firm or other partner that the technology is worth a substantial investment. He pointed to i-Corps, which began in the summer of 2011, as an example of that strategy. Even though the FY2012 budget for i-Corps is only $7 million within a $7.5 billion agency, and the maximum award is $50,000, the program has been, according to Dr. Peterson, “wildly successful.” Its purpose is to provide support for current or recent NSF grantees that are just past the discovery stage but not yet ready to apply for an SBIR or SBA grant. “This i-Corps grant provides not only some modest financial support,” he said, “but it is also an educational tool, and a hypothesis-driven process. It helps teams of researchers, who really aren’t skilled at evaluating the market potential of their idea, to get to a point where they can do that.”

i-Corps projects are team-based, including an entrepreneurial lead, who may be a postdoc or student charged with moving the enterprise forward; an i-Corps mentor, who is a volunteer guide with experience in the subject area; and the PI, a researcher with a current or previous award. The project provides a 30-hour, hypothesis-driven curriculum for the entire team. The program made 25 awards in FY2011 and 100 in FY2012, with participation by all but one NSF directorate. Among examples are Graphene Frontiers, with the University of Pennsylvania, which develops a process to produce high-quality, low-cost, large-area graphene films for thin, flexible devices; and Ground Fluor Pharmaceuticals, with the University of Nebraska, developing a single-step fluorination for positron emission tomography (PET) imaging.

He concluded with several examples for which initial basic research investments by NSF provided substantial momentum toward successful commercial products or companies:

  • SBIR support of Qualcomm: In 1985, Andrew Viterbi and six colleagues formed “QUALlity COMMunications”, and in 1987 NSF provided $265,000 in SBIR funding for single-chip implementation of the Viterbi decoder. This led to high-speed data transmission via wireless and satellite and to a company that is presently worth about $80 billion and holds more than 10,100 U.S. patents.
  • Developing DNA as a forensic tool: Basic biological research led to the PCR technique behind DNA fingerprinting. The NSF Directorate of Bioscience made numerous investments in this technology, which has become a key to the U.S. legal system.
  • Memory storage devices: NSF funding for the ERC at Carnegie Mellon University in the early 1990s supported fundamental advances in computers, including the nickel-aluminum under-layer that enables high-capacity memory storage. This storage is today used in laptops, MP3 players, and other consumer electronics.
  • Internet search engines: “Sometimes things develop by pure serendipity,” he said. In the 1990s, NSF chanced to fund Stanford professor Hector Garcia-Molina’s Digital Library Project. The university’s annual report, in describing “what else came from this research,” included the following footnote: “Graduate student Larry Page developed a new approach for a search engine. See”
  • Retinal implants: NSF-supported researchers at Johns Hopkins Intraocular Prosthesis Group, North Carolina State University, and, more recently, the University of Southern California are creating retinal prostheses to electronically capture and transmit images to the brain, enabling patients to see light and shapes.
  • Nano-patterning and detection technologies: The NSF funds the center for Nano-patterning and Detection Technologies at Northwestern, directed by Dr. Mirkin, who holds more than 350 patents in the field. Among two companies spun off from this work are NanoInk, founded in 2001 to develop the Dip Pen Nanolithography (DPN) tools for fabricating MEMS and other nanoscale devices; and Nanosphere, founded in 2000, which offers nanotechnology-based molecular diagnostic testing. Said Dr. Peterson, “We take credit for a lot of his success.”

Dr. Peterson summarized his theme by saying “we like to refer to ourselves as the Innovation Agency; or Where Discoveries Begin. I think these catch phrases are oversimplified, but they do have a strong element of truth. We rely substantially on public-private partnerships to help our academic community be successful in developing their innovations and translating them into products and companies in the innovation space.”



Innovation Fellow, Office of Naval Research

Dr. Fall began by acknowledging the need for more knowledge about innovation activities both at home and abroad. “We don’t have a lot of situational awareness about what’s going on,” he said, “even across our own government agencies.” He said he would try to make the case for this need by describing the activities of his own agency and its structure. The Office of Naval Research, he said, with a central role in generating technologies for the nation’s defense, must have a technological edge in innovative operational concepts and the science and technology behind them.

He called ONR “the Navy-Marine corps bank for funding research.” More formally, the ONR mission, as defined by Public Law 588 of 1946, is “to plan, foster, and encourage scientific research in recognition of its paramount importance to future naval power and national security.” Among the innovation milestones of ONR, he said, were development of a “drone” airplane in 1916 that could fly by radio control; the timing mechanism that allows the GPS system to work; and early technologies to launch terrestrial satellites.

He proposed a fundamental distinction between innovation and invention. Invention, he said, was working to create ideas, while innovation was putting ideas to work. “ONR,” he said, “is structured to be an innovation machine,” and the office is organized to carry innovations through to practical uses, whether commercially or for the government.

This emphasis dates back nearly a century, when the corporate laboratories were made subsidiaries of the ONR. The lab itself was founded by Thomas Edison, who was brought into the government in 1916 to establish a new research approach. The real acceleration of this strategy, he said, occurred with the government’s decision to use public funding to support science during the Vannevar Bush era after World War II. With that decision, the ONR became the first federal funding agency, predating the NSF, and DARPA.

The Defense Authorization Act of 2001 revised the structure of ONR. Before that date, the office had closely resembled the NSF in its emphasis on basic research. In 2001, all of the ONR’s translational R&D, he said, “was brought into the building to manage the Navy’s basic, applied and advanced research to foster transition from science and technology to higher levels of research, development, testing, and evaluation.” A new and critical layer of management was added to create three virtually equal directors: the Director of Research, Director of Innovation, and Director of Transition. Virtually the entire budget now flowed through those three offices, which had to compete with one another in arguing for the relative importance of the work they were doing and wanted to do in the future. “In the end,” said Dr. Fall, “this creates a better product. I don’t know of any other agency organized like this. It results in an engine for innovation that works well.”

He described the ONR’s mission in budgetary terms. The majority of the annual budget of approximately $2.25 billion is divided roughly as follows:

  • 45 percent to basic research: This is fundamental science, or “seed corn,” that forms the basis for solutions expected to bear fruit five to 20 years in the future.
  • 12 percent to innovative naval prototypes: These projects are disruptive technologies or “leap-ahead innovations” expected to ripen five to 10 years in the future.
  • 12 percent to future naval capabilities: This category, which he also called “acquisition enablers,” refers to evolutionary or component improvements expected to occur in three to five years.
  • 8 percent to quick-reaction science and technology: These are fleet-driven (i.e., generated by Navy personnel) solutions that can be achieved in one to two years.

He elaborated on the kinds of technologies assigned to each category. Quick reaction research is aimed at solving specific practical problems described by users within the fleet. ONR deliberately reaches out to sailors and marines, asking for input. A typical problem is one that cannot be solved by existing technology, but might require as much as a year’s worth of research costing about $1 million. An example has been to revise lighting systems to reduce the glare and noise from overhead lights in submarines. The ONR was able to reduce both annoyances by replacing fluorescent lighting with silent, less harsh LEDs.

The program to develop future naval capabilities (three to five years) is “the pot of money where we squeeze every bit of risk out that we can.” He called it “the engine that takes the basic science and turns it into the stuff people want.” The military is good at finding out what warfighters need, he said, and the ONR’s job is to match the need with the available basic science, and to develop that science to the point of utility. “We have an elaborate framework to accomplish technology transition agreements,” he said.

The projects under the category of innovative naval prototypes are the technologies that “sailor or marines don’t know they need yet, but the best science can produce.” For example, the ONR can see that in order to sustain its technological leadership, it will have to develop complex new systems, such as directed energy,30 ship-borne laser weapons, tactical satellites, electromagnetic railguns,31 and persistent littoral undersea surveillance. Such high-risk, high-payoff programs may cost hundreds of millions of dollars over five years and require the involvement of department leadership.

For basic research, he said, activities are not directed, “but clearly we guess there might be a military use in the future.” This program is basically an investment in people, he said, and is diverse and long-term. Topics being supported by ONR include graphene, electronic warfare, advanced GPS research, spintronics, arctic research, weather modeling, and laser cooling. Investments in this category involve extensive high-level discussions on broader levels of military strategy.

When the Defense Authorization Act of 2001 allocated S&T funding authority to the three categories of research, innovation, and transition, it also brought the directors and staffs together under one roof. This, in the opinion of Dr. Fall, has created a uniquely integrated decision structure that is far more effective than a traditional silo-oriented structure. At ONR, a program officer is well-versed not only in one of the three phases of R&D, but in all three phases. That is, each program officer overseas the development of a program through its basic research (6.1), applied research (6.2), and advanced development (6.3) phases, with all oversight done in the same building. In the other services, he noted, program officers for each phase tend to be different people located in different locations.

The R&D investments of the ONR are distributed among naval labs and centers, universities, and private industry firms. The proportion of these investments varies by type of research. For 6.1 projects, 62 percent of funding goes to universities, 31 percent to naval labs and centers, and 7 percent to industry. For 6.2 projects, 47 percent of funding goes to industry, 30 percent to naval labs and centers, and 23 percent to universities. For 6.3 projects, 65 percent goes to industry, 21 percent to naval labs, and 14 percent to universities.

The ONR presence in Washington, DC is modest, consisting of only the headquarters and program staff. It connects with a widely-placed network of naval labs and their corporate R&D centers that are associated with acquisition. The ONR funds part of that R&D work. It is unusual in spending considerable funding in its presence abroad, which includes 25 PhD-level scientists whose job is to understand new technology trends internationally. It supports R&D programs in 70 countries, all 50 states, 1,078 companies (including 859 small businesses), 1,035 universities, and nonprofits. ONR, together with its sister DOD international agencies, are among the few federal agencies that can fund abroad. It also spends about $8 million a year on STEM education, from the high school level to programs for young researchers.

In conclusion, he said that the features of the ONR, which few people have heard of, constitute a model worthy to be emulated at the national scale. While responsible for the Navy’s applied and translational needs, it fights to retain its core basic research, “the seed corn for innovation. We fight off people who want to change our budget and get rid of the basic research in order to buy weapons today. We understand the importance of both, but we also maintain an independence.” The office must struggle to do that, he emphasized, given the constraints of any federal program, including Federal Acquisition Rules, the DoD 5000 process flow chart, Office of Personnel Management rules, and other bureaucracy. “With all those constraints,” he said, “it still looks and functions a lot like the best of corporate innovation frameworks. Our corporate labs, special basic research, and funding structure are balanced between what we want to do now and what we want to do later. And all of that is set up in a reasonable risk profile. We believe that it works for us, and can work for others.”



Center for Strategic Scientific Initiatives, National Cancer Institute, National Institutes of Health

Dr. Lee proposed an innovative approach to research on one of the most intractable research challenges—discovering cures for cancer—and demonstrated that even the largest federal agencies can have the flexibility to experiment with unusual strategies.

The need to innovate in this case, he said, was apparent because of three stark realities. First, cancer continues to be heavy disease burden in the United States. About half a million Americans died of cancer in 2011, and some 1.6 million will be diagnosed with the disease in 2012. In 2010, cancer care cost Americans $124.6 billion.

Second, there is virtually no saving therapy for disseminated or metastasized cancers, which cause more than 90 percent of cancer deaths. “As the disease begins to spread,” he said, “no matter where it starts, the data shows your ultimate outcome grows dismally worse the farther the disease spreads.”

Third, there has been virtually no change in this reality in the four decades since the “war on cancer” was proclaimed by President Nixon.32 In fact, unlike other major disease killers, cancer continues to take nearly the same toll it did in 1950. For heart diseases, the death rate per 100,000 Americans has dropped from 586.8 to 203.1; for cerebrovascular diseases, it has dropped from 180.7 to 44; and for pneumonia and influenza it has dropped from 48.1 to 18.5. For cancer, the rate has barely changed, declining from 193.9 to 186.2. The cancer health burden is also global; the rate of cancer mortality worldwide is estimated to reach 10 million per year by 2020, with an incidence of 16 million cases per year. Almost every area of the world has suffered a 50 percent increase in cancer deaths since 2002.33

It is difficult to blame this slow progress on a lack of knowledge. In fact, new knowledge is accumulating at an unprecedented rate. Just a decade ago, the world was abuzz when scientists succeeded in identifying all the approximately 20,000–25,000 genes in human DNA, and determining the sequences of the 3 billion chemical base pairs that make up human DNA. Today researchers are launching programs to examine many thousands of genomes.

“It’s not that we aren’t generating enough knowledge in this area,” said Dr. Lee. “But we have to ask whether this additional knowledge is yielding more solutions for patients.” He noted that at present it takes 10 to 15 years to develop a new drug, and the cost of doing so has increased exponentially from 1990 to 2006 to $1.8 billion.34 Over the same period, total industry R&D expenses for drug discovery and development have risen from less than half a billion dollars to more than $35 billion. The results of this increase in spending have brought no more drug approvals today than there were around 1980.

“This is not sustainable,” he said. “In 2009 we were able to get approval for only 24 new molecular entities through the FDA; and of those only 17 were considered brand-new. That is very disheartening. Likewise, the situation with biomarkers is even more dismal, with 1.5 biomarkers being approved per year from thousands of samples.”35

Researchers in different sectors have debated whether too many papers are being published by public-sector research institutions, but Dr. Lee doubts that this is the cause of the problem. Over the last 40 years, 153 FDA approvals were done on drugs that were started in public sector research institutions, he said, or about 9.3 percent of all approvals. “If you just look at the priority review, 20 percent of those were done by public sector research institutions, and virtually all important, innovative vaccines introduced in the last 25 years have been created by PSRIs. So that level of innovation does exist in this sector. The question is how to accelerate it.”

The current paradigm for drug creation, which Dr. Lee called “turning the crank,” has been used for many years. It begins with gene studies and moves through target identification and validation, drug creation, and finally three stages of clinical trials—the “traditional costly and slow route of drug development.” The challenge, he said, was how to break out of this cycle.

Dr. Lee is experimenting with one possible way. “We reached out to the community 10 years ago and asked for their key needs as researchers,” he said. “What we got back was a little surprising. First, everybody wanted standards and protocols. They also wanted real-time, public release of data. They wanted large, multidisciplinary teams and a pilot-friendly team environment to share failures as well as successes with each other. Finally, they wanted team members who themselves have trans-disciplinary training. We thought all of these would be difficult at the time, especially public release of data. We felt that if we were able to meet just a few of these needs we would have the potential of transforming how we do drug discovery and diagnostics for cancer.”

This idea was greeted with enthusiasm by Dr. Anna Barker, former deputy director, and continues to be embraced by Dr. Douglas Lowy, the current deputy director of NCI, who “took all of those bullet points and put them right into our mission. He said we absolutely needed to build programs that had broad deployment of data and tools for everyone in order to empower the entire cancer research continuum—not just basic science or treatment or diagnosis or prevention.”

At the NCI, Dr. Lee’s Center for Strategic Scientific Initiatives (CSSI) was just one piece of the $18 billion National Cancer Program, budgeted at a mere $145 million. Nonetheless, the CSSI began its bold plan in 2003. It began with a “Technology Dashboard” called IMAT, or Innovative Molecular Analysis Technologies, announcing to the research world that the program would have two pieces:

  • Innovative Technologies for Molecular Analysis of Cancer, for which proof-of-concept technologies and projects were encouraged, and were driven by milestone and technology development, without biological content;
  • Application of Emerging Technologies for Cancer Research, including validation and dissemination of platforms, and demonstration of impact on basic and clinical research.

“We said, Come with your best ideas. Many people still don’t believe us when we tell them we want new technology-driven ideas with minimal biology. What we got in 2003 were some of the ‘same old-same old,’ but then we also were surprised to find a lot of genomic platforms, some proteomics platforms, and some nanotechnology platforms in the innovation space, and an overwhelming amount of genomics platforms in that emerging technologies space. So we responded to the scientific community and took the “easy” path to go after each one of these systematically.”

The first program was to examine how many types of genomics platforms were being studied, and the different reasons people thought cancer was a disease of the genes. They did not try to pick the winner, but let the proposers compete head to head, as in engineering. “I’m a chemical engineer by training,” he said, “so we thought of this as a means to generate an analog of a ‘steam table’ for cancer. To do so, we needed to catalog all the genomic changes using orthogonal platforms with the same patient sample, repeat this on 10-fold more samples than was previously being done in the literature, and finally do this for not just one but for as many cancers as possible. Most important we took seriously that bullet point of making the data public quickly.”

They quickly launched The Cancer Genome Atlas (TCGA) for three pilot diseases—brain, lung, and ovarian cancer. They funded genome sequencing and characterization centers to not only generate all the data, but also data coordinating centers to quality control the data and to analyze the data orthogonally.

By 2008, the TCGA program released its first reference cancer genome of glioblastoma for public use. It was published under one author, the Cancer Genome Atlas Network, which listed more than 300 authors. It was described as a “comprehensive genomic characterization that defines human glioblastoma genes and core pathways.”

“We found,” he said, “despite everybody’s disbelief, a couple of genes no one had ever associated with brain cancer.” This energized the scientific community, many of them not funded by CSSI, to use this reference genome, like a chemical engineer would use a steam table, to find additional signatures. In 2009, a group using the reference data found that response to aggressive therapy differs by subtype, which allowed new ways to exclude patients who were unlikely to respond to the drug. In 2010, another group identified a new subset of GBM that occurs in younger patients and brought evidence of better prediction of outcomes.

He then showed a figure he generated using data he downloaded from The Cancer Genome Atlas, which he described as his genomic steam table across diseases, and encouraged conference participants to try downloading the data. The picture depicted rows, each of which represented one patient, and columns, each of which represented one of the 23 human chromosomes. “This shows not only how cancer truly is a different disease, depending on the patient,” he said, “but also how lucky we were that we started with glioblastoma in being able to find a reproducible signature.”

Compiled from the The Cancer Genome Atlas, chart shows data available through summer 2011 from 2,053 cancer patients. The picture depicts rows, each of which represents one patient, and columns, each of which represents one of the 23 human chromosomes

FIGURE 1Genomic “Steam Table” (Summer 2011)

SOURCE: Jerry S. H. Lee, Presentation at June 28–29, 2012, National Academies Symposium, “Building the Illinois Innovation Economy.”

He noted that the figure used only data available through the summer of 2011, representing approximately 2000 patients. He then showed an updated figure that now captured data till February 2012, where an additional 2000 patients were added and reflected the rapid pace of the project. “The ease of obtaining this data is providing the equalizer for everybody to innovate together,” he said. “Those who are not able to afford to do this type of characterization can still benefit from using the data.”

Dr. Lee then reminded everyone that while impressive, this data can only benefit the patient if it is translated into clinical interventions. As such, he said that the first follow-up to the genome program was the Cancer Target Discovery and Development Network (CTD2) which accelerates the translation of patient genomic data into clinical application. “The pilot phase was possible using stimulus funds in 2009 to launch a network to computationally mine large-scale genomic data to identify new therapeutic target candidates and to subsequently confirm novel modulators, such as small molecules and siRNAs,” he said. Models, reagents, analysis tools, and data from this network continue to be shared with the scientific community with the goal of finding and testing new clinical interventions.

Compiled from the The Cancer Genome Atlas, chart shows data available through February 2012 from 4,231 cancer patients. The picture depicts rows, each of which represents one patient, and columns, each of which represents one of the 23 human chromosomes

FIGURE 2Genomic “Steam Table” (Spring 2012)

SOURCE: Jerry S. H. Lee, Presentation at June 28–29, 2012, National Academies Symposium, “Building the Illinois Innovation Economy.”

The second follow-up is the Clinical Proteomic Tumor Analysis Centers (CPTAC) program. As not all genomic aberrations are reflected as proteins, the purpose of the program is to identify the modified proteins using the same samples characterized by the TCGA program. This program was launched in September 2011 with some samples already been processed. Data from this network will be shared at opening of a public data portal in summer 2012.

Dr. Lee then noted that through programs such as TCGA and CPTAC, we began to understand more about the molecular aspects of cancer and recognized that perhaps the best interventions would occur at the micro/nanoscales. Fortunately, a pilot program to push nanotechnology into clinical studies had been launched by CSSI in early 2005. “Amid much skepticism,” he said, “we believed that nanotechnology could be used in the clinic, not just for basic life sciences.” He noted that Dr. Mirkin and others agreed, and helped persuade then NCI external scientific advisors that this was possible. This effort has now entered its second phase, he said, which builds upon more than five clinical trials launched in the first phase, and would be even more clinically focused. “Already we have more than a dozen nano-enabled diagnostic therapy and imaging trials in this network.”

He returned to the topic of data, and the challenge of interpreting and understanding the large new flows generated by the various CSSI program. “Who is going to interpret and understand it all?” he asked. To tackle this, CSSI began a bold move of inviting participation of scientists outside of the cancer fields for whom large data sets are a familiar part of their own work. These included physicists, engineers, mathematicians, computer scientists and other quantitative scientists to look at the data with a different perspective and offer their own ideas of what causes cancer and how they thought the disease works.

“Physical scientists have very different ways of interpreting the data,” he said. “We gave them the difficult charge not to do just better science, but paradigm shifting science. We asked them to build new fields of study based on their perspective of how the disease works. We want them to build trans-disciplinary teams and infrastructure to better understand and control cancer through the convergence of physical sciences and cancer biology.” He then noted that Dr. Nagahara would be describing this Physical Sciences-Oncology Centers (PS-OC) program in further detail the next day.

Seeing many of the unique programs launched by CSSI, he said that when the new NCI Director Dr. Harold Varmus began in 2010, suggested that the Center implement a new project that has become known as the “NCI’s Provocative Questions (PQ).” Dr. Varmus wanted to challenge the scientific community to think about important but non-obvious questions in cancer research. “He asked: how can we get some of the people who have really good ideas to come and talk to us.” The “PQ” project is now underway, through workshops, the web site, and other inputs. The first round has elicited proposals from many countries, and Dr. Lee said he hoped to announce awardees shortly.

He closed with some reflections about the progress of CSSI. “I don’t think we can actually generate innovation,” he said. “It just happens. We’re still trying to figure out how you actually talk about innovation across sectors without comparing apples to oranges.” He discussed the uncertain passage from creativity to feasibility, the journey of an idea from the pilot stage (“This won’t work”) to early stage (“Will this work?”) to mid-stage (“This might work”) to last stage (“This works!”). Many researchers bring a new idea to the NIH for funding, he said, and each time, they face an “innovation funnel” that looks like the mouth of a shark. “Every time the investigator clears the funding hurdle, and moves through the funnel to the next level, they have to run the same gauntlet, perhaps less prepared than they were before. How do we retain the ones who want to give up not because their idea was weak, but because they were not prepared to go through that next stage?” After they clear the innovation funnel, he said, they face the final headache of the clinical trial, where the chances of success are low. “Trying to keep them moving forward with a smaller and smaller carrot is difficult—even though many of our investigators are actually outperforming our expectations. How do we capture that beyond just counting publications and patents? Have we now started to subject initiatives to a “tenure track” mentality and reward quantity of output versus the innovation of the output?”


Dr. Mirkin commented that the programs developed by CSSI are truly innovative, and are “beginning to pay off.” He described a “fundamental flaw in the way funding is done across the agencies.” In all the start-up companies he had worked in, he said, the time and effort spent in writing and rewriting proposals proved to have little if any value by the time the project was finished. Similarly, research centers have to run an equivalent gauntlet, such as proving the value of a diagnostic or therapeutic candidate, only to find that after clinical trials, “the funding just stops and you face a different group now and there is no connection between the two. Unless we close those gaps it’s hard to imagine progress.” He asked whether there was a strategy within NCI to adopt a model that more closely resembled that of DARPA, which was “We want to see you at the next level fast, here is a check.” They can’t do that currently, nor can the SBIR program or other agencies do this. “The timeline is way too long.”

Dr. Lee said that at the upcoming NCI retreat a topic on the agenda was how to push initiatives from concept to funding projects as fast “as we want to.” He noted that CSSI has already set the bar high, where the PS-OC program was conceived and funded all within one year, something that had never been done before. “An issue for us is to accelerate the funding between the gaps, and not the development of the programs themselves,” he said.

Dr. Peterson said that the i-Corps uses this model of rapid funding. “If you look at the process for identifying those areas that have developed good ideas for basic research investments, i-Corps does just what you say. It operates on a quarterly basis, so that the typical time from identifying a potential project and giving a decision on a grant is a matter of weeks. The review is done by the program officer, and often the contact is made by the program officer as well. He already knows about the work so far, and asks the applicant whether this is something you want to pursue. It’s an important experiment. The challenge is doing it at large scale, because we may be accused of picking winners and not having the classical review process.”

Mr. Taylor said he was left with the impression that the landscape is changing in government. The term innovation was not used 20 years ago, nor 10 years ago. Each agency is trying to be more efficient in using public funds. He said he applauded agencies in their ongoing efforts. “It takes inputs from people like you to nudge agencies in the right direction.”

Dr. Roberson followed up on Dr. Mirkin’s point on the importance of timing. “Some deals move very fast. Some require a large amount of money, some a small amount. What is the best vetting process to determine that what you’re doing will actually have impact?”

Dr. Fall added that part of the problem is that the federal government “is incredibly risk-averse. It’s hard to get people to accept the possibility of failure, and fund things with the intention of squeezing the risk out. And it isn’t enough to have a process that makes sense. It has to allow you to accomplish the goal in the right time frame.”

Cmdr. Stuart Walker, a naval reserve officer, asked how well the restructuring of the translational layer of ONR was proceeding. Dr. Fall answered that it created an “ongoing tension” among the three directors who must compete for their share of the R&D budget, including the basic research director “who is there to protect the seed corn.” He added that the tension “works very well” and was an interesting model that was being copied by others. “I don’t think we ask often enough the basic question of how to structure an agency for optimal efficacy, like ours or NCI’s CSSI.”

A questioner asked whether during these austere times agencies were able to achieve savings through collaboration and leveraging their respective resources. Dr. Peterson agreed that many agencies described challenges in trying to work together, but that he had found the opposite for NSF. “All the really good collaborations happen where the action is. That is at the program officer level, where people are specifically looking for partnerships, such as groups at NCI, DOE, and ONR.” He cautioned, however, that he sometimes “almost has to be careful about describing all the great collaborations” because of potential criticism from those who misunderstand collaboration as duplication. “With well-designed collaboration,” he said, “both agencies benefit from using complementary resources.”



Directed energy refers to the use of aimed energy, without a projectile, as a weapon or other application, such as a high energy laser and high power microwave. <http://www​>.


An EM railgun, according to the ONR website, is a long-range weapon that fires projectiles using electricity instead of chemical propellants. <http://www​​/Media-Center/Fact-Sheets​/Electromagnetic-Railgun.aspx>.


The National Cancer Act of 1971 is generally viewed as the beginning of the war on cancer, although the term “war” did not appear in the legislation.




Paul, et al. Nature Rev Drug Discovery. Mar, 2010. .


Anderson Leigh. Clin Chem. 2010. .