NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

National Research Council (US) Chemical Sciences Roundtable. Reducing the Time from Basic Research to Innovation in the Chemical Sciences: A Workshop Report to the Chemical Sciences Roundtable. Washington (DC): National Academies Press (US); 2003.

Cover of Reducing the Time from Basic Research to Innovation in the Chemical Sciences

Reducing the Time from Basic Research to Innovation in the Chemical Sciences: A Workshop Report to the Chemical Sciences Roundtable.

Show details

6What Have We Learned from Hot Topics?

James R. Heath1

University of California, Los Angeles

This chapter describes two high-profile projects—one that involves an institute that I was involved in building, and the other is a scientific program that I have led. Through these projects, I will address issues such as funding for long-term research, high-risk projects, the bonuses and pitfalls of industrial collaboration, and technology transfer.

It is always logical to put science on top, and so I will talk about that first. The scientific program began several years ago when I visited the Hewlett-Packard (HP) Laboratories to help my friend Stan Williams set up a wet-chemistry effort there and to work on a collaboration involving the fairly esoteric task of performing transport spectroscopy measurements of two-dimensional solids of quantum dots. This project was funded by the National Science Foundation's (NSF) Grant Opportunities for Academic Liaison with Industry (GOALI) program, which also included Paul Alivisatos. I will come back to the nature of this funding later. In any case, Stan and I soon began to talk about other things, such as whether it would be possible to utilize self-assembly, or something like it, to build a computing machine. This was a problem that I had thought about a few years before when I was on the staff at IBM, but it was one of those thoughts that just didn't go anywhere because I didn't have sufficient “fuel” to run with it. Stan and I had, in the back of our heads, the concept that such a machine might be extremely energy efficient. Consider, for example the cost of switching 1's into 0's and back again. Rolf Landauer and Richard Feynman thought about this concept many years before, so we had very smart people to learn from.

A 1 and a 0 have to be energetically different from each other or you can't tell the difference between them; therefore, it costs energy to do this switching process. If a calculation is done by switching the 1's and 0's, there are going to be entropy costs as well. In any case, a 1-electron (quantum state) switch appeared to be an attractive option for minimizing the energy of this process. To be robust, the energy difference between a 1 and a 0 should be about 20 to 50 kBT, and we wanted T to be room temperature. By simply considering a particle-in-a-box model for calculating energy level spacings, it becomes apparent that this switch is the size of a molecule. Nevertheless, Stan and I didn't have much of a clue as to how to make a molecular switch, much less a large-scale circuit of such switches, as would be needed for a computer. Thus, we started with nothing but an aspiration to build this computer. To be honest, I wanted to build the computer simply because it represented a significant challenge, and we clearly did not know how to do it. I thought that if we did build it, it didn't even have to be useful to be worthwhile. We would very likely learn some wonderful science along the way, and we might even discover things that were useful. That has, in fact, happened, and I will return to this point.

First, let me talk about nanotechnology and how this project was funded. The project was started before the National Nanotechnology Initiative, and funding for ill-defined (call it exploratory), cross-disciplinary research was (and still is) hard to get. Thus we submitted a proposal to NSF's GOALI program that actually was fairly focused. The great thing about NSF is that the agency doesn't really care what you do with the funding it gives you, as long as you perform high-quality science. NSF also supports long-term projects, but this is a concept that must be continually retaught to our taxpayers (and even to NSF program monitors on occasion). The advent of the National Nanotechnology Initiative and similar programs has begun to educate the public about the importance of long-term research. Nevertheless, nothing helps garner public and federal research support more than pointing to legitimate commercial products that have come out of government-funded research. This is understandable and as it should be, since there must be some level of payoff to the people who pay for it. However, consider the case of nanotechnology, in comparison with biotechnology. Research in biotechnology has stabilized only in the past few years to the extent that there are a number of companies making money. Figure 6.1, which was put together by my colleagues Michael Darby and Lynn Zucker at the University of California, Los Angeles (UCLA), shows a similar rise in the number of publications and patents for nanotechnology and biotechnology, looking at the number of years from the field's base year. Those base years are 1973 and 1989 for biotechnology and nanotechnology, respectively, and thus nanotechnology should become a major commercial sector in 10 to 15 years. This can only happen if we continue to invest and nurture this field. Private money (venture capital) will not be available, because it is too early. Therefore, nanotechnology will not yield returns without sustained federal funding, but it will likely earn wonderful returns if we have the willpower to stick with it. This is an argument that must be made time and time again to our government leaders and to the people who vote for them. This is true for nanotechnology and it is true for any other basic science initiative our nation undertakes.

FIGURE 6.1. The time lag from an emerging science's base year to the time it becomes productive causes a delay in the financial returns on investment.


The time lag from an emerging science's base year to the time it becomes productive causes a delay in the financial returns on investment. The base year for biotechnology is 1973 and the base year for nanotechnology is 1989. Courtesy of Michael Darby (more...)

The NSF GOALI grant that we received allowed a small group of people with different backgrounds to take the time to come to a common understanding on a particular issue. This was really critical, or the project would not have gone anywhere. Let me say a few words about how difficult it is for people from very different backgrounds to come together. In addition to our original quantum dot project, Stan and I began to interact with Phil Kuekes, a computer architect at HP, because Phil had done some interesting work in terms of building a molecular computer. Stan and I wanted to learn from Phil what he knew, and he was anxious to learn some physics and chemistry. We sat together many times for 4-hour sessions trying to learn each other's language. I remember a few times walking out of one of those 4-hour sessions with only a common noun.

Phil had built a wonderful machine called Teramac, which consisted of circuits that were laid out in a “periodic” (as a chemist I thought “crystalline”) arrangement, and it worked perfectly in spite of having a quarter million “defective components” (again I translated this into finite chemical reaction yield). A Pentium computer chip, by contrast, consists of very complex circuit arrangements with no defective components. Stan, Phil, and I used the architecture of Teramac to develop a model for our computing machine. We published this architecture and its implications for nanotechnology in Science in 1998.2 We received a lot of press from that paper, and we also got the attention of the Defense Advanced Research Projects Agency (DARPA). Unlike the NSF, which gave us the freedom to explore but not a lot of money to explore with, DARPA made us focus on a set of deliverables but gave us a lot of money to produce them. We promised DARPA a 16-bit molecular electronic-based random access (reconfigurable) memory circuit within 2 years. To be honest, this deliverable scared us, but we made very rapid progress. In 1999, just prior to the start of the DARPA program, my collaborators and I physically demonstrated that molecules could be useful for simple logic circuitry, and in 2000 we reported the first truly reconfigurable switches based on these molecules using a one-electron process. By the next year we had a 64-bit random access memory working, and this work was cited in December of 2001 as Science journal's Breakthrough of the Year. In 2000, defect-tolerant architectures were placed onto the Semiconductor Industry Association Roadmap—something that was catalyzed by our Teramac paper. In 2001, molecular electronics was also placed on the Roadmap, although as a far-term research goal. Recently, we have demonstrated patterning techniques that allow us to make memory circuits that are similar to the 64-bit memory we delivered to DARPA but at a density approaching 1012/ cm2, which is, in fact, actually beyond where Stan and I had originally intended this project to go.

This research has received a lot of media attention. I had previous experience with the press through my work on the discovery of C60, as well as some other research we had published in the mid-1990s out of my UCLA group. Nevertheless, I did not realize at the time that the press would take one short quote from over an hour of conversation and use it as the whole basis of an article. Some of the quotes put quite a bit of pressure on us. For example, one of the quotes included the phrase “a thousand Pentiums on a grain of sand,” which made it into the State of the Union Address a few years later. When this quote was made in 1998, the highest possible bit density was about 108 bits/cm2. If a Pentium chip is about 1 cm2, then 1,000 Pentiums yield about 1011 bits. A grain of sand has a volume of about 8 mm3. Since computer architectures are typically one-dimensional, the 1011 bits can be stacked using four layers. The required bit density to have 1,000 Pentiums in a grain of sand is about 6.25 (1011 bits/cm2 per 2-mm-high stack, which is nearly equal to the volume storage of the human brain. This is actually the density of devices that we are working on developing right now, although we only have a few tens of thousands of devices in our largest circuits at this high density. Furthermore, making integrated circuits (logic and memory talking with each other and with the outside world) represents a significant step in complexity beyond where we are working now.

Why does HP support this project? Stan Williams leads for this project at HP, which is nearly the only blue-skies project that HP does fund. The assistance that they get from DARPA is critical to the program. The culture at HP is like any other big company in the information technology business in that any one of HP's products and services will become obsolete within 3 years. The vast majority of its resources are therefore spent on reinventing themselves every 3 years, and HP pays very little attention to technology that clearly cannot be developed within that time frame. Although Stan has kept the HP ship fairly steady, working with HP has, at times, been a struggle. Largely because of Stan's efforts, we have been able to hold HP's attention thus far. This has given us benefit with DARPA as well. The very fact that HP has a strong interest in molecular electronics has given a significant stamp of approval to DARPA's goals for this program. For example, DARPA Director Tether has visited HP labs and been briefed on our molecular electronics program. He has not visited UCLA, Harvard, or other academic labs that are significant players in this field.

We have collectively received several million dollars in grants, which have been split roughly 50/50 between HP and UCLA. We have six jointly owned patents that have not only been filed but have been issued over the past few years. An unusual aspect of our collaborative agreement is that both UCLA and HP retain full ownership privileges for intellectual property that is jointly developed, regardless of the relative effort each party has put forth. It is unusual that UCLA retains complete ownership privileges (as does HP) through our joint arrangement. We also filed about six other separate patents during that same time period. We have published six joint papers and about 20 other papers; several of these papers have received international press coverage. This type of publicity is clearly of value to HP. Its efforts to publicize some of this work have been remarkable when compared with what universities do, but are probably not so unusual if one tracks the positive impact that such press releases have on HP's stock price. IBM operates similarly.

This program has accomplished much, but it has a long way to go before it produces useful technologies. Probably the most promising avenue is to create energy-efficient (our first goal) computational platforms that are substrate independent. This is a unique niche for molecular electronics that is not easily met by any other information technology platform. However, out of this work have also come avenues for fabricating very high frequency (GHz) resonators (helps your cell phone talk to your computer), ultra-high-density molecular sensor arrays for interfacing with the molecular language and the timescales of living cells, and several other wonder discoveries. I believe that these discoveries alone will more than justify the federal investment that has gone into this program.

Now let me talk about my high-profile project. For the past 2 1/2 years, I have been involved in the California NanoSystems Institute (CNSI), which is a $100 million program funded by Governor Gray Davis, and which is matched by $200 million from nonstate funding sources. Our objective is to be the catalyst to make Southern California the birthplace of the nanotechnology industry. Our mission, as stated by Governor Davis, is to keep California's technology industry thriving for the next 30 years by following these guidelines:

  • Science—lay the foundation
  • Technology—take it the next step
  • People—train the work force
  • Corporate contracts and start-up companies—facilitate cooperation
  • Intellectual property (IP)—find new ways to transfer technology

In my experience at the University of California (UC), managing IP has been a very difficult process. When companies had shown interest in forming a co-development agreement with UC, more often than not they ultimately gave up due to the barriers involved in the process. Some faculty have left UC for start-up companies as a result of the university bureaucracy's tight and conservative policies regarding IP. As a result of the lack of success, Governor Davis gave CNSI a specific directive to manage IP differently, in a more liberal and open manner.

It is of maximum benefit to UC to have companies spring from its research labs but to have the faculty remain to invent and discover yet again. This requires a facile IP program that works for the faculty and is responsive to the needs of industry, venture capitalists, and others. Consider what is involved in a technology transfer process, especially in a high-technology area such as nanotechnology. A patent, while sufficient to get venture capital interests, is not by itself very useful. Knowledge transfer from the academic lab to the start-up or existing company is a critical component, and all of this—IP licensing, knowledge transfer, and so on—must be done rapidly to maintain a competitive edge. At many universities, including UC, this is nearly impossible to do.

Consider the following example. Several years ago my colleague at UC, Mike Phelps, devised a demonstration project that was to illustrate a new way of managing IP. His goal was to build something known as the LA Tech Center, which was a research arm of a company known as CTI Molecular Imaging. CTI is a publicly traded $1.5 billion company that makes PET scanners and is very successful. CTI came out of Mike's labs (Mike is the inventor of PET). Anyway, this project required two deals to be made. One was done by hiring a private lawyer who had to work with a single individual at the UC Office of the President in Oakland (no committees and high-quality legal assistance). The other was done through the regular channels and required a committee for the UCLA IP office, a committee for the president's office of the UC system, low-quality lawyers assigned by the university, and nearly complete exclusion of Mike in the negotiation process.

As of this writing, the LA Tech Center has been up and running and profitable for 1 1/2 years, because the first (demonstration) deal was done. The second deal is not yet done! The goal of the LA Tech Center is very interesting. It relates back to the issues of patents and knowledge going hand in hand to transition IP out of the university and into the commercial sector. Figure 6.2 shows the setup of a research facility in the LA Tech Center, a unique collaboration between academia and industry. At one side is basic research that reaches back to UCLA and CNSI, with a scientific motivation and fundamental scientific approach. The commercial motivation with the applied science and engineering approach is kept separate on the other side of the building. The boundary separating the scientific from the commercial is mobile according to the project's need. While the money that runs the LA Tech Center doesn't change sides, the people do. This is an extremely valuable aspect of tech transfer.

FIGURE 6.2. The commercial side of the LA Tech Center is highly leveraged by the scientific side (100:10); the boundary separating the scientific from the commercial is mobile according to the project's needs.


The commercial side of the LA Tech Center is highly leveraged by the scientific side (100:10); the boundary separating the scientific from the commercial is mobile according to the project's needs.

Based on ideas like this, and on my own frustrating experiences with the UC office of IP, I drafted a charter for the CNSI that does several things. First, the CNSI answers to a board that, while having three UC representatives, also has eight other members. Second, it gives the CNSI faculty their own power over membership and building space. Finally, it gives CNSI the authority to pursue its own IP program. This means that CNSI can retain independent council and it doesn't have to move through the slow beaurocracy of UC. Furthermore, the director or co-director has signature authority on intellectual property. Speed is also a crucial factor in IP transfer, and the exclusion of a committee-based decision process has allowed CNSI to maintain a one-month time line for its operations. CNSI is a joint UCLA/ UC at Santa Barbara Institute, and both Chancellors Carnesale (UCLA) and Yang (UCSB) have signed this document, as have all of the CNSI faculty. The UC Office of the President has been reluctant to sign it, but I believe will eventually come around.

Nanotechnology needs long-term funding, which currently does not exist in the private sector. Corporate research labs occasionally fund such projects, but the prospects for corporate involvement in long-term research projects have diminished greatly over the past decade. Other private funding, such as what is represented by venture capital firms, is also not appropriate for long-term, high-payoff projects. For these reasons and to maintain the U.S. high-technology industrial sector into the foreseeable future, it is vital for research support to be available through programs like NSF's GOALI, DARPA programs, the National Nanotechnology Initiative, and institutes like CNSI.


Mary L. Good, University of Arkansas, Little Rock: I am very impressed with your presentation. Are there other areas you believe to be as important as nanotechnology?

James R. Heath: One area that I think is extremely important and of national interest is detecting and responding to an unknown pathogen. If something is released in the environment today and people begin to die, there is no way pharmaceutical companies will be able to develop a drug in sufficient time to help. If you look at the physical problems involved, diagnosis is already a very difficult problem, and rapid pharmaceutical development is virtually impossible. Although currently no one knows how to diagnose a pathogen and develop a pharmaceutical response within a few days, there is no reason why it cannot be accomplished. People are beginning to think about that very seriously. I am developing a research program on detection with Lee Hood of the Institute for Systems Biology and Mike Phelps from the UCLA molecular and medical pharmacology department, for which the idea is to go from molecules to patients. There are a few other people who are beginning to do this as well. As medicine becomes a molecular world, people will begin to understand the mechanisms of disease. They begin to understand how to better use that knowledge, but that type of technology will be very hard to push forward without some sort of government assistance. In fact, if the Department of Defense was to develop a program along these lines with an eye toward national security, it would probably involve some combination of nanotechnology and biotechnology.

Kenneth A. Pickar, California Institute of Technology: I really want to compliment you because everyone knows the University of California is user hostile with respect to commercializing technology. It is not just the professors who hate it; it is also people on the outside. The best revolutions in technology transfer are still coming from people like you who find innovative solutions.

James R. Heath: Fighting this battle took too much out of me.

Kenneth A. Pickar: It is a lot harder when you are fighting upstream. Mary Good mentioned the problem of too much hype about energy in the 1970s. We all can discuss what traveling to the moon in the 1940s would have been like if we didn't have the technology to do it. It would have been an enormous flop as well. However, what I see as a major danger is not the potential of nanotechnology but all this horrific hype that could cause another biotechnology crash as it did back in the 1980s.

James R. Heath: You are right. At present, all of our technologies are based on “low information content” systems, such as small molecules and atoms, or periodic solids. The nanotechnology world is interested in “high information content” macromolecules as the functional units. We are in the process of learning how to incorporate form, function, and activity at the macromolecular level into nanotechnology. We are not certain how to do that yet, but I think it is a world-changing approach to science and technology. It is a very slow process, but we must always stress the excitement of this field while warding off immediate gratification. I would like to refer back to the previous comment about the national technology road map of semiconductors. The very fact that there is something called a “road map” dissuades students to pursue this. They think you look at the map and you do the prescribed research. That is not very exciting, but there is a world here such as what I described and in other areas like this where there is actually quite minimal knowledge and discovery science is happening at a rapid pace. The route toward developing nanotechnologies from this science is very long term, and so the excitement must last. A cautionary note about too much hype is also equally important.

Robert J. Bianchini, Colgate-Palmolive Company: We just signed an agreement with the University of Michigan to get into antimicrobial nano-emulsions to control anthrax and other diseases. We had a lot of problems getting IP results. In the end, the university did set up a spin-off company. If there is any way that IP issues can be resolved with universities, it certainly would help increase the rate of the innovation process. Our company wanted to protect itself, and we needed the agreement worked out before we could move forward.

James R. Heath: When a university lawyer and the committee behind the lawyer are arguing with IBM, the opponents are unequal. UC runs from that fight, and things get lost in committees as a result. Now UCLA (through the CNSI) offers some flexibility to allow faculty to obtain good, qualified counsel. In my opinion, this flexibility actually pays tremendous dividends. If a faculty member and a company want to make a deal with the university, policies are generally not so restrictive that they keep that from happening. It is the practice that is so frightening.

Joseph S. Francisco, Purdue University: How did you manage to get the University of California to allow you to agree with its policies while in turn taking a very different route in terms of practice?

James R. Heath: We have a bit of a luxury in that the policies at the University of California are actually very flexible. They have been built on top of each other over many years and just because there is a new rule doesn't mean it negates an old one. So one can do anything. However, the university's attitude is “I don't want to get sued, and our invention is worth $5 million.” The attitude of the company on the other side is that your invention is worth 10,000 shares of start-up stock, the cost of the lawyer, and the cost of your IP filings. The truth is probably in between, but it is closer to 10,000 stock shares than $5 million. In terms of the CNSI charter, it wasn't very difficult to convince our chancellor that the process for translating IP to the commercial sector was broken. The CNSI also had significant weight in that a large fraction of the UCLA/UCSB inventors were part of the CNSI. However, the CNSI has only about 30 faculty total, so the absolute number is very small. Consider this argument that we gave our chancellor: UCLA is third in the country in terms of bringing the university revenue, and our IP office still runs in the red, and it has never come close to running in the black. In fact, if you were to factor in the cost of replacing faculty who leave UC to form companies, our IP office is in really bad shape. So that already is a pretty fine argument that there is a problem, and it is one that the UCSB and UCLA chancellors agreed with.



James Heath is currently professor of chemistry at the University of California, Los Angeles, and director of the California NanoSystems Institute, formed by California Governor Grey Davis in December 2000. He was previously a research staff member at the IBM T. J. Watson Research Labs.


J.R. Heath, P.J. Kuekes, G. Snider, and R.S. Williams. 1998. A Defect-tolerant Computer Architecture: Opportunities for Nanotechnology. Science 280:1716.

Copyright © 2003, National Academy of Sciences.
Bookshelf ID: NBK36342


  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (5.5M)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...