NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Jamison DT, Breman JG, Measham AR, et al., editors. Disease Control Priorities in Developing Countries. 2nd edition. Washington (DC): The International Bank for Reconstruction and Development / The World Bank; 2006. Co-published by Oxford University Press, New York.

Cover of Disease Control Priorities in Developing Countries

Disease Control Priorities in Developing Countries. 2nd edition.

Show details

Chapter 5Science and Technology for Disease Control: Past, Present, and Future

, , , and .

Image ch5fu1.jpg

As we move into the new millennium it is becoming increasingly clear that the biomedical sciences are entering the most exciting phase of their development. Paradoxically, medical practice is also passing through a phase of increasing uncertainty, in both industrial and developing countries. Industrial countries have not been able to solve the problem of the spiraling costs of health care resulting from technological development, public expectations, and—in particular—the rapidly increasing size of their elderly populations. The people of many developing countries are still living in dire poverty with dysfunctional health care systems and extremely limited access to basic medical care.

Against this complex background, this chapter examines the role of science and technology for disease control in the past and present and assesses the potential of the remarkable developments in the basic biomedical sciences for global health care.

Medicine Before the 20th Century

From the earliest documentary evidence surviving from the ancient civilizations of Babylonia, China, Egypt, and India, it is clear that longevity, disease, and death are among humanity's oldest preoccupations. From ancient times to the Renaissance, knowledge of the living world changed little, the distinction between animate and inanimate objects was blurred, and speculations about living things were based on prevailing ideas about the nature of matter.

Advances in science and philosophy throughout the 16th and 17th centuries led to equally momentous changes in medical sciences. The elegant anatomical dissections of Andreas Vesalius swept away centuries of misconceptions about the relationship between structure and function of the human body; the work of Isaac Newton, Robert Boyle, and Robert Hooke disposed of the basic Aristotelian elements of earth, air, fire, and water; and Hooke, through his development of the microscope, showed a hitherto invisible world to explore. In 1628, William Harvey described the circulation of the blood, a discovery that, because it was based on careful experiments and measurement, signaled the beginnings of modern scientific medicine.

After steady progress during the 18th century, the biological and medical sciences began to advance at a remarkable rate during the 19th century, which saw the genuine beginnings of modern scientific medicine. Charles Darwin changed the whole course of biological thinking, and Gregor Mendel laid the ground for the new science of genetics, which was used later to describe how Darwinian evolution came about. Louis Pasteur and Robert Koch founded modern microbiology, and Claude Bernard and his followers enunciated the seminal principle of the constancy of the internal environment of the body, a notion that profoundly influenced the development of physiology and biochemistry. With the birth of cell theory, modern pathology was established. These advances in the biological sciences were accompanied by practical developments at the bedside, including the invention of the stethoscope and an instrument for measuring blood pressure, the first use of x-rays, the development of anesthesia, and early attempts at the classification of psychiatric disease as well as a more humane approach to its management. The early development of the use of statistics for analyzing data obtained in medical practice also occurred in the 19th century, and the slow evolution of public health and preventive medicine began.

Significant advances in public health occurred on both sides of the Atlantic. After the cholera epidemics of the mid 19th century, public health boards were established in many European and American cities. The Public Health Act, passed in the United Kingdom in 1848, provided for the improvement of streets, construction of drains and sewers, collection of refuse, and procurement of clean domestic water supplies. Equally important, the first attempts were made to record basic health statistics. For example, the first recorded figures for the United States showed that life expectancy at birth for those who lived in Massachusetts in 1870 was 43 years; the number of deaths per 1,000 live births in the same population was 188. At the same time, because it was becoming increasingly clear that communicable diseases were greatly depleting the workforce required to generate the potential rewards of colonization, considerable efforts were channeled into controlling infectious diseases, particularly hookworm and malaria, in many countries under colonial domination.

However, until the 19th century, curative medical technology had little effect on the health of society, and many of the improvements over the centuries resulted from higher standards of living, improved nutrition, better hygiene, and other environmental modifications. The groundwork was laid for a dramatic change during the second half of the 20th century, although considerable controversy remains over how much we owe to the effect of scientific medicine and how much to continued improvements in our environment (Porter 1997).

This balance between the potential of the basic biological sciences and simpler public health measures for affecting the health of our societies in both industrial and developing countries remains controversial and is one of the major issues to be faced by those who plan the development of health care services for the future.

Science, Technology, and Medicine in the 20th Century

Although rapid gains in life expectancy followed social change and public health measures, progress in the other medical sciences was slow during the first half of the 20th century, possibly because of the debilitating effect of two major world wars. The position changed dramatically after World War II, a time that many still believe was the period of major achievement in the biomedical sciences for improving the health of society. This section outlines some of these developments and the effect they have had on medical practice in both industrial and developing countries. More extensive treatments of this topic are available in several monographs (Cooter and Pickstone 2000; Porter 1997; Weatherall 1995).

Epidemiology and Public Health

Modern epidemiology came into its own after World War II, when increasingly sophisticated statistical methods were first applied to the study of noninfectious disease to analyze the patterns and associations of diseases in large populations. The emergence of clinical epidemiology marked one of the most important successes of the medical sciences in the 20th century.

Up to the 1950s, conditions such as heart attacks, stroke, cancer, and diabetes were bundled together as degenerative disorders, implying that they might be the natural result of wear and tear and the inevitable consequence of aging. However, information about their frequency and distribution, plus, in particular, the speed with which their frequency increased in association with environmental change, provided excellent evidence that many of them have a major environmental component. For example, death certificate rates for cancers of the stomach and lung rose so sharply between 1950 and 1973 that major environmental factors must have been at work generating these diseases in different populations.

The first major success of clinical epidemiology was the demonstration of the relationship between cigarette smoking and lung cancer by Austin Bradford Hill and Richard Doll in the United Kingdom. This work was later replicated in many studies, currently, tobacco is estimated to cause about 8.8 percent of deaths (4.9 million) and 4.1 percent of disability-adjusted life years (59.1 million) (WHO 2002c). Despite this information, the tobacco epidemic continues, with at least 1 million more deaths attributable to tobacco in 2000 than in 1990, mainly in developing countries.

The application of epidemiological approaches to the study of large populations over a long period has provided further invaluable information about environmental factors and disease. One of the most thorough—involving the follow-up of more than 50,000 males in Framingham, Massachusetts—showed unequivocally that a number of factors seem to be linked with the likelihood of developing heart disease (Castelli and Anderson 1986). Such work led to the concept of risk factors, among them smoking, diet (especially the intake of animal fats), blood cholesterol levels, obesity, lack of exercise, and elevated blood pressure. The appreciation by epidemiologists that focusing attention on interventions against low risk factors that involve large numbers of people, as opposed to focusing on the small number of people at high risk, was an important advance. Later, it led to the definition of how important environmental agents may interact with one another—the increased risk of death from tuberculosis in smokers in India, for example.

A substantial amount of work has gone into identifying risk factors for other diseases, such as hypertension, obesity and its accompaniments, and other forms of cancer. Risk factors defined in this way, and from similar analyses of the pathological role of environmental agents such as unsafe water, poor sanitation and hygiene, pollution, and others, form the basis of The World Health Report 2002 (WHO 2002c), which sets out a program for controlling disease globally by reducing 10 conditions: underweight status; unsafe sex; high blood pressure; tobacco consumption; alcohol consumption; unsafe water, sanitation, and hygiene; iron deficiency; indoor smoke from solid fuels; high cholesterol; and obesity. These conditions are calculated to account for more than one-third of all deaths worldwide.

The epidemiological approach has its limitations, however. Where risk factors seem likely to be heterogeneous or of only limited importance, even studies involving large populations continue to give equivocal or contradictory results. Furthermore, a major lack of understanding, on the part not just of the general public but also of those who administer health services, still exists about the precise meaning and interpretation of risk. The confusing messages have led to a certain amount of public cynicism about risk factors, thus diminishing the effect of information about those risk factors that have been established on a solid basis. Why so many people in both industrial and developing countries ignore risk factors that are based on solid data is still not clear; much remains to be learned about social, cultural, psychological, and ethnic differences with respect to education about important risk factors for disease. Finally, little work has been done regarding the perception of risk factors in the developing countries (WHO 2002c).

A more recent development in the field of clinical epidemiology—one that may have major implications for developing countries—stems from the work of Barker (2001) and his colleagues, who obtained evidence suggesting that death rates from cardiovascular disease fell progressively with increasing birthweight, head circumference, and other measures of increased development at birth. Further work has suggested that the development of obesity and type 2 diabetes, which constitute part of the metabolic syndrome, is also associated with low birthweight. The notion that early fetal development may have important consequences for disease in later life is still under evaluation, but its implications, particularly for developing countries, may be far reaching.

The other major development that arose from the application of statistics to medical research was the development of the randomized controlled trial. The principles of numerically based experimental design were set out in the 1920s by the geneticist Ronald Fisher and applied with increasing success after World War II, starting with the work of Hill, Doll, and Cochrane (see Chalmers 1993; Doll 1985). Variations on this theme have become central to every aspect of clinical research involving the assessment of different forms of treatment. More recently, this approach has been extended to provide broad-scale research syntheses to help inform health care and research. Increasing the numbers of patients involved in trials and applying meta-analysis and electronic technology for updating results have made it possible to provide broad-scale analyses combining the results of many different trials. Although meta-analysis has its problems—notably the lack of publication of negative trial data—and although many potential sources of bias exist in the reporting of clinical trials, these difficulties are gradually being addressed (Egger, Davey-Smith, and Altman 2001).

More recent developments in this field come under the general heading of evidence-based medicine (EBM) (Sackett and others 1996). Although it is self-evident that the medical profession should base its work on the best available evidence, the rise of EBM as a way of thinking has been a valuable addition to the development of good clinical practice over the years. It covers certain skills that are not always self-evident, including finding and appraising evidence and, particularly, implementation—that is, actually getting research into practice. Its principles are equally germane to industrial and developing countries, and the skills required, particularly numerical, will have to become part of the education of physicians of the future. To this end, the EBM Toolbox was established (Web site: However, evidence for best practice obtained from large clinical trials may not always apply to particular patients; obtaining a balance between better EBM and the kind of individualized patient care that forms the basis for good clinical practice will be a major challenge for medical education.

Partial Control of Infectious Disease

The control of communicable disease has been the major advance of the 20th century in scientific medicine. It reflects the combination of improved environmental conditions and public health together with the development of immunization, antimicrobial chemotherapy, and the increasing ability to identify new pathogenic organisms. Currently, live or killed viral or bacterial vaccines—or those based on bacterial polysaccharides or bacterial toxoids—are licensed for the control of 29 common communicable diseases worldwide. The highlight of the field was the eradication of smallpox by 1977. The next target of the World Health Organization (WHO) is the global eradication of poliomyelitis. In 1998, the disease was endemic in more than 125 countries. After a resurgence in 2002, when the number of cases rose to 1,918, the numbers dropped again in 2003 to 748; by March 2004, only 32 cases had been confirmed (Roberts 2004).

The Expanded Program on Immunization (EPI), launched in 1974, which has been taken up by many countries with slight modification, includes Bacillus Calmette-Guérin (BCG) and oral polio vaccine at birth; diphtheria, tetanus, and pertussis at 6, 10, and 14 weeks; measles; and, where relevant, yellow fever at 9 months. Hepatitis B is added at different times in different communities. By 1998, hepatitis B vaccine had been incorporated into the national programs of 90 countries, but an estimated 70 percent of the world's hepatitis B carriers still live in countries without programs (Nossal 1999). Indeed, among 12 million childhood deaths analyzed in 1998, almost 4 million were the result of diseases for which adequate vaccines are available (WHO 2002a).

The development of sulfonamides and penicillin in the period preceding World War II was followed by a remarkable period of progress in the discovery of antimicrobial agents effective against bacteria, fungi, viruses, protozoa, and helminths. Overall, knowledge of the pharmacological mode of action of these agents is best established for antibacterial and antiviral drugs. Antibacterial agents may affect cell wall or protein synthesis, nucleic acid formation, or critical metabolic pathways. Because viruses live and replicate in host cells, antiviral chemotherapy has presented a much greater challenge. However, particularly with the challenge posed by HIV/AIDS, a wide range of antiviral agents has been developed, most of which are nucleoside analogues, nucleoside or nonnucleoside reverse-transcriptase inhibitors, or protease inhibitors. Essentially, those agents interfere with critical self-copying or assembly functions of viruses or retroviruses. Knowledge of the modes of action of antifungal and antiparasitic agents is increasing as well.

Resistance to antimicrobial agents has been recognized since the introduction of effective antibiotics; within a few years, penicillin-resistant strains of Staphylococcus aureus became widespread and penicillin-susceptible strains are now very uncommon (Finch and Williams 1999). At least in part caused by the indiscriminate use of antibiotics in medical practice, animal husbandry, and agriculture, multiple-antibiotic-resistant bacteria are now widespread. Resistance to antiviral agents is also occurring with increasing frequency (Perrin and Telenti 1998), and drug resistance to malaria has gradually increased in frequency and distribution across continents (Noedl, Wongsrichanalai, and Wernsdorfer 2003). The critical issue of drug resistance to infectious agents is covered in detail in chapter 55.

In summary, although the 20th century witnessed remarkable advances in the control of communicable disease, the current position is uncertain. The emergence of new infectious agents, as evidenced by the severe acute respiratory syndrome (SARS) epidemic in 2002, is a reminder of the constant danger posed by the appearance of novel organisms; more than 30 new infective agents have been identified since 1970. Effective vaccines have not yet been developed for some of the most common infections—notably tuberculosis, malaria, and HIV—and rapidly increasing populations of organisms are resistant to antibacterial and antiviral agents. Furthermore, development of new antibiotics and effective antiviral agents with which to control such agents has declined. The indiscriminate use of antibiotics, both in the community and in the hospital populations of the industrial countries, has encouraged the emergence of resistance, a phenomenon exacerbated in some of the developing countries by the use of single antimicrobial agents when combinations would have been less likely to produce resistant strains. Finally, public health measures have been hampered by the rapid movement of populations and by war, famine, and similar social disruptions in developing countries. In short, the war against communicable disease is far from over.

Pathogenesis, Control, and Management of Non-communicable Disease

The second half of the 20th century also yielded major advances in understanding pathophysiology and in managing many common noncommunicable diseases. This phase of development of the medical sciences has been characterized by a remarkable increase in the acquisition of knowledge about the biochemical and physiological basis of disease, information that, combined with some remarkable developments in the pharmaceutical industry, has led to a situation in which few noncommunicable diseases exist for which there is no treatment and many, although not curable, can be controlled over long periods of time.

Many of these advances have stemmed from medical research rather than improved environmental conditions. In 1980, Beeson published an analysis of the changes that occurred in the management of important diseases between the years 1927 and 1975, based on a comparison of methods for treating these conditions in the 1st and 14th editions of a leading American medical textbook. He found that of 181 conditions for which little effective prevention or treatment had existed in 1927, at least 50 had been managed satisfactorily by 1975. Furthermore, most of these advances seem to have stemmed from the fruits of basic and clinical research directed at the understanding of disease mechanisms (Beeson 1980; Comroe and Dripps 1976).

Modern cardiology is a good example of the evolution of scientific medicine. The major technical advances leading to a better appreciation of the physiology and pathology of the heart and circulation included studies of its electrical activity by electrocardiography; the ability to catheterize both sides of the heart; the development of echocardiography; and, more recently, the development of sophisticated ways of visualizing the heart by computerized axial tomography, nuclear magnetic resonance, and isotope scanning. These valuable tools and the development of specialized units to use them have led to a much better understanding of the physiology of the failing heart and of the effects of coronary artery disease and have revolutionized the management of congenital heart disease. Those advances have been backed by the development of effective drugs for the management of heart disease, including diuretics, beta-blockers, a wide variety of antihypertensive agents, calcium-channel blockers, and anticoagulants.

By the late 1960s, surgical techniques were developed to relieve obstruction of the coronary arteries. Coronary bypass surgery and, later, balloon angioplasty became major tools. Progress also occurred in treatment of abnormalities of cardiac rhythm, both pharmacologically and by the implantation of artificial pacemakers. More recently, the development of microelectronic circuits has made it possible to construct implantable pacemakers. Following the success of renal transplantation, cardiac transplantation and, later, heart and lung transplantation also became feasible.

Much of this work has been backed up by large-scale controlled clinical trials. These studies, for example, showed that the early use of clot-dissolving drugs together with aspirin had a major effect on reducing the likelihood of recurrences after an episode of myocardial infarction (figure 5.1). The large number of trials and observational studies of the effects of coronary bypass surgery and dilatation of the coronary arteries with balloons have given somewhat mixed results, although overall little doubt exists that, at least in some forms of coronary artery disease, surgery is able to reduce pain from angina and probably prolong life. Similar positive results have been obtained in trials that set out to evaluate the effect of the control of hypertension (Warrell and others 2003).

Figure 5.1

Figure 5.1

Effects of a One-Hour Streptokinase Infusion Together with Aspirin for One Month on the 35-Day Mortality in the Second International Study of Infarct Survival Trial among 17,187 Patients with Acute Myocardial Infarction Who Would Not Normally Have Received (more...)

The management of other chronic diseases, notably those of the gastrointestinal tract, lung, and blood has followed along similar lines. Advances in the understanding of their pathophysiology, combined with advances in analysis at the structural and biochemical levels, have enabled many of these diseases to be managed much more effectively. The pharmaceutical industry has helped enormously by developing agents such as the H2-receptor antagonists and a wide range of drugs directed at bronchospasm. There have been some surprises—the discovery that peptic ulceration is almost certainly caused by a bacterial agent has transformed the management of this disease, dramatically reducing the frequency of surgical intervention. Neurology has benefited greatly from modern diagnostic tools, while psychiatry, though little has been learned about the cause of the major psychoses, has also benefited enormously from the development of drugs for the control of both schizophrenia and the depressive disorders and from the emergence of cognitive-behavior therapy and dynamic psychotherapy.

The second half of the 20th century has witnessed major progress in the diagnosis and management of cancer (reviewed by Souhami and others 2001). Again, this progress has followed from more sophisticated diagnostic technology combined with improvements in radiotherapy and the development of powerful anticancer drugs. This approach has led to remarkable improvements in the outlook for particular cancers, including childhood leukemia, some forms of lymphoma, testicular tumors, and—more recently—tumors of the breast. Progress in managing other cancers has been slower and reflects the results of more accurate staging and assessment of the extent and spread of the tumor; the management of many common cancers still remains unsatisfactory, however. Similarly, although much progress has been made toward the prevention of common cancers—cervix and breast, for example—by population screening programs, the cost-effectiveness of screening for other common cancers—prostate, for example—remains controversial.

Many aspects of maternal and child health have improved significantly. A better understanding of the physiology and disorders of pregnancy together with improved prenatal care and obstetric skills has led to a steady reduction in maternal mortality. In an industrial country, few children now die of childhood infection; the major pediatric problems are genetic and congenital disorders, which account for about 40 percent of admissions in pediatric wards, and behavioral problems (Scriver and others 1973). Until the advent of the molecular era, little progress was made toward an understanding of the cause of these conditions. It is now known that a considerable proportion of cases of mental retardation result from definable chromosomal abnormalities or monogenic diseases, although at least 30 percent of cases remain unexplained. Major improvements have occurred in the surgical management of congenital malformation, but only limited progress has been made toward the treatment of genetic disease. Although a few factors, such as parental age and folate deficiency, have been incriminated, little is known about the reasons for the occurrence of congenital abnormalities.

In summary, the development of scientific medical practice in the 20th century led to a much greater understanding of deranged physiology and has enabled many of the common killers in Western society to be controlled, though few to be cured. However, although epidemiological studies of these conditions have defined a number of risk factors and although a great deal is understood about the pathophysiology of established disease, a major gap remains in our knowledge about how environmental factors actually cause these diseases at the cellular and molecular levels (Weatherall 1995).

Consequences of the Demographic and Epidemiological Transitions of the 20th Century

The period of development of modern scientific medicine has been accompanied by major demographic change (Chen 1996; Feachem and others 1992). The results of increasing urbanization, war and political unrest, famine, massive population movements, and similar issues must have had a major effect on the health of communities during the 20th century, but there has been a steady fall in childhood mortality throughout the New World, Europe, the Middle East, the Indian subcontinent, and many parts of Asia during this period, although unfortunately there has been much less progress in many parts of Sub-Saharan Africa. Although much of the improvement can be ascribed to improving public health and social conditions, the advent of scientific medicine—particularly the control of many infectious diseases of childhood—seems likely to be playing an increasingly important part in this epidemiological transition. Although surveys of the health of adults in the developing world carried out in the 1980s suggested that many people between the ages of 20 and 50 were still suffering mainly from diseases of poverty, many countries have now gone through an epidemiological transition such that the global pattern of disease will change dramatically by 2020, with cardiorespiratory disease, depression, and the results of accidents replacing communicable disease as their major health problems.

Countries undergoing the epidemiological transition are increasingly caught between the two worlds of malnutrition and infectious disease on the one hand and the diseases of industrial countries, particularly cardiac disease, obesity, and diabetes, on the other. The increasing epidemic of tobacco-related diseases in developing countries exacerbates this problem. The global epidemic of obesity and type 2 diabetes is a prime example of this problem (Alberti 2001). An estimated 150 million people are affected with diabetes worldwide, and that number is expected to double by 2025. Furthermore, diabetes is associated with greatly increased risk of cardiovascular disease and hypertension; in some developing countries the rate of stroke is already four to five times that in industrial countries. These frightening figures raise the questions whether, when developing countries have gone through the epidemiological transition, they may face the same pattern of diseases that are affecting industrial countries and whether such diseases may occur much more frequently and be more difficult to control.

Partly because of advances in scientific medicine, industrial countries have to face another large drain on health resources in the new millennium (Olshansky, Carnes, and Cassel 1990). In the United Kingdom, for example, between 1981 and 1989, the number of people ages 75 to 84 rose by 16 percent, and that of people age 85 and over by 39 percent; the current population of males age 85 or over is expected to reach nearly 0.5 million by 2026, at which time close to 1 million females will be in this age group. Those figures reflect the situation for many industrial countries, and a similar trend will occur in every country that passes through the epidemiological transition. Although data about the quality of life of the aged are limited, studies such as the 1986 General Household Survey in the United States indicate that restricted activity per year among people over the age of 65 was 43 days in men and 53 days in women; those data say little about the loneliness and isolation of old age. It is estimated that 20 percent of all people over age 80 will suffer from some degree of dementia, a loss of intellectual function sufficient to render it impossible for them to care for themselves. Scientific medicine in the 20th century has provided highly effective technology for partially correcting the diseases of aging while, at the same time, making little progress toward understanding the biological basis of the aging process. Furthermore, the problems of aging and its effect on health care have received little attention from the international public health community; these problems are not restricted to industrial countries but are becoming increasingly important in middle-income and, to a lesser extent, some low-income countries.

Although dire poverty is self-evident as one of the major causes of ill health in developing countries, this phenomenon is emphatically not confined to those populations. For example, in the United Kingdom, where health care is available to all through a government health service, a major discrepancy in morbidity and mortality exists between different social classes (Black 1980). Clearly this phenomenon is not related to the accessibility of care, and more detailed analyses indicate that it cannot be ascribed wholly to different exposure to risk factors. Undoubtedly social strain, isolation, mild depression, and lack of social support play a role. However, the reasons for these important discrepancies, which occur in every industrial country, remain unclear.

Economic Consequences of High-Technology Medicine

The current high-technology medical practice based on modern scientific medicine must steadily increase health expenditures. Regardless of the mechanisms for the provision of health care, its spiraling costs caused by ever more sophisticated technology and the ability to control most chronic illnesses, combined with greater public awareness and demand for medical care, are resulting in a situation in which most industrial countries are finding it impossible to control the costs of providing health care services.

The U.K. National Health Service (NHS) offers an interesting example of the steady switch to high-technology hospital practice since its inception 50 years ago (Webster 1998). Over that period, the NHS's overall expenditure on health has increased fivefold, even though health expenditure in the United Kingdom absorbs a smaller proportion of gross domestic product than in many neighboring European countries. At the start of the NHS, 48,000 doctors were practicing in the United Kingdom; by 1995 there were 106,845, of whom 61,050 were in hospital practice and 34,594 in general (primary care) practice. Although the number of hospital beds halved over the first 50 years of the NHS, the throughput of the hospital service increased from 3 million to 10 million inpatients per year, over a time when the general population growth was only 19 percent. Similarly, outpatient activity doubled, and total outpatient visits grew from 26 million to 40 million. Because many industrial countries do not have the kind of primary care referral program that is traditional in the United Kingdom, this large skew toward hospital medicine seems likely to be even greater.

The same trends are clearly shown in countries such as Malaysia, which have been rapidly passing through the epidemiological transition and in which health care is provided on a mixed public-private basis. In Malaysia, hospitalization rates have steadily increased since the 1970s, reflecting that use is slowly outstripping population growth. The number of private hospitals and institutions rose phenomenally—more than 300 percent—in the same period. In 1996, the second National Health and Morbidity Survey in Malaysia showed that the median charge per day in private hospitals was 100 times higher than that in Ministry of Health hospitals. Those figures reflect, at least in part, the acquisition of expensive medical technology that in some cases has led to inefficient use of societal resources. As in many countries, the Malaysian government has now established a Health Technology Assessment Unit to provide a mechanism for evaluating the cost-effectiveness of new technology.

Those brief examples of the effect of high-technology practice against completely different backgrounds of the provision of health care reflect the emerging pattern of medical practice in the 20th century. In particular, they emphasize how the rapid developments in high-technology medical practice and the huge costs that have accrued may have dwarfed expenditure on preventive medicine, certainly in some industrial countries and others that have gone through the epidemiological transition.

A central question for medical research and health care planning is whether the reduction in exposure to risk factors that is the current top priority for the control of common diseases in both industrial and developing countries will have a major effect on this continuing rise of high-technology hospital medical practice. The potential of this approach has been discussed in detail recently (WHO 2002c). Although the claims for the benefits of reducing either single or multiple risk factors are impressive, no way exists of knowing to what extent they are attainable. Furthermore, if, as seems likely, they will reduce morbidity and mortality in middle life, what of later? The WHO report admits that it has ignored the problem of competing risks—that is, somebody saved from a stroke in 2001 is then "available" to die from other diseases in ensuing years. Solid information about the role of risk factors exists only for a limited number of noncommunicable diseases; little is known about musculoskeletal disease, the major psychoses, dementia, and many other major causes of morbidity and mortality.

The problems of health care systems and improving performance in health care delivery have been reviewed in World Health Report 2000—Health Systems: Improving Performance (WHO 2000). Relating different systems of health care to outcomes is extremely complex, but this report emphasizes the critical nature of research directed at health care delivery. As a response to the spiraling costs of health care, many governments are introducing repeated reforms of their health care programs without pilot studies or any other scientific indication for their likely success. This vital area of medical research has tended to be neglected in many countries over the later years of the 20th century.

Summary of Scientific Medicine in the 20th Century

The two major achievements of scientific medicine in the 20th century—the development of clinical epidemiology and the partial control of infectious disease—have made only a limited contribution to the health of developing countries. Although in part this limited effect is simply a reflection of poverty and dysfunctional health care systems, it is not the whole story. As exemplified by the fact that of 1,233 new drugs that were marketed between 1975 and 1999, only 13 were approved specifically for tropical diseases, the problem goes much deeper, reflecting neglect by industrial countries of the specific medical problems of developing countries.

For those countries that have gone through the epidemiological transition and for industrial countries, the central problem is quite different. Although the application of public health measures for the control of risk factors appears to have made a major effect on the frequency of some major killers, those gains have been balanced by an increase in the frequency of other common chronic diseases and the problems of an increasingly elderly population. At the same time, remarkable developments in scientific medicine have allowed industrial countries to develop an increasingly effective high-technology, patch-up form of medical practice. None of these countries has worked out a way to control the spiraling costs of health care, and because of their increasing aged populations, little sign exists that things will improve. Although some of the diseases that produce this enormous burden may be at least partially preventable by the more effective control of risk factors, to what extent such control will be achievable is unclear, and for many diseases these factors have not been identified. In short, scientific medicine in the 20th century, for all its successes, has left a major gap in the understanding of the pathogenesis of disease between the action of environmental risk factors and the basic disease processes that follow from exposure to them and that produce the now well-defined deranged physiology that characterizes them.

These problems are reflected, at least in some countries, by increasing public disillusion with conventional medical practice that is rooted in the belief that if modern medicine could control infectious diseases, then it would be equally effective in managing the more chronic diseases that took their place. When this improvement did not happen—and when a mood of increasing frustration about what medicine could achieve had developed—a natural move occurred toward trying to find an alternative answer to these problems. Hence, many countries have seen a major migration toward complementary medicine.

It is against this rather uncertain background that the role of science and technology for medical care in the future has to be examined.

Science, Technology, and Medicine in the Future

Before considering the remarkable potential of recent developments in basic biological research for improvements in health care, we must define priorities for their application.

Priorities for Biomedical Research in the Future

In the setting of priorities for biomedical research in the future, the central objective is to restore the balance of research between industrial and developing countries so that a far greater proportion is directed at the needs of the latter. In the 1990s, it was estimated that even though 85 percent of the global burden of disability and premature mortality occurs in the developing world, less than 4 percent of global research funding was devoted to communicable, maternal, perinatal, and nutritional disorders that constitute the major burden of disease in developing countries (WHO 2002b).

The second priority is to analyze in much more detail methods of delivery of those aspects of health care that have already been shown to be both clinically effective and cost-effective. It is vital that the delivery of health care be based on well-designed, evidence-based pilot studies rather than on current fashion or political guesswork. It is essential to understand why there are such wide discrepancies in morbidity and mortality between different socioeconomic groups in many industrial countries and to define the most effective approaches to educating the public about the whole concept of risk and what is meant by risk factors. In addition, a great deal more work is required on mechanisms for assessing overall performance of health care systems.

The third priority must be to focus research on the important diseases that the biomedical sciences have yet to control, including common communicable diseases such as malaria, AIDS, and tuberculosis; cardiovascular disease; many forms of cancer; all varieties of diabetes; musculoskeletal disease; the major psychoses; and the dementias. Of equal importance is gaining a better understanding of both the biology and pathophysiology of aging, together with trying to define its social and cultural aspects.

In the fields of child and maternal health, the requirements for research differ widely in industrial and developing countries. Industrial countries need more research into the mechanisms of congenital malformation and the better control and treatment of monogenic disease and behavioral disorders of childhood. In developing countries, both child and maternal health pose different problems, mainly relating to health education and the control of communicable disease and nutrition. In many developing countries, some of the common monogenic diseases, notably the hemoglobin disorders, also require urgent attention.

In short, our priorities for health care research come under two main heads: first, apply knowledge that we already have more effectively; second, apply a multidisciplinary attack on diseases about which we have little or no understanding. These issues are developed further in chapter 4.

New Technologies

The sections that follow briefly outline some examples of the new technologies that should help achieve these aims.

Genomics, Proteomics, and Cell Biology

Without question the fields of molecular and cell biology were the major developments in the biological sciences in the second half of the 20th century. The announcement of the partial completion of the human genome project in 2001 was accompanied by claims that knowledge gained from this field would revolutionize medical practice over the next 20 years. After further reflection, some doubts have been raised about this claim, not in the least the time involved; nevertheless, considerable reason for optimism still exists. Although the majority of common diseases clearly do not result from the dysfunction of a single gene, most diseases can ultimately be defined at the biochemical level; because genes regulate an organism's biochemical pathways, their study must ultimately tell us a great deal about pathological mechanisms.

The genome project is not restricted to the human genome but encompasses many infectious agents, animals that are extremely valuable models of human disease, disease vectors, and a wide variety of plants. However, obtaining a complete nucleotide sequence is one thing; working out the regulation and function of all the genes that it contains and how they interact with each other at the level of cells and complete organisms presents a much greater challenge. The human genome, for example, will require the identification and determination of the function of the protein products of 25,000 genes (proteomics) and the mechanisms whereby genes are maintained in active or inactive states during development (methylomics). It will also involve the exploration of the roles of the family of regulatory ribonucleic acid (RNA) molecules that have been discovered recently (Mattick 2003). All this information will have to be integrated by developments in information technology and systems biology. These tasks may take the rest of this century to carry out. In the process, however, valuable fallout from this field is likely to occur for a wide variety of medical applications. Many of these are outlined in a recent WHO report, Genomics and World Health 2002 (WHO 2002a).

The first applications of DNA technology in clinical practice were for isolating the genes for monogenic diseases. Either by using the candidate gene approach or by using DNA markers for linkage studies, researchers have defined the genes for many monogenic diseases. This information is being used in clinical practice for carrier detection, for prenatal diagnosis, and for defining of the mechanisms of phenotypic variability. It has been particularly successful in the case of the commonest monogenic diseases, the inherited disorders of hemoglobin, which affect hundreds of thousands of children in developing countries (Weatherall and Clegg 2001a, 2001b). Through North-South collaborations, it has been possible to set up screening and prenatal diagnosis programs for these conditions in many countries, resulting in a marked decline in their frequency, particularly in Mediterranean populations (figure 5.2). Gene therapy, that is, the specific correction of monogenic diseases, has been fraught with difficulties, but these are slowly being overcome and this approach seems likely to be successful for at least some genetic diseases in the future.

Figure 5.2. Decline in Serious Forms of Thalassemia in Different Populations after the Initiation of Prenatal Diagnosis in 1972 following the Development of North-South Partnerships.

Figure 5.2

Decline in Serious Forms of Thalassemia in Different Populations after the Initiation of Prenatal Diagnosis in 1972 following the Development of North-South Partnerships.

From the global perspective, one of the most exciting prospects for the medical applications of DNA technology is in the field of communicable disease. Remarkable progress has been made in sequencing the genomes of bacteria, viruses, and other infective agents, and it will not be long before the genome sequence of most of the major infectious agents is available. Information obtained in this way should provide opportunities for the development of new forms of chemotherapy (Joët and others 2003) and will be a major aid to vaccine development (Letvin, Bloom, and Hoffman 2001). In the latter case, DNA technology will be combined with studies of the basic immune mechanisms involved in individual infections in an attempt to find the most effective and economic approach. Recombinant DNA technology was used years ago to produce pure antigens of hepatitis B in other organisms for the development of safe vaccines. More recently, and with knowledge obtained from the various genome projects, interest has centered on the utility of DNA itself as a vaccine antigen. This interest is based on the chance observation that the direct injection of DNA into mammalian cells could induce them to manufacture—that is, to express—the protein encoded by a particular gene that had been injected. Early experiences have been disappointing, but a variety of techniques are being developed to improve the antigens of potential DNA-based vaccines.

The clinical applications of genomics for the control of communicable disease are not restricted to infective agents. Recently, the mosquito genome was sequenced, leading to the notion that it may be possible to genetically engineer disease vectors to make them unable to transmit particular organisms (Land 2003). A great deal is also being learned about genetic resistance to particular infections in human beings (Weatherall and Clegg 2002), information that will become increasingly important when potential vaccines go to trial in populations with a high frequency of genetically resistant individuals.

The other extremely important application of DNA technology for the control of communicable disease—one of particular importance to developing countries—is its increasing place in diagnostics. Rapid diagnostic methods are being developed that are based on the polymerase chain reaction (PCR) technique to identify pathogen sequences in blood or tissues. These approaches are being further refined for identifying organisms that exhibit drug resistance and also for subtyping many classes of bacteria and viruses. Although much remains to be learned about the cost-effectiveness of these approaches compared with more conventional diagnostic procedures, some promising results have already been obtained, particularly for identification of organisms that are difficult to grow or in cases that require a very early diagnosis (Harris and Tanner 2000). This type of technology is being widely applied for the identification of new organisms and is gaining a place in monitoring vaccine trials (Felger and others 2003). The remarkable speed with which a new corona virus and its different subtypes were identified as the causative agent of SARS and the way this information could be applied to tracing the putative origins of the infection are an example of the power of this technology (Ruan and others 2003).

Genomics is likely to play an increasingly important role in the control and management of cancer (Livingston and Shivdasani 2001). It is now well established that malignant transformation of cell populations usually results from acquired mutations in two main classes of genes:

  • First are oncogenes—genes that are involved in the major regulatory processes whereby cells interact with one another, respond to environmental signals, regulate how and when they will divide, and control the other intricate processes of cell biology (box 5.1).
  • Second are tumor suppressor genes; loss of function by mutation may lead to a neoplastic phenotype.
Box Icon

Box 5.1

Chronic Myeloid Leukemia: The Path from Basic Science to the Clinic. 1960 An abnormal chromosome, named the Philadelphia chromosome, was found in the white cells of most patients with chronic myeloid leukemia (CML). 1973 By the use of specific dyes to (more...)

In the rare familial cancers, individuals are born with one defective gene of this type, but in the vast majority of cases, cancer seems to result from the acquisition during a person's lifetime of one or more mutations of oncogenes. For example, in the case of the common colon cancers, perhaps up to six different mutations are required to produce a metastasizing tumor. The likelihood of the occurrence of these mutations is increased by the action of environmental or endogenous carcinogens.

Array technology, which examines the pattern of expression of many different genes at the same time, is already providing valuable prognostic data for cancers of the breast, blood, and lymphatic system. This technology will become an integral part of diagnostic pathology in the future, and genomic approaches to the early diagnosis of cancer and to the identification of high-risk individuals will become part of clinical practice. It is also becoming possible to interfere with the function or products of oncogenes as a more direct approach to the treatment of cancer (box 5.1), although early experience indicates that drug resistance may be caused by mutation, as it is in more conventional forms of cancer therapy.

The genomic approach to the study of common diseases of middle life—coronary artery disease, hypertension, diabetes, and the major psychoses, for example—has been widely publicized (Collins and McKusick 2001). Except in rare cases, none of them is caused by a defective single gene; rather, they appear to be the result of multiple environmental factors combined with variation in individual susceptibility attributable to the action of several different genes. The hope is that if these susceptibility genes can be identified, an analysis of their products will lead to a better understanding of the pathology of these diseases and will offer the possibility of producing more definitive therapeutic agents. Better still, this research could provide the opportunity to focus public health measures for prevention on genetically defined subsets of populations.

Pharmacogenomics is another potential development from the genomics revolution (Bumol and Watanabe 2001) (table 5.1). Considerable individual variability exists in the metabolism of drugs; hence, clinical medicine could reach a stage at which every person's genetic profile for the metabolism of common drugs will be worked out and become part of their physicians' toolkit. This information will also be of considerable value to the pharmaceutical industry for designing more effective and safer therapeutic agents.

Table 5.1. Pharmacogenomics.

Table 5.1


A word of caution is necessary: Although well-defined genetic variation is responsible for unwanted side effects of drugs, this information is still rarely used in clinical practice; a possible exception is screening for glucose-6-phosphate dehydrogenase (G6PD) deficiency for primaquine sensitivity, though the costs preclude its application in many developing countries. Furthermore, plasma levels after the administration of most common drugs follow a normal distribution, indicating that if genetic variation exists, a number of different genes must be involved. Hence, although the idea of all people having their genetic profile for handling drugs as part of their standard medical care will take a long time to achieve, if it ever happens, no doubt exists that this field will gradually impinge on medical research and clinical practice.

Many other potential applications of genomic research for medical practice wait to be developed. The role of DNA array technology for the analysis of gene expression in tumors has already been mentioned. Advances in bioengineering, with the development of biomicroelectromechanical systems, microlevel pumping, and reaction circuit systems, will revolutionize chip technology and enable routine analysis of thousands of molecules simultaneously from a single sample (Griffith and Grodzinsky 2001), with application in many other fields of research. Although somatic cell gene therapy—that is, the correction of genetic diseases by direct attack on the defective gene—has gone through long periods of slow progress and many setbacks, the signs are that it will be successful for at least a limited number of monogenic diseases in the long term (Kaji and Leiden 2001). It is also likely to play a role for shorter-term objectives—in the management of coronary artery disease and some forms of cancer, for example. DNA technology has already revolutionized forensic medicine and will play an increasingly important role in this field. Although it is too early to assess to what extent the application of DNA technology to the studies of the biology of aging will produce information of clinical value, considering the massive problem of our aging populations and the contribution of the aging process to their illnesses, expanding work in this field is vital. Current work in the field of evolution using DNA technology seems a long way from clinical practice; however, it has considerable possibilities for helping us understand the lack of adaptation of present day communities to the new environments that they have created.

Stem Cell and Organ Therapy

Stem cell therapy, or, to use its more popular if entirely inappropriate title, therapeutic cloning, is an area of research in cellular biology that is raising great expectations and bitter controversies. Transplant surgery has its limitations, and the possibility of a ready supply of cells to replace diseased tissues, even parts of the brain, is particularly exciting. Stem cells can be obtained from early embryos, from some adult and fetal tissues, and (at least theoretically) from other adult cells.

Embryonic stem cells, which retain the greatest plasticity, are present at an early stage of the developing embryo, from about the fourth to seventh day after fertilization. Although some progress has been made in persuading them to produce specific cell types, much of the potential for this field so far has come from similar studies of mouse embryonic stem cells. For example, mouse stem cells have been transplanted into mice with a similar condition to human Parkinson's disease with some therapeutic success, and they have also been used to try to restore neural function after spinal cord injuries.

Many adult tissues retain stem cell populations. Bone marrow transplantation has been applied to the treatment of a wide range of blood diseases, and human marrow clearly contains stem cells capable of differentiating into the full complement of cell types found in the blood. Preliminary evidence indicates that they can also differentiate into other cell types if given the appropriate environment; they may, for example, be a source of heart muscle or blood vessel cell populations. Although stem cells have also been found in brain, muscle, skin, and other organs in the mouse, research into characterizing similar cell populations from humans is still at a very early stage.

One of the major obstacles to stem cell therapy with cells derived from embryos or adult sources is that, unless they come from a compatible donor, they may be treated as "foreign" and rejected by a patient's immune system. Thus, much research is directed at trying to transfer cell nuclei from adult sources into an egg from which the nucleus has been removed, after which the newly created "embryo" would be used as a source of embryonic stem cells for regenerative therapy for the particular donor of the adult cells. Because this technique, called somatic cell nuclear transfer, follows similar lines to those that would be required for human reproductive cloning, this field has raised a number of controversies. Major ethical issues have also been raised because, to learn more about the regulation of differentiation of cells of this type, a great deal of work needs to be carried out on human embryonic stem cells.

If some of the formidable technical problems of this field can be overcome and, even more important, if society is able to come to terms with the ethical issues involved, this field holds considerable promise for correction of a number of different intractable human diseases, particularly those involving the nervous system (Institute of Medicine 2002).

Information Technology

The explosion in information technology has important implications for all forms of biomedical research, clinical practice, and teaching. The admirable desire on the part of publicly funded groups in the genomics field to make their data available to the scientific community at large is of enormous value for the medical application of genomic research. This goal has been achieved by the trio of public databases established in Europe, the United States, and Japan (European Bioinformatics Institute, GenBank, and DNA Data Bank of Japan, respectively). The entire data set is securely held in triplicate on three continents. The continued development and expansion of accessible databases will be of inestimable value to scientists, in both industrial and developing countries.

Electronic publishing of high-quality journals and related projects and the further development of telepathology will help link scientists in industrial and developing countries. The increasing availability of telemedicine education packages will help disseminate good practices. Realizing even these few examples of the huge potential of this field will require a major drive to train and recruit young information technology scientists, particularly in developing countries, and the financial support to obtain the basic equipment required.

Minimally Invasive Diagnostics and Surgery: Changes in Hospital Practice

Given the spiraling costs of hospital care in industrial countries and the likelihood of similar problems for developing countries in the future, reviewing aspects of diagnostics and treatment that may help reduce these costs in the future is important. Changes in clinical practice in the latter half of the 20th century have already made some headway on this problem. In the U.K. NHS, the number of hospital beds occupied daily halved between 1950 and 1990 even though the throughput of the service, after allowance for change of definition, increased from 3 million to 10 million inpatients per year. Remarkably, by 1996, of 11.3 million finished consultant episodes, 22 percent were single-day cases. How can this efficient trend be continued? A major development with this potential is the application of minimally invasive and robotic surgery (Mack 2001). Advances in imaging, endoscopic technology, and instrumentation have made it possible to convert many surgical procedures from an open to an endoscopic route. These procedures are now used routinely for gall bladder surgery, treatment of adhesions, removal of fibroids, nephrectomy, and many minor pediatric urological procedures. The recent announcement of successful hip replacement surgery using an endoscopic approach offers an outstanding example of its future potential. Although progress has been slower, a number of promising approaches exist for the use of these techniques in cardiac surgery and for their augmentation by the introduction of robotics into surgical practice. Transplant surgery will also become more efficient by advances in the development of selective immune tolerance (Niklason and Langer 2001).

These trends, and those in many other branches of medicine, will be greatly augmented by advances in biomedical imaging (Tempany and McNeil 2001). Major progress has already been made in the development of noninvasive diagnostic methods by the use of MRI, computer tomography, positron imaging tomography, and improved ultrasonography. Image-guided therapy and related noninvasive treatment methods are also showing considerable promise.

Human Development and Child and Maternal Health

Among the future developments in molecular and cell biology, a better understanding of the mechanisms of human development and the evolution of functions of the nervous system offer some of the most exciting, if distant, prospects (Goldenberg and Jobe 2001). In the long term, this field may well have important implications for reproductive health and birth outcomes. The role of a better understanding of the monogenic causes of congenital malformation and mental retardation was mentioned earlier in this chapter. Already thoughts are turning to the possibility of the isolation and clinical use of factors that promote plasticity of brain development, and specific modulators of lung and gut development are predicted to start to play an increasing role in obstetric practice. A better understanding of the mechanisms leading to vasoconstriction and vascular damage as a cause of preeclampsia has the potential for reducing its frequency and thus for allowing better management of this common condition. Similarly, an increasing appreciation of the different genetic and metabolic pathways that are involved in spontaneous preterm births should lead to effective prevention and treatment, targeting specific components of these pathways and leading to reduction in the frequency of premature births. An increasing knowledge of the mode of action of different growth factors and promoters of gut function will enhance growth and development of preterm infants.


Particularly because depression and related psychiatric conditions are predicted to be a major cause of ill health by 2020 and because of the increasing problem of dementia in the elderly, neuropsychiatry will be of increasing importance in the future (Cowan and Kandel 2001). Developments in the basic biomedical sciences will play a major role in the better diagnosis and management of these disorders. Furthermore, the application of new technologies promises to lead to increasing cooperation between neurology and psychiatry, especially for the treatment of illnesses such as mental retardation and cognitive disorders associated with Alzheimer's and Parkinson's diseases that overlap the two disciplines.

The increasing application of functional imaging, together with a better understanding of biochemical function in the brain, is likely to lead to major advances in our understanding of many neuropsychiatric disorders and, hence, provide opportunities for their better management. Early experience with fetally derived dopaminergic neurons to treat parkinsonism has already proved to be successful in some patients and has raised the possibility that genetically manipulated stem cell treatment for this and other chronic neurological disorders may become a reality. Promising methods are being developed for limiting brain damage after stroke, and there is increasing optimism in the field of neuronal repair based on the identification of brain-derived neuronotrophic growth factors. Similarly, a combination of molecular genetic and immunological approaches is aiding progress toward an understanding of common demyelinating diseases—notably multiple sclerosis.

Strong evidence exists for a major genetic component to the common psychotic illnesses—notably bipolar depression and schizophrenia. Total genome searches should identify some of the genes involved. Although progress has been slow, there are reasonable expectations for success. If some of these genes can be identified, they should provide targets for completely new approaches to the management of these diseases by the pharmaceutical industry. Recent successes in discovering the genes involved in such critical functions as speech indicate the extraordinary potential of this field. Similarly, lessons learned from the identification of the several genes involved in familial forms of early-onset Alzheimer's disease have provided invaluable information about some of the pathophysiological mechanisms involved, work that is having a major effect on studies directed at the pathophysiology and management of the much commoner forms of the disease that occur with increasing frequency in aged populations.

Nutrition and Genetically Modified Crops

By 2030, the world's population is likely to increase by approximately 2.5 billion people, with much of this projected growth occurring in developing countries. As a consequence, food requirements are expected to double by 2025. However, the annual rate of increase in cereal production has declined; the present yield is well below the rate of population increase. About 40 percent of potential productivity in parts of Africa and Asia and about 20 percent in the industrial world are estimated to be lost to pathogens.

Given these considerations, the genetic modification (GM) of plants has considerable potential for improving the world's food supplies and, hence, the health of its communities. The main aims of GM plant technologies are to enhance the nutritional value of crop species and to confer resistance to pathogens. GM technology has already recorded several successes in both these objectives.

Controversy surrounds the relative effectiveness of GM crops as compared with those produced by conventional means, particularly with respect to economic issues of farming in the developing world. Concerns are also expressed about the safety of GM crops, and a great deal more research is required in this field. The results of biosafety trials in Europe raise some issues about the effects of GM on biodiversity (Giles 2003).

Plant genetics also has more direct potential for the control of disease in humans. By genetically modifying plants, researchers hope it will be possible to produce molecules toxic to disease-carrying insects and to produce edible vaccines that are cheaper than conventional vaccines and that can be grown or freeze dried and shipped anywhere in the world. A promising example is the production of hepatitis B surface antigen in transgenic plants for oral immunization. Work is also well advanced for the production of other vaccines by this approach (WHO 2002a).

Social and Behavioral Sciences, Health Systems, and Health Economics

As well as the mainstream biomedical sciences, research into providing health care for the future will require a major input from the social and behavioral sciences and health economics. These issues are discussed in more detail in chapter 4.

The World Health Report 2002 (WHO 2002c) emphasizes the major gaps in public perception of what is meant by health and, in particular, risk factors, in both industrial and developing countries. Epidemiological studies have indicated that morbidity and mortality may be delayed among populations that are socially integrated. Increasing evidence of this kind underlines the importance of psychosocial factors in the development of a more positive approach to human health, clearly a valuable new direction for research on the part of the social sciences.

Neither developing nor industrial countries have come to grips with the problems of the organization and delivery of health care. Learning more about how to build effective health delivery strategies for developing countries is vital. Similarly, the continuous reorganization of the U.K. NHS, based on short-term political motivation and rarely on carefully designed pilot studies, is a good example of the requirement for research into the optimal approaches to the provision of health care in industrial countries. Indeed, across the entire field of health provision and the education of health care professionals, an urgent requirement exists for research into both methodology and, in particular, development of more robust endpoints for its assessment.

Similar problems exist with respect to research in health economics. Many of the parameters for assessing the burden of disease and the cost-effectiveness of different parameters for the provision of health care are still extremely crude and controversial, and they require a great deal more research and development. These problems are particularly relevant to the health problems of the developing countries.

One of the main barriers to progress in these fields is the relative isolation of the social sciences and health care economics from the mainstreams of medical research and practice. Better integration of these fields will be a major challenge for universities and national and international health care agencies.

Integration of the Medical Sciences: Organizational Priorities for the Future

From these brief examples of the likely direction of biomedical research in the future, some tentative conclusions can be drawn about its effects on the pattern of global health care.

The control of communicable disease will remain the top priority. Although this goal can be achieved in part by improving nutrition and sanitation and applying related public health measures in developing countries, the search for vaccines or better chemotherapeutic agents must also remain a high priority. However, although optimism that new vaccines will become available is well founded, many uncertainties still exist, particularly in the case of biologically complex diseases like malaria. It is vital that a balance be struck between the basic biomedical science approach and the continued application of methods to control these diseases by more conventional and well-tried methods.

For the bulk of common noncommunicable diseases, the situation is even less clear. Although much more humane, cost-effective, and clinically effective approaches to their management seem certain to be developed, mainly by high-technology and expensive procedures, the position regarding prevention and a definitive cure is much less certain. Hence, the program for reducing risk factors, as outlined in the World Health Report 2002 (WHO 2002c), clearly should be followed. However, a strong case exists for a partnership of the public health, epidemiological, and genomic sciences to develop pilot studies to define whether focusing these programs on high-risk subsets of populations will be both cost-effective and more efficient. For those many chronic diseases for which no risk factors have been defined, strategies of the same type should be established to define potential environmental factors that may be involved. Although surprises may arise along the way, such as the discovery of the infective basis for peptic ulceration, the multilayered environmental and genetic complexity of these diseases, combined with the ill-understood effects of aging, suggests that no quick or easy answers to these problems will present themselves; future planning for global health services must take this factor into consideration.

Given these uncertainties, an important place exists for the involvement and integration of the social sciences and health economics into future planning for biomedical research. Major gaps in knowledge about public perceptions and understanding of risk factors, a lack of information about the social and medical problems of aging populations, and widespread uncertainty about the most cost-effective and efficient ways of administering health care—both in developing countries and in those that have gone through the epidemiological transition and already have advanced health care systems—still exist.

In short, the emerging picture shows reasonable grounds for optimism that better and more definitive ways of preventing or curing communicable diseases will gradually become available; only the time frame is uncertain. Although there will be major improvements in management based on extensive and increasingly high-technology practice, the outlook for the prevention and definitive cure of the bulk of noncommunicable diseases is much less certain. Hence, it is vital that research in the basic biomedical sciences be directed at both the cause and the prevention of noncommunicable diseases, and that work in the fields of public health and epidemiology continues to be directed toward better use of what is known already about their prevention and management in a more cost-effective and efficient manner.

New Technologies and Developing Countries

The role of genomics and related high-technology research and practice in developing countries is discussed in detail in Genomics and World Health 2002 (WHO 2002a). The central question addressed by the report was, given the current economic, social, and health care problems of developing countries, is it too early to be applying the rather limited clinical applications of genomic and related technology to their health care programs? The report concluded that it is not too early, and subsequent discussion has suggested that this decision was right. Where DNA technology has already proven cost-effective, it should be introduced as soon as possible (Weatherall 2003). Important examples include the common inherited disorders of hemoglobin (see chapter 34) and, in particular, the use of DNA diagnostics for communicable disease. The advantage of this approach is that it offers a technical base on which further applications can be built as they become available. It also provides the impetus to develop the training required, to initiate discussions on the many ethical issues that work of this type may involve, and to establish the appropriate regulatory bodies. The way this type of program should be organized—through North-South collaboration, local networking, and related structures, monitored by WHO—was clearly defined in the report.

For the full benefits of genomics to be made available to developing countries—and for these advances not to widen the gap in health care provision between North and South—the most pressing and potentially exciting developments from the new technologies of science and medicine will have to be exploited by current scientific research in the industrial countries.

This need is particularly pressing in the case of the major communicable killers: malaria, tuberculosis, and AIDS. Similarly—and equally important—if developing countries are to make the best use of this new technology for their own particular disease problems, partnerships will have to be established between both academia and the pharmaceutical industries of the North and South.

Although this approach should be followed as a matter of urgency, that developing countries build up their own research capacity is equally important. Genomics and World Health 2002 (WHO 2002a) includes some encouraging accounts of how this capacity is being achieved in Brazil, China, and India. The establishment of the Asian-Pacific International Molecular Biology Network is a good example.

It is important that work start now to apply the advances stemming from the basic biological sciences for the health of the developing world. This beginning will form a platform for the integration of future advances into health care programs for these countries. However, because of uncertainties of the time involved, more conventional public health approaches to medical care must not be neglected, and a balance should be struck between research in this area and research in the emerging biomedical sciences.

Economic Issues for Future Medical Research

The central economic issues regarding medical research in the future are how it is to be financed and how its benefits are to be used in the most cost-effective way in both industrial and developing countries. Currently, research is carried out in both private and public sectors (table 5.2). Work in the private sector is based mainly in the pharmaceutical industry and, increasingly, in the many large biotechnology companies that evolved rapidly following the genomic revolution. In the public sector, the major sites of research are universities, government research institutes, and centers—either within the universities or freestanding—that are funded through a variety of philanthropic sources. The input of philanthropic sources varies greatly between countries. In the United Kingdom, the Wellcome Trust provides a portion of funding for clinical and basic biomedical research that approaches that of the government, and in the United States, the Howard Hughes organization also plays a major, though proportionally less important, role in supporting medical research. Similarly, the Bill & Melinda Gates Foundation and other large international philanthropic foundations are contributing a significant amount of funding for medical research. In developing countries, such research funding as is available comes from government sources. For example, Thailand and Malaysia spend US$15.7 million and US$6.9 million each year, representing 0.9 percent and 0.6 percent of their health budgets, respectively (WHO 2002b).

Table 5.2. Estimated Global Health Research and Development Funding for 1998.

Table 5.2

Estimated Global Health Research and Development Funding for 1998.

As examined in the report of the WHO Commission on Macroeconomics and Health (WHO 2001), considerable discussion is taking place about how to mobilize skills and resources of the industrial countries for the benefit of the health of the developing world. However, how this international effort should be organized or, even more important, funded is still far from clear. A number of models have been proposed, including the creation of a new global institute for health research and a global fund for health research with an independent, streamlined secretariat analogous to the Global Fund to Fight AIDS, Tuberculosis, and Malaria. Recently, a number of large donations have been given—either by governments or by philanthropic bodies—to tackle some of the major health problems of the developing world. Although many of these approaches are admirable, those that involve single donations raise the critical problem of sustainability. People with experience in developing interactions between the North and South will have no doubts about the long period of sustained work that is often required for a successful outcome.

Because of the uncertainties about sustainability and the efficiency of large international bodies, it has been suggested that a virtual global network for health research be established in which the leading research agencies of the North and South take part, together with a coordinating council (Keusch and Medlin 2003). In this scheme or in a modified form (Pang 2003), both government funding agencies and philanthropic bodies would retain their autonomy and mechanisms of funding while at the same time their individual programs would be better integrated and directed toward the problems of global health.

A central problem of both private and public patterns of funding for medical research is that industrial countries have tended to focus their research on their own diseases and have, with a few exceptions, tended to ignore the broader problems of developing countries, a trend that has resulted in the well-known 10/90 gap in which more than 90 percent of the world's expenditure on health research is directed at diseases that, numerically, affect a relatively small proportion of the world's population. If the enormous potential of modern biomedical research is not to result in a widening of the gap in health care between North and South, this situation must be corrected. The governments of industrial countries may be able to encourage a more global view of research activity on the part of their pharmaceutical and biotechnology industries by various tax advantages and other mutually beneficial approaches. Progress in this direction seems likely to be slow, however. For this reason, moving quickly toward a virtual global network for research that would bring together the research agencies of the North and South holds many attractions. Although those of the North that rely on government and charitable funding may find it equally difficult to convince their governments that more of their budget should be spent on work in the developing world, they vitally need to move in this direction, possibly by turning at least some proportion of their overseas aid to this highly effective approach to developing North-South partnerships.

In short, to produce the funding required for medical research in the future and to ensure that it takes on a much more global view of its objectives, a complete change in attitude is called for on the part of the industrial countries. This transformation, in turn, will require a similar change of outlook on the part of those who educate doctors and medical scientists. The introduction of considerable sums of research monies into the international scene by governments or philanthropic bodies as single, large donations, while welcome, will not form the basis for the kind of sustainable research program that is required. Rather, the attitudes of both government funding agencies and charitable bodies in industrial countries will have to change, with a greater proportion of their funding being directed at diseases of the developing world in the future. Achieving this end will require a major program of education on the global problems of disease at every level, including governments, industry, universities, charitable organizations, and every other body that is involved in the medical research endeavor.

Issues requiring the assessment of the economic value of medical research are discussed in chapter 4.


The central theme of the previous sections is that the potential fruits of the exciting developments in the biomedical sciences will be achieved only if a complete change in attitude occurs on the part of industrial countries, with the evolution of a much more global attitude to the problems of medical research and health care. Change will have to start in the universities of the industrial countries, which will need to incorporate a more global perspective in medical education so that the next generation of young people is more motivated to develop research careers that take a more international view of the problems of medical research. A major change of emphasis in education will be required and will be difficult to achieve unless those who control the university education and research programs can be convinced that funding is available for further development in these new directions (Weatherall 2003). Excellent examples of the value of the development of North-South partnerships between universities and other academic institutions do already exist.

An effective approach to increasing global funding for internationally based research is through virtual global networks involving the leading research agencies in the North and South. Hence, a similar effort will be required to educate these agencies and their governments that this approach to improving the level of health globally is cost-effective. In particular, it will be vital to persuade them that this approach may constitute an effective use of their programs of aid for developing countries. Carrying out a number of pilot studies showing the economic value of North-South partnerships in specific areas of medical research may be necessary. Indeed, a number of these partnerships have already been formed in several countries and information of this type almost certainly exists (WHO 2002a).

Of course, much broader issues involving education need to be resolved for the better exploitation of medical research. The problems of educating the public so that developing countries can partake in the advancements of the genome revolution were set out in detail in Genomics and World Health 2002(WHO 2002a), but a great deal of work along these lines is also required for industrial countries. People are increasingly suspicious of modern biological science and of modern high-technology medicine, a factor that, together with concerns over the pastoral skills of today's doctors, is probably playing a role in driving many communities in industrial countries toward complementary medicine (see Horton 2003). These trends undoubtedly are attributable to inadequacies of medical education and the way that science is taught in schools—reflected by the lack of scientific literacy both in the general public and in governments. If trust is to be restored between the biomedical sciences and the public, significant efforts will have to be made to improve the level of scientific literacy, and a much more open dialogue will need to be developed between scientists and the community. This requirement will be increasingly important as work on basic biomedical sciences impinges on areas such as gene therapy, stem cell research, and the collection of large DNA databases to be used for both research and therapeutic purposes in the future.

The difficulties in achieving a more global view of medical research and health care on the part of industrial countries for the future should not be underestimated. Without a major attempt to solve these difficulties, the potential of modern biomedical sciences seems certain to simply widen the gap in health care between North and South.

Ethical Issues

Few advances in scientific medicine have not raised new ethical issues for society. The genomics era has encountered many problems in this respect, and although many of the initial fears and concerns have been put to rest by sensible debate and the development of effective control bodies, new problems continue to appear (WHO 2002a). The ill-named field of therapeutic cloning is still full of unresolved issues regarding human embryo research, the creation of embryos for research purposes, and other uncertainties, but these questions should not be overemphasized at a time when most societies face even more onerous ethical issues. For example, as the size of our aging population increases, many societies may have to face the extremely difficult problem of rationing medical care. The theme recurring throughout both industrial and developing countries is how to provide an adequate level of health care equally to every income group.

Many developing countries still lack the basic structure for the application of ethical practices in research and clinical care, including the development of institutional ethics committees, governmental regulatory bodies, and independent bioethical research bodies. Every country requires a completely independent bioethics council that can debate the issues uninhibited by pressures from government, commerce, or pressure groups of any kind. Our approaches to developing a more adequate ethical framework for much of medical decision making, whether it involves preventive medicine, clinical practice, or research, constitute another neglected area that requires research input from many different disciplines.

The important question of the ethical conduct of research in the developing countries by outside agencies has been reviewed in detail recently (Nuffield Council on Bioethics 2002).

Why Do We Need Research?

It is important to appreciate that considerable public suspicion exists about both the activities and the value of biomedical research. Suspicion has been generated in part by the field's exaggerated claims over recent years, an uneasy feeling that research is venturing into areas that would best be avoided, and a lack of understanding about the complexity of many of the problems that it is attempting to solve. At the same time, many government departments that run national health care programs, the private sector (with the exception of the pharmaceutical industry), and many nongovernmental organizations set aside extremely small fractions of their overall expenditure for research. For many of those organizations, research seems irrelevant as they deal with the stresses of daily provision of programs of health care and with crisis-management scenarios that have to follow rapid change or major failures in providing health care.

One of the major challenges for the biomedical research community will be to better educate the public about its activities and to restore their faith in and support for the medical research endeavor. Educating many governments and nongovernmental organizations about the critical importance of decision making based on scientifically derived evidence will be vital. Medical care will only get more complex and expensive in the future; its problems will not be solved by short-term, politically driven activity. The need for good science, ranging from studies of molecules to communities, has never been greater.


Clearly, the most important priorities for medical research are development of more effective health delivery strategies for developing countries and control of the common and intractable communicable diseases. In this context, the argument has been that much of the medical research that has been carried out in industrial countries, with its focus on noncommunicable disease and its outcomes in high-technology practice, is completely irrelevant to the needs of developing countries. This view of the medical scene, however, is short term. Although some redistribution of effort is required, every country that passes through the epidemiological transition is now encountering the major killers of industrial countries. Learning more about those killers' basic causes, prevention, and management is crucial. Although the initial costs of providing the benefits of this research are often extremely high, they tend to fall as particular forms of treatment become more widely applied. Hence, because we cannot completely rely on our current preventive measures to control these diseases, medical research must continue.

Research in basic human biology and the biomedical sciences is entering the most exciting phase of its development. However, it is difficult to anticipate when the gains of this explosion in scientific knowledge will become available for the prevention and treatment of the major killers of mankind. Thus, medical research must strike a balance between the well-tried approaches of epidemiology, public health, and clinical investigation at the bedside with the application of discoveries in the completely new fields of science that have arisen from the genome revolution.

If this balanced approach toward the future provision of health care is not to continue to worsen the gap between North and South, however, a complete change of attitude is necessary toward health care research and practice on the part of the industrial countries. A major effort will be required to educate all parties—international nongovernmental organizations, governments, universities, and the private sector—in global health problems (Weatherall 2003). Equally important will be a major change of emphasis in the universities of industrial countries toward education programs in science and medicine to provide medical scientists of the future with a more global perspective of health and disease. If this transformation can be achieved—if it can form the basis for the establishment of networks for sustainable research programs between universities and related bodies in the North and South—much progress will be made toward distributing the benefits of biomedical research and good practice among the populations of the world. However, the great potential of advances in the biomedical sciences for global health will not come to full fruition without much closer interaction between the fields of basic and clinical research and the fields of public health, health economics, and the social sciences.


  1. Alberti G. Noncommunicable Diseases: Tomorrow's Pandemics. Bulletin of the World Health Organization. 2001;79(10):907. [PMC free article: PMC2566675] [PubMed: 11693971]
  2. Barker, D., ed. 2001. Fetal Origins of Cardiovascular and Lung Disease. New York: Marcel Dekker.
  3. Bartram C. R., deKlein A., Hagemeijer A., van Agthoven T., Geurts van Kessel A., Bootsma D. et al. Translocation of c-Abl Oncogene Correlates with the Presence of a Philadelphia Chromosome in Chronic Myelocytic Leukemia. Nature. 1983;306(5940):277–80. [PubMed: 6580527]
  4. Beeson P. B. Changes in Medical Therapy during the Past Half Century. Medicine (Baltimore). 1980;59(2):79–99. [PubMed: 7360043]
  5. Black, D. 1980. Inequalities in Health: Report of a Working Party, Department of Health and Society Security. London: Her Majesty's Stationery Office.
  6. Bumol T. F., Watanabe A. M. Genetic Information, Genomic Technologies, and the Future of Drug Discovery. Journal of the American Medical Association. 2001;285(5):551–55. [PubMed: 11176857]
  7. Castelli W. P., Anderson K. Population at Risk: Prevalence of High Cholesterol Levels in Hypertensive Patients in the Framingham Study. American Journal of Medicine. 1986;80(Suppl. 2A):23. [PubMed: 3946458]
  8. Chalmers I. The Cochrane Collaboration: Preparing, Maintaining, and Disseminating Systematic Reviews of the Effects of Health Care. Annals of the New York Academy of Science. 1993;703:156–63. ; discussion 163–165. [PubMed: 8192293]
  9. Chen, L. C. 1996."World Population and Health." In 2020 Vision: Health in the 21st Century. Washington, DC: National Academy Press.
  10. Collins F. S., McKusick V. A. Implications of the Human Genome Project for Medical Science. Journal of the American Medical Association. 2001;285(5):540–44. [PubMed: 11176855]
  11. Comroe J. H. Jr., Dripps R. D. Scientific Basis for the Support of Biomedical Science. Science. 1976;192(4235):105–11. [PubMed: 769161]
  12. Cooter, R., and J. Pickstone. 2000. Medicine in the Twentieth Century. Amersterdam: Harwood.
  13. Cowan W. M., Kandel E. R. Prospects for Neurology and Psychiatry. Journal of the American Medical Association. 2001;285(5):594–600. [PubMed: 11176865]
  14. Doll R. Preventive Medicine: The Objectives. Ciba Foundation Symposium. 1985;110:3–21. [PubMed: 3845884]
  15. Druker B. J., Tamura S., Buchdunger E., Ohno S., Segal G. M., Fanning S. et al. Effects of a Selective Inhibitor of the Abl tyrosine kinase on the Growth of Bcr-Abl Positive Cells. Nature Medicine. 1996;2(5):561–66. [PubMed: 8616716]
  16. Egger, M., G. Davey-Smith, and D. G. Altman. 2001. Systematic Reviews in Health Care: Meta-Analysis in Context. London: BMJ Publications.
  17. Feachem, R. G. A., T. Kjellstrom, C. J. L. Murray, M. Over, and M. A. Phillips. 1992. The Health of Adults in the Developing World. Oxford, U.K.: Oxford University Press.
  18. Felger I., Genton B., Smith T., Tanner M., Beck H. P. Molecular Monitoring in Malaria Vaccine Trials. Trends in Parasitology. 2003;19(2):60–63. [PubMed: 12586469]
  19. Finch, R. G., and R. J. Williams. 1999. Antibiotic Resistance. London: Baillière Tindall.
  20. Giles J. Biosafety Trials Darken Outlook for Transgenic Crops in Europe. Nature. 2003;425(6960):751. [PubMed: 14574368]
  21. Goldenberg R. L., Jobe A. H. Prospects for Research in Reproductive Health and Birth Outcomes. Journal of the American Medical Association. 2001;285(5):633–39. [PubMed: 11176872]
  22. Griffith L. G., Grodzinsky A. J. Advances in Biomedical Engineering. Journal of the American Medical Association. 2001;285(5):556–61. [PubMed: 11176858]
  23. Harris E., Tanner M. Health Technology Transfer. British Medical Journal. 2000;321(7264):817–20. [PMC free article: PMC1118623] [PubMed: 11009526]
  24. Horton, R. 2003. Second Opinion: Doctors, Diseases and Decisions in Modern Medicine. London: Grant Books.
  25. Institute of Medicine. 2002. Stem Cells and the Future of Regenerative Medicine. Washington, DC: National Academy Press.
  26. ISIS-2 (Second International Study of Infarct Survival) Collaborative Group. Randomised Trial of Intravenous Streptokinase, Oral Aspirin, Both, or Neither among 17,187 Cases of Suspected Acute Myocardial Infarction: ISIS-2. Lancet. 1988;2(8607):349–60. [PubMed: 2899772]
  27. Joët T, Eckstein-Ludwig M. U., Morin C., Krishna S. Validation of the Hexose Transporter of Plasmodium falciparum as a Novel Drug Target. Proceedings of the National Academy of Sciences of the U.S.A. 2003;100(13):7476–79. [PMC free article: PMC164611] [PubMed: 12792024]
  28. Kaji E. H., Leiden J. M. Gene and Stem Cell Therapies. Journal of the American Medical Association. 2001;285(5):545–50. [PubMed: 11176856]
  29. Keusch G. T., Medlin C. A. Tapping the Power of Small Institutions. Nature. 2003;422(6932):561–62. [PubMed: 12686973]
  30. Klein A., van Kessel A. G., Grosveld G., Bartram C. R., Hagemeijer A., Bootsma D. et al. A Cellular Oncogene Is Translocated to the Philadelphia Chromosome in Chronic Myelocytic Leukaemia. Nature. 1982;300(5894):765–67. [PubMed: 6960256]
  31. Land K. M. The Mosquito Genome: Perspectives and Possibilities. Trends in Parasitology. 2003;19(3):103–5. [PubMed: 12643988]
  32. Letvin N. L., Bloom B. R., Hoffman S. L. Prospects for Vaccines to Protect against AIDS, Tuberculosis, and Malaria. Journal of the American Medical Association. 2001;285(5):606–11. [PubMed: 11176867]
  33. Livingston D. M., Shivdasani R. Toward Mechanism-based Cancer Care. Journal of the American Medical Association. 2001;285(5):588–93. [PubMed: 11176864]
  34. Mack M. J. Minimally Invasive and Robotic Surgery. Journal of the American Medical Assocation. 2001;285(5):568–72. [PubMed: 11176860]
  35. Mattick J. S. Challenging the Dogma: The Hidden Layer of Non-Protein-Coding RNAs in Complex Organisms. Bioessays. 2003;25(10):930–39. [PubMed: 14505360]
  36. Modell B., Bulyzhenkov V. Distribution and Control of Some Genetic Disorders. World Health Statistics Quarterly. 1998;41:209–18. [PubMed: 3232409]
  37. Niklason L. E., Langer R. Prospects for Organ and Tissue Replacement. Journal of the American Medical Association. 2001;285(5):573–76. [PubMed: 11176861]
  38. Noedl H., Wongsrichanalai C., Wernsdorfer W. H. Malaria Drug-Sensitivity Testing: New Assays, New Perspectives. Trends in Parasitology. 2003;19(4):175–81. [PubMed: 12689648]
  39. Nossal, G. J. V. 1999. "Vaccines." In Fundamental Immunology, ed. W. E. Paul. Philadelphia: Lippincott-Raven.
  40. Nowell P. C., Hungerford D. A. A Minute Chromosome in Human Chronic Granulocytic Leukemia. Science. 1960;132:1497–501.
  41. Nuffield Council on Bioethics. 2002. The Ethics of Research Related to Healthcare in the Developing Countries. London: Nuffield Council on Bioethics.
  42. Olshansky S. J., Carnes B. A., Cassel C. In Search of Methuselah: Estimating the Upper Limits to Human Longevity. Science. 1990;250(4981):634–40. [PubMed: 2237414]
  43. Pang T. Complementary Strategies for Efficient Use of Knowledge for Better Health. Lancet. 2003;361(9359):716. [PubMed: 12620734]
  44. Perrin L., Telenti A. HIV Treatment Failure: Testing for HIVResistance in Clinical Practice. Science. 1998;280(5371):1871–73. [PubMed: 9669946]
  45. Porter, R. 1997. The Greatest Benefit to Mankind: A Medical History of Humanity from Antiquity to the Present. London: Harper Collins.
  46. Roberts L. Polio: The Final Assault? Science. 2004;303(5666):1960–68. [PubMed: 15044779]
  47. Rowley J. D. A New Consistent Chromosomal Abnormality in Chronic Myelogenous Leukemia Identified by Quinacrine Fluorescence and Giemsa Staining. Nature. 1973;243(5405):290–93. [PubMed: 4126434]
  48. Ruan Y. J., Wei C. L., Ee A. L., Vega V. B., Thoreau H., Su S. T. et al. Comparative Full-Length Genome Sequence Analysis of 14 SARS Coronavirus Isolates and Common Mutations Associated with Putative Origins of Infection. Lancet. 2003;361(9371):1779–85. [PubMed: 12781537]
  49. Sackett D. L., Rosenberg W. M., Gray J. A., Haynes R. B., Richardson W. S. Evidence Based Medicine: What It Is and What It Isn't. British Medical Journal. 1996;312(7023):71–72. [PMC free article: PMC2349778] [PubMed: 8555924]
  50. Scriver C. R., Neal J. L., Saginur R., Clow A. The Frequency of Genetic Disease and Congenital Malformation among Patients in a Pediatric Hospital. Canadian Medical Association Journal. 1973;108(9):1111–15. [PMC free article: PMC1941389] [PubMed: 4704890]
  51. Souhami, R. L., I. Tannock, P. Hohenberger, and J. C. Horiot, eds. 2001. The Oxford Textbook of Oncology. 2nd ed. Oxford, U.K.: Oxford University Press.
  52. Tempany C. M., McNeil B. J. Advances in Biomedical Imaging. Journal of the American Medical Association. 2001;285(5):562–67. [PubMed: 11176859]
  53. Warrell, D. A., T. M. Cox, J. D. Firth, and E. J. Benz, eds. 2003. Oxford Textbook of Medicine. 4th ed. Oxford, U.K.: Oxford University Press.
  54. Weatherall, D. J. 1995. Science and the Quiet Art: The Role of Research in Medicine. New York: Rockefeller University, W. W. Norton, and Oxford University Press.
  55. ———2003Genomics and Global Health: Time for a Reappraisal Science 302(5645):597–99. [PubMed: 14576421]
  56. Weatherall D. J., Clegg J. B. Inherited Haemoglobin Disorders: An Increasing Global Health Problem. Bulletin of the World Health Organization. 2001a;79(8):704–12. [PMC free article: PMC2566499] [PubMed: 11545326]
  57. ———. 2001b. The Thalassaemia Syndromes. 4th ed. Oxford, U.K.: Blackwell Scientific Publications.
  58. ———2002Genetic Variability in Response to Infection: Malaria and After Genes and Immunity 3(6):331–37. [PubMed: 12209359]
  59. Webster, C. 1998. The National Health Service: A Political History. Oxford, U.K.: Oxford University Press.
  60. WHO (World Health Organization). 2000. World Health Report 2000—Health Systems: Improving Performance. Geneva: WHO.
  61. ———. 2001. Macroeconomics and Health: Investing in Health for Economic Development: Report of the Commission on Macroeconomics and Health. Geneva: WHO.
  62. ———. 2002a. Genomics and World Health 2002. Geneva: WHO.
  63. ———. 2002b. Global Forum for Health Research: The 10/90 Report on Health Research 2001–2002. Geneva: WHO.
  64. ———. 2002c. The World Health Report 2002: Reducing Risks, Promoting Healthy Life. Geneva: WHO. [PubMed: 14741909]
Copyright © 2006, The International Bank for Reconstruction and Development/The World Bank Group.
Bookshelf ID: NBK11740PMID: 21250320


  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this page (755K)
  • Disable Glossary Links

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to PubMed

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...