NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

National Academy of Sciences (US) Committee on an Assessment of Centers for Disease Control and Prevention Radiation Studies from DOE Contractor Sites: Subcommittee to Review the Hanford Thyroid Disease Study Final Results and Report. Review of the Hanford Thyroid Disease Study Draft Final Report. Washington (DC): National Academies Press (US); 2000.

Cover of Review of the Hanford Thyroid Disease Study Draft Final Report

Review of the Hanford Thyroid Disease Study Draft Final Report.

Show details

Public Summary


In 1986, officials of the US Department of Energy revealed that the Hanford Atomic Products Operations in Richland, Washington, had been releasing radioactive material, in particular iodine-131, into the environment over a period of years. This information, which confirmed the suspicions of some people in the Pacific Northwest about what they called the Hanford Reservation or just Hanford, created quite a stir. Both the US Congress and citizens of the Northwest became keenly interested in knowing whether these radiation releases had caused human health effects. They were particularly concerned about whether Hanford releases of iodine-131 had led to an increase in thyroid disease among the population of the area.

In 1988, Congress ordered a study of the human health effects of exposure to the iodine-131 released from Hanford. Funded by the Centers for Disease Control and Prevention (CDC), the study was carried out by the Seattle-based Fred Hutchinson Cancer Research Center over the last decade. The study examined estimate of exposure1 of the thyroid and rates of thyroid disease because iodine-131 concentrates in the thyroid and that organ would be the best indicator of radiation damage in the population.

Scientists have recognized for about 45 years that iodine-131 intake can lead to substantial radiation exposure of the thyroid and possibly to the development of thyroid cancer and other thyroid diseases. The likelihood that a given person will develop thyroid disease after being exposed depends on the size of exposure. The amount of radiation received by people living downwind of the Hanford site depended on specific characteristics of their individual lives, such as when they were born, where they lived, what foods they ate, and where they obtained those foods. The iodine-131 exposure of children occurred mainly through the milk they drank and to a lesser extent through the leafy vegetables and fish they ate. Breathing contaminated air also exposed Hanford area residents and was included in the exposure calculations. The radiation exposures of the thyroid glands of small children were, on the average, much higher than those of adults because children's thyroids are much smaller than those of adults and children consume a lot of milk.

To conduct the Hanford Thyroid Disease Study (HTDS), a 9-year $18 million effort, the investigators had to contact 5,199 eligible people who had been born near Hanford (in Franklin, Adams, Benton, Walla Walla, Okanogan, Stevens, and Ferry counties) in the period 1940–1946 because the period of greatest radiation releases was 1944–1947. Eventually, the HTDS investigators enrolled 3,441 subjects in the study, gave them extensive medical examinations to look for evidence of thyroid disease, and used a questionnaire on risk factors for thyroid disease. The HTDS investigators estimated individual radiation exposures for the 3,190 people who, during 1944–1957, had ever lived in the geographic area for which dose calculations were made. Estimating radiation exposures of 50 years ago is a daunting task for scientists because of the many unknowns about people's lives, habits, and diet. Nevertheless, a detailed method that had been developed previously by the Pacific Northwest National Laboratory was used to estimate the exposure received by each HTDS participant.

Armed with the rates of thyroid disease found among the 3,441 participants and estimates of radiation exposure received by 3,190 of them, the HTDS investigators used statistical methods to determine whether there was a relationship between the rates of disease found and the estimated radiation exposures. Ordinarily, one would expect that participants with larger radiation exposures would have higher rates of disease. The statistical analysis was complex for a number of reasons, including difficulty in determining the radiation exposure received by each person.

On January 28, 1999, the HTDS investigators and CDC released a Draft Final Report (FHCRC, 1999a) of the study to the public. The report was a draft because, although it had undergone internal review by CDC, it was still to be reviewed and subjected to scrutiny and comment by the National Academy of Sciences-National Research Council (NAS-NRC). The draft was released 2 months earlier than planned, for several reasons, including public pressure for the report's release without changes made by CDC and the desire by NAS-NRC to have an open review of the report. The primary finding of the HTDS draft report was that there was no evidence linking radiation exposure from Hanford to the rate of thyroid disease found in the study population. The lack of evidence of an effect, in scientific terms, is often called a ''negative" finding. While presenting their findings to the media and regional citizen groups, the HTDS investigators overstated the certainty of their results.

Many Northwest citizens were upset not only about the findings of the study, but also about how the results of the study were conveyed by the investigators. Shortly after the draft's release, at CDC's request, NAS-NRC began an independent and comprehensive appraisal of the study methods, results, and interpretation and of how the study's findings were communicated to the public. This report is a fulfillment of that request.

The NRC subcommittee studied the HTDS Draft Final Report and discussed its contents in a series of meetings and e-mail communications over about 9 months, in February–October 1999.

The subcommittee arrived at consensus views on six specific questions asked by CDC: how well the HTDS investigators analyzed their data, how well the results were presented, whether their conclusions were reasonable, whether the material provided to the public was accurate and useful in helping the public to understand the study findings, how the presentation might need to be changed for the final report, and how CDC might improve communication with the public in the future. The subcommittee also developed a number of issues of its own to evaluate.

Detailed comments concerning the HTDS Draft Final Report are included in various chapters of the main report. The executive summary following this section highlights the views of the NRC subcommittee. Answers to the questions mentioned above are summarized in the executive summary and answered fully at the end of the subcommittee's report. This public summary is intended to review the main points of the executive summary in nontechnical language.

For its report, the NRC subcommittee concentrated on five main subjects for evaluation: design of the HTDS, estimated radiation exposures, data analysis, statistical power, and communication issues. Its major findings and recommendations appear below in boldface type.

Design of the HTDS

The NRC subcommittee considered the HTDS design to be appropriate to address its goals. The methods to determine who the participants should be and where they were living were exceptionally good, and the HTDS collected the appropriate data on participants to enable the proper type of analysis. Although the subcommittee found the study methods to be of high quality, there are considerable uncertainties in some of the information.

The investigators chose the most relevant population to study: those in the most highly exposed areas who were young children at the time of the greatest iodine-131 releases. It was also reasonable to study, as the subcommittee did, a low-exposure group upwind of and more distant from the Hanford site. The investigators were able to examine a high percentage of eligible persons, and this was a strength of the study. The information collected included such items as sex, history of other radiation exposures (such as from medical procedures), smoking history, and ethnicity.

Knowing the childhood milk-drinking habits of the participants in the study was particularly important because iodine-131 is most readily transferred to children through cow's milk as a result of the fallout that settles on pasture grass. The investigators attempted to question a parent or other close relative about each participant's residence history, where milk was obtained, and the amount of milk that was consumed during the period of the iodine-131 releases (1944–1957). If relatives were not available, then participants were given a questionnaire at the time of the medical examinations to get their history of residences and sources of milk. For 38% of the subjects, no parent or close relative was available to provide detailed information about childhood milk-drinking.

The NRC subcommittee found that the clinical examinations and laboratory studies were performed with good-quality, scientifically valid methods.

Ultrasound and palpation methods were used in the examinations. In palpation, a physician feels a person's thyroid gland in the neck with his or her fingers to determine its size and detect lumps. The subcommittee's only criticisms of the medical procedures were related to some quality-control procedures in the pathology review and to the fact that some requested medical records could not be obtained. But those criticisms were not important enough to invalidate the findings of the study.

Estimated Radiation Exposures

The NRC subcommittee's review found that the precision of the exposure estimates ranged from one-third or one-half the best estimate to 2 or 3 times greater than the best estimate. That range is reasonable for historical-dose reconstructions. Evaluations of the model by other scientists have cast doubt on some of the factors involved in the model. The subcommittee also has concerns about some factors that might lead to greater overestimation or underestimation of the radiation exposures than was acknowledged by the HTDS investigators.

Pacific Northwest National Laboratory developed the computer model used to estimate the radiation exposures in the Hanford Environmental Dose Reconstruction (HEDR) project. This model had to take into account many factors: how much iodine-131 was vented from the Hanford site, the wind directions and other weather-related measures, how fast the iodine-131 settled to Earth, how much stayed on vegetation, how much vegetation was consumed by cows (which depended on the season), the fraction of the iodine-131 eaten or drunk by cows that was transferred to their milk, the length of time between when the farmer milked the cow and when the milk was consumed by a child, where the milk consumed by a child came from (for instance, a local versus a distant dairy), how much milk was consumed by the child at various ages, the fraction of the iodine-131 consumed (or breathed) that was deposited in the thyroid gland, and how long it stayed there. The model had to be able to estimate thyroid exposures of persons of different sexes, ages, places of residence, and dietary habits. The subcommittee found that the general method used in the model was suitable for the HTDS, assuming that the proper information about each participant could be obtained and used.

The NRC subcommittee found that the resulting exposure estimates for the HTDS participants were probably fairly accurate, mostly within a factor of 2 or 3. This statement is based on the results of validation exercises using the HEDR models (Napier and others, 1994). Recently, however, several scientists have claimed that the amount of iodine-131 released from the Hanford site was higher than calculated by HEDR developers and that the HEDR model therefore underestimated the thyroid exposures by roughly 30%. And, the NRC subcommittee thinks that the model overestimates the iodine-131 that was transferred from pasture grass to cows' milk; this would mean that the model overestimated exposures. A careful reassessment of these elements of the model by the model developers is needed.

Errors like those can affect a study's findings about a relationship between disease rates and estimated radiation exposures. The ability of the HTDS to find the true relationship is called its "statistical power" and has been a focus of attention by the NRC subcommittee.

The NRC subcommittee found that the HEDR and HTDS investigators probably assessed individual exposures as being more precise than they actually were because some sources of uncertainty were underestimated or not dealt with.

The subcommittee noted that exposures that took place 40–50 years ago could not be precisely estimated and that such a situation could substantially reduce the ability of the study to detect a radiation effect. (Uncertainty and the power of the study are discussed further in this summary.)

The HTDS did examine the impact of fallout exposures from nuclear weapons tests conducted at the Nevada Test Site but overlooked the other sources of fallout exposure (such as nuclear tests in the Pacific and the Soviet Union). The NRC made a crude assessment of the exposures from global fallout and found that, on the average, the thyroid doses from global fallout were somewhat smaller than those from NTS fallout. In addition, the global fallout exposures occurred during the teenage years and early 20s among the study population. The NRC concluded that global fallout is not likely to have a large impact on the results of the epidemiologic study.

Analysis of HTDS Data

The subcommittee found some limitations in the HTDS data analysis, including exclusive use of the HEDR estimates of thyroid exposures from the Hanford releases, possible inaccuracies in exposure estimates for people who had lived only part of the time in the Hanford area, the need to analyze thyroid-disease rates by geographic area, and the absence of some key tables.

It is difficult to analyze the results of a study of the occurrence of disease if the number of cases is small. Although more than 3,000 people were evaluated for thyroid disease in the HTDS study, only 20 had thyroid cancer; and only 14 of those lived in the region covered by the HEDR model during 1944–1957 and could therefore have exposure estimated. The numbers were greater for most other thyroid diseases; for instance, benign thyroid nodules (noncancerous lumps) were found in 250 people. The radiation effect in causing this disease could be estimated with more certainty because of the larger number of cases.

The NRC subcommittee was critical of the HTDS investigators' exclusive use of the HEDR estimates of thyroid exposure for the data analysis and suggests supplemental analyses that could help to confirm or weaken the conclusions of the study. The subcommittee also found the analyses of the radiation effect (called "dose-response relationship" in the study) difficult to interpret for a variety of reasons. The subcommittee believes that a more complete analysis should be carried out to estimate exposures of people who were out of the study's geographic area for some of the time when the iodine-131 releases took place.

The subcommittee recommends that the HTDS investigators conduct more analyses to address the fact that the thyroid disease rates in the HTDS appeared to differ in unexpected ways between one geographic area and another. The geographic area in which each person was born should be taken into account to explain the unusual finding that thyroid disease rates tended to be higher in areas that were expected to have the smallest amount of iodine-131 deposited on the ground.

The subcommittee believes that the HTDS investigators were correct in emphasizing analyses of the radiation effect rather than comparisons with another population. It does not believe that comparing the HTDS study group with some unexposed general population would be useful.

Members of the public have repeatedly questioned why no unexposed control group was involved in the HTDS so that disease rates could be compared. There are several major reasons why the panel does not think that that would be a valid comparison. First, for reasons unrelated to radiation, persons living in various geographic areas can vary in their likelihood of developing thyroid cancer. Second, the rates of disease found in the HTDS are based on thyroid examinations. Intensive medical examinations usually find more thyroid disease than would otherwise be known about from routine medical practice. Because no other population in the Northwest has been examined this way, a valid comparison with other populations cannot be made. Any conclusions drawn from comparisons with another population that is defined as a "control group" would have more potential for error than the conclusions drawn from the analyses that the HTDS investigators conducted. Third, the analysis of a radiation effect is a valid guide to the risk to the Hanford population even without the use of an unexposed control group, as long as there is a sufficient range of exposure levels and they are estimated with reasonable accuracy.

The subcommittee is concerned that the results of the study were reported—and interpreted—in black and white terms of whether a statistical test was passed or failed. It recommends that confidence limits be provided throughout the report to allow readers to judge how large a radiation effect might be consistent with the data. It feels that the HTDS investigators probably overstated the strength of their finding that there was no radiation effect.

Usually, scientists provide both a best estimate and a range of estimates—called "confidence limits" or "confidence intervals"—to use to interpret statistical results. However, the HTDS investigators provided only their best estimate, not the confidence limits, for the size of possible radiation effects in the report or in their public statements. That made their findings seem more solid than they actually were.

Furthermore, the HTDS investigators should have calculated confidence limits that account for both the imprecision in the exposure estimates and the conventional statistical imprecision. By not presenting confidence limits, especially ones that consider imprecision in exposure estimates, the HTDS investigators overstated the strength of their main findings in the draft report.

Statistical Power and the HTDS Interpretation

The subcommittee believes that the assumptions used by the HTDS investigators to estimate the needed sample size and to calculate statistical power were incorrect; their assumptions did not acknowledge that exposures could be estimated only very imprecisely. The subcommittee found that HTDS ignored five sources of imprecision, which decreased the ability of the study to detect a small radiation effect. That means that the negative results that the study obtained are less definitive than the report and press releases stated.

Because the HTDS results found no increase in thyroid disease with an increase in radiation exposure from iodine-131, a critical issue is how to interpret those findings correctly. To evaluate the HTDS interpretation, the subcommittee asked a series of questions. For example, were the data good enough? Do the underlying patterns of exposure and disease agree or disagree with the negative findings? Was the statistical power of the study high enough to make the negative findings convincing? (The higher the statistical power of the study, the more confidence people can put in the study's findings.)

The subcommittee reviewed the factors that influence statistical power, focusing on the impact of lack of precision in the thyroid exposures calculated by the HEDR project. It found that the statistical-power calculations made inadequate allowance for imprecision in the dose estimates. Given that situation, the subcommittee believes that the HTDS did not have as much statistical power to detect radiation effects as the investigators claimed. That means that the results of no effect ("negative" findings) reported by the HTDS are less definitive than the report and related public documents stated. Hence, this subcommittee recommends that, if possible, the HTDS investigators redo the statistical-power calculations to take into account all the sources of imprecision and that they reinterpret the study results in accordance with the limitations of statistical power.

The subcommittee believes that the findings of the HTDS cannot be reliably distinguished from the findings of the study of thyroid disease among children in Nevada and Utah who had been exposed to fallout resulting from atmospheric nuclear weapons tests conducted at the Nevada Test Site in the 1950s. A marginally positive radiation effect was found in that study. It is likely that, given the confidence limits for both studies, there would be an overlap, even though one appears positive and one negative. That is because the findings of both studies are very imprecise.

Communication of HTDS Results to the Public

The subcommittee believes that the original communication plan developed for the HTDS, particularly the parts that emphasized open public communication, was well developed and should have been moderately successful if implemented as planned. However, several factors led to an early release of a draft report, rather than a final report. When the Draft Final Report was released, a number of communication errors were made that caused public outcry.

Compared with the history of less than open public information from the US Department of Energy and its predecessor agencies, the early plans by CDC and the HTDS investigators for open communication about the study were enlightened and promising. So were the decision to establish a citizen advisory group for the study and the apparent level of cooperation offered to various other citizen groups in the region over the years of the study. All those early efforts should have helped to build trust and credibility for the study.

Some of the public outcry on release of the draft report might have been avoided if the original communication plans outlined in the HTDS draft had been followed. The draft report outlined a good communication plan for its release, which included an admirable concern for translating the technical information in the report into an understandable booklet for the public and other efforts, including a Web site, to share information with the public. But the plan also called for delivery to the public of final information about the study, not a draft that had not been subject to review by outside scientists. Instead, several events forced the early release of the Draft Final Report and pre-empted the original communication plan.

Not only the early release of the report was a problem, but so was the main message in the report (namely, a strong statement that iodine-131 releases had caused no thyroid disease). In trying to decide how to present this message, CDC was on the horns of a dilemma. CDC personnel had been urged by some members of citizen groups not to alter the report before its release; they wanted the report to be released just as the HTDS investigators had written it. CDC also had to respect issues of academic freedom regarding the principal investigators' views. But after the draft's release, the CDC people were blamed for not intervening to counter the strong message delivered to the public by the HTDS investigators.

A key weakness of the communication effort surrounding the release of the draft report was that the public materials written and the oral statements made by HTDS investigators overstated the certainty of the study (the statistical power) and the conclusiveness of the negative findings, but did not report any of the uncertainties.

The public materials factually represented what appeared in the draft report. But, given the state of a draft document that had not been reviewed externally and a number of uncertainties in the data, the strong statements that the investigators made publicly were unwarranted. On the basis of comments received by the NRC subcommittee from members of the public, it is clear that many persons with an interest in the findings of the study were not only disappointed with the reported negative results, but also upset by how the results were disseminated and described.

A number of factors contributed to the problems surrounding the draft report's release, including (1) a perceived need for an information blackout that included the citizen groups that had been privy to most other parts of the study; (2) a complex schedule of briefings of groups in person in Washington, DC, and by telephone in the Hanford area to various state health agencies and citizen organizations only several hours before the media and public briefings on the findings; (3) a leak to the New York Times that related the findings to the public before most of the briefings in the Hanford area; and (4) a message that contradicted what most of the public thought would be the outcome of the study.

The subcommittee believes that in the media and public briefings the HTDS investigators paid insufficient attention to the audience's health concerns and fears and that HTDS investigators and CDC officials should have offered more balanced, and possibly alternative, interpretations of the findings and discussed their implications for individuals.

During the media briefing and public meeting held to announce the findings of the HTDS, the investigators emphasized the overall statistical results of the study and did not seriously discuss the outcome for individuals. That approach angered many members of the community who had thyroid-related health problems. More care should have been taken to explain the differences between statistical relationships and individual outcomes. The subcommittee recommends that when the final report is released, implications for individuals and families that have suffered because of thyroid disease be explained and highlighted in the written materials and the public briefings. In addition, legitimate differences in viewpoints regarding study findings between the HTDS and CDC personnel should be explained and discussed.

The subcommittee recommends that a new communication plan be developed for the release of the final report, taking into account the serious problems encountered with the release of the draft report. In the final report and all public documents related to it, any important changes made from the draft report and all remaining uncertainties should be clearly outlined and explained. The subcommittee applauds CDC's open-communication policy and strongly recommends that this policy continue with the HTDS and similar studies.

The complicated briefing strategy used for releasing the Draft Final Report did not work well, and the subcommittee suggests that a more simple and efficient briefing plan be devised for releasing the final report. In particular, it recommends that telephone briefings be abandoned because all involved with release of the draft report disliked them. Citizen groups that have participated in a study over the years should not be kept out of the information flow concerning the study report's release until the very last minute, as they were with the briefings on the draft report.

The subcommittee also suggests that a small group of risk—communication experts, scientists, journalists, and citizens be convened to consider the more effective public release and discussion of controversial draft reports that have not been peer reviewed, as well as other issues that could affect the future release of important CDC reports.

Concluding Remarks

In concluding its review of the HTDS Draft Final Report, the NRC panel considered the notion raised by the public that the HTDS is inconclusive in its findings. The subcommittee believes that the issue cannot be answered as simply as ''agree" or "disagree", because the certainty of the interpretations from a complex study like the HTDS is always a matter of degree. The subcommittee members believe that the high certainty with which the HTDS investigators presented the negative findings of the draft report amounted to an overstatement. But the main finding of the HTDS final report could prove to be that no radiation effect can be observed. Given the imprecision in the exposure estimates and the effect of other statistical issues, the absence of any observable radiation effect is not proof that there is none. It does mean that the iodine-131 exposure did not have large effects. However, until estimates are given with appropriate confidence limits, we do not know how much risk to the thyroid is compatible with the data.

It seems doubtful that a better study could have been conducted in the downwind area, short of having some way to improve the exposure estimates greatly—an unlikely prospect because so little information is available on the exposures of 45 years ago. This carefully designed study, with sound followup and sound medical methods, has examined a large fraction of the most heavily exposed population and failed to find any obvious evidence of a radiation effect; that is, there was no evidence of abnormally high rates of thyroid disease in the Hanford "downwinders" examined who had the largest estimated exposures. Thus, at face value, the study was negative, and no increased risk was found. The pattern of individual exposures is in accord with such basic factors as the prevailing wind direction and distance from the Hanford site, and this accord generally supports the exposure modeling. Finding negative results of both geographic and exposure comparisons implies that the iodine-131 exposures had no strong impact on thyroid disease.

However, if a similar exposure occurred elsewhere, one could not predict with confidence whether a positive or negative result would be seen. The small numbers of thyroid-cancer cases and the lack of precision in estimating individual exposures mean that one can have little confidence in the size of the risk estimates found in the HTDS.

At the time of the initial release of the Draft Final Report, it was indicated by the HTDS investigators that residents of downwind areas should feel relief that being close to the Hanford nuclear site did not result in increased risk of any thyroid disease. Such statements are too broad, but they might be reasonable in specific instances. For example, a healthy 55-year-old who lived near Hanford and drank a large amount of milk as a child can take comfort in learning that there is no evidence that he or she will have a greater risk of thyroid disease than other people in the general HTDS study area.

At various public-comment meetings, people who lived in downwind areas stated that their families experienced more thyroid disease than would have been expected in the population at large. Their disease could have been the result of unusual fallout or eating patterns or unusual susceptibility to radiation effects. But one should bear in mind that some cases of thyroid disease occur for reasons not understood by medical science. For example, thyroid disease tends to run in families, and family clusters could be related to genetic factors in the families or to chance. The lack of evidence of a dose-response relationship for any type of thyroid disease in the HTDS suggests, but does not prove, that the overall risk was not affected by Hanford fallout. The evidence does not rule out (although it does not support) the possibility that a weak association could affect, for instance, people who are already susceptible to thyroid disease because they are predisposed to it by genetic factors.



Although dose is the correct technical term, this summary will use exposure to refer loosely to a person's total radiation dose to the thyroid gland resulting from either short-or long-term exposure to iodine-131 in the atmosphere and environment from releases during the period 1944–1957.

Copyright 2000 by the National Academy of Sciences. All rights reserved.
Bookshelf ID: NBK225225


  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (1017K)
  • Disable Glossary Links

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...