• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of amjphAmerican Journal of Public Health Web SiteAmerican Public Health Association Web SiteSubmissionsSubscriptionsAbout Us
Am J Public Health. 2011 April; 101(4): 720–729.
PMCID: PMC3052342
NIHMSID: NIHMS258504

Excess Black Mortality in the United States and in Selected Black and White High-Poverty Areas, 1980–2000

Abstract

Objectives. Black working-aged residents of urban high-poverty areas suffered severe excess mortality in 1980 and 1990. Our goal in this study was to determine whether this trend persisted in 2000.

Methods. We analyzed death certificate and census data to estimate age-standardized all-cause and cause-specific mortality among 16- to 64-year-old Blacks and Whites nationwide and in selected urban and rural high-poverty areas.

Results. Urban men's mortality rate estimates peaked in 1990 and declined between 1990 and 2000 back to or below 1980 levels. Evidence of excess mortality declines among urban or rural women and among rural men was modest, with some increases. Between 1980 and 2000, there was little decline in chronic disease mortality among men and women in most areas, and in some instances there were increases.

Conclusions. In 2000, despite improved economic conditions, working-age residents of the study areas still died disproportionately of early onset of chronic disease, suggesting an entrenched burden of disease and unmet health care needs. The lack of consistent improvement in death rates among working-age residents of high-poverty areas since 1980 necessitates reflection and concerted action given that sustainable progress has been elusive for this age group.

In their seminal article, McCord and Freeman estimated that in 1980 Black male youths in Harlem, New York City, were less likely to survive to the age of 65 years than were male youths in Bangladesh.1 Mortality rates in 1980 were approximately 6 times greater among Harlem women aged 25 to 34 years and Harlem men aged 35 to 44 years than among White women and men in the same age groups nationwide. Geronimus et al. estimated that Black youths in a geographically diverse set of US high-poverty urban areas faced even worse mortality outcomes through middle age in 1990 than in 1980, including when these youths were compared with Black residents of equally poor rural communities.2,3

These striking findings suggest that national or statewide studies of population mortality may conceal important local variations. In addition, comparisons that include all age groups may obscure trends among working-age adults. For example, measures of life expectancy may be disproportionately influenced by survival probabilities of elderly people and infants, with life expectancy increasing with increased access to tertiary care. Variations in mortality among working-age adults may be more sensitive to circumstances that affect chronic disease trajectories, including access to and continuity of primary care, health education, work environments, neighborhood conditions, and the extent to which competing work and family obligations trigger sustained stress responses.46

Whether the severe mortality profiles of Black urban working-age adults persisted through the end of the 20th century is unknown. The 1990s witnessed significant socioeconomic, population health, and health care changes with potentially countervailing effects. On the positive side, the middle to late 1990s saw unprecedented economic growth, unemployment rates fell to all-time lows, poverty was deconcentrated in urban centers,7 highly active antiretroviral therapy became widely available, and the incidence of homicide declined.8

Yet, the extent to which economic growth affected residents of segregated urban communities varied by race and gender, with low-skilled Black men, in particular, being “left behind.”911 For poor mothers, Aid to Families with Dependent Children was replaced with Temporary Assistance to Needy Families, establishing lifetime limits, setting stringent work requirements, and reducing Medicaid enrollments. Studies revealed that Temporary Assistance to Needy Families participants expressed pride in employment but reported exhaustion and chronic anxiety, with possibly adverse health implications.6,12,13 Poor Black individuals faced additional challenges to accessing medical care in the context of a more privatized, market-based health care delivery system14; the movement of private practitioners out of the inner city15; and lack of health insurance for the working poor. Gentrification may have reduced the presence of inner-city federally qualified health centers.16 Antiretroviral treatment was less available to high-poverty populations than to more advantaged groups.17,18

In light of these competing and significant changes to the socioeconomic, health, and medical care landscape during the 1990s, we extended the analyses of McCord and Freeman1 and Geronimus et al.2,3 to the year 2000, the most recent year for which necessary census data were available.

METHODS

We analyzed census and death certificate data from 7 local poverty areas during 3 time periods.

Study Populations

Our study populations comprised all 16- through 64-year-old non-Hispanic Black or White residents of 7 local high-poverty areas studied by McCord and Freeman1 or Geronimus et al.3 Each area is composed of economically similar and contiguous census tracts or zip codes (urban areas) or counties (rural areas).

We focused on Black residents of 3 high-poverty urban areas: Southside Chicago, Illinois; Eastside Detroit, Michigan; and Harlem. For comparison, we analyzed Black residents of 2 poor rural areas (Black Belt Alabama and Delta Louisiana), White residents of 1 poor urban area (Central Cleveland, OH) and 1 poor rural area (Appalachian Kentucky), and Blacks and Whites nationwide (see the Appendix, available as a supplement to the online version of this article at http://www.ajph.org, for a listing of the neighborhoods that were included in the different study areas).

Statistical Analyses

We analyzed geocoded death certificate data for 1979 through 1981, 1989 through 1991, and 1999 through 2001, combining deaths in 3-year intervals to reduce the impact of random fluctuations. To estimate the population base for mortality rate calculations, we matched local area death certificate data with population counts by age, race, and gender for the same geographic area from the 1980, 1990, and 2000 decennial censuses. Coverage of mortal events in death certificate data is virtually universal. To mitigate biases introduced by census undercounting, we adjusted population counts for coverage errors.

We computed age- and year-specific standardized death rates (SDRs),19 directly standardizing our calculations with reference to the age distribution of the White population nationwide in 2000 according to gender.20 Using Greville's method,1922 we calculated the likelihood that a 16-year-old resident of a given area would survive to the age of 65 years. We relied on standard life table methodology to calculate years of life lost between 16 and 65 years.19,23

We computed 2 relative mortality measures. We calculated excess death rates (EDRs) as 100 000 times the difference between expected deaths and observed deaths, divided by the White population nationwide. To compute standardized mortality ratios (SMRs), we divided the number of expected deaths by the number of observed deaths in the White population nationwide. We estimated 95% confidence intervals for SMRs to provide a qualitative sense of the variability in estimates. We used the International Classification of Diseases (9th and 10th revisions) to disaggregate excess mortality rates according to the 6 underlying causes of death most influential in explaining existing health disparities in the United States24: circulatory disease, cancer, accidents, homicide, HIV/AIDS, and infections. We also included a residual category for other causes.

RESULTS

Table 1 displays population characteristics for the years 1980, 1990, and 2000. In urban areas and in Appalachian Kentucky, Black or White residents made up the majority of the population (64%–99%, depending on area and year). Although Blacks accounted for less than half of the residents of rural study counties, segregation indices for Black Belt Alabama and Delta Louisiana were very high, indicating that our rural Black populations were concentrated in high-poverty segregated areas within counties.25 Also, the percentage of Black residents in Harlem declined over the course of the 3 decades, from 94% in 1980 to 76% in 2000. The White population in Central Cleveland declined from 91% in 1980 to 64% in 2000. Overall, the size of most urban study populations decreased over the study period, whereas rural population sizes remained relatively stable.

TABLE 1
Race-Specific Summary Data: United States and Selected High-Poverty Areas, 1980, 1990, and 2000

Mean family incomes were calculated in 2000 dollars for all years. Reflecting national distributions, the poor White populations were not as economically disadvantaged as were the poor Black populations in the study. However, the poverty rates for poor White populations in Central Cleveland and Appalachian Kentucky were 2 and 4 times the White rate nationwide, respectively. White residents of Appalachian Kentucky were poorer than were Blacks in the United States as a whole.

Between 1980 and 2000, US mean family incomes increased, and poverty rates declined among both Whites and Blacks, with larger improvements between 1990 and 2000 than between 1980 and 1990. In poor Black areas, mean family incomes increased and poverty rates decreased as well between 1980 and 2000, but in all areas other than Harlem this 20-year improvement masked substantial deteriorations between 1980 and 1990 followed by significant improvements between 1990 and 2000. Similar trends were evident in poor White areas.

Summary Mortality Measures in 2000

Table 2 provides summary measures of mortality in 2000. In absolute terms, standardized death rates were higher among Blacks nationwide and among Blacks and Whites in the local study areas than they were among Whites nationwide.

TABLE 2
All-Cause Mortality for Blacks and Whites: United States and in Selected High-Poverty Areas, 2000

In 2000, SMRs for urban Black men ranged from 2.25 to 3.32, and EDRs ranged from 487 to 904. The probability that a 16-year-old urban Black male would die before the age of 65 years ranged from 38% to 50%, compared with about 20% and 33%, respectively, among young White and Black males nationwide. Years of life lost were also higher among urban Black males than they were among Blacks or Whites nationwide, ranging from 5.3 in Harlem to 8.4 in Southside Chicago.

Mean incomes were lowest in Black rural areas (up to 43% lower than they were among Blacks nationwide), and poverty rates in these areas were comparable with those in Black urban areas (approaching twice the rate among Blacks nationwide). However, Black rural mortality rates were only modestly elevated relative to Black mortality nationwide and were lower than were mortality rates among urban Blacks.

The 2 poor White study populations fared better than did the poor Black populations, with respective SMRs of 1.59 and 1.75 and EDRs of 229 and 291 for White men. Whites residing in Appalachian Kentucky had a better mortality profile than did Whites residing in Central Cleveland, despite having almost twice the poverty rate and 39% lower mean incomes, and a better profile than Blacks nationwide despite being 37% poorer. Among women in the Black and White local populations, the geographic patterns of all-cause mortality and SMRs were similar to those for men in these populations, but magnitudes were smaller.

Changes in Mortality Profiles From 1980 to 2000

Nationwide. Figures 1 and 2 compare SDRs, EDRs, and SMRs over the study period for US Whites, US Blacks, and each local population. Calculations for the additional absolute measures (probability of dying and years of life lost between the ages of 16 and 65 years) were completed for each decade (data not shown) and generally followed similar patterns (with exceptions as noted).

FIGURE 1
Mortality measures for US Black and White men nationally and in selected high-poverty populations, by decade, 1980, 1990, and 2000.
FIGURE 2
Mortality measures for US Black and White women nationally and in selected high-poverty populations, by decade, 1980, 1990, and 2000.

In terms of absolute measures, Black and White men in the United States as a whole showed improvement across the decades studied. The SDR for US White men in 2000 (390) was only 8% smaller than it was in 1990 but a full 27% smaller than it was in 1980. Black men nationwide showed SDR declines each decade from 1980 to 2000. However, they showed improvements in years of life lost solely in the second decade, with no improvement between 1980 and 1990. Because White men exhibited greater improvements than did Black men in absolute death rates over the 20-year study period, the Black–White SMR was higher in 2000 than it was in 1980. The EDRs for Black men showed a net decrease of 14% between 1980 and 2000 after having peaked in 1990.

The SDR for White women nationwide (Figure 2) was lower in 2000 than it was in 1980 but showed no change relative to 1990. Between 1990 and 2000, there was virtually no change among White women nationwide with respect to probability of dying by the age of 65 years or years of life lost. The SDR for Black women nationwide decreased by only 5% between 1990 and 2000, compared with the 10% decrease from 1980 to 1990. From 1980 to 2000, EDRs for Black women nationwide improved by 10% but were virtually constant between 1980 and 1990. Because the SDR for White women improved more than that for Black women, the Black–White SMR was slightly greater in 2000 than it was in 1980, having peaked in 1990.

Local areas.

Black populations in southside Chicago and Harlem showed sizable deteriorations in SDRs between 1980 and 1990 followed by improvements from 1990 to 2000, albeit muted for women. In Eastside Detroit, there were incremental improvements in SDRs among men across both decades, whereas there was a small improvement among women between 1980 and 1990 that reversed direction and resulted in a slightly higher death rate in 2000 than in 1990. Women in Black urban areas saw little to no change in their probability of dying across the decades. Women in Harlem and Eastside Detroit saw a small net decrease in years of life lost between 1980 and 2000, whereas women in Southside Chicago witnessed a slight increase.

Although there were improvements in absolute measures of mortality among men and women in most populations between 1980 and 2000, the magnitudes of those improvements varied widely by area and gender, with implications for changes in relative measures. In Chicago and Detroit from 1980 to 2000, SDRs decreased by 20% and EDRs by 17% among men, compared with decreases of only 7% and 2%, respectively, among women. The largest SDR decreases were observed among Harlem men and women, but the percentage reductions for women were less than half of those for men. Between 1980 and 2000, SMRs stayed the same for Harlem women and showed a small decline for men; by contrast, these rates increased among men and women residing in Southside Chicago and Eastside Detroit.

Black rural areas showed limited reductions in all-cause mortality over the study period among both men and women. Among rural men, SDRs improved by 3% to 5%. These rates remained the same among women in Delta Louisiana, whereas they increased by 8% among women in Black Belt Alabama. Neither men nor women in either Black rural area saw any decrease in their probability of dying. Men saw a very slight decrease in years of life lost, whereas years of life lost among women remained the same in Black Belt Alabama and increased slightly in Delta Louisiana. Among both men and women in Black rural areas, EDRs and SMRs increased between 1980 and 2000 by 20% to 45%.

White Populations

Among men, absolute mortality measures improved in both poor White populations between 1980 and 2000, with relative measures improving in Cleveland but deteriorating in Appalachian Kentucky. Among women, absolute and relative mortality profiles improved in Cleveland but deteriorated in Appalachian Kentucky. For example, among women in Appalachian Kentucky, SDRs increased from 320 to 369, EDRs increased from 39 to 133, and SMRs increased from 1.14 to 1.56 between 1980 and 2000.

Causes of Mortality in 2000

In Table 3, we present standardized death rates (SDRs) for US Whites and excess death rates for Blacks nationwide and for all local populations disaggregated by cause of death for 1980 and 2000, along with percentages and directions of change between the 2 time periods. We focused on deaths caused by circulatory disease, cancer, homicide, and HIV/AIDS, which were important contributors to SDRs and EDRs. Decompositions of 2000 EDRs showed that circulatory disease was a primary contributor in every poor population. It was the leading cause of excess death in all groups other than Harlem men (for whom the leading cause was HIV/AIDS), accounting for roughly one third to one half of all excess deaths and often outpacing other causes by a wide margin.

TABLE 3
Excess Death Rates (EDRs) Among US Blacks and Selected High-Poverty Populations by Key Causes: 1980 and 2000

Cancers are another substantial contributor to elevated mortality rates in high-poverty populations. They accounted for 10% to 15% of excess deaths among urban Black and White men and 15% to 25% of excess deaths among rural Black men and urban and rural women. HIV/AIDS accounted for a substantial share of excess deaths in Harlem (29% and 24% among men and women, respectively) but contributed little in White or rural populations. In 2000, homicide was a less significant but still important underlying cause of excess deaths among Black urban men, accounting for 8.4% to 13.8% of excess deaths in this population.

Changes in Causes of Mortality, 1980 to 2000

In Table 4, we also report changes in SDRs among US Whites between 1980 and 2000, disaggregated by cause. About two thirds of the declining death rate among Whites nationwide was attributed to reductions in circulatory disease deaths, with an additional 17% (men) or 37% (women) resulting from reductions in cancer deaths. Rates of and changes in mortality due to homicide or HIV/AIDS were slight.

Excess deaths declined among Blacks nationwide as well as among Black and White urban populations. However, all rural populations experienced increases in EDRs. Among US Black men and men in every local population, declines in excess deaths caused by homicide were disproportionately large relative to those among US Whites. Conversely, declines in the proportion of excess deaths caused by circulatory diseases or cancer were disproportionately low, in some cases accounting for none of the decline and in others even contributing to increases in excess deaths.

Although Harlem men experienced the greatest overall reduction in EDRs, virtually none of it was a result of declines in circulatory disease deaths. Only men in Central Cleveland and Southside Chicago saw significant decreases in circulatory EDRs, and even these declines were low relative to those for circulatory EDRs among White men nationwide. Although cancer EDRs declined modestly in urban populations, they increased substantially in rural areas, accounting for 53% to 75% of total EDR increases in these areas.

Excess homicide deaths also fell nationwide for Black women, as they did for Black and White women in most of the local areas. Circulatory EDRs declined more for urban women than for urban men but proportionately less than did circulatory SDRs for White women nationwide. In 2 of the 3 rural areas—Black Belt Alabama and Appalachian Kentucky—excess circulatory disease deaths contributed to increases in women's excess deaths over the study period. Increasing proportions of cancer deaths contributed to increases in excess deaths among all rural area populations and among US Black and Harlem women while representing only a very small portion of the declines in Eastside Detroit and Cleveland. In Harlem and in all 3 rural areas, increases in women's excess cancer deaths more than offset decreases in excess circulatory disease deaths. On balance, the result was a relatively modest improvement in EDRs over the study period for Black women nationwide and Black women in urban areas, with rural women experiencing sizeable increases in EDRs.

By definition, there was an increase in HIV/AIDS deaths in 2000 relative to 1980 in every population, in that HIV/AIDS had not been identified in 1980. Despite the introduction of highly active antiretroviral therapy during the mid-1990s, only US White men and Harlem men and women witnessed a decline in HIV/AIDS deaths between 1990 and 2000 (data not shown).

The impact of HIV/AIDS on excess deaths between 1980 and 2000 varied widely over the local study areas, from virtually no effect in Appalachian Kentucky to a significant impact in Harlem. Although smaller than the increase in Harlem, excess HIV/AIDS deaths increased substantially over the study period among men and women in Southside Chicago and Eastside Detroit and among men in Cleveland. The contribution of HIV/AIDS to excess deaths in rural areas was small. It is noteworthy that the larger increases in excess HIV/AIDS deaths nationally and in urban areas occurred in a context where excess deaths showed a net decline, a testament to the magnitude of the decline in other causes, most notably homicide among urban men. The small contribution of excess HIV/AIDS deaths in areas with net EDR increases is a testament to the importance of increases in other causes of excess death, notably cancer and circulatory disease in rural populations.

DISCUSSION

A snapshot of the mortality experience of Black and White working-aged adults in the year 2000 shows inequalities by race, gender, poverty rate, and rural–urban region of the country. As in other years, death rates among Black residents of high-poverty urban areas remained alarmingly high. For example, 16-year-old Black males residing in urban locales in 2000 had only a 50% to 62% chance of surviving to the age of 65 years, whereas those residing in rural areas or in the country as a whole had only modestly better chances (62%–67%). By contrast, their White counterparts nationwide had an 80% probability of surviving to the age of 65 years.

Comparing the 3 census years examined suggests a number of noteworthy trends. First, Black urban residents of Harlem and Southside Chicago showed alarming increases in mortality between 1980 and 1990, whereas mortality among those in Eastside Detroit remained stable but high. With the exception of women in Eastside Detroit, urban Black populations exhibited mortality declines between 1990 and 2000, with larger decreases for men than for women and the largest decreases in Harlem. However, those net improvements that occurred among Blacks subsequent to 1980 did not keep pace with those for White men or White women nationwide. The resulting SMRs for Black populations in 2000 were roughly the same as or higher than those in 1980.

Second, dramatic declines in excess deaths caused by homicide among urban Black men contributed heavily to the reversal back to or below 1980 mortality levels between 1990 and 2000. These declines were not matched by similarly sized improvements in excess deaths caused by circulatory disease or cancer. Among urban Black women, far smaller declines in excess deaths reflected more modest decreases across multiple causes of death and increasing excess cancer deaths in some areas.

Third, a mortality advantage for high-poverty rural versus urban locales remained but was smaller in 2000 than in earlier decades. Between 1980 and 2000, there were only small improvements in SDRs among rural men, whereas rural women's absolute death rates deteriorated. EDRs and SMRs worsened among rural men and women over the study period. The deteriorations in EDRs were largely a result of increasing excess circulatory disease and cancer deaths, with declines in excess deaths caused by homicide too small to offset them.

Fourth, death rates were lower in White local populations than they were in their Black counterparts. However, similar to Blacks, White residents of these areas showed evidence of an urban–rural divide in absolute levels of mortality, with declining mortality in urban areas but increasing mortality in rural areas. Finally, gender disparities in the size and direction of trends were apparent at the national and local levels. Nationwide, White and Black men witnessed their absolute life chances improve across all of the decades studied. Although women's absolute death rates were lower than were men's, their improvements were not universal and were far more modest overall.

The lack of universal improvement in excess mortality occurred in a context of improved economic characteristics between 1980 and 2000 in all study areas. The smaller improvements in death rates seen among urban women relative to urban men occurred in a context in which the women were likely to experience greater gains in income and employment.911 Our data did not permit us to test possible explanations, but different trends by gender (as well as by urban–rural residence) suggest a more complex relationship between socioeconomic characteristics and mortality than often assumed.

Income gains among women may have been insufficient to offset other health threats, including those that may follow from increased employment. The low-wage labor market may expose employees to hazardous working conditions or result in their engaging in persistent high-effort coping with stressors or suffering sleep deprivation as they struggle to meet competing obligations.6,12 Meanwhile, the working poor are often without health insurance, may have difficulty finding time to attend to their own health needs,12 or may find health care providers have become less accessible.15,16

Over the study period, life expectancy at birth increased nationally among both Whites and Blacks, by 4 years for men and 2 years for women.26 That we do not see a commensurate improvement among working-age Black adults, nationally or in the poor study areas examined here, suggests that increases in Black life expectancy disproportionately represent improvements in the survival of infants and the elderly.

Our findings suggest that failure to adequately address excessive rates of chronic disease in high-poverty locales limited the extent to which progress was made in reducing excess mortality in vulnerable populations. Instead, declining homicide rates were key to many of the improvements observed. These declines affected men more than women and urban residents more than rural residents. Certainly, evidence of declining death rates among Black urban men—the group with the highest level of absolute mortality—is encouraging. However, the large contribution of dropping homicide rates is less so, in that homicide rates ceased their national decline in 2003 and may now have begun to rise in urban areas.27 Because rates of chronic disease associated with obesity (e.g., diabetes and hypertension) have begun to affect individuals at younger ages,13,28 the apparent failure to avert chronic disease deaths proportionately for high-poverty populations raises additional concern that excess mortality rates may be on the rise.

Limitations

An important study limitation is that we cannot know the extent to which mortality differences between areas or decades reflect disparities in disease prevalence or inequalities in health care that might prevent incident cases of disease from resulting in death. Either way, our findings document a substantial burden of chronic disease among working-age Black adults living in high-poverty areas and are suggestive of unmet health care needs. The technology for avoiding chronic disease mortality improved dramatically in the United States over the study period. If Blacks in some of the study populations felt the benefits of these gains, they apparently benefited less than Whites.

Changes in the size and economic characteristics of the urban study populations suggest shifting residential composition. Our data do not enable us to determine the size of any in- or out-migration from the study areas or the extent to which it may have been health-related. To the extent that improved economic characteristics between 1990 and 2000 reflect gentrification,11,29,30 such compositional changes might result in healthier local populations, raising the disturbing possibility that mortality improvements in some of the urban populations described here could be overstated.

In addition, incarceration of Black men rose dramatically over the study decades.31 In 2000, almost 15% of Black men aged 20 to 35 years nationwide were in prison.32 Individuals who, but for incarceration, would have been residents of our study areas in a census year (or, if deceased, at the time of death) would not be counted in either the local area census or the corresponding mortality statistics.33,34 To the extent that those incarcerated might well be at a heightened risk for dying relative to other residents of the local area they left for prison, their absence from the population would serve to overstate mortality improvements in that area.

The national trends reported here would not be affected by rising incarceration rates, because incarcerated individuals are included in national mortality and population counts. Changing patterns of incarceration cannot explain rising mortality rates between 1980 and 1990, nor are they likely to explain cross-area patterns during the 1990s. However, given that incarceration rates rose disproportionately among Black men in poor urban areas, this compositional shift in local population may have had a downward influence on our estimates of male mortality rates in Harlem, eastside Detroit, and southside Chicago, especially for 2000.

Conclusions

Nonelderly women and men perform critical social, economic, reproductive, and caretaking roles; improving their health in high-poverty populations would reap health advantages for residents of all ages and for their communities and society as a whole. The lack of consistent improvement in excess mortality among working-age Black residents of high-poverty areas since McCord and Freeman's original analysis necessitates reflection and concerted action. The intervening period was one in which the reduction of social disparities in health was an explicit and high-priority national objective,35 but making sustainable inroads toward that goal has been elusive for this critical age group.

Acknowledgments

We are indebted to the Eunice Kennedy Shriver National Institute of Child Health and Human Development (grant R21HD056307) for financial support, as well as to the Center for Advanced Study in the Behavioral Sciences at Stanford University for fellowships to Arline T. Geronimus and John Bound and the Robert Wood Johnson Foundation Health and Society Scholars program for a fellowship to Cynthia G. Colen. We are also grateful to Reynolds Farley for helpful comments, Domenico Parisi for providing segregation indices for rural counties, Lisa A. Neidert for assistance with census data, Timothy A. Waidmann for statistical consultation, N. E. Barr and Diane Laviolette for help with preparation of the article, and 3 anonymous reviewers for helpful comments.

Human Participant Protection

This study was approved by the institutional review board at the University of Michigan. All data were obtained from publicly available secondary sources without personal identifiers.

References

1. McCord C, Freeman HP. Excess mortality in Harlem. N Engl J Med. 1990;322(3):173–177. [PubMed]
2. Geronimus AT, Bound J, Waidmann TA, Hillemeier MM, Burns PB. Excess mortality among blacks and whites in the United States. N Engl J Med. 1996;335(21):1552–1558. [PubMed]
3. Geronimus AT, Bound J, Waidmann TA. Poverty, time, and place: variation in excess mortality across selected U.S. populations, 1980–1990. J Epidemiol Community Health. 1999;53(6):325–334. [PMC free article] [PubMed]
4. McEwen BS. Protective and damaging effects of stress mediators. N Engl J Med. 1998;338(3):171–179. [PubMed]
5. Geronimus AT, Hicken M, Keene D, Bound J. “Weathering” and age patterns of allostatic load scores among blacks and whites in the United States. Am J Public Health. 2006;96(5):826–833. [PMC free article] [PubMed]
6. London AS, Scott EK, Edin K, Hunter V. Welfare reform, work-family tradeoffs, and child well-being. Fam Relat. 2004;53(2):148–158.
7. Jargowsky PA, Yang R. The “underclass” revisited: a social problem in decline. J Urban Aff. 2006;28(1):55–70.
8. Travis J, Waul M. Reflections on the Crime Decline: Lessons for the Future? Washington, DC: Urban Institute; 2002.
9. Freeman R, Rodgers W. Area economic conditions and the labor market outcomes of young men in the 1990's expansion. : Cherry R, Rodgers W, editors. , Prosperity for All? The Economic Boom and African Americans. New York, NY: Russell Sage Foundation; 2000:50–87.
10. Holzer HJ, Offner P. The puzzle of black male unemployment. Public Interest. 2004;154:74–85.
11. Jargowsky PA. Stunning Progress, Hidden Problems: The Dramatic Decline of Concentrated Poverty in the 1990s. Washington, DC: Brookings Institute; 2003.
12. Burton LM, Whitfield KE. Weathering toward poorer health in later life: comorbidity in low income urban families. Public Policy Aging Rep. 2003;13(3):13–18.
13. Geronimus AT, Bound J, Keene D, Hicken M. Black-white differences in age trajectories of hypertension prevalence among adult women and men, 1999–2002. Ethn Dis. 2007;17(1):40–48. [PubMed]
14. Schlesinger M. Paying the price: medical care, minorities, and the newly competitive healthcare system. Milbank Q. 1987;65(suppl 2):270–296. [PubMed]
15. Fossett JW, Perloff JD, Peterson JA, Kletke PR. Medicaid in the inner city: the case of maternity care in Chicago. Milbank Q. 1990;68(1):111–141. [PubMed]
16. Barrett RE, Young IC, Weaver KE, et al. Neighborhood change and distant metastasis at diagnosis of breast cancer. Ann Epidemiol. 2008;18(1):43–47. [PubMed]
17. Rubin MS, Colen CG, Link BG. Examination of inequalities in HIV/AIDS mortality in the United States from a fundamental cause perspective. Am J Public Health. 2010;100(6):1053–1059. [PMC free article] [PubMed]
18. Ghani AC, Donnelly CA, Anderson RM. Patterns of antiretroviral use in the United States of America: analysis of three observational databases. HIV Med. 2003;4(1):24–32. [PubMed]
19. Chiang CL. The Life Table and Its Applications. Malabar, FL: Robert E. Krieger Publishing; 1984.
20. Shryock HS, Siegel JS. The Methods and Materials of Demography. Washington, DC: US Government Printing Office; 1975.
21. Greville T. Short methods of constructing abridged life tables. Rec Am Inst Actuaries. 1943;32(1):29–43.
22. Siegel JS, Swanson DA, editors. , The Methods and Materials of Demography. 2nd ed New York, NY: Elsevier; 2004.
23. Smith DP. Formal Demography. New York, NY: Plenum Press; 1992.
24. Williams DR, Jackson PB. Social sources of racial disparities in health. Health Aff. 2005;24(2):325–334. [PubMed]
25. Lichter DT, Parisi D, Taquino MC, Beaulieu B. Race and the micro-scale concentration of poverty. Cambridge J Regions Econ Soc. 2008;1(1):51–67.
26. United States Life Tables, 2004. Hyattsville, MD: National Center for Health Statistics; 2007.
27. Fox JA, Zawitz MW. Homicide Trends in the United States. Washington, DC: Bureau of Justice Statistics; 2007.
28. Health, United States, 2008 With Chartbook. Hyattsville, MD: National Center for Health Statistics; 2009.
29. Freeman L. There Goes the Hood: Views of Gentrification From the Ground Up. Philadelphia, PA: Temple University Press; 2006.
30. Nyden P, Edlynn E, Davais J. The Differential Impact of Gentrification on Communities in Chicago. Chicago, IL: Loyola University, Chicago Center for Urban Research and Learning; 2006.
31. Western B. Punishment and Inequality in America. New York, NY: Russell Sage Foundation; 2006.
32. Charles KK, Luoh MC. Male incarceration, the marriage market, and female outcomes. Rev Econ Stat. 2010;92(3):614–627.
33. US Census Bureau Group quarters enumeration: Census 2000 evaluation E.5, revision 1. Available at: http://www.census.gov/pred/www/rpts/E.5%20R.pdf. Accessed January 5, 2011.
34. Medical Examiners' and Coroners' Handbook on Death Registration and Fetal Death Reporting. Hyattsville, MD: National Center for Health Statistics; 2003.
35. Healthy People 2000: National Health Promotion and Disease Prevention Objectives. Washington, DC: US Dept of Health and Human Services; 1990.

Articles from American Journal of Public Health are provided here courtesy of American Public Health Association

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...