Iron deficiency and iron overload
Iron is an essential element with important functions such as oxygen transport, DNA synthesis and muscle metabolism (6, 7). Iron deficiency is the main cause of anaemia, which is the most prevalent nutritional deficiency worldwide, affecting 29% of non-pregnant women, 38% of pregnant women, and 43% of children worldwide (8). On the other hand, iron overload (i.e. accumulation of iron in the body from any cause) is generally the result of disorders such as hereditary haemochromatosis, thalassaemias, repeated blood transfusions or other conditions that affect iron absorption or regulation and can also have important deleterious consequences on health if left untreated (9). Owing to its high reactivity, iron is always bound to a protein, depending on its role and location in the body. Iron circulates in the plasma bound to transferrin, is stored in cells as ferritin, and functions as part of haemoglobin or myoglobin molecules.
Iron deficiency
Iron deficiency exists when body iron stores are inadequate to meet the needs for metabolism. Progressive iron deficiency can result in iron-deficient erythropoiesis (formation of red blood cells) and, eventually, iron deficiency anaemia (10). However, even in the absence of anaemia, iron deficiency appears to be associated with clinical signs and symptoms including fatigue (11), impaired physical performance (12), decreased work productivity (13) and suboptimal brain development (14). Iron deficiency may result from physiological, environmental, pathological, drug-related, genetic or iron-restricted erythropoietic causes (15). Iron deficiency may be caused by inadequate iron intake, excess iron (i.e. blood) loss, or excess iron utilization. Inadequate iron intake may result from a diet that is poor in iron, and/or that contains iron in a biologically inaccessible form. Iron may also fail to be absorbed in individuals with intestinal disorders such as coeliac disease, and perhaps Helicobacter pylori infection (16). Inflammation can also impair iron absorption, which may mediate iron deficiency in athletes (17). The most common cause of blood loss is menstruation, which is the primary reason that iron deficiency is more common in women. In low-income settings, other important causes include chronic blood loss from hookworm and schistosomiasis. In all settings, blood donation (18) and bleeding from intestinal lesions (19) must be considered. Iron requirements are increased during growth (especially in infants and children under 5 years of age) and adolescence, while iron requirements during pregnancy are increased due to iron needs for maternal and fetal erythropoiesis and fetal growth.
When considering these factors, it is not surprising that iron deficiency and iron deficiency anaemia are most common in preschool-age children and women of reproductive age, and that, overall, iron deficiency is most common in low-income settings where dietary iron content and availability are low and parasitic infections are highly prevalent. It is estimated that approximately 33% of the world’s population has anaemia, with iron deficiency considered to be the leading cause, and that anaemia accounts for almost 9% of the world’s years lived with a disability burden (20). It has also been estimated that, worldwide, 273 million preschool-age children have anaemia (43% of all children, 42% of cases responsive to iron); 32 million pregnant women have anaemia (38% of all pregnant women, 50% of cases responsive to iron); and 496 million non-pregnant women have anaemia (29% of all non-pregnant women, 50% of cases responsive to iron). Anaemia is most prevalent in central and west Africa and south Asia (8).
Iron overload
Because elemental iron is toxic to the body (due to its propensity to initiate redox reactions and generate free radicals, causing tissue damage), it must be chaperoned and stored in the body by binding proteins (i.e. transferrin, ferritin). There is no physiological mechanism to excrete iron, hence homeostatic regulation of iron stores is entirely mediated by changes in iron absorption (mainly via modulation of the hepatic hormone, hepcidin) (9). Iron overload results from excess iron absorption, generally caused by autosomal dominant genetic conditions (hereditary haemochromatosis, caused chiefly by mutations in the HFE gene, but also less commonly by mutations in genes that encode haemojuvelin [HJV], transferrin receptor 2 [TFR2] and ferroportin [SLC40A1]); conditions associated with ineffective erythropoiesis (for example thalassaemia intermedia and haemoglobin E-beta thalassaemia); and iron accumulation from repeated red cell transfusions, usually to treat inherited (e.g. thalassaemia and other congenital conditions) or acquired (e.g. aplastic anaemia, myelodysplasia) anaemia (21, 22).
Over time, iron overload results in excess iron accumulation in organs, especially the liver (resulting in cirrhosis, liver failure and hepatocellular carcinoma), endocrine organs (causing pituitary and gonadal failure), pancreas (causing diabetes), skin (causing pigmentation) and heart (resulting in cardiomyopathy, heart failure and arrhythmia). Untreated, most patients with severe iron overload succumb to cardiac or hepatic complications. Thus, early diagnosis and non-invasive tests for monitoring treatment are essential to optimal management of iron overload (23). The global burden of iron overload is uncertain and varies by population (ethnicity, age and sex), as well as the methodology used for estimation (screening with ferritin, transferrin saturation or genetic testing). The prevalence of hereditary haemochromatosis in Caucasian populations has been estimated to be 3.5 to 4.5 per 1000 population with clinical expression, being greater in males than females (24, 25). Annually and worldwide, approximately 21 000 children are born with haemoglobin E-beta thalassaemia (about half of whom are transfusion dependent) and approximately 23 000 are born with thalassaemia major; a further 14 000 are born with haemoglobin H disease. Thus, genetic conditions associated with risk of iron overload are prevalent conditions worldwide (26).
Assessment of iron status
Indicators for the study of iron status in populations are important for determining its magnitude and distribution, for deciding on intervention options, and for monitoring and evaluating the impact of implemented public health programmes. The strategy may include one or more direct or indirect interventions affecting iron status, such as nutrition education or counselling; universal or targeted provision of iron supplements; point-of-use fortification of foods with micronutrient powders containing iron; fortification of staple foods or condiments with iron and other micronutrients; deworming; and water, sanitation and hygiene.
The assessment of iron status is not precise, since proteins reflect the status of different compartments in the body. For example, measurement of serum ferritin assesses storage iron (27–32), while measurements of serum iron and the percentage of transferrin saturation reflect the iron supply to tissues. Serum transferrin receptor (sTfR), erythrocyte ferritin and red cell zinc protoporphyrin are indicators of the iron supply to bone marrow. The use of iron by the bone marrow can be assessed by the percentage of hypochromic red blood cells, mean corpuscular volume and reticulocyte haemoglobin content. Risk of iron overload is usually studied by liver biopsy or magnetic resonance imaging (MRI).
As these biomarkers are affected by other conditions such as age, sex, disease, smoking, infection and inflammation, it may be difficult to identify a unique indicator of iron status.
Ferritin
Ferritin is the primary iron-storage protein and is critical to iron homeostasis (33). The ferritin molecule is an intracellular hollow protein shell, composed of 24 subunits surrounding an iron core that may contain as many as 4000–4500 iron atoms. In the body, small amounts of ferritin are secreted into the blood circulation. In the absence of inflammation, the concentration of this plasma (or serum) ferritin is positively correlated with the size of the total body iron stores (10, 33, 34). A low serum ferritin concentration reflects depleted iron stores, but not necessarily the severity of the depletion as it progresses (34).
To determine its usefulness to detect low iron reserves or iron deficiency in populations, the concentration of ferritin can be compared to iron contained in the bone marrow (35). The absence of stainable iron on a bone marrow aspirate that contains spicules is diagnostic of iron deficiency. In some studies, however, bone marrow aspirates have failed to detect iron deficiency, suggesting methodological and interpretation limitations (36, 37). Although bone marrow is the appropriate tissue to assess iron deposits, aspirations or biopsies are invasive and costly procedures that are not free of methodological difficulties. For these reasons, they have been largely replaced by other determinations such as ferritin, serum iron and total iron-binding capacity to diagnose iron deficiency (38).
On the other side of the spectrum, liver biopsies have commonly been used to detect iron overload, because the liver is the dominant iron-storage organ; liver iron concentration correlates closely with the total iron balance; and the liver is the only organ in which the iron concentration is elevated in all forms of systemic iron overload (39). Non-invasive methods such as MRI have become established in diagnosis and quantitation of iron overload. Other methods, including superconducting quantum interference device biomagnetic susceptometry (SQUID) and computerized tomography (CT), are being used to assess iron content in the liver. An advantage of MRI over other methods is that it includes a low variability between measures and can detect the iron load in the liver, heart and endocrine tissues (40–42).
Ferritin cut-off values
Definitive identification of iron status for a patient requires invasive or expensive studies and is generally not available in public health or primary care settings. Iron deficiency is considered to exist when bone marrow iron staining is absent, but bone marrow aspiration cannot be routinely used in the population health context and is often unacceptable to patients in routine clinical practice. Iron overload can be assessed by histological assessment of accumulation of iron in tissues (usually the liver), or measurement of tissue iron content, either directly (i.e. by biochemical assessment of biopsied liver samples) or indirectly with MRI. These investigations can only be done in selected patients and the determination of iron status in patients or populations therefore relies on measurement of indices in peripheral blood (43).
Appropriate cut-off values for ferritin need to be characterized to define pathology (for both iron deficiency and iron overload). However, ferritin concentrations are also raised in inflammation with or without infection, liver disease, obesity, and some rare haematological conditions. Inflammation can distort interpretation of ferritin concentrations, obscure the diagnosis of iron deficiency and be misleading in the diagnosis of iron overload (44). Clinical or biochemical assessment for concomitant inflammation is therefore essential, but optimal adjustments of ferritin measures to account for inflammation remain uncertain (43).
The existing cut-off points for serum ferritin as a measure of iron stores have been summarized by WHO (34). Although widely implemented and cited, these cut-off values are based on qualitative expert opinion and not a systematic appraisal of the published literature (45), and have not been universally adopted. A review of the literature revealed use of a broad range of cut-off values and approaches for obtaining those values (44).
Selection of appropriate cut-off values for an index that yields continuous data to indicate the dichotomous presence or absence of disease necessitates trade-offs between sensitivity and specificity. The optimal cut-off point depends on the task of the test: for example, for a screening test, or when the outcome of a false-positive result is not severe, a cut-off value that provides a higher sensitivity would be appropriate. Conversely, if a false-positive outcome puts the patient or population at risk, a cut-off value with higher specificity is preferable. The diagnostic properties of selected thresholds are useful to the end-user to enable appropriate interpretation (43).
Clinical pathways for iron deficiency and overload
Evaluation of iron status may be performed clinically, for individual patients, or across a population. Measurement of iron status in individuals is important to correctly define iron status and provide appropriate treatment for iron deficiency; to prompt further testing and management if iron overload is suspected; and to monitor interventions for both iron deficiency and iron overload. Measurement of iron status in populations is important to determine the prevalence and distribution of iron deficiency and risk of iron overload in the population, and thus to decide appropriate interventions, and to monitor and evaluate the impact and safety of implemented public health programmes (46).
The risks of a missed case of iron deficiency in infants and young children may portend irreversible consequences of iron deficiency and anaemia in this critical period. A false-negative test in adulthood may leave fatigue, lethargy and reduced exercise performance untreated; impaired pregnancy outcomes (anaemia, reduced birth weight, reduced gestation) in pregnant women unresolved; and underlying potentially serious causes of iron deficiency undetected. A misdiagnosed case of iron deficiency in a non-iron-deficient individual would cause unnecessary side-effects from iron therapy (e.g. constipation, abdominal pain, metallic taste); potential increased risk of infection; or a risk of toxicity in overdose. In the case of a negative test for iron deficiency with anaemia, it would be critical to search for other causes of anaemia and consider masked iron deficiency due to elevation in ferritin as a result of infection/inflammation. In addition to inflammatory markers, the clinician should also consider other markers of iron metabolism and clinical conditions. A clinical pathway for investigation of iron deficiency is proposed in .
Clinical pathway for iron deficiency.
For iron overload, a positive ferritin test (above a determined threshold) would trigger additional testing to assess differential diagnoses (with or without anaemia), with investigation of aetiology through history and genetic testing for blood disorders (e.g. the HFE gene, thalassaemia); haematological tests for haemoglobinopathies; and liver function testing. After confirmation of the diagnosis of iron overload, medical management of the underlying condition, including reduction of iron overload (i.e. phlebotomy, iron chelation), would follow. Misdiagnosis of iron overload would be a missed opportunity to treat iron overload (liver disease, cardiomyopathy, diabetes, pituitary failure, hypothyroidism, arthropathy), as well as a missed opportunity to detect affected family members with similar conditions. False-positive test results would entail unnecessary exposure to additional tests (e.g. genetic studies, MRI) and, potentially, unnecessary treatment with phlebotomy or chelators. The clinical pathway for iron overload with ferritin is depicted in .
Clinical pathway for iron overload. Dx: diagnosis; HFE: hereditary haemochromatosis.
Ferritin assays
Interpretation of and comparison between studies that have been undertaken in various laboratories at different times in the last decades may be compounded by variation and evolution in assay techniques and platforms, as well as the limited use of WHO reference materials (47, 48).
The WHO Expert Committee on Biological Standardization has established international reference materials to develop tests or to evaluate inter-laboratory performance. These reference materials for ferritin have been developed for calibrating working/secondary standards in routine assays performed in laboratories and also for evaluating and standardizing new assays for ferritin quantification. At least three international reference materials have been developed: first (liver), second (spleen) and third (recombinant) (49–51).
Since ferritin concentration is widely used as marker of iron stores and status, it is important to determine whether all commonly used methods can detect and discriminate the full range of iron status (deficiency, repletion and overload), and to assess the comparability of methods across measurement systems.
History of the project on the use of ferritin for the assessment of iron status
There are previous WHO documents on the use of ferritin for the assessment of iron status in populations from consultations held in 1987 and 1993 (1, 2, 51).
In 2004, a joint WHO/CDC technical consultation was held on assessment of the iron status of populations (1) and an analysis of indicators of iron status and acute phase proteins was undertaken. In preparation for this consultation, a non-systematic review sought to identify the most efficient indicators to evaluate the impact of interventions to control iron deficiency and detect a true change in iron status of a population, using the fewest and simplest tests (52). Several iron indicators were reviewed to assess their ability to measure change in iron status due to an iron intervention. Based on the data analysis and the consultation, participants concluded that the concentration of haemoglobin should be measured for the assessment of iron status, even though not all anaemia is caused by iron deficiency, and that the assessment of serum ferritin and soluble transferrin receptor would be the best approach for measuring the iron status of populations. In evaluating the impact of interventions to control iron deficiency in populations, it was recommended to use serum ferritin as the indicator of a response to an intervention to control iron deficiency and to measure it along with the haemoglobin concentration in all programme evaluations. Additionally, the consultation concluded that if funding was available, it may also be useful to measure the concentration of one or both of the acute phase proteins, C-reactive protein (CRP) or α-1 acid glycoprotein (AGP), to account for a high serum ferritin caused by inflammation, as well as measuring transferrin receptor during repeated surveys.
Meetings on the ferritin project
WHO guideline development group meeting: Use of ferritin concentrations to assess iron status in populations, Panama City, Panama 15–17 September 2010
WHO convened the first guideline development group meeting in Panama City, Panama in 2010, on priorities in the assessment of vitamin A and iron status in populations (53), to discuss and initiate the work of updating WHO guidelines on indicators for the assessment of vitamin A and iron status. With regard to the assessment of iron status, serum ferritin and soluble transferrin receptor were ranked of highest priority for undergoing a thorough review.
Starting in 2013, WHO developed a project for retrieving, summarizing and assessing the evidence to inform WHO recommendations on the use and interpretation of serum/plasma ferritin concentrations for assessing iron status in populations. Five protocols were initially developed to answer specific topics, through systematic reviews of published data and analysis of raw data from international databases. These protocols are listed next.
Serum/plasma ferritin for assessing iron status in populations. This protocol examined whether ferritin concentration reflects all possible iron statuses (deficiency, repletion and overload) and what the cut-off points are to define each iron status. This protocol searched for the association of serum/plasma ferritin with other measures of iron stores, as indicated by bone marrow aspirates, haemoglobin, blood smears and liver biopsies.
How ferritin concentration responds to nutrition interventions. The aim was to summarize systematic reviews that assess the effects of nutrition-specific and nutrition-sensitive interventions on ferritin concentrations, particularly in children aged 6–59 months, school-age children and pregnant and non-pregnant women of reproductive age.
The accuracy and comparability of methods for measuring ferritin concentration. This protocol searched the different laboratory methods of assessing ferritin concentration and aimed to estimate between-methods adjustments if necessary.
The influence of inflammation on ferritin concentrations. A methodological approach was used to adjust values in population-based surveys. This protocol aimed to describe and compare different approaches to account for inflammation, using the measurement of acute phase proteins. The reviews aimed to identify a single “best” approach that accurately describes the prevalence of low ferritin, is easy to use, and can be applied in most populations.
The use of ferritin concentrations in defining levels of public health concern with respect to the iron status of populations. The objective was to describe the magnitude and distribution of iron deficiency as measured by ferritin concentration in different regions of the world, to inform guidelines and to define ranges, severities and proportions of iron deficiency at national or regional levels.
WHO/CDC technical consultation: Ferritin concentrations to assess iron status in populations, Emory University, Atlanta, United States of America (USA) 3–5 March 2014
WHO, in collaboration with CDC, convened a technical consultation to validate the methodological approach to retrieve, summarize and assess the evidence to inform future guidelines on the use of ferritin as an indicator of iron status in populations, including cut-off points to define iron deficiency, repletion and overload in different population groups. Experts in the areas of ferritin methodology, immunology, nutrition and programme implementation met with authors of the meta-analyses, to discuss the draft review protocols and any preliminary results.
Meeting: Review of evidence to inform WHO/CDC guidelines on the use of ferritin concentrations to assess iron status in populations, Geneva, Switzerland 4–5 December 2014
WHO, in collaboration with CDC, convened a meeting with the authors of the reviews and invited guests to discuss the outcomes of the reviews and draft recommendations in preparation for a meeting of the WHO guideline development group. The objectives were to describe the WHO guideline development process; present results of each of the reviews for informing the WHO/CDC guideline on the use of ferritin concentrations to assess iron status in populations; and discuss results and develop a draft guideline for consideration by the WHO guideline development group.
Meeting: Review of evidence to inform WHO/CDC recommendations on the use of ferritin concentrations to assess iron status in populations, Bethesda, USA 6–8 May 2015
This meeting was held in collaboration with CDC and the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). The objectives were to present preliminary results of each of the five protocols, retrieving evidence for informing the WHO/CDC recommendations on the use of ferritin concentrations to assess iron status in populations, and to discuss the results and main conclusions of the findings (54).
Meeting: Review of evidence to inform WHO recommendations on the use of ferritin concentrations to assess iron status in populations, Geneva, Switzerland 3–4 March 2016
WHO, in collaboration CDC, convened a review group meeting to present the final results of each of the reviews for informing the WHO recommendations on the use of ferritin concentrations to assess iron status in populations; discuss results and main conclusions of the findings; and draft recommendations to put forward to the WHO guideline development group (55).
WHO guideline development group meeting: Use of ferritin concentrations to assess iron status in populations, Geneva, Switzerland 15–17 June 2016
The WHO Department of Nutrition and Food Safety established the WHO guideline development group – ferritin, to present and discuss evidence to finalize a recommendation on the use of ferritin concentrations to assess iron status in populations; determine the strength of the recommendations, considering costs, values and preferences; define implications for further research; and discus challenges for implementation of the guideline (56).
Why is it important for WHO to develop this guideline?
The WHO 13th General Programme of Work (GPW13) 2019–2023 (57) focuses on delivering impact for the people at the country level, in all countries – low, middle and high income, and is based on the United Nations Sustainable Development Goals (SDGs) (58). The three strategic priorities set out in GPW13, referred to as the triple billion goals, include achieving universal health coverage, addressing health emergencies and promoting healthier populations. Nutrition, as a cross-cutting area in the health and development sectors, is an integral part of these goals (57, 59). Accurate determination of iron status is crucial for diagnostic and screening purposes in the clinical setting, and to guide public health interventions at the population level. In an individual patient, diagnosis of iron deficiency or overload will help guide management, including further investigations and appropriate therapy. At the population level, determination of the magnitude and distribution of iron deficiency can help prioritize appropriate interventions in settings in which the prevalence is regarded as a severe public health problem, or help identify populations with hereditary conditions that predispose them to iron overload.
In support of WHO GPW13, the 2030 SDG agenda, particularly SDG2 and SDG3, and in concert with the United Nations Decade of Action on Nutrition (2016–2025) (60), WHO’s Ambition and action in nutrition 2016–2025 (61) aims for “A world free from all forms of malnutrition where all people achieve health and well-being”. It defines the unique value of WHO for advancing nutrition – the provision of leadership, guidance and monitoring – and proposes a theory of change. In line with these three core functions, WHO’s work in nutrition will build on the outcomes of the efforts to accelerate progress towards achieving the Global Nutrition Targets (60, 61), as well as WHO’s triple billion goals, in the key areas of guidance, policy, surveillance and engagement for achieving universal health coverage, addressing health emergencies and promoting healthier populations (57, 59).
This guideline is in line with GPW13, particularly in outcomes 1.1: Improved access to essential nutrition actions as part of quality essential health services; 3.1: Determinants of health addressed; 3.2: Risk factors reduced through multi-sectoral action; and 4.1: Strengthened country capacity in data and innovation (57).