• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of wtpaEurope PMCEurope PMC Funders GroupSubmit a Manuscript
Br J Nutr. Author manuscript; available in PMC Dec 21, 2012.
Published in final edited form as:
PMCID: PMC3398148

A re-analysis of the iron content of plant based foods in the United Kingdom


In the UK contemporary estimates of dietary iron intakes rely upon food iron content data from the 1980s or before. Moreover, there has been speculation that the natural iron content of foods has fallen over time, predominantly due to changes in agricultural practices. Therefore, we re-analysed common plant based foods of the UK diet for their iron content (the “2000s analyses”) and compared the values to the most recent published values (the “1980s analyses”) and the much older published values (the “1930s analyses”), the latter two being from different editions of the McCance and Widdowson food tables. Overall, there was remarkable consistency between analytical data for foods spanning the 70 years. There was a marginal, but significant, apparent decrease in natural food iron content from the 1930s to 1980s/2000s. Whether this represents a true difference or is analytical error between the eras is unclear and how it could translate into differences in intake requires clarification. However, fortificant iron levels (and fortificant iron intake based upon linked national data) did appear to have increased between the 1980s and 2000s, and deserves further attention in light of recent potential concerns over the long term safety and effectiveness of fortificant iron. In conclusion, the overall iron content of plant based foods is largely consistent between the 1930s and 2000s, with a fall in natural dietary iron content negated or even surpassed by a rise in fortificant iron but for which the long term effects are uncertain.

Keywords: dietary iron, non-haem iron, food composition, food analysis, fortification


Reliable information on the nutrient composition of foods, such as their iron content, is essential to meet the needs of a wide variety of groups, including nutritionists, government agencies, health and agriculture professionals, policy makers and planners, food producers, retailers, and consumers. Tables from McCance and Widdowson’s ‘The Composition of Foods’, the UK Government’s official source of food composition, provide the most recent available data on the iron content of common foods in the UK, yet much of the nutrient analysis was carried out in the 1980s or earlier (1).

Several studies based upon the comparison of data from different food composition tables have suggested that significant changes in the mineral content of food have occurred over time, with iron contents declining in recent years in both the UK and USA (2-5). However, these studies have had to assume that analytical data from different eras are similarly accurate while statistical comparisons have been basic, i.e. not allowing for multiple testing for example. Nonetheless, there are several lines of evidence that suggest that the nutrient content of certain foods could have altered in recent years. Mostly, such reports hypothesised that a decline in food nutrient content has occurred due to changes in agricultural practices, particularly depletion in available soil minerals (3).

Indeed, over the last century, food production has undergone a revolution, with changes in farming practice at the forefront. Prior to World War II, agricultural chemicals were rarely used: yet, in modern agriculture, traditional organic fertilisers, such as manure, have been largely replaced by chemical fertilisers (6). Secondly, some foods, such as tomatoes, are now frequently grown hydroponically in nutrient solutions (7). Thirdly, evidence is emerging that the recent higher atmospheric CO2 levels may impact upon plant nutrient content at least in wheat and brown rice, by increasing the proportion of carbohydrate and thus leading to a relative reduction in the content of other nutrients such as iron (8). Fourthly, Fan et al provide robust evidence from the contemporary analysis of archived wheat grain and soil samples, taken from the Broadbalk Wheat Experiment established in 1843 at Rothamsted (UK), that there has been a significant decrease in the content of iron, and other minerals, in wheat over 160 years. However, they did not attribute this to a change in fertiliser usage, or a decrease in soil mineral content, but instead they suggested that the introduction of semi-dwarf cultivars in the mid to late 1960s, which are high-yielding crops, had a significant effect on the mineral content of wheat, including iron (9). Whether this is true for other plant based foods, and whether the modern practice of iron fortification and restoration negates or even exceeds such declining effects, is not clear. There is no repository that allows for the direct comparison of the nutrient content of similar foods from different eras by contemporary analysis. However, one can at least address the issue of whether the values given for the iron content of plant based foods, in the current UK version of ‘The Composition of Foods’, is matched by data from re-analysis of the same foods, more than 20 years later. Here we report on the analysis of total iron content in 146 commonly ingested plant based foods and make a more thorough comparison of the iron content of foods reported in the different versions of the UK ‘The Composition of Foods’ since the 1930s.


Food samples

The foods analysed were purchased and carefully sampled in 2001-2 as part of an investigation into the silicon content of foods in the UK (10). Briefly, foods were purchased from three different shops or supermarkets in South-East England with the food varieties selected on the basis that they provided a fair representation of those commonly consumed by the British population. An equal amount was taken from each of the three samples of that food, and combined into a composite sample. Representative samples were taken, so in the case of lettuce for example, both outer and inner leaves were used. The method commonly used to prepare a food in domestic practice was carried out if needed. For example, wholemeal pasta was boiled in water for 10 minutes. Composite samples were homogenised in a blender and then placed in a polypropylene tube and frozen, ready for peroxide/acid digestion prior to analysis.

Composite samples were digested using an acid-assisted microwave digestion system (Milestone Analytical UK Ltd, Northwich, UK) as described previously (10). Briefly, 0.25 g of each sample was taken and placed in a microwave vessel with 10 ml 65% (v/v) nitric acid (p.a. plus (≤ 0.1 mg/kg iron), Fluka, Aldrich-Chemical Co, Gillingham, UK) and 1 ml of 30% (v/v) hydrogen peroxide (AristaR, Merck Ltd, Nottingham, UK), the vessel sealed and heated at 200°C for 15 min. In order to allow for possible iron contamination from the environment, a baseline blank sample was similarly prepared with each digestion run (i.e. one blank for every nine samples). These vessels contained only the nitric acid and hydrogen peroxide. Digested samples and blanks were diluted with approximately 33ml ultra-high purity water (18MG/cm; Elga Ltd, High Wycombe, UK) and total final volumes, assessed accurately by weight, were recorded before analysis.

Iron content

Total iron content of composite samples and blanks was determined using an Inductively Coupled Plasma-Optical Emission Spectrometer (ICP-OES; Jobin-Yvon JY24 Instruments SA, Longjumeau, France) with a V-groove nebuliser and Scott-type double-pass spray chamber at 259.940 nm. Iron concentration was determined by comparison to standards prepared from a stock ICP standard solution (1000 mg/L; Spectrol/Merck, Feltham, UK) with a matrix-matched diluent (i.e. containing nitric acid and hydrogen peroxide). Blanks were randomly distributed throughout every sample batch. Iron levels in the samples are reported, where, the value for the sample (which is ‘sample minus mean batch blank’) exceeds the highest blank value of the batch. The mean and standard deviation values for the iron content of all blanks were 5.4 ± 1.8 μg/l. The minimally detected content of iron in food was 0.12 mg iron per 100g food, corresponding to a blank-subtracted iron content of 6.82 μg/l in the analysed sample (i.e post digestion and dilution). A certified reference material (Seronorm™ Trace Elements, lot NO0371; Alere Ltd, Stockport, UK) was processed as described above for the food samples and found to contain 1.87 ± 0.02 mg iron/kg by analysis, consistent with its certified level of 1.95 mg iron/kg (range 1.71-2.18 mg iron/kg).

Comparisons with UK literature values

Results are expressed in terms of iron content per 100g while average portion sizes (11, 12) are indicated for each food. Data are compared with those from the 1980s and 1930s which, respectively, are published values from the 6th edition of McCance and Widdowson’s ‘The Composition of Foods’ (1) and McCance and Widdowson’s ‘Chemical Composition of foods’ (13). Using the Bland-Altman method, any bias was determined by comparing the mean of the differences to zero and their associated 95%confidence intervals (14).

We recognised that some foods, especially breakfast cereals, are fortified with iron and that manufacturer’s fortification practices change frequently. Thus analytical differences between the eras could represent (i) different fortification practices, (ii) genuine endogenous differences in mineral content or (iii) analytical error. To help control for the latter two we analysed for another, non-fortificant mineral, namely magnesium, in four common breakfast cereals. ICP-OES was used as described above for iron, using the same digested composite samples and blanks, and wavelengths for magnesium analysis were 279.553 and 279.079 nm.

Comparison of iron intake in the UK adult population

The impact that our newly measured values may have on the mean daily iron intake of the UK adult population was estimated using data from the National Diet and Nutrition Survey (NDNS) conducted in 2000-1 (15, 16). Using intake data from adults aged 19-64 years, we replaced the iron content established in the 1980s (1) with our new data, also from 2001, and reported here for 128 plant based foods (corresponding to 203 food codes). We did not undertake this for all subjects of the NDNS dataset, but, rather, for a subgroup (n=497) where we had a full set of matching data for breakfast cereal consumption (10 breakfast cereals corresponding to 17 food codes) for the reasons explained below in the Results section. The means were compared using the mean of the paired difference with 95% confidence intervals.


We analysed 146 plant based foods and 128 of those had detectable iron, meaning an iron content greater than or equal to 0.12 mg iron per 100g food (Table 1). Historical values are also given in Table 1 and difference plots (14), demonstrating variation between the years, are shown in Figures 11--44.

Figure 1
Comparison of the iron content (mg iron/100g food) of the same fruits between different decades: the analysis presented here (2000s), the latest published values (1980s) (1) and the earliest published values (1930s) (13). Comparisons were made using the ...
Figure 2
Comparison of the iron content (mg iron/100g food) of the vegetables between the different decades: the analysis presented here (2000s), the latest published values (1980s) (1) and the earliest published values (1930s) (13). Comparisons were made using ...
Figure 3
Comparison of the iron content (mg iron/100g food)a of the cereal products between the analysis presented here (2000s) and the latest published values (1980s)(1). Comparison was made using the Bland-Altman method (14). Briefly, for each food, the difference ...
Figure 4
Percentage changes in the iron content (mg iron/100g food)a of breakfast cereals between our analysis (2000s) and the latest published values (1980s)(1).
Table 1
The iron contenta of selected UK foods in 2001-2b, analysed by inductively coupled plasma – optical emission spectrometry (ICP-OES), and published values. (iron content)

To determine whether significant differences existed between the reported iron content of major food groups for each of the three decades we calculated means of differences and 95% CI where matching analytical data were available. There were too few data for cereal and cereal products or legumes for the 1930s to allow for widespread comparisons, so in Table 2 we compared data on (a) fruit for the three decades, (b) vegetables including legumes and pulses for the last two decades, (c) vegetables alone for the comparison with 1930s and (d) cereals and cereal products for the last two decades. Overall, data from the 1930s were marginally but significantly higher than data from the later years, albeit not quite so for vegetables between 2000s and 1930s. Between the 1980s and 2000s there was a tendency for analytical values to be higher for the latter decade, although this was only significant, with marked difference, for cereals and cereal products (Table 2) which may be explained by changes to fortification practices because magnesium values generally matched between the 1980s and 2000s for the four cereals analysed, namely (all as mg/100g prepared food for the 1980s versus 2000s analysis): 240 versus 220, 10 versus 13, 40 versus 49 and 120 versus 91 for, respectively, high fiber wheat bran cereal; cornflakes; rice cereal toasted and crisped and shredded wholegrain wheat cereal biscuits.

Table 2
Statistical means of the differences in the iron content (mg iron/100g food)a of the plant based food groups between the decadesb (number of foods in the group, means, standard deviation and 95% confidence intervals)

Finally we questioned whether these differences in cereal iron content between the 1980s and 2000s would impact upon reported dietary iron intakes in the population. As shown in Figure 3, fortified breakfast cereals had the highest iron content as well as the greatest variation in iron content between the 1980s and 2000s. Using the NDNS data we compared iron intakes in those adults who ingested breakfast cereals for which we had a full set of matching analytical values for the two decades (i.e. 100% of their breakfast cereal consumption was with breakfast cereals for which there were matching analytical data between the 1980s and 2000s). These were 10 different breakfast cereals (Figure 4) and 497 adult individuals matched this criterion. To assess intakes of iron from the 1980s values the latest version of McCance and Widdowson (1) was used as shown in Table 1. To assess intakes of iron from the year 2000s values, we still based the analysis on the latest version of McCance and Widdowson but substituted in all new iron content values from the 2000s analyses where they were available. This was for 128 foods (Table 1), translating to 203 food codes, with, as noted above, 10 of these being for breakfast cereals (17 food codes). For the 497 adult individuals described above their mean ± SD daily iron intake, based upon 7 day weighed dietary records, was 12.2 ± 4.2 (95% CI 11.9, 12.6) mg of iron per day using the 1980s values for the iron content of foods. However, using the 2000s values for the iron content of foods, mean intake was 11.5% higher at 13.6 ± 7.2 (95% CI 13.0, 14.2) mg iron per day. The mean of the paired differences was 1.4 (95% CI 1.0, 1.8) mg iron per day. This sub-sample of 497 adults had iron intake marginally higher than the entire sampled population of NDNS (n=1724), being 12.2 (95% CI 11.9, 12.6) mg iron per day versus 11.4 (95% CI 11.2, 11.6) mg iron per day respectively, presumably as they were selected to be “breakfast cereal consumers”.


Considering that the analytical data presented here span 70 years, are from different samples, used markedly different analytical techniques, and come from different laboratories, the similarities in the reported results are remarkable. Whether the marginally higher values of the 1930s reflect a genuine difference or a constant, small analytical bias is not clear. Reasonable arguments could be made either way: separating interferences from the true analytical signal would have been challenging in the 1930s. Conversely Fan et al (9) have reported from contemporary analysis of stored wheat samples that there has been a significant decrease in the iron content of wheat in the 1960s as discussed above (see Introduction), which perhaps is true for other plant based foods. However, whatever the reason(s), the differences are small (e.g. 0.14 mg iron/100g for fruits overall and 0.09 mg iron/100g for vegetables overall for 1930s versus 2000s in the context of a mean iron content of 0.79 mg iron/100g and 0.76 mg iron/100g for those food groups respectively). These differences are unlikely to convert to marked differences in total daily iron intake from these sources and will be more than compensated for by the use of fortificant iron in more recent years. In fact, between the 1980s and 2000s it is noteworthy how iron fortification of breakfast cereals appears to have generally increased (Figure 4).

Thus, when examining in greater detail the issue of food iron content between different years or eras, there is, as yet, no real suggestion that, quantitatively, dietary iron exposure is falling over time as some authors have suggested (2-5). However, firstly, in our analysis we have been unable to compare iron content of cereal and cereal products between the three different eras and this food group is the major source of dietary non-haem iron. Secondly, there may be reasons to question whether, qualitatively, current dietary iron intakes are satisfactory given this potentially increasing reliance upon fortificant iron. For example, in 2003, Henderson et al estimated that fortified breakfast cereals alone contributed about 20% of daily iron intake in the NDNS 19-64 year old adults (15). Fortificant iron is used in a number of different chemical forms with variable bioavailability (17, 18). Current UK legislation does not consider this nor does it set a maximum value for fortification but, rather, an implied minimum value for bread and flour in terms of ‘restoration’ of the iron that was removed during the milling process (the iron content of flour should be at least 1.65 mg/100 g flour) (19). Recently the observations that fortificant iron may negatively affect the bacterial flora (20) and, at least in certain populations, affect colonic health (21, 22), argue that better consideration should be given to (a) maximum dietary levels of fortificant iron and (b) the chemical form of the fortificant iron. Nonetheless, this study only addresses the iron content of plant based foods. Both bioavailability, influenced for example by ascorbic acid rich foods, and the addition of heam iron to the omnivorous diet would alter the quantity of bioavailable iron and the quality (i.e. chemical speciation and reactivity) of the iron.

Finally we note that when considering dietary iron intakes, and especially fortificant iron exposure, it would be advisable to have up-to-date values for the food iron content.


This project made use of samples collected from an original FSA funded project to determine silicon levels of plant based foods in the UK diet (10). Sample analyses were performed in the Gastrointestinal laboratory, The Rayne Institute, St Thomas’ Hospital, King’s College London, UK. The authors would like to thank Mark Sykes, from the above named institute, for his contribution in the sample preparation and analysis. This work was supported by the Medical Research Council [Unit Programme number U105960399]. JP and RJ designed the study, TC and RJ conducted the study, CT contributed all content in relation to the National Diet and Nutrition Survey and contributed to the preparation of the other data for the manuscript, SB analysed the data, performed statistical analysis with AO’s assistance and prepared the manuscript. SB and JJP had primary responsibility for the final content. There are no conflicts of interest.


1. Food Standards Agency . McCance and Widdowson’s The Composition of Foods. Sixth Summary Edition Royal Society of Chemistry; Cambridge: 2002.
2. White P, Broadley M. Historical variation in the mineral composition of edible horticultural products. J Hortic Sci Biotech. 2005;80:660–667.
3. Thomas D. A study on the mineral depletion of the foods available to us as a nation over the period 1940 to 1991. Nutr Health. 2003;17:85–115. [PubMed]
4. Mayer A. Historical changes in the mineral content of fruits and vegetables. Br Food J. 1997;99:207–211.
5. Davis DR, Epp MD, Riordan HD. Changes in USDA food composition data for 43 garden crops, 1950 to 1999. J Am Coll Nutr. 2004;23:669–682. [PubMed]
6. Hornick SB. Factors affecting the nutritional quality of crops. Am J Alternative Agr. 1992;7(1-2):63–68. Special Issue on Soil Quality.
7. Kelly SD, Bateman AS. Comparison of mineral concentrations in commercially grown organic and conventional crops – Tomatoes (Lycopersicon esculentum) and lettuces (Lactuca sativa) Food Chem. 2010;119:738–745.
8. Loladze I. Rising atmospheric CO2 and human nutrition: toward globally imbalanced plant stoichiometry? Trends Ecol Evol. 2002;17:457–461.
9. Fan MS, Zhao FJ, Fairweather-Tait SJ, et al. Evidence of decreasing mineral density in wheat grain over the last 160 years. J Trace Elem Med Biol. 2008;22:315–324. [PubMed]
10. Powell JJ, McNaughton SA, Jugdaohsingh R, et al. A provisional database for the silicon content of foods in the United Kingdom. Br J Nutr. 2005;94:804–812. [PubMed]
11. Mills A, Patel S. Food Portion Sizes. 2nd ed. The Stationery Office; London: 1993.
12. Davies J, Dickerson J. Nutrient Content of Food Portions. The Royal Society of Chemistry; Cambridge: 1991.
13. McCance R, Widdowson E. The Chemical Composition of Foods. His Majesty’s Stationery office; London: 1940.
14. Bland JM, Altman DG. Statistical methods for assessing agreement between two methods of clinical measurement. Lancet. 1986;1:8476, 307–310. [PubMed]
15. Henderson L, Gregory J, Swan G. National Diet and Nutrition Survey: Adults Aged 19 to 64 Years. Vol 1: Types and quantities of foods consumed. The Stationary Office; London: 2002.
16. Henderson L, Irving K, Gregory J, et al. National Diet and Nutrition Survey: Adults Aged 19 to 64 Years. Vol 3: Vitamin and mineral intake and urinary analytes. The Stationary Office; London: 2003.
17. Zhu L, Glahn RP, Nelson D, et al. Comparing soluble ferric pyrophosphate to common iron salts and chelates as sources of bioavailable iron in a Caco-2 cell culture model. J Agr Food Chem. 2009;57:5014–5019. [PubMed]
18. Hurrell RF, Lynch S, Bothwell T, et al. Enhancing the absorption of fortification iron. A SUSTAIN Task Force report. Int J Vitam Nutr Res. 2004;74:387–401. [PubMed]
19. Statutory Instrument No. 141. The Bread and Flour Regulations 1998. The Stationery Office; London: 1998.
20. Zimmermann MB, Chassard C, Rohner F, et al. The effects of iron fortification on the gut microbiota in African children: a randomized controlled trial in Cote d’Ivoire. Am J Clin Nutr. 2010;92:1406–1415. [PubMed]
21. Werner T, Wagner SJ, Martinez I, et al. Depletion of luminal iron alters the gut microbiota and prevents Crohn’s disease-like ileitis. Gut. 2011;60:325–333. [PubMed]
22. Werner T, Hoermannsperger G, Schuemann K, et al. Intestinal epithelial cell proteome from wild-type and TNFDeltaARE/WT mice: effect of iron on the development of chronic ileitis. J Proteome Res. 2009;8:3252–3264. [PubMed]
PubReader format: click here to try


Related citations in PubMed

See reviews...See all...


Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...