A pair of studies suggest physicians should consider testing female adolescents for iron deficiency within a few years of starting menses.
Women are typically tested for anemia in their teens, with a quick and affordable hemoglobin test.
However, iron deficiency can develop years before anemia and can be missed by hemoglobin testing alone.
Blood tests for iron deficiency without anemia are more costly and more difficult to obtain than hemoglobin testing for anemia.
Deepa Sekhar, MD, of Penn State College of Medicine in Hershey, Pennsylvania, and her colleagues set out to determine risk factors for iron deficiency without anemia in order to pinpoint which women could benefit most from the more costly testing.
The results of the researchers’ 2 studies were published in PLOS ONE and The Journal of Pediatrics.
PLOS ONE study
The researchers evaluated data from 6216 females, ages 12 to 49, who took part in the National Health and Nutrition Examination Survey (NHANES) between 2003 and 2010. As part of the survey, participants were tested for both iron deficiency and anemia.
Eight percent of all subjects (n=494) had iron deficiency.
Nine percent (n=250) of non-anemic younger women (ages 12-21) had iron deficiency, as did 7% (n=244) of older women (ages 22-49) who were not anemic.
The researchers looked at potential risk factors for iron deficiency, including the age when women started menstruating, as well as their race/ethnicity, poverty status, food insecurity, tobacco or nicotine use, dietary information, body mass index, and physical activity.
All of these factors have been associated with iron-deficiency anemia in women in prior studies.
In this study, there was only 1 risk factor significantly associated with iron deficiency without anemia.
Young women (ages 12-21) who had been menstruating for more than 3 years had a significantly higher risk of iron deficiency without anemia (risk ratio=3.18).
The Journal of Pediatrics study
In this study, the researchers looked at whether a questionnaire could better predict iron status.
The questionnaire included questions on depression, poor attention, and daytime sleepiness, all of which have been associated with iron deficiency or iron-deficiency anemia, but were not captured in the prior NHANES analyses.
This questionnaire was compared to the 4 questions assessing iron-deficiency anemia risk in the Bright Futures Adolescent Previsit Questionnaire, a survey recommended for physician use by the American Academy of Pediatrics.
Ninety-six female adolescents participated in this study. Eighteen percent of them (n=17) had iron deficiency, and 5% (n=5) had iron-deficiency anemia.
Both the Bright Futures questions and the researchers’ risk assessment questionnaire poorly predicted ferritin and hemoglobin values in these subjects.
Mean differences in depression, poor attention, food insecurity, daytime sleepiness, and body mass index percentile were not significantly associated with ferritin or hemoglobin.
Conclusions
The results of these 2 studies suggest that risk factors and assessments cannot accurately determine which young women should receive testing for iron deficiency, although results from the first study might be used to determine when testing should occur.
“I think we need to establish the optimal timing for an objective assessment of adolescent iron deficiency and anemia,” Dr Sekhar said.
She believes the appropriate age may be 16 years old, when most females will have been menstruating for at least 3 years.
Further research will be needed to determine which blood test for iron deficiency without anemia is accurate, cost-efficient, and practical for routine doctor’s office use.
This test should be given with hemoglobin testing to catch all young women on the spectrum of iron deficiency, Dr Sekhar said.