Ironing Out Some Mental Limitations
By Janet Raloff
As the pace of urban life picks up, people increasingly find themselves forced to multitask.
It may be as simple as gleaning news from a radio broadcast while packing the kids’ lunches and emptying the dishwasher. Other times, it may be more challenging—breaking up a backseat sibling spat while maneuvering lane changes during rush hour. Or imagine an emergency-room physician handling several patients at once after a community catastrophe—while regularly calling home to see if everyone’s all right there.
What all of these situations call for, explains John Beard of the Pennsylvania State University, is executive functioning—the integration of thinking skills and memory with quick reactions and accuracy. Most people cover the bases pretty well. However, his team has just completed a study indicating that iron deficiency can subtly compromise an individual’s ability to simultaneously perform multiple challenging tasks.
That’s disturbing, he says, because recent federal surveys have indicated that some 8 million U.S. adults are iron deficient. Many of those people, Beard notes, are reproductive-age women who lose significant quantities of iron each month as part of their menstrual-blood loss.
Research had amply demonstrated that iron plays a pivotal role in children. Youngsters who get too little of the element in their diets suffer slow brain development—a condition that can have irreversible impacts. Some affected children grow into adults who permanently lag their peers in intellectual functioning. However, Beard notes, few studies have focused on the cognitive impacts of iron deficiency that might begin during adulthood. Researchers have generally assumed that once the brain had developed, it’s fairly invulnerable to an iron shortfall.
In his work with rodents, however, Beard found that an iron deficiency initiated in adulthood can have substantial brain impacts, including changes in concentrations of the chemicals that carry neural signals. “I began thinking,” Beard recalls, that such changes might play out in impaired thinking.
To see whether such an effect occurs in people, Beard and Laura Murray-Kolb set out to investigate mental performance in a group of generally healthy people liable to have iron deficiency: 149 female college students and staff of reproductive age. The researchers reasoned that the menstrual cycling of these women might periodically or chronically deplete the iron in their bodies, known as their iron stores.
At the Experimental Biology meeting in Washington, D.C., last month, Murray-Kolb’s group reported its findings: Iron-sufficient women far outperformed iron-short peers in every mental category tested—attention, memory, and learning.
What’s the score?
Prior to the study, blood tests categorized the women as iron sufficient or iron deficient. The women ranged from 18 to 35 years old, with an average age of 21. Overall, Murray-Kolb found, some 30 percent were iron deficient, some to the point of being anemic.
To start the study, each woman completed a series of eight tests at a computer’s touch screen. One test of attention and reaction time, for instance, asked each to touch a spot on the computer screen every time a particular light appeared. Another displayed a pattern on the screen and asked the women to recall it. Another asked the volunteers to memorize increasingly long lists of displayed images.
The computer scored each woman’s accuracy and the time she took to complete each test.
Women who had started the study moderately iron-deficient completed the tests in the same amount of time but performed only half as accurately as did women who had sufficient iron. Volunteers who were anemic not only performed with poor accuracy, but also took longer than any other women to complete most tasks. Indeed, Murray-Kolb points out, the more severe a woman’s iron deficiency, the longer it took her to complete each test.
Then the women in each iron category were randomly assigned to receive daily doses of either a 60-milligram iron supplement or a look-alike inert pill. The supplement was enough to eliminate iron deficiency in either moderately iron-deficient or anemic women. Each woman was invited to come back and repeat the mental-performance tests after 4 months of taking the pills, and 113 did.
The women who had been iron-deficient but received the iron supplements were as quick and accurate on the tests as iron-sufficient women were. Therefore, Beard calls the intellectual impairment associated with adult iron deficiency “very impressive but reversible.”
Unfortunately, women who took inert pills and remained iron deficient didn’t improve on accuracy, though their speed picked up—something Beard attributes to their having done the tests before. Beard says that the new data indicate that women with an iron deficiency aren’t as focused, don’t remember as well, and don’t learn as well as women getting plenty of iron. He also worries about the “cumulative effect” of these impairments on decision-making abilities. The brain effects of iron deficiency could foster accidents in the kitchen, on the road, or at the workplace, he says.
Although the researchers tested only young women in this study, Beard says “there’s no biologic reason” why older women and men of any age shouldn’t suffer similar effects from iron deficiency. In the United States, some 2 percent of males 16 to 69 years old and 3 percent of those 70 and older are iron deficient, according to a Centers for Disease Control and Prevention survey. That same survey indicated that between 6 and 9 percent of postmenopausal women also are iron deficient.
Those percentages add up to a large number of individuals. However, they pale in comparison to estimates of iron deficiency in developing countries. The World Health Organization (WHO) reports that “iron deficiency is the most common nutritional disorder in the world.” The numbers are staggering: as many as 5 billion people, or up to 80 percent of the world’s population, may be iron deficient. Of these, WHO estimates that as many as 2 billion people fully qualify as anemic.
If the new Penn State data hold up, they indicate that the intellectual functioning of all of these people may is subtly compromised by inadequate iron.
Don’t rely on Popeye
For years, children have been regaled with tales of how spinach pours iron and strength into the oft-beleaguered hero Popeye the Sailor Man. In fact, the story is an extremely tall tale, since research now shows that spinach isn’t an especially robust source of iron.
In general, a serving of this or other vegetables, fruits, grains, or pasta naturally contains a mere 0.1 to 1.4 milligrams of iron, according to a 2001 Institute of Medicine (IOM) review of iron-consumption needs and recommended sources. That’s well below the 18 mg/day recommended daily intake (RDA) for menstruating women and the 8 mg/day RDA for men and postmenopausal women.
People interested in boosting their iron uptake would do better to reach for meat, eggs, and iron-fortified breads, cereals, and breakfast bars, the IOM report said. Indeed, the report noted that some heavily fortified breakfast cereals now offer a whopping 24 mg of iron per 1-cup serving. People in the United States people currently derive about half of their dietary iron from breads and other fortified grain products.
However, the report also cautions that it’s important not to overdo and recommends an iron-intake limit of about 40 to 45 mg/day. Regularly downing too much supplemental iron risks causing vomiting, diarrhea, and even death, the report notes. These are signs of iron toxicity, but even subtoxic concentrations of the element may compromise a person’s health. Studies have indicated that too much iron can increase an individual’s risk not only of cancer (SN: 2/26/94, p. 132), but also of heart disease (SN: 9/19/92, p. 180).