When categorizing its 10 greatest public health achievements of the 20th century, the Centers for Disease Control and Prevention made sure to include "safer and healthier foods." Yet the iodization of salt, the most effective dietary intervention of the 20th century, received a paltry eight words in a 3,400-word article (CDC, 1999).
Although the effectiveness of iodine in treating goiter had been known since 1821, it was a full century before the successful management of this disease became a fact of life. During World War I, doctors in Michigan were shocked at the number of young draftees who had to be turned away from the military because of goiter, which ran as high as 64% in some areas. Goiter proved to be the largest single cause of medical disqualification for military service.
In 1924, the Michigan State Medical Society endorsed the iodization of salt, and Morton Salt Co. began marketing iodized table salt for country-wide consumption. From that moment on, with a simple jiggle of the salt shaker, Americans dispatched the scourge of iodine-deficiency diseases—goiter, cretinism, and hypothyroid coma (myxedema)—into the dustbin of medical history.
What was not known at the time is that the fourth and most devastating horseman of the iodine-deficiency apocalypse is mental retardation. In the past, it was not uncommon for significant numbers of children in certain regions to be considered "dull" or "dim-witted," without any connection made to iodine deficiency. However, within the past two decades it has been demonstrated that where 5% or more of school-age children have goiter, the average cognitive ability in the entire population is reduced by as much as 15 IQ points (Bleichrodt and Born, 1994), a drop sufficient to move a child into the mildly retarded category.
The World Health Organization (WHO, 1999) claimed that iodine deficiency at critical stages of fetal development and early childhood remains the world’s single most important and preventable cause of mental retardation, and many developing countries are forging ahead to promote national salt iodization programs. China, poised to be the 21st century’s economic and technological powerhouse, has given such programs high priority—a sure sign that it intends its citizens to achieve their full intellectual potential in a competitive world.
In the U.S., however, the National Health and Nutrition Examination Surveys carried out over the past 30 years have shown a dramatic drop in urinary iodine. From 1971–74 to 2001–02, the median urinary iodine excretion in adults declined from 320 μg/L to 168 μg/L and the frequency of moderate iodine deficiency in pregnant women jumped from 1% to 7% (Caldwell et al., 2005). While the current levels are not low enough to declare a public health emergency, the trend is a matter of great concern.
What is responsible for this disturbing trend? Iodine in milk has decreased because iodophor sanitizers used in the dairy industry are being phased out. While this accounts for some reduction, I believe there is a far more compelling explanation for reduced iodine intake.
During the past 30 years, we have witnessed the incredible success of the restaurant and foodservice sectors. In parallel with the increasing trend of meals consumed outside the home is the congruent drop in urinary iodine. This makes sense because with very few exceptions, meals consumed outside the home are made with non-iodized salt, the notable exception being those restaurants and foodservice operations that buy consumer packs in bulk. Although we are not there yet, we will soon approach a critical level that may reintroduce iodine-deficiency disorders and all their social and economic consequences.
The restaurant and foodservice trade must demand and use iodized salt so that those meals that consumers and schoolchildren eat away from home provide the same level of iodine as would be obtained within the home. If we are to continue producing such brilliant triumphs as space exploration, breakthroughs in medical and agricultural sciences, and all the other intellectual endeavors that require our utmost cognitive faculties, our nutrition must be worth its salt.
by Morton Satin, a Member of IFT, is Director, Technical and Regulatory Affairs, Salt Institute, 700 N. Fairfax St., Alexandria, VA 22314 ( [email protected] ).
Bleichrodt, N. and Born, M.P. 1994. A metaanalysis of research on iodine and its relationship to cognitive development. In "The Damaged Brain of Iodine Deficiency," ed. J.B. Stanbury, pp. 195-200. Cognizant Communication Corp., New York.
Caldwell, K.L., Jones, R., and Hollowell, J.G. 2005. Urinary iodine concentration: United States National Health and Nutrition Examination Survey 2001-2002. Thyroid 15: 692-699.
CDC. 1999. Ten great public health achievements—United States, 1900-1999. Centers for Disease Control and Prevention, Morbidity Mortality Wkly. Rept. 48: 241-243. www.cdc.gov/mmwr/preview/mmwrhtml/00056796.htm.
WHO. 1999. Progress towards the elimination of Iodine Deficiency Disorders (IDD), WHO/NHD/99.4, World Health Org., Geneva.