A report in the March 23 issue of Archives of Internal Medicine (a journal from JAMA/Archives) indicates that in the United States from 1994 to 2004, the average blood levels of vitamin D seem to have decreased.
In the past, the most important health conditions linked to vitamin D deficiency were poor bone mineral content in adults and rickets in children. The report indicates those problems were treated by fortifying food with vitamin D. Lately, the lack of vitamin D has been linked to heart disease, cancer, infections, and poor health in general. Data indicates that to reach optimum health, levels of 30 to 40 nanograms per milliliter may be considered necessary.
The authors say, “Vitamin D supplementation appears to mitigate the incidence and adverse outcomes of these diseases and may reduce all-cause mortality.” Still, the present recommendations for supplement levels concentrate mostly on gaining bone health, with 200 international units per day from birth to age fifty, 400 international units per day from age fifty one to seventy and 600 international units from age seventy one and up. Moreover, less outdoor activities and campaigns urging to reduce sun exposure have lead to vitamin D deficiency, given that sunlight exposure is an important determinant of vitamin D in humans.
Adit A. Ginde, M.D., M.P.H., University of Colorado Denver School of Medicine, Aurora, and his team did a comparative study on levels of serum 25-hydroxyvitamin D (25[OH]D, a measurement of the amount of vitamin D in the blood) collected from the Third National Health and Nutrition Examination Survey (NHANES III) between 1988 and 1994, and those collected during NHANES between 2001 and 2004. Full records of the participants were available (18,883 in the first survey and 13,369 in the second survey).
The authors write: “Overall, the mean [average] serum 25(OH)D level in the U.S. population was 30 nanograms per milliliter during the 1988-1994 collection and decreased to 24 nanograms per milliliter during the 2001-2004 collection.” Between the two periods, the incidence of levels below 10 nanograms per milliliter increased from 2 percent to 6 percent. Fewer individuals had levels reaching 30 nanograms per mililiter or higher (45 percent compared to 23 percent).
Throughout both surveys, racial and ethnic variations remained: the prevalence of 25 (OH)D levels of less than 10 nanograms per milliliter increased from 9 to 29 percent and levels of more than 30 nonograms per milliliter or higher decreased from 12 to 3 percent, for non-Hispanic blacks.
“These findings have important implications for health disparities and public health,” the researchers write. “We found that the mean serum 25(OH)D level in the U.S. population dropped by 6 nanograms per milliliter from the 1988-1994 to the 2001-2004 data collections. This drop was associated with an overall increase in vitamin D insufficiency to nearly three of every four adolescent and adult Americans.”
The authors conclude: “Current recommendations for dosage of vitamin D supplements are inadequate to address this growing epidemic of vitamin D insufficiency. Increased intake of vitamin D (1,000 international units per day or more) – particularly during the winter months and at higher latitudes – and judicious sun exposure would improve vitamin D status and likely improve the overall health of the U.S. population. Large randomized controlled trials of these higher doses of vitamin D supplementation are needed to evaluate their effect on general health and mortality.”
Arch Intern Med. 2009;169:626-632.
Archives of Internal Medicine
Editor’s Note: Senior author Dr. Camargo was supported by the Massachusetts General Hospital Center for D-receptor Activation Research, and he and co-author Dr. Liu were supported by grants from the National Institutes of Health. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.
Written by Stephanie Brunner (B.A.)