Dietary Supplements: Too Much of a Good Thing?

Doctors Paid up to $5,000 to Recruit Human Gui

Americans are spending $23 billion a year on dietary supplements, and the National Institutes of Health thinks that might be about $22.99 billion too much. In its long awaited final statement on multivitamin and mineral supplements, released in August, the NIH claims there's no convincing evidence to support the claim that taking supplements is a good idea for the general population.

Worse, the overdose of nutrition might have negative effects.

This might sound like bad news for the dietary supplement industry and its legion of naive, pill-popping devotees, which includes more than half of American adults along with myself, a daily multivitamin junkie.

Yet the NIH left some wiggle room. The final statement also stated that not enough is known to make a final statement. If that sounds confusing, welcome to the world of dietary supplements, where enough studies have been performed at this point to support any stance you want.

Good and bad

Vitamin C, for example, is good at warding off cancer, except when it fuels its growth. Laboratory animals that received vitamin C before exposure to a powerful dose of the herbicide paraquat were largely protected from cancer. For animals that received vitamin C after exposure, the antioxidant properties of vitamin C aggravated the damage caused by the herbicide and led to more cancers.

Similar, contradictory findings have been published on beta carotene, which is strongly associated with fueling lung cancer in smokers. Vitamin E has been shown to prevent heart disease yet also raise the risk of a second heart attack. Selenium showed promise in cancer prevention in small studies but had zero benefit in a long, large study of 60,000 nurses.

Part of the problem is the complexity of disease prevention. Free radicals, those nasty cell-damaging chemicals that we try to smother with antioxidants, are also important to the immune system, killing bacteria and cancer cells. An influx of vitamins and minerals with antioxidant properties (such as vitamins C and E and selenium) at the wrong time could prevent free radicals from helping the body.

And part of the problem is that we love nutrients to death. The NIH final statement was clear to state the importance of vitamins and minerals—well, before bashing them.

Vital amines

Scientists have long known that certain foods prevent certain diseases, with the first convincing case made in the 18th century with the discovery that lemons and limes cured scurvy. It was not until the 20th century, however, that vitamins (vital amines) were isolated, such as vitamin C in citrus fruits.

Supplementing poor diets 100 years ago with newly discovered vitamins and minerals saved lives. Thiamin cured beriberi; iodine cured goiter.

Of course, folks were lacking these nutrients, and the supplements got them back to normal levels. Today, however, we load up on megadoses of nutrients with precious little evidence to suggest this is beneficial and a good deal of evidence suggesting that it might be a bad idea. This follows the very American ideal that more is better; so if 100 milligrams of vitamin C is recommended daily, then 1,000 milligrams is 10 times better.

"We're concerned that some people may be getting too much of certain nutrients," said J. Michael McGinnis of the National Academy of Sciences, who chaired the panel that created the NIH statement. "The bottom line is that we don't know for sure that they're benefiting from them."

This follows the "Antioxidant Paradox," a concept introduced by Barry Halliwell of the National University in Singapore in a Lancet article in 2000. Halliwell lamented the fact that although diets rich in antioxidants seem to have a positive effect on health, popping antioxidant supplements can be harmful and the results are not at all predictable.

Downing chemicals

The NIH's supplement recommendations applies only to select groups: vitamin D and calcium for postmenopausal women for strong bones; zinc and antioxidants (including beta carotene) for nonsmokers with intermediate-stage age-related macular degeneration, an eye disease; and folate for women of childbearing age, to prevent certain birth defects. That's about it.

Outside of this dietary supplement report, the NIH and other health organizations have recommended iron supplements for women of childbearing age, because of iron lost in menstrual blood. But men, in general, don't need iron pills. (Earlier I confessed to taking a multivitamin; this contains no iron, and no vitamin or mineral exceeds the U.S. RDA.)

The long-standing advice has been to get your nutrients from food, not pills.

Would you willingly down a concoction of pyridoxine, pantothenic acid, ascorbic acid, and alpha-tocopherol? How about 2-Me-3-polyisoprenyl-1,4-naphthoquinone? Understanding that vitamins B5, B6, C, E and K are chemicals and not just alphabet soup might caution us to consume these in moderation.

Christopher Wanjek is the author of the books “Bad Medicine” and “Food At Work.” Got a question about Bad Medicine? Email Wanjek. If it’s really bad, he just might answer it in a future column. Bad Medicine appears each Tuesday on LIveScience.

Christopher Wanjek
Live Science Contributor

Christopher Wanjek is a Live Science contributor and a health and science writer. He is the author of three science books: Spacefarers (2020), Food at Work (2005) and Bad Medicine (2003). His "Food at Work" book and project, concerning workers' health, safety and productivity, was commissioned by the U.N.'s International Labor Organization. For Live Science, Christopher covers public health, nutrition and biology, and he has written extensively for The Washington Post and Sky & Telescope among others, as well as for the NASA Goddard Space Flight Center, where he was a senior writer. Christopher holds a Master of Health degree from Harvard School of Public Health and a degree in journalism from Temple University.