When you eat something, your body increases the blood flow to your stomach muscles to help with digestion. The larger the meal you scarf down, the more oxygenated blood your stomach needs for digestion. But this means less oxygen available for your arms and legs, which require an increased amount during exercise (whether you're swimming, running, or cycling). Depriving your muscles of vital oxygen can lead to cramps, conceivably increasing your risk of drowning.
[What Causes a Charlie Horse?]
For recreational swimmers, the risk of getting cramps after eating is actually very low; your body has more than enough oxygen to share between your stomach and limbs. The real danger lies with those who eat huge meals before vigorous, triathlon-level exercise. Such cases can indeed lead to cramps and even vomiting. But even then, the medical consensus has long been that it's unlikely to result in drowning; that is, unless the swimmer all-out panics and forgets how to float.
Believers have schemed up several different explanations for why our bovine friends would hit the ground in anticipation of a storm, and many of them sound equally plausible. The simplest is that cows can sense increasing air moisture and will plop down to preserve a dry patch of grass. Another theory states that cows lie down to ease their stomachs, which are supposedly sensitive to changes in atmospheric pressure brought on by rainfall.
The most complicated explanation suggests that cow legs are micro-porous structures that rapidly absorb moisture. As the relative humidity builds from an oncoming downpour, the cow's legs will absorb more and more moisture from the air, softening until they can no longer support the weight of the cow.
But is there any weight behind this tale? Not likely cows lie down for many reasons, and there's no scientific evidence that rain is one of them. As the Farmer's Almanac says, Cows lying down in a field more often means they're chewing their cud, rather than preparing for raindrops. And just think: If weather predictions were made based on the actions of cows, the forecast would always be grim.
Thankfully the legend is false.
As gastroenterologist Dr. Rodger Liddle of the Duke University School of Medicine explained to Scientific American: "Nothing would reside that long unless it was so large it couldn't get out of the stomach or it was trapped in the intestine."
Chewing gum passes through the digestive system like any other food. Your body is able to break down some of the gum's components, such as sweeteners and oil derivatives, but the gum's rubber or latex base gets churned out in a matter of days.
However, this doesn't mean you should start swallowing your chewing gum regularly in several reported cases, doctors had to remove taffylike wads of gum from children's bowels. Swallowing a lot of chewing gum in a relatively short amount of time, it seems, can cause the pieces to accumulate and stuff up the digestive tract, causing constipation.
While a cat could accidentally suffocate a sleeping baby by cozying up too close to its face, experts agree it's highly unlikely a cat would smother an infant on purpose. Reports of cat-caused infant deaths are scarce, so how did this tale become so common?
One case from 300 years ago may have given this tale all the oomph it needed to reach its current scare level. In the Annual Register, a publication that records the year's interesting events, there is an entry for Jan. 25, 1791: A child of eighteen months old was found dead near Plymouth; and it appeared, on the coroner's inquest, that the child died in consequence of a cat sucking its breath, thereby occasioning a strangulation." Coroner knows best, so it must be true, right?
Adding to this report is the fact that cats have long been thought of as the familiars of witches, so if parents (or even coroners) found an infant dead with a cat nearby, the cat was automatically blamed for the incident. Nowadays, however, we know that otherwise healthy babies can die without any known causes, an occurrence known as sudden infant death syndrome.
But in the 1980s, scientists put this old wives' tale to rest (at least in the medical community a lot of people still believe this one). Studies showed that spicy food doesn't cause ulcers, though it can irritate existing ulcers, which explains the misunderstanding.
The real culprit behind the majority of ulcers, researchers found, was the bacterium Helicobacter pylori. When H. pylori enters the body, it heads for the stomach, excreting protective enzymes to shield it from the stomach's harmful digestive acids. H. pylori then burrows into the stomach's mucosal lining, which partially protects it from white blood cells, the immune system's main weapon against bacterial intruders. Ulcers then develop as the bacteria colonize the stomach.
Today's ulcer treatments usually involve antibiotics to kill the infection, but recent research has shown that cranberry juice may be effective, too. Interestingly, cranberry juice has long been a part of another popular and possibly true tale asserting that the tart drink effectively fights bladder infections. The mechanism behind both treatments is thought to be the same: Compounds in cranberry juice prevent bacteria from adhering to the cells lining the urinary tract and prevent H. pylori from sticking to the lining of the stomach.
Similar anecdotal evidence exists for preventing restless leg syndrome (RLS) with soap, but on a smaller scale. On another popular medical talk show, "The Dr. Oz Show," Dr. Mehmet Oz recommended placing a bar of lavender soap beneath the bed sheets to alleviate RLS, hypothesizing that the smell of lavender is relaxing in itself and may be beneficial for the condition. However, there are no peer-reviewed studies that suggest lavender or lavender soap can successfully treat RLS.
So if you're suffering from nightly leg cramps or RLS, perhaps you should try placing a bar of soap under your sheets near your feet. Even though science has yet to show that these treatments work, what have you got to lose? Just don't try Dove or Dial those soaps don't work, according to many online testimonies. Why? Your guess is as good as any.
The hair shaft naturally tapers at the end, so what you typically see are the thinnest portions of your hair. When you shave, however, you are crossing the midshaft and exposing the thicker part of the hair, making it seem as if each individual strand is taking up a bit more space. Moreover, the stubble feels stiffer because it's shorter and cut straight across (body hair feels softer as it gets longer). Even the apparent darkening of the cut hair is an illusion it appears darker because you are now seeing the hair dots directly against the backdrop of your normal skin color.
Scientists have actually conducted studies to test whether shaving affects hair growth. In a 1928 study published in the journal Anatomical Record, forensic anthropologist Mildred Trotter found that shaving has no effect on hair's color, texture or growth rate. More recently, research published in the Journal of Investigative Dermatology also looked at this tale. "No significant differences in total weight of hair produced in a measured area, or in width or rate of growth of individual hairs, could be ascribed to shaving," the researchers concluded in their 1970 study.
In 2005, a study of more than 2,500 pregnant women by the New York University College of Dentistry found that as her number of children increases, so does the mother's risk of losing teeth. More children also equated to a greater risk of developing periodontal disease.
There are several things that could cause affect a pregnant woman's oral health. Some are morning sickness (vomiting erodes tooth enamel); dry mouth from hormonal changes (less saliva increases the risk of cavities); and an increased desire for sugary and starchy foods (which can deteriorate teeth). On top of this, research has shown pregnant women are less likely to visit their dentists.
But these issues are not new. A 2008 study in the journal Current Anthropology found that women have had worse dental health than men ever since the rise of agriculture 10,000 years ago and the subsequent boom in the human population.
Gain a child, lose a tooth? Probably not. Gain a child, gain a cavity? Perhaps.
In 1969, research published in the Journal of the American Medical Association found that chocolate doesn't worsen acne, and several subsequent studies have backed up that conclusion. Now both the American Academy of Dermatology and the National Institute of Arthritis and Musculoskeletal and Skin Disease say there's no connection between chocolate and acne. In fact, the organizations assert that your diet, in general, has little effect on pimple development.
Still, recent research shows the issue isn't so clear-cut. Last year a study published in the journal Clinics in Dermatology concluded the 1969 study was flawed in several ways, while a study published in Journal of the American Academy of Dermatology found a link between pure chocolate and pimple formation.
Despite the new findings, the major medical associations aren't ready to change their rulings just yet. As usual, more research is needed.
The myth got its start in the late 1960s, when General Electric sold television sets that emitted levels of radiation as much as 100,000 times more than what federal health experts considered safe. To its credit, GE quickly recalled and repaired its hazardous TVs.
But there was a danger even before GE's big blunder. Televisions developed before the 1950s emitted levels of radiation that could heighten a person's risk of eye problems after repeated and extended exposure, Dr. Norman Saffra, chairman of ophthalmology at Maimonides Medical Center in Brooklyn, told the New York Times.
[5 Everyday Things that Are Radioactive]
These issues are now a thing of past; modern TVs come with proper shielding to block radiation. Nowadays, the only eye problems that televisions cause are strain and fatigue, both of which can be cured by simply resting your eyes. (The same goes for another popular old wives' tale about reading in dim light.)