Health & WellnessS


Bulb

Leprosy: India's Hidden Disease

Image
© Channel 4Seyi Rhodes in a leprosy colony.
Leprosy has officially been eliminated in India, yet 130,000 new cases are diagnosed every year. Richard Cookson and Seyi Rhodes report on the plight of the patients shunned by society

Narsappa was just 10 years old when he was told he had leprosy, but the news changed the course of his life forever. People in his Indian village immediately began to shun him and told his parents that he had to leave. He says his mother started grieving for him "as if I was already dead". Shortly afterwards, his father took him to a hospital two hours away from home and left him there. No one ever came to visit him and Narsappa never went home again.

Now 42, he now lives in a leprosy colony on the outskirts of Hyderabad and campaigns on behalf of people affected by the disease. "I lie awake at night thinking about how I was treated and how I can stop others from going through the same thing," he says.

India may have one of the fastest growing economies in the world, but 130,000 Indians are diagnosed with leprosy every year - more than every other country put together. It's partly because the country's population is so huge but also, campaigners say, because the Indian government and some international donors are neglecting the fight against the disease. Hundreds of thousands of Indians suffer from leprosy and its debilitating after-effects.

Given the number of new cases, it may come as some surprise that India announced it had eliminated leprosy in 2005. According to a target set by the World Health Organisation, countries can announce 'elimination' when there are fewer than one case for every 10,000 people. Since then, the government has channeled funding previously dedicated to leprosy back into the general health system. Leprosy charities say that donations have also fallen significantly and some projects have had to close.

Cell Phone

Study Finds: Cell Phone Use May Weaken Bones

Image
© drmercola.comResearch suggests that wearing a cell phone on your hip may weaken an area of your pelvis.
Exactly how dangerous cell phone radiation might be has been an issue of dispute, with the industry, not surprisingly, insisting the devices are safe. While numerous studies point to links between the devices and dangerous diseases including cancer and fertility issues, an emerging study has found a potential link to reduced bone density.

Dr Fernando Sravi of Argentina's National University of Cuyo reports that men who wear their cell phone on their belt on the right side of their body experienced reduced bone mineral content (BMC) and bone mineral density (BMD) in that hip, said TG Daily. "The different patterns of right-left asymmetry in femoral bone mineral found in mobile cell phone users and non-users are consistent with a nonthermal effect of electromagnetic radiofrequency waves not previously described," he said, quoted TG Daily.

Attention

Even 'BPA-Free' Plastics Leach Endrocrine-Disrupting Chemicals

Image
© scrink.comDespite all of the media attention about the safety of plastic products over the last year; the toxic chemical bisphenol-A (BPA) is still found in many products.
Plastic containers and linings often leach chemicals into the surrounding environment. And some of those chemicals, like the endocrine-disrupting bisphenol-A (BPA) and phthalates, may be harmful to your health.

Manufacturers have even begun advertising some products as "BPA-free." But a recent study found that most plastic products leach endocrine-disrupting chemicals even if they're labeled "BPA-free!" The scientists found that 70 percent of common plastic products tested positive for estrogenic activity, and that number rose to 95 percent when the products were subject to real-world conditions such as dishwashing or microwaving.

Time Magazine reports:
"BPA is particularly worrisome simply because it is so common. Nearly every American has some amount of BPA in his or her body, in part because plastics are so ubiquitous."
Sources

Time Magazine March 8, 2011

Environmental Health Perspectives March 2, 2011 (Epub Ahead of Print)

Beaker

Chemicals Linked to Early Menopause

Image
© UnknownMenopause Symtoms
Study Suggests Exposure to Chemicals Called PFCs May Be Associated With Earlier Menopause

Women exposed to high levels of chemicals called perfluorocarbons (PFCs) may enter menopause earlier, new research suggests.

PFCs are man-made chemicals found in many household products such as food containers and stain-resistant clothing as well as in water, soil, and plants.

''Before this study, there was strong evidence from animal research that PFCs were endocrine disruptors," says researcher Sarah Knox, PhD, professor of epidemiology at the West Virginia University School of Medicine, Morgantown.

For the study, she evaluated the levels of two PFCs, called PFOS (perfluorooctane sulfonate) and PFOA (perfluorooctanoate) in nearly 26,000 women, ages 18 to 65.

Overall, she found, ''the higher the perfluorocarbons, the earlier the menopause." Women between ages 42 and 64 with the highest blood levels of the PFCs were more likely to have experienced menopause than those with the lowest levels.

Clock

Real Food, Wise and Robust Old Age

Image
© conner395Inverness Castle in the Scottish Highlands, home of a healthy people.
Old people in modern times are considered weak, foolish, and helpless, unable to survive without care. Most people expect to be weak and helpless when they get old, and to end their lives in a "rest home." We often read in the news media that young workers will have the burden of taking care of an aging population.

Yet this is a new and horrible way of aging. Through most of history, old age was associated with wisdom, strength, and leadership. Most older people who ate a traditional diet not only took care of themselves, but led their communities, taught the young, and were the repository of knowledge and leadership for their peoples.

What is the difference? Why did old age change from a time of wisdom and leadership to a time of failing minds, deteriorating bodies, and chronic illness?

What we do know is that people eating the healthy traditional diets of their ancestors, with little or no medical care, remained wise and strong into their nineties.

We also know that modern people eating the Standard American Diet (SAD) become helpless in their sixties and seventies and even younger, unable to care for themselves, needing all kinds of expensive medical care and procedures just to keep breathing.

In other words, real food is the key to a wise and healthy old age.

Cookie

Why Did We Evolve a Taste for Sweetness?

Image
© VisualPhotos
After I did my post on Seth Roberts's new therapies for circadian rhythm disorders, Seth learned of my experience with scurvy and blogged about a similar experience of his own.

Seth made the important point that food cravings are driven by nutritional deficiencies - a point I heartily agree with, which is why it's so important for those seeking to lose weight to be well nourished - and asked, "Why do we like sweet foods?" His suggested answer was that the taste for sweetness encouraged Paleo man "to eat more fruit so that we will get enough Vitamin C."

This led to a fascinating contribution from Tomas in the comment thread:
I have read several books on the Traditional Chinese Medicine and they attributed that increased craving for sweets is in fact signaling some serious nutritious deficiencies. They said that it's in fact meat or starches or other nutritionally dense foods that will soothe the craving, but sweets are more readily available. The taste of meat is in fact sweet as well.

In my experience this seems (the TCM view) to be true. I always have been very skinny, but eating enormous amounts of sweets. After I switched to a proper, paleo-like diet, the situation changed in many aspects and I no longer have such strong cravings and slowly I am gaining some weight.

Health

Handy dandy carb index

Image
There are a number of ways to gauge your dietary carbohydrate exposure and its physiologic consequences.

One of my favorite ways is to do fingerstick blood sugars for a one-hour postprandial glucose. I like this because it provides real-time feedback on the glucose consequences of your last meal. This can pinpoint problem areas in your diet.

Another way is to measure small LDL particles. Because small LDL particles are created through a cascade that begins with carbohydrate consumption, measuring them provides an index of both carbohydrate exposure and sensitivity. Drawback: Getting access to the test.

For many people, the most practical and widely available gauge of carbohydrate intake and sensitivity is your hemoglobin A1c, or HbA1c.

HbA1c reflects the previous 60 to 90 days blood sugar fluctuations, since hemoglobin is irreversibly glycated by blood glucose. (Glycation is also the phenomenon responsible for formation of cataracts from glycation of lens proteins, kidney disease, arthritis from glycation of cartilage proteins, atherosclerosis from LDL glycation and components of the arterial wall, and many other conditions.)

HbA1c of a primitive hunter-gatherer foraging for leaves, roots, berries, and hunting for elk, ibex, wild boar, reptiles, and fish: 4.5% or less.

HbA1c of an average American: 5.2% (In the population I see, however, it is typically 5.6%, with many 6.0% and higher.)

HbA1c of diabetics: 6.5% or greater.

Don't be falsely reassured by not having a HbA1c that meets "official" criteria for diabetes. A HbA1c of 5.8%, for example, means that many of the complications suffered by diabetics--kidney disease, heightened risk for atherosclerosis, osteoarthritis, cataracts--are experienced at nearly the same rate as diabetics.

With our wheat-free, cornstarch-free, sugar-free diet, we have been aiming to reduce HbA1c to 4.8% or less, much as if you spent your days tracking wild boar.

Health

Not all trans fats are created equal

Image
'Trans-fatty acids' can be formed in the processing of fats. They usually start out life as a vegetable oil, which is then treated in a multi-stage process to, say, solidify it and extend its shelf life. The word 'trans' refers to the chemical shape of these molecules. In general terms, these fats are a different shape to fats found naturally in nature, which usually have a different - 'cis' - shape. Trans fats have been linked with a variety of health issues including enhanced risk of cardiovascular disease [1-4] and diabetes [5-7].

However, trans fats can be found in nature too. For example, butter contains trans fatty acid. The food industry sometimes refers to this fact, I suspect in an attempt to suggest that the industrially-produced trans fats that they put in foods are somehow 'natural' too. But are the trans-fats found in nature the same as those that are formed in a factory?

Actually, industrially produced and naturally occurring trans fats have different chemical structures: industrially-produced trans fats are predominantly monounsaturated trans fats of which something known as 'elaidic acid' is a major component. Trans fats found naturally in food, on the other hand, are mainly to be found in the form of very different fats known as 'trans vaccenic acid' and 'conjugated linoleic acids'. Do these differences reflect on their impact on health?

Cookie

ADHD: It's The Food, Stupid

Image
© triliumhealth.com
Over five million children ages four to 17 have been diagnosed with attention deficit hyperactivity disorder (ADHD) in the United States and close to 3 million of those children take medication for their symptoms, according to the Centers for Disease Control. But a new study reported in The Lancet last month found that with a restricted diet alone, many children experienced a significant reduction in symptoms. The study's lead author, Dr. Lidy Pelsser of the ADHD Research Centre in the Netherlands, said in an interview with NPR, "The teachers thought it was so strange that the diet would change the behavior of the child as thoroughly as they saw it. It was a miracle, the teachers said."

Dr. Pessler's study is the first to conclusively say that diet is implicated in ADHD. In the NPR interview, Dr. Pessler did not mince words, "Food is the main cause of ADHD," she said adding, "After the diet, they were just normal children with normal behavior. They were no longer more easily distracted, they were no more forgetful, there were no more temper-tantrums." The study found that in 64 percent of children with ADHD, the symptoms were caused by food. "It's a hypersensitivity reaction to food," Pessler said.

Comment: For more information about the connection between diet and hyperactivity in children read the following articles:

Study: Cutting Out Suspect Foods Could Help Calm ADHD Children
According to a new study by Dutch scientists, restricting the range of foods fed to children suffering from ADHD can "significantly improve" their disrupting behavior and can prove a standard treatment for such kids.
Study: Western Diet Link to ADHD
A new study from Perth's Telethon Institute for Child Health Research shows an association between ADHD and a 'Western-style' diet in adolescents.

Leader of Nutrition studies at the Institute, Associate Professor Wendy Oddy, said the study examined the dietary patterns of 1800 adolescents from the long-term Raine Study and classified diets into 'Healthy' or 'Western' patterns.

Dr Oddy said:
"We suggest that a Western dietary pattern may indicate the adolescent has a less optimal fatty acid profile, whereas a diet higher in omega-3 fatty acids is thought to hold benefits for mental health and optimal brain function.

"It also may be that the Western dietary pattern doesn't provide enough essential micronutrients that are needed for brain function, particularly attention and concentration, or that a Western diet might contain more colors, flavors and additives that have been linked to an increase in ADHD symptoms.
Do Synthetic Food Colors Cause Hyperactivity?

Food Dyes Linked to Hyper Kids, Group Asks FDA to Ban


Light Saber

Iodine for Health

Image
© unknownSeaweeds, Kelp and Algae: all natural sources of iodine
There is growing evidence that Americans would have better health and a lower incidence of cancer and fibrocystic disease of the breast if they consumed more iodine. A decrease in iodine intake coupled with an increased consumption of competing halogens, fluoride and bromide, has created an epidemic of iodine deficiency in America.

People in the U.S. consume an average 240 micrograms (µg) of iodine a day. In contrast, people in Japan consume more than 12 milligrams (mg) of iodine a day (12,000 µg), a 50-fold greater amount. They eat seaweed, which include brown algae (kelp), red algae (nori sheets, with sushi), and green algae (chlorella). Compared to terrestrial plants, which contain only trace amounts of iodine (0.001 mg/gm), these marine plants have high concentrations of this nutrient (0.5 - 8.0 mg/gm). When studied in 1964, Japanese seaweed consumption was found to be 4.5 grams (gm) a day and that eaten had a measured iodine concentration of 3.1 mg/gm of seaweed (= 13.8 mg of iodine). According to public health officials, mainland Japanese now consume 14.5 gm of seaweed a day (= 45 mg of iodine, if its iodine content, not measured, remains unchanged)(link). Researchers have determined that residents on the coast of Hokkaido eat a quantity of seaweed sufficient to provide a daily iodine intake of 200 mg a day. Saltwater fish and shellfish contain iodine, but one would have to eat 15 - 25 pounds of fish to get 12 mg of iodine.