Saturday, November 30, 2013



Sugary drinks linked to increased endometrial cancer risk

The usual bunk  -- correlational data with social class confounding

Sugar-sweetened beverages have long been associated with a number of health risks – including obesity, diabetes and heart disease.  And now, a new study reveals that sugary drinks may also be associated with a significantly increased risk of a common type of endometrial cancer.

In a study published in the journal Cancer Epidemiology, Biomarkers and Prevention, researchers analyzed data collected from 23,039 postmenopausal women as part of the Iowa Women’s Health Study. The data included information on the women’s dietary intake and medical history.

As part of the study, participants were asked to report their typical consumption of sugar-sweetened beverages – including Coke, Pepsi and other carbonated beverages with sugar – in addition to their consumption of noncarbonated fruit drinks, like Hawaiian Punch or lemonade.

Overall, the researchers discovered that the women who reported the highest intake of sugary drinks had a 78 percent increased risk of developing estrogen-dependent type 1 endometrial cancer – the most common type of endometrial cancer. The more sugary drinks the women consumed, the worse their risk for developing the cancer.

It is estimated that 49,560 women in the United States will be diagnosed with endometrial cancer in 2013, according to the Centers for Disease Control and Prevention. According to the researchers, type 1 endometrial cancer is an estrogen-dependent cancer, which may explain why sugary beverages are linked to an increased risk for the disease.

“We know…that higher consumption of sugar-sweetened beverages may increase body fat, and higher body fat may increase estrogen levels,” study author Maki Inoue-Choi, a research associate at the University of Minnesota School of Public Health in Minneapolis, told FoxNews.com.

While the risk posed by sugar-sweetened beverages may seem substantial, Inoue-Choi said that it’s relatively small compared to the overall risk for endometrial cancer associated with obesity.

“Obese and overweight (women) may have up to 3.5 times higher risk for endometrial cancer,” Inoue-Choi said.

Interestingly, the study didn’t find any correlation between increased intake of other criticized foods, such as sugar-free sodas, sweets or baked goods and starches.

“One theory is that sugar from whole foods come with other nutrients ,fat, (and) fiber, so they may slow sugar absorption,” Inoue-Choi  said. “But beverages don’t come with these nutrients, so the sugar may be absorbed more quickly.”

While this study is the first to link sugar-sweetened beverages to endometrial cancer, Inoue-Choi emphasizes that more research is still needed to confirm the connection.

“This needs to be replicated in other studies, but everyone should follow the current guidelines to avoid sugar-sweetened beverage intake, because it may increase the risk of other health conditions like obesity, diabetes heart disease and cancer,” Inoue-Choi said.

SOURCE




Disregard toxic advice on Turkey Day

Toxic chemicals lurk in the “typical” Thanksgiving meal, warns a green activist website. Eat organic, avoid canned food, and you might be okay, according to their advice. Fortunately, there’s no need to buy this line. In fact, the trace levels of man-made chemicals found in these foods warrant no concern and are no different from trace chemicals that appear in food naturally.

The American Council on Science and Health (ACSH) illustrates this reality best with their Holiday Dinner Menu, which outlines all the “toxic” chemicals found naturally in food.  The point is, at such low levels, both the man-made and naturally occurring chemicals pose little risk. This year the ACSH puts the issue in perspective explaining:

    "Toxicologists have confirmed that food naturally contains a myriad of chemicals traditionally thought of as “poisons.” Potatoes contain solanine, arsenic, and chaconine. Lima beans contain hydrogen cyanide, a classic suicide substance. Carrots contain carototoxin, a nerve poison. And nutmeg, black pepper, and carrots all contain the hallucinogenic compound myristicin. Moreover, all chemicals, whether natural or synthetic, are potential toxicants at high doses but are perfectly safe when consumed in low doses.”

Watch ACSH’s video on this topic here.

Nevertheless, green groups continue to demonize man-made chemicals, suggesting that they are somehow different than naturally occurring ones. At the top of the green hit list is the chemical bisphenol A (BPA), which is used to make hard-clear plastics and resins that line food cans. A couple years back, the Breast Cancer Fund issued a report that measured the trace levels of BPA in food. It warned:  “An unwelcome visitor may be joining your Thanksgiving feast: bisphenol A. BPA is an estrogenic chemical that lab studies have linked to breast cancer.”

Seriously, if you are worried about chemicals with estrogenic properties, you’d need to avoid many healthy foods, such as beans, nuts, and any soy-based products, which contain naturally occurring hormonally active chemicals. These naturally occurring chemicals are tens of thousands of times more potent than traces of synthetic chemicals in food. And guess what? Even though they are more potent and plentiful than BPA, these chemicals pose little risk as well.

When they hype the risks of BPA, anti-chemical activists never communicate truly useful information about actual BPA risk, which is negligible, according to extensive scientific reviews that numerous government agencies and research bodies around the world have conducted. The U.S. Food and Drug Administration affirmed BPA safety again this past March, stating: “FDA’s current assessment is that BPA is safe at the very low levels that occur in some foods. This assessment is based on review by FDA scientists of hundreds of studies including the latest findings from new studies initiated by the agency.”

The negligible risks of BPA are certainly worth taking given the huge benefit that BPA provides in making long-term safe food storage and distribution possible. Get more information on BPA here.

BPA levels, like so many trace chemicals — man-made and natural — are simply too low to pose much risks. So enjoy your turkey along with canned green beans and cranberry dressing, and don’t worry!


SOURCE



Thursday, November 28, 2013



More Than 20% of 14-Year-Old Boys Diagnosed With ADHD

Good evidence that a lot of normal behavior is being medicalized

More than 20 percent of the 14-year-old boys in the United States have been diagnosed at some point in their lives with attention-deficit/hyperactivity disorder (ADHD), according to a newly released study by the federal Centers for Disease Control and Prevention.

The study also said more than 20 percent of 11-year-old boys had been diagnosed with ADHD at some point in their lives.

The study indicated that American boys were 125 percent more likely than girls to be diagnosed with ADHD, and that boys were 127 percent more likely than girls to be medicated for it.

13.3 percent of American 11-year-old boys are being medicated for ADHD, said the study.

Overall, the percentage of children from 4 through 17 years of age who have  been diagnosed with ADHD increased 42 percent from 2003 through 2011.

The study also found that children in public health programs (Medicaid and the State Children’s Health Insurance Program) were 53 percent more likely to be diagnosed with ADHD than children with private health insurance.

“The parent-reported prevalence of a history of an attention-deficit/hyperactivity disorder (ADHD) diagnosis by a health care provider among U.S. school-aged children increased from 7.8% in 2003 to 11% in 2011, an increase of 42% in less than a decade,” said the study published by the Journal of the American Academy of Child & Adolescent Psychiatry.

“This study is really based on the parent-reported survey data and it extends what we know about the increasing prevalence of health-care-provider diagnosed ADHD,” said Susanna Visser of the National Center on Birth Defects and Developmental Disabilities in a CDC podcast.

“It highlights the consistent increases in ADHD diagnoses since 2003,” said Visser, who is one of the authors of the study. “Now we also document that there’s been significant increases in the percentage of kids 4-17 years of age who are taking medication for ADHD since 2007.”

ADHD diagnoses, according to the study, are not distributed evenly among the nation’s children.

“Ever-diagnosed ADHD was more common among children with health care coverage than those without coverage, and among those with public coverage than with private coverage,” said the study.

“Nearly 1 in 5 high school boys and 1 in 11 high school girls had been diagnosed with ADHD,” said the study.

“Estimates of medicated ADHD increased in 2011, as compared to 2007, particularly among teen boys. In 2011, the highest medicated ADHD prevalence was among 11-year-old boys (13.3%).”

The study was based on the National Survey of Children’s Health, which has been conducted in three phases, including one in 2003, another in 2007, and a third in 2011. The survey interviewed a random sample of tens of thousands of parents (95,677 in the 2011-2012 phase), asking each parent interviewed about one child in their family. Among the questions asked was whether a doctor or other health provider had ever told the parent the child in question had ADHD, and whether this child is currently taking medication for ADHD.

In the 2003 survey, 7.8 percent of parents said their child had been diagnosed at some point with ADHD. In 2007, 9.5 percent said that, in 2011, 11.0 percent said that.

Children who did not have health-care coverage were least likely to have been diagnosed with ADHD, and children children on Medicaid and SCHIP were most likely. In the 2011 survey, 6.4 percent of children without health coverage were diagnosed, compared to 9.4 percent with private coverage, and 14.4 percent with Medicaid or SCHIP.

The 14.4 percent of children on Medicaid or SCHIP diagnosed with ADHD was 125 percent more than the 6.4 percent with no health coverage and 53 percent more than the 9.4 percent with private health coverage.

According to the 2011 survey, 15.1 percent of all American boys 4 to 17 years old have been diagnosed at some point with ADHD. That compares with 6.7 percent of girls in that age bracket. Thus, boys are 125 percent more likely than girls to be diagnosed with ADHD.

The 2011 survey also showed that 8.4 percent of American boys who are 4 to 17 years old are currently being medicated for ADHD, while only 3.7 percent of girls are being currently medicated. Thus, boys are 127 percent more likely than girls to be medicated for ADHD.

The study said: “Among boys, the 2003 prevalence of ever diagnosed ADHD … was less than 15%, regardless of age; in 2007, the estimates exceeded 15% for individuals 9 to 17 years of age, with the exception of those 12 years of age (13.6%); in 2011 the estimates exceeded 15% for those 10 to 17 years, and exceeded 20% for those 11 years and 14 years.”

“Nationally, the increases in parent-reported ADHD diagnosis and associated medication treatment occurred during a period in which the Food and Drug Administration (FDA) issued 3 Public Health Alerts and a series of communications regarding cardiac and psychiatric risks of ADHD medications,” said the report.

The CDC describes ADHD symptoms as follows: “A child with ADHD might: daydream a lot, forget or lose things a lot, squirm or fidget, talk too much, make careless mistakes or take unnecessary risks, have a hard time resisting temptation, have trouble taking turns, have difficulty getting along with others”

“We also have new information from the survey about the age of health-care-provider diagnosed ADHD and overall we estimate that children are diagnosed at an average age of 7, with about half of these children diagnosed by age 6,” said the CDC’s Visser in her podcast.

SOURCE






Australia:  Anti-Vaccination body loses appeal against name change order

Fruitcakes

The Australian Vaccination Network has again been ordered to change its name, after losing an appeal against a ruling that its current name is misleading.

The New South Wales Administrative Decisions Tribunal has upheld a ruling by the state's Fair Trading department that the anti-vaccination group's current name could mislead the public.

The AVN can elect to make a further appeal against the ruling, but Fair Trading Minister Anthony Roberts has warned the organisation risks a hefty legal bill because the department will seek legal costs.

"The AVN must change its name now," Mr Roberts said.  "We're awaiting advice from the AVN as to what they consider an appropriate name would be.

"We reserve the right to reject any names we consider inappropriate, but again my clear message to the Australian Vaccination Network is be open and up-front about what you stand for."

The Australian Medical Association was among those that complained to Fair Trading about the AVN's name.  AMA NSW president Brian Owler says the AVN has a right to exist but not to mislead.

"The State Government should be commended on its efforts to improve the health of children through its support of vaccination and its stand against the anti-vaccination lobby," associate professor Owler said.

"The importance of vaccination cannot be understated in helping to keep children free from harm.  "Ultimately, your family GP is your best source of advice about vaccination."

The AVN has also issued a statement about the decision.

"We believe that the Administrative Decision Tribunal, in finding against the AVN, exemplified the current climate of government-sanctioned abuse and hatred of anyone who steps away from mainstream medical dogma," the statement said.

The statement gives no indication whether the AVN will launch a further appeal against the decision.

The AVN currently has a disclaimer on its website informing visitors of the name change order.

SOURCE




Tuesday, November 26, 2013



How a McDonald's sign makes us MISERABLE: Fast food logos 'stop people enjoying music and art'

This sounds like the respondents just gave the researchers the response that they knew was wanted

Just looking at a McDonalds sign, or any other symbols of our ‘culture of convenience’, makes us sad, according to new research.

Canadian researchers claim being exposed to symbols of fast food and other signs of disposable society could make the smaller, everyday things in life harder to enjoy.

The study found that people regularly exposed to fast food signs are less likely to savour enjoyable experiences, such as finding pleasure in art and music.

Researchers from the University of Toronto picked the McDonalds symbol to examine, as they claim it has become the ‘ultimate symbol of time efficiency’ in the modern world.

Student Julian House and professors Sanford E. DeVoe and Chen-Bo Zhong, from the university, told Psypost: ‘It is ironic that technologies designed to improve well-being by minimising time spent on mundane chores may ultimately undermine the surplus leisure time they permit.

‘By instigating a sense of impatience, these technologies may prevent people from savouring the enjoyable moments life offers serendipitously.'

The research, published in journal Social Psychological and Personality Science, discovered that people who regularly see fast-food symbols, are not as likely to savour enjoyment in art and music.

An examination of 280 participants in the U.S. found people while living in neighbourhoods packed with fast food outlets were less likely to savour enjoyable experiences.

In another experiment, 250 people rated the suitability of five advertising images, three of which were ‘neutral’ and two showed McDonalds meals.

Half of the survey participants saw food displayed in the McDonalds packaging, while the other half saw the same food on ceramic plates.

Half of the participants were also shown scenes of natural beauty and all those who took part were asked to rate their happiness.

The scientists found that people who looked at the pictures of natural beauty were happier than those who had not, but the effect was lost on those who had also looked at the McDonalds symbol as they reported being less happy than those people who had only looked at the scenery.

Another 122 participants rated the same five images of food but some listened to 86 seconds of ‘The Flower Duet’ from opera Lakmé.

Those who had only listened to the music and seen the neutral food images thought the music had lasted longer than it did,, while those who had also seen the McDonalds food in its official packaging reported a less positive emotional response to the piece of music and were less patient.

The researchers believe it is important to understand the influence of advertising symbols as they are so prevalent in our everyday environment.

‘As a ubiquitous symbol of an impatient culture, fast food not only impacts people’s physical health but may also shape their experience of happiness in unexpected ways,’ they said.

However, they stressed the findings only examined a small sample of ‘early pleasures’ and that happiness does not simply rely on the savouring of experiences.

SOURCE






Drinking doesn't make you fat: A startling new book claims that nightly glass of wine won't go straight to the hips

Here’s just some of the evidence. Professor Charles S. Lieber of Harvard University, who died in 2009, was probably the greatest expert on alcohol and health the world has ever seen.

In the Seventies, he founded the first scientific journal on alcohol, and was also the first to establish a link between alcohol and liver disease. So he was no friend of alcohol.

Yet in 1991 he firmly rejected the notion that alcohol has any significant effect on weight.

Lieber, however, was relying mainly on evidence drawn from studies that were looking at alcohol’s other effects. It wasn’t until later that anyone actually decided to examine this conundrum directly.

In the Nineties, researchers at Harvard embarked on a survey of almost 20,000 middle-aged women, whose drinking habits and weight were tracked for almost 13 years.

At the start, the women were all roughly UK dress sizes 8 to 12.

By the end, about 9,000 had put on significant amounts of weight, and some had become clinically obese.

All other things being equal, you’d expect the fatties to be the drinkers. But they weren’t.

In fact, the fatties were the women who didn’t drink, and the skinnies were the heaviest drinkers.

The women who drank five grams of alcohol a day reduced their risk of being overweight by 4 per cent. Those who drank 15 grams (roughly one medium glass of wine) a day reduced their risk of piling on the pounds by 14 per cent.

The figures were even more striking when it came to obesity.

Drinking 30 grams (two medium glasses of wine) a day or more gave the women an incredible 70 per cent reduction in obesity risk.

So it was the non-drinkers who turned into size 18s or more.

In other words, this study showed that alcohol is not only non-fattening, but actually helps prevent weight gain.

A rogue result? Well, this was certainly no tin-pot study.

The researchers made full allowances for obvious lifestyle differences that might have skewed the results, such as exercise, food intake and smoking habits.

Indeed, if the study had been a 13-year trial of a new slimming pill, the drug company involved would have been laughing all the way to the bank.

However, this was just one piece of research.

In the world of science, to stand a chance of anyone believing such startling evidence, the results need to be independently replicated.

Which means other researchers have to find pretty much the same thing.

And they have — in spades. Here are just three of the studies conducted in the past 25 years which demonstrate that alcohol doesn’t cause weight gain:

    A six-year study of 43,500 people by the University of Denmark. Key findings: teetotallers and infrequent drinkers ended up with the biggest waistlines, daily drinkers had the smallest.

    An eight-year study of 49,300 women by University College Medical School, London. Key findings: women who drank below 30 grams a day (around two medium glasses of wine) were up to 24 per cent less likely to put on weight than teetotallers.

    A ten-year study of 7,230 people by the U.S. National Center for Disease Control. Key findings: drinkers gained less weight than non-drinkers. Alcohol intake did not increase the risk of obesity.

And there are at least a dozen more studies on alcohol and weight which, by and large, confirm these results. So why isn’t the medical world rejoicing?

Given the obesity epidemic in the Western world, you’d expect doctors to be rushing to prescribe two glasses of wine a day for overweight patients.

Well, science doesn’t quite work like that. Although data shows people who drank alcohol didn’t put on weight  it doesn’t actually prove that beyond any doubt.

Yes, the studies made adjustments for other factors — such as a person’s social class, fitness and education — what if they’d missed something?

On the other hand, it’s highly unlikely that so many studies were wrong. And the methods used are certainly widely accepted as proof when it comes to, say, evaluating new vaccines.

Even so, it can take decades to challenge long-held scientific theories successfully.

So we’re back to where we started: nutritionists remain adamant that because alcohol is high in calories, drinking must therefore put on weight.  To think otherwise is tantamount to heresy.

Fortunately, in the past ten years, a few nutritionists have had the courage to question this dogma.

One of the simplest studies was done in 1997 by U.S. sports scientists who wanted to find out if drinking a couple of glasses of wine a day puts on weight or not.

A total of 14 men were studied for 12 weeks, during which they either drank a third of a bottle of red wine a day for six weeks, then abstained for the next six weeks, or vice-versa.

The result? The addition of two glasses of red wine to the evening meal had no effect on the men’s weight.

But that still didn’t convince sceptical nutritionists.

If taking in extra calories from alcohol doesn’t put on weight, they argued, it must mean alcohol somehow makes people eat less.

So in 1999, Swiss physiologists tested 52 people to see if the sceptics were right.

Predictably, they weren’t: alcohol, they found, actually made people want to eat more. What a surprise.

Next, they tested other theories. Was alcohol causing the body to heat up? Was it affecting fat metabolism? Again the answer was no: the team was stumped.

In fact, if you search the literature, you’ll find no one has any explanation for why alcohol calories don’t seem to count.

More HERE


Monday, November 25, 2013



Racist sandwiches?

Did you know that eating or even talking about a peanut butter and jelly sandwich could be considered racist?

That’s right.  Apparently, it’s because people in some cultures don’t eat sandwich bread. Verenice Gutierrez, principal of Harvey Scott K-8 School in Portland explained in and interview with the Portland Tribune:

“Take the peanut butter sandwich, a seemingly innocent example a teacher used in a lesson last school year,” the Tribune said.

“What about Somali or Hispanic students, who might not eat sandwiches?” Gutierrez asked. “Another way would be to say: ‘Americans eat peanut butter and jelly, do you have anything like that?’ Let them tell you. Maybe they eat torta. Or pita.”

…The Tribune noted that the school started the new year with “intensive staff trainings, frequent staff meetings, classroom observations and other initiatives,” to help educators understand their own “white privilege,” in order to “change their teaching practices to boost minority students’ performance.”

"Last Wednesday, the first day of the school year for staff, for example, the first item of business for teachers at Scott School was to have a Courageous Conversation — to examine a news article and discuss the ‘white privilege’ it conveys,” the Tribune added.

Gutierrez completed a week-long seminar called “Coaching for Educational Equity,” a program the Tribune says focuses “on race and how it affects life.” She also serves on an administrative committee that focuses on systematic racism.

“Our focus school and our Superintendent’s mandate that we improve education for students of color, particularly Black and Brown boys, will provide us with many opportunities to use the protocols of Courageous Conversations in data teams, team meetings, staff meetings, and conversations amongst one another,” she said in a letter to staff.

You can read more about principal Gutierrez’s sandwich-sensitivity philosophy here.

Next time you’re in the bread aisle at the grocery store, you may want to think twice. Sensitive liberal educators are now recommending the “torta” or the “pita” as a more culturally inclusive alternative.

Now that you’ve been made aware of the evil of PB&J, there’s only one question left to answer: Is white bread more racist than whole wheat?

SOURCE






Leading cardiologist says Mexican food is the world's most deadly

Working on conventional but probably wrong assumptions.  Mexican life expectations are not the best but you would have to factor bullets into that as well as enchiladas

TURN down those tacos. Eschew that enchilada. Quell your craving for quesadillas. A leading Mexican cardiologist says that Mexican food is officially the world's most lethal.

Dr Enrique C. Morales Villegas, director of the Cardiometabolic Research Centre in Aguascalientes, Mexico, made the extraordinary statement overnight that his national cuisine is more dangerous than the sludge served up by fast food chains.  "It's a combination of fried food, junk food and soft drinks," the good doctor said.

With obesity reaching frightening levels in Mexico - 73 per cent of women, 69 per cent of men and 35 per cent of adolescents are overweight - Dr Villegas then proceeded to take a heavy-handed swipe at the lifestyle of his countrymen and women.

"The philosophy of life is around comfort. People eat too much and every day they watch four hours of TV, spend two hours at the computer and do less than 10 minutes of physical activity."

Dr Villegas has proposed that the Mexican government measures the glucose, cholesterol, blood pressure and body mass index (BMI) of all 18-year-olds with repeat assessment every three years.

"The prevalence of overweight and obesity in Mexico is one of the highest in the world and the problem is increasing in all age groups," he told the Mexican Congress of Cardiology.

"Obesity begins in childhood and persists into adolescence and adulthood."

SOURCE




Sunday, November 24, 2013



Sugary drinks linked to womb cancer: Those who consume the most are 78 per cent more likely to contract killer disease (?)

Soft drinks laden with sugar could raise a woman’s risk of developing womb cancer, claim researchers.  They said those who downed the highest amounts were 78 per cent more likely to suffer from the disease as those who did not.

The disease tends to hit women aged 50-plus and is Britain’s fourth most common female cancer, killing nearly 2,000 a year.

The 14-year study involving almost 25,000 women in their 50s and 60s looked into endometrial cancer, which affects the lining of the womb.  The participants  gave detailed data about what they ate and drank, with around half having fizzy drinks.

Almost 600 developed endometrial cancer, the most common form of the disease when it affects the womb. However, there was no link with diet versions.

The University of Minnesota researchers said that they couldn’t rule out that women who had lots of sugar-laden drinks had lots of unhealthy habits.

However, they believe the sugar in the soft drinks to be key as it could make the women put on weight. This is important because fat cells make oestrogen, a hormone that is believed to fuel endometrial cancer.

Overweight women also tend to make more insulin, another hormone linked to the disease.

Researcher Dr Maki Inoue-Choi said: ‘Research has documented the contribution of sugar-sweetened beverages to the obesity epidemic.

‘Too much sugar can boost a person’s overall calorie intake and may increase the risk of health conditions such as obesity, diabetes, heart disease and cancer.’

The study, published in the journal Cancer Epidemiology, Biomarkers & Prevention, is the latest in a long line to raise concerns about the health effects of the soft drinks enjoyed by millions of Britons every day.

Previous studies have linked them to a host of health problems, including heart attacks, diabetes, weight gain, brittle bones, pancreatic and prostate cancer, muscle weakness and paralysis.

The soft drinks industry says that its products account for a tiny amount of overall calorie intake.

SOURCE







Alcohol is good for your health: leading science writer claims tipple can prevent cancer and may help improve your sex life

A businessman goes to his GP. ‘My hands hurt, I get a bit of a pain in my chest sometimes, and I’m beginning to forget things,’ he complains.

The doctor examines him and says: ‘You’ve got a touch of arthritis, possibly mild heart disease, and you may be in the first stages of dementia. How much are you drinking?’

‘Never touch a drop, doctor,’ says the patient proudly.  ‘Ah, that explains it,’ says the GP, wagging an admonishing finger. ‘Here’s a prescription for red wine — a quarter of a litre a day.’

Ridiculous? Absurd? A story from the Bumper Book of Jokes for  Alcoholics? Absolutely not.

Let me tell you this: if GPs fail to recommend alcohol to at least some of their patients, they should be had up for medical negligence.

That, at least, is the logical conclusion I’ve drawn from an in-depth study of around half-a-million scientific papers about alcohol.

To my surprise, they contained findings that would make any pharmaceutical company uncork the champagne. Except they never will, of course, because alcohol can’t be patented as a drug.

Most of the evidence suggests that if red wine, in particular — and to a lesser degree white wine, beer, lager and spirits — were used as a preventive and therapeutic medicine, disease rates would fall substantially. Not only that, but lives would be saved — with huge benefits to the economy.

In fact, red wine may well be one of the most effective ‘medications’ in history.

Like other drugs, it has side-effects. It has a minimum and maximum therapeutic dose — take too little and it won’t work; take too much and it may make you ill.

And it has a daily treatment regime: ideally, you should take wine once a day with the evening meal.

Yes, daily. I know that the medical profession urges us to stop drinking at least two or three days a week, but this isn’t borne out by scientific studies. These consistently show that daily moderate drinking is the best for health.

OK, but what is a moderate amount? Unfortunately, that’s where it gets a bit more complicated, because the prevention and treatment of different diseases seem to require differing amounts — varying from a small to a large glass a day, but sometimes more.

Of course, most people already know red wine is supposed to be good for your heart. But, that aside, the endlessly repeated public message is that alcohol is Bad News.

Now, I’d be foolish to deny that over-indulging in booze can be harmful to society. You need only think of alcohol-fuelled crime, road deaths, city centre mayhem, domestic violence, and costs to the NHS.

But to show only one side of the picture, as government and medical authorities inevitably do, is simply bad medicine. It prevents people making sensible decisions about their own health.

Why haven’t doctors ever come clean about all this? The reason is fairly obvious: they don’t trust us.

One shining exception is Professor Karol Sikora, the UK-based consultant oncologist who’s written the foreword to my new book about the benefits of alcohol.

Too much booze, he warns, not only kills but ‘ruins lives, destroys families, ends successful careers, causes untold physical and mental illness and has a huge adverse impact on society’.

However, he continues: ‘If you don’t drink at all, you have a defined risk of developing all sorts of medical problems in your heart, joints, brain, blood sugar levels, and kidneys — indeed all round your body.

‘As you begin to drink, there seems to be evidence of benefit. As you drink more, that gradually disappears and the damaging effects kick in.’

But let’s be clear here: I’m not recommending anything personally. I’m just an averagely intelligent science journalist who’s done what anyone else can if they have the time: I’ve looked at the scientific and medical data published in top-flight journals, and collated the evidence.

So, readers should consult knowledgeable health professionals before acting upon anything they read below. The trouble is, most doctors know very little about this area, because they, like you, have been largely kept in the dark.

More HERE




Friday, November 22, 2013




Women who take the Pill for more than three years have double the risk of eye condition that leads to blindness

The effect described below is larger than the usual rubbish but the finding is still only a correlational one with no obvious causal path.  If estrogen is at fault, women should be going blind all over the place

Women who take the contraceptive pill for more than three years are twice as likely to suffer from an eye condition that can lead to blindness, a new study has found.  The research revealed women who take the pill are more likely to develop glaucoma.

The condition occurs when the drainage tubes within the eyes become blocked, preventing fluid from draining properly and causing pressure to build up, damaging the optic nerve and nerves from the retina.

In England alone around 480,000 people have the most common form of glaucoma, which is most often seen in white Europeans.

The new study is the first to establish an increased risk of glaucoma in women who have used oral contraceptives.

Scientists have now urged gynaecologists and ophthalmologists to tell women of the increased risk and to screen for the condition.

The eye damage caused by glaucoma cannot be reversed but prompt treatment will halt the progression of the condition.

Dr Shan Lin of the University of California in San Francisco said: ‘This study should be an impetus for future research to prove the cause and effect of oral contraceptives and glaucoma.

‘At this point, women who have taken oral contraceptives for three or more years should be screened for glaucoma and followed closely by an ophthalmologist.’

The researchers used data from 2005 to 2008 in the National Health and Nutrition Examination Survey.

This included 3,406 female participants aged 40 years or older from across the U.S. who completed the survey’s vision and reproductive health questionnaire and underwent eye examinations.

It found women who had used oral contraceptives, no matter which kind, for longer than three years were 2.05 times more likely to also report that they had been diagnosed with glaucoma.

Although there was no direct causative effect of oral contraceptives on the development of glaucoma, it indicates that long-term use of oral contraceptives might be a potential risk factor for glaucoma.

Previous studies have shown that oestrogen may play a significant role in the development of glaucoma.

The finding was presented at the 117th Annual Meeting of the American Academy of Ophthalmology in New Orleans.

About one in 50 white Europeans over the age of 40 have glaucoma, as do one in 10 people over the age of 75.

The most common form of glaucoma - chronic open-angle glaucoma - progresses very slowly and often does not cause any noticeable symptoms.

However, over time the patient will start to lose vision from the outer rim of their eye.

The loss of vision will slowly move inwards towards the centre of the eye.

SOURCE







Can’t get children to eat greens? Blame it on the survival instinct

It is a question that has perplexed parents for decades: why do children refuse to eat greens?  Now, two American academics believe they have the answer.

After studying dozens of toddlers as they played with various objects, the researchers noted that they were far more reluctant to grasp plants than artificial items such as spoons or pipe cleaners.

They believe this is because evolution has biologically programmed children to be wary of flora as it may contain potentially hazardous toxins.

Due to a susceptibility to illness or injury in the early years of life, the body has designed an inbuilt defence mechanism that limits a child’s contact with plants, they think.

The researchers believe this is why infants in the study were wary of grabbing plants — and why children turn their noses up when faced with a plate of broad beans.

The findings are published in an academic paper by Dr Annie E Wertz and Dr Karen Wynn, both psychologists at Yale University, titled “Thyme to touch: Infants possess strategies that protect them from dangers posed by plants”.

They wrote: “Throughout human evolution … plants have been essential to human existence. Yet, for all of these benefits, plants have always posed very real dangers.

“Plants produce toxins as defences against predators that can be harmful, or even deadly, if ingested. Some plants also employ physical defences, such as fine hairs, thorns, and noxious oils that can damage tissues and cause systemic effects.”

They added: “We predicted that infants may possess behavioural strategies that reduce their exposure to hazards posed by plant defences by minimising their physical contact with plants.”

To test their theory, the academics studied how children aged eight to 18-months-old reacted when presented with a variety of objects while sitting on their parent’s lap.

In each case they would place six objects, one at a time, in front of the infant while saying “look what I’ve got” and timing how long it took before they clasped the item. Parents were told to keep their eyes shut to minimise their influence. By repeating the same test on almost 100 subjects, the researchers discovered significant variations in the time it took for children to reach out for different items.

On average, it took 3.4 seconds for a child to reach out for shells, 4.6 seconds for lamps and spoons but almost 10 seconds for parsley and basil plants, twice as long as many non-plant items.

Objects that were faked to look like plants also triggered a slow response time.

The authors concluded that trial and error across centuries of human existence had taught infants to be intrinsically wary of physical contact with plants.

“We are not suggesting that infants are actively afraid of plants,” Dr Wertz and Dr Wynn concluded.

“Rather, we propose that once infants identify an object as a plant, they deploy a behavioural strategy of inhibited manual exploration, which serves to help protect them from plants’ potential dangers.”

For parents who have been noting this “behavioural strategy” every time they attempt to spoon some peas into the mouth of their child, this finding should come as a welcome relief, they said.

SOURCE



Thursday, November 21, 2013



Weight loss surgery could REVERSE the effects of ageing: Study finds it can protect against diabetes, heart disease and cancer

This is a very small study and telomeres are still poorly understood so the conclusion is speculative

Weight loss surgery may reverse the effects of ageing, new research suggests.  After the surgery some patients’ telomeres – genetic biomarkers that play a role in the ageing of cells – lengthened, suggesting the ageing process had been reversed.

Usually telomeres become shorter as a person gets older.

‘Obesity has an adverse effect on health, causes premature ageing and reduces life expectancy,' said study co-author Dr John Morton, Chief of Bariatric Surgery at Stanford University Medical Centre.

'This is the first study to show that surgical weight loss may be able to reverse the effects.'

He added: ‘If your telomeres get longer, you’re likely to reverse the effects of ageing and have a lower risk of developing a wide range of age-related diseases such as type 2 diabetes, heart and respiratory diseases, and certain types of cancer.’

Researchers at Stanford University studied genetic data from 51 patients before and after gastric bypass surgery.  Three quarters of the patients were women and their average age was 49.

On average, the patients lost 71 per cent of their excess weight after the surgery and saw the level of inflammation in their body drop by more than 60 per cent.  Inflammation is associated with risk of heart disease and diabetes.

They also saw a four-fold decline in their fasting insulin – a marker of type 2 diabetes risk - within 12 months of surgery.

These findings were consistent with the results of previous studies but the researchers went further this time.

They measured the length of each patient’s telomeres before and after their weight loss.

They found that after the gastric bypass, some of the patients’ telomeres became longer.

The benefits were most pronounced in patients who had had high levels of ‘bad cholesterol’ and of inflammation before their surgery.

Just as the tips of shoelaces prevent fraying, telomeres keep chromosomes stable and prevent deterioration when the cells containing them divide.

Researchers say further studies are needed to confirm the direct effects of telomere length on health outcomes.

The research was presented at ObesityWeek 2013, an event hosted by the American Society for Metabolic and Bariatric Surgery and The Obesity Society.

SOURCE






Parents Fined For Not Sending Ritz Crackers In Kids' Lunches

It's quite possible that the single stupidest school lunch policy on the planet comes courtesy of a strange interpretation of the Manitoba Government's Early Learning and Child Care lunch regulations

Apparently if a child's lunch is deemed "unbalanced", where "balance" refers to ensuring that a lunch conforms to the proportions of food groups as laid out by Canada's awful Food Guide, then that child's lunch is "supplemented", and their parent is fined.

Blog reader Kristen Bartkiw received just such a fine.

She sent her children to daycare with with lunches containing leftover homemade roast beef and potatoes, carrots, an orange and some milk.

She did not send along any "grains".

As a consequence the school provided her children with, I kid you not, supplemental Ritz Crackers, and her with a $10 fine.

As Kristen writes, had she sent along lunches consisting of, "microwave Kraft Dinner and a hot dog, a package of fruit twists, a Cheestring, and a juice box" those lunches would have sailed right through this idiocy. But her whole food, homemade lunches? They lacked Ritz Crackers.

So what say you? Have you come across a more inane school lunch policy? Because I sure haven't.

SOURCE




Wednesday, November 20, 2013



Eating a full English breakfast CAN help you lose weight: Protein - not cereal or fruit -  is best for preventing hunger pangs

This seems a reasonable study

Eating a full English for breakfast can help you lose weight, a new study suggests.  Research shows that a meal high in protein instead of carbohydrate or fibre for breakfast can fight off hunger and avoid the urge to over-eat later in the day.

A hearty sitting of foods like sausage, egg or bacon for the first meal of the day helps to curb hunger throughout the morning and cut the number of calories eaten at lunch time

An experiment at the University of Missouri on a group of 18 to 55-year-old women showed that a high-protein breakfast kept them fuller longer than a meal with less protein but the same amount of fat and fibre.

The team, led by research scientist Dr Kevin Maki, found eating between around 35 grams of protein for breakfast - the equivalent to a four-egg omelette or two sausages and a rasher of bacon - helped regulate appetite.

He said: 'Eating a breakfast rich in protein significantly improves appetite control and may help women to avoid overeating later in the day.'

In the experiment the participants all ate a 300 calorie meal with equal amounts of fat and fibre, although one group had between 30 and 39g of protein in their bowls and a third group were given just a glass of water.

Dr Heather Leidy, an assistant professor specialising in appetite regulation, explained: "In the USA, many people choose to skip breakfast or choose low protein foods because of lack of high protein convenient choices.

The team tracked the test subjects' hunger throughout the morning, using appetite questionnaires every half an hour to gauge levels of hunger, fullness, and desire to eat before before breakfast and up until lunch.

The group who ate a high-protein meal had improved appetite rating scores and ate less of the lunchtime meal of tortellini and sauce than the other groups.

Dr Leidy said: "These results demonstrate that commercially prepared convenient protein-rich meals can help women feel full until lunch time and potentially avoid overeating and improve diet quality."

SOURCE




The FDA's Ill-Conceived Proposal to Ban Trans Fats

We may soon have to say goodbye to many doughnuts, crackers, frozen pizza, coffee creamer and other goodies—whether or not we agree

On Thursday the FDA made the surprise announcement that it would move to ban artificial trans fats, which are found in foods containing partially hydrogenated vegetable oils. The ban would not apply to naturally occurring trans fats, such as those found in meat and dairy products. Adoption of the proposal, which is open to public comment until Jan. 7, 2014, would mean that food producers who want to use partially hydrogenated oils would first have to prove to the FDA the safety of the ingredient.

Considering that the FDA’s announcement this week declared preemptively “that there is no safe level of consumption of artificial trans fat,” the burden of proof for future trans fat use would appear to be quite high.

What does trans fat research say?

Studies on artificial trans fats have found generally that they raise the amount of bad cholesterol (LDL) and lower the amount of good cholesterol (HDL) in the blood. (More here on HDL and LDL generally.)

But a recent meta analysis, Effect of Animal and Industrial Trans Fatty Acids on HDL and LDL Cholesterol Levels in Humans—A Quantitative Review, published in the open-access, peer-reviewed journal PLOS ONE in 2010, concludes that those negative effects may also be shared by natural trans fats.

The study looked at the results from twenty-nine human studies in which subjects were fed artificial trans fats and six studies in which subjects were fed natural trans fats derived from milk fat. It found the impact of artificial and natural trans fats on HDL and LDL levels to be roughly equivalent.

Like the 2010 PLOS ONE study, a 2005 book by the Institute of Medicine that appears to form much of the basis for the FDA’s action (the agency went so far as to link to it in yesterday’s FDA press release) appears to make no distinction between artificial and natural trans fats.

What’s more, the IOM appears torn over trans fats. On the one hand, it refers to them as “not essential” and says they “provide no known health benefit.” The FDA cites these points, of course. But the IOM also concludes in the same paragraph “trans fatty acids are unavoidable in ordinary, nonvegan diets.”

How much trans fats do we eat?

Thanks to the fact many food producers have responded to consumer demand and removed trans fats from their foods in recent years, the FDA’s press release noted that “trans fat intake among American consumers has declined from 4.6 grams per day in 2003 to about 1 gram per day in 2012.”

The American Heart Association, meanwhile, suggests Americans consume “less than 2 grams of trans fats a day.” So if the FDA and AHA are correct, then current consumption levels—prior to and without any ban—are well within safe levels. Still, that didn’t stop the AHA from endorsing the FDA’s suggested ban.

And what about natural trans fats? According to the USDA, a pound of ground beef contains more than 8g of trans fat.

SOURCE



Tuesday, November 19, 2013



Could symptoms of autism be ERASED by stimulating the brain? Magnets could boost social skills

This is a very small study over a very short time period so is of tentative significance only

A clinical trial revealed magnetic brain stimulation can help people with autism with social interaction.

The treatment can boost the part of the brain that is underactive in people with autism, the researchers found.

Dr Lindsay Oberman, a neurologist at the Beth Israel Deaconess Medical Centre in Boston who was not involved in the study, told New Scientist that the findings are an ‘excellent start’.

Researchers at Monash University, in Melbourne, drew on previous research that showed the brain’s dorsomedial prefrontal cortex (dmPFC) is less active in people with autism.

They used repetitive transcranial magnetic stimulation (rTMS), a technique which sees magnetic pulses delivered into the brain, to try and boost the activity of the dmPFC.  This is also the part of the brain which allows people to comprehend other people’s feelings and thoughts.

Lead researcher Dr Peter Enticott, and his team, carried out the trial with 28 adults who have Asperger’s syndrome.

They gave some of the volunteers 15 minutes of rTMS every day for 10 days. The other participants were told they might be having the treatment and had coils placed on their heads, but were not given the magnetic pulses.

Before and after the treatment, the volunteers took part in a range of tests of their social skills – the results of these tests showed the patients who had been given rTMS had significantly improved social skills.

The researchers said that in one case, a patient started making tea for her sibling who was revising for an exam.  The researchers say this was a dramatic improvement as it showed the woman understood her sibling’s needs and that she wanted to help.

Dr Enticott told New Scientist: ’As far as her family were concerned, this was completely foreign to her. She had never shown any inclination toward that sort of activity in her life.’

The team also found the volunteers who had been given rTMS experienced reduced anxiety levels.  In contrast, people who had been given placebo therapy did not experience any improvement.

Despite the improvements seen in the participants’ social skills, they showed no improvement in their ability to understand other people’s mental states.

Dr Enticott told New Scientist: ‘It surprises me. We are stimulating the region of the brain that is most closely associated with these tasks.’

Dr Oberman believes this could be because the dmPFC is buried deep within the brain meaning it is hard to target with rTMS.

The researchers say the next step is to establish how long the improvements last for.

SOURCE





A Christian perspective on vegetarianism

If one wants to look to someone superior and follow him or her as to what or what not to grub, then I think we should not look to anyone currently schlepping the third rock from the sun; especially the freaks at PETA.

I know for me and my casa, instead of eying a Hollywood skank’s food choice, we’re going to look to God’s special son and what he dined on for our life’s culinary glide path. Call us crazy.

Which brings me to this enquiry: What did Jesus grub on when he donned an earth suit 2000 years ago? Boy, that sure would be interesting to find out, eh?

When it comes to talking about someone being superior in the purest/ultimate sense of the word, well … Jesus leaves us all in the dust, right? Seeing that he never sinned in word or deed, which includes what he ate and drank, I wonder what he banqueted on when he sat down at Denny’s way back in the day?

Did he order barbecued aubergine on lentil polenta patties with a black bean sauce? Or was it a “Best in Class” Tomato Tart? I know, it was Chinese tea eggs with a noodle salad. Dee-Lish! No, wait. It had to be the sassy and crunchy Red Lentil Kofte.

How can we ever know what he masticated upon? Oh, wait. Maybe the Bible documented what he ate? Let us now turn to the holy text to see shall we?

According to the New Testament, we know Jesus definitely dug on fish because huge chunks of the Gospel are taken up with him both eating and supernaturally aiding and abetting the mass slaughter of thousands of fish (see Lk.5:1-10; Mk.6:35-44; Jn.6:1-14).

Matter of fact, in John’s recounting of the proliferation of fish fillets that act equated a divine attestation to those lucky enough to watch that phenomenon go down. Indeed, that little ditty equated to that audience that Jesus was no normal Nazarene. They sure as heck didn’t see it as a sin. Oh, no. On the contrary, they saw that paranormal provision of meat as proof that Jesus was sent from Jehovah. Boom.

I know, some folks are probably thinking, “Okay, smart-ass. Jesus enabled his disciples to slay and eat fish. Big whoop. A lot of vegetarians are cool with eating fish. However, the referenced passages said squat about Jesus eating actual meat, meat.”

You’re right. You got me. It said fish and not beef. I’m so busted.

Even though I still don’t get how the anti-hunters and anti-meat eaters make a distinction between slaughtering a pretty fish and grubbing it being okay and shooting, killing and eating a four legged animal is not; but … whatever.

I guess I lose. You guys win. I’m such a dork.

Oh, wait. I forgot about the Passover. You remember, the famous Last Supper in Mark 14:12-18, don’t you?

The Passover, to those not hip to it, is a holy event Jesus partook in that entailed the slaughter of a lamb. Not only did they kill that little rascal, they also grilled it and ate it.

Please note in Mk.14:14 Jesus said he’s going to eat it, the lamb, with his disciples and again in v.18 Jesus said the one who betrays me is eating with me.

It’s pretty clear to this C-student that Jesus not only was cool with consuming fish, but he also was down with meat consumption; so much so that it was his last meal. Yep, before his execution Jesus loved him some tasty lamb chops, thereby blowing all to hell the vegans’ being able to use Jesus as an example of vegetable quiche eating only.

One more note of importance. Y’know, just to rub it in that Jesus was no vegan: In his post-resurrection, pre-ascension person, Jesus proved to his doubting disciples in John 21 that he was the risen Christ, not by hugging them and singing to them “Friends are Friends Forever When the Lord’s the Lord of Them”, but by allowing them to catch 153 humongous fish and then personally grilling the fish and eating them with his boys.

Check it out: the catching and killing of another mind-blowing haul of fish, and then cooking and consuming their succulent flesh, was all the proof his disciples needed to know that that man on the beach was no casual clod, but the risen Son of God.

Oh, and by-the-way, in probably one of the most amazing stories of redemption that Jesus ever parlayed on the public’s noggin was the parable of the prodigal son.

In this familiar narrative, we have a rich look inside the father’s heart towards lost causes who’re squandering their lives away on vice. Please note, fair reader, in one of the most stunning stories of redemption, reconciliation and restoration there’s a big ol’ Texas BBQ in the big middle of it. Yep, the father showed his joy, love and acceptance by giving his repentant boy a nice ring, a wardrobe upgrade, some killer sandals and then ordering his posse to kill and grill their fattened calf.

Pretty telling what the Bible thinks about eating meat is that in one of its most touching tales of salvation it entails the father (God) ordering a massive meat platter to celebrate this son who was once lost, but now is found.

And lastly, for the coup de grace, the end all to being able to use Jesus as an anti-meat killing/eating hippie, I offer you Mark 7:14-19…

"Again Jesus called the crowd to him and said, “Listen to me, everyone, and understand this. Nothing outside a person can defile them by going into them. Rather, it is what comes out of a person that defiles them. After he had left the crowd and entered the house, his disciples asked him about this parable. “Are you so dull?” he asked. “Don’t you see that nothing that enters a person from the outside can defile them? For it doesn’t go into their heart but into their stomach, and then out of the body.” (In saying this, Jesus declared all foods clean.)

So, in conclusion, I guess Jesus is okay with eating fish and meat and, as stated above, I’d rather follow his lead than a screechin’ vegan any old day.

SOURCE




Monday, November 18, 2013


Buffaloberry: Next Commercial ‘Super Fruit,’ Scientists Say

Zero evidence of clinical utility presented.  Just discredited theory -- JR

Buffaloberry is a deciduous thorny, thicket-forming shrub growing up to 6 m tall. The shrub is a member of the olive family native to Western North America and is found on many Indian reservations, often where little else grows well.

The bright red fruit has a tart flavor, and has historically been used as a source of nutrients for many Native Americans.

The sugar and acidity of the fruit make it desirable as a fresh or dried product. In addition to its potential health benefits, lycopenoate may also be used as a natural food colorant.

Recently the buffaloberry has drawn attention from several commercial wine producers.

In the new study, lead author Dr Ken Riedl and his colleagues from the Ohio State University found that buffaloberries contain large amounts of lycopene and a related acidic compound, methyl-lycopenoate, which are important antioxidants and nutrients beneficial for human health.

“These plants produce fruits rich in carotenoid and phenolic antioxidants, which may have health benefits that may make buffaloberry commercially valuable,” the scientists wrote in the paper.

“This species is adapted to poor soils and can tolerate drier climates. In the Dakotas, buffaloberry flourishes on the American Indian Tribal Reservations, yielding copious amounts of health-beneficial fruit for fresh and processing markets, making it a potentially valuable new crop for marginal lands.”

SOURCE






Don’t Give More Patients Statins

Wow! The tide is turning.  Below is from the NYT

ON Tuesday, the American Heart Association and the American College of Cardiology issued new cholesterol guidelines that essentially declared, in one fell swoop, that millions of healthy Americans should immediately start taking pills — namely statins — for undefined health “benefits.”

This announcement is not a result of a sudden epidemic of heart disease, nor is it based on new data showing the benefits of lower cholesterol. Instead, it is a consequence of simply expanding the definition of who should take the drugs — a decision that will benefit the pharmaceutical industry more than anyone else.

The new guidelines, among other things, now recommend statins for people with a lower risk of heart disease (a 7.5 percent risk over the next 10 years, compared with the previous guidelines’ 10 to 20 percent risk), and for people with a risk of stroke. In addition, they eliminate the earlier criteria that a patient’s “bad cholesterol,” or LDL, be at or above a certain level. Although statins are no longer recommended for the small group of patients who were on the drugs only to lower their bad cholesterol, eliminating the LDL criteria will mean a vast increase in prescriptions over all. According to our calculations, it will increase the number of healthy people for whom statins are recommended by nearly 70 percent.

This may sound like good news for patients, and it would be — if statins actually offered meaningful protection from our No. 1 killer, heart disease; if they helped people live longer or better; and if they had minimal adverse side effects. However, none of these are the case.

Statins are effective for people with known heart disease. But for people who have less than a 20 percent risk of getting heart disease in the next 10 years, statins not only fail to reduce the risk of death, but also fail even to reduce the risk of serious illness — as shown in a recent BMJ article co-written by one of us. That article shows that, based on the same data the new guidelines rely on, 140 people in this risk group would need to be treated with statins in order to prevent a single heart attack or stroke, without any overall reduction in death or serious illness.

At the same time, 18 percent or more of this group would experience side effects, including muscle pain or weakness, decreased cognitive function, increased risk of diabetes (especially for women), cataracts or sexual dysfunction.

Perhaps more dangerous, statins provide false reassurances that may discourage patients from taking the steps that actually reduce cardiovascular disease. According to the World Health Organization, 80 percent of cardiovascular disease is caused by smoking, lack of exercise, an unhealthy diet, and other lifestyle factors. Statins give the illusion of protection to many people, who would be much better served, for example, by simply walking an extra 10 minutes per day.

Aside from these concerns, we have more reasons to be wary about the data behind this expansion of drug therapy.

When the last guidelines were issued by the National Heart, Lung, and Blood Institute in 2001, they nearly tripled the number of Americans for whom cholesterol-lowering drug therapy was recommended — from 13 million to 36 million. These guidelines were reportedly based strictly on results from clinical trials. But this was contradicted by the data described in the document itself.

For example, even though the guidelines recommended that women between the ages of 45 and 75 at increased risk of heart disease and with relatively high LDL levels take statins, the fine print in the 284-page document admitted, “Clinical trials of LDL lowering generally are lacking for this risk category.” The general lack of evidence for LDL level targets is why they have been dropped from the current guidelines. In fact, committee members noted that cholesterol lowered by drugs may not have the same effect as cholesterol lowered by nondrug methods, such as diet, exercise and being lucky enough to have good genes.

The process by which these latest guidelines were developed gives rise to further skepticism. The group that wrote the recommendations was not sufficiently free of conflicts of interest; several of the experts on the panel have recent or current financial ties to drug makers. In addition, both the American Heart Association and the American College of Cardiology, while nonprofit entities, are heavily supported by drug companies.

The American people deserve to have important medical guidelines developed by doctors and scientists on whom they can confidently rely to make judgments free from influence, conscious or unconscious, by the industries that stand to gain or lose.

We believe that the new guidelines are not adequately supported by objective data, and that statins should not be recommended for this vastly expanded class of healthy Americans. Instead of converting millions of people into statin customers, we should be focusing on the real factors that undeniably reduce the risk of heart disease: healthy diets, exercise and avoiding smoking. Patients should be skeptical about the guidelines, and have a meaningful dialogue with their doctors about statins, including what the evidence does and does not show, before deciding what is best for them.

SOURCE




Sunday, November 17, 2013


Fat is good for you

"About TURN!" -- as they say in the army. The "bad fat" gospel is now increasingly being challenged.  And below is one such challenge.  I don't believe either gospel.  "Eat what you like" is my gospel.  Human beings are omnivores.  They can thrive on a wide variety of diets.

It’s tempting to call David Perlmutter’s dietary advice radical. The neurologist and president of the Perlmutter Health Center in Naples, Fla., believes all carbs, including highly touted whole grains, are devastating to our brains. He claims we must make major changes in our eating habits as a society to ward off terrifying increases in Alzheimer’s disease and dementia rates.

The FDA recommends between 20 and 30 grams of fiber per day, but most Americans aren't eating half of that. With benefits that range from weight management to cardiovascular health, is it any wonder we're an overweight nation? Here, 10 health benefits of increasing your fiber intake.

And yet Perlmutter argues that his recommendations are not radical at all. In fact, he says, his suggested menu adheres more closely to the way mankind has eaten for most of human history.

What’s deviant, he insists, is our modern diet. Dementia, chronic headaches, depression, epilepsy and other contemporary scourges are not in our genes, he claims. “It’s in the food you eat,” Perlmutter writes in his bestselling new book, Grain Brain: The Surprising Truth About Wheat, Carbs, and Sugar – Your Brain’s Silent Killers. “The origin of brain disease is in many cases predominantly dietary.”

How We Got Here

Perlmutter’s book is propelled by a growing body of research indicating that Alzheimer’s disease may really be a third type of diabetes, a discovery that highlights the close relationship between lifestyle and dementia. It also reveals a potential opening to successfully warding off debilitating brain disease through dietary changes.

Perlmutter says we need to return to the eating habits of early man, a diet generally thought to be about 75% fat and 5% carbs. The average U.S. diet today features about 60% carbs and 20% fat. (A 20% share of dietary protein has remained fairly consistent, experts believe.)

Some in the nutrition and medical communities take issue with Perlmutter’s premise and prescription. Several critics, while not questioning the neurological risks of a high-carb diet, have pointed out that readers may interpret his book as a green light to load up on meat and dairy instead, a choice that has its own well-documented cardiovascular heart risks.

“Perlmutter uses bits and pieces of the effects of diet on cognitive outcomes — that obese people have a higher risk of cognitive impairment, for example — to construct an ultimately misleading picture of what people should eat for optimal cognitive and overall health,” St. Catherine University professor emerita Julie Miller Jones, Ph. D., told the website FoodNavigator-USA.

Grain Brain does delve deeply into the negative neurological effects of dietary sugar. “The food we eat goes beyond its macronutrients of carbohydrates, fat and protein,” Perlmutter said in a recent interview with Next Avenue. “It’s information. It interacts with and instructs our genome with every mouthful, changing genetic expression.”

Human genes, he says, have evolved over thousands of years to accommodate a high-fat, low-carb diet. But today we feed our bodies almost the opposite, with seemingly major effects on our brains. A Mayo Clinic study published earlier this year in the Journal of Alzheimer’s Disease found that people 70 and older with a high-carbohydrate diet face a risk of developing mild cognitive impairment 3.6 times higher than those who follow low-carb regimens. Those with the diets highest in sugar did not fare much better. However, subjects with the diets highest in fat were 42% less likely to face cognitive impairment than the participants whose diets were lowest in fat.

Further research published in the New England Journal of Medicine in August showed that people with even mildly elevated levels of blood sugar — too low to register as a Type 2 diabetes risk — still had a significantly higher risk of developing dementia.

“This low-fat idea that’s been drummed into our heads and bellies,” Perlmutter says, “is completely off-base and deeply responsible for most of our modern ills.”

Turning to Nutrition, Not Pills

This fall, the federal government committed $33.2 million to testing a drug designed to prevent Alzheimer’s in healthy people with elevated risk factors for the disease, but “the idea of lifestyle modification for Alzheimer’s has been with us for years,” Perlmutter says, and it’s cost-free.

The author hopes his book and other related media on the diet-dementia connection will inspire more people to change the way they eat. “Dementia is our most-feared illness, more than heart disease or cancer,” Perlmutter says. “When you let Type 2 diabetics know they’re doubling their risk for Alzheimer’s disease, they suddenly open their eyes and take notice.

“People are getting to this place of understanding that their lifestyle choices actually do matter a whole lot,” he says, “as opposed to this notion that you live your life come what may and hope for a pill.”

As we learn more about the brain’s ability to maintain or even gain strength as we age, Perlmutter believes, diet overhauls could become all the more valuable.

“Lifestyle changes can have profound effects later in life,” he says. “I’m watching people who’d already started to forget why they walked into a room change and reverse this. We have this incredible ability to grow back new brain cells. The brain can regenerate itself, if we give it what it needs.”

What it needs most of all, Perlmutter says, is “wonderful fat.” There’s no room in anyone’s diet for modified fats or trans fats, he says, but a diet rich in extra-virgin olive oil, grass-fed beef and wild fish provides “life-sustaining fat that modern American diets are so desperate for.”

Too few of us understand there’s “a big difference between eating fat and being fat,” he says. People who eat more fat tend to consume fewer carbs. As a result, they produce less insulin and store less fat in their bodies.

Change We Ought to Believe In

Changing minds, however, is an uphill climb. “The idea that grains are good for you seems to get so much play,” he says. “But grains are categorically not good for you,” not even whole grains.

“We like to think a whole-grain bagel and orange juice makes for the perfect breakfast,” Perlmutter continues. “But that bagel has 400 calories, almost completely carbohydrates with gluten. And the hidden source of carbs in this picture is that 12-ounce glass of fresh-squeezed orange juice. It has nine full teaspoons of pure sugar, the same as a can of Coke. It’s doing a service with Vitamin C, but you’ve already gotten 72 grams of carbs.

“It’s time to relearn,” he says. “You can have vegetables at breakfast – the world won’t come to an end. You can have smoked salmon, free-range eggs with olive oil and organic goat cheese and you’re ready for the day. And you’re not having a high-carb breakfast that can cause you to bang on a vending machine at 10 a.m. because your blood sugar is plummeting and your brain isn’t working.”

Changing one’s diet is a challenge, he acknowledges. Giving up the gluten found in most carbs makes it even tougher. “The exact parts of the brain that allow people to become addicted to narcotics are stimulated by gluten,” Perlmutter points out. “People absolutely go through withdrawal from gluten. It takes a couple of weeks.”

But the change is worth making, he says, at any age.

“Nutrition matters,” Perlmutter says. “The brain is more responsive to diet and lifestyle than any other part of the body and until now it’s been virtually ignored. We load up on medications when our mood is off, we hope for an Alzheimer’s disease pill when we get older. I submit that we need to take a step back and ask, ‘Is this really how we want to treat ourselves?’”

SOURCE





Wider use of statins 'disturbing'

Wider use of statins will have minimal benefit and could needlessly expose thousands to severe side effects, doctors warn -- following change in US prescription guidelines

Prescribing statins to millions more healthy people would make only a minimal difference to their average lifespan but risk exposing thousands to harmful side effects, a leading doctor has claimed.

Dr Aseem Malhotra, a cardiology specialist registrar at Croydon University Hospital, said he would be "disturbed" if Britain followed America in changing prescription guidelines to widen use of statins.

There is "no doubt" that the cholesterol-lowering drugs reduce the likelihood of heart attacks and strokes in people with heart disease, he said, but the potential benefits of medicating millions more who are at low risk could be dramatically outweighed by the associated harms.

Side effects experienced by up to one in five patients include severe muscle aches, memory disturbance, sexual dysfunction, cataracts and diabetes.

New US guidelines on statins, issued on Tuesday by the American College of Cardiology and the American Heart Association, recommend that doctors should consider prescribing the drugs to all people with at least a 7.5 per cent risk of suffering a heart attack or stroke within the next decade.

US experts who drafted the new guidance said doctors had been "undertreating" patients and that the new advice would mean "more people who would benefit from statins are going to be on them".

But the guidelines have also raised concerns among doctors in America, and in Britain where current advice that statins should be prescribed to those with a 20 per cent risk over 10 years is under review.

The National Institute of Health and Care Excellence has confirmed that the same recent clinical evidence which prompted the change in US policy will form part of its own decision, and experts believe the threshold could be lowered.

Dr Malhotra said: "I think it is very possible that this will also happen in Britain.

"One thing we have learned in the past decade is the considerable influence of a very financially powerful pharmaceutical industry over prescribing and modern medicine, and the trends suggest that this influence will have the same kind of effect over in the UK [as in America]."

Statins, which cost the NHS less than 10p per day, have become the most widely prescribed drugs in Britain and are currently used by an estimated six million people.

Some experts have claimed that all over-50s should take the drugs routinely to lower their levels of "bad" LDL cholesterol and protect against heart attacks and strokes.

Dr David Wald, a cardiologist at Queen Mary, University of London, said on Wednesday it would be “sensible” to lower the threshold on eligibility, which would be “heading towards the point where statins may eventually be offered to everyone once they reach a certain age of around 55.”

But a recent analysis published in the British Medical Journal found that even patients with a 20 per cent risk of a heart attack or stroke who were over the age of 50 may not benefit from the drugs.

"This expansion of use of statins is not good for public health," Dr Malhotra said. "There is no doubt that for people with established heart disease the benefits outweigh the risks, but for people who do not have established heart disease this isn't the case ... I would be very disturbed if the UK were to follow suit."

Writing in the New York Times Dr John D Abramson, who co-wrote the BMJ review, and Dr Rita F Redberg said wider use of statins "will benefit the pharmaceutical industry more than anyone else".

"For people who have less than a 20 per cent risk of getting heart disease in the next 10 years, statins not only fail to reduce the risk of death, but also fail even to reduce the risk of serious illness," they said.

"Instead of converting millions of people into statin customers, we should be focusing on the real factors that undeniably reduce the risk of heart disease: healthy diets, exercise and avoiding smoking."

SOURCE

Friday, November 15, 2013




Women who eat nuts 'less likely to develop pancreatic cancer'

The usual rubbish.  Middle class people are more likely to eat a "correct" diet  -- in which nuts are praised  -- and they are healthier anyway

Women who snack on brazil nuts, cashews, pecans and other popular varieties are less likely to develop pancreatic cancer, a study has suggested.  Researchers found a handful of nuts twice a week is enough to significantly reduce the risk of contracting the disease.

Pancreatic cancer is one of the most deadly forms of cancer in western countries, with smoking and obesity thought to be the two leading causes.

But the disease is also associated with a form of diabetes known as diabetes mellitus, which can be inhibited by tree nuts – the family that also includes almonds, pistachios and walnuts.

Tree nuts contain a range of vitamins, minerals and phytochemicals, the compounds which give plants their colour, smell and other physical qualities.

The results, published by the British Journal of Cancer, come from a detailed study of more than 75,000 women in the US, none of whom had any history of the disease.

Researcher Dr Ying Bao of Harvard Medical School and Utah's Brigham and Women's Hospital, looked at the connection between pancreatic cancer and nut consumption.

Nuts can be fattening, which is a factor in many forms of cancer, but the study showed women who consumed the most nuts tended to be slimmer than average.

The report does not say if this is down to other factors of their lifestyle – such as exercise – or eating nuts instead of less healthy, calorie-laden snacks such as chocolate.

But Dr Bao said: "Women who consumed a one-ounce serving of nuts two or more times per week had a significantly reduced risk of pancreatic cancer compared to those who largely abstained from nuts.

"This reduction in risk was independent of established or suspected risk factors for pancreatic cancer including age, height, obesity, physical activity, smoking, diabetes and dietary factors."

He added: "In our cohort women who consumed the most nuts tended to weigh less."

Nuts have been shown to have several health benefits in recent years.

Almonds, walnuts, cashews and were found to reduce to risk of heart attacks by more than 10 per cent following a large European study looking into the links between diet and cancer and heart disease.

Only two servings a week of eight grams of nuts, enough to cover a small plate, is enough to decrease the risk, according to the study presented to the World Congress of Cardiology.

Moreover, a 2012 study found pregnant woman who eat nuts are less likely to have children who developed allergies.

Researchers at the Statens Serum Institute in Copenhagen discovered children of women who eat peanuts and other nuts during pregnancy are a third less likely to suffer from asthma by the age of seven, compared to those whose mothers avoid them.

The report contradicted a series of studies in the 1980s, that purported to find evidence of a link between nut-eating during pregnancy and allergic response.

Colin Michie, chairman of nutrition at the Royal College of Paediatrics and Child Health, said the results revealed previous studies to be weak.

SOURCE






Damp, mouldy rooms increase risk of Parkinson's

If you are a fruit fly.  Mould can be  quite harmful to people so a study of the theory in a human sample should be done without delay

A damp, mouldy house has long been known to trigger asthma and allergies - and new research has now linked it to an increased risk of Parkinson's disease.

Scientists found a compound emitted by mould can be linked to the development of the neurological condition.

The U.S. researchers found a connection between the compound given off by mould and mildew - a vapour known as 'mushroom alcohol'  – and the malfunctioning of two genes associated with the brain chemical dopamine.

This is lost in patients with Parkinson's disease.

While the condition has previously been linked to exposure to toxins, these were man-made rather than natural chemicals, said researcher Dr Arati Inamdar, from Rutgers University.

The idea for the research came from the study's co-author Dr Joan Bennett, also from Rutgers University, who became interested in the health effects of living in a damp building after her house was flooded when Hurricane Katrina struck New Orleans in 2005.

After her house flooded it became riddled with moulds and Dr Bennett started to collect samples wearing protective clothing.

She said: ‘I felt horrible – headaches, dizziness, nausea. I knew something about “sick building syndrome” but until then I didn’t believe in it.

‘I didn’t think it would be possible to breathe in enough mould spores to get sick.’

As a result of this experience, Dr Bennett decided to investigate the connection between moulds and the symptoms she had experienced.

She and Dr Inamdar discovered that the volatile organic compound 1-octen-3-ol, otherwise known as mushroom alcohol, can cause movement disorders in fruit flies.

They also found that it attacks the two genes that deal with dopamine, degenerating the neurones and causing Parkinson’s-like symptoms.

Parkinson's disease occurs when the nerve cells in the brain that make dopamine are slowly destroyed. Without dopamine, the nerve cells in that part of the brain cannot properly send messages.

Studies indicate that the condition - a progressive disease of the nervous system marked by tremor, muscular rigidity and slow, imprecise movement - is increasing in rural areas, where it’s usually attributed to pesticide exposure.

But rural environments also have a lot of mould and mushroom exposure.

‘Our work suggests that 1-octen-3-ol might also be connected to the disease, particularly for people with a genetic susceptibility to it,’ Dr Inamdar said. ‘We’ve given the epidemiologists some new avenues to explore.’

Claire Bale, Research Communications Manager at Parkinson’s UK, said: ‘Understanding what causes Parkinson’s remains one of the big unanswered questions for researchers today.

‘We already know that exposure to some chemicals can slightly increase the risk of Parkinson's, and this is the first study to suggest that chemicals produced by fungi may play a part in what causes the condition to develop.

‘It is important to remember this study was conducted using tiny fruit flies, so before we can really be confident about this new connection we need to see evidence from studies in people.

‘Whilst exposure to chemicals produced by fungi – and possibly other chemicals – may play a role in Parkinson's in some people, it's likely just a small part of a much bigger puzzle and we wouldn’t want people to worry unnecessarily about developing the condition if they found mold or fungi in their homes.

‘We still don't know exactly what causes Parkinson's – for most people it's likely to be a combination of natural ageing, genetic susceptibility, lifestyle and environmental factors.’

The findings, which were produced with help from researchers at Emory University, were published online in the Proceedings of the National Academy of Sciences and the study was funded by Rutgers University and the National Institutes of Health.

SOURCE



Thursday, November 14, 2013


The saturated fat myth has had its day

For more than 20 years, there have been one or two medical commentators in newspapers, such as the Telegraph's great James Le Fanu, who have rejected the cholesterol theory of heart disease. Dr Le Fanu has always maintained that (most) people should stick to the boiled eggs and buttered soldiers for breakfast and avoided margarine as if their lives depended on it. But the mainstream view for 40 years, as dished out to the public in health campaigns and via the NHS has been – cut down saturated fat to lower your risk of cardiovascular disease. That has meant that butter, full-fat milk and cheese were ruthlessly demonised, while oil-based spreads and low-fat products flew off the shelves.

But this is changing – and if you doubt it, consider that we have a leading young cardiologist, Aseem Malhotra, writing in the British Medical Journal today, saying quite plainly: "If you have a choice between butter and margarine, have the butter every time."

The evidence is pretty persuasive. The essence of it is: recent studies have failed to support any significant association between saturated fat intake and cardiovascular risk. Indeed, saturated fat has actually been found to be protective – though it should come from natural sources, not processed food where it's mixed with harmful chemical additives. That is to say, steak is all right, but an Ulster Fry – sausage slice, black pudding, bacon etc – should be for special occasions only.

What has happened over the past four decades is that public health campaigns have relentlessly hammered home the "cut fat" message. The general public has responded, egged on by the food companies flogging "low-fat" processed snacks and meals. The trouble is, fat makes substances palatable to eat, and if you take it out the food will taste bland.

Here is where the industry stepped in – replacing all that fat with sugar to provide a substitute kick. And did obesity decline in this period? It did not. In fact, over the past 30 years in the US, as the proportion of energy derived from fat in the diet fell from 40 per cent to 30 per cent, obesity rocketed. And there's increasing evidence that sugar could be an independent risk factor for the metabolic syndrome that was discussed in a blog the other day.

All this is in Malhotra's report and I recommend reading it in full. It has implications for me because like thousands of others I take a daily statin – a tiny dose of atorvastatin in my case. I was prescribed it for raised cholesterol and some family history of blocked arteries and heart problems. I'm now seriously considering ditching the little white pill and carefully following Malhotra's advice on diet instead. He says consuming a Mediterranean mixture of foods – fresh vegetables, olive oil, complex carbohydrates – is "almost three times as powerful in reducing mortality as taking a statin", citing the Predimed study.

He does seem to accept that statins have an important role in treating patients who have already had heart attacks, but "the benefits of statins are independent of their effect on cholesterol". I still have a feeling that statins are a bit of wonder drug. But it seems that, as with aspirin, they exert their beneficial effect in mysterious ways not yet fully understood.

The saturated fat/cholesterol theory of heart disease resembles the serotonin model of depression in that it's a gross simplification of complex and poorly understood mechanisms. But at least Malhotra's message is clear, and it should be listened to: the saturated fat myth has had its day.

SOURCE




   

Organic ISN'T better than factory farmed: Why caged chickens have 'less stressed' lives than free-range



For consumers with a  conscience, they are top of the shopping list.  But it seems free-range eggs may not be all they are cracked up to be. According to researchers, hens kept indoors in cages lead happier lives.

A study found birds raised in ‘enriched cages’ enjoyed lower levels of stress and mortality, and were less likely to suffer from bone fractures or pecking than free-range chickens.

Demand for free-range eggs, which cost almost double the caged equivalent, has risen steeply as a result of high- profile campaigns.

But Professor Christine Nicol, who led the research at the University of Bristol, said although free-range farms had the potential to offer birds a better quality of life than their caged counterparts, many had poor welfare standards.

As a result, shoppers concerned about the well-being of birds should opt for eggs from caged hens, or those from free-range flocks that are part of a farm assurance scheme.

‘Caged hens are more comfortable than people think and have higher welfare as standard than free-range hens,’ she said.

‘It would be nice to think the current free-range system gave the birds the best welfare, but the problem is that the management of free-range systems in the UK is so variable. Although you get some brilliant farms, you get some that are really not good.’

Prof Nicol added: ‘The challenge for the industry is realising the potential of the free-range system... so that they actually do what consumers think they do, which is provide all hens with good welfare.’

Battery farming of chickens, which involved five or six birds living in cramped conditions in a small cage, was outlawed in 2012.

It was replaced by ‘enriched cages’ – flocks of 70 or 80 birds living in stacked enclosures with access to food, water perches and scratching posts.

Half of all eggs produced in Britain come from caged hens. But the majority of eggs sold in British supermarkets are free range.

A carton of 15 cage eggs from Tesco costs £1.75, or less than  12p an egg. For a dozen medium free-range eggs, the price is £2.50, or 21p each.

A spokesman for Compassion in World Farming said consumers should still opt for free-range eggs.

She added: ‘Only in free-range (or organic) farms can hens fully perform all their important natural behaviours, like stretching and flapping their wings, perching up high, foraging, scratching, dust-bathing and laying their eggs in a comfortable nest.’

SOURCE




Wednesday, November 13, 2013




A daily bowl of wild blueberries could protect against obesity, heart disease and diabetes


If you are a specially bred obese lab rat

A bowl of wild blueberries a day could protect against a range of health problems including obesity, heart disease and diabetes.  Regular consumption of the berries over an eight-week period can improve or prevent metabolic syndrome, researchers say.

Metabolic syndrome is the medical term for a combination of diabetes, high blood pressure and obesity.

It increases the risk of heart disease, stroke and other conditions affecting blood vessels.

On their own, diabetes, high blood pressure, and obesity can potentially damage the blood vessels, but having all three together is particularly dangerous.

Berries are rich in polyphenols - antioxidants that protect cells in the heart and help lower blood pressure.

This means they may help reduce damage to the lining of blood vessels and tackle glucose intolerance - excess sugar in the blood that can lead to diabetes.

In the study, published in the journal the Applied Physiology, Nutrition, and Metabolism, specially bred obese lab rats were fed a diet of blueberries - the equivalent of two cups a day for a human.

The researchers found this improved the relaxing and constricting in the blood vessels (endothelial function), which had a significant impact on blood flow and blood pressure.

Dorothy Klimis-Zacas, professor of clinical nutrition at the University of Maine and a co-author of the study, said: 'Metabolic syndrome is a group of risk factors characterised by obesity, hypertension [high blood pressure], inflammation, high cholesterol, glucose intolerance and insulin resistance, and endothelial dysfunction.

'Many substances found in food have the potential to prevent metabolic syndrome, thus reducing the need for medication and medical intervention.'

She added that previous research had shown the heart benefits of the 'polyphenol-rich' wild blueberry, using rats who had high blood pressure.

This study used rats whose bodies act in a similar way to humans.

Professor Klimis-Zacas added this study showed that eating wild blueberries long-term could normalise inflammation and improve endothelial function.

But it's best to eat the berries raw. Previous research has found that cooking in pies or muffins could reduces their disease fighting nutrients.

Heating the fruit affects the levels of some polyphenols - which give them their 'superfood' credentials - potentially reducing their ability to cut the risk of heart attack, sooth inflammation and sharpen thinking.

Experts say eating blueberries raw is the best way to get as much nutritional benefit from them, whereas baking them into breads, muffins or pies can cut their polyphenol levels by up to a fifth.

SOURCE






Should the drug that transformed Megan's skin be banned? Experts think it's a wonder cure but it's been linked to depression and suicidal thoughts

A generally balanced discussion below but it is a pity that the teratogenic effects are not mentioned.  Women taking it should exercise rigorous birth control

Megan Taylor began to fear her modelling and acting career was over last year when, at the age of 27, she suddenly developed spots on her face.

Her GP diagnosed cystic acne, a severe form of the condition that extends deep into the skin, and prescribed antibiotics - the standard treatment - for six weeks. When this didn't help, Megan was given a different antibiotic, then another and another - but none made a difference.

She had sailed through her teens without any spots, and developing acne in her 20s was not only distressing but threatened her livelihood. 'Obviously your skin is scrutinised at castings and I became really self-conscious,' says Megan, 28, who lives with her fiancé Liam, 31, a sports physiotherapist, in Thames Ditton, Surrey.

'I tried everything to improve my skin. I gave up chocolate and alcohol, and tried dozens of over-the-counter skin lotions and creams - some were very expensive.'  Then an actress friend mentioned that she'd had a similar skin problem and recommended the acne drug Roaccutane.

Roaccutane is a brand name for isotretinoin, a compound derived from vitamin A. Dermatologists regard it as something of a 'wonder drug'. But recently it has made headlines for the wrong reasons, with suggestions that it is linked to suicidal feelings. As a result, people who might benefit from the drug - and their doctors - are reluctant to use it.

When Megan asked her GP to prescribe the drug, he said that only a dermatologist could do this, and her condition wasn't severe enough to refer her. Desperate for an improvement to her skin, Megan paid £300 to see a private dermatologist and was put on Roaccutane straight away.

After five months her skin had cleared completely. Now, six months after finishing the treatment, Megan remains acne-free and is back modelling. 'It gave me my life back and my career,' she says.

So have safety fears over Roaccutane been exaggerated, denying patients a potentially life-changing treatment? Or was Megan merely the positive side of a story that remains more troubling?

Acne is triggered by an oversensitivity to hormones, such as testosterone. This causes the sebaceous glands - the oil-producing glands in the skin - to go into overdrive.

At the same time, the dead skin cells lining the pores are not shed properly. The two factors result in a build-up of oil (sebum), which leads to blackheads (a darkened plug of oil and dead skin)  and whiteheads.

This is the ideal environment for the acne bacterium, Propionibacterium acnes, to flourish. We all have this on our skin and usually it causes no problems, but in those prone to acne it triggers inflammation and the formation of red or pus-filled spots.

Roaccutane works by reducing the size and activity of the sebaceous glands, so reducing the amount of oil; it also eases inflammation. It is more effective than antibiotics for severe acne because it tackles the main cause - the oil - while antibiotics only kill pore-blocking bacteria. It is usually prescribed for 16 to 24 weeks.

For dermatologists the drug represents a real breakthrough. 'Isotretinoin came along 30 years ago and revolutionised the treatment of severe acne - from a dermatologist's point of view, it is an extraordinarily useful drug,' says Dr Neil Walker, honorary consultant dermatologist at the Churchill Hospital and the Stratum Clinic, both of which are in Oxford.

And its use is soaring - prescriptions for acne products containing isotretinoin rose from 1,697 in 2008 to 48,797 in 2012.

Around 80 per cent of teenagers will have acne at some point, and for 10 to 15 per cent (an estimated 250,000) it is bad enough to warrant treatment with an isotretinoin product.

However, as well as side-effects such as dry skin (linked to reduced sebum) and a risk to unborn babies (women, while taking it, must use contraception during treatment and for a month after stopping), it has been associated with depression and suicidal thoughts. Although this is rare - the manufacturer, Roche, says depression affects one in 1,000 patients and suicide/suicidal thoughts affect one in 10,000 - several recent high-profile cases have highlighted the terrible cost for those involved.

Last year, 16-year-old Jack Bowlby, the nephew of racehorse trainer Jenny Pitman, was found dead at his boarding school. The cause of death was neck compression due to a ligature.

He had been taking Roaccutane for four months, stopped it, then started back on it. His family said they felt Roaccutane 'may have played a part' in his death, and that patients and their parents should be 'very aware of the possible risks'. The coroner recorded an open verdict.

James Sillcock, 26, from Kent, also killed himself, after years of mental health problems that he blamed on Roaccutane, which he took at 16. In a suicide note he said the drug had left his world 'in tatters'.

Perhaps not surprisingly there have been calls for Roaccutane to be banned.

So how might the drug have this effect?  A study of rats by Dr Sarah Bailey, a pharmacologist from the University of Bath, suggests that isotretinoin may trigger depressive behaviour.

'Our theory is that it may turn on genes responsible for metabolising serotonin - a feel-good chemical made in the brain,' explains Dr Bailey. 'This means less serotonin, which may then precipitate depression.'

What is not clear is if this is a temporary or permanent effect. For although the drug clears from the body a month after the last pill is taken, some families of those who committed suicide after taking it maintain its psychological side-effects persist for much longer - months, even years.

However, evidence cited by Roche and leading dermatologists suggests that those who suffer from acne are at an increased risk of suicide because of the acne itself, not the treatment. Dr Anthony Bewley, consultant dermatologist at Barts Health NHS Trust, says: 'I see a lot of patients with acne who are very depressed - it can be absolutely devastating for confidence and self-esteem.

'Young teenagers are vulnerable to depression and having acne can increase the suicide rate in this group by up to five times.'

Dr Bewley, who runs specialist psycho-dermatology clinics in partnership with consultant psychiatrists, says that Roaccutane has been used for decades and is largely safe. 'The evidence suggests that if it is an effect, it's a small one, and it may just be that it exacerbates existing low mood.

'Even so, it's a drug that should be treated with the greatest respect and prescribed carefully by dermatology specialists only.

'They should always ask questions about mood and how the patient is feeling beforehand and during treatment.'

Unfortunately, in some cases the drug may be being prescribed inappropriately to those with moderate to mild acne, says Tony Chu, professor of dermatology at Hammersmith Hospital.

'There is a lot of pressure in the NHS for dermatologists to clear patients from their books quickly - we're told we should have a maximum of two follow-up appointments because of financial pressures, so I think Roaccutane is sometimes prescribed to speed things up when it's not always the most appropriate drug.

'A combination of antibiotics and retinoid cream might work,  or a laser treatment, as they  can be effective and have  fewer side-effects.'

Professor Chu also warns that, while isotretinoin treatment can be a cure in some cases, 'in my experience, 50 per cent of patients will relapse after a course of treatment. This can be very fast, within two weeks of stopping the drug, or it may be several years later'.

He adds that the drug can, in his view, also affect concentration. 'Another rare side-effect I've seen is chronic fatigue. I once had a bright, straight-A-star patient referred to me who had developed this after taking Roaccutane for four months. It had persisted for years after stopping the drug.

'He subsequently dropped out of Oxford University. Although his acne cleared, he never recovered from the chronic fatigue and, sadly, two years later I learnt that he'd taken his own life. I think his suicide mey well have been related to Roaccutane.

'Although the drug only stays in the system for about a month, the effects on the brain can in some cases be permanent. A biological switch is flicked by Roaccutane and sometimes it doesn't turn on or off again.

'All these side-effects are rare but there is no way of predicting beforehand whether a patient may be affected. My experience has  made me very cautious about prescribing it. You have to weigh up the benefits against the potential side-effects - in severe cases of acne that hasn't responded to treatment, it is worth it, but in milder cases it probably isn't.'

Lisa Tester strongly agrees. When she first took Roaccutane, she was delighted. The 44-year-old translator had suffered years of misery because of acne.

'I'd been bullied at school. I even decided against university because it affected my confidence so much. My face, neck, back and chest were all covered in ugly red pustules, which left scars.'

After she was prescribed Roaccutane at 26, Lisa's spots cleared up within weeks.

'I'd never felt so good,' says Lisa, who is divorced and lives in Brighton with her son Andrew, 16.

'Back then I was given no warnings about the drug affecting mood or causing depression, but I didn't notice anything like that anyway - I'd never felt happier.'

Lisa was given repeated courses over the next few years and her skin stayed clear. Then last year she saw a different dermatologist. 'He said the latest thinking was that a higher dose of Roaccutane would prevent the acne coming back for longer. He more than doubled my dose to 60mg a day.'

This is a standard dose, and teenagers are often put on it. But while her skin cleared, Lisa's mood was affected and two weeks later she woke up and - with no previous history of depression - decided to kill herself.

'Suddenly I had very dark thoughts. Over the next four weeks I felt worse and worse and could not see a way forward.'

When her boyfriend, Jeremy, and her son were both away one weekend - six weeks after her dose was raised - Lisa took an overdose of alcohol and painkillers. Fortunately this failed and the next day she went to her GP.

'Even at that stage I didn't connect my feelings to Roaccutane - it was only when I was discussing possible triggers with the community psychiatric nurse that I realised it had started after my dose had been doubled.

'I will never take Roaccutane again. My son has severe acne, but he doesn't want to take the risk either.'

Dermatologists are concerned that such rare experiences will put patients and doctors off the drug. 'I'm fearful that a good drug may start to be withheld because of the publicity about suicides,' says Dr Bewley.

Dr Neil Walker says Roaccutane must be handled carefully. 'Dermatologists should counsel patients and their families about the risks and tell them to report any changes in behaviour, and it should not be prescribed to anyone with a history of depression.'

Megan Taylor is in no doubt about the drug's benefits. 'I've got the greatest sympathy for the families of teenagers who have taken their own lives after taking Roaccutane, but those side-effects are extremely rare.

'I think if your acne is severe and persistent, and you inform your family of the risks, and the dermatologist monitors you carefully, the benefits may outweigh the risks.'

SOURCE