Friday, August 31, 2012

Marmite: the latest superfood?

Marmite -- and a similar Australian product -- Vegemite -- is a complete mystery to Americans, who generally find it revolting.  But in much of the British Commonwealth it has a huge and dedicated following.  I enjoy the stuff myself.  I always have a large jar of Vegemite in the fridge.  That's almost a patriotic duty in Australia.  And it has always been clear that it has some useful nutrients in it. I doubt that the concentration of niacin is high enough for relevance to the mouse study  mentioned below, however

Five months ago, crisis struck in New Zealand. Earthquake damage to a factory in Christchurch halted production of a staple foodstuff, crippling supply chains nationwide. Supermarket shelves were stripped bare and store cupboards emptied. Consumers started panic buying, hoarding secret supplies and auctioning half-full containers online for extortionate amounts.

They called it “Marmageddon”. The foodstuff? Marmite. That sticky, gloopy, salty spread, made from yeast extract. It’s so popular on the other side of the world that when Sanitarium, its main manufacturer in New Zealand, shut down, the prime minister appeared on television urging the public to stay calm. Now, Marmite could become just as in demand in Britain, after scientists labelled it the latest “superfood”, capable of helping our bodies fight off life-threatening infections.

According to research in the Journal of Clinical Investigation, high doses of niacin (or vitamin B3), one of the main ingredients in Marmite, help boost the body’s defences against staphylococcus bacteria. In tests, concentrated niacin – which produces neutrophils, a white blood cell that fights bacteria – increased our immune system’s ability to kill different strains of the bugs by up to 1,000 times. This could mark a turning point in the battle against antibiotic-resistant superbugs, such as MRSA, the deadly strain that poses a threat in hospitals.

As the saying goes, you either love Marmite or you hate it. On one side are devoted fans who worship “black gold” and would pour it on their cornflakes if they could. On the other are those who hate its yeasty, bitter tang. I’m one of the latter: for me, Marmite has the taste of stale, acrid sardines and the texture of cold treacle. They say the best things for you often taste the worst – such as cabbage, lentils and green tea – but I’d need a lot more convincing before spreading Marmite on my toast.

This isn’t the first time it has been billed as a superfood. First produced in Burton-on-Trent in Staffordshire in 1902, Marmite contains concentrated brewer’s yeast, salt, spices and celery. Due to its high nutritional value, it was part of soldiers’ ration packs during the First World War, and in the Thirties, English scientist Lucy Wills found that the folic acid in Marmite could be used to treat anaemia. Its high vitamin B content also reportedly makes the spread an effective mosquito repellent.

“Marmite helps my pregnant clients get over morning sickness and it’s great for elderly people who have lost their sense of taste,” explains nutritionist Melanie Brown. “I would recommend it to vegetarians, who miss out on vitamin B12, and children who don’t eat much wholegrain bread.”

But not everyone agrees. Concerns have been raised over the high salt content of Marmite (11g per 100g), which led the local council in Ceredigion, Wales, to ban it in primary schools in 2008. More recently, the Danish Veterinary and Food Administration declared Marmite illegal because of its large quantities of additives – it hasn’t been sold in Denmark since May last year.

So before you start lathering yourself in the sticky spread, be warned: indeed, scientists from Oregon State University, who carried out the latest research, have urged people not to take high doses without medical supervision. Our recommended daily intake of niacin is 17mg (13mg for women), and excessive quantities can cause skin flushes and liver damage.

For all those Marmite obsessives out there, the experts recommend a spoonful at a time. “A thin layer is all you need, not piled on your toast like chocolate spread,” says Brown. “A little of what you love won’t do you any harm.”


Eating nuts in pregnancy 'reduces chance of childhood allergy'

Finally the word is getting out

Mothers-to-be should eat nuts because doing so reduces the chances of their children developing allergies, new research has found.
The study adds to evidence that most women should not fear eating nuts in pregnancy, or while breast feeding.

Children of women who eat peanuts and other nuts during pregnancy are a third less likely to suffer from asthma by the age of seven, compared to those whose mothers avoid them, researchers discovered.

For years pregnant women were advised against eating nuts of any kind, due to concerns that they could increase the risk of allergies in their offspring.

But in 2009, the Food Standards Agency revised its advice, stating there was “no clear evidence that eating or not eating peanuts during pregnancy, breastfeeding or early childhood has any effect on the chances of a child developing a peanut allergy”.

Now Danish researchers have gone a step further - finding that eating nuts while expecting has a protective effect on babies.

British experts said they hoped the “robust” study would help discredit the myth that foods containing nuts were somehow intrinsically dangerous for most children.

The new study looked at more than 60,000 mothers and their children, following them from early pregnancy until the children were seven.

Nut eating during pregnancy reduced the chance of a child being classes as asthmatic at 18 months by about a quarter, and a third at seven years.

Writing in the Journal of Allergy and Clinical Immunology, Ekaterina Maslova and colleagues from the Statens Serum Institute in Copenhagen, said: “We found that maternal peanut and tree nut intake one or more times per week during pregnancy decreases the risk of allergic disease in childhood. These results do not support avoidance of nuts during pregnancy.”

Colin Michie, chairman of nutrition at the Royal College of Paediatrics and Child Health, hoped women would take note of the findings, which mirrored others showing early exposure to nuts was beneficial for the developing immune system.

He said a rash of studies in the 1980s, that purported to find evidence of a link between nut eating during pregnancy and allergic response, had in fact been found to be weak.

Unfortunately public health officials had leapt on these and others, and advised against eating nuts. By invoking the ‘precautionary principle’ they had unwittingly done more harm than good, he said.

He went on: “Recent studies such as this robust research show the truth of granny’s wisdom, that a little bit of everything tends to be good for you.

“If your body has experienced something before, it’s not going to think that it’s an enemy and come out fighting against it, which is what happens with an allergic response.

“Scientifically speaking, if you have antigens that are present when you are building up your immune repertoire as a foetus and infant, you are less likely to regard something as foreign or dangerous when you encounter large quantities of it.”

This school of thought is exactly the same as that in the hygiene hypothesis, which contends that growing up in a home that is too clean is bad for a child, as it prevents exposure to bugs that stimulate the immune system.

Dr Michie cautioned that, while it was now largely accepted that most pregnant women and young children should not restrict their diets for fear of allergies, there were still clinical exceptions.

Women who had a “dreadful family history of allergy” to nuts should still avoid them, he said, while those unsure should consult their doctors.

Michael Walker, a food chemist, said two studies, called EAT and LEAP, were currently ongoing to determine if early introduction of potentially allergenic foods could help prevent food allergies.


Thursday, August 30, 2012

Whoops!   Chocolates and red wine may not be so good for you as scientists say there is no evidence they battle heart disease

Scientists claim there is no proof that chocolate and red wine cut heart disease – despite millions hoping they do.

The mechanisms by which they could make a difference have still to be explained, according to heart specialists.

The evidence that dark chocolate protects the heart remains elusive, even though a recent study showed a 37 per cent cut in risk for those eating a square a day.

This was only a 'sign', however, and not proof because the study was flawed, said Steffen Desch from the University of Leipzig Heart Centre in Germany.

He said a more conclusive trial could be difficult because the real thing would have to be tested against a 'dummy' substance that looked and tasted like chocolate.

Some small studies have claimed that chocolate lowers blood pressure and reduces inflammation in the body. But Dr Desch is unconvinced.  'Despite the studies I couldn't yet recommend dark chocolate as a prevention or treatment in cardiovascular disease,' he said.

'There's no strong evidence of a benefit and no clear explanation of an effective mechanism.' The calories contained in chocolate are likely to offset any protection to the heart, he added.

His reservations came as Dutch researchers dampened down speculation about the benefits of red wine on the heart.  Even though it is also supposed to help heart health, there is no single ingredient which appears to work, they said.

They have tested resveratrol, which is found in the skin of red grapes and is believed to have a range of life-enhancing properties.  Eric Sijbrands, of Erasmus University Medical Centre in Rotterdam, led a series of studies which failed to replicate the findings of heart benefits from taking resveratrol.

Using it in capsules for four weeks did not lower blood pressure in patients with hypertension, he said. 'Certainly I would never actively prescribe red wine for a heart condition and, even if I was asked about it, I would be cautious,' he added.

If red wine does work, the explanation is likely to be 'complex', he said. Any benefit from moderate consumption is likely to be small and outweighed by the adverse effects of drinking too much.

The scientists were speaking at the European Congress of Cardiology in Munich yesterday.


Cannabis smoking 'permanently lowers IQ'

Teenagers who regularly smoke cannabis are putting themselves at risk of permanently damaging their intelligence, according to a landmark study.  Researchers found persistent users of the drug, who started smoking it at school, had lower IQ scores as adults.

They were also significantly more likely to have attention and memory problems in later life, than their peers who abstained.

Furthermore, those who started as teenagers and used it heavily, but quit as adults, did not regain their full mental powers, found academics at King’s College London and Duke University in the US.

They looked at data from over 1,000 people from Dunedin in New Zealand, who have been followed through their lives since being born in 1972 or 1973.

Participants were asked about cannabis usage when they were 18, 21, 26, 32 and 38. Their IQ was tested at 13 and 38. In addition, each nominated a close friend or family member, who was asked about attention and memory problems.

About one in 20 admitted to starting cannabis use before the age of 18, while a further one in 10 took up the habit in the early or mid 20s.

Professor Terrie Moffitt, of KCL’s Institute of Psychiatry, who contributed to the study, published in the journal Proceedings of the National Academy of Sciences, said “persistent users” who started as teenagers suffered a drop of eight IQ points at the age of 38, compared to when they were 13.

Persistent users meant those who used it during at least three of the ages from 18 to 38, and who said at each occasion they were smoking it on at least four days a week.

She said: “Adolescent-onset cannabis users, but not adult-onset cannabis users, showed marked IQ decline from childhood to adulthood.  “For example, individuals who started using cannabis in adolescence and used it for years thereafter showed an average eight-point IQ decline.

“Quitting or reducing cannabis use did not appear to fully restore intellectual functioning among adolescent-onset former persistent cannabis users,” she said.

Although eight points did not sound much, it was not trivial, she warned.

It meant that an average person dropped far down the intelligence rankings, so that instead of 50 per cent of the population being more intelligent than them, 71 per cent were.

“Research has shown that IQ is a strong determinant of a person’s access to a college education, their lifelong total income, their access to a good job, their performance on the job, their tendency to develop heart disease, Alzheimer’s disease, and even early death,” she said.

“Individuals who lose eight IQ points in their teens and 20s may be disadvantaged, relative to their same-age peers, in most of the important aspects of life and for years to come.”

The cognitive abilities of the 10 per cent of people who started in their 20s - who could loosely be classed as college smokers - also suffered while they were still smoking.

However, if they gave up at least a year before their IQ test at 38, their intelligence recovered, suggesting their brains were more resilient and bounced back.

Prof Moffitt said adolescent brains appeared "more vulnerable to damage and disruption" from cannabis than those of fully mature adults.

Reliable figures on cannabis usage among today’s British teens and twentysomethings are hard to come by.

But Prof Moffitt said there was growing concern in the US that cannabis was increasingly being seen as a safe alternative to tobacco.  “This is the first year that more secondary school students in the US are using cannabis than tobacco, according to the Monitoring the Future project at the University of Michigan,” she noted.  “Fewer now think cannabis is damaging than tobacco. But cannabis is harmful for the very young.”


Wednesday, August 29, 2012

Middle class people more likely to do what the do-gooders tell them

Whether that is what makes them healthier is not establshed, however

The middle classes are getting healthier by giving up bad habits as the less well-off fail to get the message, a report has found.

The increasing social class divide in health will put ‘unavoidable pressure’ on an already hard-pressed NHS, it says.

The report, from the influential King’s Fund health think-tank, says many poorer people are failing to give up habits such as smoking and eating junk food.

Researchers analysed official data from England covering four behaviours linked to disease and early death: smoking; excess alcohol use; poor diet and sedentary lifestyles.

These bad habits account for almost half the burden of ill health in developed countries and are linked to everything from heart problems and diabetes to cancer.

They found the number of people engaging in three or four of these risky behaviours fell from 33 per cent in 2003 to 25 per cent in 2008.

But the ‘significant’ change was very different among the social classes.  The report found ‘these reductions have been seen mainly among those in higher socioeconomic and educational groups’.

Those with no educational qualifications were more than five times as likely as those with degrees to engage in four key damaging behaviours in 2008, compared with three times as likely in 2003.

‘The health of the overall population will improve as a result of the decline in these behaviours, but the poorest and those with the least education will benefit least, leading to widening health inequalities and unavoidable pressure on the NHS’, the report says.

It found the better off someone was, the more likely they were to have begun living a healthier life during 2003-08 – when the Labour government embarked on a campaign for healthier living.

David Buck, a senior fellow at the King’s Fund who was head of health inequalities at the Department of Health until 2010, led the research.  He said: ‘The widening... gap is due to the improvement in those at the top, and, to a lesser degree, those in the middle, not because those at the bottom have got worse per se. They’re stuck in a rut.’

Those from poorer backgrounds or with less education are more likely to develop long-term conditions such as cancer and diabetes earlier and to experience them more severely, Mr Buck said. He added: ‘As well as this being a  public health problem, this does also store up problems for the NHS in future.’

The report warns about 70 per cent of adults in England still engage in two of the four habits.

Health Secretary Andrew Lansley has pledged to ‘improve the health of the poorest fastest’, with the better-off currently living seven years longer on average.

A Department of Health spokesman said: ‘We are working hard to tackle health inequalities – from next year, local authorities will receive a specific public health budget for the first time, targeted at the areas that need it most.’


An aspirin a day could help in fight against depression among the elderly

The effects found below were slight in absolute terms but the evidence for benefit from aspirin intake does seem wide-ranging

Taking an aspirin pill a day could help combat depression in the elderly.  Trials found a regular dose reduced the risk in sufferers by around 40 per cent.

It seems to work by lowering levels of homocysteine, an acid in the blood thought to increase the chances of heart attacks and strokes when levels are too high.

Now some scientists think excess homocysteine may also be a factor in poor mental health and that nearly one in six cases of depression in the elderly could be avoided by using aspirin to lower levels in the blood.

Up to 20 per cent of us suffer depression at some point in our lives, with women affected more than men.

And the elderly are at high risk because of the effect from declining health, bereavements and loneliness.

To test whether lowering homocysteine levels prevented depression, scientists at the University of Western Australia in Perth studied 3,700 men aged between 69 and 87 and monitored their medical records to see which ones had a history of depression.

They were also tested to see if they had raised levels of homocysteine. The findings, published in the journal Translational Psychiatry, showed men with excessive homocysteine levels were 60 per cent more likely to suffer with depression.

The report said: 'This study showed, for the first time, that aspirin is associated with a significantly lower risk of depression among older men with high homocysteine.'

Researchers say it is still not clear how homocysteine makes someone more susceptible to depression, but the men with high homocysteine who took a daily aspirin saw their risk of depression drop 43 per cent.

Taking vitamin B supplements, which can also lower homocysteine, did not have the same effect. US scientists recently discovered daily aspirin users are 16 per cent less likely to die if they develop any type of cancer.

Other studies suggest the drug can also cut the risk of prostate cancer by almost 30 per cent and bowel cancer by up to 60 per cent.

However aspirin can cause stomach bleeding in around one in a thousand patients.

Emer O’Neill, chief executive of the Depression Alliance, said although the research was ‘interesting’, patients should not change their treatment because of the findings.


Tuesday, August 28, 2012

Could  drinking red wine help keep old people  steady on their feet?

Mouse study only  -- using gigantic doses

Red wine isn’t usually associated with being steady on your feet.  But a ‘miracle ingredient’ in it could have that effect on pensioners, scientists claim.  They say that resveratrol, which is already credited with a host of health benefits from cutting cholesterol to warding off cancer, boosts balance and improves mobility.

In tests, old mice that were given the plant chemical for a few weeks became just as sprightly as young animals.

If resveratrol has similar effects on the human body, it could help prevent the painful falls and fractures from which many pensioners struggle to recover.

Falls are one of the leading causes of death in the over-75s, and half of elderly women die within two years of a fall.

The US researchers said: ‘Our study suggests that a natural compound like resveratrol, which can be obtained through dietary supplementation or diet itself, could actually decrease some of the motor deficiencies that are seen in our ageing population.

‘That would therefore increase an ageing person’s quality of life and decrease their risk of hospitalisation due to slips and falls.’ The researchers, from Duquesne University in Pittsburgh, fed resveratrol to young and old mice for eight weeks and regularly tested their ability to walk along a rodent-sized beam.

Initially, the older mice struggled but, over time, they became just as deft on their paws as the younger animals.

An American Chemical Society conference heard that it is not entirely clear how resveratrol, which is found in the grape skins that give red wine its colour, improves balance.

But rather than it strengthening bones or muscles, studies on cells in a dish suggest it helps ailing brain cells survive.

But don’t reach for the wine bottle just yet – you would fall over long before you drank the required amount.

Despite its potential in lab tests, resveratrol is so poorly absorbed by the human body that someone would have to drink several hundred glasses of wine a day to get the benefits enjoyed by the mice.

The researchers are now looking for compounds that work just as well but at much lower quantities.

They say that while there are medicines available to help improve balance and co-ordination in people with diseases such as Parkinson's, there is nothing for otherwise healthy pensioners who are not as steady on their feet as they used to be.


Anorexia is genetically transmitted

As it is clearly a mental illness in the OCD category, this is what you would expect

Claire Vickery was not surprised when scientists announced that eating disorders have a genetic link, because she and her two daughters suffered from the illness.

Eating disorders specialist Professor Howard Steiger, of McGill University in Montreal, told a conference in Adelaide last week that new discoveries in epigenetics show mothers pass a genetic predisposition to eating disorders to their children.

"The science of epigenetics is relatively new," he said at the National Eating Disorders Collaboration National Workshop. "Epigenetics helps explain how adverse development, stress, malnutrition and other influences can affect development of mental-health problems - including eating disorders."

Ms Vickery, 56, said she had bulimia from the ages of 16 to 29. "I'm sure I'm carrying the gene."

However, the president of the Australian and New Zealand Academy for Eating Disorders, Dr Anthea Fursland, said that genes alone would not result in a child developing an eating disorder. "Genetic influences do play a part but they will not cause an eating disorder on their own," she said. "Eating disorders arise as result of a combination of factors but the common factor in every case is dieting."

Ms Vickery's two daughters, Anna and Laura, both had eating disorders when they were younger. They have all recovered but Ms Vickery's experience led her to set up the Butterfly Foundation, which encourages prevention, treatment and support of those affected by eating disorders.

Professor Steiger said epigenetics would play a large role in understanding eating disorders. "If eating disorders are about anything, they're about the ways in which environments switch on hereditary vulnerabilities," he said.

"It will give us a better understanding how it is that some people develop an eating disorder. It's not due to moral weakness or character flaws, but real susceptibilities, for which we can find real physical evidence."

By identifying the genes, he hopes to develop a test and even medication.

One of Ms Vickery's daughters, Anna Spraggett, who is 33 and has three children, said she was excited about the discovery. "It's a positive step forward to finding a cure and treatment," she said.

While she thinks that environmental factors play a part, Ms Vickery said if parents were aware their children were susceptible, they could be mindful of stressful triggers.

"This is not about guilt for mothers," she said. "But if there was a take-home message, it's to choose your words with children … no fat talk in the household - or ever, in fact."


Monday, August 27, 2012

Breastfeeding fanatics

Class told baby formula 'was like AIDS'

EXPECTANT mums and their partners were told baby formula was "like AIDS" during an Australian Breastfeeding Association class.  Couples were also repeatedly told a baby died "every 30 seconds" from formula feeding, prompting a rebuke from doctors.

"Formula is a little bit like AIDS," one of the association's leading counsellors told couples in the breastfeeding education class.

"Nobody actually dies from AIDS; what happens is AIDS destroys your immune system and then you just die of anything and that's what happens with formula. It provides no antibodies.

"Every 30 seconds a baby dies from infections due to a lack of breastfeeding and the use of bottles, artificial milks and other risky products. Every 30 seconds."

The association has received $4.3 million from the Federal Government during the past five years and its patron is Governor-General Quentin Bryce.

The counsellor is commended in the ABA's latest annual report for taking the highest number of calls to the body's taxpayer-subsidised National Breastfeeding Helpline.  Other documents show she helped more than 900 callers in 2010 and was honoured at a branch conference last year.

The Royal Australasian College of Physicians said the baby mortality cited was "certainly not true in Australia" and could be "highly frightening" for new parents.

"There are better ways to try to explain the benefits of breastmilk," paediatric and child health division president Susan Moloney said.  "We highly support breastfeeding. It is the optimal form of nutrition for any human infant. But in the cases where it isn't able to be done, formula feeding is safe in Australia."

Australian Breastfeeding Association president Rachel Fuller immediately launched an investigation into the comments.

"These statements were inappropriate in this situation and the individual concerned has acted outside the instructions and guidelines given," Ms Fuller said.  "We take such matters seriously and are following this matter up internally today."

An expectant mother attended the class at the ABA's Brisbane office on behalf of The Sunday Mail after a complaint about a previous session.

A dozen couples paid $85 each, including a compulsory membership fee, to attend. Similar sessions are held regularly around the country. "Of course, there's the higher IQ and all of the diseases that you don't get," the breastfeeding counsellor said in her opening remarks.

"We used to talk about all those sorts of things, but we don't talk about any of those any more."  She added: "A couple of years ago I broke this leg, quite badly. Nobody said to me 'we have this wonderful range of wooden legs now' ... they fixed the leg."

Like wooden-leg salespeople, formula companies would try to promote benefits, attendees heard.  "That's what formula is; it's pure sales pitch. They don't say 'look, a baby dies from this product every 30 seconds' ... they forget about that bit."

No information was offered about deaths in Australia.

Australian Medical Association Queensland president Alex Markwell said the statements were "inappropriate" and could amount to "scaremongering".

"I just don't think those comments are helpful in the long term. We have enough evidence that shows breastfeeding is best wherever possible. But women who for whatever reason are unable to breastfeed should not be ostracised," Dr Markwell said.

Why we went undercover:

To gain a true picture of what was being told to couples one of our reporters, who is an expectant mother, attended the class as a member of the public. The Australian Press Council and Media Alliance guidelines allow for undercover investigations in circumstances of significant public interest and when no alternative is available.


I have personally encouraged young mothers I know to breastfeed but have also been supportive when they have found it too difficult -- JR

Little need to chew over secret to long life

I like the last sentence below

 Doctors, dietitians and divines have long sought to identify the secret of a long life. The answer? Minestrone soup, according to nine siblings from Sardinia who have been recognised as the world's oldest in terms of combined age.

The oldest member of the Melis family, Consolata, was turning 105 yesterday, while the youngest of her siblings, Mafalda, is 78.

"To have such a large number of living siblings with an average age of more than 90 years is incredibly rare," the editor-in-chief of Guinness World Records, Craig Glenday, said on Tuesday of the Melises, who hail from Perdasdefogu in the mountainous Ogliastra province.

"We believe Ogliastra contains the highest number of centenarians per capita in the world."

Scientists have tried to work out what makes Sardinians live so long - 371 are over the age of 100, or 22 in every 100,000 - and credit genetic heritage, a frugal Mediterranean diet and a hardy lifestyle.

"We eat real food, meaning lots of minestrone and little meat, and we are always working," said Alfonso Melis, 89, who narrowly escaped being captured by German soldiers in World War II.

"Every free moment I have, I am down at my vineyard or at the allotment where I grow beans, aubergines, peppers and potatoes," he said.  "You just keep working and you eat minestrone, beans and potatoes," added his older sister Claudia, 99.

Consolata, who has had 14 children, nine of whom are still alive, plus 24 grandchildren, 25 great-grandchildren and three great-great-grandchildren, still cooks and feeds her goats.

"My grandchildren have washing machines, dishwashers and vacuum cleaners, and when I hear them say, 'I am stressed', I don't understand," she told Corriere della Sera.


Sunday, August 26, 2012

Amusing "junk" food idiocy in the NYT

Economist Don Boudreaux has a laugh at some addled Leftist hatred in the  letter to the New York Times below:

    Asserting that “Not everyone can afford fresh fruits and vegetables,” Mark Bittman pleads for policies that would replace today’s large commercial farms with smaller farms (“Celebrate the Farmer!” Aug. 22).  He writes: “The naysayers will yell, ‘this mode of farming will not produce enough corn and soy to feed our junk food and cheeseburger habit,’ and that’s exactly the point.  It would produce enough food so that we can all eat well”.

    Not all food experts agree with Mr. Bittman’s suggestion that agricultural markets and policies result in too little availability of fresh foods and, hence, prevent Americans – and especially poor Americans – from eating well.  Only last September one expert found that “In fact it isn’t cheaper to eat highly processed food….  In general, despite extensive government subsidies, hyperprocessed food remains more expensive than food cooked at home.  You can serve a roasted chicken with vegetables along with a simple salad and milk for about $14, and feed four or even six people” – a price, this expert reported, that is half of what it costs at McDonald’s for the same number of people to dine on burgers, fries, and soda.  (This fact, of course, means that people who eat lots of hyperprocessed foods choose to do so, and even pay a premium to indulge that preference.)

    Oh, I almost forgot: the expert who found that junk food is more pricey than are many healthier options such as “rice, grains, pasta, beans, fresh vegetables” is your very own Mark Bittman writing in your very own pages (“Is Junk Food Really Cheaper?” Sept. 24).


The empty-headed self-righteousness of the original NYT article is nauseating so I am pleased that Boudreaux has exposed the author for what he is -- JR

The "incorrect" diet that seems to beat fibromyalgia

When I wrote in the Daily Mail about how I’d overcome fibromyalgia, the response from readers was overwhelming.  Clearly, many people, like me, have been floored by the condition — and the lack of effective treatment — and were anxious for more details.

Unfortunately, no one really knows what causes fibromyalgia and there’s no cure.  Treatments such as painkillers rarely do more than ease the symptoms (characterised by debilitating muscle pain).

Many patients end up giving up work and normal daily life — I longed to retire early from my job as a GP just so I could rest all day.

After two years of misery, my condition was getting worse — but I then came across the theory that fibromyalgia may be linked to oxalates, which are compounds found in ‘healthy’ foods such as fruit, vegetables, salad, nuts and beans.

I cut these out of my diet and overnight my symptoms disappeared — the disabling muscle pains, tingling legs, fatigue and inability to concentrate all went.  But if I ate foods rich in oxalates, the symptoms returned within hours.

Why would this be so?  Oxalates are a kind of ‘natural’ plant pesticide and if the body doesn’t excrete them properly for some reason, it’s possible they accumulate in the muscles, brain and urinary system, causing a range of problems.

But though this made sense, no one could have been as surprised as me that the low oxalate diet actually helped.

And it really did — I was so happy to function normally again, to be able to run instead of amble, do my housework, carry on working and feel animated again.

I must stress that by no means am I an expert in fibromyalgia — eminent doctors and researchers, such as those behind the Fibromyalgia Association UK, have spent years studying this condition, and done much to support sufferers.

Indeed, the article I wrote was about my personal experiences and those of a small number of my patients.

But I can’t believe we are unique — I’m willing to believe my physiology may be a bit odd, but felt surely there would be others in the same situation......

The important thing to remember is that this approach appears to go against the healthy eating principles you’ve been following for years.

Your fruit and vegetable intake is going to be limited to low oxalate produce, which will likely result in you eating much less than before (though this is no reason not to get your five a day — you just won’t have a wide range of fruit and vegetables to choose from).

Going low oxalate also means avoiding healthy wholewheat products and potatoes.

I’d also recommend avoiding vitamin C supplements — in large doses, this vitamin is metabolised into oxalate.

Some low-oxalate foods, such as sponge cake and shortbread biscuits, are high in sugar, so shouldn’t be eaten to excess.

However, there are plenty of low-oxalate foods that are low in sugar, such as eggs, meat and cheese.


Friday, August 24, 2012

Living proof that the food Fascists are wrong

Some more proof of extremely limited diets being quite viable

When William Staub died of natural causes at the age of 96 last month, his longevity seemed a tribute to the benefits of healthy living. After all, in the Sixties Staub invented the first mass-produced running treadmill, which found its way into millions of homes and gyms. He was still using his own treadmill right up to the last weeks of his life.

But there was also something odd about his lifestyle — an extremely restricted diet that runs contrary to all sensible ideas of nutrition. For most of his long life, Mr Staub lived solely on tomatoes, plain toast and tea — occasionally brightened by a slice of cheese and lettuce. How can anyone exist on such a regime for a month — let alone many decades?

Mr Staub’s story is just the latest in a long line of strange tales of people who, for years, will eat only a few odd foods, such as cheese and chips, or even just Monster Munch crisps (and only the one flavour, at that).

Why are they still alive? After all, we are constantly reminded how we must enjoy balanced diets that include five-a-day fruit and veg, along with the right proportions of protein, dairy and carbs, and all the vitamins, minerals that a body needs (and not too much of anything, remember!).

Nevertheless, thousands get along by eating far more restricted fare every day of their lives. Infamously, Lord Lucan would only ever have the same food for dinner: pork chops.

According to Muriel Spark, the novelist who researched Lucan’s life, the missing peer’s idea of variety was to have the chops glazed in gelatine during the summer months, while in winter he would have them grilled.  Lucan’s friends claimed this as evidence that he was too dull to have done anything so bold as to attempt to murder his wife, kill his nanny by mistake and then disappear.

But it is not only the famous or infamous who are affected. Last month Abi Stroud, an 18-year-old from Newport, South Wales, revealed that she has eaten only cheese and chips for the past eight years. The regime might sound like teenage heaven to some kids, but Stroud says it has been utter hell.  She eats three blocks of mature cheddar and three bags of chips a week. She will eat white bread — but only one particular brand.

The A-level student says that this is not through choice. She has a deep phobia of new foods. They terrify her, she says. Even the sight of a banana being peeled makes her gag.

As a result, her social life is as sorely restricted as her diet. ‘I never go out for dinner with friends or eat with other people because so I’m worried about being expected to eat something else,’ she told reporters.  ‘When people ask me to try something different, I feel sick and dizzy. A teacher tried to get me to eat a chicken nugget and I burst into tears.’

Now Miss Stroud has been diagnosed by specialists with a condition called Selective Eating Disorder. Her food aversion began when she was ten, and she believes it was linked to the death of her grandmother. Her condition saw her weight spiral to 15st when she was 16. Exercise then saw her slim down to 13st.

She is off to university next month, and she hopes her diagnosis can help her to break her phobic cycle. ‘Now I know it’s not just me being a fussy eater, I’m determined to try something new,’ she says.

Selective eating disorder (SED) is such a newly identified condition that it has not yet been accepted into the ‘bible’ of psychiatry, the American Diagnostic and Statistical Manual of Mental Disorders. It is expected to be included in the 2013 edition.

Meanwhile, The British Journal of Clinical Child Psychology and Psychiatry has described SED as: ‘A little-studied phenomenon of eating a highly limited range of foods, associated with an unwillingness to try new foods. When this happens social avoidance, anxiety and conflict can result.’

Pilot studies in America have found many thousands of people who seem to fit the criteria for the disorder. But SED should not be confused with normal childhood fussiness.

According to the Royal College of Psychiatrists, about 12 per cent of three-year-olds suffer from persistent selective eating — extremely faddy about their food — but fewer than one per cent carry it into adulthood.

Debbie Taylor is one of this minority. For more than a decade, the 32-year-old has eaten nothing but crisps. For the past two years, she has eaten only beef-flavoured Monster Munch for breakfast, lunch and dinner — two family-size bags a day.

The mother of a 12-year-old son, she says she has always been a fussy eater. ‘I can remember my mum trying everything to get me to eat healthily, cooking spaghetti bolognese and chopping up veg, which I refused to eat. She finally said: “If you don’t eat that, there’s nothing else.” I replied: “Fine. I don’t want anything.”’

Her food aversions led to anorexia and bulimia as a schoolgirl. In her late teens, she ate only dry-roasted peanuts, and bread sprinkled with salt. At the age of 25, she bought a packet of barbecue-flavoured crisps and fell in love with them.  ‘I didn’t eat anything else for the next eight years, until the day I decided to go wild and try Monster Munch. They had been a childhood treat, and they became my crisp of choice,’ she has said.

The amazing thing is that Ms Taylor looks remarkably healthy, as do many selective eaters.

So how on earth do their bodies manage to survive? The secret lies in the human frame’s remarkable diversity and adaptability, according to Rick Miller, a registered dietitian and spokesman for the British Dietetic Association.

He says the dietary guidelines put out by Government experts are our best scientific guess at a one-size-fits-all recommendation. But our individual nutritional needs vary widely — and at the far edges of this spectrum are people whose bodies exist happily on strange diets.

‘The human body is a fascinating organism. It has been built for survival, and people’s nutritional requirements can differ from person to person,’ Mr Miller explains.

‘The official recommended daily intakes of nutrients — called Dietary Reference Values (DRVs) are only a guide. There are individuals who can survive on very little, as well as those who need a lot more every day. So some people can be apparently healthy on very restricted diets. However, they may be missing out on vital vitamins and minerals.’

On top of this, our systems can hoard scarce nutrients, which may also help people to survive on bizarre food regimes.

Mr Miller adds: ‘The body can store minerals, iron and B vitamins in the liver, so people on restricted diets can rely on their own stores for a while. People with SED may also have tastes that reflect their body’s vital nutritional needs.  ‘We see cravings for certain nutrients in pregnant women, and there might be something similar happening with some selective eaters.’

And he has a warning for healthy-diet evangelists: the worst thing you could ever do to someone with SED is to make them suddenly eat a ‘proper’ meal.

‘If you force someone with SED to suddenly take on lots of other nutrients, it can send their body into a form of shock,’ he says, ‘This is called “re-feeding syndrome” and can have serious consequences, such as causing heart attacks.’

Of course, no one with conventional tastes should try voluntarily eating a severely restricted diet. But if one had to do it, what would be the best thing to eat?

Scientists have looked into this question and found that Sophie Ray, 19, from Wrexham, North Wales, might be on the right track.   She has reportedly eaten nothing but cheese and tomato pizza for the past eight years after a attack of the stomach bug gastroenteritis left her with an extreme fear of food.  She says: ‘I love pizza. The thought of trying other foods makes me very anxious, I feel sick and clam up.’

Naturally, she would be healthier on a full-spectrum diet, but an investigation in 1997 by Dr Marion Nestle, a professor of nutrition and food studies at New York University, has shown real cheese pizza with real tomato sauce can provide us with sufficient nutrients to survive.

Professor Nestle says pizza mixes a lot of ingredients and can provide protein (from wheat crust and cheese) and essential nutrients, such as vitamin B12 (again from cheese) and vitamin C (from tomato), along with antioxidants and other nutrients. The olive oil used for good Italian pizza provides both calories and vitamin E.

‘Vitamin D can come from the sun, there is a fair amount of vitamin A in tomatoes,’ she says. ‘And to top it off, tomato sauce is a good source of nutrients such as lycopenes, with their rich anti-oxidant potential.  ‘If you are stuck on a desert island that happened to have a pizza parlour, you could do a lot worse.’

Only one food might be better — it is the food that many of us consumed solely for six months or more. And that is breast milk.

According to Jo Ann Hattner, a nutrition consultant and the author of Gut Insight, a book about digestive health: ‘Mother’s milk is a complete food. We may add some solid foods to an infant’s diet in the first year of life to provide more iron and other nutrients, but there is a little bit of everything in human milk.’

Technically, adults could survive on breast milk, too. The only problem (outside of the comedy world of Little Britain) would be finding a woman willing to provide it — and in sufficient quantities to keep a grown-up supplied.


Junk science about junk food

In the fight against obesity, should science matter? It depends on whom you ask. The answer may surprise you, and could make you realize that you shouldn't always trust the do-gooders.

A study published in Pediatrics magazine this month shows an association between obesity reduction and states with strict school rules against salty and fatty foods and sugary drinks. The researchers were properly prudent to caution that while they found a link between less obesity and rules against goodies, their study did not prove causation.

They noted that they did not control for key factors that could explain the results some other way. The conclusion of the study is clear and should be undisputed: These laws may, but don't necessarily, make a difference -- the same way umbrellas may be a leading cause of rain.

But consider the reaction from the executive director of the New York State Healthy Eating and Physical Activity Alliance: In response to the accurate NBC News headline, "School junk food bans may really help curb obesity," Nancy Huehnergarth tweeted, "Worth repeating. School food policy works!" But that's not repeating; it's distorting. Neither the study, the headline nor the story said the bans work.

There's nothing wrong with promoting a policy agenda, but it's wrong to mislead the public by knowingly twisting the findings of a study to serve that agenda. Unfortunately, policymakers and the public tend to give a free ride to anyone fighting obesity, smoking or any societal ill. If science is to determine policy, that is a mistake. We shouldn't blindly trust those who mislead us even if they want to save the world.

Similar fuzzy thinking applies in what turns out to be an asymmetrical battle over disclosure of funding and the credibility of scientific research. The study itself appears to be scientifically sound and comes with appropriate caveats. It was partially funded by the Robert Wood Johnson Foundation, or RWJF, which most media outlets disclosed. However, that disclosure is woefully incomplete; a distortion by omission. The typical reader would consider the funding source to bolster the credibility of the report. But I've found no coverage that also discloses that critical fact that the RWJF is one of the nation's leading proponents of the very laws being evaluated for their efficacy. Everyone might safely assume pretzel purveyors oppose the laws, but not everyone will know that RWJF has a dog in the fight.

Don't disbelieve the study just because it was funded by the RWJF, but be aware of the potential for bias, just as you would if a study funded by Coca-Cola reached the opposite result.

The same caution is also in order even for today's government-funded studies. The Centers for Disease Control and Prevention is pushing the limits of federal law by using taxpayer dollars, first from the stimulus bill and now from the health care law, to lobby for policy changes at the state and local level. Remember the Bloomberg administration's controversial (and unscientific) subway ads, where soda turned into globs of fat? Those were the type of federally funded campaigns meant to lay the groundwork for soda taxes. Less controversial are federally funded studies meant to justify the policies. The government isn't funding studies to determine whether these laws work; it is funding them to justify a position it has already taken.

If you ignore these principles you aren't following the science -- you are biased in favor of nanny state laws. That's fine, but in that case, don't pretend the science is on your side.


Thursday, August 23, 2012

Green tea extract 'eradicates cancer tumours'

A very preliminary study in laboratory glassware only

Powerful new anti-cancer drugs based on green tea could soon be developed after scientists found an extract from the beverage could make almost half of tumours vanish.  The University of Strathclyde team made 40 per cent of human skin cancer tumours disappear using the compound, in a laboratory study.

Green tea has long been suspected of having anti-cancer properties and the extract, called epigallocatechin gallate, has been investigated before.  However, this is the first time researchers have managed to make it effective at shrinking tumours.

Previous attempts to capitalise on its cancer-fighting properties have failed because scientists used intravenous drips, which failed to deliver enough of the extract to the tumours themselves.

So, the Strathclyde team devised a “targeted delivery system”, piggy-backing the extract on proteins that carry iron molecules, which cancer tumours Hoover up.  The lab test on one type of human skin cancer showed 40 per cent of tumours disappeared after a month of treatment, while an additional 30 per cent shrank.

Dr Christine Dufès, a senior lecturer at the Strathclyde Institute of Pharmacy and Biomedical Sciences, who led the research, said: “These are very encouraging results which we hope could pave the way for new and effective cancer treatments.

“When we used our method, the green tea extract reduced the size of many of the tumours every day, in some cases removing them altogether.  "By contrast, the extract had no effect at all when it was delivered by other means, as every one of these tumours continued to grow.

“This research could open doors to new treatments for what is still one of the biggest killer diseases in many countries.”  She added: “I was expecting good results, but not as strong as these.”

Dr Dufès said population studies had previously indicated that green tea had anti-cancer properties, and scientists had since identified the active compound as epigallocatechin gallate.

But the Strathclyde researchers were the first to delivery it in high enough doses to tumours to have an effect.

She explained: “The problems with this extract is that when it’s administered intravenously, it goes everywhere in the body, so when it gets to the tumours it’s too diluted.  “With the targeted delivery system, it’s taken straight to the tumours without any effect on normal tissue.”

Cancer scientists are increasingly using targeted delivery to improve results, relying on the many different ‘receptors’ that tumours have for different biological substances.

In this instance, the scientists used the fact that tumours have receptors for transferrin, a plasma protein which transports iron through the blood.

The results have been published in the journal Nanomedicine.

The “ultimate objective” was a clinical trial in humans - but Dr Dufès said that was some way off.  “We have got to optimise the delivery system and therapeutic effect first,” she said.

Dr Julie Sharp, from Cancer Research UK, said: “A few studies have shown that extracts from green tea may have some effect on cancer cells in the lab but this has not yet been backed up by research in humans.”  She added: “It’s far too soon to say if enjoying a cup of green tea has any wider benefits in combating cancer but we know that a healthy balanced diet can help to reduce the risk.”


Midwives told to drop ‘30-second rule’ on cutting umbilical cord after delaying longer shown to benefit babies

This has been known for some time.  It seems a pity that it is not already generally implemented

A radical change in the way babies are delivered will see midwives delay cutting the umbilical cord following evidence that it improves the health of newborns.

The Royal College of Midwives is preparing to update its guidance  to recommend delayed clamping  for most women who give birth in hospitals, which will affect about  90 per cent of all births.

Current guidance from the RCM  and the National Institute for Health and Clinical Excellence is to cut and clamp the umbilical cord within  30 seconds to protect babies from too much exposure to a synthetic hormone given to mothers to speed up labour and deliver the placenta.

It was also thought to help prevent a baby getting jaundice, a condition that causes yellowing of the skin, and was encouraged because of the risk of bleeding in new mothers.

However, doctors have long been divided over the issue – and studies have now found that delaying the procedure by just a few minutes has significant health benefits.

It is thought being connected to the maternal blood supply for longer helps protect babies against iron deficiency and anaemia, and allows vital stem cells to  be transferred.

Increasing numbers of women have also been asking midwives to delay cutting and clamping to allow more blood to drain from the placenta into the baby, and also simply so they are connected for longer.

The new guidance is being developed and will be announced at the  College’s conference in November.

Mervi Jokinen, practice and standards development adviser at the RCM, said: ‘We are supporting the midwives not to clamp the cord immediately. We’ve not finalised the guidelines and in terms of how long it will recommend delaying clamping for, we don’t know.

‘Guidelines drawn up by different organisations vary from one to five minutes, and even up to ten.

‘Most midwives will have to use  their judgment in terms of the clinical situation. It’s more likely to happen within three to five minutes.’

Mrs Jokinen added that the change was driven by the evidence from  clinical studies, but also because women were increasingly asking for midwives to delay clamping.

‘The issue here was studies started to show that with early clamping you’re denying a baby a boost of blood and it was recognised that haemoglobin levels were much lower later on,’ she said.

‘It is said that babies who are healthy and well would benefit  from greater haemoglobin levels. Women have also asked us to give their babies to them while they  are attached.’

A study from Sweden found a delay of three minutes could reduce the risk of iron deficiency later in childhood as well as anaemia in newborns, which can lead to poor brain development.

At four months, fewer than one per cent of infants who had delayed clamping were deficient in iron compared with six per cent of those clamped immediately.  There was no increase in jaundice or other complications thought to be linked to delayed clamping.

In an editorial published in the same journal as the study, Dr Patrick van Rheenen, a consultant paediatrician at Groningen University in the Netherlands, said: ‘Delayed clamping clearly favours the child.

‘How much evidence is needed to convince obstetricians and midwives that it is worthwhile to wait for three minutes to allow for placental transfusion?’

A major US study published in 2007, which involved more than 1,900 newborns, found a two-minute delay was enough to reduce the  risk of anaemia by half and low iron levels in the blood by a third.

The World Health Organisation dropped early clamping from its guidelines in 2007 and best practice on the issue varies across Europe.  Guidelines in the UK, drawn up by NICE, recommend early clamping although an update is due in 2014.

The Royal College of Obstetricians and Gynaecologists updated its guidance last year to recommend the cord ‘should not be clamped earlier than necessary, based on a clinical assessment of the situation’.

Although hospitals will still be able to decide their own birth protocols, it is likely that they will  follow RCM policy.

David Hutchon, a retired consultant obstetrician and gynaecologist  who has campaigned for years for a change in policy, said: ‘This is very welcome.  ‘But whether doctors will take any notice is another issue.  ‘There’s a lot of ignorance out  there and people have just blindly  followed guidance for years without questioning it.’


Wednesday, August 22, 2012

Mouse study:  Bowel cancer 'could be fuelled by E coli stomach bug'

Or is it that mice with cancer are more likely to get e-coli through weakened resistance?

One of Britain’s most common cancers could be fuelled by the E coli stomach bug, scientists believe.

The breakthrough raises the prospect of a vaccine against bowel cancer, which claims 16,000 lives a year and is the second most common form of the disease in women after breast cancer and the third most diagnosed in men.

The elderly, who are most at risk of the bowel cancer, could also be screened for the ‘sticky’ strain of E coli that makes a DNA-damaging poison.

Although the idea that a bug is involved in cancer might seem strange, it is not unheard of, with a virus being to blame for most cases of cervical cancer and a bacterium strongly linked to stomach cancer.

Now, tests on mice and people, carried out in the UK and US, have pointed to E coli being a strong suspect in bowel cancer.

The concern surrounds a version that sticks well to the inside of the lower bowel, or colon.  It also contains genes that make a poison which causes the type of damage to DNA usually seen in cancer.

Although we usually think of E coli as causing food poisoning, these strains had been thought to live in the bowel without causing any problems.

However, tests show them to be much more common in bowel cancer patients than in healthy people.

Two-thirds of the 21 samples taken from bowel cancer patients contained the bug, compared to just one in five of those taken from healthy people, the journal Science reports.

Experiments also showed that  mice inoculated with the bug are at very high odds of developing bowel cancer – as long as the E coli carries the poison-making ‘pks’ genes.

Liverpool University’s Dr Barry Campbell, a co-author of the study, said: ‘The research suggests that Ecoli has a much wider involvement in the development of colon cancer than previously thought.

‘It is important to build on these findings to understand why this type of bacteria, containing the pks genes, is present in some people and not in others.’

Professor Jonathan Rhodes said: ‘The bottom line message is that there seems to be a strong association between a type of E coli and the development of colon cancer.

‘And given that this type of E coli is specifically able to damage DNA and inflict the sort of damage you get in a cancer, it is very likely it has a causative role, at least in some patients.’

The scientists, who collaborated with scientists from the University of North Carolina, aren’t sure why some people who have the bug go onto develop cancer and others don’t.

But factors such as genes and diet are probably important.

Professor Rhodes said: ‘The literature on colon cancer taken as a whole suggests that having the right genes, taking exercise, possibly taking an aspirin a day, limiting red meat and eating plenty of leafy green vegetables all have a protective effect.’

If the link is confirmed, it could lead to tests for the rogue form of E coli being included in bowel cancer screening for the elderly.

In the long-term, a vaccine that stops the bug from taking root is also possible, added the professor.

There is a precedent for this – the HPV vaccine which is given to teenage girls wards off infection by the human papilloma virus - the bug behind the majority of cases of cervical cancer.

Henry Scowcroft, of Cancer Research UK, said: ‘This is an intriguing study in mice suggesting that the bacteria in our gut may play a role in the development of bowel cancer.

‘This would make sense, as we know that being infected with bacteria called H pylori can increase the chances of developing stomach cancer.

‘But since this study only involved mice and is still at an early stage, it’s not yet clear whether E coli is actually linked to bowel cancer in humans at all, let alone whether this knowledge could be used to help improve things for patients or people at risk.’


How your blood group can affect your heart disease risk: Britons with 'O' type 'benefit from natural protection'

The Japanese are fanatical about blood type.  Maybe they are onto something!  The effects below are however too small to be given much credence

A person’s blood group helps determine their risk of heart disease, a study has found.  Researchers claim almost half of Britons with blood group O, the most common blood type, benefit from some natural protection against the illness.

However, they said people from groups A and B are more at risk, while people from AB, the rarest blood group, are the most vulnerable.

The findings, published in the journal Arteriosclerosis, Thrombosis and Vascular Biology, are based on an analysis of two large US health and lifestyle studies.

The Harvard University researchers concluded people with blood group AB were 23 per cent more likely to suffer from heart disease.  Group B blood increased the risk by 11 per cent, and type A by 5 per cent.

Lead researcher Professor Lu Qi, from the Harvard School of Public Health in Boston, said ‘While people cannot change their blood type, our findings may help physicians better understand who is at risk for developing heart disease.

‘It’s good to know your blood type the same way you should know your cholesterol or blood pressure numbers.  'If you know you’re at higher risk, you can reduce the risk by adopting a healthier lifestyle, such as eating right, exercising and not smoking.’

The study compared blood groups and heart disease incidence but did not analyse the complex biological mechanisms involved.

There is evidence that type A blood is associated with higher levels of ‘bad’ type of cholesterol, low density lipoprotein (LDL), which is more likely to fur up the arteries.

AB blood is linked to inflammation, which also plays an important role in artery damage.

People with type O blood may benefit from a substance that is thought to assist blood flow and reduce clotting.

The researchers pointed out the study group was mostly white Caucasian and it is not clear whether the same findings applied to other ethnic groups.

Prof Qi said ‘It would be interesting to study whether people with different blood types respond differently to lifestyle intervention, such as diet.’

Scientists from Pennsylvania University last year found the same gene that causes people to be blood group ‘O’ gives them some protection against heart attack.

But experts warn that while blood type O may offer some protection from heart trouble, blood type alone will not compensate for other factors that are linked to cardiovascular disease.

Other research found blood group O patients may be at greater risk for bleeding and blood transfusions after heart surgery.  Patients with AB blood type are 20 per cent less likely to die after heart bypass surgery than those with A, B or O blood types, said Duke University Medical Center researchers.

Doireann Maddock, Senior Cardiac Nurse at the British Heart Foundation, said ‘While these findings are certainly interesting we’ll need more research to draw any firm conclusions about blood type and its role in heart disease risk.

‘Nobody can influence what type of blood they are born with but a healthy lifestyle is something everybody can have an influence over. Eating healthily, getting active and stopping smoking are the types of things you should be worrying about, not your blood type.’


Tuesday, August 21, 2012

California Initiative Puts Profit Ahead of Science

Proposition 37 props up profits for organic growers and denies the scientific consensus in favor of biotech crops.

The Proposition 37 petition asserts that “genetic engineering of plants and animals often causes unintended consequences. Manipulating genes and inserting them into organisms is an imprecise process. The results are not always predictable or controllable, and they can lead to adverse health or environmental consequences.” All of these claims, quoted from the findings and declarations section of the initiative, are solidly contradicted by the scientific consensus regarding biotech crops.

In a 2004 report, Safety of Genetically Engineered Foods: Approaches to Assessing Unintended Health Effects, the National Academy of Sciences (NAS) reviewed and compared the unintended consequences of conventional, mutagenic, and biotech plant breeding. The NAS report noted that all types of plant breeding—conventional, mutagenic, and biotech—could on rare occasions produce crops with unintended consequences. However, the report concluded, “The process of rDNA [biotech breeding] is itself not inherently hazardous.”

What about the claim that biotech breeding is “an imprecise process”? Not so says the NAS report. Conventional breeding transfers thousands of unknown genes with unknown functions along with desired genes, and mutation breeding induces thousands of random mutations via chemicals or radiation. In contrast, the NAS report notes, “Genetic engineering methods are considered by some to be more precise than conventional breeding methods because only known and precisely characterized genes are transferred.”

Any adverse health consequences? After reviewing all the scientific evidence, the NAS report concluded, “To date, no adverse health effects attributed to genetic engineering have been documented in the human population.” In 2003, the International Council for Science (ICSU) representing 111 national academies of science and 29 scientific unions issued a report declaring, “Currently available genetically modified foods are safe to eat.” The ICSU pointedly added, “There is no evidence of any ill effects from the consumption of foods containing genetically modified ingredients.” With regard to eating foods made from biotech crops, the World Health Organization flatly states, “No effects on human health have been shown as a result of the consumption of such foods by the general population in the countries where they have been approved.”

At its annual meeting in June, the American Medical Association endorsed a report on the labeling of bioengineered foods from its Council on Science and Public Health. The report found that, “Bioengineered foods have been consumed for close to 20 years, and during that time, no overt consequences on human health have been reported and/or substantiated in the peer-reviewed literature.” The AMA report further noted, “Despite strong consumer interest in mandatory labeling of bioengineered foods, the FDA’s science-based labeling policies do not support special labeling without evidence of material differences between bioengineered foods and their traditional counterparts. The Council supports this science-based approach….” Every independent scientific body that has ever evaluated the safety of current biotech crop varieties has found them to be as safe or even safer than conventional crop varieties.

So who is funding this pack of lies? The petition for Proposition 37 was filed and launched by notorious trial lawyer James Wheaton. The corporations that back the initiative include Nature’s Path, which sells $300 million worth of organic cereals annually and has pledged $500,000 to the anti-science campaign and Dr. Bronner’s Magic Soap, a private company with revenues of $50 million annually derived from peddling organic soaps and has given $300,000. The biggest donor is Mercola Health Resources run by Chicago osteopath and self-styled alternative medicine guru Joseph Mercola, who promotes his sketchy supplements through his online health newsletter. Mercola has donated $800,000 to the campaign.

The Organic Consumers Association (OCA) has spent $635,000 promoting the initiative. OCA lists no donors on its 2010 IRS Form 990 and apparently gets most of its $1.3 million in revenues from phone solicitations contracted out to the Hudson Bay Company of Illinois based in Lincoln, Nebraska. Lundberg Family Farms, with revenues of nearly $50 million from selling organic rice, has committed $200,000 to the campaign. Among the activist groups favoring Proposition 37, is the Institute for Responsible Technology (IRT), which is part of an anti-science coalition jumpstarted with a $1 million grant from Mercola. Among other claims, the IRT suggests that eating foods made from biotech crops is a cause of autism.

The traditional anti-biotech environmentalist groups have piled on and endorsed Proposition 37 as well, including Greenpeace, Friends of the Earth, the Pesticide Action Network, and the Sierra Club. Shoving science aside, the California Democratic Party has formally endorsed Proposition 37. In particular, Sen. Barbara Boxer (D-Calif.), who insists on the accepting the scientific consensus concerning climate change, rejects it with regard to the safety of biotech crops and supports anti-science when it comes to Proposition 37. Rep. Maxine Waters (D-Calif.) is also on board the pro-Proposition 37 bandwagon.

One other claim made in the Proposition 37 petition is that “90 percent of the public want to know if their food was produced using genetic engineering.” That is unfortunately about right. And why not? After all, profitmongering organic foods purveyors and scaremongering environmentalists have been spreading disinformation about the safety of biotech crops for more than two decades now.

However, there may less than meets the eye to those poll results. The citizens of the European Union are supposed to be especially averse to biotech crops. However, a new European Commission report, A Decade of EU-Funded GMO Research, finds that polls may not be a good way to evaluate actual consumer attitudes toward foods made with biotech crops. The researchers found that despite strongly negative polls, when it came to looking at the actual buying behavior, “most people do not actively avoid GM [genetically modified] food, suggesting that they are not greatly concerned with the GM issue.”

Based on scientific assessments the Food and Drug Administration only requires labels when a product raises safety or nutritional issues which clearly current foods using ingredients from biotech crops do not. Thus the agency is correct when it says that such labels would be "inherently misleading," and would "imply that GM/GE foods are in any way different from other foods." Of course, the whole point of Proposition 37 is to mislead with regard to the safety of biotech crops. The coalition anti-science campaigners want to mandate labels in this case because they hope that consumers would treat them as warning labels, turning away from perfectly safe and cheaper biotech and conventional foods toward pricier and more profitable organic fare. Of course, if people who have been suckered by organic fearmongering want to avoid biotech foods, they can simply purchase foods labeled organic now.

Although cloaking the Proposition 37 anti-science disinformation campaign in bogus health fears and alleged consumer choice concerns, the Organic Consumers Association Director Ronnie Cummins gives the game away in an open letter earlier this month. “The burning question for us all then becomes how—and how quickly—can we move healthy, organic products from a 4.2% market niche, to the dominant force in American food and farming?,” writes Cummins. Sadly many well-meaning Californians appear to have been duped by the promoters of Proposition 37, so that its corporate and special interest backers cynically calculate that an electoral victory in November will produce higher profits and more donations. Here is a real case of putting profits ahead of science.


Scientists hopeful of drug addiction cure

Early days yet

AUSTRALIAN and international scientists may have found a cure for heroin and morphine addictions.  The discovery could have wide-reaching implications leading to better pain relief without the risk of addiction to prescription drugs, while also helping heroin users kick the habit.

Dr Mark Hutchinson from the University of Adelaide said a team of researchers had shown for the first time that blocking an immune receptor, called TLR4, stopped opioid cravings.

"Both the central nervous system and the immune system play important roles in creating addiction, but our studies have shown we only need to block the immune response in the brain to prevent cravings for opioid drugs," Dr Hutchinson said.

The scientists, including a team from the University of Colorado Boulder, used an existing drug to target and block the TLR4 receptor.

The National Institutes on Drug Abuse in the United States is further developing the drug, which has been proven to work in the laboratory, to test in clinical trials.  As a result, clinical trials on patients could be underway in just two to three years time, Dr Hutchinson said.

If the clinical trials were successful, opioid drugs used to treat acute pain could potentially be co-formulated with the additional drugs to limit the chance of addiction.  This approach could also treat patients with heroin or other opioid addictions who are admitted to hospital and require pain relief.

These patients generally needed larger doses of drugs like morphine to treat pain because their bodies have developed a higher tolerance.

However, Dr Hutchinson said co-formulated drugs would mean these patients could be given lower doses.  "It might make it much easier to treat those already addicted or tolerant populations," Dr Hutchinson told AAP.

President of the Australian and New Zealand College of the Anaesthetists Dr Lindy Roberts said although opioids were important for the treatment of pain they could have adverse effects.

She said treatments that could potentially separate the pain relief aspects of drugs from adverse effects were welcomed.

The findings were published this week in the Journal of Neuroscience.


Monday, August 20, 2012

Popcorn ingredient linked to Alzheimer’s

This had to come.  Anything to knock something that is popular!   The journal article is The Butter Flavorant, Diacetyl, Exacerbates β-Amyloid Cytotoxicity.  It was a study in laboratory glassware only

Movie popcorn has often been criticized for its high calorie count, but now the tasty treat may harm more than just your waistline.

A recent study has found that diacetyl, an ingredient in popcorn responsible for its buttery flavor and smell, may be linked to Alzheimer’s disease.

The scientists said they focused on the substance, because it has already been associated with respiratory and other health issues in workers at microwave popcorn and food-flavoring factories.  According to, diacetyl is used in other products such as margarines, snacks and candies, baked goods and in some beers and chardonnay wine.

Robert Vince, director of the Center for Drug Design at the University of Minnesota and the study’s lead author, said diacetyl is similar in structure to another substance that aids the clumping of beta-amyloid proteins in the brain – a significant indicator of Alzheimer’s.

Just like this substance, diacetyl was found to increase the amount of beta-amyloid clumping, said.  The popcorn ingredient was also able to penetrate the blood-brain barrier, a defense which prevents harmful substances from entering the brain.

The study was published in the journal Chemical Research in Toxicology.


Boosting bacteria in drinking water may improve health

Every gallon of purified drinking water is home to hundreds of millions of bacteria. Water treatment facilities try to remove them – but perhaps encouraging some of the microbes to grow could benefit human health.

Lutgarde Raskin of the University of Michigan in Ann Arbor says that workers at water treatment facilities across the US try to destroy all of the bacteria in drinking water with infusions of chlorine and other disinfectants. But this is nearly impossible to achieve with the current technology.

The present approach also ignores the fact that the drinking water microbiome contains some bacteria that can be beneficial. For instance, nitrates that can contaminate drinking water could be converted by some bacteria into harmless nitrogen gas. Raskin and her team suggest that encouraging the growth of these bacteria in drinking water could actually improve the quality and safety of the product.

Between April and October 2010, the researchers analysed bacterial DNA in drinking water treated at municipal facilities in Ann Arbor. They wanted to work out exactly which bacteria were present, and what factors influenced the abundance of the various components of the bacterial community.

They found that slightly altering the water's pH during the filtration process, or even changing how filters were cleaned, helped good bacteria outcompete more harmful microorganisms for the limited resources in the water.

"It does no good to try to remove bacteria entirely," says Raskin. "We are suggesting that a few simple changes can be made that will give bacteria that are good for human health an edge over harmful competitors."


Sunday, August 19, 2012

Could dark chocolate stave off dementia?

A number of things to note:  This is a study of some fairly dippy oldies so should not be generalized beyond that.  Secondly, we have no means of knowing if the study generalizes even to all dippy oldies.  Is the effect peculiar to Italians, for instance?  Thirdly, as the abstract below shows, the alleged beneficial effect was not observed on the widely used Mini Mental State Examination -- a pretty comprehensive test for dementia.  So which set of results should we trust?  Fourthly,  averages based on the results of only 30 people are likely to be quite unstable.  Fifthly, it is again only dark chocolate that gets a tick of approval.  Most people don't like dark chocolate much so it suits the usual elite tendency to condemn what is popular and praise what is not

A daily dose of chocolate could help keep dementia and Alzheimer's at bay, a study suggests.

Researchers found that consuming cocoa every day helped improve mild cognitive impairment – a condition involving memory loss which can progress to dementia or  Alzheimer's – in elderly patients.

For the study, 90 people aged 70 or older  diagnosed with mild cognitive impairment were split into three groups of 30 and given either a high, medium or low dose of a  cocoa drink daily.

The drink contained flavanols – chemicals associated with a decreased dementia risk which are found in a variety of foods, including cocoa products such as dark chocolate.

The participants' diet was restricted to  eliminate other sources of flavanols, such as tea or red wine.

Their cognitive function was examined using tests of factors including working memory and processing speed.

Researchers found those who drank the high and medium doses daily had significantly better cognitive scores by the end of the eight-week study in a number of categories, including working memory.

Those given the higher doses of the flavanol drink improved far more than those given the lowest dose, the study, published in the journal Hypertension, found. 

Insulin resistance and blood pressure also decreased in those drinking high and medium doses of the flavanol drink.

Doctor Giovambattista Desideri of the  University of L'Aquila in Italy, lead author of the study, said: 'This study provides encouraging evidence that consuming cocoa flavanols, as  a part of a calorie-controlled and nutritionally-balanced diet, could improve cognitive function.

'Larger studies are needed to validate the findings, figure out how long the positive effects will last and determine the levels of cocoa flavanols required for benefit.'

Dr Laura Phipps, of Alzheimer's Research UK, said: 'Cocoa-based treatments for brain  function would likely have patients queuing out the door, but this small study of flavanols is not yet conclusive.'

Benefits in Cognitive Function, Blood Pressure, and Insulin Resistance Through Cocoa Flavanol Consumption in Elderly Subjects With Mild Cognitive Impairment

By  Giovambattista Desideri et al.


Flavanol consumption is favorably associated with cognitive function. We tested the hypothesis that dietary flavanols might improve cognitive function in subjects with mild cognitive impairment. We conducted a double-blind, parallel arm study in 90 elderly individuals with mild cognitive impairment randomized to consume once daily for 8 weeks a drink containing ≈990 mg (high flavanols), ≈520 mg (intermediate flavanols), or ≈45 mg (low flavanols) of cocoa flavanols per day. Cognitive function was assessed by Mini Mental State Examination, Trail Making Test A and B, and verbal fluency test. At the end of the follow-up period, Mini Mental State Examination was similar in the 3 treatment groups (P=0.13). The time required to complete Trail Making Test A and Trail Making Test B was significantly (P less than 0.05) lower in subjects assigned to high flavanols (38.10±10.94 and 104.10±28.73 seconds, respectively) and intermediate flavanols (40.20±11.35 and 115.97±28.35 seconds, respectively) in comparison with those assigned to low flavanols (52.60±17.97 and 139.23±43.02 seconds, respectively). Similarly, verbal fluency test score was significantly (P less than 0.05) better in subjects assigned to high flavanols in comparison with those assigned to low flavanols (27.50±6.75 versus 22.30±8.09 words per 60 seconds). Insulin resistance, blood pressure, and lipid peroxidation also decreased among subjects in the high-flavanol and intermediate-flavanol groups. Changes of insulin resistance explained ≈40% of composite z score variability through the study period (partial r2=0.4013; Pless than 0.0001). To the best of our knowledge, this is the first dietary intervention study demonstrating that the regular consumption of cocoa flavanols might be effective in improving cognitive function in elderly subjects with mild cognitive impairment. This effect appears mediated in part by an improvement in insulin sensitivity.


When Did Milk Become Bad for You?

Last week, as I entered Union Station Metro station in Washington, I saw ads for what appeared to be First Lady Michelle Obama’s “Let’s Move” campaign. It was a series of three ads, the first said: “Let’s move hot dogs out of school lunch.” Okay, fine, because, let’s face it, while hot dogs may be scrumptious and all-beef, they look like small batons of questionable meat.

The second ad said: “Let’s move cheese out of school lunch.” I mean, I guess. Cheese, while a good source of calcium (and delicious), is not necessarily the healthiest thing in the world. I don’t think it should be removed from children’s lunches, but I just chalked it up to liberal nanny-state policies.

It was the third ad that really got my goat. Hidden in the corner, the least noticeable sign read: “Let’s move milk out of school lunch.” Really? Milk? Arguably one of the best sources of calcium, which, as I’ve been told since I was old enough to remember, makes bones strong?

When did milk become unhealthy? Rather, when did milk become so bad for you that it should be banned from school lunches and put on the same level as the hot dog?

Curious about the reasoning behind the sudden “war on milk,” I visited the website mentioned on the ad. To my surprise, it was not, in fact, Michelle Obama’s “Let’s Move” website, but was a separate organization called “Let’s Really Move!” – an apparent response to the failure of the First Lady’s core initiative:

    “The stalled ‘Let’s Move’ campaign needs to get back in gear. The ‘Let’s Move’ campaign has abandoned any major effort to improve the nation’s nutrition, focusing instead on noncontroversial recommendations about exercise. That strategy will not combat skyrocketing rates of childhood obesity, diabetes, and high cholesterol.”

As for the organization’s crusade against milk (even skim milk), they claim it does not actually promote bone health or protect against osteoporosis and is high in fat, cholesterol and sugar. Instead of milk, they suggest beans, broccoli, kale, tofu and whole grains. Mmmm! That’s sure to get the kids excited about healthy eating!

Conversely, Michelle Obama’s “Let’s Move” suggests fat-free milk is okay.

Perhaps you’ve noticed, as I have, that it’s nearly impossible these days to keep track of what foods are good for you and which ones aren’t. I grew up thinking milk was great, now it’s apparently as bad for you as what’s sold at sporting events.

Advocacy groups and nanny-state politicians have for decades tried to control us through our diets. But they don’t just try to control us by telling us what we should or should not eat, they also control the very supply of food.

For milk, the government subsidizes the dairy industry and sets production limits, which means taxpayers and consumers are paying more for a gallon of milk than they should be.

And the current version of the farm bill will make such market distortions even worse. The new Dairy Management Supply Program will set milk prices and effectively tax dairy farmers if prices fall below those price controls. The government will then use that tax money to purchase products, controlling the supply.

Dairy farmers are not happy about this, and are meeting with members of Congress this month in order to discuss their issues with the farm bill. Wonderful. We know what that will mean: dairy farmers will receive special consideration and carve-outs within the farm bill, further muddling the bill and promoting corporate welfare.

With this kind of control over something so basic to our diets, it’s no wonder the farm bill has stalled in Congress. Now, lawmakers need to work to remove these kinds of market-distorting special handouts, not just because they promote cronyism, but because, seriously, “milk does a body good.”


Friday, August 17, 2012

Are walnuts good for sperm?

No journal source mentioned so difficult to evaluate.  The effect was tiny and may have been a Rosenthal effect if the study was not double blind

Men who want to increase their  fertility levels might benefit from eating walnuts, according to a study.

Researchers in America asked a group of young men in their 20s and 30s to eat a 75g packet every day for three months.  Compared with a group of men who avoided walnuts, they managed to increase their sperm count and its quality, potentially giving them a  better chance of fathering a child.

Scientists at the University of California chose walnuts because they are a major source of ‘good’ polyunsaturated fats.  They are rich in omega 3 and omega 6 – also found in oily fish – which are thought to be good for sperm development and function but are lacking in many Western diets.

One in six couples struggle to conceive, and it is thought around 40 per cent of these problems are due to problems with the man’s sperm.

Professor Wendie Robbins, of UCLA’s School of Public Health, said as the 117 volunteers were healthy non-smokers, it was not clear that walnuts would help with fertility problems, but it had a positive effect.

The researchers analysed the men’s sperm concentration, how strongly they swam and their genetic makeup.  Those eating walnuts saw a modest 3 per cent average increase in sperm swimming, compared with no increase in the group who did not eat walnuts.

And fewer of the walnut eaters were seen with aneuploidy – a  disorder where sperm have too many or too few chromosomes.

Allan Pacey, a fertility expert at the University of Sheffield, said the study found only a ‘quite modest’ increase in sperm count.  ‘I would be cautious about recommending this as a therapy for infertility until it has been studied further,’ he added


Powerful new drug eases pain and inflammation of arthritis sufferers

A powerful new drug that could bring relief to hundreds of thousands of -Britons crippled by rheumatoid arthritis is being developed.

Patients taking the pill, -tofacitinib, suffered less pain and inflammation than those on today's best treatments.

Scientists say it was also more effective at slowing -damage to joints after results of an ongoing clinical trial of nearly 1,000 sufferers showed the pill is 'superior' to the common treatment methotrexate, or MTX.

Tofacitinib targets pathways in the cells that regulate inflammation. And, unlike many treatments for rheumatoid arthritis - which affects around 400,000 Britons - it can be taken orally instead of by injection.

Half of those on the trial had fewer symptoms than those on MTX and -displayed less joint damage.

Judith Brodie, chief executive of UK charity Arthritis Care, told the Daily Express: 'This looks very promising. Anything that can make a difference to people with rheumatoid arthritis is hugely important.'

The disease, in which the body's immune system attacks the joints, can strike at a young  age, unlike the more common osteoarthritis, which mainly strikes older people.

It usually affects hands and feet, although any of the body's joints can become inflamed and painful.

Tofacitinib, still in the developmental stage, belongs to a new group of drugs called Janus kinases, which can be used to treat adults with moderate to severe rheumatoid arthritis.

The current more common treatment involves painkillers and anti-inflammatory drugs which tackle the pain and swelling.

Disease-modifying anti-rheumatic drugs or DMARDS - of which methotrexate is the most common - are used to slow down the progression of the disease and joint damage.

Tofacitinib is being reviewed by regulators in the U.S., Europe and Japan. If approved, it would become the first new-generation inhibitor treatment drug on the market.

A spokesman for pharmaceutical giant Pfizer, which has developed the drug, said: 'Tofacitinib is a novel, oral small molecule Janus kinase (JAK) inhibitor that is being investigated as an immunomodulator and disease-modifying therapy for rheumatoid arthritis.

'Tofacitinib is currently under review by several regulatory agencies around the world, including in the European Medicines Agency.'

Rheumatoid arthritis is less common than osteoarthritis, which affects around 8.5 million Britons.

Osteoarthritis is a degenerative condition associated with age, treated by exercise and painkillers.