Thursday, October 31, 2013






A miscarriage of science: BPA’s unproven pregnancy risk

The headlines are out: Pregnant woman should fear the chemical Bisphenol A (BPA) because a “new study” says it increase the risk of miscarriage. Fortunately, we have lots of good reasons to doubt these headlines.

What does the study really say? We don’t completely know since it’s not available in any peer-reviewed publication. All that’s available is an abstract produced for a recent presentation at a conference hosted by the American Society for Reproductive Medicine (ASRM). The abstract does, however, provide enough information to allay fears—despite how the headlines read.

Based on the abstract, we can see that the study examined 114 women in the early stages of pregnancy who also are a high-risk group for miscarriage. The researchers took “spot samples” of the women’s blood and measured BPA in their blood serum. Sixty-eight of these women suffered from miscarriages, and the rest carried their babies to term.

The researchers then measured the BPA in the serum to see if those who miscarried had more BPA in their blood serum than the others. They then divided these 68 women into four groups based on the BPA exposure levels, ranging from those with the lowest to those with the highest. Using this data, the researchers calculated that those in the group with the highest BPA levels in their serum had an 80 percent higher risk of miscarrying than did those in the lowest-exposed group.

Sounds pretty clear, right?  WRONG!  Here’s why this study isn’t as meaningful as you might think.

First, the authors did not find a cause-and-effect relationship, they simply found an association, and it was pretty weak. Researchers express the strength of such associations numerically as a “risk ratio.” In this study, the risk ratio for the highest risk group was 1.83, which is low and suggests that the result may have arisen by accident or researcher bias. “Although any measure of risk would follow a continuous distribution and there are no predefined values that separate ‘strong’ from ‘moderate’ or ‘weak’ associations, relative risks below 3 are considered moderate or weak,” points out Paolo Boffetta of the The Tisch Cancer Institute at Mount Sinai School of Medicine in an article on the topic of relative risk.

The study’s promoters are aware that the findings are inconclusive, yet they advance alarming claims with clever rhetoric. For example, Dr. Linda Giudice, president of the ASRM,commented to the press that although the study proves nothing, it adds to the “biological plausibility” that BPA affects fertility and health. It is true that if something also has a biological explanation (plausibility), researchers can make a stronger argument for a cause-and-effect relationship—particularly if their study discovers a reasonably strong association. But using biological plausibility to rationalize a weak association is itself, pretty weak!

Second, the sample size here is very small, which greatly increases the probability that the weak association is little more than accidental. Larger samples by definition are more representative of the larger population. Accordingly, if this study was ten- or twenty-times bigger, a weak association would have greater meaning.

Third, the BPA was measured only once. Since BPA levels in the body can fluctuate considerably over time, one-time measures can’t reveal which women really have higher exposures. Accordingly, the data going into the study is not good enough to draw conclusions.

Finally, while the authors and news organizations suggest this adds to the research indicting BPA, they fail to note that many other studies that contradict it. This study would be much stronger if it was consistent with the larger body of research, particularly the larger, best-designed scientific research. But it’s not.

Scientific panels around the world have reviewed the full body of BPA research repeatedly, and they all concluded that current consumer exposure to BPA is simply too low to have any adverse public health impacts!

This study is consistent with the almost daily barrage of government-funded, usually small and often poorly designed studies reporting largely meaningless weak associations. Politicians have funded them in response to the clamoring and hype created by misguided, anti-chemical environmental activists. We can expect to see more of this misuse of science as the researchers and their activist partners continue the fear-mongering.

This case is full of such examples. In a press release, referencing this unpublished study, Giudice notes: “Many studies on environmental contaminants’ impact on reproductive capacity have been focused on infertility patients and it is clear that high levels of exposure affect them negatively. These studies extend our observations to the general population and show that these chemicals are a cause for concern to all of us.”

This cryptic comment has no real relevance to the study at hand, even though Giudice placed it directly under the abstract in the release. After all, what chemicals is she talking about? Who knows? It is clear that the high-level chemical exposures to which she refers have nothing to do with extremely low, easily metabolized trace levels of BPA. Nothing at all.  But her comment worked well in the “news” stories alleging BPA-induced miscarriage risk. And that’s the only apparent reason to make such a misplaced comment.

The authors have also spun the issue to grab headlines. Lead author, Ruth B. Lathi notes in USA Today, that although they did not show that BPA is dangerous “it’s far from reassuring that BPA is safe.” Well that comment may be safe to say since you can’t prove absolute safety—of anything! Not even a glass of water.

She also recommends avoiding plastic food packaging, not cooking in plastic food containers, and not leaving bottled water in the sun. Never mind that BPA is not used for most of these products, which Lathi doesn’t specify. There isn’t any BPA in the lightweight flexible plastic that makes single-use bottled water or plastic storage containers such as Gladware, which is what Lathi seems to be suggesting we avoid.  She also says to limit eating canned food.

BPA is used to make hard-clear plastics, such as the 5-gallon water jugs used in office water coolers and safety goggles. And it’s used in resins that line canned goods to prevent rust and the development of deadly pathogens in our food supply. Can Lathi assure us that removing those resins will be safe? Certainly not, yet her rhetoric advances policy in that direction.

But what does it matter? After all, the scientific discussion about BPA apparently isn’t about the facts—or good science.

SOURCE






Pregnant women who gain a lot of weight may be more likely to have a child with autism

Some appropriate caution about the causal path expressed below

Pregnant women who put on a lot of weight may be more likely to have an autistic child.  Weight gain in pregnancy varies greatly, with most women gaining between 17.5lb to 30lb, putting most of the weight on after week 20.

Now a study has found that autism spectrum disorder was more prevalent in in children born to mothers who gained 33.5lb during pregnancy than in those who gained 30.5lb, for instance.

Previous studies have identified links between  pregnancy weight gain and autism.  But this new study found a link between autism and pregnancy weight gain after taking into account how much a woman weighed before.

Lead author Dr Deborah Bilder, an assistant professor of psychiatry at the University of Utah, said: 'These findings suggest that weight gain during pregnancy is not the cause of autism.

'Rather, [it] may reflect an underlying process that it shares with autism spectrum disorders, such as abnormal hormone levels or inflammation.'

Researchers carried out the study by comparing the cases of eight-year-olds living in Salt Lake, Davis and Utah counties.

A group of 128 children diagnosed with autism spectrum disorders were compared to a control group of 10,920 children of the same age and gender.

The research was possible in Utah because it tracks how much weight women put on during pregnancy.

The study had one surprising result: Researchers found no connection between a mother’s  BMI before she became pregnant and the incidence of autism - something that hasn't been seen in previous studies.

Researchers also examined a second group of 288 Utah children diagnosed with autism spectrum disorders and compared their data with that of unaffected siblings.

In both scenarios, pregnancy weight gain patterns obtained from birth certificate records were identified as common factors in mothers who gave birth to children born with autism.

The findings suggest that these small changes in pregnancy weight gain and autism may share the same underlying cause.

Dr Bilder added: 'The findings in this study are important because they provide clues to what may increase the risk of having an autism spectrum disorder.

'Doctors have known for a long time that proper nutrition is essential to a healthy pregnancy. Pregnant women should not change their diet based on these results. Rather, this study provides one more piece for the autism puzzle that researchers are exploring.'

The findings were published in the journal Pediatrics.

SOURCE

Wednesday, October 30, 2013



Alzheimer’s to be treated by replacing faulty genes -- one day

Alzheimer’s will be treated or even prevented by replacing faulty genes, an expert in the disease has predicted.  Men and women could be given a nasal spray packed with healthy versions of the defective genes that cause the illness.

Professor Julie Williams, of Cardiff University, said the entire population could eventually be screened in middle-age to identify those at most risk of the memory-robbing disease.

They could then be given cutting-edge gene therapy and other treatments to stop the disease ever developing.

Alzheimer’s and other forms of dementia affect more than 800,000 Britons, with the number expected to double in a generation as the population ages.

Existing drugs delay the progress of Alzheimer’s, but their failure to tackle the underlying cause in the brain means that the effect quickly wears off and the disease soon takes its devastating course.

The professor, who was given a CBE in the Queen’s birthday honours last year for her work on Alzheimer’s, made the prediction after jointly leading the biggest-ever study into the genetics of the disease.

The landmark study, involving more than 180 researchers from 15 countries, pinpointed 11 genes that raise the risk of Alzheimer’s.

The size of the collaboration allowed them to identify more genes in less than three years than have been found in the past two decades.

By taking the total to 21, it also more than doubles the number of known Alzheimer’s genes, the journal Nature Genetics reports.

Alzheimer’s charities said the ‘exciting’ discovery of genes linked with the disease ‘opens up new avenues to explore in the search for treatments for the condition’.

The new genes were found by comparing the DNA of more than 25,000 people suffering from Alzheimer’s with that of 48,000 people without the disease.

Professor Williams, who is chief scientific adviser to the Welsh Assembly in addition to being a working researcher, said: ‘What surprised us most about the findings was the very strong pattern that showed several genes implicating the body’s immune system in causing dementia.

‘Each individual gene will carry a relatively low risk but when you put all the information together, they are telling us an interesting and novel story and that takes us in a new direction.’

She added that the find needs to be followed up with ‘great urgency’ to determine just how the genes cause dementia. Knowing this will speed the search for new drug treatments.

But another possibility is correcting the flawed DNA, or the genetic variations that cause Alzheimer’s, by giving people a nasal spray packed with healthy genes.

Professor Williams said: ‘I do think that in ten years’ time we might be looking at a genetic therapy. That might be feasible but not quite yet.

‘If you have variation that you know is contributing to a disease, the most effective way of reducing the risk is to change the variation in a very precise way. Genetic therapies will allow you to just change the elements that are contributing to the disease.

She added that ‘in the distant future’ everyone in their 40s or 50s could be screened for dementia genes and given genetic therapy and other treatments in a bid to stop the disease ever developing.

The study also suggested links between Alzheimer’s and multiple sclerosis and Parkinson’s disease.

Professor Hugh Perry of the Medical Research Council, which part-funded the study, as did Alzheimer’s Research UK, said: ‘Understanding how our genetic code contributes to Alzheimer’s disease, other dementias and neurodegenerative diseases is a crucial part of the puzzle in learning how we can prevent their devastating effects.’

SOURCE






Kids should spend no more than TWO HOURS online per day, warn doctors

This is just opinion -- relying on a few correlational studies that prove nothing

The recommendations are bound to prompt eye-rolling and LOLs from many teens but an influential pediatricians group says parents need to know that unrestricted media use can have serious consequences.

It's been linked with violence, cyberbullying, school woes, obesity, lack of sleep and a host of other problems. It's not a major cause of these troubles, but 'many parents are clueless' about the profound impact media exposure can have on their children, said Dr. Victor Strasburger, lead author of the new American Academy of Pediatrics policy.

'This is the 21st century and they need to get with it,' said Strasburger, a University of New Mexico adolescent medicine specialist.

The policy is aimed at all kids, including those who use smartphones, computers and other internet-connected devices. It expands the academy's longstanding recommendations on banning televisions from children's and teens' bedrooms and limiting entertainment screen time to no more than two hours daily.

Under the new policy, those two hours include using the internet for entertainment, including Facebook, Twitter, TV and movies; online homework is an exception.

The policy statement cites a 2010 report that found U.S. children aged 8 to 18 spend an average of more than seven hours daily using some kind of entertainment media. Many kids now watch TV online and many send text messages from their bedrooms after 'lights out,' including sexually explicit images by cellphone or internet, yet few parents set rules about media use, the policy says.

'I guarantee you that if you have a 14-year-old boy and he has an internet connection in his bedroom, he is looking at pornography,' Strasburger said.

The policy notes that three-quarters of kids aged 12 to 17 own cellphones; nearly all teens send text messages, and many younger kids have phones giving them online access.

'Young people now spend more time with media than they do in school — it is the leading activity for children and teenagers other than sleeping' the policy says.

Strasburger said he realizes many kids will scoff at advice from pediatricians — or any adults.

'After all, they're the experts! We're media-Neanderthals to them,' he said. But he said he hopes it will lead to more limits from parents and schools, and more government research on the effects of media.

The policy was published online Monday in the journal Pediatrics. It comes two weeks after police arrested two Florida girls accused of bullying a classmate who committed suicide. Police say one of the girls recently boasted online about the bullying and the local sheriff questioned why the suspects' parents hadn't restricted their internet use.

SOURCE

Tuesday, October 29, 2013




Drinking three cups of coffee a day could halve the risk of liver cancer

Some proper reservations about the direction of causation expressed below

Three cups of coffee a day could reduce the risk of liver cancer by up to 50 per cent, latest research has shown.

One study found the drink reduces the risk of the most common type of liver cancer, hepatocellular carcinoma (HCC), by 40 per cent but separate research indicated that risk could be reduced by half.

Study author Dr Carlo La Vecchia, said 'Our research confirms past claims that coffee is good for your health, and particularly the liver.'

Dr Vecchia, of the department of epidemiology, Istituto di Ricerche Farmacologiche 'Mario Negri' and the department of clinical sciences and community health, Universit` degli Studi di Milan, Italy, added: 'The favorable effect of coffee on liver cancer might be mediated by coffee's proven prevention of diabetes, a known risk factor for the disease, or for its beneficial effects on cirrhosis and liver enzymes.'

The researchers conducted a meta-analysis of articles published between 1996 and September 2012, involving 16 high-quality studies and a total of 3,153 cases.

It also included data on 900 more recent cases of HCC published since the last detailed research in 2007.

Despite the consistency of results across studies, time periods and populations, it is difficult to establish whether the association between coffee drinking and HCC is causal, or if this relationship may be partially attributable to the fact that patients with liver and digestive diseases often voluntarily reduce their coffee intake.

Dr La Vecchia, whose research was published in the journal Clinical Gastroenterology and Hepatology, added: 'It remains unclear whether coffee drinking has an additional role in liver cancer prevention.

'But, in any case, such a role would be limited as compared to what is achievable through the current measures.'

Primary liver cancers are largely avoidable through hepatitis B virus vaccination, control of hepatitis C virus transmission and reduction of alcohol drinking.

These three measures can, in principle, avoid more than 90 percent of primary liver cancer worldwide.

Liver cancer is the sixth most common cancer in the world, and the third most common cause of cancer death.

HCC is the main type of liver cancer, accounting for more than 90 percent of cases worldwide.

Chronic infections with hepatitis B and C viruses are the main causes of liver cancer; other relevant risk factors include alcohol, tobacco, obesity and diabetes.

SOURCE





Eat chocolate to banish greys: How to munch your way to younger hair

This seems to be just opinion

You can spend hundreds on conditioners and treatments — but healthy, beautiful hair is more about what you put into your body than what you slather on your head.

Ricardo Vila Nova, resident trichologist at Urban Retreat in Harrods says: ‘Poor diet can cause hair thinning and hair loss as well as lacklustre hair, dryness and excess sebum.

A high-stress lifestyle and bad nutrition can be catalysts for damage which may need four years to recover.’

If your hair keeps breaking, eat red meat: ‘Iron is a top strength booster,’ says Vila Nova. It carries the blood’s supply of oxygen around the body keeping hair strong and nourished.

Thin and limp hair is the first sign that you aren’t getting enough iron.  As red meat is a great source of iron, eat a portion of beef or lamb at least twice a week.  Vegetarian? Lentils and tofu are also good  sources of iron.

Going grey? Try chocolate.

Melanin, the pigment that forms the colour in your hair and skin, is responsible for keeping your tresses vibrant.

Foods that boost the presence of melanin in your body include chocolate (especially the dark variety).

If you want softer hair, eating salmon might help as the Omega-3 and 6 it contains boost scalp health.

Essential fatty acids keep the scalp moisturised, help hair maintain hydration and elasticity.

Oily fish such as tuna, salmon and mackerel are excellent sources of fatty acids: eat them once or twice a week.

Finally, avocados and walnuts could boost hair shine.

The better lubricated the cuticle layer on your hair, the smoother the surface of the hair and the shinier it looks. Bon appetit!

SOURCE


Monday, October 28, 2013



The savvy snacker's secret? Eating 30 almonds a day reduces hunger pangs and doesn't cause weight gain  -- if you are a pre-diabetic

The actual finding was that the nuts did nothing.  Total calorie intake was unaffected

Snacking has become something of a national pastime, with an estimated 97 per cent of people munching their way through at least one snack a day.  While this habit may keep hunger at bay, it's fuelling an obesity epidemic.

Now new American research may hold the answer -  munching on almonds can reduce hunger without increasing weight.

Researchers at Purdue University, in Indiana, found that eating 1.5oz of dry-roasted, lightly salted almonds every day reduced volunteers’ hunger, improved their Vitamin E levels and ‘good’ fat intake, and did not cause them to pile on the pounds.  1.50z of almonds is equivalent to 43g or around 30 individual nuts.

The researchers conducted a four-week trial to investigate the effects of eating almonds on weight and appetite.

The study included 137 adults at increased risk of type 2 diabetes.  The participants were divided into five groups - a control group that avoided all nuts and seeds, a group that ate 1.5oz of almonds at breakfast and one that ate the nuts at lunch.

There was also a group that snacked on them in the morning, and one that ate them in the afternoon.

The volunteers were not given any other rules other than to follow their usual eating patterns and physical activity.

The results showed that even though they were eating approximately 250 calories a day in the form of the almonds, they did not eat any more calories overall.

‘This research suggests that almonds may be a good snack option, especially for those concerned about weight,’ said Dr Richard Mattes, professor of nutrition science at Purdue University and the study's lead author.

‘In this study, participants compensated for the additional calories provided by the almonds so daily energy intake did not rise.

‘They also reported reduced hunger levels and desire to eat at subsequent meals, particularly when almonds were consumed as a snack [as opposed to during a meal].’

Almonds have also previously been shown to increase the feeling of fullness in both normal weight, and overweight people.

This is thought to be due to almonds' monounsaturated fat, protein, and fibre content.

Previous research has shown that eating almonds can cut a person's risk of liver cancer because of the nuts’ Vitamin E content.

The Vitamin E in almonds is also thought to protect against heart disease and eye damage in old age.

Another study suggested that eating almonds can help prevent diabetes because it can help improve insulin sensitivity and reduce cholesterol levels.

SOURCE






If you want your partner to trust you, make them an omelette: Compound found in eggs credited with increasing feelings of trust

That hormones can influence behaviour, every married man knows, but it seems unlikely that diet would have lasting effects

Chaps, if you fear your wife doesn’t entirely trust you, get on her good side by whipping up an omelette.  And to really make an impression, serve chocolate mousse for dessert.

Research has credited tryptophan, a compound found in eggs and chocolate, with increasing feelings of trust.

Other foods rich in tryptophan include red meat, cottage cheese, spinach, nuts and seeds, bananas, tuna, shellfish and turkey.

The advice follows a study in which Dutch researchers asked a group of volunteers to pair up and take part in a game of trust.

In the game, the first member of the pair is given some money and given the option of giving some to their partner.  The gifted cash is then tripled and the second person can then give some of it back.

The game is seen as a measure of trust because the first player could end up a lot better off but only if he trusts the second player enough to give him a large sum initially.

Those taking part in the study were given orange juice to drink and in half of the cases, the juice was supplemented with tryptophan.

Players who had the tryptophan transferred almost 40 per cent more cash, the journal Psychological Science reports.

The Leiden University researchers said: ‘Interpersonal trust is an essential element of social life and co-operative behaviour.

‘After all, most people will only work together if they expect others to do so also, making mutual trust an important precondition for establishing mutual co-operation.  ‘We found that people who took tryptophan transferred significantly more euros than people who took the placebo.

‘Our results support the materialist approach that you are what you eat, the idea that the food one eats has a bearing on one’s state of mind.  ‘So the food we take may act as a cognitive enhancer that modulates the way we think and perceive the physical and social world.

‘In particular, the supplementation of tryptophan or diets containing tryptophan may promote interpersonal trust in inexpensive, efficient and healthy ways.’

Tryptophan is formed in the body during the digestion of some proteins and is a building block of the ‘feel-good’ brain chemical serotonin.

It is also a natural sedative, which has led to it being blamed for making people doze off after eating a big turkey dinner.

SOURCE



Sunday, October 27, 2013



Five Phony Public Health Scares

Activist misinformation harms Americans

Health activists, nutrition nannies, medical paternalists, and just plain old quacks regularly conjure up menaces that are supposedly damaging the health of Americans. Their scares range from the decades-long campaign against fluoridation to worries that saccharin causes cancer to the ongoing hysteria over crop biotechnology. The campaigners' usual "solution" is to demand that regulators ban the offending substance or practice. Here are five especially egregious examples of activist misinformation.

1. Americans should consume no more than 1,500 milligrams of sodium per day, in order to reduce everybody's risk of heart disease, strokes, and high blood pressure.

You hear this one all the time. The American Heart Association recommends consuming less than 1,500 milligrams of sodium per day. A June 2013 report by the Center for Science in the Public Interest asserted, "Immediately reducing average sodium consumption levels to between 2,200 mg to 1,500 mg per day would save about 700,000 to 1.2 million lives over 10 years." These nutrition nannies have been urging the U.S. government to lower the upper limit of daily recommended sodium intake to just two-thirds of a teaspoon of salt.

But a May 2013 study by the Institute of Medicine calls those longstanding recommendations into question. Contrary to years of anti-salt dogma, consuming less than 2,300 milligrams of sodium a day may actually harm people suffering from congestive heart failure. There was also "no evidence for benefit and some evidence suggesting risk of adverse health outcomes" if the person with a low-salt diet has diabetes, chronic kidney disease, or pre-existing cardiovascular disease.

"The evidence on health outcomes," the report concluded, "is not consistent with efforts that encourage lowering of dietary sodium in the general population to 1,500 milligrams per day."

2. Vaccines cause autism.

In 1998 the British researcher Andrew Wakefield claimed in The Lancet that he had identified an association between vaccination against measles, mumps, and rubella (MMR) and the onset of autism. Thus was launched one of the more destructive health scares of recent years, in which tens of thousands of frightened parents refused to have their children vaccinated. Anti-vaccine cheerleaders such as the actress Jenny McCarthy fanned those fears.

Years of research and numerous studies have thoroughly debunked this scare. For example, the Institute of Medicine issued a 2011 report, "Adverse Effects of Vaccines," that found no association between MMR vaccination and autism. The Centers for Disease Control and Prevention agrees that "there is no relationship between vaccines containing thimerosal and autism rates in children." The Lancet finally retracted the infamous Wakefield study in 2010. Also in 2010, Britain's General Medical Council banned Wakefield from the practice of medicine after concluding that his paper had been not just inaccurate but dishonest.

3. Cellphone use causes cancer.

The fear here is that radio frequency waves emitted by cellular phones are associated with higher risk of various brain cancers. One anecdotal report even suggested that women who secreted their cellphones in their bras were more likely to get breast cancer.

It is true that in 2011 the hyper-precautionary International Agency for Research on Cancer classified cellphones as a "possible carcinogen." But as a somewhat snarky response in the Journal of Carcinogenesis pointed out, the agency classifies coffee and pickles as possible carcinogens, too. Meanwhile, the National Cancer Institute flatly states that "to date there is no evidence from studies of cells, animals, or humans that radiofrequency energy can cause cancer." A 2012 comprehensive review of studies in the journal Bioelectromagnetics found "no statistically significant increase in risk for adult brain or other head tumors from wireless phone use."

4. High fructose corn syrup is responsible for the obesity "epidemic."

This particular scare was launched by a 2004 article in the American Journal of Clinical Nutrition, which noted, "The increased use of HFCS in the United States mirrors the rapid increase in obesity." The authors pointed out that American consumption of HFCS had increased by more than 1,000 percent between 1970 and 1990, and they estimated that Americans consumed an average of 132 kilocalories of HFCS per day. Digesting fructose, they suggested, failed to send signals to the brain to tell people to stop eating.

Since this scare was unleashed, a lot of research has investigated many different hypotheses about how HFCS might be worse for people than table sugar (sucrose). Most have turned up nothing significant.

A 2012 review article in the journal Advances in Nutrition summarized this research: "a broad scientific consensus has emerged that there are no metabolic or endocrine response differences between HFCS and sucrose related to obesity or any other adverse health outcome. This equivalence is not surprising given that both of these sugars contain approximately equal amounts of fructose and glucose, contain the same number of calories, possess the same level of sweetness, and are absorbed identically through the gastrointestinal tract." Another 2012 review article, in the Journal of Obesity, concluded, "In the past decade, a number of research trials have demonstrated no short-term differences between HFCS and sucrose in any metabolic parameter or health related effect measured in human beings including blood glucose, insulin, leptin, ghrelin and appetite."

So if HFCS is not to blame for the fattening up of Americans, what is? How about pigging out? The U.S. Department of Agriculture reports that in 1970 Americans consumed an average of 2,169 calories per day. In 2010, the figure was about 2,614. Sweeteners such as sugar and HFCS provided only 42 of this 445-calorie increase.

5. Exposure to trace amounts of synthetic chemicals is a major cause of cancer.

Rachel Carson's passionate 1962 bestseller Silent Spring warned that we "are living in a sea of carcinogens." More recently, a 2010 report issued by the President's Cancer Panel declared, "The true burden of environmentally induced cancers has been grossly underestimated."

But is that so? As the American Cancer Society's Cancer Facts and Figures 2013 notes, "Exposure to carcinogenic agents in occupational, community, and other settings is thought to account for a relatively small percentage of cancer deaths-about 4% from occupational exposures and 2% from environmental pollutants (man-made and naturally occurring)." The same group rejected the President's Cancer Panel's conclusion as well, arguing that it "does not represent scientific consensus."

In fact, at the same time that human ingenuity has been generating all these useful synthetic compounds, both cancer incidence and death rates have been falling. While cancer remains the second leading cause of death in the United States, a 2012 report by the National Cancer Institute confirms that overall cancer death rates continue to decline, and that over the past decade the incidence of cancer continues to fall for men while holding steady for women.

Once a bogus health alarm has been launched, more careful researchers must waste years and tens of millions of dollars battling the misinformation. In the meantime, worried Americans actually harm their health by refusing to get their kids vaccinated, or squander their money on such items as "chemical-free" products.

Scaremongering, unfortunately, can be both lucrative and a source of gratifying media attention, so it's not likely to go away anytime soon.

SOURCE





Friday, October 25, 2013



3-year-old baby born with HIV may have been cured thanks to 'unusually aggressive' treatment

Babies are very flexible developmentally so this may not be replicable in adults

Doctors now have convincing evidence that they put HIV into remission, hopefully for good, in a Mississippi baby born with the AIDS virus — a medical first that is prompting a new look at how hard and fast such cases should be treated.

The case was reported earlier this year but some doctors were skeptical that the baby was really infected rather than testing positive because of exposure to virus in the mom's blood.

The new report, published online Wednesday by the New England Journal of Medicine, makes clear that the girl, now 3, was infected in the womb. She was treated unusually aggressively and shows no active infection despite stopping AIDS medicines 18 months ago.

Doctors won't call it a cure because they don't know what proof or how much time is needed to declare someone free of HIV infection, long feared to be permanent.

'We want to be very cautious here. We're calling it remission because we'd like to observe the child for a longer time and be absolutely sure there's no rebound,' said Dr. Katherine Luzuriaga, a University of Massachusetts AIDS expert involved in the baby's care.

The government's top AIDS scientist, Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, agreed.

'At minimum, the baby is in a clear remission. It is possible that the baby has actually been cured. We don't have a definition for cure as we do for certain cancers, where after five years or so you can be relatively certain the person is not going to go and relapse,' he said. A scientist at his institute did sophisticated tests that showed no active virus in the child.

A government-sponsored international study starting in January aims to test early treatment in babies born with HIV to see if the results in this case can be reproduced.

Most HIV-infected moms in the U.S. get AIDS medicines during pregnancy, which greatly cuts the chances they will pass the virus to their babies. But the Mississippi mom got no prenatal care and her HIV was discovered during labor. Doctors considered the baby to be at such high risk that they started the child on three powerful medicines 30 hours after birth, rather than waiting for a test to confirm infection as is usually done.

Within a month, the baby's virus fell to undetectable levels. She remained on treatment until she was 18 months old when doctors lost contact with her. Ten months later when she returned, they could find no sign of infection even though the mom had stopped giving the child AIDS medicines.

Only one other person is thought to have been cured of HIV infection — a San Francisco man who had a bone marrow transplant in 2007 from a donor with natural resistance to HIV, and showed no sign of infection five years later.

In the Mississippi baby, 'there's no immune mechanism we can identify that would keep the virus in check' like that bone marrow donor, said another study author, Dr. Deborah Persaud of the Johns Hopkins Children's Center, who helped investigate the case because she has researched treatment in children.

Dr. Peter Havens, pediatric HIV chief at Children's Hospital of Wisconsin and a government adviser on HIV treatment guidelines, said the child may have an undiscovered genetic trait that helped her manage the virus.

'I'm just not convinced that her dramatic response would be replicable in a large population,' he said. It's too soon to recommend treating other high-risk babies so aggressively without more study, he said.

In the upcoming study, doctors plan to give AIDS medicines for at least two years and watch for signs of remission before suspending treatment and seeing whether a remission results.

The Mississippi case 'did open people's eyes further' about a possible cure, Luzuriaga said. 'We might be able to intervene early and spare children a lifetime of therapy. That is the potential impact of this case.'

SOURCE






Strokes in under 64s soar by 25% in 20 years as doctors issue warning over toll taken by unhealthy lifestyle

DESPITE the way statins are handed out like peanuts

Rocketing global rates of stroke among the young and middle-aged are a ‘wake-up call’, say British experts.  In the past 20 years the number of strokes afflicting people aged 20 to 64 have jumped by a quarter, an international study shows.

Strokes in this age group now make up 31 per cent of the worldwide total, compared with 25 per cent before 1990.

Although the rate of strokes among older people is declining in the UK, Britons are still more likely to die from stroke than someone living in France, Germany and the US.

In the poorest areas of the UK the number of people dying from stroke is around three times higher than in the least economically deprived.

Jon Barrick, chief executive of the Stroke Association, called the study a ‘wake-up call to governments across the globe’.

Poor lifestyles, increasing sedentary habits, obesity and diabetes threatened to eradicate advances made in Britain in recent years, he warned.

Figures from King’s College London earlier this month revealed a 40 per cent fall in stroke rates among  elderly people since the mid-1990s.

Experts said the drop was due to better treatment of high blood  pressure and cholesterol.

But rates remain high for people aged 45 and under, and those of black Caribbean and African origin, and a drop in the total number of strokes is unlikely because more people are  living longer.

The latest findings, published in The Lancet medical journal, form part of the Global Burden of Disease Study 2010, looking at major diseases and causes of ill health in 50 countries.

The results reveal stark differences between rich and poor. [As usual]

Strokes were linked to 46 per cent more disability and illness and 42 per cent more deaths in poorer countries than in richer ones between 1990 and 2010. In rich countries, stroke rates fell 12 per cent over the two decades.

Lifestyle factors such as smoking, high blood pressure and unhealthy diet were thought to play a role in increasing stroke rates in low-to-middle income countries.

Professor Valery Feigin of New  Zealand’s National Institute for Stroke and Applied Neurosciences said: ‘The worldwide stroke burden is growing very fast and there is now an urgent need for culturally acceptable and affordable stroke prevention, management and rehabilitation strategies to be developed and implemented worldwide.’

Each year around 152,000 strokes occur in the UK, costing the NHS an estimated £3.7billion, and there are over a million Britons living with the effects of stroke.

In January, BBC presenter Andrew Marr, 54, suffered a stroke, which he blamed on his relentless schedule.

Mr Barrick said at least half of strokes could be prevented by simple lifestyle changes, such as taking more exercise.

‘The report reveals a shocking disparity between rich and poor, where death rates from stroke are up to ten times higher in lower income countries,’ he said.

‘Closer to home, within the UK, the number of people dying from stroke is around three times higher in the most economically deprived areas, compared to the least deprived.

‘Stroke survivors often face a black hole when discharged from hospital, with one in five in the UK receiving no support from services to help them recover.

‘This is a stark warning. We urgently need to address this global stroke crisis by prioritising stroke prevention and investment into stroke research.’

A second study published in The Lancet Global Health showed that in 2010, three-fifths of the global disability and more than half of deaths due to stroke were from bleeding in the brain.

This type of stroke, known as a haemorrhagic stroke, is deadlier than more common ischaemic strokes that cut blood supply to the brain.

SOURCE



Thursday, October 24, 2013



Does living in a sunny place reduce the risk of a child developing ADHD?

Correlational data only.  Interpretation speculative. 

The percentage of the variance in ADHD explained by sunlight (correlations around .7) was extremely high but that is to be expected of ecological correlations. 

Individuals were not examined.  High sunlight AREAS and high ADHD AREAS were correlated.  Are high sunlight areas simply less likely to REPORT ADHD?  Could be.  The study may simply show that plenty of sunlight makes you Pollyanna-ish


Children who live in sunny areas may be less likely to suffer from ADHD, a new study suggests.

Dutch researchers have found that very sunny regions have a lower prevalence of the condition, suggesting that high sunlight intensity may exert a 'protective' effect.

Researchers came to the conclusion after analysing maps that showed the U.S. states (and nine other countries) most affected - and how sunny they were.

Even after taking into account other factors that are known to be associated with ADHD, they found there was still a relationship between solar intensity and how many children suffered from the condition.

To test their theory further, the researchers also looked at whether there appeared to be relationship between sunlight exposure and autism or major depressive disorders - but there was not.

The data maps were released by the U.S. Centers for Disease Control and Prevention and the U.S. Department of Energy.

Study leader Martijn Arns, of Utrecht University in the Netherlands, suggests that our biological clocks may help explain the apparent connection with sun exposure.

Sleep disorder treatments intended to restore normal circadian rhythms, including light exposure therapy, have been shown to improve ADHD symptoms.

He suggests one solution in less sunny areas could be to install more skylights in classrooms and schedule playtime to get maximum sun exposure.

And that with our heavy use of tablets, smartphones and PCs - which disrupt sleep - manufacturers should develop colour-adjusted screens to filter out disruptive blue light.

ADHD is the most common childhood psychiatric disorder, with an average worldwide prevalence of about five to seven per cent, but it also varies greatly by region.

Symptoms include an inability to focus, poor attention, hyperactivity, and impulsive behavior, and the normal process of brain maturation is delayed in children with ADHD. Many people also report sleep-related difficulties and disorders.

Scientists do not know what causes it, but genetics play a clear role. Other risk factors have also been identified, including premature birth, low birth weight, a mother's use of alcohol or tobacco during pregnancy, and environmental exposures to toxins like lead.

The research is published in the current issue of Biological Psychiatry.

Commenting on the findings, editor Dr. John Krystal, said: 'The reported association is intriguing, but it raises many questions that have no answers.  'Do sunny climates reduce the severity or prevalence of ADHD and if so, how? Do people prone to develop ADHD tend to move away from sunny climates and if so, why?'

SOURCE
 

Geographic Variation in the Prevalence of Attention-Deficit/Hyperactivity Disorder: The Sunny Perspective

By Martijn Arns et al.

Abstract

Attention-deficit/hyperactivity disorder (ADHD) is the most common psychiatric disorder of childhood, with average worldwide prevalence of 5.3%, varying by region.

Methods

We assessed the relationship between the prevalence of ADHD and solar intensity (SI) (kilowatt hours/square meters/day) on the basis of multinational and cross-state studies. Prevalence data for the U.S. were based on self-report of professional diagnoses; prevalence data for the other countries were based on diagnostic assessment. The SI data were obtained from national institutes.

Results

In three datasets (across 49 U.S. states for 2003 and 2007, and across 9 non-U.S. countries) a relationship between SI and the prevalence of ADHD was found, explaining 34%–57% of the variance in ADHD prevalence, with high SI having an apparent preventative effect. Controlling for low birth weight, infant mortality, average income (socioeconomic status), latitude, and other relevant factors did not change these findings. Furthermore, these findings were specific to ADHD, not found for the prevalence of autism spectrum disorders or major depressive disorder.

Conclusions

In this study we found a lower prevalence of ADHD in areas with high SI for both U.S. and non-U.S. data. This association has not been reported before in the literature. The preventative effect of high SI might be related to an improvement of circadian clock disturbances, which have recently been associated with ADHD. These findings likely apply to a substantial subgroup of ADHD patients and have major implications in our understanding of the etiology and possibly prevention of ADHD by medical professionals, schools, parents, and manufacturers of mobile devices.

SOURCE







Is a high-fat diet GOOD for the heart? Doctors say carbs are more damaging to the arteries than butter or cream

The great backflip has begun

Cutting back on butter, cream and fatty meats may have done more harm to heart health than good.

Experts say the belief that high-fat diets are bad for arteries is based on faulty interpretation of scientific studies and has led to millions being ‘over-medicated’ with statin drugs.

Doctors insist it is time to bust the myth of the role of saturated fat in heart disease, which was based on faulty interpretation of scientific studies.

Some western nations, such as Sweden, are now adopting dietary guidelines that encourage foods high in fat but low in carbs.

Cardiologist Aseem Malhotra says almost four decades of advice to cut back on saturated fats found in cream, butter and less lean meat has ‘paradoxically increased our cardiovascular risks’.

He leads a debate online in the British Medical Journal website bmj.com that challenges the demonisation of saturated fat.

A landmark study in the 1970s concluded there was a link between heart disease and blood cholesterol, which correlated with the calories provided by saturated fat.

‘But correlation is not causation,’ said Dr Malhotra, interventional cardiology specialist registrar at Croydon University Hospital, London.

Nevertheless, people were advised to reduce fat intake to 30 per cent of total energy and a fall in saturated fat intake to 10 per cent.

Recent studies fail to show a link between saturated fat intake and risk of cardiovascular disease, with saturated fat actually found to be protective, he said.

One of the earliest obesity experiments, published in the Lancet in 1956, comparing groups on diets of 90 per cent fat versus 90 per cent protein versus 90 per cent carbohydrate revealed the greatest weight loss was among those eating the most fat.

Professor David Haslam, of the National Obesity Forum, said: ‘The assumption has been made that increased fat in the bloodstream is caused by increased saturated fat in the diet … modern scientific evidence is proving that refined carbohydrates and sugar in particular are actually the culprits.’

Another US study showed a ‘low fat’ diet was worse for health than one which was low in carbohydrates, such as potatoes, pasta, bread.

Dr Malhotra said obesity has ‘rocketed’ in the US despite a big drop in calories consumed from fat.  ‘One reason’ he said ‘when you take the fat out, the food tastes worse.’

The food industry compensated by replacing saturated fat with added sugar but evidence is mounting that sugar is a ‘possible independent risk factor’ for metabolic syndrome which can lead to diabetes.

Dr Malhotra said the government’s obsession with cholesterol ‘has led to the over-medication of millions of people with statins’.   But why has there been no demonstrable effect on heart disease trends when eight million Britons are being prescribed cholesterol-lowering drugs, he asked.

Adopting a Mediterranean diet after a heart attack is almost three times as powerful in reducing death rates as taking a statin, which have been linked to unacceptable side effects in real-world use, he added.

Dr Malhrotra said ‘The greatest improvements in morbidity and mortality have been due not to personal responsibility but rather to public health.

‘It is time to bust the myth of the role of saturated in heart disease and wind back the harms of dietary advice that has contributed to obesity.’

Dr Malcolm Kendrick, a GP and author of The Great Cholesterol Con, said Sweden had become the first western nation to develop national dietary guidelines that rejected the low-fat myth, in favour of low-carb high-fat nutrition advice.

He said ‘Around the world, the tide is turning, and science is overturning anti-fat dogma.

'Recently, the Swedish Council on Health Technology assessment has admitted that a high fat diet improves blood sugar levels, reduces triglycerides improves ‘good’ cholesterol - all signs of insulin resistance, the underlying cause of diabetes - and has nothing but beneficial effects, including assisting in weight loss.

‘Aseem Malhotra is to be congratulated for stating the truth that has been suppressed for the last forty years’ he added.

Timothy Noakes, Professor of Exercise and Sports Science, University of Cape Town, South Africa said ‘Focusing on an elevated blood cholesterol concentration as the exclusive cause of coronary heart disease is unquestionably the worst medical error of our time.

‘After reviewing all the scientific evidence I draw just one conclusion - Never prescribe a statin drug for a loved one.’

SOURCE



Wednesday, October 23, 2013



Daily exercise 'can boost pupils' secondary school  results by a grade'

So who were the inactive ones?  Probably lower class, who are dimmer anyway.  The results may tell us NOTHING about exercise as such

One hour’s exercise each day can boost children’s GCSE results by a grade amid fresh evidence of a link between physical activity and academic achievement.

Researchers found that pupils could improve their results in a series of key academic subjects with increased exposure to activities such as PE, lunchtime games or cycling to school.

The study – based on an analysis of almost 5,000 schoolchildren – found that grades increased in direct correlation with the amount of physical exercise undertaken in the average day.

It emerged that an extra 17 minutes of exercise for boys and 12 minutes for girls at the age of 11 – beyond the current average for the age group – could boost children’s results by the age of 16.

Overall, researchers found an average of 60 minutes of “moderate to vigorous” physical activity could be the difference between achieving a C or B grade by the time pupils sat their GCSEs.

The effect was particularly marked for girls in science, it emerged.

The disclosure – in a study by academics at Dundee and Strathclyde universities – comes amid concerns that children in Britain are leading increasingly sedentary lifestyles as they spend hours every day glued to televisions, the internet and games consoles.

Previous figures have shown that almost nine-in-10 children fail to get the 60 minutes of daily exercise recommended for a good health and a third complete less than an hour each week.

It is thought that physical activity can stimulate chemicals in the brain that lead to improvements in academic performance.

Repeated studies have also created a link between physical fitness and memory, attention span and "on-task" focus, which can have an effect on classroom performance.

Writing in today’s report, academics said: “If moderate to vigorous physical activity does influence academic attainment this has implications for public health and education policy by providing schools and parents with a potentially important stake in meaningful and sustained increases in physical activity.”

The study, published in the British Journal of Sports Medicine, was based on data from 4,755 children born in the early 90s and tracked through their education.

Scientists analysed physical activity levels for between three and seven days when children were aged 11 using a motion sensor.

Factors likely to influence academic attainment, such as birthweight, mother’s age at delivery, smoking during the pregnancy and socioeconomic factors were taken into account.

The study found that boys took part in an average of 29 minutes of moderate to vigorous physical activity each day, while girls clocked up just 18 minutes. It was significantly less than the recommended level of 60 minutes for all children.

Academics then compared exercise regimes with children's academic performance in English, maths, and science at the ages of 11, 13 and 16.

Boys’ GCSE results at the age of 16 increased for every additional 17 minutes’ exercise – beyond the average – registered at the age of 11, while girls showed an improvement for each extra 12 minutes.

The study, led by Dr Josephine Booth, from Dundee, and Prof John Reilly, from Strathclyde, said that increasing activity levels among boys and girls to a recommended hour “would translate to predicted increases of academic attainment of almost one GCSE grade (eg. an increase from a grade C in English to a grade B)”.

Exercise appeared to impact on science results the most, particularly among girls.

“This is an important finding, especially in light of the current UK and European Commission policy aimed at increasing the number of females in science subjects,” the study said.

SOURCE






Forget creams and ointments – duct tape really can cure verrucas

Verrucas are a type of wart on the foot.  People who go barefoot rarely get them.  Wearing thongs (flip-flops) instead of shoes may also be protective

Quack remedies such as duct tape to cure verrucas and bathing in oats to cure skin complaints really can work, a panel of doctors has found.

Some sufferers of common complaints have sworn by household cures for years, which also include using the lubricant WD-40 to ease arthritis and drinking breast milk to cure infections.

They had never been put to the test until the Channel 4 programme Health Freaks, broadcast last night, carried out controlled studies on them.

Dr Ellie Cannon, a west London GP, was among the medics who assessed some of the unusual homespun treatments presented to them by advocates of the cures.

She said: “We know people do use duct tape for verrucas and we did see in the trial we did that it improved them for some. In one case the verruca went completely.”

One patient featured on the programme said they had had a verruca for eight years and had been unable to shift it until they used duct tape, which finally cleared it up.

In the trial, some participants used duct tape and some used Elastoplast.

Dr Cannon said further investigation was needed to establish why duct tape was so effective as a remedy, as this remained unclear.

She added: “It’s not what we would call a consistent treatment. It doesn’t work for everyone.

“The three of us on the panel had different theories about why it was working. It might be that the tape is starving the verruca of oxygen, or it might be that the adhesive in the tape is causing an immune reaction.”

Other unusual treatments tested on the programme include breast milk to cure infections and an oat bath for the skin complaint psoriasis.

Breast milk was found to be less effective than one might expect, Dr Cannon said, while oat baths did help a little.

Another homespun remedy, using the lubricant WD-40 to treat arthritis and chest pain, was suggested by two builders but was deemed to be too unsafe to trial.

Dr Cannon said: “One of the things that surprised me was just how widespread the use of some of these remedies is. WD-40 is so widely used on building sites to treat arthritis that the manufacturers have had to put a notice on their website saying it’s not for human use.

“In years gone by the remedy might only be known within a particular family but the internet has made them much more commonly known.

“Another thing that surprised me was how prepared people are to try out remedies that could have dangerous side-effects, like drinking their own urine.”

Other remedies that the doctors were presented with included amber necklaces to cure teething trouble, copper coins to clear up styes, turmeric as an acne cure and leeches to cure deep vein thrombosis.

The doctors concluded that many of the “cures” are a result of the placebo effect, when patients’ bodies heal themselves because the patient is convinced they have been given a miracle remedy.

“Even when people have three doctors telling them their treatment has no medical benefits, once they are in that zone, believing in their remedy, they won’t be persuaded otherwise.”

In the test, all of the people given duct tape found their verrucas shrank by at least one millimetre, whereas none of those who used surgical tape saw any difference at all.

Dr Cannon said: "I'll certainly suggest to my patients that they give it a try if they aren't having any success with other treatments."

SOURCE

Tuesday, October 22, 2013


Three glasses of wine could reduce chance of conception  -- if you already have serious fertility problems

Drinking three small glasses of wine a week could reduce some   women's chances of conceiving by two thirds, research has found

The study of women's drinking habits in the months before they began fertility treatment found that even low quantities of alcohol had a dramatic impact on the ability to conceive.

Research on couples who had already undergone around three failed cycles of IVF, found that women who abstained from all alcohol had a 90 per cent chance of achieving a successful pregnancy, over three years.

However, women who drank an average of just three small glasses of wine a week had a 30 per cent chance of conceiving over the same period.

Researchers said the same patterns were likely to hold true for couples trying to conceive naturally.

The study found that even women who drank just one or two glasses of wine a week - well within Government safe drinking limits for those trying to conceive - drastically jeopardised their fertility, with success rates of 66 per cent.

Government advice recommends that women trying to get pregnant should drink no more than 1 to 2 units of alcohol twice a week - the equivalent of up to two glasses of wine.

Researchers who led the study of 90 women, presented at the American Society for Reproductive Medicine's annual conference in Boston, US, said it was not clear why relatively small quantities of alcohol had such an impact.

Lead author Dara Godfrey, an IVF specialist from Reproductive Medicine Associates of New York, said: "My advice to patients is always to limit or abstain from alcohol. But whether they do or not its up to them. Alcohol definitely has a detrimental effect on pregnancy success."

Dr Godfrey said the same impact was likely to occur in women trying to conceive naturally, with the greatest effect likely to be felt among those who had several drinks on the same evening.

She said researchers had not identified the mechanism which meant alcohol reduced fertility, but that it was possible it jeopardises normal egg development.

Some fertility clinics recommend that clinics stop drinking for three months before they start IVF treatment, because it takes that long for an egg to develop.

Dr Allan Pacey, a fertility expert at the University of Sheffield said the differences in pregnancy rates between the groups were substantial, and consistent with advice to avoid alcohol if trying to conceive.

However, he said it was possible that there were other differences between the women who abstained from alcohol entirely, and those who had several drinks a week.

Dr Pacey said: "I would wonder whether alcohol could be a surrogacy marker for something else - that the women who have something to drink are more likely to be stressed."


Stress levels affect hormones such as cortisol which can interfere with reproductive cycles.

The university's research on sperm quality last year suggested that moderate intake of alcohol did not affect male fertility, he said.

"There is a certainly a bit of a difficulty in advising men that it is okay for them to drink if trying to conceive but women shouldn't touch a drop - that could create tensions in many a household," he said.

SOURCE






A jog in the park won’t cure serious depression

A study of over-prescription for depression and anxiety deserves analysis because it contains a mix of truth… and hidden agendas

GPs are turning us into a nation of pill-poppers, according to shock headlines last week. The research, commissioned by the charity Nuffield Health, found that GPs are 46 times more likely to prescribe medication for depression and anxiety ''rather than recommend other, medically proven alternatives such as exercise’’.

This feeds in nicely to the social narrative surrounding primary care: that GPs are too busy and harassed to listen and are only interested in pushing us out of the door clutching a prescription to keep us quiet. Dr Davina Deniszczyc, the medical director of Nuffield Health, said: ''The compelling evidence that physical activity can play an important role in both treating and alleviating early symptoms of mental ill health isn’t sufficiently filtering through to front-line and primary care services.’’

This study deserves a closer analysis because it contains a mix of truth… and hidden agendas. The newspaper reports indicated, correctly, that it was commissioned by a charity. But although Nuffield Health is technically a charity, it is actually a private hospital chain. It was criticised when it emerged that the group paid only £100,000 corporation tax in 2011, despite a turnover of £575 million, because of its ''charity’’ status. Its chief executive, David Mobbs, has a salary package of £860,000. It has 31 hospitals but also 60 membership gyms. So, a cynic could argue that it has a vested interest in, firstly, undermining people’s confidence in GPs and, secondly, commissioning research that promoted exercise. The study is, in essence, a nicely dressed up piece of covert marketing.

And it works as a marketing message because it does contain some truth. I should emphasise that I routinely prescribe antidepressants to patients with moderate to severe depressive illness, and they are effective. It is also true that sometimes antidepressants are prescribed to people for whom exercise would be beneficial, such as those with a mild depressive illness. But for many, their depression is so severe that the idea of a brisk jog in the park to lift their spirits is absurd. It can be a life-threatening illness that deserves prompt pharmacological intervention.

However, what the study failed to explore was why GPs were so ready to prescribe antidepressants. The real story here is about psychological therapy services. Historically, GPs have been reliant on antidepressants because access to the alternative – the ''talking therapies’’ – in the NHS is subject to very long waiting lists.

When I began training it was not unusual to hear of patients waiting for several years to receive therapy on the NHS. The response to this was the IAPT scheme – or Improving Access to Psychological Therapies, which evolved from a paper first tabled by health economist Lord Layard in 2005. He argued that, as well as humanitarian grounds, there was a sound economic argument for providing evidence-based therapies quickly and effectively for people with depression and anxiety, as it would reduce the cost of incapacity benefit. On the basis of his assessment, staff were recruited and trained and services were rolled out across the country providing cognitive behavioural therapy. At the end of the first three full financial years of operation in March 2012, more than one million people had used the service and 45,000 people had been moved off benefits as a result.

Unfortunately, it has been a victim of its own success. Those with only mild symptoms are seen quickly by specially trained professionals, but not doctors or psychologists, which limits the complexity of the cases they are able to deal with. So while money has poured into IAPT services to deal with minor complaints, waiting times for more complex cases have lengthened. There are reports of people with severe depression having to wait over a year. This is not to denigrate IAPT services – they do a great job. But, with the success of IAPT, the Government feels it has solved the problems of accessing “talking therapies” when there are still shamefully long waiting lists for those that need help the most.

GPs still prescribe antidepressants because, for some patients, the alterative is an insufferably long wait before they get any respite from their symptoms.

SOURCE




Monday, October 21, 2013


Super-vaccine could eliminate need for annual flu jabs within five years after successful trials

A new 'Holy Grail' flu vaccine which gives lifelong protection against all strains of the virus could be available within five years.  Scientists from Britain and Europe are getting ready to start large-scale trials of a universal vaccine after early tests on humans proved successful.

If all goes to plan the new injection would stop the need for annual flu jabs and could save thousands of lives every year.

It could also be effective against highly dangerous forms of the disease, such as Spanish flu, even if they mutate, preventing global pandemics like the one which killed 100million people in 1918.

Despite carrying out human trials on almost 100 patients over many years, this is the first positive news.

Professor John Oxford, British flu expert and a key researcher of the study, said that his team are 'wildly enthusiastic' about the vaccine's prospects.

The programme has recently received a multi-million pound EU grant to fund its research.

At the moment vaccines work by identifying viruses by their 'coats', however as viruses mutate these change, making old vaccines ineffective.

The universal vaccine works by attacking proteins hidden within the virus which are common throughout harmful strains.

If it works, the 'Holy Grail' vaccine would eliminate the need for annual flu jabs and could save thousands of lives every year and prevent global flu pandemics

The news comes at the end of a week which has seen a new strain of bird flu re-emerge in China and after it was reported to have passed between humans in August.

A 32-year-old woman was said to have died after caring for her father who was infected by the H7N9 strain of bird flu.

Reports of human infection began in March this year but have trailed off in the last few months having killed at least 45 people out of 136 cases.

However as poultry stocks swell ahead of Chinese new year a 35-year-old man in the eastern province of Zhejiang has been hospitalised and the World Health Organisation confirms two more people are in hospital with another 88 being sent home.

A nasal flu spray has also been made available for all children aged between two and three years old, and will eventually be extended into a national programme for all under-16s.

If trials of the new flu super-jab are successful it could be available for use by 2018.

SOURCE






Man-made virus could defeat the disease that stole Coleen Rooney's sister: Tests on mice offer new hope for Rett Syndrome

For Beth Johnsson, having a daughter with Rett Syndrome is ‘like losing a child you still have.’ Until she was 18 months old, Hannah was alert, responsive and developing normally.

Yet, Beth, 35, an English teacher, explains: ‘Very suddenly, the beautiful baby we knew slowly began to disappear – she was alive, but we couldn’t get to her.’

Hannah is now six yet has the mental capabilities of an 18-month-old – a reality that is only too stark when she is playing with her brothers Matthew, three, and Noah, who is almost two.

Just like her baby brother, she frequently screams, tries to eat everything within arm’s distance, pulls her hair (and her brother’s) and finds it hard to support her own weight.

Yet now scientists may be close to eliminating the condition. Astonishingly, Rett Syndrome (RS) – which also afflicted Coleen Rooney’s little sister, Rosie, who died aged 14 in January – has been reversed in mice.

One child in 12,000 is born with RS, yet few people have heard of it.  The genetic disorder affects almost exclusively females, causing them to regress neurologically and physically.

Almost all cases are caused by a mutation in the MECP2 gene which prevents nerve cells in the brain from working properly. Currently there is no cure and only the symptoms are treated. Sufferers can live to their  40s but most die before 25.

Beth and her husband Vince, 37, from Sutton in Surrey, first noticed something was wrong when Hannah started to react oddly to people’s emotions.

‘Her normal responses reversed – she would become very distressed when someone laughed,’ says Beth. ‘This was accompanied by screaming which was difficult to bear.’

There was a sudden slowing in her development, and nursery workers noticed she had started to shake occasionally. Beth and Vince took Hannah to the doctor but they were reassured there was nothing wrong with her because she was reaching all her milestones, albeit slowly. But Hannah continued to regress until she started to pull out her hair in handfuls.

‘Even at this point no one could tell me what was wrong with her,’ adds Beth. ‘It was incredibly frustrating.’

Finally, tests revealed the true cause. When the couple were given the news they were strictly told not to search the internet about the condition and to ‘carry on as normal’. Doctors wanted to prevent them frightening themselves.

‘I’m not sure I’d have managed to  get out of bed if I’d known what lay ahead,’ Beth admits.

Since then there has been a steady decline in Hannah’s abilities. She wakes throughout the night for hours at a time and needs a strict routine, otherwise she screams for long periods.

Until recently, neurodevelopmental conditions were thought to be irreversible. Yet there is hope from the Edinburgh University research which in 2007 reversed RS symptoms in mice.

‘That took our breath away,’ says Dr Adrian Bird, who led the study. The mice were infected with a virus that altered the gene and reversed symptoms. ‘Mice are different from humans, but it is a very strong indicator that therapies could be developed in our lifetime,’ says Dr Bird.

For Beth, the research is the light at the end of the tunnel and she will work tirelessly to raise money until there is a therapy for Hannah. She says: ‘I refuse to lose hope.’

SOURCE

Sunday, October 20, 2013



Do high doses of vitamin C raise prostate cancer risk? Study shows popping too many supplements could give men tumours

Men who take high doses of vitamin supplements could be increasing their risk of lethal prostate cancer by nearly 30 per cent, say researchers.

A study of 48,000 men spanning more than two decades suggests popping too many vitamin pills can put them in danger of tumours that are more likely to be fatal.

The researchers linked high doses of vitamin C to an increased risk of lethal and advanced prostate cancer.

The results, by experts from Harvard School of Public Health in Boston, in the US, and the University of Oslo in Norway, are not the first to raise the alarm over the dangers of excess vitamin consumption.

Nearly a quarter of adults in the UK are estimated to take antioxidant supplements or multivitamins regularly in the hope that it will help protect them against illnesses such as heart disease and cancer. The market for such products is worth around half a billion pounds a year.

In recent years, high-dose vitamins have become popular, with people  taking more in the belief that it is  better for them.

For example, health food shops now sell vitamin C tablets in doses of 1,000mg each, but the body needs only about 40mg a day to keep cells healthy and promote healing.

In the latest research, the scientists set out to see if antioxidants in vitamin pills and food could reduce the chances of a prostate tumour.

From 1986 to 2008 they followed 48,000 men aged between 40 and 75. Every four years, the men completed food questionnaires designed to record their dietary habits.  The researchers followed them up to see which ones developed prostate cancer.

The results, published in the International Journal Of Cancer, show that total antioxidant intake – from foods or pills – neither increased nor decreased the risk of a tumour. Antioxidants fight the process, called oxidation, that destroys cells.

There was some suggestion antioxidants from coffee had a slightly protective effect.

But the most alarming finding was that men with the highest intake of antioxidants from vitamin pills were 28 per cent more likely to get lethal prostate cancer than those who took the lowest amount of pills or none.

Those with the highest intake of antioxidants from vitamin pills were 15 per cent more likely to get advanced prostate cancer – a tumour that spreads quickly beyond the prostate, reducing the chances of survival.

In a report the researchers said: ‘High intake of antioxidants from  supplements was associated with increased risk for lethal and advanced prostate cancer.

'The main contributor is vitamin C, and this finding warrants further investigation.’

But the researchers stressed that, until more research is carried out, they cannot be sure that vitamin tablets actually cause cancer.

It may be that the cancer victims felt unwell for several months before their diagnosis and simply increased vitamin intake to try to ward off symptoms such as fatigue.

Dr Carrie Ruxton, of the Health Supplements Information Service, which is funded by supplements makers, said: ‘It is entirely possible that these men may have had prostate-related symptoms and fatigue long before diagnosis.  'The cancer may have had nothing to do with the supplements.’

SOURCE





New HRT drug may help to PREVENT breast cancer in British women

A menopause treatment which could prevent breast cancer, rather than causing it, may soon be available for British women.

Evidence shows the new hormone replacement therapy pill is even more effective at combating menopause symptoms, such as hot flushes, than the standard treatment.

But crucially, early trials show it may prevent growth of breast cancer tumours. Existing forms of HRT, by contrast, are thought to cause the disease.

Researchers say it could be given to millions of women worldwide who are too afraid to take menopause treatment due to the risk of breast cancer.

And they also believe it could be given to younger women with a strong family history of the disease to prevent it occurring.

The pill – called Duavee – has just been approved for use in America and will be available in chemists there from January.

It is now being considered for use in Britain by the EU watchdog – the European Medicines Agency – which is expected to make a decision in the next few months.

The drug contains the hormone oestrogen which combats symptoms of the menopause including hot flushes, night sweats, sleeping problems and thinning of the bones, or osteoporosis.

But the problem with oestrogen, which is in standard HRT, is that it is thought to trigger the growth of cancer tumours.

To combat this, a chemical called bazedoxifene is added to the Duavee pill, blocking the cancer-causing effects of oestrogen. This means the drug has all the benefits of reducing menopause symptoms as normal HRT but does not trigger breast cancer.

Trials on 8,000 women have shown it reduces hot flushes by 85 per cent – making it more effective than standard HRT, which cuts them by 75 per cent. It also prevents fractures caused by bone thinning by 40 per cent and participants said it had improved their overall happiness.

But in tests on mice the chemical prevented the growth of breast cancer tumours – and scientists are convinced it will have the same effect on women.

Professor Richard Santen of the University of Virginia, who is an expert in the role of oestrogen in breast cancer, said: ‘If this does what we think it does this is huge.’

Unveiling the findings at the American Society for Reproductive Medicine conference in Boston, he added: ‘I’ve been around for 45 years studying breast cancer and when you look at the effects of these agents in animals, the animals have really predicted what’s going to happen in patients.’

Breast cancer is the most common form of the disease in women and in the UK there are just under 50,000 new cases each year and 11,500 deaths.

In 2001 and 2002 two major US studies suggested breast cancer was being triggered by HRT leading to millions of women worldwide abandoning the drug. In Britain, the numbers of women on HRT fell by half – only a million now take it today.

Doctors are concerned that many are suffering the debilitating symptoms of the menopause and putting themselves at risk of osteoporosis because they are too afraid to take HRT.

The new drug’s manufacturer Pfizer has not revealed the cost of the pill – which would be taken once a day – but say it would be comparable to current forms of HRT, which is between £2 and £7 for a month’s supply depending on the type.

Professor Santen said that if it was shown to prevent breast cancer, it could be given to thousands of younger women at high risk of the disease. This summer the NHS began offering these women the drug Tamoxifen, but it can have very unpleasant side effects such as depression, tiredness, blood clots, hot flushes and headaches.

The professor said trials had so far shown the new pill had limited side effects.

The drug could be available in Britain next year if the EU watchdog approves it, although it may take several years to show it prevents breast cancer in humans.

SOURCE




Friday, October 18, 2013




Air pollution leading cause of cancer, WHO finds

If you are talking about third worlders cooking over a cow-dung fire in a windowless hut, maybe.  For the rest it's just opinion based  on rodents or an inconclusive correlational base

The World Health Organisation (WHO) has said outdoor air pollution is a leading cause of cancer in humans.

The International Agency for Research on Cancer declared on Thursday that air pollution is a carcinogen, alongside known dangers such as asbestos, tobacco and ultraviolet radiation. The decision came after a consultation by an expert panel organized by IARC, the cancer agency of the World Health Organisation.

"The air we breathe has become polluted with a mixture of cancer-causing substances," said Kurt Straif of the WHO's International Agency for Research on Cancer (IARC).

"We now know that outdoor air pollution is not only a major risk to health in general, but also a leading environmental cause of cancer deaths."

The IARC said a panel of top experts had found "sufficient evidence" that exposure to outdoor air pollution caused lung cancer and raised the risk of bladder cancer.

Although the composition of air pollution and levels of exposure can vary dramatically between locations, the agency said its conclusions applied to all regions of the globe.

Air pollution was already known to increase the risk of respiratory and heart diseases.

The IARC said pollution exposure levels increased significantly in some parts of the world in recent years, notably in rapidly industrialising nations with large populations.

The most recent data, from 2010, showed that 223,000 lung cancer deaths worldwide were the result of air pollution, the agency said.

In the past, the IARC had measured the presence of individual chemicals and mixtures of chemicals in the air - including diesel engine exhaust, solvents, metals, and dust.

But the latest findings were based on overall air quality.

"Our task was to evaluate the air everyone breathes rather than focus on specific air pollutants," said the IARC's Dana Loomis.

"The results from the reviewed studies point in the same direction: the risk of developing lung cancer is significantly increased in people exposed to air pollution," he added.

The predominant sources of outdoor air pollution were transport, power generation, emissions from factories and farms, and residential heating and cooking, the agency said.

SOURCE





The half-price IVF that boosts chance of getting pregnant: New technique uses far fewer drugs and could be done during a woman's lunch hour

A half-price version of IVF that women could have in their lunch-hour has been shown to raise the chances of having a baby drastically.

It is particularly effective for those in their late 30s and early 40s and success rates are almost twice as high as the conventional fertility treatment.

The technique involves using far lower doses of drugs, with the result that it is not only far cheaper, but also has virtually no side effects.

Many women undergoing normal IVF suffer from mood swings, nausea and headaches.  In rare cases it can cause a life-threatening condition whereby their abdomen fills with fluid.

This new method, known as mini-IVF, involves giving women a daily pill for ten to 12 days which contains a low dose of the fertility drug clomid.

This encourages their ovaries to produce eggs and during this time the women undergo ultrasound scans every few days to check the eggs are developing healthily.

Once the eggs are large enough – around another ten days later – they are removed during a five-minute operation which does not require a general anaesthetic.

This means women can have it done before work, or during their lunch break, unlike normal IVF which lasts half a day and requires them to be put to sleep.

Doctors from St Louis, Missouri, who developed the method say it should be routinely offered to all women as it is cheaper, safer and a far more effective alternative.

Trials involving 520 women unveiled at the American Society for Reproductive Medicine conference in Boston showed that success rates for women over 35 were a third higher compared with those undergoing conventional IVF.

The results were even better among the over-40s – those using this new method were twice as likely to have a baby compared to if they had used the conventional  fertility treatment.

Dr Sherman Silber, a fertility expert who helped develop the method, said it was so effective it offered women in their 40s the same chance of falling pregnant as those in their twenties.

In women aged 35 or below, success rates are about the same as standard IVF but the researchers say they would also benefit from using it because it is far cheaper and has fewer side effects.

Dr Silber said: ‘This is perfect for Britain and it would save an incredible amount of money. This is the magic solution.’

A course of mini-IVF costs between £1,200 and £1,800 compared with standard IVF which is between £3,000 and £4,000.

The reason it is so much cheaper is that it involves using far lower doses of the fertility drug clomid.

But this also makes it safer as large amounts of medication can lead to the deadly condition ovarian hyperstimulation syndrome whereby their abdomens fill with fluid – this occurs in 1 per cent of women having IVF.

One of main reasons women in their late 30s and early 40s have problems conceiving either naturally or with IVF is that they do not produce enough healthy eggs, capable of developing into an embryo and eventually a foetus.

In fact, the high dose fertility drugs used in conventional IVF worsen this problem as although they make a woman produce more eggs, they also appear to result in changes in the DNA of the eggs which make them defective.

But mini-IVF only uses very low doses of the drugs that do not make the woman’s eggs less healthy.

Professor Geeta Nargund, a  consultant gynaecologist at the London fertility clinic Create said: ‘This study is a valuable addition to the growing evidence that mild stimulation IVF needs to become the first choice in IVF clinics for many women.’

SOURCE



Thursday, October 17, 2013



Epigenetics: Can we  alter  genes?

A useful summary of a school of thought below but most of the supposed epigenetic effects have fairly obvious alternative explanations.  The children of the Dutch famine survivors may have been disadvantaged in a number of ways -- such as a damaged hormonal environment in utero etc.

Towards the end of the Second World War, something unprecedented happened in modern Europe: a famine. Operation Market Garden, the Allies’ attempt to push across the Rhine in September 1944, had failed, and in retaliation for Dutch collusion the Nazis blockaded towns across the western Netherlands for more than six months. The resultant food shortages – known as the Dutch Hongerwinter – were severe: with just 580 calories of food per person per day, over 22,000 people died from malnutrition, and thousands of babies were born badly underweight.

When scientific researchers analysed the meticulous Dutch medical records decades later, they could see the health effects of prenatal exposure to famine: that the infants who survived were more susceptible to health problems. But they also found a curious anomaly: that these children’s own children – born years later, and well fed – were also significantly underweight. The famine had, it seemed, “scarred” the victims’ DNA.

Which was surprising. After all, for decades we’ve all been told: you are what you eat. You are what you drink. You are how much, or how little, you exercise; you are whatever toxins you imbibe or inhale. Your genes may have destined you to a little baldness, or an increased susceptibility to some vulgar tumour. But as health experts have cautioned you repeatedly: you are a product of your own lifestyle choices.

And yet a quiet scientific revolution is changing that thinking. For it seems you might also be what your mother ate. How much your father drank. And what your grandma smoked. Likewise your own children, too, may be shaped by whether you spend your evenings jogging, worrying about work, or sat on the sofa eating Wotsits. And that nurture, rather than our intractable nature, may determine who we are far more than was ever previously thought.

Epigenetics is a relatively new scientific field; research only began in earnest in the mid Nineties, and has only found traction in the wider scientific community in the last decade or so. And the sources of its data are eclectic, to say the least – stretching from famines in northern Sweden to the 9/11 attacks to the medical notes of Audrey Hepburn.

Audrey Hepburn, who spent her childhood in the Netherlands during the Dutch Hongerwinter, attributed her clinical depression later in life to the malnutrition in her formative years (Credit: Hulton Archive)

But already epigenetics is offering explanations to how our diets, our exposure to toxins, our stress levels at work – even one-off traumatic events – might be subtly altering the genetic legacy we pass on to our children and grandchildren. It’s opened up new avenues into explaining – and curing – illnesses that genes alone can’t explain, ranging from autism to cancer. Moreover, its momentum is resurrecting old theories long dismissed – and rewriting the textbooks and biological rules once thought sacrosanct.

Ever since the existence of genes was first suggested by Gregor Mendel in the 1860s, and James Watson and Francis Crick came up with the double-helix model in 1953, science has held one idea untouchable: that DNA is nature’s blueprint. That chromosomes passed from parent to child form a detailed genetic design for development. So when, 10 years ago, researchers finished mapping the human genome, it promised to revolutionise the field of molecular medicine.

In many ways it did, but something was still missing. Back in the Fifties, biologists had already theorised that something on top of the DNA sequence was actually responsible for “expressing” what came out. As Adrian Bird, a genetics professor at the University of Edinburgh, explains: “We knew there are millions of markers on your DNA and on the proteins that sit on your DNA. What are they doing? What is their function? How do they help genes work, or stop working?”

It was termed the epigenome (literally “upon genetics”). But only in the last few years has research revealed more detail of the vast array of molecular mechanisms that affect the activity of the genes. And that your DNA itself might not be a static, predetermined programme, but instead can be modified by these biological markers. Chief among them are what are called methyl groups – tiny carbon-hydrogen instruction packs that bind to a gene and say “ignore this bit” or “exaggerate this part”.

Methylation is how the cell knows it needs to grow into, say, an eyeball rather than a toenail. In addition, there are also what are called “histones”, controlling how tightly the DNA is spooled around its central thread, and therefore how “readable” the information is. And it’s these two epigenetic controls – an on-off switch and a volume knob, if you will – which give each cell its orders.

Except this epigenetic “interpretation” of your DNA is not fixed – it alters dramatically. And not just during big life changes, like puberty or pregnancy. Now research has found it can also change due to environmental factors, such as our stress levels, if we smoke, etc.

As an example: scientists now know that a bad diet can interfere with this methylation. Which means a cell can grow abnormally. Which means you get a disease or – at worst – a cancer. Scientists used to think that these little epigenetic instructions would be left off your DNA before it was passed onto your children. That when a sperm and egg combined, the embryo had a “clean slate”. Alas, no. New research has found that about one to two per cent of our epigenetic tags cling on. And thus your worst habits – smoking, overeating – are the ones that can be passed onto your offspring, and even further down the hereditary line. Or, put another way: your grandfather was making lifestyle decisions that affect you today.

In biological terms, the idea is heretical. After all, Darwin’s central premise is that evolutionary change takes place over millions of years of natural selection, whereas this new model suggests characteristics are epigenetically “memorised” and transmitted between individual generations. And yet, slowly but surely, the evidence is mounting.

The Hongerwinter is one field of study. Another project focused on the inhabitants of Överkalix, an isolated town in northern Sweden. During the mid 1800s, the community was hit by several periods of intense famine when the crops failed. By studying the medical records found in parish registers, researchers were able to show that the population who went from a normal diet to overeating during a year of crop success produced grandchildren who experienced far shorter lives. And significantly too: a difference of around 32 years.

“There are social implications to these results,” says Marcus Pembrey, emeritus professor of paediatric genetics at University College London, who collaborated on the Överkalix research. “In the sense that you don’t live your life just for yourself but also for your descendants. Although it is important to realise that transgenerational effects are for better as well as worse.” For the medical world, however, the implications could be hugely important.

Suddenly, new “epidemics” such as auto-immune disorders or diabetes might be traced back to epigenetic markers left generations ago. At the University of Texas, for example, a study of rats suggests that soaring obesity and autism rates in humans could be due to “the chemical revolution of the Forties” — and our grandparents’ exposure to new plastics, fertilisers and detergents. As professor of psychology and zoology David Crews explains: “It’s as if the exposure three generations before has reprogrammed the brain.” There could also be implications to what we eat. Already, pregnant women are encouraged to take folic acid, vitamin B-12 and other nutrients containing “methyl groups”, as they decrease the risk of asthma and brain and spinal cord defects in their foetuses.

There is also increasing evidence that certain cancers are caused by misplaced epigenetic tags. So scientists are developing new drugs to silence the bad genes which were supposed to be silenced in the first place. A team of molecular biologists at Temple University in Philadelphia, for example, are currently investigating an ingenious potential alternative to traditional chemotherapy: treating cancer patients with drugs that “reprogramme” cancer cells by reconfiguring the epigenetic markers. Team leader Prof Jean-Pierre Issa, director of the Fels Institute for Cancer Research, hopes this “reshuffling” of the epigenome could, perhaps one day, even produce a cure.

However, the biggest excitement – and, indeed, controversy – surrounds growing research that suggests it’s not just physical characteristics or illnesses we might be passing onto future generations. Instead, our DNA might be affected by behavioural epigenetics too.

Research on rats by Prof Michael Meaney of McGill University, Montreal, and Frances Champagne, a behavioural scientist at Columbia University in New York, have identified changes in genes caused by the most basic psychological influence: maternal love. The 2004 study showed that the quality of a rat mother’s care significantly affects how its offspring behave in adulthood – that rat pups that had been repeatedly groomed by their mothers during the first week of life were subsequently better at coping with stressful situations than pups who received little or no contact.

Frances Champagne identified changes in rats' genes caused by maternal love (Credit: Joe Blossom / Alamy)

You might think this is nothing new; that we already know a loving upbringing has positive psychological effects on children. But Prof Meaney’s research suggests the changes are physiological, not psychological. Epigeneticists also think socioeconomic factors like poverty might “mark” children’s genes to leave them more prone to drug addiction and depression in later life, regardless of whether they’re still poor or not.

There’s also evidence that markers put down during pregnancy can affect our psychological welfare. Further research into the Hongerwinter found that children who were affected in the second trimester of their mother’s pregnancy had an increased incidence of schizophrenia and neurological defects. The actress Audrey Hepburn may be a case in point. She spent her childhood in the Netherlands during the famine, suffering from anaemia and respiratory illnesses at the time; she attributed her clinical depression later in life to the malnutrition in her formative years.

But even one-off traumas could affect later generations too. The attacks of 9/11 offered a key insight. An estimated 530,000 New York City residents suffered symptoms of post-traumatic stress disorder (PTSD) after witnessing the attacks – of which approximately 1,700 were pregnant women. But research by Rachel Yehuda, professor of psychiatry and neuroscience at Icahn School of Medicine at Mount Sinai, found the effects could last longer. She found that mothers who were in their second or third trimester on the day of the attacks were far more likely to give birth to stressed-out infants – i.e. children who reacted with unusual levels of fear and stress when faced with loud noises, unfamiliar people, or new foods.

In short, it seems some children inherited the nightmare their mothers experienced on that day. Will these 9/11 children pass that fear onto their own children? It remains to be seen. But Yehuda has obtained similar results in the adult offspring of Holocaust survivors, and is currently trying to identify the epigenetic markers associated with PTSD in combat veterans.

Indeed, in the space of less than two decades, the field of epigenetics has exploded. With it has emerged new strands of data analysis, sociology, pharmaceutical research and medical discovery. The field is still young – and yet already its bold claims are causing scientific schisms.

As Bird warns: “I do think people have jumped the gun and seen more positive results than are really out there. As yet, there is no evidence worthy of the name that lifestyle choices affect the health of children, let alone grandchildren. I worry that suggesting this is a scientific fact will encourage more futile parental guilt.” But researchers leading the charge, such as Champagne, are philosophical. As she has said: “Critics keep everyone honest. The enthusiasm in the field is obviously great, but I think people’s expectations of what this means need to chill out a little bit.”

SOURCE







Oreos are as addictive as cocaine, say scientists

Ho hum! Putting the cart before the horse again. It would be more realistic to say that cocaine and other drugs hit the receptors that we have always had in order to enable food appreciation and discrimination. Food came first. Drugs imitate it

Oreos can be as addictive to the brain as cocaine, the authors of a scientific study have claimed.

The chocolate cookies have been found to trigger the same neurons in the brain's 'pleasure centre' as the outlawed drug during extensive lab testing on rats.

Neuroscientist Joseph Schroeder from Connecticut College in New London, Connecticut, led research into the addictive effect of the indulgent treat.

His team discovered that the hungry rodents' reaction to the biscuit was comparable to that of rats who had been offered cocaine in earlier tests.

As well as finding that, like humans, rats prefer to eat the cream part of their Oreo first, scientists also saw similarities between the levels of addiction in 'Oreo rats' and their cocaine hooked cousins.

To arrive at the conclusion, Schroeder placed rats in a maze which had two routes to different treats.  One on side, they placed rice cakes and on the other they placed Oreos.

After the animals had explored the maze fully, they were then left to choose which treat they would prefer to stay at.

Speaking of his findings, Schroeder said: 'Just like humans, rats don’t seem to get much pleasure out of eating rice cakes.'

The results, which showed the rodents had a strong preference for the chocolate treat, were compared to those of an identical test involving drugs.

One on side of the maze, the rats would be given an injection of saline while on the other they were given a dose of cocaine or morphine.

According to Schroeder, the rats in the Oreo experiment spent as much time hanging around their Oreo zone in the food test as they did the cocaine zone in the drug test, showing similar levels of addiction.

Writing in a statement describing the study, to be presented at the Society for Neuroscience in San Diego next month, Schroeder added: 'Our research supports the theory that high-fat and high-sugar foods stimulate the brain in the same way that drugs do.

'That may be one reason people have trouble staying away from them and it may be contributing to the obesity epidemic.

'(The results) lend support to the hypothesis that maladaptive eating behaviors contributing to obesity can be compared to drug addiction.

Lauren Cameron, a student at Connecticut College who worked on the study said: 'It really just speaks to the effects that high fat and high sugar foods and foods in general, can have on your body.

'The way they react in your brain, that was really surprising for me.'

SOURCE