Thursday, October 17, 2013



Epigenetics: Can we  alter  genes?

A useful summary of a school of thought below but most of the supposed epigenetic effects have fairly obvious alternative explanations.  The children of the Dutch famine survivors may have been disadvantaged in a number of ways -- such as a damaged hormonal environment in utero etc.

Towards the end of the Second World War, something unprecedented happened in modern Europe: a famine. Operation Market Garden, the Allies’ attempt to push across the Rhine in September 1944, had failed, and in retaliation for Dutch collusion the Nazis blockaded towns across the western Netherlands for more than six months. The resultant food shortages – known as the Dutch Hongerwinter – were severe: with just 580 calories of food per person per day, over 22,000 people died from malnutrition, and thousands of babies were born badly underweight.

When scientific researchers analysed the meticulous Dutch medical records decades later, they could see the health effects of prenatal exposure to famine: that the infants who survived were more susceptible to health problems. But they also found a curious anomaly: that these children’s own children – born years later, and well fed – were also significantly underweight. The famine had, it seemed, “scarred” the victims’ DNA.

Which was surprising. After all, for decades we’ve all been told: you are what you eat. You are what you drink. You are how much, or how little, you exercise; you are whatever toxins you imbibe or inhale. Your genes may have destined you to a little baldness, or an increased susceptibility to some vulgar tumour. But as health experts have cautioned you repeatedly: you are a product of your own lifestyle choices.

And yet a quiet scientific revolution is changing that thinking. For it seems you might also be what your mother ate. How much your father drank. And what your grandma smoked. Likewise your own children, too, may be shaped by whether you spend your evenings jogging, worrying about work, or sat on the sofa eating Wotsits. And that nurture, rather than our intractable nature, may determine who we are far more than was ever previously thought.

Epigenetics is a relatively new scientific field; research only began in earnest in the mid Nineties, and has only found traction in the wider scientific community in the last decade or so. And the sources of its data are eclectic, to say the least – stretching from famines in northern Sweden to the 9/11 attacks to the medical notes of Audrey Hepburn.

Audrey Hepburn, who spent her childhood in the Netherlands during the Dutch Hongerwinter, attributed her clinical depression later in life to the malnutrition in her formative years (Credit: Hulton Archive)

But already epigenetics is offering explanations to how our diets, our exposure to toxins, our stress levels at work – even one-off traumatic events – might be subtly altering the genetic legacy we pass on to our children and grandchildren. It’s opened up new avenues into explaining – and curing – illnesses that genes alone can’t explain, ranging from autism to cancer. Moreover, its momentum is resurrecting old theories long dismissed – and rewriting the textbooks and biological rules once thought sacrosanct.

Ever since the existence of genes was first suggested by Gregor Mendel in the 1860s, and James Watson and Francis Crick came up with the double-helix model in 1953, science has held one idea untouchable: that DNA is nature’s blueprint. That chromosomes passed from parent to child form a detailed genetic design for development. So when, 10 years ago, researchers finished mapping the human genome, it promised to revolutionise the field of molecular medicine.

In many ways it did, but something was still missing. Back in the Fifties, biologists had already theorised that something on top of the DNA sequence was actually responsible for “expressing” what came out. As Adrian Bird, a genetics professor at the University of Edinburgh, explains: “We knew there are millions of markers on your DNA and on the proteins that sit on your DNA. What are they doing? What is their function? How do they help genes work, or stop working?”

It was termed the epigenome (literally “upon genetics”). But only in the last few years has research revealed more detail of the vast array of molecular mechanisms that affect the activity of the genes. And that your DNA itself might not be a static, predetermined programme, but instead can be modified by these biological markers. Chief among them are what are called methyl groups – tiny carbon-hydrogen instruction packs that bind to a gene and say “ignore this bit” or “exaggerate this part”.

Methylation is how the cell knows it needs to grow into, say, an eyeball rather than a toenail. In addition, there are also what are called “histones”, controlling how tightly the DNA is spooled around its central thread, and therefore how “readable” the information is. And it’s these two epigenetic controls – an on-off switch and a volume knob, if you will – which give each cell its orders.

Except this epigenetic “interpretation” of your DNA is not fixed – it alters dramatically. And not just during big life changes, like puberty or pregnancy. Now research has found it can also change due to environmental factors, such as our stress levels, if we smoke, etc.

As an example: scientists now know that a bad diet can interfere with this methylation. Which means a cell can grow abnormally. Which means you get a disease or – at worst – a cancer. Scientists used to think that these little epigenetic instructions would be left off your DNA before it was passed onto your children. That when a sperm and egg combined, the embryo had a “clean slate”. Alas, no. New research has found that about one to two per cent of our epigenetic tags cling on. And thus your worst habits – smoking, overeating – are the ones that can be passed onto your offspring, and even further down the hereditary line. Or, put another way: your grandfather was making lifestyle decisions that affect you today.

In biological terms, the idea is heretical. After all, Darwin’s central premise is that evolutionary change takes place over millions of years of natural selection, whereas this new model suggests characteristics are epigenetically “memorised” and transmitted between individual generations. And yet, slowly but surely, the evidence is mounting.

The Hongerwinter is one field of study. Another project focused on the inhabitants of Överkalix, an isolated town in northern Sweden. During the mid 1800s, the community was hit by several periods of intense famine when the crops failed. By studying the medical records found in parish registers, researchers were able to show that the population who went from a normal diet to overeating during a year of crop success produced grandchildren who experienced far shorter lives. And significantly too: a difference of around 32 years.

“There are social implications to these results,” says Marcus Pembrey, emeritus professor of paediatric genetics at University College London, who collaborated on the Överkalix research. “In the sense that you don’t live your life just for yourself but also for your descendants. Although it is important to realise that transgenerational effects are for better as well as worse.” For the medical world, however, the implications could be hugely important.

Suddenly, new “epidemics” such as auto-immune disorders or diabetes might be traced back to epigenetic markers left generations ago. At the University of Texas, for example, a study of rats suggests that soaring obesity and autism rates in humans could be due to “the chemical revolution of the Forties” — and our grandparents’ exposure to new plastics, fertilisers and detergents. As professor of psychology and zoology David Crews explains: “It’s as if the exposure three generations before has reprogrammed the brain.” There could also be implications to what we eat. Already, pregnant women are encouraged to take folic acid, vitamin B-12 and other nutrients containing “methyl groups”, as they decrease the risk of asthma and brain and spinal cord defects in their foetuses.

There is also increasing evidence that certain cancers are caused by misplaced epigenetic tags. So scientists are developing new drugs to silence the bad genes which were supposed to be silenced in the first place. A team of molecular biologists at Temple University in Philadelphia, for example, are currently investigating an ingenious potential alternative to traditional chemotherapy: treating cancer patients with drugs that “reprogramme” cancer cells by reconfiguring the epigenetic markers. Team leader Prof Jean-Pierre Issa, director of the Fels Institute for Cancer Research, hopes this “reshuffling” of the epigenome could, perhaps one day, even produce a cure.

However, the biggest excitement – and, indeed, controversy – surrounds growing research that suggests it’s not just physical characteristics or illnesses we might be passing onto future generations. Instead, our DNA might be affected by behavioural epigenetics too.

Research on rats by Prof Michael Meaney of McGill University, Montreal, and Frances Champagne, a behavioural scientist at Columbia University in New York, have identified changes in genes caused by the most basic psychological influence: maternal love. The 2004 study showed that the quality of a rat mother’s care significantly affects how its offspring behave in adulthood – that rat pups that had been repeatedly groomed by their mothers during the first week of life were subsequently better at coping with stressful situations than pups who received little or no contact.

Frances Champagne identified changes in rats' genes caused by maternal love (Credit: Joe Blossom / Alamy)

You might think this is nothing new; that we already know a loving upbringing has positive psychological effects on children. But Prof Meaney’s research suggests the changes are physiological, not psychological. Epigeneticists also think socioeconomic factors like poverty might “mark” children’s genes to leave them more prone to drug addiction and depression in later life, regardless of whether they’re still poor or not.

There’s also evidence that markers put down during pregnancy can affect our psychological welfare. Further research into the Hongerwinter found that children who were affected in the second trimester of their mother’s pregnancy had an increased incidence of schizophrenia and neurological defects. The actress Audrey Hepburn may be a case in point. She spent her childhood in the Netherlands during the famine, suffering from anaemia and respiratory illnesses at the time; she attributed her clinical depression later in life to the malnutrition in her formative years.

But even one-off traumas could affect later generations too. The attacks of 9/11 offered a key insight. An estimated 530,000 New York City residents suffered symptoms of post-traumatic stress disorder (PTSD) after witnessing the attacks – of which approximately 1,700 were pregnant women. But research by Rachel Yehuda, professor of psychiatry and neuroscience at Icahn School of Medicine at Mount Sinai, found the effects could last longer. She found that mothers who were in their second or third trimester on the day of the attacks were far more likely to give birth to stressed-out infants – i.e. children who reacted with unusual levels of fear and stress when faced with loud noises, unfamiliar people, or new foods.

In short, it seems some children inherited the nightmare their mothers experienced on that day. Will these 9/11 children pass that fear onto their own children? It remains to be seen. But Yehuda has obtained similar results in the adult offspring of Holocaust survivors, and is currently trying to identify the epigenetic markers associated with PTSD in combat veterans.

Indeed, in the space of less than two decades, the field of epigenetics has exploded. With it has emerged new strands of data analysis, sociology, pharmaceutical research and medical discovery. The field is still young – and yet already its bold claims are causing scientific schisms.

As Bird warns: “I do think people have jumped the gun and seen more positive results than are really out there. As yet, there is no evidence worthy of the name that lifestyle choices affect the health of children, let alone grandchildren. I worry that suggesting this is a scientific fact will encourage more futile parental guilt.” But researchers leading the charge, such as Champagne, are philosophical. As she has said: “Critics keep everyone honest. The enthusiasm in the field is obviously great, but I think people’s expectations of what this means need to chill out a little bit.”

SOURCE







Oreos are as addictive as cocaine, say scientists

Ho hum! Putting the cart before the horse again. It would be more realistic to say that cocaine and other drugs hit the receptors that we have always had in order to enable food appreciation and discrimination. Food came first. Drugs imitate it

Oreos can be as addictive to the brain as cocaine, the authors of a scientific study have claimed.

The chocolate cookies have been found to trigger the same neurons in the brain's 'pleasure centre' as the outlawed drug during extensive lab testing on rats.

Neuroscientist Joseph Schroeder from Connecticut College in New London, Connecticut, led research into the addictive effect of the indulgent treat.

His team discovered that the hungry rodents' reaction to the biscuit was comparable to that of rats who had been offered cocaine in earlier tests.

As well as finding that, like humans, rats prefer to eat the cream part of their Oreo first, scientists also saw similarities between the levels of addiction in 'Oreo rats' and their cocaine hooked cousins.

To arrive at the conclusion, Schroeder placed rats in a maze which had two routes to different treats.  One on side, they placed rice cakes and on the other they placed Oreos.

After the animals had explored the maze fully, they were then left to choose which treat they would prefer to stay at.

Speaking of his findings, Schroeder said: 'Just like humans, rats don’t seem to get much pleasure out of eating rice cakes.'

The results, which showed the rodents had a strong preference for the chocolate treat, were compared to those of an identical test involving drugs.

One on side of the maze, the rats would be given an injection of saline while on the other they were given a dose of cocaine or morphine.

According to Schroeder, the rats in the Oreo experiment spent as much time hanging around their Oreo zone in the food test as they did the cocaine zone in the drug test, showing similar levels of addiction.

Writing in a statement describing the study, to be presented at the Society for Neuroscience in San Diego next month, Schroeder added: 'Our research supports the theory that high-fat and high-sugar foods stimulate the brain in the same way that drugs do.

'That may be one reason people have trouble staying away from them and it may be contributing to the obesity epidemic.

'(The results) lend support to the hypothesis that maladaptive eating behaviors contributing to obesity can be compared to drug addiction.

Lauren Cameron, a student at Connecticut College who worked on the study said: 'It really just speaks to the effects that high fat and high sugar foods and foods in general, can have on your body.

'The way they react in your brain, that was really surprising for me.'

SOURCE

1 comment:

John A said...

Epigenetics sort-of makes sense, though if politicians take it up I fear for the future.

And, does this validate Lamarckism?