Why getting a university degree is also the secret to a long life
More stupid causal assumptions. It is politically incorrect to note that high IQ people live longer. And high IQ helps you to get a degree. Need I say more? It's not the degree that gives you a long life but rather the better health associated with a high IQ
Higher education could help you live longer, according to a study. It found people who went to college or university had lower blood pressure as they aged than those whose education finished when they left school in their teens.
With high blood pressure doubling the risk of dying from a heart attack or stroke, according to the Blood Pressure Association, the finding suggests a good education could save your life.
A study found that people who went to college or university had lower blood pressure as they aged than those left school in their teens
A study found that people who went to college or university had lower blood pressure as they aged than those left school in their teens
The biggest health benefits were found among those with master’s degrees or doctorates, and were stronger for women, the journal BMC Public Health reports.
Researchers at Brown University, Rhode Island, who tracked the health of nearly 4,000 American men and women for 30 years, also found highly educated men tended to be thinner and smoked and drank less than those without further education.
Well-educated women also smoked less and were thinner – but drank more than those who did not go to college or university.
With high blood pressure purportedly doubling the risk of dying from a heart attack or stroke, the finding suggests a good education could save your life
With high blood pressure purportedly doubling the risk of dying from a heart attack or stroke, the finding suggests a good education could save your life
The jobs taken by school-leavers may also impact on health.
Study leader Eric Loucks said: ‘Low educational attainment has been demonstrated to predispose individuals to high strain jobs, characterised by high levels of demand and low levels of control, which have been associated with elevated blood pressure.’
He isn’t sure why women’s blood pressure is particularly affected by education – or the lack of it. But it may that lack of education affects a woman’s lifestyle, and so her physical health, more than a man’s.
Dr Loucks said: ‘Women with less education are more likely to be experiencing depression, they are more likely to be single parents, more likely to be living in impoverished areas and more likely to be living below the poverty line.
‘Socio-economic gradients in health are very complex. But there’s a questions of what do we do about it. ‘One of the big potential areas to intervene on is education.’
The British Heart Foundation cautioned that the differences in blood pressure noted were small but added: ‘Action is needed across all parts of society to give children the best possible start in life and reduce health inequalities.’
Education has also been linked with warding off Alzheimer’s. But it may be the case that when the condition does hit, it hits harder and progresses faster.
SOURCE
Parents warned against giving paracetamol and ibuprofen for mild fever
Proper caution at last
Parents should not give children with a mild fever regular spoonfuls of paracetamol and ibuprofen, doctors advise today, as they warn that doing so could extend their illness or put their health at risk.
A misplaced “fever phobia” in society means parents too frequently use both medicines to bring down even slight temperatures, say a group of American paediatricians, who warn that children can receive accidental overdoses as a result.
As many as half of parents are giving their children the wrong dosage, according to a study carried out by the doctors.
In new guidance, the American Academy of Pediatrics advises that a high temperature is often the body’s way of fighting an infection, and warns parents that to bring it down with drugs could actually lengthen a child’s illness. [Nice to have that rediscovered]
Family doctors too readily advise parents to use the medicines, known collectively as “antipyretics”, according to the authors of the guidance.
GPs also often tell parents to give their children alternate doses of paracetamol and ibuprofen – known as combination therapy – believing the risk of side effects to be minimal.
In its official guidance, the National Institute for Health and Clinical Excellence (Nice) says the use of the drugs “should be considered in children with fever who appear distressed or unwell”.
Although Nice says that both drugs should not “routinely” be given to children with a fever, it states that this approach “may be considered” if the child does not respond to being given just one of them.
Children’s paracetamol solutions such as Calpol and ibuprofen solutions such as Nurofen for Children are sold over the counter in chemists. Recommended dosage quantities vary by age.
There is a range of solutions for different age groups, meaning it is possible for parents with children of different ages to mix up which they are giving.
According to the British National Formulary, which GPs consult when prescribing or advising on medication, children should receive no more than four doses of the right amount of paracetamol in a 24-hour period, and no more than four doses of ibuprofen a day.
In its guidance today, however, theAmerican Academy of Pediatrics notes that both medications have potential side effects and says the risks should be taken seriously.
Doctors, the authors write, should begin “by helping parents understand that fever, in and of itself, is not known to endanger a generally healthy child”. “It should be emphasised that fever is not an illness but is, in fact, a physiological mechanism that has beneficial effects in fighting infection.”
Despite this, the academy says, many parents administer paracetamol or ibuprofen even though there is only a minimal fever, or none at all. “Unfortunately, as many as half of all parents administer incorrect doses,” the authors say. A frequent error is giving children adult-sized doses, while children who are small for their age can also receive doses that are too high even if their parents follow the instructions correctly.
Paracetamol has been linked to asthma, while there have been reports of ibuprofen causing stomach ulcers and bleeding, and leading to kidney problems.
“Questions remain regarding the safety” of combination therapy, say the authors, led by Dr Janice Sullivan, of the University of Louisville Pediatric Pharmacology Research Unit, and Dr Henry Farrar, of the University of Arkansas.
Dr Clare Gerada, the chairman of the Royal College of GPs, said: “In my experience of 20 years as a GP, parents are usually pretty careful. “I think the most important thing to be worried about is keeping medicines out of the reach of children, because some taste quite nice.”
SOURCE
Monday, February 28, 2011
Sunday, February 27, 2011
Tea gives your brain a lift and reduces tiredness
Since tea contains caffeine, which is a well-known stimulant, I am not at all clear on why this article had to be written
Natural ingredients found in a cup of tea can improve brain power and increase alertness, it is claimed. Researchers looked at the effect of key chemicals found in tea on the mental performance of 44 young volunteers.
The effects of these ingredients, an amino acid called L-theanine – which is also found in green tea – and caffeine at levels typically found in a cup of tea, were compared with a dummy treatment. The active ingredients significantly improved accuracy across a number of switching tasks for those who drank the tea after 20 and 70 minutes, compared with the placebo. The tea drinkers’ alertness was also heightened, the study found.
Tea was also found to reduced tiredness among the volunteers, who were aged under 40, according to the Dutch researchers reporting on their findings in the journal Nutritional Neuroscience.
‘The results suggest the combination helps to focus attention during a demanding cognitive task,’ they said. Previous trials have shown that adding milk to a cup of tea does not affect the drinker’s absorption of flavonoids – or antioxidants – or disrupt the health benefits from these.
Tea drinking has already been linked with lowering the risk of heart disease, cancer and Parkinson’s. Other research shows drinking tea on a regular basis for ten or more years may help improve bone density.
Dr Tim Bond, of the industry-backed Tea Advisory Panel, said the latest findings backed a previous study which showed drinking two cups of black tea ‘improves the ability to react to stimuli and to focus attention on the task in hand’. ‘Taken together, these two studies provide evidence that consumption of black tea improves cognitive function, in particular helping to focus attention during the challenge of a demanding mental task,’ he said.
‘As a result, all this new data adds to the growing science that drinking tea, preferably four cups of tea a day, is good for our health and well being.’
SOURCE
Cannabis ingredient 'restores taste buds and lost pleasure in food to cancer patients'
It is well-known that pot gives users "the munchies" so this is not a big surprise. But the methodology below is best passed over in forgiving silence. I think the experimenters must be regular users
The ingredient that gives cannabis its 'high' can help cancer patients recover their sense of taste, researchers say.
A group of patients who had been treated with chemotherapy for advanced cancer were given capsules that either contained THC - the psychoactive chemical in cannabis - or dummy lookalike pills. The 21 volunteers took the tablets for 18 days and were then asked to fill in questionnaires.
Researchers from the University of Alberta, Canada, found 73 per cent of those who took THC reported an increased liking for food, compared to 30 per cent of the placebo group. Just over half of the THC takers said the medication 'made food taste better' compared to one in 10 of the control group.
While both groups consumed roughly the same amount of calories during the trial, the THC patients said they ate more protein and enjoyed savoury foods more. The THC-takers also reported better quality of sleep and relaxation than in the placebo group.
While the experiment is small scale it is the first to explore the touted qualities of THC through random selection of volunteers and use of a 'control' group by which to make a comparison.
Lead investigator Professor Wendy Wismer said the findings were important because cancer, or its treatment, can cripple appetite and lead to dangerous weight loss.
Many cancer patients eat less as they say meat smells and tastes unpleasant following treatment. 'For a long time, everyone has thought that nothing could be done about this,' Professor Wismer said. 'Indeed, cancer patients are often told to 'cope' with chemosensory problems by eating bland, cold and colourless food. This may well have the result of reducing food intake and food enjoyment.'
Professor Wismer said that doctors should consider THC treatment for cancer patients suffering from loss of taste, smell and appetite.
THC was well tolerated, and in terms of side effects there were no differences between the THC and placebo groups, which suggests that long-term therapy is also an option, she said.
Cannabis is a Class B drug in the UK and is illegal to have, give away or sell.
The study appears in the journal Annals of Oncology, published by the European Society for Medical Oncology.
SOURCE
Since tea contains caffeine, which is a well-known stimulant, I am not at all clear on why this article had to be written
Natural ingredients found in a cup of tea can improve brain power and increase alertness, it is claimed. Researchers looked at the effect of key chemicals found in tea on the mental performance of 44 young volunteers.
The effects of these ingredients, an amino acid called L-theanine – which is also found in green tea – and caffeine at levels typically found in a cup of tea, were compared with a dummy treatment. The active ingredients significantly improved accuracy across a number of switching tasks for those who drank the tea after 20 and 70 minutes, compared with the placebo. The tea drinkers’ alertness was also heightened, the study found.
Tea was also found to reduced tiredness among the volunteers, who were aged under 40, according to the Dutch researchers reporting on their findings in the journal Nutritional Neuroscience.
‘The results suggest the combination helps to focus attention during a demanding cognitive task,’ they said. Previous trials have shown that adding milk to a cup of tea does not affect the drinker’s absorption of flavonoids – or antioxidants – or disrupt the health benefits from these.
Tea drinking has already been linked with lowering the risk of heart disease, cancer and Parkinson’s. Other research shows drinking tea on a regular basis for ten or more years may help improve bone density.
Dr Tim Bond, of the industry-backed Tea Advisory Panel, said the latest findings backed a previous study which showed drinking two cups of black tea ‘improves the ability to react to stimuli and to focus attention on the task in hand’. ‘Taken together, these two studies provide evidence that consumption of black tea improves cognitive function, in particular helping to focus attention during the challenge of a demanding mental task,’ he said.
‘As a result, all this new data adds to the growing science that drinking tea, preferably four cups of tea a day, is good for our health and well being.’
SOURCE
Cannabis ingredient 'restores taste buds and lost pleasure in food to cancer patients'
It is well-known that pot gives users "the munchies" so this is not a big surprise. But the methodology below is best passed over in forgiving silence. I think the experimenters must be regular users
The ingredient that gives cannabis its 'high' can help cancer patients recover their sense of taste, researchers say.
A group of patients who had been treated with chemotherapy for advanced cancer were given capsules that either contained THC - the psychoactive chemical in cannabis - or dummy lookalike pills. The 21 volunteers took the tablets for 18 days and were then asked to fill in questionnaires.
Researchers from the University of Alberta, Canada, found 73 per cent of those who took THC reported an increased liking for food, compared to 30 per cent of the placebo group. Just over half of the THC takers said the medication 'made food taste better' compared to one in 10 of the control group.
While both groups consumed roughly the same amount of calories during the trial, the THC patients said they ate more protein and enjoyed savoury foods more. The THC-takers also reported better quality of sleep and relaxation than in the placebo group.
While the experiment is small scale it is the first to explore the touted qualities of THC through random selection of volunteers and use of a 'control' group by which to make a comparison.
Lead investigator Professor Wendy Wismer said the findings were important because cancer, or its treatment, can cripple appetite and lead to dangerous weight loss.
Many cancer patients eat less as they say meat smells and tastes unpleasant following treatment. 'For a long time, everyone has thought that nothing could be done about this,' Professor Wismer said. 'Indeed, cancer patients are often told to 'cope' with chemosensory problems by eating bland, cold and colourless food. This may well have the result of reducing food intake and food enjoyment.'
Professor Wismer said that doctors should consider THC treatment for cancer patients suffering from loss of taste, smell and appetite.
THC was well tolerated, and in terms of side effects there were no differences between the THC and placebo groups, which suggests that long-term therapy is also an option, she said.
Cannabis is a Class B drug in the UK and is illegal to have, give away or sell.
The study appears in the journal Annals of Oncology, published by the European Society for Medical Oncology.
SOURCE
Saturday, February 26, 2011
Eating more than three slices of ham a day DOES increase the risk of bowel cancer, say government experts
More epidemiological and theoretical speculation sourced from the sensation-mongering WCRF
You should limit the amount of red meat you eat to the equivalent of three slices of ham, one lamb chop or two slices of roast beef a day, Government advisors have warned. The Scientific Advisory Committee on Nutrition (SACN), published recommendations designed to cut the risk of bowel cancer.
The latest findings are bound to muddy the already confusing debate around the nutritional benefits of red meat. Only last week a British Nutrition Foundation study claimed that the majority of adults ate ‘healthy amounts’ of red meat and there was an ‘inconclusive’ link to cancer. However, the government insists that people who eat 90g or more of red and processed meat a day should cut back. Cutting down to the UK average of 70g a day can help reduce the risk, the study from SACN said.
Red meat contains substances that have been linked to bowel cancer. One compound in particular, haem, which gives red meat its colour, has been shown to damage the lining of the colon in some studies.
The World Cancer Research Fund (WCRF) recommends limiting red meat consumption to 500g a week of cooked weight (about 700g to 750g uncooked). And it says people should avoid processed meats altogether because of the even higher risk of bowel cancer.
The charity estimated 3,800 cases of bowel cancer could be prevented every year if everyone ate less than 70g of processed meat a week. Some 1,900 cases of bowel cancer could also be prevented through cutting red meat consumption to under 70g per week.
Processed meat is generally defined as any meat preserved by smoking, curing or salting, or with chemical preservatives added to it. It is thought this process causes the formation of carcinogens, which can damage cells in the body and allow cancer to develop.
To help consumers the Government published a list today of what is considered a 70g portion of red or processed meat. These are: one medium portion shepherds pie and a rasher of bacon; two standard beef burgers; six slices of salami; one lamb chop; two slices of roast lamb, beef or pork; or three slices of ham. Some 90g of cooked meat is the equivalent to about 130g of uncooked meat, due to the loss of water during cooking.
Men are more likely to eat a lot of red and processed meat - 42 per cent eat more than 90g a day compared to 12 per cent of women.
Interim chief medical officer, Professor Dame Sally Davies, said: 'Following simple diet and lifestyle advice can help protect against cancer.
'Red meat can be part of a healthy balanced diet. It is a good source of protein and vitamins and minerals, such as iron, selenium, zinc and B vitamins. 'But people who eat a lot of red and processed meat should consider cutting down. 'The occasional steak or extra few slices of lamb is fine but regularly eating a lot could increase your risk of bowel cancer.'
Experts estimate the average Briton's lifetime risk of bowel cancer to be about 5 per cent. This rises to 6 per cent if people eat an extra 50g of processed meat a day on top of what they already consume.
Mark Flannagan, chief executive of Beating Bowel Cancer, welcomed the advice. 'The evidence suggests that a diet high in red and processed meat may increase your risk of developing bowel cancer, but the good news is that red meat can still be enjoyed in moderation as part of a healthy balanced diet. 'This combined with an active lifestyle, and awareness of the symptoms and risk factors, could help protect you from the UK's second biggest cancer killer.'
Dr Rachel Thompson, deputy head of science for the World cancer Research Fund, said: 'We welcome the fact that this report recognises the strong evidence that it increases risk of bowel cancer. 'We are also pleased that its suggested maximum intake is similar to the 500g per week (cooked weight) limit that World Cancer Research Fund recommends.
'However, our report made the distinction between red and processed meat and we recommended that while people should limit intake of red meat, they should avoid processed meat. 'This means that we would suggest that people following this new report's guidelines should try and make sure as little as possible of their 70g per day is processed.'
Peter Baker, chief executive of the Men's Health Forum, said: 'Men who enjoy regular breakfast fry-ups or roast beef dinners will be surprised to learn that eating too much red or processed meat might increase their risk of bowel cancer.
'We're not saying men can't occasionally enjoy a bacon sandwich or some sausages for breakfast - but the evidence tells us we need to think about cutting down on how much red and processed meat we're eating. 'This is a health benefit surely worth giving up a few sausages for.'
Last year, experts from the Harvard School of Public Health in the U.S. found that eating processed meats can increase the risk of heart disease and diabetes.
The round-up of 20 studies published worldwide found people who eat processed meats have a 42 per cent higher risk of heart disease and a 19 per cent increased risk of Type 2 diabetes. However, unprocessed red meats, such as beef, pork or lamb, do not raise the risk, the study found.
SOURCE
Another hymn of praise to the virtues of nuts
The fact that antioxidants shorten your lifespan is not mentioned, funnily enough. Even if it's true that they are good for your heart, they are obviously bad for other things
Eating pecan nuts can lower the risk of developing heart disease or cancer, say researchers. A study showed their naturally occurring antioxidants help reduce inflammation in the arteries.
The nuts are particularly rich in one form of the antioxidant vitamin E called gamma-tocopherol, and the study showed that its levels in the body doubled eight hours after eating pecans.
The researchers analysed 16 men and women who ate a sequence of three diets, one of whole pecans, one of pecans blended with water, and a neutral ‘control’ meal. Even after three hours, unhealthy oxidation of ‘bad’ cholesterol in the blood – which can cause heart problems – fell by up to a third.
‘Previous research has shown that pecans contain antioxidant factors. Our study shows these antioxidants are indeed absorbed in the body and provide a protective effect against diseases,’ said Ella Haddad, of California’s Loma Linda University, whose findings were published in The Journal of Nutrition. ‘This protective effect is important in helping to prevent development of various diseases such as cancer and heart disease.’
It is only the latest evidence that nuts can boost health. Walnuts help lower cholesterol, while almonds are a great source of bone-building calcium and Brazil nuts are high in the antioxidant selenium, linked to preventing some cancers.
'Our tests show that eating pecans increases the amount of healthy antioxidants in the body,' said LLU researcher Ella Haddad, associate professor in the School of Public Health department of nutrition. 'This protective effect is important in helping to prevent development of various diseases such as cancer and heart disease.'
Dr Habbad added: 'This study is another piece of evidence that pecans are a healthy food," says Dr. Haddad. 'Previous research has shown that pecans contain antioxidant factors. Our study shows these antioxidants are indeed absorbed in the body and provide a protective effect against diseases.'
Research from Loma Linda University published earlier in the Journal of Nutrition showed that a pecan-enriched diet lowered levels of LDL cholesterol by 16.5 percent - more than twice the American Heart Association's Step I diet, which was used as the control diet in that study. Similarly, the pecan-enriched diet lowered total cholesterol levels by 11.3 percent.
SOURCE
More epidemiological and theoretical speculation sourced from the sensation-mongering WCRF
You should limit the amount of red meat you eat to the equivalent of three slices of ham, one lamb chop or two slices of roast beef a day, Government advisors have warned. The Scientific Advisory Committee on Nutrition (SACN), published recommendations designed to cut the risk of bowel cancer.
The latest findings are bound to muddy the already confusing debate around the nutritional benefits of red meat. Only last week a British Nutrition Foundation study claimed that the majority of adults ate ‘healthy amounts’ of red meat and there was an ‘inconclusive’ link to cancer. However, the government insists that people who eat 90g or more of red and processed meat a day should cut back. Cutting down to the UK average of 70g a day can help reduce the risk, the study from SACN said.
Red meat contains substances that have been linked to bowel cancer. One compound in particular, haem, which gives red meat its colour, has been shown to damage the lining of the colon in some studies.
The World Cancer Research Fund (WCRF) recommends limiting red meat consumption to 500g a week of cooked weight (about 700g to 750g uncooked). And it says people should avoid processed meats altogether because of the even higher risk of bowel cancer.
The charity estimated 3,800 cases of bowel cancer could be prevented every year if everyone ate less than 70g of processed meat a week. Some 1,900 cases of bowel cancer could also be prevented through cutting red meat consumption to under 70g per week.
Processed meat is generally defined as any meat preserved by smoking, curing or salting, or with chemical preservatives added to it. It is thought this process causes the formation of carcinogens, which can damage cells in the body and allow cancer to develop.
To help consumers the Government published a list today of what is considered a 70g portion of red or processed meat. These are: one medium portion shepherds pie and a rasher of bacon; two standard beef burgers; six slices of salami; one lamb chop; two slices of roast lamb, beef or pork; or three slices of ham. Some 90g of cooked meat is the equivalent to about 130g of uncooked meat, due to the loss of water during cooking.
Men are more likely to eat a lot of red and processed meat - 42 per cent eat more than 90g a day compared to 12 per cent of women.
Interim chief medical officer, Professor Dame Sally Davies, said: 'Following simple diet and lifestyle advice can help protect against cancer.
'Red meat can be part of a healthy balanced diet. It is a good source of protein and vitamins and minerals, such as iron, selenium, zinc and B vitamins. 'But people who eat a lot of red and processed meat should consider cutting down. 'The occasional steak or extra few slices of lamb is fine but regularly eating a lot could increase your risk of bowel cancer.'
Experts estimate the average Briton's lifetime risk of bowel cancer to be about 5 per cent. This rises to 6 per cent if people eat an extra 50g of processed meat a day on top of what they already consume.
Mark Flannagan, chief executive of Beating Bowel Cancer, welcomed the advice. 'The evidence suggests that a diet high in red and processed meat may increase your risk of developing bowel cancer, but the good news is that red meat can still be enjoyed in moderation as part of a healthy balanced diet. 'This combined with an active lifestyle, and awareness of the symptoms and risk factors, could help protect you from the UK's second biggest cancer killer.'
Dr Rachel Thompson, deputy head of science for the World cancer Research Fund, said: 'We welcome the fact that this report recognises the strong evidence that it increases risk of bowel cancer. 'We are also pleased that its suggested maximum intake is similar to the 500g per week (cooked weight) limit that World Cancer Research Fund recommends.
'However, our report made the distinction between red and processed meat and we recommended that while people should limit intake of red meat, they should avoid processed meat. 'This means that we would suggest that people following this new report's guidelines should try and make sure as little as possible of their 70g per day is processed.'
Peter Baker, chief executive of the Men's Health Forum, said: 'Men who enjoy regular breakfast fry-ups or roast beef dinners will be surprised to learn that eating too much red or processed meat might increase their risk of bowel cancer.
'We're not saying men can't occasionally enjoy a bacon sandwich or some sausages for breakfast - but the evidence tells us we need to think about cutting down on how much red and processed meat we're eating. 'This is a health benefit surely worth giving up a few sausages for.'
Last year, experts from the Harvard School of Public Health in the U.S. found that eating processed meats can increase the risk of heart disease and diabetes.
The round-up of 20 studies published worldwide found people who eat processed meats have a 42 per cent higher risk of heart disease and a 19 per cent increased risk of Type 2 diabetes. However, unprocessed red meats, such as beef, pork or lamb, do not raise the risk, the study found.
SOURCE
Another hymn of praise to the virtues of nuts
The fact that antioxidants shorten your lifespan is not mentioned, funnily enough. Even if it's true that they are good for your heart, they are obviously bad for other things
Eating pecan nuts can lower the risk of developing heart disease or cancer, say researchers. A study showed their naturally occurring antioxidants help reduce inflammation in the arteries.
The nuts are particularly rich in one form of the antioxidant vitamin E called gamma-tocopherol, and the study showed that its levels in the body doubled eight hours after eating pecans.
The researchers analysed 16 men and women who ate a sequence of three diets, one of whole pecans, one of pecans blended with water, and a neutral ‘control’ meal. Even after three hours, unhealthy oxidation of ‘bad’ cholesterol in the blood – which can cause heart problems – fell by up to a third.
‘Previous research has shown that pecans contain antioxidant factors. Our study shows these antioxidants are indeed absorbed in the body and provide a protective effect against diseases,’ said Ella Haddad, of California’s Loma Linda University, whose findings were published in The Journal of Nutrition. ‘This protective effect is important in helping to prevent development of various diseases such as cancer and heart disease.’
It is only the latest evidence that nuts can boost health. Walnuts help lower cholesterol, while almonds are a great source of bone-building calcium and Brazil nuts are high in the antioxidant selenium, linked to preventing some cancers.
'Our tests show that eating pecans increases the amount of healthy antioxidants in the body,' said LLU researcher Ella Haddad, associate professor in the School of Public Health department of nutrition. 'This protective effect is important in helping to prevent development of various diseases such as cancer and heart disease.'
Dr Habbad added: 'This study is another piece of evidence that pecans are a healthy food," says Dr. Haddad. 'Previous research has shown that pecans contain antioxidant factors. Our study shows these antioxidants are indeed absorbed in the body and provide a protective effect against diseases.'
Research from Loma Linda University published earlier in the Journal of Nutrition showed that a pecan-enriched diet lowered levels of LDL cholesterol by 16.5 percent - more than twice the American Heart Association's Step I diet, which was used as the control diet in that study. Similarly, the pecan-enriched diet lowered total cholesterol levels by 11.3 percent.
SOURCE
Friday, February 25, 2011
SCOTUS limits lawsuits against vaccine makers
A win for all. Legal burdens could easily wipe out vaccine provision
Vaccine makers such as Pfizer are breathing much easier today: The Supreme Court ruled they can't be sued for defective vaccine designs. That puts the kibosh on some 5,000 cases in which parents blame vaccines for their children's autism, and generally gives the pharmaceutical companies much more certainty about their potential liability. The decision was 6-2, with Justice Elena Kagan sitting out. Justices Sonia Sotomayor and Ruth Bader Ginsburg dissented.
When a product hurts someone, one possible way the victim can sue is to claim that the product was designed defectively. Claiming a defective design is the tricky, because products can be inherently dangerous, but still be good products -- chainsaws or cars, for example. While different courts use different tests to determine if a design was defective, the basic idea is to strike a balance between the product's usefulness as intended and the risks it creates.
Let's apply this to a hypothetical vaccine: Imagine it's possible to have a vaccine that has no side effects, but isn't particularly effective at preventing the disease it targets, and another formulation that's extremely effective at disease prevention, but does have side effects, including -- in very rare cases -- horrible ones such as brain damage or even death. Because of society's interest in promoting the use of effective vaccines -- when someone fails to immunize their child, both that child and other people are put at risk, particularly infants and the elderly -- Congress has emphatically endorsed vaccines that are effective but have some rare risks as the better design.
To prevent vaccine makers from being bankrupted by lawsuits over those rare but horrible side effects, Congress created a special vaccine compensation program for victims. If a victim's claim is rejected by that program, the victim can still sue under state law, claiming the vaccine was defective. In the case the Court just decided, the issue was whether Congress allowed all kinds of defective vaccine claims, or just claims a vaccine was defectively manufactured or defectively marketed (i.e.: that the maker failed to warn users of known risks.)
The court heard the case of Hannah Bruesewitz, who developed a seizure disorder and mental disabilities as an infant, shortly after being given Wyeth's DPT vaccine. (Wyeth is now part of Pfizer.) Her family's claim for compensation was rejected by the federal program, and they turned to state court, alleging the vaccine was defectively designed because Wyeth owned a different design that was safer, but chose not to market it. Wyeth disputes the claim that the other design was safer.
In deciding the Bruesewitz's claim was barred, the Court turned to the text and structure of the law creating the federal compensation program. Effectively the decision turned on what the three words "even though" and "unavoidable" meant in this context:
"No vaccine manufacturer shall be liable [for] a vaccine-related injury or death...if the injury or death resulted from side effects that were unavoidable even though the vaccine was properly prepared and was accompanied by proper directions and warnings."
Justice Scalia, a famous textualist, wrote the opinion. The majority decided that the language meant that only manufacturing and marketing defects, which follow the "even though", are allowed. If design defects were allowed, it should have said something like "properly prepared and designed and accompanied..."
The dissent disagreed, arguing that the word "unavoidable" had a special meaning from a law treatise, which changed the analysis, and Congress hadn't clearly said it was preempting the state claims. Finally, the dissenters emphasized that shielding vaccine makers from design defect cases eliminates a powerful incentive for manufactures to keep improving their designs. As to the last argument, Scalia conceded the tort liability motive for design improvement was indeed eliminated by his opinion, but insisted the law provided other incentives.
Scalia, who has a reputation for witty, readable and caustic opinions, clearly reveled in parsing the sentence's structure as well as talking trash (in polite Supreme Court fashion) about Sotomayor and Ginsburg's dissent. For example, Justice Scalia noted that the "even though" clause is known as a "concessive subordinate clause by grammarians" and said things like "dissent's principal textual argument is mistaken ... We do not comprehend ... its reasoning."
Justice Scalia also took a passing swipe at Congress, noting that it had been unnecessarily wordy:
"Petitioners and the dissent contend that the interpretation we propose would render part of [the vaccine law] superfluous: ... ("the injury or death resulted from side effects that were unavoidable even though") is unnecessary. True enough. But the rule against giving a portion of text an interpretation which renders it superfluous does not prescribe that a passage which could have been more terse does not mean what it says."
At the end of the day, vaccine makers win. Society also wins -- especially if the vaccine makers' threats of withdrawing from the business if they lost this case were sincere. But the Bruesewitz family loses, painfully, as their now teenage daughter still suffers from the seizure condition and mental disabilities. And if the dissent is right, and manufacturers will fail to improve their vaccine designs as a result of the decision, we will all lose eventually as side effects that could have been reduced or eliminated continue to hurt people.
SOURCE
Moderate amounts of alcohol protect against heart disease
But the effect is small
Drinking a glass of wine or pint of beer every evening reduces the risk of heart disease by up to a quarter, according to research.
Just days after a warning that Britain faces up to 250,000 extra deaths from liver disease unless its binge-drinking culture is tackled, two reports claim that moderate amounts of alcohol are actually good for the health.
They say that a small glass of wine for women and up to two bottles or one pint of beer can prevent the build-up of bad cholesterol, and so sensible drinkers are at lower risk of developing heart disease than teetotallers.
This is because alcohol taken in moderation increases the amount of “good” cholesterol circulating in the body.
Prof William Ghali, of the University of Calgary, said: “With respect to public health messages there may now be an impetus to better communicate to the public that alcohol, in moderation, may have overall health benefits that outweigh the risks in selected subsets of patients - any such strategy would need to be accompanied by rigorous study and oversight of impacts.”
However Cathy Ross, senior cardiac nurse at the British Heart Foundation, warned: “Drinking more than sensible amounts of alcohol does not offer any protection and can cause high blood pressure, stroke, some cancers and damage to your heart.
“If you don't drink, this is not a reason to start. Similar results can be achieved by being physically active and eating a balanced and healthy diet.”
In the first study, published online by the BMJ on Wednesday, Prof Ghali and colleagues reviews 84 previous studies of alcohol consumption and disease. They compared the number of drinkers and non-drinkers who suffered, or died from, heart disease and stroke, and concluded that “alcohol consumption was associated with lower risk”, of between 14 and 25 per cent.
The second paper, led by Dr Susan Brien at the same Canadian university, looked at 63 existing studies on alcohol consumption and cholesterol and fat levels. It concluded that consumption of one drink (of about 15g of alcohol) for women and two for men was good for the health, and that the benefit was felt regardless of whether beer, wine or spirits was drunk.
The study said that alcohol “significantly increased” high-density lipoprotein cholesterol, the “good” form that cleans blood vessel walls of “bad” cholesterol and returns it to the liver, preventing the build ups that can lead to heart disease.
SOURCE
A win for all. Legal burdens could easily wipe out vaccine provision
Vaccine makers such as Pfizer are breathing much easier today: The Supreme Court ruled they can't be sued for defective vaccine designs. That puts the kibosh on some 5,000 cases in which parents blame vaccines for their children's autism, and generally gives the pharmaceutical companies much more certainty about their potential liability. The decision was 6-2, with Justice Elena Kagan sitting out. Justices Sonia Sotomayor and Ruth Bader Ginsburg dissented.
When a product hurts someone, one possible way the victim can sue is to claim that the product was designed defectively. Claiming a defective design is the tricky, because products can be inherently dangerous, but still be good products -- chainsaws or cars, for example. While different courts use different tests to determine if a design was defective, the basic idea is to strike a balance between the product's usefulness as intended and the risks it creates.
Let's apply this to a hypothetical vaccine: Imagine it's possible to have a vaccine that has no side effects, but isn't particularly effective at preventing the disease it targets, and another formulation that's extremely effective at disease prevention, but does have side effects, including -- in very rare cases -- horrible ones such as brain damage or even death. Because of society's interest in promoting the use of effective vaccines -- when someone fails to immunize their child, both that child and other people are put at risk, particularly infants and the elderly -- Congress has emphatically endorsed vaccines that are effective but have some rare risks as the better design.
To prevent vaccine makers from being bankrupted by lawsuits over those rare but horrible side effects, Congress created a special vaccine compensation program for victims. If a victim's claim is rejected by that program, the victim can still sue under state law, claiming the vaccine was defective. In the case the Court just decided, the issue was whether Congress allowed all kinds of defective vaccine claims, or just claims a vaccine was defectively manufactured or defectively marketed (i.e.: that the maker failed to warn users of known risks.)
The court heard the case of Hannah Bruesewitz, who developed a seizure disorder and mental disabilities as an infant, shortly after being given Wyeth's DPT vaccine. (Wyeth is now part of Pfizer.) Her family's claim for compensation was rejected by the federal program, and they turned to state court, alleging the vaccine was defectively designed because Wyeth owned a different design that was safer, but chose not to market it. Wyeth disputes the claim that the other design was safer.
In deciding the Bruesewitz's claim was barred, the Court turned to the text and structure of the law creating the federal compensation program. Effectively the decision turned on what the three words "even though" and "unavoidable" meant in this context:
"No vaccine manufacturer shall be liable [for] a vaccine-related injury or death...if the injury or death resulted from side effects that were unavoidable even though the vaccine was properly prepared and was accompanied by proper directions and warnings."
Justice Scalia, a famous textualist, wrote the opinion. The majority decided that the language meant that only manufacturing and marketing defects, which follow the "even though", are allowed. If design defects were allowed, it should have said something like "properly prepared and designed and accompanied..."
The dissent disagreed, arguing that the word "unavoidable" had a special meaning from a law treatise, which changed the analysis, and Congress hadn't clearly said it was preempting the state claims. Finally, the dissenters emphasized that shielding vaccine makers from design defect cases eliminates a powerful incentive for manufactures to keep improving their designs. As to the last argument, Scalia conceded the tort liability motive for design improvement was indeed eliminated by his opinion, but insisted the law provided other incentives.
Scalia, who has a reputation for witty, readable and caustic opinions, clearly reveled in parsing the sentence's structure as well as talking trash (in polite Supreme Court fashion) about Sotomayor and Ginsburg's dissent. For example, Justice Scalia noted that the "even though" clause is known as a "concessive subordinate clause by grammarians" and said things like "dissent's principal textual argument is mistaken ... We do not comprehend ... its reasoning."
Justice Scalia also took a passing swipe at Congress, noting that it had been unnecessarily wordy:
"Petitioners and the dissent contend that the interpretation we propose would render part of [the vaccine law] superfluous: ... ("the injury or death resulted from side effects that were unavoidable even though") is unnecessary. True enough. But the rule against giving a portion of text an interpretation which renders it superfluous does not prescribe that a passage which could have been more terse does not mean what it says."
At the end of the day, vaccine makers win. Society also wins -- especially if the vaccine makers' threats of withdrawing from the business if they lost this case were sincere. But the Bruesewitz family loses, painfully, as their now teenage daughter still suffers from the seizure condition and mental disabilities. And if the dissent is right, and manufacturers will fail to improve their vaccine designs as a result of the decision, we will all lose eventually as side effects that could have been reduced or eliminated continue to hurt people.
SOURCE
Moderate amounts of alcohol protect against heart disease
But the effect is small
Drinking a glass of wine or pint of beer every evening reduces the risk of heart disease by up to a quarter, according to research.
Just days after a warning that Britain faces up to 250,000 extra deaths from liver disease unless its binge-drinking culture is tackled, two reports claim that moderate amounts of alcohol are actually good for the health.
They say that a small glass of wine for women and up to two bottles or one pint of beer can prevent the build-up of bad cholesterol, and so sensible drinkers are at lower risk of developing heart disease than teetotallers.
This is because alcohol taken in moderation increases the amount of “good” cholesterol circulating in the body.
Prof William Ghali, of the University of Calgary, said: “With respect to public health messages there may now be an impetus to better communicate to the public that alcohol, in moderation, may have overall health benefits that outweigh the risks in selected subsets of patients - any such strategy would need to be accompanied by rigorous study and oversight of impacts.”
However Cathy Ross, senior cardiac nurse at the British Heart Foundation, warned: “Drinking more than sensible amounts of alcohol does not offer any protection and can cause high blood pressure, stroke, some cancers and damage to your heart.
“If you don't drink, this is not a reason to start. Similar results can be achieved by being physically active and eating a balanced and healthy diet.”
In the first study, published online by the BMJ on Wednesday, Prof Ghali and colleagues reviews 84 previous studies of alcohol consumption and disease. They compared the number of drinkers and non-drinkers who suffered, or died from, heart disease and stroke, and concluded that “alcohol consumption was associated with lower risk”, of between 14 and 25 per cent.
The second paper, led by Dr Susan Brien at the same Canadian university, looked at 63 existing studies on alcohol consumption and cholesterol and fat levels. It concluded that consumption of one drink (of about 15g of alcohol) for women and two for men was good for the health, and that the benefit was felt regardless of whether beer, wine or spirits was drunk.
The study said that alcohol “significantly increased” high-density lipoprotein cholesterol, the “good” form that cleans blood vessel walls of “bad” cholesterol and returns it to the liver, preventing the build ups that can lead to heart disease.
SOURCE
Thursday, February 24, 2011
The "too clean" theory of asthma rises again. It even makes the Wall St. Journal!
The germ theory has been in some eclipse in recent years because of some awkward epidemiological facts. For instance, Australian Aborigines often live in extraordinary squalor but don't seem to be protected from anything because of that. In fact they have quite high rates of autoimmune diseases such as diabetes
So how do we evaluate the findings below? It's a bit difficult as the article in NEJM has not yet appeared but there are at least two possibilities. The most favourable to the theory is that it is not the overall bacterial load that matters but rather just some bacteria. So southern German farmhouses might have the helpful bacteria but Aboriginal camps may not. That is not inherently absurd but would be very much in need of proof, considering that both populations have extensive contact with all the world's infective agents via the modern-day "global village".
The second much more skeptical possibility derives from the fact that we are only looking at epidemiology here -- so all causal links are speculative. For instance, it has recently been found that Paracetamol (Tylenol) use in children under 15 months doubles their chance of getting asthma. So maybe the "dirty" farms were less health conscious in general and so used fewer medications, including paracetamol. Isn't epidemiology wonderful?
The possibilities are endless, in fact. It was found last year, for instance, that that receptors for bitter tastes are not confined to the tongue but are also are found in the smooth muscles of the lungs and airways. And bitter tastes RELAX those airways. So in doing any epidemiological comparisons of asthma incidence, we would have to ask whether the different groups used in the research differed in their preferences for bitter drinks, including, of course, beer!
OK. I could go on but I will have mercy at that point
Children living on farms have a lower risk of asthma than children who don't because they are surrounded by a greater variety of germs, according to two large-scale studies published Wednesday.
The prevalence of asthma in the U.S. has doubled over the past 30 years, and one theory for the increase blames urban and suburban living environments that are too clean. The latest findings, published in the New England Journal of Medicine, bolster what is often known as the hygiene theory, which says that contact with bacteria and other microbes is necessary to building a normal immune system.
The key appears to be exposure to a diversity of bugs, not just more of them, according to Markus Ege, an epidemiologist at the Children's Hospital of Munich and first author on the paper that covered both studies. "Bacteria can be beneficial for asthma," said Dr. Ege. "You have to have microbes that educate the immune system. But you have to have the right ones."
Previous research, including some conducted by Dr. Ege's group, has found that children raised on farms exhibit substantially reduced risk for asthma and allergies—lower by 30% or more—than those raised elsewhere. Though scientists had hypothesized that the difference was linked to germs, they also had to determine whether it could be due to other elements of farm life such as fresh air, exposure to farm animals, or dietary factors like drinking raw milk.
The latest study helps untangle that question by providing evidence that the reduction in risk is indeed significantly related to the variety of bacteria and other bugs a child is exposed to, according to James Gern, a professor of pediatrics and medicine at the University of Wisconsin-Madison who wrote an editorial to accompany the paper in the journal but wasn't involved in the study.
In Wednesday's paper, the researchers surveyed and collected samples of house dust in two studies of children from Southern Germany, Austria and Switzerland. One study comprised 6,800 children, about half of whom lived on farms, and the other studied nearly 9,700 children, 16% of whom were raised on a farm. Researchers then examined the dust for presence and type of microbes.
Those living on farms were exposed to a greater variety of bugs and also had a lower risk of asthma. There was evidence that exposure to a particular type of bacteria, known as gram-negative rods, was also related to lower rates of allergic responses.
Identifying which microbes are beneficial to the immune system is important because those germs could help the development of new treatments or vaccines to prevent asthma, Dr. Ege said. His group is now studying some of the microbes in greater detail.
The findings don't yield much in the way of practical suggestions, however. Dr. Ege said it wouldn't help for parents to take their children to a farm two or three times a year or to get a dog or other pet for the purpose of exposing their children to microbes, since the biggest effect appeared to be related to prolonged exposure to cows and pigs.
SOURCE
Could your blood group determine your health?
Since different blood types carry different antigens, it is not surprising that they might vary in their ability to fight different diseases. They may well have evolved to fight the threats most common in their original local environments. In modern populations, however, differences in disease resistance would appear to be small
Could your blood group determine your risk of major cancers, infertility and stomach ulcers, as well as diseases such as cholera and malaria? For years, the idea that blood groups had any medical significance beyond blood transfusions was dismissed by scientists.
It hasn’t been helped by the celebrity’s favourite, the ‘blood group diet’, which claims your blood type determines how your body responds to certain food.
But a growing number of studies is revealing how our blood groups may make us more prone to lethal illnesses — or even protect us from them.
The latest research into blood types shows that having group O blood can lower your risk of heart attacks. Researchers at Pennsylvania University discovered this benefit in a study involving 20,000 people. Their research, to be published in The Lancet, found that most people who have a gene called Adamts7 face a significantly raised risk of suffering a heart attack. But in people with blood group O who have the Adamts7, there is no raised risk.
Dr Muredach Reilly, the lead researcher, says this knowledge may help to develop new therapies for people at risk of heart attacks. Such drugs may mimic the beneficial effect of the O blood group gene.
Only 40 per cent of people in Britain know what their group is, according to the National Blood Service. But in future, we may be far more keen to learn it — and to understand its life-saving implications. Our blood group is determined by genes inherited from our parents.
Millennia of evolution have split human blood into four types: A, B, AB and O — around 44 per cent of Britons are type O, 42 per cent are type A, 10 per cent type B and 4 per cent are AB.
What distinguishes each type are their antigens (the immune defence systems) on the surface of the red blood cells. Each blood group type evolved to provide defences against lethal diseases.
But each has its own weaknesses, too. People with type O blood are at less risk of dying from malaria than people with other blood groups. But they are more vulnerable to cholera and stomach ulcers caused by viruses and bacteria.
For a long time, the study of blood groups and disease was discredited — thanks to the Nazis. Otto Reche, a Nazi German ‘professor of racial science’, claimed in the Thirties that pure Aryans all had blood type A. The main ‘enemy’ blood group was, he said, B type. He used this to identify ‘inferior’ races for persecution during Hitler’s rise to power.
While such claims are scientifically absurd, in Japan there is still widespread discrimination on the grounds of blood group. In the Twenties, Japanese scientists claimed blood groups produced different personalities. The idea became so ingrained that in World War II, the Imperial Army formed battle groups based on blood type.
The idea resurfaced in the Seventies and a rash of Japanese best-sellers has spread the belief that type As are sensitive but anxious; Type Bs are cheerful but focused; Os are outgoing but stubborn; and ABs are arty and unpredictable.
This theory has a dark side. Bura-hara (blood-group harassment) is common in Japan. Company chiefs often consider candidates’ blood types when picking staff. Children at some kindergartens are also divided by blood type. Matchmaking agencies provide blood-type compatibility tests.
Nevertheless, there is serious science behind the idea that blood groups can hold the secret to fighting deadly diseases.
In the Fifties, research at four London hospitals found the risk of developing gastric cancer was much higher for people with blood group A than for those with blood group O. But people with group O had a greater risk of peptic ulcers.
This month, those findings have been confirmed by investigators at Sweden’s Karolinska Institute, which studied more than a million people over a period of 35 years. The lead researcher, Dr Gustaf Edgren, says people with group A may be more susceptible to gastric cancer risks such as smoking, alcohol and use of non-steroidal anti-inflammatory drugs. Type O people may be more vulnerable to a bacterium that can cause peptic ulcers, Helicobacter pylori.
Last October, U.S. scientists showed that a woman’s blood group can affect her chances of becoming pregnant. The study of more than 560 women undertaking fertility treatment found that those with blood type O were up to twice as likely to have a lower egg count and poorer egg quality, which could affect the chances of conceiving. Women with blood group A seemed to be better protected against their egg counts falling over time.
Researcher Edward Nejat, from New York’s Albert Einstein College, says the exact reasons for a link between blood group and ovarian reserve was not clear.
Blood groups have been linked to other reproductive troubles. Last month, a study at Harvard University found that women with AB or B group blood have a raised risk of developing ovarian cancer.
There are also fears that AB blood may double or even treble the risk of pregnant mothers suffering from the potentially lethal blood pressure condition pre-eclampsia. This finding could be harnessed to identify women at higher risk.
Other research has found that people with type AB and B blood have a much higher risk of developing pancreatic cancer.
Meanwhile, people with type O might be less at risk of cancer, but research shows they are also more vulnerable than others to norovirus, the potentially lethal vomiting and diarrhoea bug.
And men with type O might be more prone to piling on the pounds, say Danish researchers. They have found that type O males who are exposed routinely to pollution at work have a significantly raised risk of obesity compared with men of other blood types.
The researchers, at Copenhagen’s Bispebjerg University Hospital, speculate that the pollution sets off chronic inflammatory responses in the men’s bodies that can result in them becoming overweight. It’s a good excuse anyway.
Taken overall, such a weight of medical evidence might prompt us to question why we are not told of the health threats we might face due to our blood type. But in the UK, there is little work in this field.
Professor Mike Murphy, of the NHS Blood and Transplant authority, says: ‘Our colleagues in the U.S. have become increasingly involved in this type of research, particularly in trying to harness the power of blood types to fight infectious diseases. But the interest in Britain is sparse.’
Meanwhile, a lone group of British researchers is trying to turn blood-group science into a bona-fide lifesaver in one area: malaria. The effort is being led by Alex Rowe, an infection specialist at Edinburgh University’s School of Biological Sciences. Her work shows that people with blood group O are resistant to the tropical disease, which kills millions every year.
SOURCE
The germ theory has been in some eclipse in recent years because of some awkward epidemiological facts. For instance, Australian Aborigines often live in extraordinary squalor but don't seem to be protected from anything because of that. In fact they have quite high rates of autoimmune diseases such as diabetes
So how do we evaluate the findings below? It's a bit difficult as the article in NEJM has not yet appeared but there are at least two possibilities. The most favourable to the theory is that it is not the overall bacterial load that matters but rather just some bacteria. So southern German farmhouses might have the helpful bacteria but Aboriginal camps may not. That is not inherently absurd but would be very much in need of proof, considering that both populations have extensive contact with all the world's infective agents via the modern-day "global village".
The second much more skeptical possibility derives from the fact that we are only looking at epidemiology here -- so all causal links are speculative. For instance, it has recently been found that Paracetamol (Tylenol) use in children under 15 months doubles their chance of getting asthma. So maybe the "dirty" farms were less health conscious in general and so used fewer medications, including paracetamol. Isn't epidemiology wonderful?
The possibilities are endless, in fact. It was found last year, for instance, that that receptors for bitter tastes are not confined to the tongue but are also are found in the smooth muscles of the lungs and airways. And bitter tastes RELAX those airways. So in doing any epidemiological comparisons of asthma incidence, we would have to ask whether the different groups used in the research differed in their preferences for bitter drinks, including, of course, beer!
OK. I could go on but I will have mercy at that point
Children living on farms have a lower risk of asthma than children who don't because they are surrounded by a greater variety of germs, according to two large-scale studies published Wednesday.
The prevalence of asthma in the U.S. has doubled over the past 30 years, and one theory for the increase blames urban and suburban living environments that are too clean. The latest findings, published in the New England Journal of Medicine, bolster what is often known as the hygiene theory, which says that contact with bacteria and other microbes is necessary to building a normal immune system.
The key appears to be exposure to a diversity of bugs, not just more of them, according to Markus Ege, an epidemiologist at the Children's Hospital of Munich and first author on the paper that covered both studies. "Bacteria can be beneficial for asthma," said Dr. Ege. "You have to have microbes that educate the immune system. But you have to have the right ones."
Previous research, including some conducted by Dr. Ege's group, has found that children raised on farms exhibit substantially reduced risk for asthma and allergies—lower by 30% or more—than those raised elsewhere. Though scientists had hypothesized that the difference was linked to germs, they also had to determine whether it could be due to other elements of farm life such as fresh air, exposure to farm animals, or dietary factors like drinking raw milk.
The latest study helps untangle that question by providing evidence that the reduction in risk is indeed significantly related to the variety of bacteria and other bugs a child is exposed to, according to James Gern, a professor of pediatrics and medicine at the University of Wisconsin-Madison who wrote an editorial to accompany the paper in the journal but wasn't involved in the study.
In Wednesday's paper, the researchers surveyed and collected samples of house dust in two studies of children from Southern Germany, Austria and Switzerland. One study comprised 6,800 children, about half of whom lived on farms, and the other studied nearly 9,700 children, 16% of whom were raised on a farm. Researchers then examined the dust for presence and type of microbes.
Those living on farms were exposed to a greater variety of bugs and also had a lower risk of asthma. There was evidence that exposure to a particular type of bacteria, known as gram-negative rods, was also related to lower rates of allergic responses.
Identifying which microbes are beneficial to the immune system is important because those germs could help the development of new treatments or vaccines to prevent asthma, Dr. Ege said. His group is now studying some of the microbes in greater detail.
The findings don't yield much in the way of practical suggestions, however. Dr. Ege said it wouldn't help for parents to take their children to a farm two or three times a year or to get a dog or other pet for the purpose of exposing their children to microbes, since the biggest effect appeared to be related to prolonged exposure to cows and pigs.
SOURCE
Could your blood group determine your health?
Since different blood types carry different antigens, it is not surprising that they might vary in their ability to fight different diseases. They may well have evolved to fight the threats most common in their original local environments. In modern populations, however, differences in disease resistance would appear to be small
Could your blood group determine your risk of major cancers, infertility and stomach ulcers, as well as diseases such as cholera and malaria? For years, the idea that blood groups had any medical significance beyond blood transfusions was dismissed by scientists.
It hasn’t been helped by the celebrity’s favourite, the ‘blood group diet’, which claims your blood type determines how your body responds to certain food.
But a growing number of studies is revealing how our blood groups may make us more prone to lethal illnesses — or even protect us from them.
The latest research into blood types shows that having group O blood can lower your risk of heart attacks. Researchers at Pennsylvania University discovered this benefit in a study involving 20,000 people. Their research, to be published in The Lancet, found that most people who have a gene called Adamts7 face a significantly raised risk of suffering a heart attack. But in people with blood group O who have the Adamts7, there is no raised risk.
Dr Muredach Reilly, the lead researcher, says this knowledge may help to develop new therapies for people at risk of heart attacks. Such drugs may mimic the beneficial effect of the O blood group gene.
Only 40 per cent of people in Britain know what their group is, according to the National Blood Service. But in future, we may be far more keen to learn it — and to understand its life-saving implications. Our blood group is determined by genes inherited from our parents.
Millennia of evolution have split human blood into four types: A, B, AB and O — around 44 per cent of Britons are type O, 42 per cent are type A, 10 per cent type B and 4 per cent are AB.
What distinguishes each type are their antigens (the immune defence systems) on the surface of the red blood cells. Each blood group type evolved to provide defences against lethal diseases.
But each has its own weaknesses, too. People with type O blood are at less risk of dying from malaria than people with other blood groups. But they are more vulnerable to cholera and stomach ulcers caused by viruses and bacteria.
For a long time, the study of blood groups and disease was discredited — thanks to the Nazis. Otto Reche, a Nazi German ‘professor of racial science’, claimed in the Thirties that pure Aryans all had blood type A. The main ‘enemy’ blood group was, he said, B type. He used this to identify ‘inferior’ races for persecution during Hitler’s rise to power.
While such claims are scientifically absurd, in Japan there is still widespread discrimination on the grounds of blood group. In the Twenties, Japanese scientists claimed blood groups produced different personalities. The idea became so ingrained that in World War II, the Imperial Army formed battle groups based on blood type.
The idea resurfaced in the Seventies and a rash of Japanese best-sellers has spread the belief that type As are sensitive but anxious; Type Bs are cheerful but focused; Os are outgoing but stubborn; and ABs are arty and unpredictable.
This theory has a dark side. Bura-hara (blood-group harassment) is common in Japan. Company chiefs often consider candidates’ blood types when picking staff. Children at some kindergartens are also divided by blood type. Matchmaking agencies provide blood-type compatibility tests.
Nevertheless, there is serious science behind the idea that blood groups can hold the secret to fighting deadly diseases.
In the Fifties, research at four London hospitals found the risk of developing gastric cancer was much higher for people with blood group A than for those with blood group O. But people with group O had a greater risk of peptic ulcers.
This month, those findings have been confirmed by investigators at Sweden’s Karolinska Institute, which studied more than a million people over a period of 35 years. The lead researcher, Dr Gustaf Edgren, says people with group A may be more susceptible to gastric cancer risks such as smoking, alcohol and use of non-steroidal anti-inflammatory drugs. Type O people may be more vulnerable to a bacterium that can cause peptic ulcers, Helicobacter pylori.
Last October, U.S. scientists showed that a woman’s blood group can affect her chances of becoming pregnant. The study of more than 560 women undertaking fertility treatment found that those with blood type O were up to twice as likely to have a lower egg count and poorer egg quality, which could affect the chances of conceiving. Women with blood group A seemed to be better protected against their egg counts falling over time.
Researcher Edward Nejat, from New York’s Albert Einstein College, says the exact reasons for a link between blood group and ovarian reserve was not clear.
Blood groups have been linked to other reproductive troubles. Last month, a study at Harvard University found that women with AB or B group blood have a raised risk of developing ovarian cancer.
There are also fears that AB blood may double or even treble the risk of pregnant mothers suffering from the potentially lethal blood pressure condition pre-eclampsia. This finding could be harnessed to identify women at higher risk.
Other research has found that people with type AB and B blood have a much higher risk of developing pancreatic cancer.
Meanwhile, people with type O might be less at risk of cancer, but research shows they are also more vulnerable than others to norovirus, the potentially lethal vomiting and diarrhoea bug.
And men with type O might be more prone to piling on the pounds, say Danish researchers. They have found that type O males who are exposed routinely to pollution at work have a significantly raised risk of obesity compared with men of other blood types.
The researchers, at Copenhagen’s Bispebjerg University Hospital, speculate that the pollution sets off chronic inflammatory responses in the men’s bodies that can result in them becoming overweight. It’s a good excuse anyway.
Taken overall, such a weight of medical evidence might prompt us to question why we are not told of the health threats we might face due to our blood type. But in the UK, there is little work in this field.
Professor Mike Murphy, of the NHS Blood and Transplant authority, says: ‘Our colleagues in the U.S. have become increasingly involved in this type of research, particularly in trying to harness the power of blood types to fight infectious diseases. But the interest in Britain is sparse.’
Meanwhile, a lone group of British researchers is trying to turn blood-group science into a bona-fide lifesaver in one area: malaria. The effort is being led by Alex Rowe, an infection specialist at Edinburgh University’s School of Biological Sciences. Her work shows that people with blood group O are resistant to the tropical disease, which kills millions every year.
SOURCE
Wednesday, February 23, 2011
Organic produce 'not as good for your health': Vegetables grown with pesticides contain MORE vitamins
The organic approach to gardening which avoids chemicals will not deliver healthier or more tasty produce, it is claimed.
A controversial study from Which? Gardening suggests produce grown using modern artificial methods may well be better for you.
The claims, which will alarm producers and consumers who put their faith in natural food, follow a two-year study.
Non-organic broccoli was found to have significantly higher levels of antioxidants than organically grown samples. Antioxidants are beneficial chemicals that are said to improve general health and help prevent cancer.
The research found that non-organic potatoes contained more Vitamin C than the organic crop, and expert tasters found that non-organically grown tomatoes had a stronger flavour than the organic samples.
Organic bodies have rejected the claims, insisting the trial was too small to offer meaningful results.
SOURCE
Using mobile phones 'does not increase the risk of cancer'
The only reason this is still an issue is that a lot of conceited people hate anything that is popular and need to feel that they know better
Using a mobile phone does not increase the risk of getting brain cancer, claim British scientists. There has been virtually no change in rates of the disease - despite around 70 million mobile phones being used in the UK.
A study by scientists at the University of Manchester looked at data from the Office of National Statistics on rates of newly diagnosed brain cancers in England between 1998 and 2007. It found no statistically significant change in the incidence of brain cancers in men or women during the nine-year period.
The study, published in the journal Bioelectromagnetics, suggests radio frequency exposure from mobile phone use has not led to a 'noticeable increase' in the risk of developing brain cancers.
Lead researcher Dr Frank de Vocht, an expert in occupational and environmental health in the University of Manchester’s School of Community-Based Medicine, said it was 'unlikely we are at the forefront of a cancer epidemic'.
He said 'Mobile phone use in the United Kingdom and other countries has risen steeply since the early 1990s when the first digital mobile phones were introduced.
'There is an ongoing controversy about whether radio frequency exposure from mobile phones increases the risk of brain cancer. 'Our findings indicate that a causal link between mobile phone use and cancer is unlikely because there is no evidence of any significant increase in the disease since their introduction and rapid proliferation.'
The study says there is no 'plausible biological mechanism' for radio waves to directly damage genes, resulting in cells becoming cancerous. If they are related to cancer, they are more likely to promote growth in an existing brain tumour.
The researchers said they would expect an increase in the number of diagnosed cases of brain cancer to appear within five to 10 years of the introduction of mobile phones and for this to continue as mobile use became more widespread.
The time period studied, between 1998 and 2007, would relate to exposure from 1990 to 2002 when mobile phone use in the UK increased from zero to 65 per cent of households.
The team, which included researchers from the Institute of Occupational Medicine in Edinburgh and Drexel University, Philadelphia, found a small increase in the incidence of cancers in the temporal lobe of 0.6 cases per 100,000 people or 31 extra cases per year in a population of 52 million.
Brain cancers of the parietal lobe, cerebrum and cerebellum in men actually fell slightly between 1998 and 2007. 'Our research suggests that the increased and widespread use of mobile phones, which in some studies was associated to increased brain cancer risk, has not led to a noticeable increase in the incidence of brain cancer in England between 1998 and 2007' said Dr de Vocht.
'It is very unlikely that we are at the forefront of a brain cancer epidemic related to mobile phones, as some have suggested, although we did observe a small increased rate of brain cancers in the temporal lobe.
'However, to put this into perspective, if this specific rise in tumour incidence was caused by mobile phone use, it would contribute to less than one additional case per 100,000 population in a decade.
'We cannot exclude the possibility that there are people who are susceptible to radio-frequency exposure or that some rare brain cancers are associated with it but we interpret our data as not indicating a pressing need to implement public health measures to reduce radio-frequency exposure from mobile phones.'
SOURCE
The organic approach to gardening which avoids chemicals will not deliver healthier or more tasty produce, it is claimed.
A controversial study from Which? Gardening suggests produce grown using modern artificial methods may well be better for you.
The claims, which will alarm producers and consumers who put their faith in natural food, follow a two-year study.
Non-organic broccoli was found to have significantly higher levels of antioxidants than organically grown samples. Antioxidants are beneficial chemicals that are said to improve general health and help prevent cancer.
The research found that non-organic potatoes contained more Vitamin C than the organic crop, and expert tasters found that non-organically grown tomatoes had a stronger flavour than the organic samples.
Organic bodies have rejected the claims, insisting the trial was too small to offer meaningful results.
SOURCE
Using mobile phones 'does not increase the risk of cancer'
The only reason this is still an issue is that a lot of conceited people hate anything that is popular and need to feel that they know better
Using a mobile phone does not increase the risk of getting brain cancer, claim British scientists. There has been virtually no change in rates of the disease - despite around 70 million mobile phones being used in the UK.
A study by scientists at the University of Manchester looked at data from the Office of National Statistics on rates of newly diagnosed brain cancers in England between 1998 and 2007. It found no statistically significant change in the incidence of brain cancers in men or women during the nine-year period.
The study, published in the journal Bioelectromagnetics, suggests radio frequency exposure from mobile phone use has not led to a 'noticeable increase' in the risk of developing brain cancers.
Lead researcher Dr Frank de Vocht, an expert in occupational and environmental health in the University of Manchester’s School of Community-Based Medicine, said it was 'unlikely we are at the forefront of a cancer epidemic'.
He said 'Mobile phone use in the United Kingdom and other countries has risen steeply since the early 1990s when the first digital mobile phones were introduced.
'There is an ongoing controversy about whether radio frequency exposure from mobile phones increases the risk of brain cancer. 'Our findings indicate that a causal link between mobile phone use and cancer is unlikely because there is no evidence of any significant increase in the disease since their introduction and rapid proliferation.'
The study says there is no 'plausible biological mechanism' for radio waves to directly damage genes, resulting in cells becoming cancerous. If they are related to cancer, they are more likely to promote growth in an existing brain tumour.
The researchers said they would expect an increase in the number of diagnosed cases of brain cancer to appear within five to 10 years of the introduction of mobile phones and for this to continue as mobile use became more widespread.
The time period studied, between 1998 and 2007, would relate to exposure from 1990 to 2002 when mobile phone use in the UK increased from zero to 65 per cent of households.
The team, which included researchers from the Institute of Occupational Medicine in Edinburgh and Drexel University, Philadelphia, found a small increase in the incidence of cancers in the temporal lobe of 0.6 cases per 100,000 people or 31 extra cases per year in a population of 52 million.
Brain cancers of the parietal lobe, cerebrum and cerebellum in men actually fell slightly between 1998 and 2007. 'Our research suggests that the increased and widespread use of mobile phones, which in some studies was associated to increased brain cancer risk, has not led to a noticeable increase in the incidence of brain cancer in England between 1998 and 2007' said Dr de Vocht.
'It is very unlikely that we are at the forefront of a brain cancer epidemic related to mobile phones, as some have suggested, although we did observe a small increased rate of brain cancers in the temporal lobe.
'However, to put this into perspective, if this specific rise in tumour incidence was caused by mobile phone use, it would contribute to less than one additional case per 100,000 population in a decade.
'We cannot exclude the possibility that there are people who are susceptible to radio-frequency exposure or that some rare brain cancers are associated with it but we interpret our data as not indicating a pressing need to implement public health measures to reduce radio-frequency exposure from mobile phones.'
SOURCE
Tuesday, February 22, 2011
How dumb can officialdom get?
Only one person out of over 1,900 Met AHA's Definition of Ideal Heart Health -- yet it doesn't occur to them that their criteria are wrong. Procrustes obviously has many modern-day followers.
All the nonagenarians tottering around the place must have good hearts. How about using them as a criterion for heart health? That would put the cat among the pigeons! A lot of them smoke, drink, are inactive, grew up on high fat foods etc.
Only one out of more than 1,900 people evaluated met the American Heart Association (AHA) definition of ideal cardiovascular health, according to a new study led by researchers at the University of Pittsburgh School of Medicine.
Their findings were recently published online in Circulation.
Ideal cardiovascular health is the combination of these seven factors: nonsmoking, a body mass index less than 25, goal-level physical activity and healthy diet, untreated cholesterol below 200, blood pressure below 120/80 and fasting blood sugar below 100, explained senior investigator and cardiologist Steven Reis, M.D., associate vice chancellor for clinical research at Pitt.
"Of all the people we assessed, only one out of 1,900 could claim ideal heart health," said Dr. Reis. "This tells us that the current prevalence of heart health is extremely low, and that we have a great challenge ahead of us to attain the AHA's aim of a 20 percent improvement in cardiovascular health rates by 2020."
As part of the Heart Strategies Concentrating on Risk Evaluation (Heart SCORE) study, the researchers evaluated 1,933 people ages 45 to 75 in Allegheny County with surveys, physical exams and blood tests. Less than 10 percent met five or more criteria; 2 percent met the four heart-healthy behaviors; and 1.4 percent met all three heart-healthy factors. After adjustment for age, sex and income level, blacks had 82 percent lower odds than whites of meeting five or more criteria.
A multipronged approach, including change at the individual level, the social and physical environment, policy and access to care, will be needed to help people not only avoid heart disease, but also attain heart health, Dr. Reis said.
"Many of our study participants were overweight or obese, and that likely had a powerful influence on the other behaviors and factors," he noted. "Our next step is to analyze additional data to confirm this and, based on the results, try to develop a multifaceted approach to improve health. That could include identifying predictors of success or failure at adhering to the guidelines."
SOURCE
Daily pill may stop the ringing in your ears
The trials of this theory have not yet even begun!
A mineral found in spinach and other green leafy vegetables is being used to treat people with chronic tinnitus — characterised by an inexplicable ringing or buzzing in the ears. Researchers believe the mineral magnesium plays a key role in protecting our hearing system and that supplements taken daily will reduce tinnitus.
This condition is believed to permanently affect one in ten adults, with one in three people experiencing it at some point in their life. The clinical trial of 40 patients, at the Mayo Clinic in Arizona, America, is due to start this month. The trial subjects will be split into two groups; one will take a 535mg magnesium tablet every day, while the other group will take a daily placebo pill.
The trial follows previous studies that linked low levels of magnesium in the body to a higher risk of noise-induced hearing loss.
Tinnitus is usually accompanied by some hearing loss and researchers believe the same biological malfunction in our body’s hearing system may cause both conditions.
Tinnitus is triggered by a range of factors, such as ear infections, adverse reactions to some medications (such as aspirin), high blood pressure or age-related hearing damage. Prolonged exposure to loud noise can also trigger it and sufferers include musicians Phil Collins and Eric Clapton.
Tinnitus can affect one or both ears and there is no cure. The condition is linked to problems with hair cells in the inner ear. These cells vibrate in response to sound waves and these vibrations are translated into electrical signals which are sent to the brain via nerves.
When these cells become weakened or damaged — through infection or over-exposure to loud noise, for instance — they send a constant stream of abnormal signals along the nerves. The brain interprets these signals as sounds of ringing, humming or buzzing. Damage to these hair cells also causes deafness.
Magnesium is needed to help maintain normal nerve function in the body and good sources include green leafy vegetables, bread and dairy products.
The UK recommended daily intake is 300mg. Higher doses can trigger diarrhoea, stomach cramps and cause complications in patients with kidney disease. Therefore, they should be taken only under medical supervision, say the scientists.
The team believe a lack of the mineral in the hair cells may contribute to tinnitus.
One function of magnesium is to stop too much calcium being released by the body. Calcium causes small blood vessels to narrow and a lack of blood flow to the hair cells is thought to contribute to the condition as it reduces their supply of oxygen and nutrients. Another theory is that magnesium blocks glutamate, a brain chemical responsible for sending signals between nerve cells.
Although this chemical is important for relaying messages throughout the body, too much of it can damage nerve cells, especially in the body’s hearing system. Previous studies suggest that exposure to loud noise triggers the over-production of glutamate.
Dr Ralph Holme, head of biomedical research at the Royal National Institute for Deaf People (RNID), says: ‘Everyday life can often be frustrating and distressing for people experiencing tinnitus, and RNID is keen to see effective treatments developed to cure or treat the condition. ‘Only a small group of people are being tested in this study, so it will be hard for researchers to show whether a magnesium supplement can meaningfully reduce the effects of tinnitus. But, the research may encourage future larger-scale trials.’
Elsewhere, researchers are testing a new treatment for hearing loss in people who listen to loud music or work in noisy environments. The trial, which is being conducted in the U.S., Spain and Sweden, will involve 60 young people who use MP3 players, 25 arms officers taking part in combat training in Sweden, 130 Nato soldiers and 120 factory workers. Half of the group will be given a placebo, while the other half will be given a daily pill containing the antioxidants beta-carotene and vitamins C and E. The team hope these antioxidants will help protect the hearing cells in the ear.
The group, who have good hearing, will take the pill for two years and will be tested throughout this time.
Animal studies have found this combination of compounds can be effective in protecting hearing loss. This is the first trial to test the theory in humans.
SOURCE
Only one person out of over 1,900 Met AHA's Definition of Ideal Heart Health -- yet it doesn't occur to them that their criteria are wrong. Procrustes obviously has many modern-day followers.
All the nonagenarians tottering around the place must have good hearts. How about using them as a criterion for heart health? That would put the cat among the pigeons! A lot of them smoke, drink, are inactive, grew up on high fat foods etc.
Only one out of more than 1,900 people evaluated met the American Heart Association (AHA) definition of ideal cardiovascular health, according to a new study led by researchers at the University of Pittsburgh School of Medicine.
Their findings were recently published online in Circulation.
Ideal cardiovascular health is the combination of these seven factors: nonsmoking, a body mass index less than 25, goal-level physical activity and healthy diet, untreated cholesterol below 200, blood pressure below 120/80 and fasting blood sugar below 100, explained senior investigator and cardiologist Steven Reis, M.D., associate vice chancellor for clinical research at Pitt.
"Of all the people we assessed, only one out of 1,900 could claim ideal heart health," said Dr. Reis. "This tells us that the current prevalence of heart health is extremely low, and that we have a great challenge ahead of us to attain the AHA's aim of a 20 percent improvement in cardiovascular health rates by 2020."
As part of the Heart Strategies Concentrating on Risk Evaluation (Heart SCORE) study, the researchers evaluated 1,933 people ages 45 to 75 in Allegheny County with surveys, physical exams and blood tests. Less than 10 percent met five or more criteria; 2 percent met the four heart-healthy behaviors; and 1.4 percent met all three heart-healthy factors. After adjustment for age, sex and income level, blacks had 82 percent lower odds than whites of meeting five or more criteria.
A multipronged approach, including change at the individual level, the social and physical environment, policy and access to care, will be needed to help people not only avoid heart disease, but also attain heart health, Dr. Reis said.
"Many of our study participants were overweight or obese, and that likely had a powerful influence on the other behaviors and factors," he noted. "Our next step is to analyze additional data to confirm this and, based on the results, try to develop a multifaceted approach to improve health. That could include identifying predictors of success or failure at adhering to the guidelines."
SOURCE
Daily pill may stop the ringing in your ears
The trials of this theory have not yet even begun!
A mineral found in spinach and other green leafy vegetables is being used to treat people with chronic tinnitus — characterised by an inexplicable ringing or buzzing in the ears. Researchers believe the mineral magnesium plays a key role in protecting our hearing system and that supplements taken daily will reduce tinnitus.
This condition is believed to permanently affect one in ten adults, with one in three people experiencing it at some point in their life. The clinical trial of 40 patients, at the Mayo Clinic in Arizona, America, is due to start this month. The trial subjects will be split into two groups; one will take a 535mg magnesium tablet every day, while the other group will take a daily placebo pill.
The trial follows previous studies that linked low levels of magnesium in the body to a higher risk of noise-induced hearing loss.
Tinnitus is usually accompanied by some hearing loss and researchers believe the same biological malfunction in our body’s hearing system may cause both conditions.
Tinnitus is triggered by a range of factors, such as ear infections, adverse reactions to some medications (such as aspirin), high blood pressure or age-related hearing damage. Prolonged exposure to loud noise can also trigger it and sufferers include musicians Phil Collins and Eric Clapton.
Tinnitus can affect one or both ears and there is no cure. The condition is linked to problems with hair cells in the inner ear. These cells vibrate in response to sound waves and these vibrations are translated into electrical signals which are sent to the brain via nerves.
When these cells become weakened or damaged — through infection or over-exposure to loud noise, for instance — they send a constant stream of abnormal signals along the nerves. The brain interprets these signals as sounds of ringing, humming or buzzing. Damage to these hair cells also causes deafness.
Magnesium is needed to help maintain normal nerve function in the body and good sources include green leafy vegetables, bread and dairy products.
The UK recommended daily intake is 300mg. Higher doses can trigger diarrhoea, stomach cramps and cause complications in patients with kidney disease. Therefore, they should be taken only under medical supervision, say the scientists.
The team believe a lack of the mineral in the hair cells may contribute to tinnitus.
One function of magnesium is to stop too much calcium being released by the body. Calcium causes small blood vessels to narrow and a lack of blood flow to the hair cells is thought to contribute to the condition as it reduces their supply of oxygen and nutrients. Another theory is that magnesium blocks glutamate, a brain chemical responsible for sending signals between nerve cells.
Although this chemical is important for relaying messages throughout the body, too much of it can damage nerve cells, especially in the body’s hearing system. Previous studies suggest that exposure to loud noise triggers the over-production of glutamate.
Dr Ralph Holme, head of biomedical research at the Royal National Institute for Deaf People (RNID), says: ‘Everyday life can often be frustrating and distressing for people experiencing tinnitus, and RNID is keen to see effective treatments developed to cure or treat the condition. ‘Only a small group of people are being tested in this study, so it will be hard for researchers to show whether a magnesium supplement can meaningfully reduce the effects of tinnitus. But, the research may encourage future larger-scale trials.’
Elsewhere, researchers are testing a new treatment for hearing loss in people who listen to loud music or work in noisy environments. The trial, which is being conducted in the U.S., Spain and Sweden, will involve 60 young people who use MP3 players, 25 arms officers taking part in combat training in Sweden, 130 Nato soldiers and 120 factory workers. Half of the group will be given a placebo, while the other half will be given a daily pill containing the antioxidants beta-carotene and vitamins C and E. The team hope these antioxidants will help protect the hearing cells in the ear.
The group, who have good hearing, will take the pill for two years and will be tested throughout this time.
Animal studies have found this combination of compounds can be effective in protecting hearing loss. This is the first trial to test the theory in humans.
SOURCE
Monday, February 21, 2011
C-section puts children at food risk?
This idea has been grumbling on for years. Nobody seems to mention that less healthy women might be more likely to need a Caesarian and that it might be the poorer average maternal health that leads to poorer average child health -- nothing to do with the delivery method
GIVING birth by caesarean section increases the risk of your child suffering from food allergies, an expert has warned.
Pediatric allergy specialist Dr Peter Smith is urging expectant mothers to consider a vaginal delivery because of growing evidence a c-section can "significantly increase the risk of your child suffering from an allergy to cow's milk".
Admissions to hospital emergency departments for allergic reactions have increased by 500 per cent since 1990 in Australia. "It is at epidemic proportions," Dr Smith said of the massive rise in food allergies, likely to be attributed to several causes rather than one.
But symptomatic food allergy was found to occur more frequently in children born by c-section. There has been a 30 per cent growth in caesareans in the past decade in Australia.
"Several studies have shown a difference in the composition of the gastrointestinal flora of children with food allergies compared to those without," Dr Smith said. "When a child moves through the birth canal, they ingest bacteria and become naturally inoculated through a small mouthful of secretions. "The oral ingestion of those healthy bugs is the first bacteria that comes into their system." Dr Smith said that first bacteria entering the body established "the population".
Not only does Australia have one of the highest prevalence of allergic disorders in the developed world, but recent studies have demonstrated a doubling in some conditions such as allergic rhinitis (hay fever), eczema and potentially dangerous anaphylaxis. Asthma, hay fever, chronic sinusitis and "other allergy" comprise four of the top 10 most common long-term, self-reported illnesses in young people aged 12-24 in Australia.
Dr Smith said the next best thing to a "natural" birth was to follow birth with breast feeding. "Breast milk contains lots of healthy bugs (probiotics) to promote the growth of healthy bacteria and assist your child's immune system in the first few week's of life," he said.
SOURCE
City life is making us sick, study warns
Pure anecdote, without even a pretence of research
CITY slickers juggling phones, computers and the stresses of modern life are being struck down by a new condition called "urban mental health", an international mental health conference heard yesterday.
In the next few decades it will be the single biggest issue facing those in big cities who may not realise their hectic lifestyles are adding to their stress, which could lead to a mental illness.
Compounding the problem is that many people are living in units on their own and parks and backyards are disappearing, causing people to be cut off from society.
One in five Australians is diagnosed each year with a mental condition of some sort, from anxiety and depression to more serious conditions such as schizophrenia, the conference at St Vincent's Hospital in Sydney was told.
Faces in the Street Urban Mental Health Research Institute director Professor Kay Wilhelm said many people were suffering chronic stress and placing themselves at risk. "We've heard there are problems living in the city which are probably becoming more so with stress, pollution etcetera," she said.
"And it's thought that in terms of the social determinants of mental health, one of the underlying factors is being chronically stressed by a whole lot of things.
"Urban mental health is really about the particular mental health issues that have to do with people living in the inner city, it's not really so much the suburbs."
In order to combat urban mental health, town planners and developers are being urged to consider community interaction and encourage meeting spots in their designs.
University of NSW Faculty of Built Environment Associate Professor Susan Thompson said the rise in community gardens was helping to bring people outdoors. "A lot of people in the city live on their own. Others don't have backyards, so they are not out in the garden or interacting with neighbours," she said. "It is about designing cities and letting residents have an input into things that will make them happy."
SOURCE
This idea has been grumbling on for years. Nobody seems to mention that less healthy women might be more likely to need a Caesarian and that it might be the poorer average maternal health that leads to poorer average child health -- nothing to do with the delivery method
GIVING birth by caesarean section increases the risk of your child suffering from food allergies, an expert has warned.
Pediatric allergy specialist Dr Peter Smith is urging expectant mothers to consider a vaginal delivery because of growing evidence a c-section can "significantly increase the risk of your child suffering from an allergy to cow's milk".
Admissions to hospital emergency departments for allergic reactions have increased by 500 per cent since 1990 in Australia. "It is at epidemic proportions," Dr Smith said of the massive rise in food allergies, likely to be attributed to several causes rather than one.
But symptomatic food allergy was found to occur more frequently in children born by c-section. There has been a 30 per cent growth in caesareans in the past decade in Australia.
"Several studies have shown a difference in the composition of the gastrointestinal flora of children with food allergies compared to those without," Dr Smith said. "When a child moves through the birth canal, they ingest bacteria and become naturally inoculated through a small mouthful of secretions. "The oral ingestion of those healthy bugs is the first bacteria that comes into their system." Dr Smith said that first bacteria entering the body established "the population".
Not only does Australia have one of the highest prevalence of allergic disorders in the developed world, but recent studies have demonstrated a doubling in some conditions such as allergic rhinitis (hay fever), eczema and potentially dangerous anaphylaxis. Asthma, hay fever, chronic sinusitis and "other allergy" comprise four of the top 10 most common long-term, self-reported illnesses in young people aged 12-24 in Australia.
Dr Smith said the next best thing to a "natural" birth was to follow birth with breast feeding. "Breast milk contains lots of healthy bugs (probiotics) to promote the growth of healthy bacteria and assist your child's immune system in the first few week's of life," he said.
SOURCE
City life is making us sick, study warns
Pure anecdote, without even a pretence of research
CITY slickers juggling phones, computers and the stresses of modern life are being struck down by a new condition called "urban mental health", an international mental health conference heard yesterday.
In the next few decades it will be the single biggest issue facing those in big cities who may not realise their hectic lifestyles are adding to their stress, which could lead to a mental illness.
Compounding the problem is that many people are living in units on their own and parks and backyards are disappearing, causing people to be cut off from society.
One in five Australians is diagnosed each year with a mental condition of some sort, from anxiety and depression to more serious conditions such as schizophrenia, the conference at St Vincent's Hospital in Sydney was told.
Faces in the Street Urban Mental Health Research Institute director Professor Kay Wilhelm said many people were suffering chronic stress and placing themselves at risk. "We've heard there are problems living in the city which are probably becoming more so with stress, pollution etcetera," she said.
"And it's thought that in terms of the social determinants of mental health, one of the underlying factors is being chronically stressed by a whole lot of things.
"Urban mental health is really about the particular mental health issues that have to do with people living in the inner city, it's not really so much the suburbs."
In order to combat urban mental health, town planners and developers are being urged to consider community interaction and encourage meeting spots in their designs.
University of NSW Faculty of Built Environment Associate Professor Susan Thompson said the rise in community gardens was helping to bring people outdoors. "A lot of people in the city live on their own. Others don't have backyards, so they are not out in the garden or interacting with neighbours," she said. "It is about designing cities and letting residents have an input into things that will make them happy."
SOURCE
Sunday, February 20, 2011
Red meat DOES increase cancer risk, new report will say
"Although the evidence is not conclusive". Well what is it then? Speculation is what it is -- motivated by the fact that meat is popular. The "superior" people will attack ANYTHING that is popular
Britons should cut their consumption of red and processed meat to reduce the risk of bowel cancer, scientific experts are expected to recommend in a report. The Scientific Advisory Committee on Nutrition (SACN) was asked by the Department of Health to review dietary advice on meat consumption as a source of iron.
In a draft report published in June 2009 the committee of independent experts said lower consumption of red and processed meat would probably reduce the risk of colorectal cancer.
The committee said: 'Although the evidence is not conclusive, as a precaution, it may be advisable for intakes of red and processed meat not to increase above the current average (70g/day) and for high consumers of red and processed meat (100g/day or more) to reduce their intakes.'
A daily total of 70g is equivalent to about three rashers of bacon.
The Sunday Telegraph said the full report, to be published within days, was expected to echo the committee's draft report.
A Department of Health spokeswoman said: 'The DH committee of independent experts on nutrition will shortly publish their final report on iron and health.'
The World Cancer Research Fund already recommends people limit their intake of red meat, including pork, beef, lamb and goat, to 500g a week. The fund also advises consumers to avoid too much processed meat, including hot dogs, ham, bacon and some sausages and burgers.
It follows a review by the British Nutrition Foundation last week which suggested demolished the ‘myths and misconceptions’ about the meat, saying that most people eat healthy amounts which are not linked to greater risk of disease.
Modern farming methods have cut fat levels, which can be even lower than chicken, while red meat provides high levels of vital nutrients, including iron.
A vegetarian having a Cheddar cheese salad will eat seven times more fat, pound for pound, than lean red meat contains, said the review which looks at current evidence on health and red meat and found no evidence of ‘negative health effects’.
SOURCE
The Ultimate in Nanny-State Paternalism
Aside from the air we breathe, nothing is more important than the food and drink we consume. Not healthcare, not employment, not housing — nothing. Obviously, the best healthcare, the highest-paying job, and the biggest mansion in the world can’t do anything for you if you don’t eat. For someone to dictate to someone else the food and drink he should and shouldn’t consume is the ultimate in paternalism; for the state to tell someone the food and drink he should and shouldn’t consume is the ultimate in nanny-state paternalism.
Although the government’s war on poverty has been around about fifty years, its war on drugs about forty years, and its war on terrorism about ten years, it was only last year that the government declared war on childhood obesity. First Lady Michelle Obama has made this latest war her signature issue. “Obesity in this country is nothing less than a public health crisis,” said the president’s wife. She further claims that because military leaders say that one in four young people are unqualified for military service because of their weight, “childhood obesity isn’t just a public health threat, it’s not just an economic threat, it’s a national security threat as well.”
But the first lady is not alone. To help fight the war on childhood obesity, Congress last year passed, and President Obama signed into law, the Healthy Hunger-Free Kids Act. This new law, which amends the Child Nutrition Act, the Food and Nutrition Act, and the Richard B. Russell National School Lunch Act, gives the government more power to decide what kinds of foods can be sold at schools. School-sponsored fundraisers like candy sales are exempt, but only if they are “infrequent within the school.”
What many Americans probably don’t realize is that the federal government is not just concerned about what children eat in school. Since 1980, and every five years since then, the Department of Agriculture (USDA) and the Department of Health and Human Services (HHS) have joined forces to publish Dietary Guidelines for Americans. The 112-page seventh edition dated 2010 has just been published. It provides nutritional guidelines for all Americans two years and older. It is based on the 453-page Report of the Dietary Guidelines Advisory Committee on the Dietary Guidelines for Americans, 2010.
This edition of Dietary Guidelines for Americans recommends that Americans reduce their daily sodium intake, as well as their consumption of saturated fat, trans fat, cholesterol, added sugars, refined grains, and alcohol. It recommends that Americans increase their consumption of vegetables, fruit, whole grains, fat-free or low-fat milk and milk products, and seafood. It also recommends that Americans choose a variety of protein foods and foods that contain more potassium, fiber, calcium, and vitamin D. And, of course, it is also recommended that Americans “increase physical activity and reduce time spent in sedentary behaviors.”
There are two problems with these dietary guidelines, one nutritional and one philosophical.
Some physicians, nutritionists, and health professionals would strongly disagree with some of what is recommended in the Guidelines. For example, the demonization of cholesterol, butter, saturated fat, and unpasteurized dairy products, the dismissal of the glycemic index and the recommendation that 45 to 65 percent of one’s caloric intake should be from carbohydrates, and the lack of any warning about the dangers of aspartame, soy, and genetically modified foods. In fact, some of the above individuals blame the government itself for contributing to the current obesity and diabetes epidemics because it accepted the “lipid hypothesis” and the “cholesterol myth” that links dietary fat to coronary heart disease and recommended an unhealthy excess of carbohydrates in the form of bread, cereal, rice, and pasta at the bottom of its food pyramid.
There is no question that obesity is a growing problem in America. If government figures in the Dietary Guidelines for Americans are to be believed, in 2008, 10 percent of children ages 2 to 5 were obese, 20 percent of children ages 6 to 11 were obese, 18 percent of adolescents were obese, and 34 percent of adults were obese. A visit to your local buffet will probably confirm these figures.
But even if the government recruited the best and brightest nutritional scientists to solve the deepest and darkest mysteries of metabolism, diet, nutrition, exercise, and weight loss, even if they came up with the perfect diet to ensure that every American leads a long and healthy life, even if they won the war on obesity, and even if they did their work without government funding — there would still be a problem with the government’s issuing dietary guidelines.
It’s not that libertarians are indifferent to the obesity epidemic, unconcerned about the tragedy of childhood obesity, and dismissive of the health risks associated with being obese.
The more important issue is the role of government in the family and society. It is just simply not the purpose of government to issue nutrition guidelines, make food pyramids, wage war on obesity, conduct scientific research, subsidize agriculture, promote or demonize certain foods, monitor school lunches, ban unpasteurized dairy products, encourage healthy eating and exercise, regulate food production and labeling, and gather statistics on obesity.
And unlike programs like Social Security, which some people say we just can’t abolish because there is no free-market alternative, in the case of diet and nutrition there are already scores if not hundreds of private organizations in existence offering analysis and advice on a myriad of health-, medical-, food-, exercise-, nutrition-, and diet-related subjects.
But, it is argued, with so many organizations offering such a variety of opinions there is no way to know what is right and so, it is claimed, we need the Departments of Agriculture and Health and Human Services to serve as the final arbiter. And what about the people who are just too lazy or too mentally deficient to do any reading and research on their own? Don’t we need the government to take care of those people by issuing things like dietary guidelines?
But how do we know that the government will get it right? Just look at how many times the Food and Drug Administration has gotten it wrong on drug policy with deadly consequences for tens of thousands of Americans. And what about those people who are just too lazy or too mentally deficient to read and follow the government’s pronouncements and guidelines? Should the state spoon-feed them every day and force them to exercise?
Once the government dictates to us the food and drink we should and shouldn’t consume, there is no stopping its reach into the family and society. And as Ludwig von Mises pointed out:
It is a fact that no paternal government, whether ancient or modern, ever shrank from regimenting its subjects’ minds, beliefs, and opinions. If one abolishes man’s freedom to determine his own consumption, one takes all freedoms away.
The issue is one of freedom. Freedom to consume or not to consume. Freedom to exercise or not to exercise. Freedom to make one’s own health and welfare decisions. Freedom to not have to fund the FDA, USDA, and HHS bureaucracies. Freedom from a nanny state. And yes, freedom to be obese.
As C. K. Chesterton reminds us:
The free man owns himself. He can damage himself with either eating or drinking; he can ruin himself with gambling. If he does he is certainly a damn fool, and he might possibly be a damned soul; but if he may not, he is not a free man any more than a dog.
The new Dietary Guidelines for Americans should be taken with a grain of salt, but no more than a grain lest you fun afoul of the government-recommended daily allowance.
SOURCE
"Although the evidence is not conclusive". Well what is it then? Speculation is what it is -- motivated by the fact that meat is popular. The "superior" people will attack ANYTHING that is popular
Britons should cut their consumption of red and processed meat to reduce the risk of bowel cancer, scientific experts are expected to recommend in a report. The Scientific Advisory Committee on Nutrition (SACN) was asked by the Department of Health to review dietary advice on meat consumption as a source of iron.
In a draft report published in June 2009 the committee of independent experts said lower consumption of red and processed meat would probably reduce the risk of colorectal cancer.
The committee said: 'Although the evidence is not conclusive, as a precaution, it may be advisable for intakes of red and processed meat not to increase above the current average (70g/day) and for high consumers of red and processed meat (100g/day or more) to reduce their intakes.'
A daily total of 70g is equivalent to about three rashers of bacon.
The Sunday Telegraph said the full report, to be published within days, was expected to echo the committee's draft report.
A Department of Health spokeswoman said: 'The DH committee of independent experts on nutrition will shortly publish their final report on iron and health.'
The World Cancer Research Fund already recommends people limit their intake of red meat, including pork, beef, lamb and goat, to 500g a week. The fund also advises consumers to avoid too much processed meat, including hot dogs, ham, bacon and some sausages and burgers.
It follows a review by the British Nutrition Foundation last week which suggested demolished the ‘myths and misconceptions’ about the meat, saying that most people eat healthy amounts which are not linked to greater risk of disease.
Modern farming methods have cut fat levels, which can be even lower than chicken, while red meat provides high levels of vital nutrients, including iron.
A vegetarian having a Cheddar cheese salad will eat seven times more fat, pound for pound, than lean red meat contains, said the review which looks at current evidence on health and red meat and found no evidence of ‘negative health effects’.
SOURCE
The Ultimate in Nanny-State Paternalism
Aside from the air we breathe, nothing is more important than the food and drink we consume. Not healthcare, not employment, not housing — nothing. Obviously, the best healthcare, the highest-paying job, and the biggest mansion in the world can’t do anything for you if you don’t eat. For someone to dictate to someone else the food and drink he should and shouldn’t consume is the ultimate in paternalism; for the state to tell someone the food and drink he should and shouldn’t consume is the ultimate in nanny-state paternalism.
Although the government’s war on poverty has been around about fifty years, its war on drugs about forty years, and its war on terrorism about ten years, it was only last year that the government declared war on childhood obesity. First Lady Michelle Obama has made this latest war her signature issue. “Obesity in this country is nothing less than a public health crisis,” said the president’s wife. She further claims that because military leaders say that one in four young people are unqualified for military service because of their weight, “childhood obesity isn’t just a public health threat, it’s not just an economic threat, it’s a national security threat as well.”
But the first lady is not alone. To help fight the war on childhood obesity, Congress last year passed, and President Obama signed into law, the Healthy Hunger-Free Kids Act. This new law, which amends the Child Nutrition Act, the Food and Nutrition Act, and the Richard B. Russell National School Lunch Act, gives the government more power to decide what kinds of foods can be sold at schools. School-sponsored fundraisers like candy sales are exempt, but only if they are “infrequent within the school.”
What many Americans probably don’t realize is that the federal government is not just concerned about what children eat in school. Since 1980, and every five years since then, the Department of Agriculture (USDA) and the Department of Health and Human Services (HHS) have joined forces to publish Dietary Guidelines for Americans. The 112-page seventh edition dated 2010 has just been published. It provides nutritional guidelines for all Americans two years and older. It is based on the 453-page Report of the Dietary Guidelines Advisory Committee on the Dietary Guidelines for Americans, 2010.
This edition of Dietary Guidelines for Americans recommends that Americans reduce their daily sodium intake, as well as their consumption of saturated fat, trans fat, cholesterol, added sugars, refined grains, and alcohol. It recommends that Americans increase their consumption of vegetables, fruit, whole grains, fat-free or low-fat milk and milk products, and seafood. It also recommends that Americans choose a variety of protein foods and foods that contain more potassium, fiber, calcium, and vitamin D. And, of course, it is also recommended that Americans “increase physical activity and reduce time spent in sedentary behaviors.”
There are two problems with these dietary guidelines, one nutritional and one philosophical.
Some physicians, nutritionists, and health professionals would strongly disagree with some of what is recommended in the Guidelines. For example, the demonization of cholesterol, butter, saturated fat, and unpasteurized dairy products, the dismissal of the glycemic index and the recommendation that 45 to 65 percent of one’s caloric intake should be from carbohydrates, and the lack of any warning about the dangers of aspartame, soy, and genetically modified foods. In fact, some of the above individuals blame the government itself for contributing to the current obesity and diabetes epidemics because it accepted the “lipid hypothesis” and the “cholesterol myth” that links dietary fat to coronary heart disease and recommended an unhealthy excess of carbohydrates in the form of bread, cereal, rice, and pasta at the bottom of its food pyramid.
There is no question that obesity is a growing problem in America. If government figures in the Dietary Guidelines for Americans are to be believed, in 2008, 10 percent of children ages 2 to 5 were obese, 20 percent of children ages 6 to 11 were obese, 18 percent of adolescents were obese, and 34 percent of adults were obese. A visit to your local buffet will probably confirm these figures.
But even if the government recruited the best and brightest nutritional scientists to solve the deepest and darkest mysteries of metabolism, diet, nutrition, exercise, and weight loss, even if they came up with the perfect diet to ensure that every American leads a long and healthy life, even if they won the war on obesity, and even if they did their work without government funding — there would still be a problem with the government’s issuing dietary guidelines.
It’s not that libertarians are indifferent to the obesity epidemic, unconcerned about the tragedy of childhood obesity, and dismissive of the health risks associated with being obese.
The more important issue is the role of government in the family and society. It is just simply not the purpose of government to issue nutrition guidelines, make food pyramids, wage war on obesity, conduct scientific research, subsidize agriculture, promote or demonize certain foods, monitor school lunches, ban unpasteurized dairy products, encourage healthy eating and exercise, regulate food production and labeling, and gather statistics on obesity.
And unlike programs like Social Security, which some people say we just can’t abolish because there is no free-market alternative, in the case of diet and nutrition there are already scores if not hundreds of private organizations in existence offering analysis and advice on a myriad of health-, medical-, food-, exercise-, nutrition-, and diet-related subjects.
But, it is argued, with so many organizations offering such a variety of opinions there is no way to know what is right and so, it is claimed, we need the Departments of Agriculture and Health and Human Services to serve as the final arbiter. And what about the people who are just too lazy or too mentally deficient to do any reading and research on their own? Don’t we need the government to take care of those people by issuing things like dietary guidelines?
But how do we know that the government will get it right? Just look at how many times the Food and Drug Administration has gotten it wrong on drug policy with deadly consequences for tens of thousands of Americans. And what about those people who are just too lazy or too mentally deficient to read and follow the government’s pronouncements and guidelines? Should the state spoon-feed them every day and force them to exercise?
Once the government dictates to us the food and drink we should and shouldn’t consume, there is no stopping its reach into the family and society. And as Ludwig von Mises pointed out:
It is a fact that no paternal government, whether ancient or modern, ever shrank from regimenting its subjects’ minds, beliefs, and opinions. If one abolishes man’s freedom to determine his own consumption, one takes all freedoms away.
The issue is one of freedom. Freedom to consume or not to consume. Freedom to exercise or not to exercise. Freedom to make one’s own health and welfare decisions. Freedom to not have to fund the FDA, USDA, and HHS bureaucracies. Freedom from a nanny state. And yes, freedom to be obese.
As C. K. Chesterton reminds us:
The free man owns himself. He can damage himself with either eating or drinking; he can ruin himself with gambling. If he does he is certainly a damn fool, and he might possibly be a damned soul; but if he may not, he is not a free man any more than a dog.
The new Dietary Guidelines for Americans should be taken with a grain of salt, but no more than a grain lest you fun afoul of the government-recommended daily allowance.
SOURCE
Saturday, February 19, 2011
Hurrah - eating red meat is good for you! After all the warnings, Sunday roast not linked to heart disease
No details are given of the study below but all the evidence I have seen that opposes meat eating is very weak -- motivated more by vegetarian convictions than anything else
After years of worrying that tucking into red meat could lead to a heart attack or cancer, you can relax and enjoy the Sunday roast, say researchers. A report demolishes the ‘myths and misconceptions’ about the meat, saying that most people eat healthy amounts which are not linked to greater risk of disease.
Modern farming methods have cut fat levels, which can be even lower than chicken, while red meat provides high levels of vital nutrients, including iron.
A vegetarian having a Cheddar cheese salad will eat seven times more fat, pound for pound, than lean red meat contains, says a review by the British Nutrition Foundation.
However, the World Cancer Research Fund, which advises people to curb red meat consumption and cut out processed meat, disputed the findings. [They would. Scares are meat and potatoes to them]
The 77-page review, which looks at current evidence on health and red meat, found no evidence of ‘negative health effects’. It shows on average men in the UK eat 96g of red meat and processed meat a day and women are eating 57g.
Those eating more than 140g a day are advised by the Scientific Advisory Committee on Nutrition to cut down, as these levels are linked to disease. There has been a cut in consumption over the last 30 years, with Britons eating less than many other European countries including Spain, Italy, France, Sweden and the Netherlands.
The review says there is ‘no conclusive link’ between cardiovascular disease and red meat, which actually contains some fatty acids that may protect the heart. At current levels of average consumption, there also is no evidence of a link to cancer, it says.
Cooking methods which overdo or char the meat are a much more likely cause of any link with bowel cancer, says the review.
Dr Carrie Ruxton, an independent dietician and member of the Meat Advisory Panel, which is supported by a grant from the meat industry, said: ‘This review highlights that eating red meat in moderation is an important part of a healthy balanced diet.
‘It also lays to rest many of the misconceptions about meat and health. People have been told they can’t eat it and they feel guilty when they do, but given that current intakes, on average, are well within health targets, there is no reason to eat less red meat if you enjoy it.’ An average slice of ham is 23g, beef 45g and a thick slice of lamb 90g. A small piece of steak is 100g.
Dr Ruxton said: ‘There is less saturated fat in a grilled pork steak than a grilled chicken breast with the skin left on.’
Although meat eaters often have more body fat than vegetarians, the review says it is impossible to attribute this to shunning meat as vegetarians tend to have more health-conscious lifestyles.
Dr Ruxton said many young women were iron-deficient and should be eating more red meat, but she advised that processed meat should be no more than an occasional treat. ‘You don’t need red meat every day, people should be eating fish twice a week, but if you ate a slice of red meat in a sandwich daily you can eat a portion of red meat for dinner up to four times a week and still stay within healthy limits,’ she said.
Since 2006 researchers have been giving hollow warnings about red meat. Professor Martin Wiseman, medical and scientific adviser for World Cancer Research Fund, said the study was being promoted by the meat industry, but added: ‘This paper is not a systematic review of the evidence and does not change the fact that there is convincing evidence that red and processed meat increase risk of bowel cancer. ‘This is why we recommend limiting red meat to 500g cooked weight per week and avoiding processed meat.
‘It is true that red meat contains valuable nutrients and this is why we do not recommend avoiding it altogether. But to suggest, as the authors of this review have done, that there is “no evidence” that a moderate intake of lean red meat has any negative health effects is wrong.
‘Essentially, the public has a choice between believing our findings – which are those of an independent panel of scientists after a systematic and transparent review of the complete global evidence – or the conclusions of this review.’
The review was published in the Nutritional Bulletin, the journal of the British Nutrition Foundation, a charity with funding from various sources including the food industry.
SOURCE
The tantalising evidence that belief in God makes you happier and healthier
I don't think there is much doubt that Christian religious belief de-stresses people. That alone could account for the correlations summarized below. And militant atheists seem such angry people -- and there is little doubt that chronic anger is bad for your heart
God has had a tough time over the past few years. On TV, in newspapers and on the internet, the debate as to whether faith has any relevance in a sceptical modern world has been as ubiquitous as it has been vigorous.
And it has been pretty clear which side is the most splenetic. From Richard Dawkins’ powerful atheist polemics to Christopher Hitchens’ public derision of the Roman Catholic Tony Blair and Stephen Hawking’s proclamation that the universe ‘has no need for God’, it seems that unbelievers have had the dwindling faithful on the run.
As research for my latest novel, Bible Of The Dead, I have spent months investigating the science of faith versus atheism, and discovered startling and unexpected evidence. It might just change the way you think about the whole debate, as it has changed my view.
I am not a religious zealot. On the contrary, I was a teenage atheist. And although in adulthood I have had a vague and fuzzy feeling that ‘there must be something out there’, I was never a regular church-goer. But what I have discovered, on my voyage through the science of faith, has astonished me.
My journey began a couple of years ago when I was travelling in Utah, the home of Mormonism. During my first week there, I approached this eccentric American religion with a typically European cynicism. I teased Mormons about their taste in ‘spiritual undergarments’; I despaired at being unable to find a decent cappuccino (Mormons are forbidden coffee, as well as alcohol, smoking, tea and premarital sex).
But then I had something of an epiphany. One night, after a long dinner, I was walking back to my hotel in downtown Salt Lake City at 2am and I suddenly realised: I felt safe. As any transatlantic traveller knows, this is a pretty unusual experience in an American city after midnight.
Why did I feel safe? Because I was in a largely Mormon city, and Mormons are never going to mug you. They might bore or annoy you when they come knocking on your door, touting their faith, but they are not going to attack you.
The Mormons’ wholesome religiousness, their endless and charitable kindliness, made their city a better place. And that made me think: Why was I so supercilious about such happy, hospitable people? What gave me the right to sneer at their religion? From that moment I took a deeper, more rigorous interest in the possible benefits of religious faith. Not one particular creed, but all creeds. And I was startled by what I found.
For a growing yet largely unnoticed body of scientific work, amassed over the past 30 years, shows religious belief is medically, socially and psychologically beneficial.
In 2006, the American Society of Hypertension established that church-goers have lower blood pressure than the non-faithful. Likewise, in 2004, scholars at the University of California, Los Angeles, suggested that college students involved in religious activities are more likely to have better mental and emotional health than those who do not. Meanwhile, in 2006, population researchers at the University of Texas discovered that the more often you go to church, the longer you live. As they put it: ‘Religious attendance is associated with adult mortality in a graded fashion: there is a seven-year difference in life expectancy between those who never attend church and those who attend weekly.’
Exactly the same outcome was recently reported in the American Journal of Public Health, which studied nearly 2,000 older Californians for five years. Those who attended religious services were 36 per cent less likely to die during this half-decade than those who didn’t. Even those who attended a place of worship irregularly — implying a less than ardent faith — did better than those who never attended.
Pretty impressive. But there’s more; so much more that it’s positively surreal. In 1990, the American Journal of Psychiatry discovered believers with broken hips were less depressed, had shorter hospital stays and could even walk further when they were discharged compared to their similarly broken-hipped and hospitalised, but comparatively heathen peers.
It’s not just hips. Scientists have revealed that believers recover from breast cancer quicker than non-believers; have better outcomes from coronary disease and rheumatoid arthritis; and are less likely to have children with meningitis.
Intriguing research in 2002 showed that believers have more success with IVF than non-believers. A 1999 study found that going to a religious service or saying a few prayers actively strengthened your immune system. These medical benefits accrue even if you adjust for the fact that believers are less likely to smoke, drink or take drugs.
And faith doesn’t just heal the body; it salves the mind, too. In 1998, the American Journal of Public Health found that depressed patients with a strong ‘intrinsic faith’ (a deep personal belief, not just a social inclination to go to a place of worship) recovered 70 per cent faster than those who did not have strong faith. Another study, in 2002, showed that prayer reduced ‘adverse outcomes in heart patients’.
But perhaps this is just an American thing? After all, those Bible-bashing Yanks are a bit credulous compared to us more sceptical Europeans, aren’t they?
Not so. In 2008, Professor Andrew Clark of the Paris School of Economics and Doctor Orsolya Lelkes of the European Centre for Social Welfare Policy and Research conducted a vast survey of Europeans. They found that religious believers, compared to non-believers, record less stress, are better able to cope with losing jobs and divorce, are less prone to suicide, report higher levels of self-esteem, enjoy greater ‘life purpose’ and report being more happy overall.
What is stunning about this research is that the team didn’t go looking for this effect — it came to them unexpectedly. ‘We originally started the research to work out why some European countries had more generous unemployment benefits than others,’ says Professor Clark. But as they went on, the pattern of beneficial faith presented itself. ‘Our analysis suggested religious people suffered less psychological harm from unemployment than the non-religious. Believers had higher levels of life satisfaction.’
So what’s going on? How does religion work this apparent magic? One of the latest surveys to suggest that religious people are happier than the non-religious was conducted by Professors Chaeyoon Lim and Robert Putnam, from Harvard, and published last year.
They discovered that many of the health benefits of religion materialise only if you go to church regularly and have good friends there. In other words, it’s the ‘organised’ part of organised religion that does a lot of the good stuff. Going to a friendly church, temple or mosque gives you a strong social network and a ready-made support group, which in turn gives you a more positive outlook on life — and offers vital help in times of need. The Harvard scientists were so startled by their findings that they considered altering their own religious behaviour.
As Professor Lim said: ‘I am not a religious person, but . . . I personally began to think about whether I should go to church. It would make my mum happy.’
But if the ‘congregation’ effect is one explanation for the good health of churchgoers, it’s not the only one. Other surveys have found that intrinsic faith is also important.
For instance, a study of nearly 4,000 older adults for the U.S. Journal of Gerontology revealed that atheists had a notably increased chance of dying over a six-year period than the faithful. Crucially, religious people lived longer than atheists even if they didn’t go regularly to a place of worship. This study clearly suggests there is a benefit in pure faith alone — perhaps this religiousness works by affording a greater sense of inner purpose and solace in grief.
This begs the question: Given all this vast evidence that religion is good for you, how come the atheists seem so set against it? They pride themselves on their rationality, yet so much of the empirical evidence indicates that God is good for you. Surely, then, it is the atheists, not the devout, who are acting irrationally?
All this will come as no surprise to many students of genetics and evolution, who have long speculated that religious faith might be hard- wired into the human mind. For instance, twin studies (research on identical siblings who are separated at birth) show that religion is a heritable characteristic: if one twin is religious, the other is likely to be a believer as well, even when raised by different parents.
Neurologists are making exciting progress in locating the areas of the brain, primarily the frontal cortex, ‘responsible’ for religious belief — parts of the brain that seem designed to accommodate faith. This research even has its own name: neurotheology.
Why might we be hard-wired to be religious? Precisely because religion makes us happier and healthier, and thus makes us have more children. In the purest of Darwinian terms, God isn’t just good for you, He’s good for your genes, too.
All of which means that, contrary to expectation, it is the atheists who are eccentric, flawed and maladaptive, and it’s the devout who are healthy, well-adjusted and normal.
Certainly, in purely evolutionary terms, atheism is a blind alley. Across the world, religious people have more children than non-religious (go forth and multiply!), while atheist societies are the ones with the lowest birth rates.
The Czech Republic is a classic example. It proclaims itself the most atheist country in Europe, if not the world; it also has a puny birthrate of 1.28 per woman, one of the lowest on the planet (so soon there won’t be any godless Czechs to proclaim their atheism).
The existence of atheism is therefore something of an anomaly. But then again, anomalies are not unknown in evolution. Think of the dodo or the flightless parrot, doomed to extinction. Are atheists similarly blighted? Are Richard Dawkins and his type destined to vanish off the face of the Earth — the victims of their own intellectual arrogance?
That’s not for me to say; it’s for you to ponder. All I do know is that reassessing the research has changed the way I think about faith. These days I go to church quite a lot, especially when I am travelling and researching my books. For instance, the other day I found myself in Cambridge — the home of Stephen Hawking — and took the opportunity to do some sightseeing of the city’s intellectual landmarks.
I strolled by the labs where Hawking does his brilliant work, popped into the pub where they announced the discovery of DNA and admired the library where Charles Darwin studied. As I did, I was in awe at the greatness of Man’s achievements. And then I went to Evensong at King’s College Chapel, and it was beautiful, sublime and uplifting. And I felt a very different kind of awe.
Sneer at faith all you like. Just don’t assume science is on your side.
SOURCE
No details are given of the study below but all the evidence I have seen that opposes meat eating is very weak -- motivated more by vegetarian convictions than anything else
After years of worrying that tucking into red meat could lead to a heart attack or cancer, you can relax and enjoy the Sunday roast, say researchers. A report demolishes the ‘myths and misconceptions’ about the meat, saying that most people eat healthy amounts which are not linked to greater risk of disease.
Modern farming methods have cut fat levels, which can be even lower than chicken, while red meat provides high levels of vital nutrients, including iron.
A vegetarian having a Cheddar cheese salad will eat seven times more fat, pound for pound, than lean red meat contains, says a review by the British Nutrition Foundation.
However, the World Cancer Research Fund, which advises people to curb red meat consumption and cut out processed meat, disputed the findings. [They would. Scares are meat and potatoes to them]
The 77-page review, which looks at current evidence on health and red meat, found no evidence of ‘negative health effects’. It shows on average men in the UK eat 96g of red meat and processed meat a day and women are eating 57g.
Those eating more than 140g a day are advised by the Scientific Advisory Committee on Nutrition to cut down, as these levels are linked to disease. There has been a cut in consumption over the last 30 years, with Britons eating less than many other European countries including Spain, Italy, France, Sweden and the Netherlands.
The review says there is ‘no conclusive link’ between cardiovascular disease and red meat, which actually contains some fatty acids that may protect the heart. At current levels of average consumption, there also is no evidence of a link to cancer, it says.
Cooking methods which overdo or char the meat are a much more likely cause of any link with bowel cancer, says the review.
Dr Carrie Ruxton, an independent dietician and member of the Meat Advisory Panel, which is supported by a grant from the meat industry, said: ‘This review highlights that eating red meat in moderation is an important part of a healthy balanced diet.
‘It also lays to rest many of the misconceptions about meat and health. People have been told they can’t eat it and they feel guilty when they do, but given that current intakes, on average, are well within health targets, there is no reason to eat less red meat if you enjoy it.’ An average slice of ham is 23g, beef 45g and a thick slice of lamb 90g. A small piece of steak is 100g.
Dr Ruxton said: ‘There is less saturated fat in a grilled pork steak than a grilled chicken breast with the skin left on.’
Although meat eaters often have more body fat than vegetarians, the review says it is impossible to attribute this to shunning meat as vegetarians tend to have more health-conscious lifestyles.
Dr Ruxton said many young women were iron-deficient and should be eating more red meat, but she advised that processed meat should be no more than an occasional treat. ‘You don’t need red meat every day, people should be eating fish twice a week, but if you ate a slice of red meat in a sandwich daily you can eat a portion of red meat for dinner up to four times a week and still stay within healthy limits,’ she said.
Since 2006 researchers have been giving hollow warnings about red meat. Professor Martin Wiseman, medical and scientific adviser for World Cancer Research Fund, said the study was being promoted by the meat industry, but added: ‘This paper is not a systematic review of the evidence and does not change the fact that there is convincing evidence that red and processed meat increase risk of bowel cancer. ‘This is why we recommend limiting red meat to 500g cooked weight per week and avoiding processed meat.
‘It is true that red meat contains valuable nutrients and this is why we do not recommend avoiding it altogether. But to suggest, as the authors of this review have done, that there is “no evidence” that a moderate intake of lean red meat has any negative health effects is wrong.
‘Essentially, the public has a choice between believing our findings – which are those of an independent panel of scientists after a systematic and transparent review of the complete global evidence – or the conclusions of this review.’
The review was published in the Nutritional Bulletin, the journal of the British Nutrition Foundation, a charity with funding from various sources including the food industry.
SOURCE
The tantalising evidence that belief in God makes you happier and healthier
I don't think there is much doubt that Christian religious belief de-stresses people. That alone could account for the correlations summarized below. And militant atheists seem such angry people -- and there is little doubt that chronic anger is bad for your heart
God has had a tough time over the past few years. On TV, in newspapers and on the internet, the debate as to whether faith has any relevance in a sceptical modern world has been as ubiquitous as it has been vigorous.
And it has been pretty clear which side is the most splenetic. From Richard Dawkins’ powerful atheist polemics to Christopher Hitchens’ public derision of the Roman Catholic Tony Blair and Stephen Hawking’s proclamation that the universe ‘has no need for God’, it seems that unbelievers have had the dwindling faithful on the run.
As research for my latest novel, Bible Of The Dead, I have spent months investigating the science of faith versus atheism, and discovered startling and unexpected evidence. It might just change the way you think about the whole debate, as it has changed my view.
I am not a religious zealot. On the contrary, I was a teenage atheist. And although in adulthood I have had a vague and fuzzy feeling that ‘there must be something out there’, I was never a regular church-goer. But what I have discovered, on my voyage through the science of faith, has astonished me.
My journey began a couple of years ago when I was travelling in Utah, the home of Mormonism. During my first week there, I approached this eccentric American religion with a typically European cynicism. I teased Mormons about their taste in ‘spiritual undergarments’; I despaired at being unable to find a decent cappuccino (Mormons are forbidden coffee, as well as alcohol, smoking, tea and premarital sex).
But then I had something of an epiphany. One night, after a long dinner, I was walking back to my hotel in downtown Salt Lake City at 2am and I suddenly realised: I felt safe. As any transatlantic traveller knows, this is a pretty unusual experience in an American city after midnight.
Why did I feel safe? Because I was in a largely Mormon city, and Mormons are never going to mug you. They might bore or annoy you when they come knocking on your door, touting their faith, but they are not going to attack you.
The Mormons’ wholesome religiousness, their endless and charitable kindliness, made their city a better place. And that made me think: Why was I so supercilious about such happy, hospitable people? What gave me the right to sneer at their religion? From that moment I took a deeper, more rigorous interest in the possible benefits of religious faith. Not one particular creed, but all creeds. And I was startled by what I found.
For a growing yet largely unnoticed body of scientific work, amassed over the past 30 years, shows religious belief is medically, socially and psychologically beneficial.
In 2006, the American Society of Hypertension established that church-goers have lower blood pressure than the non-faithful. Likewise, in 2004, scholars at the University of California, Los Angeles, suggested that college students involved in religious activities are more likely to have better mental and emotional health than those who do not. Meanwhile, in 2006, population researchers at the University of Texas discovered that the more often you go to church, the longer you live. As they put it: ‘Religious attendance is associated with adult mortality in a graded fashion: there is a seven-year difference in life expectancy between those who never attend church and those who attend weekly.’
Exactly the same outcome was recently reported in the American Journal of Public Health, which studied nearly 2,000 older Californians for five years. Those who attended religious services were 36 per cent less likely to die during this half-decade than those who didn’t. Even those who attended a place of worship irregularly — implying a less than ardent faith — did better than those who never attended.
Pretty impressive. But there’s more; so much more that it’s positively surreal. In 1990, the American Journal of Psychiatry discovered believers with broken hips were less depressed, had shorter hospital stays and could even walk further when they were discharged compared to their similarly broken-hipped and hospitalised, but comparatively heathen peers.
It’s not just hips. Scientists have revealed that believers recover from breast cancer quicker than non-believers; have better outcomes from coronary disease and rheumatoid arthritis; and are less likely to have children with meningitis.
Intriguing research in 2002 showed that believers have more success with IVF than non-believers. A 1999 study found that going to a religious service or saying a few prayers actively strengthened your immune system. These medical benefits accrue even if you adjust for the fact that believers are less likely to smoke, drink or take drugs.
And faith doesn’t just heal the body; it salves the mind, too. In 1998, the American Journal of Public Health found that depressed patients with a strong ‘intrinsic faith’ (a deep personal belief, not just a social inclination to go to a place of worship) recovered 70 per cent faster than those who did not have strong faith. Another study, in 2002, showed that prayer reduced ‘adverse outcomes in heart patients’.
But perhaps this is just an American thing? After all, those Bible-bashing Yanks are a bit credulous compared to us more sceptical Europeans, aren’t they?
Not so. In 2008, Professor Andrew Clark of the Paris School of Economics and Doctor Orsolya Lelkes of the European Centre for Social Welfare Policy and Research conducted a vast survey of Europeans. They found that religious believers, compared to non-believers, record less stress, are better able to cope with losing jobs and divorce, are less prone to suicide, report higher levels of self-esteem, enjoy greater ‘life purpose’ and report being more happy overall.
What is stunning about this research is that the team didn’t go looking for this effect — it came to them unexpectedly. ‘We originally started the research to work out why some European countries had more generous unemployment benefits than others,’ says Professor Clark. But as they went on, the pattern of beneficial faith presented itself. ‘Our analysis suggested religious people suffered less psychological harm from unemployment than the non-religious. Believers had higher levels of life satisfaction.’
So what’s going on? How does religion work this apparent magic? One of the latest surveys to suggest that religious people are happier than the non-religious was conducted by Professors Chaeyoon Lim and Robert Putnam, from Harvard, and published last year.
They discovered that many of the health benefits of religion materialise only if you go to church regularly and have good friends there. In other words, it’s the ‘organised’ part of organised religion that does a lot of the good stuff. Going to a friendly church, temple or mosque gives you a strong social network and a ready-made support group, which in turn gives you a more positive outlook on life — and offers vital help in times of need. The Harvard scientists were so startled by their findings that they considered altering their own religious behaviour.
As Professor Lim said: ‘I am not a religious person, but . . . I personally began to think about whether I should go to church. It would make my mum happy.’
But if the ‘congregation’ effect is one explanation for the good health of churchgoers, it’s not the only one. Other surveys have found that intrinsic faith is also important.
For instance, a study of nearly 4,000 older adults for the U.S. Journal of Gerontology revealed that atheists had a notably increased chance of dying over a six-year period than the faithful. Crucially, religious people lived longer than atheists even if they didn’t go regularly to a place of worship. This study clearly suggests there is a benefit in pure faith alone — perhaps this religiousness works by affording a greater sense of inner purpose and solace in grief.
This begs the question: Given all this vast evidence that religion is good for you, how come the atheists seem so set against it? They pride themselves on their rationality, yet so much of the empirical evidence indicates that God is good for you. Surely, then, it is the atheists, not the devout, who are acting irrationally?
All this will come as no surprise to many students of genetics and evolution, who have long speculated that religious faith might be hard- wired into the human mind. For instance, twin studies (research on identical siblings who are separated at birth) show that religion is a heritable characteristic: if one twin is religious, the other is likely to be a believer as well, even when raised by different parents.
Neurologists are making exciting progress in locating the areas of the brain, primarily the frontal cortex, ‘responsible’ for religious belief — parts of the brain that seem designed to accommodate faith. This research even has its own name: neurotheology.
Why might we be hard-wired to be religious? Precisely because religion makes us happier and healthier, and thus makes us have more children. In the purest of Darwinian terms, God isn’t just good for you, He’s good for your genes, too.
All of which means that, contrary to expectation, it is the atheists who are eccentric, flawed and maladaptive, and it’s the devout who are healthy, well-adjusted and normal.
Certainly, in purely evolutionary terms, atheism is a blind alley. Across the world, religious people have more children than non-religious (go forth and multiply!), while atheist societies are the ones with the lowest birth rates.
The Czech Republic is a classic example. It proclaims itself the most atheist country in Europe, if not the world; it also has a puny birthrate of 1.28 per woman, one of the lowest on the planet (so soon there won’t be any godless Czechs to proclaim their atheism).
The existence of atheism is therefore something of an anomaly. But then again, anomalies are not unknown in evolution. Think of the dodo or the flightless parrot, doomed to extinction. Are atheists similarly blighted? Are Richard Dawkins and his type destined to vanish off the face of the Earth — the victims of their own intellectual arrogance?
That’s not for me to say; it’s for you to ponder. All I do know is that reassessing the research has changed the way I think about faith. These days I go to church quite a lot, especially when I am travelling and researching my books. For instance, the other day I found myself in Cambridge — the home of Stephen Hawking — and took the opportunity to do some sightseeing of the city’s intellectual landmarks.
I strolled by the labs where Hawking does his brilliant work, popped into the pub where they announced the discovery of DNA and admired the library where Charles Darwin studied. As I did, I was in awe at the greatness of Man’s achievements. And then I went to Evensong at King’s College Chapel, and it was beautiful, sublime and uplifting. And I felt a very different kind of awe.
Sneer at faith all you like. Just don’t assume science is on your side.
SOURCE
Friday, February 18, 2011
Why Almost Everything You Hear About Medicine Is Wrong
If you follow the news about health research, you risk whiplash. First garlic lowers bad cholesterol, then—after more study—it doesn’t. Hormone replacement reduces the risk of heart disease in postmenopausal women, until a huge study finds that it doesn’t (and that it raises the risk of breast cancer to boot). Eating a big breakfast cuts your total daily calories, or not—as a study released last week finds. Yet even if biomedical research can be a fickle guide, we rely on it.
But what if wrong answers aren’t the exception but the rule? More and more scholars who scrutinize health research are now making that claim. It isn’t just an individual study here and there that’s flawed, they charge. Instead, the very framework of medical investigation may be off-kilter, leading time and again to findings that are at best unproved and at worst dangerously wrong. The result is a system that leads patients and physicians astray—spurring often costly regimens that won’t help and may even harm you.
It’s a disturbing view, with huge im-plications for doctors, policymakers, and health-conscious consumers. And one of its foremost advocates, Dr. John P.A. Ioannidis, has just ascended to a new, prominent platform after years of crusading against the baseless health and medical claims. As the new chief of Stanford University’s Prevention Research Center, Ioannidis is cementing his role as one of medicine’s top mythbusters. “People are being hurt and even dying” because of false medical claims, he says: not quackery, but errors in medical research.
This is Ioannidis’s moment. As medical costs hamper the economy and impede deficit-reduction efforts, policymakers and businesses are desperate to cut them without sacrificing sick people. One no-brainer solution is to use and pay for only treatments that work.
But if Ioannidis is right, most biomedical studies are wrong.
In just the last two months, two pillars of preventive medicine fell. A major study concluded there’s no good evidence that statins (drugs like Lipitor and Crestor) help people with no history of heart disease. The study, by the Cochrane Collaboration, a global consortium of biomedical experts, was based on an evaluation of 14 individual trials with 34,272 patients. Cost of statins: more than $20 billion per year, of which half may be unnecessary. (Pfizer, which makes Lipitor, responds in part that “managing cardiovascular disease risk factors is complicated”).
In November a panel of the Institute of Medicine concluded that having a blood test for vitamin D is pointless: almost everyone has enough D for bone health (20 nanograms per milliliter) without taking supplements or calcium pills. Cost of vitamin D: $425 million per year.
Ioannidis, 45, didn’t set out to slay medical myths. A child prodigy (he was calculating decimals at age 3 and wrote a book of poetry at 8), he graduated first in his class from the University of Athens Medical School, did a residency at Harvard, oversaw AIDS clinical trials at the National Institutes of Health in the mid-1990s, and chaired the department of epidemiology at Greece’s University of Ioannina School of Medicine.
But at NIH Ioannidis had an epiphany. “Positive” drug trials, which find that a treatment is effective, and “negative” trials, in which a drug fails, take the same amount of time to conduct. “But negative trials took an extra two to four years to be published,” he noticed. “Negative results sit in a file drawer, or the trial keeps going in hopes the results turn positive.” With billions of dollars on the line, companies are loath to declare a new drug ineffective. As a result of the lag in publishing negative studies, patients receive a treatment that is actually ineffective. That made Ioannidis wonder, how many biomedical studies are wrong?
His answer, in a 2005 paper: “the majority.” From clinical trials of new drugs to cutting-edge genetics, biomedical research is riddled with incorrect findings, he argued. Ioannidis deployed an abstruse mathematical argument to prove this, which some critics have questioned. “I do agree that many claims are far more tenuous than is generally appreciated, but to ‘prove’ that most are false, in all areas of medicine, one needs a different statistical model and more empirical evidence than Ioannidis uses,” says biostatistician Steven Goodman of Johns Hopkins, who worries that the most-research-is-wrong claim “could promote an unhealthy skepticism about medical research, which is being used to fuel anti-science fervor.”
Even a cursory glance at medical journals shows that once heralded studies keep falling by the wayside. Two 1993 studies concluded that vitamin E prevents cardiovascular disease; that claim was overturned by more rigorous experiments, in 1996 and 2000.
A 1996 study concluding that estrogen therapy reduces older women’s risk of Alzheimer’s was overturned in 2004.
Numerous studies concluding that popular antidepressants work by altering brain chemistry have now been contradicted (the drugs help with mild and moderate depression, when they work at all, through a placebo effect), as has research claiming that early cancer detection (through, say, PSA tests) invariably saves lives.
The list goes on.
Despite the explosive nature of his charges, Ioannidis has collaborated with some 1,500 other scientists, and Stanford, epitome of the establishment, hired him in August to run the preventive-medicine center. “The core of medicine is getting evidence that guides decision making for patients and doctors,” says Ralph Horwitz, chairman of the department of medicine at Stanford. “John has been the foremost innovative thinker about biomedical evidence, so he was a natural for us.”
Ioannidis’s first targets were shoddy statistics used in early genome studies. Scientists would test one or a few genes at a time for links to virtually every disease they could think of. That just about ensured they would get “hits” by chance alone. When he began marching through the genetics literature, it was like Sherman laying waste to Georgia: most of these candidate genes could not be verified.
The claim that variants of the vitamin D–receptor gene explain three quarters of the risk of osteoporosis? Wrong, he and colleagues proved in 2006: the variants have no effect on osteoporosis.
That scores of genes identified by the National Human Genome Research Institute can be used to predict cardiovascular disease? No (2009). That six gene variants raise the risk of Parkinson’s disease? No (2010). Yet claims that gene X raises the risk of disease Y contaminate the scientific literature, affecting personal health decisions and sustaining the personal genome-testing industry.
Statistical flukes also plague epidemiology, in which researchers look for links between health and the environment, including how people behave and what they eat. A study might ask whether coffee raises the risk of joint pain, or headaches, or gallbladder disease, or hundreds of other ills. “When you do thousands of tests, statistics says you’ll have some false winners,” says Ioannidis. Drug companies make a mint on such dicey statistics. By testing an approved drug for other uses, they get hits by chance, “and doctors use that as the basis to prescribe the drug for this new use. I think that’s wrong.”
Even when a claim is disproved, it hangs around like a deadbeat renter you can’t evict. Years after the claim that vitamin E prevents heart disease had been overturned, half the scientific papers mentioning it cast it as true, Ioannidis found in 2007.
The situation isn’t hopeless. Geneticists have mostly mended their ways, tightening statistical criteria, but other fields still need to clean house, Ioannidis says. Surgical practices, for instance, have not been tested to nearly the extent that medications have. “I wouldn’t be surprised if a large proportion of surgical practice is based on thin air, and [claims for effectiveness] would evaporate if we studied them closely,” Ioannidis says.
That would also save billions of dollars. George Lundberg, former editor of The Journal of the American Medical Association, estimates that strictly applying criteria like Ioannidis pushes would save $700 billion to $1 trillion a year in U.S. health-care spending.
Of course, not all conventional health wisdom is wrong. Smoking kills, being morbidly obese or severely underweight makes you more likely to die before your time, processed meat raises the risk of some cancers, and controlling blood pressure reduces the risk of stroke.
The upshot for consumers: medical wisdom that has stood the test of time—and large, randomized, controlled trials—is more likely to be right than the latest news flash about a single food or drug.
SOURCE
Stress blocker helps bald mice regrow hair
If this generalizes to humans, the discoverers of the compound should make a fortune
US researchers looking at how stress affects the gut stumbled upon a potent chemical that caused mice to regrow hair by blocking a stress-related hormone, said a study today.
While the process has not yet been tested in humans, it grew more hair in mice than minoxidil, the ingredient in Rogaine, a popular treatment for baldness, said the study in the online journal PLoS One.
"This could open new venues to treat hair loss in humans through the modulation of the stress hormone receptors, particularly hair loss related to chronic stress and aging," said co-author Million Mulugeta.
Researchers from University of California at Los Angeles and the Veterans Administration discovered the chemical compound "entirely by accident", said the study. Scientists were using genetically engineered mutant mice that were altered to produce too much of a stress hormone called corticotrophin-releasing factor, or CRF. The chronic stress condition makes them lose hair on their backs.
They injected a chemical compound called astressin-B, developed by the California-based Salk Institute, into the mice to see how the CRF-blocker would affect gastrointestinal function. When they saw no effect at first, they continued for five days. The researchers completed their gastrointestinal tests and put the mice back in cages with their hairier counterparts.
When they returned to get the stressed-out mice three months later for more tests, they discovered they could no longer tell them apart because the mice had regrown all the hair they had lost.
"Our findings show that a short-duration treatment with this compound causes an astounding long-term hair regrowth in chronically stressed mutant mice," said Professor Mulugeta of the David Geffen School of Medicine at UCLA.
Not only did it help grow hair, it also appeared to help hair maintain its color and not go grey. "This molecule also keeps the hair color, prevents the hair from turning gray," he said.
The short five-day time span of treatments brought hair growth effects that lasted up to four months, which was also surprising to researchers. "This is a comparatively long time, considering that mice's life span is less than two years," Professor Mulugeta said.
Researchers gave the bald mice treatments of "minoxidil alone, which resulted in mild hair growth, as it does in humans. This suggests that astressin-B could also translate for use in human hair growth," said the study.
Co-author Yvette Tache, a professor of medicine at UCLA, said it could take up to five years to start a clinical trial in humans. "This research could be beneficial in a lot of diseases which are stress-related in their manifestations or exacerbation of symptoms," she said, noting that no sign of toxicity has appeared after extensive tests on mice.
Professor Mulugeta said talks were underway with a major cosmetics firm to fund a study in humans. "In general, the concept that interfering with the stress hormones or their receptors to prevent or treat stress-related diseases is a very valid concept," he said.
Jean Rivier, a Swiss professor at the Salk Institute, said he has reason to believe the process could be useful in other stress-related ailments, from psoriasis to depression to heart and artery problems. "You bring back the skin to a normal acidic state where it can essentially go back to a normal stage whereby hair will grow again, and this is very reminiscent of what is seen in relapses and remissions," he said.
The CRF-blocker was studied by other major pharmaceutical companies but was abandoned because it could not administered orally, he said.
However, the latest method uses peptide analogs that bind to the same receptors but via different types of molecules, which could be administered though injections or potentially through a nasal spray.
SOURCE
If you follow the news about health research, you risk whiplash. First garlic lowers bad cholesterol, then—after more study—it doesn’t. Hormone replacement reduces the risk of heart disease in postmenopausal women, until a huge study finds that it doesn’t (and that it raises the risk of breast cancer to boot). Eating a big breakfast cuts your total daily calories, or not—as a study released last week finds. Yet even if biomedical research can be a fickle guide, we rely on it.
But what if wrong answers aren’t the exception but the rule? More and more scholars who scrutinize health research are now making that claim. It isn’t just an individual study here and there that’s flawed, they charge. Instead, the very framework of medical investigation may be off-kilter, leading time and again to findings that are at best unproved and at worst dangerously wrong. The result is a system that leads patients and physicians astray—spurring often costly regimens that won’t help and may even harm you.
It’s a disturbing view, with huge im-plications for doctors, policymakers, and health-conscious consumers. And one of its foremost advocates, Dr. John P.A. Ioannidis, has just ascended to a new, prominent platform after years of crusading against the baseless health and medical claims. As the new chief of Stanford University’s Prevention Research Center, Ioannidis is cementing his role as one of medicine’s top mythbusters. “People are being hurt and even dying” because of false medical claims, he says: not quackery, but errors in medical research.
This is Ioannidis’s moment. As medical costs hamper the economy and impede deficit-reduction efforts, policymakers and businesses are desperate to cut them without sacrificing sick people. One no-brainer solution is to use and pay for only treatments that work.
But if Ioannidis is right, most biomedical studies are wrong.
In just the last two months, two pillars of preventive medicine fell. A major study concluded there’s no good evidence that statins (drugs like Lipitor and Crestor) help people with no history of heart disease. The study, by the Cochrane Collaboration, a global consortium of biomedical experts, was based on an evaluation of 14 individual trials with 34,272 patients. Cost of statins: more than $20 billion per year, of which half may be unnecessary. (Pfizer, which makes Lipitor, responds in part that “managing cardiovascular disease risk factors is complicated”).
In November a panel of the Institute of Medicine concluded that having a blood test for vitamin D is pointless: almost everyone has enough D for bone health (20 nanograms per milliliter) without taking supplements or calcium pills. Cost of vitamin D: $425 million per year.
Ioannidis, 45, didn’t set out to slay medical myths. A child prodigy (he was calculating decimals at age 3 and wrote a book of poetry at 8), he graduated first in his class from the University of Athens Medical School, did a residency at Harvard, oversaw AIDS clinical trials at the National Institutes of Health in the mid-1990s, and chaired the department of epidemiology at Greece’s University of Ioannina School of Medicine.
But at NIH Ioannidis had an epiphany. “Positive” drug trials, which find that a treatment is effective, and “negative” trials, in which a drug fails, take the same amount of time to conduct. “But negative trials took an extra two to four years to be published,” he noticed. “Negative results sit in a file drawer, or the trial keeps going in hopes the results turn positive.” With billions of dollars on the line, companies are loath to declare a new drug ineffective. As a result of the lag in publishing negative studies, patients receive a treatment that is actually ineffective. That made Ioannidis wonder, how many biomedical studies are wrong?
His answer, in a 2005 paper: “the majority.” From clinical trials of new drugs to cutting-edge genetics, biomedical research is riddled with incorrect findings, he argued. Ioannidis deployed an abstruse mathematical argument to prove this, which some critics have questioned. “I do agree that many claims are far more tenuous than is generally appreciated, but to ‘prove’ that most are false, in all areas of medicine, one needs a different statistical model and more empirical evidence than Ioannidis uses,” says biostatistician Steven Goodman of Johns Hopkins, who worries that the most-research-is-wrong claim “could promote an unhealthy skepticism about medical research, which is being used to fuel anti-science fervor.”
Even a cursory glance at medical journals shows that once heralded studies keep falling by the wayside. Two 1993 studies concluded that vitamin E prevents cardiovascular disease; that claim was overturned by more rigorous experiments, in 1996 and 2000.
A 1996 study concluding that estrogen therapy reduces older women’s risk of Alzheimer’s was overturned in 2004.
Numerous studies concluding that popular antidepressants work by altering brain chemistry have now been contradicted (the drugs help with mild and moderate depression, when they work at all, through a placebo effect), as has research claiming that early cancer detection (through, say, PSA tests) invariably saves lives.
The list goes on.
Despite the explosive nature of his charges, Ioannidis has collaborated with some 1,500 other scientists, and Stanford, epitome of the establishment, hired him in August to run the preventive-medicine center. “The core of medicine is getting evidence that guides decision making for patients and doctors,” says Ralph Horwitz, chairman of the department of medicine at Stanford. “John has been the foremost innovative thinker about biomedical evidence, so he was a natural for us.”
Ioannidis’s first targets were shoddy statistics used in early genome studies. Scientists would test one or a few genes at a time for links to virtually every disease they could think of. That just about ensured they would get “hits” by chance alone. When he began marching through the genetics literature, it was like Sherman laying waste to Georgia: most of these candidate genes could not be verified.
The claim that variants of the vitamin D–receptor gene explain three quarters of the risk of osteoporosis? Wrong, he and colleagues proved in 2006: the variants have no effect on osteoporosis.
That scores of genes identified by the National Human Genome Research Institute can be used to predict cardiovascular disease? No (2009). That six gene variants raise the risk of Parkinson’s disease? No (2010). Yet claims that gene X raises the risk of disease Y contaminate the scientific literature, affecting personal health decisions and sustaining the personal genome-testing industry.
Statistical flukes also plague epidemiology, in which researchers look for links between health and the environment, including how people behave and what they eat. A study might ask whether coffee raises the risk of joint pain, or headaches, or gallbladder disease, or hundreds of other ills. “When you do thousands of tests, statistics says you’ll have some false winners,” says Ioannidis. Drug companies make a mint on such dicey statistics. By testing an approved drug for other uses, they get hits by chance, “and doctors use that as the basis to prescribe the drug for this new use. I think that’s wrong.”
Even when a claim is disproved, it hangs around like a deadbeat renter you can’t evict. Years after the claim that vitamin E prevents heart disease had been overturned, half the scientific papers mentioning it cast it as true, Ioannidis found in 2007.
The situation isn’t hopeless. Geneticists have mostly mended their ways, tightening statistical criteria, but other fields still need to clean house, Ioannidis says. Surgical practices, for instance, have not been tested to nearly the extent that medications have. “I wouldn’t be surprised if a large proportion of surgical practice is based on thin air, and [claims for effectiveness] would evaporate if we studied them closely,” Ioannidis says.
That would also save billions of dollars. George Lundberg, former editor of The Journal of the American Medical Association, estimates that strictly applying criteria like Ioannidis pushes would save $700 billion to $1 trillion a year in U.S. health-care spending.
Of course, not all conventional health wisdom is wrong. Smoking kills, being morbidly obese or severely underweight makes you more likely to die before your time, processed meat raises the risk of some cancers, and controlling blood pressure reduces the risk of stroke.
The upshot for consumers: medical wisdom that has stood the test of time—and large, randomized, controlled trials—is more likely to be right than the latest news flash about a single food or drug.
SOURCE
Stress blocker helps bald mice regrow hair
If this generalizes to humans, the discoverers of the compound should make a fortune
US researchers looking at how stress affects the gut stumbled upon a potent chemical that caused mice to regrow hair by blocking a stress-related hormone, said a study today.
While the process has not yet been tested in humans, it grew more hair in mice than minoxidil, the ingredient in Rogaine, a popular treatment for baldness, said the study in the online journal PLoS One.
"This could open new venues to treat hair loss in humans through the modulation of the stress hormone receptors, particularly hair loss related to chronic stress and aging," said co-author Million Mulugeta.
Researchers from University of California at Los Angeles and the Veterans Administration discovered the chemical compound "entirely by accident", said the study. Scientists were using genetically engineered mutant mice that were altered to produce too much of a stress hormone called corticotrophin-releasing factor, or CRF. The chronic stress condition makes them lose hair on their backs.
They injected a chemical compound called astressin-B, developed by the California-based Salk Institute, into the mice to see how the CRF-blocker would affect gastrointestinal function. When they saw no effect at first, they continued for five days. The researchers completed their gastrointestinal tests and put the mice back in cages with their hairier counterparts.
When they returned to get the stressed-out mice three months later for more tests, they discovered they could no longer tell them apart because the mice had regrown all the hair they had lost.
"Our findings show that a short-duration treatment with this compound causes an astounding long-term hair regrowth in chronically stressed mutant mice," said Professor Mulugeta of the David Geffen School of Medicine at UCLA.
Not only did it help grow hair, it also appeared to help hair maintain its color and not go grey. "This molecule also keeps the hair color, prevents the hair from turning gray," he said.
The short five-day time span of treatments brought hair growth effects that lasted up to four months, which was also surprising to researchers. "This is a comparatively long time, considering that mice's life span is less than two years," Professor Mulugeta said.
Researchers gave the bald mice treatments of "minoxidil alone, which resulted in mild hair growth, as it does in humans. This suggests that astressin-B could also translate for use in human hair growth," said the study.
Co-author Yvette Tache, a professor of medicine at UCLA, said it could take up to five years to start a clinical trial in humans. "This research could be beneficial in a lot of diseases which are stress-related in their manifestations or exacerbation of symptoms," she said, noting that no sign of toxicity has appeared after extensive tests on mice.
Professor Mulugeta said talks were underway with a major cosmetics firm to fund a study in humans. "In general, the concept that interfering with the stress hormones or their receptors to prevent or treat stress-related diseases is a very valid concept," he said.
Jean Rivier, a Swiss professor at the Salk Institute, said he has reason to believe the process could be useful in other stress-related ailments, from psoriasis to depression to heart and artery problems. "You bring back the skin to a normal acidic state where it can essentially go back to a normal stage whereby hair will grow again, and this is very reminiscent of what is seen in relapses and remissions," he said.
The CRF-blocker was studied by other major pharmaceutical companies but was abandoned because it could not administered orally, he said.
However, the latest method uses peptide analogs that bind to the same receptors but via different types of molecules, which could be administered though injections or potentially through a nasal spray.
SOURCE
Subscribe to:
Posts (Atom)