Is Monsanto’s glyphosate the true cause of sensitivity to glutenous foods?
Over 18 million people are said to have some intolerance to gluten, the sticky protein that can be found in breads, barley, and other wheat products. But how scientifically grounded is this sudden wave of large-scale gluten intolerance? As it turns out, it may not be gluten that is triggering health problems, but a reaction to agrochemicals being used in the harvesting of wheat.
In 2011, an Australian scientist named Peter Gibson at Monash University conducted an experiment to determine whether gluten in the diet can cause gastrointestinal distress in people who did not have celiac disease. When experiments confirmed this hypothesis, they named this condition ‘non-celiac gluten sensitivity‘ or NCGS, thus beginning the gluten-free trend, which has resulted in an estimated $15 billion industry by 2016.
For this new experiment, Gibson sought out 37 self-identified gluten sensitive patients. The study was done double-blind with subjects that had NCGS and irritable bowel syndrome, but not celiac disease. For two weeks, the patients were given high-gluten, low-gluten, and no-gluten meals (as the control group), followed by a two-week “washout” period.
The findings of the study showed that although in opposition to the results found in the first experiment, gluten intolerance actually does not exist in people without celiac disease. A third study, also by Gibson, further supports these findings, suggesting perhaps that much of what we see as gluten sensitivity is psychosomatic.
“In contrast to our first study… we could find absolutely no specific response to gluten,” – Dr. Peter Gibson
It May Not Be The Gluten – But Don’t Eat That Wheat Too Soon
Although gluten is no longer believed to be the culprit of health problems reportedly associated with consuming glutenous wheat, that does not mean that conventionally grown wheat is completely safe to eat. In fact, until 2005, GMO wheat was being tested in 16 states, and is known to have escaped testings grounds, genetically polluting nearby fields via airborne seeds and cross-pollination.
“Further testing by USDA laboratories indicates the presence of the same GE glyphosate-resistant wheat variety that Monsanto was authorized to field test in 16 states from 1998 to 2005.” – USDA, reported by Natural News
In addition, even non-GMO wheat is drenched with Monsanto’s carcinogenic glyphosate Round-up just days before harvest, because, as it turns out, wheat fields produce slightly more seed when sprayed with this poison 7-10 days before harvest, as researched by Dr. Stephanie Seneff of MIT.
“It ‘goes to seed’ as it dies. At its last gasp, it releases the seed.” –Dr. Seneff
“If Glyphosate ends up in bread it’s impossible for people to avoid it, unless they are eating organic. On the other hand, farmers could easily choose not to use Glyphosate as a spray on wheat crops – just before they are harvested. This is why the Soil Association is calling for the immediate ending of the use of Glyphosate sprays on wheat destined for use in bread.” – Peter Melchett of the Soil Association
It is true that not every food fad ends up being true. However, we should still take caution when choosing the foods we feed our families. Although it has been found that gluten itself is not causing an intolerance in people without celiac disease, there are still other issues with wheat production that we need to be aware of. Get your wheat from local, organic farms when possible and do what you can to avoid Monsanto and other pesticide company’s chemical toxins finding their way into your body. So Who Found Gluten Sensitivity Now Says it Doesn’t Exist?
Experience shows that preservatives and gut flora may also be implicated in this so called gluten sensitivity. hence this supports the scientist who Found Gluten Sensitivity Now Says it Doesn’t Exist.
This article re-enforces the importance of Iodine in our modern society, which is great but doesn’t do enough to reinforce what I call “THE HALOGEN EFFECT” – which is basically the accumulative effect of all halogens that we are exposed too. The most important are the 3 above IODINE on the periodic table. Namely, FLOURIDE, CHLORIDE, and BROMINE which are more reactive than Iodine so they reactive in your body so should compete at the expense of Iodine and possibly our health.
During the last several years, I have sent out MUCH info on “IODINE” and HOW “IMPORTANT” IT IS TO YOUR “WELL BEING”, “YOUR LIFE” !!
HERE is MORE info: Now that we are being ravaged with RADIATION and “it’s only just begun” to ESCALATE beyond limits we may not be able to endure without DYING of something produced via RADIATION !!
TAKE HEED NOW !! SAVE THESE ARTICLES… REFER TO THEM OFTEN. GET ON “IODINE” AFTER UNDERSTANDING IT AND HOW TO TAKE IT.. AND WHAT KIND OF IODINE TO TAKE…. – USE !
THE LIFE YOU SAVE — could be “YOURS”, or “FAMILY” or “FRIENDS” or “TINY BABIES” EVERY WHERE..
Published on March 13, 2017 in Iodine
We knew a hundred years ago that we needed more iodine and governments started putting a tiny bit in table salt. It was not enough and after the nuclear accident at Fukushima, we need more because of the radioactive iodine released into the environment.
Humans in the 21st Century have an absolute need for iodine supplementation. Iodine is the only medicinal that stands between antibiotic resistant hell and us.
There are many reasons we need iodine in abundance.
There are three principle reasons, however, that stand out thrusting iodine into its place as a medicine of supreme importance:
The first is its antibiotic, anti-fungal and anti-viral effects….. that go beyond antibiotics because it kills viruses, which antibiotics do not. Iodine kills fungus and yeast like candida, which antibiotics do not. In addition, it does it without creating antibiotic resistant strains of bacteria. Scientists are also finding that antibiotics are causing bacteria to grow faster instead of killing them so it is almost suicidal to not employ iodine as the first line of defense in our fight against infections.
Salon Magazine published, “Over 95 percent of physicians are concerned about antibiotic resistance,” and that is all we need to know about iodine, which everyone should have on hand in their homes for every day use as well as emergencies. Iodine has it all over antibiotics, not only because it takes down viruses as well as fungus but also because it does not provoke bacteria to become resistant to it.
Though it kills 90 percent of bacteria on the skin within 90 seconds its use as an antibiotic has been tragically ignored. Iodine exhibits activity against bacteria, molds, yeasts, protozoa, and many viruses. Indeed, of all antiseptic preparations suitable for direct use on humans and animals and upon tissues, only iodine is capable of killing all classes of pathogens: gram-positive and gram-negative bacteria, mycobacteria, yeasts and protozoa.
Most bacteria are killed within 15 to 30 seconds of contact.
In the beginning of 2017, nearly three dozen people in the United States have been diagnosed with a deadly and highly drug-resistant fungal infection that has been rapidly spreading around the world.
The fungus is a strain of a kind of yeast known as Candida auris. Unlike garden-variety yeast infections, this one causes serious bloodstream infections, spreads easily from person to person in health-care settings, and survives for months on skin and for weeks on bed rails, chairs and other hospital equipment. Some strains are resistant to all three major classes of antifungal drugs. Up to 60 percent of people with these infections have died. Candida auris is not resistant to iodine.
Viral infections always have doctors scrambling because they have forgotten how effective iodine is against them. Dr. Richard Kunin, after fifty years of practice, concluded that iodide destroys the virus of herpes. Both oral and genital lesions are treatable this way.
Dr. Eliot Dick observed a 50 percent reduction in respiratory illnesses when using iodine. Many reports by patients find that a gargle of ten drops of potassium iodide in a glass of water, with or without additional vitamin C, relieved sore throat in a matter of hours. Research at the Massachusetts General Hospital tells us, “HIV was completely inactivated and could no longer replicate after exposure to the povidone-iodine preparations, even at very low concentrations.”
The second is its importance with prevention and treatment of cancer.
Iodine is indispensable in protecting against thyroid cancer, breast cancer, ovarium cancer as well as prostate cancer because all of these glands concentrate iodine more than other tissues. Deficiencies in iodine leave these glands vulnerable. Iodine also is indispensable for treating anything on the skin, even skin cancer mainly because it kills everything on contact that does not belong.
Breast tissue contains the body’s third highest concentrations of this essential mineral, so shortfalls in iodine needs have a highly negative impact on breast tissue. Iodine shortfalls coupled with bromine and other toxic halogens cause fibrocystic breast disease and breast cancer.
High intake of iodine is associated with a lower risk of breast cancer. Low iodine intake is associated with liver cancer. Iodine is ideal for treating skin cancer. These are just the tip of the iceberg in terms of how important iodine is for cancer patients.
The third is iodine is protective against radioactive iodine.
We cannot begin to understand how important this is until we also learn how dire the threat is of radiation.
Dr. John W. Gofman, Professor Emeritus of Molecular and Cell Biology in the University of California at Berkeley, has written extensively about the effort to belittle the menace of low-level radiation. People associated with the nuclear and medical industries assert falsely, “there is no evidence that exposure to low-dose radiation causes any cancer—the risk is onlytheoretical,” or “the risk is utterly negligible,” or “the accidental exposures were below the safelevel,” and even “there is reasonably good evidence that exposure to low-dose radiation isbeneficial and lowers the cancer rate.” By any reasonable standard of scientific proof, the weight of the human evidence shows decisively that cancer is inducible by ionizing radiation even at the lowest possible dose and dose-rate—which means that the risk is never theoretical.
Different isotopes of radioactive iodine, one with an incredibly long half-life, have been dumped into the environment by the Fukushima meltdown. Iodine deficient adults and children are sitting ducks to their radioactive cousins especially if they are eating milk and cheese because radioactive iodine gets into the grass that the cattle eat and it just goes up the food chain to your door.
Fourth is the absolute necessity of iodine in metabolism.
Human life is not possible without iodine. That truth is important to every cell in our bodies.
Fifth is its role in the production of hormones.
Iodine helps synthesize thyroid hormones and prevents both hypo- and hyper- thyroidism. Iodine sufficiency reverses hypo- and hyper- thyroidism. Iodine’s ability to revive hormonal sensitivity seems to significantly improve insulin sensitivity. Iodine attaches to insulin receptors and improves glucose metabolism. Iodine is the best nutritional support for your thyroid. Your thyroid controls your metabolism and the efficiency of your metabolism is directly related to that of your immune system.
Sixth is its role in the immune system.
The body’s ability to resist infection and disease is hindered by long-term deficiency in iodine. Poor immune response is directly tied in with impaired thyroid function. A deficiency in iodine can greatly affect the immune system because low levels of iodine lead to problems with the thyroid gland.
Iodine purifies water and it does the same job on the bloodstream. Iodine purifies the complete bloodstream of the body (something the Thyroid does every 17 minutes) meaning sufficient levels of iodine, especially in children, keeps the body free of pathogens, no vaccines needed! Iodine’s true role as clearly making up as much as 1/2 of the body’s immune system has yet to be understood by doctors, but should be as the age of antibiotic resistant, fungal resistant and viral medication resistant infections threaten the human race.
Iodide is accumulated during phagocytosis, the process of engulfing and ingesting bacteria and other foreign bodies. The iodide is attached to the bacteria and to proteins, creating iodoproteins including monoiodotyrosine (T1). Sometimes, the thyroid hormones are utilized as the source of the iodide.
Dr. Gabriel Cousens lists many other important functions of iodine. Iodine offers dozens of under-utilized applications and should always be included when treating or preventing disease.
Simply put, there is no bacteria, virus,
or other microorganism that can survive or adapt to an iodine-rich environment.
1) Iodine prevents heart disease.
2) Iodine eliminates toxic halogens from the body (including radioactive I-131).
3) Iodine supports apoptosis (meaning it supports the natural, programmed death of cells, without it the cells start to mutate and the formation of cancerous growths/cysts/tumors begins).
4) Iodine activates hormone receptors and helps prevent certain forms of cancer.
5) Iodine protects ATP function and enhances ATP production.
6) Iodine prevents fibrocystic breast disease.
7) Iodine decreases insulin needs in diabetics.
8) Iodine helps support protein synthesis.
9) Iodine deficiency is a global health threat.
10) Iodine destroys pathogens, molds, fungi, parasites, and malaria.
11) Iodine is needed with the use of cordless phones, cell phones and now smart meters to prevent hypothyroidism.
12) Iodine supports pregnancy (as the fetus undergoes more apoptosis than any other developmental stage).
13) Iodine regulates estrogen production in the ovaries.
14) Iodine is anti-mucolytic (meaning it reduces mucus catarrh).
15) Iodine neutralizes hydroxyl ions and hydrates the cells.
16) Iodine makes us smarter.
17) High doses of iodine may be used for wounds, bedsores, inflammatory and traumatic pain, and restoration of hair growth when applied topically.
18) Iodine helps in the diminishing of tissue scarring, cheloid formations, and Dupuytren’s and Peyronie’s contractures, which are hyper-scarring conditions.
19) High doses of iodine may be used to reverse certain diseases.
20) Iodine supports spiritual development.
I have always recommended liquid iodine in the form of Nascent Iodine, which I recommend for children and iodine sensitive patients who need to start at very low dosages, and Lugol’s Iodine, which has been around for almost two centuries. Solid forms are available.
I am preparing a second edition of my Iodine book for my New York publisher and it is quite a shock to re-realize how important iodine is. Perhaps will change the title to Iodine to the Rescue, or Iodine – The Most Important Medicine in the Age of Antibiotic Resistant Infections.
This has been an interesting debate that has been going on for years – my thoughts have been that if Low Fat diet and Cholesterol Drugs were supposed to be so good how come is that Heart disease is still one of the biggest killers in Western Nation Zoe Harcombe is the lead author of a new study that debunks the low-fat myths and reported on diabetes.co.uk may help clarify what happened.
Authentic Turkish BakeHouse Bread has only been a NO ADDED FAT PRODUCT – which still means it is low in fat compared to most breads – hopefully you add the healthier fats to the bread when you use it.
The choice is yours to make!
“Every last shred of evidence”: Why low-fat dietary guidelines should never have been introduced
When the US government introduced “Dietary Goals for the United States”, they did not have unanimous support. The guidelines, which urged the public to cut saturated fat from their diet, were challenged by a number of scientists in a Congressional hearing. The findings were not based on sufficient evidence, they argued.
They were ignored. Dr. Robert Olson recounts an exchange he had with Senator George McGovern, in which he said: “I plead in my report and will plead again orally here for more research on the problem before we make announcements to the American public.” McGovern replied: “Senators don’t have the luxury that the research scientist does of waiting until every last shred of evidence is in.”
And so, backed by a questionable base of evidence, the guidelines were implemented, triggering one of the most complex and controversial scientific debates of the last three or four decades.
Senator George McGovern had a big influence on the introduction of low-fat dietary guidelines. He also argued that “senators don’t have the luxury that the research scientist does of waiting until every last shred of evidence is in.” image source: usnews.nbcnews.com
We are only now seeing a significant challenge to the low-fat guidelines. In June of last year, Time published Bryan Walsh’s “Ending the War on Fat.” Walsh quotes Philip Handler, who was president of the National Academy of Sciences in 1980. He called the fat guidelines a “vast nutritional experiment.” According to Walsh, “the experiment was a failure. We cut the fat, but by almost every measure, Americans are sicker than ever.”
“Senators don’t have the luxury that the research scientist does of waiting until every last shred of evidence is in.”
On June 1, 2015, Open Heart, The BMJ‘s cardiology journal, published “Evidence from randomised controlled trials did not support the introduction of dietary fat guidelines in 1977 and 1983: a systematic review and meta-analysis,” the first study to comprehensively critique the body of evidence supporting the low-fat guidelines.
The study, led by Zoe Harcombe, argues that “dietary recommendations were introduced for 220 million US and 56 million UK citizens by 1983, in the absence of supporting evidence from [randomised controlled trials].”
The study presents a number of findings, including:
The original RCTs [randomised controlled trials] did not find any relationship between dietary fat intake and deaths from CHD [coronary heart disease] or all-causes, despite significant reductions in cholesterol levels […] this undermines the role of serum cholesterol levels as an intermediary to the development of CHD and contravenes the theory that reducing dietary fat generally and saturated fat particularly potentiates a reduction in CHD.
In other words, it’s generally believed that having lower cholesterol levels means having a lower risk of heart disease, and the best way to lower cholesterol levels is to reduce one’s intake of fat. But the study punctured this belief: there is no real established link between cholesterol levels and risk of heart disease. Therefore, reducing fat intake isn’t going to reduce the risk of heart disease.
Uffe Ravnskov, a Danish researcher notorious for challenging the conventional wisdom surrounding heart disease, makes a similar point:
Today a reduction of the intake of saturated fat is one of the cornerstones in dietary prevention of obesity, diabetes and cardiovascular disease. However, several recent trials have documented that the main argument for this advice, its alleged influence on blood cholesterol, is questionable. More disturbing is that a large number of epidemiological, clinical, pathological and experimental studies have falsified the idea that a high intake is harmful to human health and that a reduction may be beneficial; indeed, several observations point to the opposite.
Harcombe et al also identify flaws in the research, flaws that compromise the integrity of their findings. For example, many of the original studies focused on men who had suffered a myocardial infarction. But they failed to take into account that:
men who have suffered a myocardial infarction (MI) subsequently make multiple lifestyle changes (weight loss, smoking cessation, increase in physical activity, for example), which makes them a poor group for testing the lipid hypothesis.
Ms Catherine Collins, Principal Dietitian at St. George’s Hospital NHS Trust, also described the failings of the original research:
Studies had small numbers (most with a history of heart disease) were highly varied in dietary and lifestyle approach, and reviewed outcomes over a short time scale – around five years on average. All these factors would influence the outcome – whatever dietary changes were made.
In short, the body of evidence that led to the introductions of anti-fat dietary guidelines was incomplete. The researchers – Harcombe et al – summarised the huge gaps:
At the time dietary advice was introduced, 2467 men had been observed in RCTs. No women had been studied; no primary prevention study had been undertaken; no RCT had tested the dietary fat recommendations; no RCT concluded that dietary guidelines should be introduced. It seems incomprehensible that dietary advice was introduced for 220 million Americans and 56 million UK citizens, given the contrary results from a small number of unhealthy men.
Whoever decided to implement the anti-fat guidelines seems to have done so by ignoring all evidence to the contrary. A few supportive studies seem to have been given a greater importance, even though, as Ravnskov notes, “A few supportive studies cannot outweigh more than one hundred studies that have falsified the hypothesis.” And, having wrongfully supported the anti-fat guidelines, fresh evidence to the contrary has also been ignored. The guidelines remain unchanged. Ravnskov argues that what is “most disturbing is that the numerous contradictory epidemiological, pathological and clinical observations […] have been ignored by all guideline authors.”
So the guidelines were pushed through on the back of a thin suggestion that saturated fat was the biggest cause of heart disease, without any of the rigorous, comprehensive examination that should come with it. The decision to promote anti-fat guidelines seems to stem from a selective attitude to evidence. Because, as Ravnskov notes, “any idea can be verified if one looks for confirmation only.” It wasn’t just flawed; it was woefully inadequate. Or, as the researchers put it: “dietary advice not merely needs review; it should not have been introduced.”
Zoe Harcombe is the lead author of a new study that debunks the low-fat myths. Image source: walesonline.co.uk.
People with diabetes have been tarred with the low-fat brush as much as anybody. Historically, all the official advice, from the NHS to Diabetes UK, has urged people with diabetes to avoid fat. This despite the fact that anecdotal evidence has for some time suggested that people with diabetes find a low-carb, high-fat diet more conducive to good blood glucose control.
The NHS recommends “reducing the amount of fat” in the diets of people with type 2 diabetes, and including “starchy carbohydrates, such as pasta,” while Diabetes UK suggest that people with diabetes “choose low-fat alternatives wherever possible,” while having “some starchy food” every day.
Slowly but surely, the advice is changing. Last year, Dr. David Unwin published “Low carbohydrate diet to achieve weight loss and improve HbA1c in type 2 diabetes and pre-diabetes: experience from one general practice.” The study, published in Practical Diabetes, indicated that a low-carb diet could enhance weight loss and improve HbA1c levels in people with type 2 diabetes, in contrast to the official guidelines of the NHS and Diabetes UK.
Other studies report similar findings. Halton et al’s “Low-carbohydrate-diet score and the risk of coronary heart disease” concluded that “diets lower in carbohydrate and higher in protein and fat are not associated with increased risk of coronary heart disease in women. When vegetable sources of fat and protein are chosen, these diets may moderately reduce the risk of coronary heart disease.”
The two pieces of advice are more or less mutually exclusive. If we cut down on carbs, we have to make up for it by eating more fat. If we cut down on fat, we have to compensate by eating more carbs. Broadly speaking, there can only be one winner. The original low-fat guidelines recognised this, telling people to “increase carbohydrate consumption to account for 55 to 60 per cent of the energy (caloric) intake [and] reduce overall fat consumption from approximately 40 to 30 per cent energy intake.” But according to Richard Feinman, the Editor of Nutrition and Metabolism, “fat consumption, if anything, went down during the obesity epidemic – almost all of the increase in calories was in carbohydrates.”
Shops are full of low-fat alternatives to yoghurt, butter and the like. But are they the answer to heart disease and obesity?
So why were the guidelines introduced, if they were so obviously flawed from the start?
It’s not the sinister conspiracy we might have hoped. In fact, the motives for introducing the guidelines appear to have been completely noble and well-meaning. On the publication of the guidelines, Senator George McGovern said:
My hope is that this report will perform a function similar to that of the Surgeon General’s Report on Smoking. Since that report, we haven’t eliminated the hazards of smoking, nor have people stopped smoking because of it. but the cigarette industry has modified its products to reduce risk factors, and many people who would other be smoking have stopped because of it. The same progress can and must be made in matters of nutritional health.
If anything, the sincere desire to improve health may have been at fault for the implementation of such shoddily-evidenced guidelines.
“Fat consumption, if anything, went down during the obesity epidemic – almost all of the increase in calories was in carbohydrates.”
Matters were exacerbated by another commendable motive: the desire to propose guidelines that ordinary people could stick to. Dr. D.M. Hegsted, Professor of Nutrition at the Harvard School of Public Health in Boston, Massachusetts, said when the guidelines were published:
It is increasingly obvious that if new knowledge is to result in new behaviours then people must be able to act, without undue obstacles, in accordance with the information they learn. The problem of education for health as it has been practiced is that it has been in isolation, not to say oblivion, of the real pressures, expectations, and norms of society which mould and constrain individual behaviour. There must be some coordination between what people are taught to do and what they can do.
This is a noble approach for the most part. Certainly, modern dietary guidelines would be more effective if they took into account the practical realities of people lives. And yet, at the same time, the approach can be simplistic. This is a complex issue, and ignoring its complexity in order to make more convenient guidelines isn’t going to help anyone.
Ironically, the guidelines were introduced because “there needs to be less confusion about what to eat and how our diet affects us,” yet ended up triggering one of the biggest scientific debates of the century.
The low-fat dietary guidelines are finally being challenged, with a number of experts suggesting that it’s carbohydrates, and not fat, that does the damage.
When it comes to dietary guidelines, there is no simple answer. Studies will always conflict, and findings will always be questionable to some degree. Harcombe’s study isn’t perfect; according to Christine Williams, Professor of Human Nutrition at the University of Reading, it takes
a classical pharmaceutical approach to the evidence, with the assumption that RCTs also provide the gold standard for diet, representing the pinnacle of the evidence hierarchy against which other types of study are inevitably weaker. Most dietary guidelines have been developed using an approach which takes the degree of consistency of a number of lines of evidence as the gold standard for risk assessment. This approach uses RCTs where such data is available.
Ironically, the guidelines were introduced because “there needs to be less confusion about what to eat and how our diet affects us,” yet ended up triggering one of the biggest scientific debates of the century.
And it isn’t productive to insist people adhere to any one strict diet; different things will work for different people. But it is necessary to call out deeply-flawed guidelines. A number of studies have suggested a “protective effect of saturated fat from dairy products.” We may be increasing our risk of heart disease by avoiding saturated fats.
Maybe so, maybe not. Decades of scientific dispute haven’t put the fat versus carbs debate to bed; it’s very unlikely to happen in this article. But whatever you believe, the debate shows how important it is that we insist on a thorough scientific basis for dietary guidelines. We need to ask questions of the researchers. Otherwise we’re stuck with low-fat recommendations that could, just maybe, be killing us.
I love chocolate. I keep a stash of dark chocolate in the pantry at all times for after dinner cravings or mid-day pick me ups. In fact, it is rare that a day goes by where I don’t eat at least a small square of chocolate. But is my love of chocolate a bad habit, or is chocolate good for you? I am happy to report that dark chocolate, specifically, is actually quite healthy; there continues to be new evidence for the various health benefits of dark chocolate. Provided below are the 7 reasons eat dark chocolate.
Why is Chocolate Good for You?
Cacao, which comes from seeds of the tree Theobroma cacao, is the main component of dark chocolate. Cacao is full of compounds called polyphenols (particularly flavanols), which have a variety of health benefits. Polyphenols are potent antioxidants, which help to fight diseases, particularly of the brain andheart. And dark chocolate has two to three times more of these compounds and a higher antioxidant activity than green tea, which is well known for its health benefits. Dark chocolate might be an especially effective form of polyphenols because of the way bacteria in our gut interact with the dark chocolate that we ingest.
Health benefits of Dark Chocolate
The list of reasons why chocolate is good for you is seemingly endless. Here are just a few of the impressive health benefits of dark chocolate:
Improve insulin functioning. Eating dark chocolate can help with blood sugar regulation. It leads to improved insulin resistance, and can even effect β-cell functioning (the cells that produce insulin).[1,2] In fact, people who eat one ounce of chocolate two to six times per week are 34% less likely to be diagnosed with diabetes.
Protect against the effects of stress. Recent research shows that intake of dark chocolate can help to protect against the body’s negative reaction to stress. It seems that flavanols in chocolate can reduce the stress response by affecting hormone activity of the hypothalamic-pituitary-adrenal axis in the brain.
Fight dementia and improve cognitive performance. Dark chocolate can help to improve cognitive function in both young and older adults.[5,6] Flavanols in cacao are associated with a decreased risk of dementia and cognitive impairment. This may be because the flavanols can help to increase the number and strength of connections between neurons, which can help to preserve memory and improve cognitive ability. They also increase blood flow, and blood flow to the brain helps improve cognitive performance as well.
Improve cholesterol and triglycerides levels. A systematic review of 42 studies found that chocolate intake is associated with reduced LDL cholesterol and triglyceride levels and an increase in HDL cholesterol.
Reduce blood pressure. Flavanols are effective blood pressure reducers, as well. People who eat flavanol-rich cacao products regularly have significantly reduced blood pressure readings.
Lower risk of cardiovascular disease. Eating chocolate can decrease your risk for cardiovascular disease mortality, as dark chocolate can lower blood pressure and cholesterol, has anti-inflammatory capabilities, and effects heart health in numerous other ways.
Fight cancer. Although research has not yet made a direct association between chocolate intake and cancer risk, many compounds found in chocolate have cancer-fighting properties. They protect against oxidative damage, suppress proliferation of cells, and protect from mutation.
Scientists continue to discover new reasons why chocolate is good for our health. It may help improve mood, fight symptoms of chronic fatigue syndrome, and more.[8,9] So if you have been limiting your intake of chocolate out of fear that it was an unhealthy indulgence, don’t worry; you can now enjoy a square of dark chocolate after dinner or a few dark chocolate covered nuts for a snack without guilt. Just make sure you choose a high quality dark chocolate with a high cacao content (look for over 70%).
Chelsea Clarkis a writer with a passion for science, human biology, and natural health. She holds a bachelor’s degree in molecular and cellular biology with an emphasis in neuroscience from the University of Puget Sound in Tacoma, WA. Her research on the relationship between chronic headache pain and daily stress levels has been presented at various regional, national, and international conferences. Chelsea’s interest in natural health has been fueled by her own personal experience with chronic medical issues. Her many profound experiences with natural health practitioners and remedies have motivated Chelsea to contribute to the world of natural health as a researcher and writer forNatural Health Advisory Institute.