Dr. Bell's Blog

Does Organic Always Mean Healthier?

vegetables-790022_1920.jpg

Organic food has become one of the fastest growing sectors of our food industry as more and more people are trying to take better care of themselves.  Many questions however remain. 

Does the food simply being organic mean that it is healthy for you? 

Is organic food really necessary?

What qualifies a food product as being organic? 

What is the difference between "All Natural", and "Organic"?  

First off, lets look at what is an organic food product and what that means.  According to the department of Agriculture, organic means the following:  ‘Organic food is the product of a farming system which avoids the use of man-made fertilizers, pesticides; growth regulators and livestock feed additives. Irradiation and the use of genetically modified organisms (GMOs) or products produced from or by GMOs are generally prohibited by organic legislation. 

That being said, how does that differ from "All Natural"?  This can be the source of major confusion for people.  "All Natural" is a very loosely regulated term that frankly means whatever the food manufacturer and it's marketing division want it to mean. To put it simply, don't trust the "All Natural" label when you see it.  It has a lot of wiggle room for interpretation, and by no means does it mean the food is healthy for you.

Should you assume that when a food is organic that it means it is healthy for you then?  The answer is a firm no.  You can make a food product, that is loaded with sugar and other unhealthy ingredients, that still meets the definition of organic.  You still need to check labels.  "Healthy" sugar is still sugar, and has a negative impact on your health.  When it comes to meat and vegetables, the short answer is generally yes, it tends to be healthier for you than it's non-organic counterpart. The nutrient content tends to be much higher, and the pesticide content very minimal.  Just be aware, especially this time of year when local fruits and vegetables become  available, that "local" does not necessarily mean organic.  

Finally, after all this information, is organic food really necessary?  The answer is definitely yes.  There are some products that I believe are more important to eat organic than others, but as a whole, they tend to be significantly healthier for you (cumulatively over time).  Yes, they tend to come at a higher price, but your health is that important.  One way to help off set some of the cost would be to grow your own fruit and vegetables in a home garden.  This can be done quite inexpensively, and can provide you with very nutritious food. 

Remember, this is only one piece of the health puzzle.  You must also focus on a healthy lifestyle in other aspects of your daily life.  With some consistent effort however, you too can be well on your way to...

LIVE well.  BE well.