Call me a crazy optimist. Call me a dreamer. But I think the tide has finally turned–and America has gone organic.
I just read a new poll which showed that, when given a choice, American consumers prefer to purchase organic foods. In fact, 58 percent of Americans say they choose organic over conventionally produced foods when they have the opportunity.
Among those who prefer organic foods, 36 percent said they do so to support local farmers markets. And 34 percent said they wanted to avoid exposure to toxins in non-organic foods.
This is great news!
I have always advised my patients to eat as much local, organic food as possible.
It takes all of the guesswork out of what you’re putting into your body. With organic foods, you avoid growth hormones and antibiotic residue found in commercially farmed foods. Plus, you also get more nutrition from your food.
This is especially true for vegetables (particularly the green leafy varieties, like spinach), eggs, fish, cheese, milk, poultry, and meat. Those foods are especially critical because the organic versions naturally contain more omega-3 fatty acids, which prevent inflammation in the body.
You can find organic foods in just about every supermarket these days, but, if you can, check out the farmers’ markets in your area while you still can. They’re the best places to find locally grown organic foods.
I know that organic foods can be more expensive, but the truth is you’re actually getting much more nutrition for your dollar. Think of it as an investment in your future–and the very best “health insurance” money can buy.