At the beginning of the summer, I saw a billboard ad for a local grocery store by the freeway. It had two ears of grilled corn. The caption over the image proclaimed:
Make Vegetables Less Vegetabley!
Make vegetables less vegetabley? What the hell does that mean? Isn't that point of vegetables? That they taste like vegetables? Has our food culture gone so far astray that we can't appreciate fresh food when we have it? I happen to love the taste of fresh vegetables, and I think its unfortunate that they grocery stores rather than promoting their fresh food as is, has to demean it in order to sell it.
Another bizarre sign I saw was outside a local brewery. Their sign said,
Schmiddt's Now Offers Organic Water
What the hell is organic water? It seems to me that water is one of the most pure things out there. Is organic water simply the absence of added minerals, sugars, electrolytes and flavors? It's too bad that there is now so much other crap in our water that's not simply hydrogen and oxygen that we have to advertise water as 'organic'.
What the hell?!