Over the past few years in my interactions with my friends who reside on the west coast of the US, I have wondered at some things. Not sure if it is just the US or it is equally true of urban vs suburban living worldwide. It may sound stereotypical or way too general but here are my thoughts for what they are worth.
Awareness about ecological footprint, going green, buying organic, buying local etc tends to run rampant among west-coasters. On the other hand, I see little evidence of it in my local circle here. Even if my friends here have strong views about these things, I hardly hear about it. Same thing with Walmart bashing. I label it Walmart bashing as I personally like Walmart. I have heard about their hiring practices, their working hours, their benefits etc. Yet, I believe they are picked apart more often because they are huge and they are successful. Again, these are just my views.
I noticed it again with hybrid cars. Long before I even heard there exists a thing like a hybrid car, family I know in the west coast had been using it for a year. Cloth diapers, returning to a more simplistic way of life. All of it seems to happen along the west coast before the east catches on.
So, what do you think? Is is a west coast thing or something different?