So there's an argument going on on Facebook right now about the science behind why going barefoot is good for you. I don't follow all the "positive ions and engaging all the bones" talk, but I do have an opinion. I don't know the science behind it, if there is any. I just know that I feel better when I'm barefoot. Whether indoors or outdoors, I feel free and younger and healthier when I don't have shoes on. And if I have to wear them, I prefer sandals every time.
I remember in high school taking my shoes and socks off to walk home from school. I remember walking around the neighbourhood as a kid in my bare feet. I still run out to the garbage cans or to get the mail in bare feet in the dead of winter. I wander my garden barefoot (watching for thistles. My weeding isn't up to snuff). I love the feel of connection with the ground, the scent of earth, the touch of skin to soil.
I've heard that soil has organisms in it that contribute anti-depression qualities somehow, and this transfers to you through your skin. Whether that's true or not, I know I feel better when my hands are in dirt and my feet are directly on the ground. When winter drags on, I crave getting my hands back in the garden. I've been known to stick my head in a bag of damp potting soil just to breathe in the smell of it when winter has lasted too long. I just close the curtains first to the neighbours don't think I've lost it all together...
I have pots of green beans growing in my dining room right now, along with rosemary and a lemon tree. I harvest a mere trickle of beans, but the food isn't the point. It's the smell of wet soil, of green growing things, in the winter that I glean from them. It's a tonic and a promise, a bit of bright hope that spring will come again.