When did humans’ main interaction with Earth become seen as wrecking it?

June 18, 2022 by Joshua
in Art, Nature

At an art show recently, I was looking at a painting that caught my eye and attention. It was of a beach or shallow ocean, viewed underwater so you could see the waves from below, with the sun filtered and refracted through it, echoing the waviness of the sea floor. Here’s a similar piece by the artist, though online doesn’t match the beauty in person:

As I looked at it, a woman approached me who turned out to be the artist. I’ve shown art in shows and galleries, so like learning about artists’ methods and perspectives. She said she was creating art showing Earth before humans. In particular, she said “before humans wrecked the planet” as if it was the most natural way to describe before us.

She was younger than me. Young enough, maybe, not to remember that we didn’t always consider humans’ main impact on the Earth to be wrecking it. I know many people consider nature to be dangerous and deadly, and our taming it and seizing dominion over it made it pleasant, but I think most people still see a beauty in untouched nature.

I’m asking the following question rhetorically, but I suggest it’s worth considering or lamenting: “when did it become this way“, that we think of humans as wreckers of Earth, not stewards, caretakers, or just benign?

Read my weekly newsletter

On initiative, leadership, the environment, and burpees

We won't send you spam. Unsubscribe at any time. Powered by Kit

Leave a Reply

Sign up for my weekly newsletter