The decline of the West — and all the rest…
Question from the Internet:
“What are the signs that Western Culture is actually "declining" and not just changing?”
To be honest, I am not even sure what “Western culture”, means.
People cite individual freedom, democracy, free market capitalism among other factors, but we are not really free, there is no democracy anywhere in the world according to the original definition of the notion, and capitalism has become raw, crony, a very small minority exploiting the majority.
We all live in an artificial Matrix that is built around excessive consumption, ruthless and exclusive competition, forcing people to succeed, survive at each other’s expense.
We are all brainwashed by an all-encompassing media, entertainment machinery that controls every facet of our lives, making us slaves in the consumerism system.
The “non-Western” part of the world is not much better either, and they all aspire to copy the West, repeating the same mistakes, crimes blindly.
We have reached very unique crossroads, when we can finally understand that whatever we have invented, built before had been built, devised by our inherently “cancer-like”, egocentric, subjective, self-serving nature that leads us towards self-destruction.
After such an honest, self-critical recognition, admission we will become able to start a completely different, this time conscious, “truly Human” development, using Nature’s integral, balanced template for building a new Human society.