Looking for an explanation: Which direction does American society plan to take?
In my opinion, American and other societies are changing rapidly to an uncharted way of life.
Years ago, America and most other Western nations honored and attempted to follow most Christian - Judaic principals and laws. It successfully guided humanity for over thousands of years towards a reasonable betterment for all.
Despite this historic success, it seems to be a “prerequisite” nowadays for most leaders, specifically politicians and cronies, the news media with their spokespersons and governmental institutions of different parties and viewpoints, various product commercials, must lie, cheat, or suppress crucial facts (providing only partial truths) and to engage in shady, unethical deals.
Furthermore, the English language is full of words describing most situations rather accurately and is easy to comprehend. There is no need for the “pseudo-Intelligentsia” to invent convoluted new expressions in the effort to fake their superiority or to bamboozle me, or the average Joe, into believing something different.
As an avid news follower, I am tiring of the continuing falsehood and have lost the trust, become more sarcastic and am at a loss who or what to believe!
If I am not alone with my observation, then this is a truly sad state for America, humanity at large, and our general future.