If you’re American, do you concur with my assessment that our country is rapidly going to hell in a handbasket? Or do you believe that everything is hunky-dory and America’s light is shining as brightly as ever?
If you’re not an American, based upon what you’ve read, seen, and heard, do you feel that America has, indeed, faltered? Or do you think that America will weather this storm?
Since I am NOT an American, I don’t know much about the politics in this country. When we moved to America in 2015, Obama was the president. The elections were near and we saw how the GOP selected Trump for their candidate. We never thought that he would be elected the next president. But it was to be so!
I give this context because of the difference I saw and felt between America under the two presidents. In Obama’s time, people were more tolerant, there weren’t any racial slurs cast on us as Muslims and brown people and the atmosphere was way more peaceful. When Trump took power, things went downhill. Open racism was common. People would say nasty things in passing and the entitlement of some white people grew by leaps and bounds.
What happened during the last couple of years of Trump’s rule and after the elections was unbelievable. It reminded me of any third-world country which sees power struggle at the hands of despots. More like what has been going on in my country since we became an independent country.
What Trump did was to unleash all the racism, hatred, and intolerance which some people already had inside their minds but now it became okay to be openly racist. This smells like beginning of the end of the tolerances and acceptability that was a hallmark of American society.
The recent acquittal of a white man for the murder of three people openly was another example of why sensible people feel that justice in America is totally blind and lopsided.
Too many issues have cropped up in recent years and many people are losing hope.
How can you fix this?