Im not a big fan of USA and i hate a lot of things about it, but nowadays the hatred it gets is completly blown out of proprtion
It is now depicted as some kind of blood-lustful nation that invades every country that has oil, commits atrocities and murder sprees and leaves with all the oil they stole.
I saw people outright compare the US to nazi germany.
The 'murica memes were funny at first, but then then came the 'lets bring them freedom/democracy' memes that generally presnt the US as an evil war mongorer that excuses all their actions by saying "its for freedom.
Oh, and the best one, that all of the wars in the moern era are caused by the US. all our misery? their fault. thanks obama.
People don't know just a tiny bit of their history?
It's like people magically forgot that before the US invasion Iraq was ruled by an insane dictator that invaded all his neighbors and killed coutless of thousends of innocent people with chmical weapons.
Or how about afghanistan being under threat of radical muslims that kill women for the sole crime of aquiring education(!)
How about south korea could have ended up being like hell on earth known as north korea, if it wasn't for US intervention?
People could argue that there are plenty of other oppressed nations that need liberating and the US doesn't do anything, hence, US only invades to control the oil/money/whatever.
that might be true. But what difference does it make? it still terminated horrible regimes and dictators in hopes liberating them.
I know that most people who demonize americans are just teenage brats that try to be cool and edgy by hating what is popular to hate.
And that's the problem. WHY IS AMERICA POPULAR TO HATE? we really need to stop it. There are already kiddies online saying crap like "the world would have been a better place if america lost WWII and germany was the victor"