I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?
This right here. The US was isolationist prior to WWII but then got attacked and drawn in to active war.
Since the mainland of US was untouched by war directly, and industry boomed post depression and during the war they came out of it better off than Europe, which had a lot of rebuilding to do.
As a result of the war and the need for defense they established bases all across the globe and for the last 80-90 years as the political system grew more corrupt the increase of American hegemony followed.
South America wants to have a word with you.
After World War I, the United States returned to limited isolationism primarily due to war disillusionment, economic concerns, and political opposition to international organizations like the League of Nations
Two things can be true at the same time.