I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?

  • AES_Enjoyer@reddthat.com
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    3 days ago

    It most definitely did not start in the cold war. The US was happily invading and controlling the politics of as much of the continent of the Americas well before WW2, with stuff such as the United Fruit Company or the Big Stick Ideology. The 1898 invasion of Cuba and establishment of a military junta comes to mind.