I guess my question is who gave the Americans the right? I say this as an American. But would not the world be a better place if we just minded our own business and quit nation building and stoking non existant fires?

  • salacious_coaster@infosec.pub
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    4 days ago

    It mostly started with the cold war. The US was obsessed with stopping the perceived threat of communism. In the process, it discovered the benefits of power mongering and war profiteering.

    • AES_Enjoyer@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      3 days ago

      It most definitely did not start in the cold war. The US was happily invading and controlling the politics of as much of the continent of the Americas well before WW2, with stuff such as the United Fruit Company or the Big Stick Ideology. The 1898 invasion of Cuba and establishment of a military junta comes to mind.