Here is a clue Andrew. The USA has never been an Imperialist nation. Dont shove your guilt on us. How dare you accuse us of all people of colonialism?
You need to acquaint yourself more closely with your own nation’s history. Pay particular attention to the Spanish American war, and the subsequent colonisation of Puerto Rico and the Phillipines.
And if the earlier westward expansion wasn’t imperialism then I don’t know what is....
America WAS a colonial nation at one point in its history.