Posted on 10/14/2007 7:14:10 AM PDT by proudofthesouth
In that case, we've been an empire since at least when? 1803, when Jefferson bought the Louisiana territory? 1783, when we got everything up to the Mississippi? 1775, when we tried to conquer Canada?
I'm not trying to argue with you, but would like to point out to the troops that in 19th century America, it was Democrats like Jefferson, Jackson, and Polk -- who tended towards state's rights, limited government, and strict construction -- who were most passionately expansionist and in one sense at least, imperialist.
A lot of the talk of our lost Republic relates to what Jefferson and Jackson did -- the "Revolutions" of 1800 and 1828 -- more than to what the founders may have intended.
In other words, the radical state's rights view wasn't Washington's, or Hamilton's or Adams's, but something the victorious Jeffersonians and Jacksonians imposed on the Constitution and the country.
So the very premises of the original question may be shaky.
If we had not entered WWI, there would have been no Hitler.
“I’m not trying to argue with you”
Argue away. That’s why I love being on FR. I don’t get a chance to debate in real life.
The only thing I don’t like is when the person I’m debating can’t be an adult and responds to my points by calling me a poopoo head. : )
Lincoln was never the aggressor. Read his first inaugural address. The south started the war when they fired on Fort Sumter. Thereafter, the federal government was in a fight for survival against insurgents who wanted to seize the nation’s capital and take control of the governament. The war was never totally about state’s rights. A brief study of history will show this. Read “The Civil War”, by Benson Lossing.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.