Republicans always have been the “War Party.” Why did anyone think that would change?
I've heard it said that Democrats start wars but the Republicans are left to wage war and do the clean-up.
Don't know if it is true or not. Just throwing it out there and hoping someone with war knowledge/historical expertise will set me right if I'm wrong.