I think the problem is that "empire" is usually only used in a pejorative sense. The Roman and British empires spread civilised values around the world, and, while they both had their darker moments, were generally Good Things for humanity. I don't see why the American empire should be any different.
Typical leftist cant---to re-write the definition of a word or phrase to mean what they want it to.
Real definition (Merriam Webster):
"empire-a major political unit of great extent, or a number of territories or peoples UNDER A SINGLE SOVEREIGN AUTHORITY."
By that definition, the US is NOT an empire.
Yes, the US is "empire like", but not in the traditional sense. Instead of extracting wealth from other coutries, wealth is increased through an open competitive society and free trade among nations. As far as defending its national interests outside its own borders, why should anyone feel ashamed about that?
As long as these actions are consistent with defending national interests, and consistent with Western ideals and available resources, what's the problem?