How so?
Western Europe hated Bush. They liked Obama at first. Now things are worse. What went wrong?
When people write about "America's image abroad" most of the time they focus on foreign elites in places like Europe who are paying attention.
You can argue that Obama wasn't trying to improve our image abroad, but Western Europeans liked America more at first and then things went sour.
Something or some things went wrong? What were they?
The article goes over pretty thoroughly the things that went wrong during the Obama years.
Five other posters on this thread agree with me.
Post #10 says it best.