Anyone else remember how Obama was supposed to restore America’s image and prestige in the world?
Remember being lectured by liberals that Bush exercised cowboy diplomacy and all that? Remember liberals saying that we were not respected in the world because of “W”? That our image was tarnished all because of “W”?
I am shocked, just shocked, to see NBC News reporting a story such as this. Obama was supposed to lead us out of the dark ages of the Bush years, to a greater communion with the world community, yadda, yadda, yadda. What the heck happened?????
Obama surrendered and now nobody in the world fears the U.S. like they used to.