Americans have always grown up believing we are the good guys. We fight the wars of the righteous, help the oppressed and free the world from despots and evil dictators. I am seriously wondering when all that had changed. Do I have to dig up John Wayne to do battle with evil? Isn’t anyone up to the challenge anymore? It is no longer an assumption that America is on the right side of history, because now it seems we are on the wrong one or can’t even find the sides at all. When good men ignore evil and allow it...