For forty years critics have attacked Western culture in general and its American brand in particular for an assortment of perceived sins. Minority groups have alleged America was singularly racist. Radical feminist have charged that it is sexist and male-dominated. Gays have complained about homophobia. Hard-core Leftists argued that the United States is exploitive and in thrall to a few elite capitalists. ...First, the charge was that our culture was inordinately dominated by white, heterosexual Christian men, who had systematically oppressed others to maintain their own privilege. Second, the solution was to enact affirmative action, change attitudes, pay fines, create...