When Did White Men Become The Bad Guys in America?
The question should be, when did blacks and women gain their rights? The answer is, when they stood up and refused to be treated as second class citizens.
Well, now it's white men's turn.
4 posted on 06/11/2016 4:54:48 AM PDT by TwelveOfTwenty
(See my home page for some of my answers to the left's talking points.)