Besides "Whitey, gimme all your stuff" and freely killing each other, does anybody know what "social change" they want?
I haven't heard any of them demanding the things that most people do to succeed in life: get a good education, stay in school, don't do drugs, stay sober, get a good job, show up on time every day, don't have babies out of marriage, denounce violent anti-woman rap crap, respect the police and authority, treat everybody civilly, be polite, call your mother.
Yeah, just last week some black woman announced that all white people should just give up their homes to a black person. Does the black person get to pay off the mortgage then? Or, do they think that a nice house is just freely given to a white person with no cost to the owner?
I just laughed, as hubby and I are working toward paying off our home before he retires; but, we should just hand over the house to someone else after we have worked to pay it off. Yeah, that sounds fair.