I know its a bit off topic but this stmt piqued my interest: ‘Many American blacks are already angry.’. Why? About what? They’ve been given so much but have succeeded so poorly. Are they angry at ‘us’ for giving or not *making* them succeed? Are they angry w/ themselves and taking it out on ‘us’? Sorry for being so think but I just don’t get it.
They (many American blacks) have been indoctrinated for several generations and continuously told that they are victims of "the man." Republicans (white males) want to deny them the right to vote, welfare, respect (whether deserved or not) free stuff (in general), etc, etc.
They are not happy, and they just want everyone in America to be as unhappy as they are.