The South was called the solid south for many years because it voted solidly Democrat for more than a hundred years. The Democrat party was in control, the KKK was terrorizing the Black population and nothing seemed to be done about the KKK. It certainly doesn’t seem that the Democrats were trying very hard to eradicate the KKK.
I have been trying to learn something about the politics of the Democratic party in the South, does anyone know of an good books that really tell the whole story, not the sanitized version but the whole truth?
My book, Back to Basics for the Republican Party, explains the politics of the Democratic Party in the South — from the Republican point of view. Cheers,