I haven’t wasted my time reading most of those books since I was a teenager. So of course I don’t remember the names. But probably half the ones at your local book store are that type. They give the image of the South was all evil and the North was all good as if things were that simple, and they often try to promote a sense of white guilt for slavery. Everything Lincoln did was good, all Southerners were racists and all Northerners supported abolition. While most don’t take it quite to this extreme, some do (I remember reading them as a child), and many come pretty close.
I've read a great deal on the subject as well and have yet to come across a volume such as this. Can you provide even a single title?