No, they have not. If anything, a lot of the history books and memorials tend to put the confederates in a bit more of a nicer point of view than they should be.
The textbooks have denigrated the Confederacy, but there are many Civil War books out there read by the few who can understand them.
As well as the Union. It was a nasty war.
Really? It is my impression that it is very much the other way around. The propaganda/fiction that the Union invaded the South to stop slavery has been repeated so often that now most people believe it.
The truth is more mundane and less noble. They invaded the South to stop Independence, not slavery.