Practically every year there is a theater release movie, OR a tv movie about the evil South, and the saintly North.
Truth is truth, and should be shared whenever it can be.
Well waaaa! Maybe if you cry a little bit louder more we’ll hear you.
I seem to recall it being the other way around in the 1960s and 1970s.
The movies portrayed the South and noble and the north as animals.
times change
Movies potray a saintly north:
Like “The Outlaw Josie Wales” which potrayed US Army shooting down unarmed surrendering confederate solders with Gatling guns. May I point out that no such event ever happened.
Actually the movie business has been overly kind to the slave power.
If you're talking about the Civil War itself, I'd say that's far from true.
If we go all the way back to Gone with the Wind, or Birth of a Nation, it's clear that even liberal Hollywood has bent over backwards to be "fair and balanced" toward Southerners.
Of course, I do watch very few movies, so there might be whole genres out there I don't know about, but can't think of even one recently which matches your description.
Can you name some?