Not in the South it wasn't.
At the Founding it was (almost) universally acknowledged that slavery was a bad thing and should be gotten rid of. Since it wasn’t very profitable in most of the country, people generally assumed it would die out by itself.
Over the course of the 19th century slavery became very profitable indeed, and not surprisingly people’s ideas about it began to change. By 1860 it was almost universally believed in the South (by white people) that slavery was a positive good and should be expanded indefinitely in time and space.
There are lots of triumphalist books and speeches in Congress and of course the infamous Cornerstone Speech to demonstrate this.