Is the left deliberately letting it slip that they hate this country from its founding? They're not attacking a particular philosophy or political point of view but the very roots of America at this stage.
We've been saying this from day one and now they don't even make much effort to hide it.
Leftists hate a lot of things and love little, if anything.