I know that's become conventional wisdom among conservatives, but I don't believe it. Near as I can tell, the Dem Party has been home to the American Left since at least the end of the 19th Century. However, it is true that their anti-American, anti-traditionalist radicalism blossomed into full public view in the 1960's and has only gotten worse ever since.