I have some history books written during and right after WWII.
They state that the conservatives (right wing) were the last holdouts against the NAZIs, which makes sense as the NAZIs were left wing socialists/fascists.
History books tend to be written by history professors, who in the USA became increasingly left wing.
One, or some, I haven't figured out exactly when this started, mid 1950s though, started calling the NAZIs "right wing", and ever since the "right" has been tarred by the left with this lie.
Perhaps some other Freepers into history can look into this for more examples and to try to figure out who started it?
If one looks at the political spectrum as no government control to total government control, then anarchists and hippies fighting the man would fall on the left, leftists and conservatives believing in limited government would fall somewhere to their right.
As the leftists seize more control of government, they are the true far right but label conservatives the fascists... I blame the educational system and indoctrination.