“The German government has announced a package of legal measures aimed at fighting right-wing extremism. “
The USA should announce “a package of legal measures aimed at fighting [left]-wing extremism.” What’s good for the goose is good for the gander.
I have some history books written during and right after WWII.
They state that the conservatives (right wing) were the last holdouts against the NAZIs, which makes sense as the NAZIs were left wing socialists/fascists.
History books tend to be written by history professors, who in the USA became increasingly left wing.
One, or some, I haven't figured out exactly when this started, mid 1950s though, started calling the NAZIs "right wing", and ever since the "right" has been tarred by the left with this lie.
Perhaps some other Freepers into history can look into this for more examples and to try to figure out who started it?