Are we sure the Allies did win WWII? I don’t think the U.S. did.
Just like always we won the war but lost the peace.
Just like we won the Cold War over the Soviet Union but the commies won propaganda war inside the US.
The commies took the battle to the schools, colleges and media. McCarthy tried to warn us but instead we listened to the media and hence the media and leftist professors rewrote history and defeated us from the inside.
I know, right? Crazy.
Knowing what Marxists they were, I’m shocked we didn’t join the axis Considering we had Woodrow Wilson as president during WWI and FDR for most of WWII.