Why should anyone fight for a country that has ‘leaders’ who fight against it?
Today the USA isn’t worth fighting for. Students I have taught believe D-day was a terrible mistake and all the GIs who fell were fools who died for nothing. Why “Liberate Europe” when in a year we had the A-Bomb and could have vaporized Berlin and Hitler from a bomber and ended the war? The woke, de-fanged America where only crime pays and its all about just two things—the Bucks and Fame. Corruption is embraced at the highest levels and all that was seen as “Good” is now bad. Marriage, children, patriotism and Religion (save for Islam—they will die for Allah and take you with them). The flag is mocked, and even the anthem that is replaced. It’s all about race, gender and control. Nothing worth saving. If WW III starts who will fight for Biden’s woke America? Why should they?