Next, study your geography and history. They (His "children") never got the land back.
God gave our founding fathers America. Certain/many/most Americans embraced Christ and because of that, God blessed us richly. But about 50 years ago we permitted a movement to highjack the country. It's called humanism. It's an ancient philosophy that exalts the individual human as the ultimate source of morals, values and ethics.
These humanists in education, government and media, evicted God from the country and gave America over to baby murderers, faggots (despicable sub humans for which God destroyed whole territories), and degenerates.
Do you not see a correlation? He's giving our country to our enemies for our sins, and there's nothing you can do about it. America is a misbehaving child and is going to be punished.
If you really believe that, go curl up and die somewhere, and spare the rest of us your defeatist moaning.