I’m in my mid 50’s and cannot believe what has happened to this country.
America has rotted in a way that only a post-Christian country can.
Will it ALWAYS be post-Christian? I have the strongest sense it won’t. That God’s last hurrah has yet to be seen, even here.