The Western World, once a bastion for Christianity, now spiraling down the path of Sodom and Gomorrah.
Shocking but not unexpected what with our culture that celebrates and worships Hollywood.
The West is finished, barring religious revival. Not betting on that.