I wonder when supposedly “Christian” people will stop watching movies that center around Christian themes and are produced, directed, written and acted by people who openly hate Christianity?
It makes no sense to me.
If a Hollywood movie or TV show mentions God, church, Jesus, heaven, hell, etc… I immediately turn it off. I don’t any to be propagandized to about issues of faith by the people who control our entertainment industry. I will go to church with to hear Bible stories and find out the meanings of them.