Without question. Hollywood has been consistently attacking Christianity for three decades.
Hollywood films have always glamorized immorality.
Very few films did not promote immoral thinking.
Even the best, if you look at them honestly and you know the Bible, were, in modern parlance, sleazy. And the main characters, the heros, who were sleazy just as much as the “bad guys”.
It’s basically always men chasing around women, and the women’s fathers are nowhere in the picture, or presented as a “bad guy”, who won’t let his daughter be with her “true love”.
This is how the replacement of marriage in America with a constant stream of “romantic” sexual encounters came to be seen as a good thing, and the idea of a father’s approval to court his daughter needing to be sought became first dated then actually scorned. It is viewed as depriving a young woman of her “freedom” when in truth, bypassing the father removes his protection of his own daughter from lechery.