For years the movies coming out of Hollywood have centered on putting curse words in them, sex scenes and plots,nudity. Movie makers obviously making a play for actors male and female and kids, (and don’t think we don’t know about that) or they don’t work. They must think we are blind.
Half the time I think Hollywood has a secret Satanic cult you have to belong to in order to work there. The other half I just think they are narcissists without conscience.
Having worked in the industry, I can’t disagree.