The irony of this is striking - the market, pursuit of profits, love of money, or whatever you want to call it may actually revive morality in Hollywood. It may create a veneer, or an appearance of morality, but Hollywood will never be a moral place.
I have to agree - Hollywood will never be a moral place, but I'll take whatever I can get in terms of putting more positive messages and better role models in front of audiences.