Actually, Hollywood has been historically good for America. It has spread American ideas and culture abroad, and it has been a huge money maker for our country.
But I sympathize with those who would like to see Hollywood die. Instead of spreading ideas like freedom and the American way all around the world, or American style humor and romance, it is spreading sexual license, perversion, and hatred of America. It still is making a lot of money for our country, improving our balance of payments, and providing a lot of jobs, but the people who are running the industry today are mostly a bunch of sick, self-hating, sorry excuses for human beings.
I don’t know what it would take to bring Hollywood back to its senses, but that would certainly be preferable to watching it destroy itself as it tries to bring down the country with it.
There are plenty of good movies every year. You just have to seek them out. Last year was unusually strong actually.