And further, when they make a movie with carefully crafted dialogue and good acting (without explosions, etc.), they're pushing abortion or mercy killing or global warming because of the US or whatever.
Hollywood is in denial. They think they're well in the mainstream, their movies are extremely popular and they'd make billions upon billions if it weren't for pirates.
The only 2 good American movies I've seen in recent years that left me with a "wow, that was GOOD" feeling were Return of the King and March of the Penguin.