Who cares if Hollywood suffers a financial hit? They’ve been culturally irrelevant to the average American for years. Even ignoring the leftist ideology pushed by the celebs, the storylines are tired and the characters difficult to relate to. Haven’t set foot in a movie theater since before the scam-demic.
That’s true. There’s nothing out there except pointless sequels, reboots and remakes.
Any trace of goodwill they had, they threw it in the trash a long time ago. You can’t make good movies for Americans if you hate America.