Will Hollywood ever stop portraying whites negatively ?
Al Bundy, Archie Bunker, the list goes on and on, from the beginning of movies right through to today.
Lazy, stupid, racist, childish, greedy, adulterous, drunkard, “whitebread”, prejudiced, “evil rich”, privileged, thief, hypocrite, etc., etc., etc. If you’re smart, you’re a criminal. If you’re a cop, you use excessive force. Even if a white male character is portrayed as a “good guy”, he’s got to have tons of evil, rotten, wicked traits as “baggage”. He’s a good guy, just a little deviant, or a perpetual loser, a drunk, etc.
For every “good guy”, Hollywood portrays several “bad guys”. And the “bad guys” are glorified.
Same could be said of the way white women are portrayed; mostly negatively.
In Hollywood’s eyes, everyone else is a saint, white people are the “root of all evil”.
Well, unless they jump on the “progressive” bandwagon and support socialism/communism/new world order/globalist foundations/etc., otherwise known as the Dem party. Then, in Hollywood’s eyes, they are free to commit every crime and immorality imaginable - and then some. And they’re still ok, because they’re the minions of the “elites”.
Not really — white folks are not a specific stereotypical “evil dude” — that’s English accented or German accented or Russian accented folks