Presumably this will free hollywood and the oscars from the “oscars so white” stigma. Now, films featuring predominantly white actors and directors will be free to collect awards because they have black actors in the scenery. Or will it be a sincere effort to “right a wrong”? Hint: When was the last time hollywood, the home of those who fake sincerity for entertainment, produced anything sincere? (
Not a chance. Nothing is ever enough for the left.