Not all women or women directors are woke. Your generalizations regarding women are skewed… no comment as to why.
Our culture as it is and the way things work in Hollywood, yes they're all woke.