How can this possibly be a surprise to the feminist? A couple things that I admire about Islam are their family orientation, and their roles for men and women. While they take it to an unacceptable extreme, they have clear mutually supportive roles for the genders, much like we once had forty years ago, when most mothers were stay at home moms. How has our culture improved since feminists took over?
I am far from an expert in Islam, but the things I mentioned are common knowledge, especially the supportive role of women.
In fairness, I've heard that sort of thing from some Christians.