Feminism is *really* a left wing anti-Western movement that is for bigger government, and for mandating equality of outcomes between men and women. I remember reading somewhere about how Gloria Steinem was pushing for more female fire fighters. Never mind that women can't lift as much weight as men. Steinem's solution for that problem was that women could drag people by their ankles out of burning buildings.
There was another instance where some European feminist said, in response to getting raped by Muslim men, that they should dress modestly. You all can be sure what her response would have been if the rapist was a white man.
The reason why these feminists are silent on the topic of abuses of women in Muslim countries is because their main target is white, conservative, heterosexual Christian men. That is how they plan to bring down the West. They want to turn America into a Third World banana republic.