Many women, he adds, have told him that it is more socially affirming to work outside the home than to give up their careers to take care of their children. That ideology, he says, has been shaped by feminists who demean the work of women who stay at home as primary caregivers.He's got that right. I think women should work if they want, but it's disgusting how those pretending to care about women have done their best to force women into work they don't want, by demeaning choices (like having a baby) that the feminists don't like.
Men are forced into work that they don't wan't - by demeaning choices like the father being the primary caregiver - all the time.