Somehow in America women seemed to have become this super privileged class of humans who supersede the value of anyone else.
When did this happen?
It is demanded that we always believe women.
We are forever told that the “choice” rests with a woman alone. Really, the child has no choice to, you know, not have their life ripped from them? A man, whom pregnancy is not possible, has valued say, as in the male is of no worth.
And don’t get me started about every third woman in society starting every public statement by trying to endear pity by proclaim, As a single mom...” as if this is pity code for the male left and of course it was his fault, therefore, I’m a poor, pitiable wronged female.
Women have absolutely no moral right to terminate a child. You chose to have sex. You chose not to use contraception. Know what? There are times in life when you may have to shoulder a heavier load in life. Just because you are female does not mean you are any more morally worthy than any other human being.
This idea that women have somehow become society’s unassailable privileged class is abhorrent.
Single mom used to mean widow, or maybe "got dumped by a cheater" ...
Now, it sort of means "slut". But at least it means "slut who didn't slaughter her kid" ... so at least there's that.