Has our society become so demented that rape is regarded as no more significant than someone grabbing your hand and shaking it without your permission, so that people have to emphasize the violent aspect of it to make it seem serious?
Rape is an act of violence where sex is the weapon. They meant to degrade and demean her and in this way be superior. It’s a common enough human trait and why God had to reestablish the Ten Commandments - people without God get increasingly depraved.