If people would note, there are constantly instances of women slapping, hitting, kicking and doing other negative things to men on television and in the movies. This is supposed to be funny or worse yet the appropriate thing to do.
It's no wonder this woman felt empowered to do what she did.
I mentioned to a coworker that a friend of mine had been assaulted by a woman years ago and she thought it was hilarious. This negative mindset has been instilled in women by the major media outlets.
Ask yourself if it would be appropriate for the media to show men doing these things to women, in a setting that wasn't trying to demonize the male, and you'll begin to realize the point I'm trying to make here.
It's wrong for women or men to use physical means unless defending someone from physical harm.
At some point, men are going to have to start taking female violence against males seriously and start treating violent women as criminals. Until then, as long as men allow women to get away with this, women with violent tendencies will continue to assault men.