Feminism destroyed America.
Feminism not only destroyed America; it also destroyed male-female relationships as far as I am concerned.
No. Americans allowed feminism to destroy America. Look at who people voted for in 2008 & 2012. SICK!!!!!!!!!!!!!