English Departments no longer do much teaching of English literature. An awful lot of it is what is commonly called “gender bending.” Women in the novel, women writers, the evil crimes of dead white male writers, the postmodernist significance of race, class, and gender, and so forth.
Really a shame. I think most of them are losing students, since any normal person would want to take an English course because they enjoy reading poems, plays, or novels and want to know more about them.
So, now it’s turning from attempted mutilation of student minds to self mutilation, literally.
This begs the question, are there any colleges that are not mental institutions? Seriously, what are some healthy universities and colleges?