Maybe it’s my imagination but I seem to be seeing more stories about sexual abuse of animals.
What the hell is happening to this world??
I’ve been seeing more physical and sexual abuse of animals in the news. Both local, and national.
Want to think it’s just left over from Obama and once we have Trump for 8 years we will have a better society. The leftists seem to have brought out all the perverts. This is just my theory.