Is it just me, or is anyone else just sick and tired of this issue being in our face all the time?
I have gays in my family, and gay friends. It seems nearly every show has to put this issue front and center at some point. From Bones to The Good Wife, at some point homosexuality will be thrust front and center.
Then there’s Glee. Ugh. It’s getting so tedious. If you’re gay, I’m not sure all this exposure is helping your cause. There’s a backfire in here somewhere.