Speaks volumes about how Hollywood views America.
I don't care why it didn't win. I'm not surprised, for all their claim of being "open minded" or "progressive" or "out of step" they're no better than the rest of us. Even the actors themselves seemed uncomfortable with the publicity the film got. It was a cultural moment (which kind of worked against it) the actors didn't really seem to want to embrace and Hollywood was just like them. The Academy voters can feel "good" about themselves because they gave the director an Oscar for his own risk taking.
Just like they can feel good about being paid to be pimping for charities, for example. They'll rationalize it all and blame conservatives or homophobes or older, out-of-touch Academy voters.