One usually goes with the other. The Oscars are such a part and parcel of American movies.
I disagree.
The Oscars is an awards celebration where the award is given by the participants themselves... it is a pat ourselves on the back celebration and NOT indicative of what the common ordinary "man/woman on the street" feels about the work they present. It is an opportunity for them to TELL us what we should like/love about the work they have presented in the past year.
Look back at the history of this awards presentation and you will see example after example of movies/persons awarded who did not truly earn what was given.
American culture vis a vis the movie industry is truly served and reflected by the common folk who view the art as presented and react to it through their actions in everyday life.
The Academy Awards is purely about the artists awarding themselves and one another, it is not about the common man/woman and their opinion.
Movies take from the culture. Sometimes they faithfully represent it and sometimes they bastardize it. Too often it is the latter.