I must be oblivious, I never noticed any leftist stuff in them, though I'm not surprised if it's there. Many movies seem to have an anti American reference in them somewhere.
I don't watch too many movies or even TV, it's a shame the whole arts world leans to the left.