Seriously, Hollywood? You’re the ones making such a big deal about having ‘strong roles’ for women and minorities, you’re the ones making films that supposedly lack these qualities, and then you blame the audiences for being bigots because we…what? Go see the movies you actually make?