I think Hollywood's gotten more reactionary and conservative over the years, because there's no longer art in Hollywood. Art suffers in Hollywood.
Even if it's such a lowly art as TV, you've got to get stuff off your chest, because that's what makes something different and original, your particular take on stuff.
I'm not a film snob at all. I much prefer a really good Hollywood blockbuster than a thought-provoking art house movie because entertainment is sort of where it's at.