Essential films that go behind the scenes of Hollywood
Cinematic moments where Hollywood held up a mirror to itself
MOVIES Cinema
The legend of show business runs so deep that even films have been made to portray the drama that goes on behind the scenes—drama that often even eclipses what is seen on screen.
Who better to tell the stories of Hollywood than Hollywood itself, right? Click through to see the most essential films that give the most engrossing (and only somewhat biased) depiction of what Hollywood is really like.