The best films about women's rights
There's far more to film than dominant male characters and bravado
MOVIES Women's rights
Women have played a huge role on screen since the days of silent cinema. Fortunately, these days, many films are telling important stories from women's perspectives. Don't believe it? Click on. Here are the films that have told important stories about women's rights (or have featured incredible women leads) that will go down in history.