Tagged: Hollywood

the women who changed hollywood

The Women Who Changed Hollywood

As the prestige movie and awards seasons approach, a survey of the women who evolved an industry through art, persistence, or both. Read about the women who changed Hollywood on VanityFair.com >>>> Read Full...