Diana Agron makes some pretty big claims about the way women are treated in Hollywood. The ‘Glee’ star says the industry is hugely sexist and it’s getting worse.
Video · 'Glee' Star Diana Agron Says Hollywood is Sexist and Getting Worse
25 views
Hollyscoop -
Celebrities Glee