What happened to Feminism? When did it become a dirty word?
"In the last thirty years to forty years women's lives really have been transformed, for better and for worse, and feminism has played a role in that transformation". (The Great Feminist Denial by Monica Dux and Zora Simic) So why don't young people identify as being a feminist? What went wrong? Feminism is not a simple issue of equality - it's complex.