The history of American women is about the fight for freedom, but it's less a war against oppressive men than a struggle to straighten out the perpetually mixed message about women's role that was accepted by almost everybody of both genders.
You hear younger women say, 'I don't believe I'm a feminist. I believe women should have equal right and I believe in fighting for the rights of other women, but I'm certainly not a feminist. No, no, not that!' It's just a word. If you called it 'Fred' would it be better?