It would seem like a lot of people have forgotten what the definition of feminism really is. Here, just so you don’t get further confused:
the advocacy of women’s rights on the basis of the equality of the sexes.
It’s not man-hating, it’s not blaming men for everything, it’s not blaming a patriarchy that has been changing throughout the years to include more and more women, and it’s definitely not a divisive force as it’s been used in the recent past.
Men and women can’t ever fully equal, at least not from a biological standpoint. We’re formed differently, our bodies have different functions, but at the core we are the same. That being said however, real feminists don’t seek to tear down the opposite gender, they try to work with them to gain equal rights and equal say within this country.
Women do deserve equality, but ladies, if you want to be completely equal to men, in everything but biology, then you’ve got to take the bad with the good. You don’t get to pick and choose which parts of the legal system apply to you. You can’t say I want the ability to be treated equally and then say “Oh no, I don’t want that part or this part.” Sorry, things don’t work that way. Would it be that way if our roles were switched and women had been dominant for so long?
In an equal world, yes.