In this day and age, feminism should be embedded into society. It should be normal for a woman to go through her day treated as an equal to men, it should be normal for ‘strong’ female characters to appear throughout the media, ‘alpha females’ should not be a rarity.
Instead it’s this concept that only a few seem to really embrace, and that’s pretty sad. I don’t think feminism is burning bras and living with cats for your whole life and not shaving your legs and shaving all of your hair off. It’s not. Feminism is simply the recognition that women are equal to men. It’s just about preaching about equality for both sexes. It’s just, at heart, the idea that you can be whatever the hell you want to be, regardless of the junk between your legs.
Thank you. You just perfectly summed up what I’ve been feeling for a while now.