I’ll tell you what feminism really is, a lot of people think it’s just man shaming, or hating and that’s totally wrong. Feminism is not just about women vs. men either. I
t is about sex, gender, race, rights, and social “norms”.
This for example. How come when a mother is in labor and the baby pops out and has a vagina and the doc is like oh its a girl! everybody is like, awh lets go to the gift shop and buy a bunch of pink dresses and bows vice versa with a boy, why is it that society labeled girls and boys with a color? When I tell you I have a daughter and she’s a tom boy what exactly do you picture in your mind? you are probably thinking about a little girl wearing baggy shorts and shirt playing in mud with a backwards baseball cap, and that’s not technically your fault its just what society has made it out to be.
Whenever I hear people talking about feminist, I hear that “they” are nothing but ugly lesbians that hate men, ugly personalities, cant get none ect. There is so much more to feminist groups.
Why do men get paid more for doing the exact same job as a women? for example a custodian.
Why do black women get paid less than a white women for doing the exact same job?
I’ll tell you why, SOCIETY.
Please don’t be shy to post your opinions. thank you !