Redefining Gender Roles?
After discussing and reading Lavaque-Manty and Mill’s views on women and feminism, it made me think a lot about the term feminism.
Almost all of my life it seems that feminism has had a negative connotation. I’ve thought that way for the majority of my life. Just growing up the term was always said with that underlying assumption that woman you were talking about wasn’t a very nice person. For example, when talking about a certain teacher… “Man I hate Mrs. Robinson, I always have to listen to her crazy radical views on equal rights for women”
I’ve had a self-proclaimed feminist teacher in high school who just seemed to rub everyone the wrong way. And not just guys, but the girls as well – more often the girls were the one with the negative views.
Famous women’s rights advocates (feminists) are never seen in a negative way, so why does that word have such a negative connotation? Maybe men feel threatened. I know there is always this tension between couples about “who wears the pants,” which usually leads to disagreements. I’m no different. Maybe it has to do with the roles that society has assigned for us as men or women.
Whenever I go out with my girlfriend, I make sure that I drive because for some reason, it is just uncomfortable about a girl driving me around (and no that was not a joke about women’s driving abilities). Then when we get to dinner I always feel obligated to pay, which I end up doing most of the time, and because If a girl buys dinner, society says something is wrong. What evidence is there that men have more money than women? Nowadays, women and men are being educated equally.
Then, when marriage roles around it is always the man’s job to propose to his girlfriend. I know not a lot of college kids are married, which I am not, but how many of us guys actually got asked to a school dance that wasn’t Sadie Hawkins? Eventually the couple has kids, when most of the time Mom stays home with them, but what’s wrong with Mr. Mom? Why is there this stigma about stay at home dads?
Maybe these roles aren’t such a bad thing. I personally don’t have a problem with them, but where did they come from, and will they ever change? Will we ever see women proposing to men on a regular basis? Or more stay at home dads?
I end with this video. It is not a deep, philosophical clip with a hidden message. It’s simply a very funny and entertaining video that deals with gender roles that are more prevalent to us college kids.