Are Gender Roles Necessary?
February 10, 2015
Gender roles have existed since the beginning of time determining the roles of women versus the roles of men. Unfortunately, our options, our jobs, and our beliefs have been dictated by these roles. Whether they are necessary and beneficial to society is the greatest issue. Gender is something every single person is involuntarily influenced by at some time in their life which can unfairly define the roles, behaviors, and physical characteristics that a person should have. These roles are not necessary and can actually be very harmful to men and women, and the harmony between them.
Many cultures around the world strongly enforce these roles on men and women which have created gender inequality and gender gaps. Cultural traits and practices such as: foot binding in China; sati in India, a practice where women jump into the burning pyre of her dead husband; and female genital mutilation (FGM) in African tribal cultures; have deemed women as inferior and useless without men. These birth given roles have altered the lives of countless women and men. They have even been implemented into the culture and now exist as culturally acceptable forms of gender inequality. These cultural aspects have widened the gender gap for many women across the world. As the culture is passed down, so will the inequality.
The gender roles enforced today have resulted in an imbalanced and unequal society. For instance, the pay among men and women for the same job is rarely the same. In most cases, the male is paid a greater amount even if both sexes are completing the same job. This clearly shows how men are seen as more capable of doing work while women are seen as inferior and less than a man. Many believe women should not belong in the same work force as men and, therefore, should not be paid equally. Women are also the primary victims of sexual harassment due to the role given to them as weak and vulnerable. Society establishes the role of “housewife” or the “dependent woman” to females which makes it extremely difficult for women to gain respect and be seen as strong. These gender roles have been created by every society and have caused even worse side effects.
Since these roles are established in almost every culture and society, they have altered the relationship between men and women in a dramatic and harmful way. Men are told to be tough, strong, and macho. Women are told to be gentle, kind, and sweet. These two characteristics are complete opposites and do not work well with each other. Sometimes, the sexes enforce the gender roles themselves without knowing it. Men seem to find their sense of manhood in their strength and power. So when a woman is seen to have power over a man, he is threatened and fights back through physical violence and mental domination. In the same way, when men are not seen to follow their gender roles of being strong and tough, women tend to shame and ignore them. Ultimately this is very harmful to society and each gender because it almost causes a battle of the sexes. It takes away any chance for equality and leaves more room for judgment and hatred towards each other.
Take a look at our own school today. The gender roles are there if you look for them. How many female chemistry teachers do you know? How many male English teachers have you seen? Not too many. Women have been told not to believe they can become scientists or great inventors while men have been told they cannot get in touch with their feelings and become great poets or writers. This has led to a big imbalance in men and women in the workforce. Girls at school are stared at for not wearing makeup or not being skinny enough and boys are laughed at for not being tough enough or not being big enough.
These enforced gender roles have caused an uncomfortable and judgmental society. It has taken away the peace that is necessary for a society or a school to advance and succeed. These roles that define our lives and claim who we can become are only constraining. Clearly, gender roles are extremely unnecessary.