We live in a society where we have some set norms to follow, especially with gender roles. It is believed that a man has to go out and work and the female is meant to take care of the home and children. Though a lot of it has been changing in the recent time but still it's a long way to go. The recent day changes that I see are yes the man has to work, the woman also has to work and earn, but the man may not take care of the house work, whereas along with earning the woman has to take care of the house work and children also. And if for any reason one of them has to sacrifice their job then it will be majorly the woman who has to do that.
I do not see any problem in the gender defined roles as far as it is mutually agreed with each other's consent and respect in place, which would be a man can work and earn and the woman can take care of the home and children and her financial needs are taken care by the man, just the way she takes care of the home and in-turn him.
But I do have a problem when it is believed that a woman should work to earn and then by default she is also responsible to take care of the home and children. As such a woman is considered to be a weaker sex physically but still she is expected to do more physical work then a man will do. A woman will be empowered truly only when these changes will happen in the society, where she will be in a position to make the decision that her man will quit his work and take care of the house. She will be empowered when she will be in a position to dictate terms.
Nowadays the whole movement on empowering is focused on woman being able to get out of the stereotype roles of home-maker and being a professional working woman. But this empowerment should not mean that she is burdened with additional responsibilities. Else I feel it's perfectly ok for a woman to be home and take care of the home and children and let the man get out and get the money to provide the family.
Thank you for visiting my blog. 👼🏻👼🏻💖💖🌹🌺🌸
Member of
![]() |
![]() |
![]() |