The Role of Women in Western Society

on Thursday, April 11, 2013

Women have been dominated by men since the beginning of mankind. They just lived to serve and obey man. The main role of women in the society was to bear kids, take care of the house and be faithful and loyal to their husbands. . They are not given equal place as men in the society. In modern countries women are kept under total male domination, these countries including Saudi Arabia, Afghanistan, Egypt, India, Pakistan, Indonesia, China, Malaysia, and many African countries.

Women in these countries are not encouraged to obtain higher education. They are not allowed to speak freely. They are enforced to dress properly according to the will of her father or husband. Most women in these countries are not given the right to take part in some activities like work, games, business etc. They are just considered as glorified house keepers.

Until recently, in some western countries women were not allowed to get higher education and take part in many activities as well, but nowadays the situation is different. Western women have various political and lawful rights that allow them to get higher education, as well as, take part in as many activities as they want. The Western world provides complete freedom to women. Like a man, she can take part in all activities.

A great ratio of women in the west are well educated. They have higher education degrees and play a key and important role in the economic growth of their country through business and government positions. Most of the western women, like men, are usually employed and earn a lot of money during their career.

As we know, every coin has two sides. Besides its various advantages, feminism has left some harmful effects on these countries. Pregnancy before marriage and other features are some of the factors in the breakdown of families in the West. Divorce is a common word in these countries

Traditional intellectual like Carlson Carolyn and Allan Graglia believe that the change of women’s role in the western world, from being a caring mother to self identified and trained professional, has shaped a communal disaster that is carrying on to take its charge on the family. There are different points of view regarding the freedom of women in the western world. Some professional, experienced and well educated liberal women in the west are quick to endorse the importance of business and employment.

While the position of women in neo-liberal western society is well established and that of great importance, there should be some thought given to the real extent of their role in society, business and family.



View the
Original article

0 comments: