Historically, gender roles have placed women out of work and instead working within their home. This was a problem because the general population of women were solely dependent on a man and could not acquire any financial independence. As women started to gain more independence and it became more accepted for women to work outside of the household, many felt that women were discarding their domestic duties and taking jobs that would have otherwise gone to men. However, equal work distribution - both men and women being involved in the workforce - actually benefits the economy.
Americas Quarterly outlines how gender equality is beneficial for the whole of a nation. The site states that not only is female political participation constructive - responding to women’s rights as citizens; enriching political discourse, decision-making, and inclusiveness; and improving social conditions - but that women having an economic presence also benefits the country, resulting in societies that are better equipped to reach their economic and social …show more content…
The term feminism, as described by MarriamWebster.com, is the belief that men and women deserve equality in all opportunities, treatment, respect, and social rights. Or is it? The movement has undeniably made many gains in the treck to a prosperous, egalitarian American society (and world). However, is modern feminist practice really all about equality as it says it is? Although Feminism has made great advancements in cultures worldwide, perhaps it does not live up to its definition and is, in fact, less focused on gender equality but rather evening the score. Or so some people say.
In a 2013 Economist/YouGov poll, 72 percent of respondents said that they did not consider themselves feminists. More women than men – 38 percent versus 18 percent – identified as feminist, but in neither group did a majority use the label. Why then, when feminism has done so much for the equality of the sexes, are people so reluctant to wear its