Should Rape Be Taught For Rape? Essay

1194 Words Feb 18th, 2015 5 Pages
People should be taught not to rape rather than not to be raped. Often times, people will often deny that they have raped someone, using the excuse that, “they wanted it,” or, “they asked for it.” The dictionary definition of rape is, “unlawful sexual intercourse or any other sexual penetration of the vagina, anus, or mouth of another person, with or without force, by a sex organ, other body part, or foreign object, without the consent of the victim.” Along with this, people will confuse feminism with misandry, the hatred of men, but that’s not what it is at all. The Merriam-Webster definition of feminism is, “the theory of the political, economic, and social equality of the sexes.” You can often hear people denying that rape culture exists, which I believe to be extremely false. Rape culture is the idea that society teaches us not to be raped rather than not to rape in the first place. This takes apart in almost everything we do or are associated with including dress codes (specifically for females), social media, magazines and T.V., etc; it is a standard that has become an everyday normality to us. People are taught to change themselves to prevent rape, which can be roughly interpreted as victim blaming. For example, in most modern schools, you could find a female getting dress coded for wearing tank tops or shorts that are “too short,” because it’s distracting to other people, specifically boys.
Somewhere along the way, boys have inherited this reputation of “boys…

Related Documents