Essay On The Effects Of Advertising In The United States
Since the industrial and economic expansion after World War II, the United States has been driven to a society based around consumerism. In the modern era, not only are goods and services are more accessible to consumers more than ever before, but one product may have numerous options from which to choose. This is the result of technology being utilized in order to produce more goods for cheaper prices, which in return alters the supply and demand curve. Yet, the biggest influence of the way in which Americans consume goods and services is directly linked to modern advertising. Advertising has become an industry on its own. The average person sees so many ads per day, he becomes desensitized …show more content…
This has changed how Americans dine. Venders purposefully make the unhealthy food look more appealing on television by using special make-up techniques such as adding simulated water droplets onto a glass of soda to appeal to the idea the soda will quench one’s thirst, or airbrushing extra color of hamburger buns onto the sandwich in efforts to make it look toastier, fresher and more appealing than what one would find in an actual restaurant. This advertising drives people to eat at the establishments which, not only changes what foods are consumed, but also directly effects their health. It is known that eating releases dopamine in the brain which is the “feel good” neurotransmitter, and causes individuals to become addicted to food as if it were a drug. Fast food is greasy and full of sodium, sugar, and carbohydrates. In moderation, the foods are not completely terrible for a person, but when they are eaten too frequently, as the commercials suggest people should, health problems arise. Problems such as obesity, cardiac diseases, and diabetes may all develop over a period of time. This is an effect of advertising because since the market is able to inform more people about their product, more people are consuming the unhealthy food, becoming addicted, and ending up with diseases they may have not developed otherwise.
Rape Culture and Gender “Rape culture” is a term that describes how a state normalizes sexual assault from the male gender. This idea may stream into the minds of young men from television shows, pornography, movies, and advertising. Advertising takes part in this