Considering its popularity, Western medicine must be the best way to treat health problems, right? If not, why do we continue to use these practices if they aren’t helping us? More importantly, if they aren’t helping us, are they hurting us? Western medicine is remarkably prevalent because the health industry is one of the most lucrative industries in the United States, and people are willing to pay a high price for a “magic cure-all.” In addition, pharmacies provide money and gifts to medical schools in exchange for advertising, causing many medical students to get a biased education. Western medicine also has many health-related drawbacks, including negative side effects, weakened immune systems, stronger bacteria, and the overall lazy mentality that “quick-fixes” tend to
Considering its popularity, Western medicine must be the best way to treat health problems, right? If not, why do we continue to use these practices if they aren’t helping us? More importantly, if they aren’t helping us, are they hurting us? Western medicine is remarkably prevalent because the health industry is one of the most lucrative industries in the United States, and people are willing to pay a high price for a “magic cure-all.” In addition, pharmacies provide money and gifts to medical schools in exchange for advertising, causing many medical students to get a biased education. Western medicine also has many health-related drawbacks, including negative side effects, weakened immune systems, stronger bacteria, and the overall lazy mentality that “quick-fixes” tend to