For most of the war the United States had declared neutrality so they can trade with both parties of the war without “getting involved”. They did this for the most part until they saw what the axis powers true intentions were and what they were capable of. After WWII the United States the changes to a policy of intervention was most apparent. The United States falsely believed that if they chose not to get involved with the Europeans the problems that would happen after would not concern them. “Americans believed that they were immune from Europe’s problems as long as they refused to get involved.”. They were obviously wrong and this can be seen by the events at Pearl Harbor when the Japanese attacked us and brought us into the war where we partnered up with our allies to to defeat the axis powers. This wasn 't the first time the US was forced into a war, but after this war everything was different. The United States finally realized that they couldn 't be secluded anymore and had to play an active role in global situations even if it didn 't involve them directly. The US quickly adapted to their new policy and immediately put it in to use. After WWII the united states became the big brother of the world since every other country was devastated and needed money and help and the United States could assist in both of those areas. After …show more content…
It saw women in the workforce, saw the US change from isolationist to interventionist and saw the United States become a global superpower from then on. If the US had never entered the war these events probably would have happened years down the line or maybe not at all. The United States was only following what George Washington told them blindly and not adapting to changing