In fact, America was viewed favourably because they were perceived to be anti-colonial. Unfortunately, this perception discontinued after World War Two when America transitioned into being the dominant Western influence in the Middle East. The first way in which this is evident, is through America’s ties to oil. Whilst, America officially began accessing Middle Eastern oil in 1928 with the Red Light Agreement, it was only after World War Two that they really interceded amongst politics. This was eminent in 1953, where America teamed up with Britain, in Operation Ajax, to reinstate Shah Reza Pahlavi on the Iranian throne. This was done as the two countries feared the Anglo-Iranian Oil Company would be nationalised if the Soviet Union was able to impact the Iranian government. Yet, this was not where it ended in terms of America’s …show more content…
Firstly, during World War One Britain agreed to help create an independent Arab state whilst, after the war they, hypocritically, colonised the Middle Eastern continent and promised its territory to other nations. Moreover, the West did this again in later years when they seemed to side with the concept of an Arab Palestine, but altered their view at the end of World War two. Even subsequent to such, the West, now shifting to America, continued to interfere with Middle-Eastern politics, especially trying to be the peace-broker of the Arab-Israeli conflict. Therefore, it is evident to see how Western policies shifted and interceded with Middle Eastern