In 1941 the United States was drawn into the Second World War when the Empire of Japan attacked the American naval and air bases in Hawai’i and the Philippines. Was it inevitable for the United States to be drawn into the war? Could the United States have avoided being pulled into the War? Why or why not?

error: Content is protected !!