HomeMembers areaContact usBlog
The History Bank - Teaching & Learning Resources
You are viewing : Home » How did the USA emerge from the First World War?

How did the USA emerge from the First World War?

How did the USA emerge from the First World War?

The USA had been reluctant to get involved in the First World War. Throughout the nineteenth century the USA's official policy was to isolate itself from European politics. It did not take sides in the disputes which affected Europe or get involved in alliances which might drag America into the war. Yet in 1917 the USA joined the war on the side of Britain and her Allies and made the deciding contribution that brought about the defeat of Germany and the Central Powers. However, the question that many Americans were asking in 1919 was: 'Was it worth it?'



Lessons & Resources » What was the USA like in the 1920s?



Share this on FacebookShare this on TwitterShare this on Pinterest
Powered by SmartDeCat ® SmartSite Builder™ | Valid XHTML 1.0 | CSS 2.0+