Westward Expansion is a term that relates to the period of American History in which the country expanded its territory from the east coast to the west coast of North America. It occurred throughout the 19th and early 20th centuries and saw the United States expand into the country that it is today. Click below to learn more about specific topics related to the American Westward Expansion.
READINGS & SOURCES
TEACHING RESOURCES & LESSONS