The United States and the World

Throughout much of the second half of the nineteenth century, U.S. interest in foreign policy took a backseat to territorial expansion in the American West. The United States fought the Indian wars while European nations carved empires in Asia, Africa, Latin America, and the Pacific.

At the turn of the twentieth century, the United States pursued a foreign policy consisting of two currents—isolationism and expansionism. Although the determination to remain detached from European politics had been a hallmark of U.S. foreign policy since the nation’s founding, Americans simultaneously believed in manifest destiny—the “obvious” right to expand the nation from ocean to ocean. With its own inland empire secured, the United States looked outward. Determined to protect its sphere of influence in the Western Hemi-sphere and to expand its trading in Asia, the nation turned away from isolationism and toward a more active role on the world stage that led to intervention in China’s Boxer uprising and war with Spain.