Define Manifest Destiny

Manifest Destiny is a belief or ideology that emerged in the 19th century United States. It was the belief that American settlers were destined, by divine providence, to expand the country's territory across the North American continent. Proponents of Manifest Destiny believed that it was the duty or "manifest destiny" of the United States to spread democracy, capitalism, and the American way of life to the western frontier. This ideology justified and motivated westward expansion, leading to the acquisition of territories such as Texas, Oregon, California, and the Southwest through treaties, wars, and other means.