The United States was initially neutral in World War II but eventually entered the war after the bombing of Pearl Harbor by Japan in December 1941. The U.S. played a significant role in the war, providing military and economic aid to Allied countries, participating in major battles in Europe and the Pacific, and ultimately helping to defeat the Axis powers.