The United States has been at war every day since its founding, often covertly and often in several parts of the world at once. As ghastly as that sentence is, it still does not capture the full picture. Indeed, prior to its founding, what would become the United States was engaged—as it would continue to be for more than a century following—in internal warfare to piece together its continental territory. Even during the Civil War, both the Union and Confederate armies continued to war against the nations of the Diné and Apache, the Cheyenne and the Dakota, inflicting hideous massacres upon civilians and forcing their relocations. Yet when considering the history of U.S. imperialism and militarism, few historians trace their genesis to this period of internal empire-building. They should.