The United States Is A Colonial Empire, And An Extremely Successful One – Analysis

For those who remember their high school textbooks, the issue of America’s imperial history is very clear cut. American imperialism, the textbooks told us, was that fairly brief period in American history that lasted from 1898 to 1945. This was the period during which the United States acquired a number of overseas territories, including the Philippines and Puerto Rico, among others. The era of imperialism, we are told, ended in 1945 when the Philippines was finally granted full independence from Washington. Even more dubious—the textbook writers tell us—is the idea that the US is or has been a colonial empire. After all, American citizens did not set up any colonies in the Philippines or in Puerto Rico in the way that British migrants populated Virginia or New England.