As we ponder over whether America was an empire during the post World War II period or not, we must first and foremost, try to define the notion of empire. This will help us analyze the situation of America at that time. "Imperialism and thus the notion of empire resembles Darwinism, in that many use the term but few can say what it really means", said Patrick Wolfe in the American Historical Review.
The Cambridge Online dictionary, gives us several meanings for empire, or imperialism. An empire is "a group of countries ruled by a single person, government or country". Imperialism is then defined as "a system in which a country rules other countries, sometimes having used force to obtain power over them", or "when one country has a lot of power or influence over others, especially in political and economic matters". The Greek, Roman and British Empires, all embody these definitions in their own way. The Greek and Roman empires had control of a wide territory in the entire world as well.
APA Style reference
For your bibliographyOnline reading
with our online readerContent validated
by our reading committee