Opinion: American empire

Illustration

Illustration

By Zachary Schuster

Empire. To most U.S. citizens, empires are failed institutions of the past belonging to the British and Romans. As hard as it is to believe, the United States has become an empire. An American empire is not inherently bad, but accepting that the United States has become one is a critical step in preparing the nation for the future.

Thinking of the United States as an empire is difficult. U.S. citizens are taught about U.S. exceptionalism and benevolence; with the nation’s foreign policy spreading freedom and democracy around the world. U.S. students are taught that their nation spreads the ideals of freedom and democracy, but in reality, the United States merely exerts its influence and power.

The American empire did not begin with the war in Iraq. The United States has been moving from its republican roots toward an empire since the 1898 Spanish-American War gave the nation its first overseas territories.

Most U.S. citizens can identify with the American empire based on the exporting of U.S. brand names. Corporations like Nike and McDonald’s are mainstays on every continent. This form of corporate imperialism is one aspect of the American empire, but not the most important to the security of the United States.

Of greater concern is the expansion of the military and its supporting industries after World War II. Harry S. Truman’s presidency established foreign intervention as the cornerstone of U.S. foreign policy. With the United States committed to foreign intervention, the military-industrial complex has grown to the point where it needs military conflict to justify its existence.

Get The Daily Illini in your inbox!

  • Catch the latest on University of Illinois news, sports, and more. Delivered every weekday.
  • Stay up to date on all things Illini sports. Delivered every Monday.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Thank you for subscribing!

U.S. foreign intervention was initially driven by the belief that it provided national security during the Cold War. It now is driven by a desire to remake the world in the United States’ image. Both ideologies have led to nothing short of imperialism; the United States now maintains more than 700 military bases in more than 100 countries.

Whispers of empire surfaced following the 2003 war with Iraq. Some on the political left point to the Iraq war as the beginning of the American empire, blaming the whole thing on President Bush. But their view is shortsighted.

The recent war with Iraq is merely the most recent expansion of the American empire. The war resulted in the presence of more than 100,000 U.S. soldiers in Iraq and an expansion of U.S. influence in the Middle East. The lesson from the war should be clear: As the empire expands, the United States becomes weaker. Money is spent on expansion instead of protecting the homeland.

America’s vulnerability at home was tragically displayed on Sept. 11, 2001. Conventional political wisdom said that the terrorists attacked because they hated U.S. freedom and democracy. Although popular, this line of thought is wrong.

History has shown that people controlled by an imperialistic power sometimes get angry and choose to rise up to expel their foreign occupier. Insurgents obviously cannot match foreign empires on a traditional battlefield, so they resort to unconventional tactics.

These tactics are known today as terrorism. Those who understand the nature of the American empire understand that terrorists hate the United States because of its intervention in the Middle East to obtain oil and protect other interests.

Retribution is part of maintaining an empire. Today, that retribution has been taken to a whole new level.

To avoid future catastrophic attacks, the only solution might be to shift U.S. foreign policy toward that dreaded political four-letter word: isolationism.

With the United States wrapped in a web of corporate and military imperialism, isolationism might be tough to fathom. However, the United States would benefit most from putting itself first. Getting involved in other peoples’ lands creates hostility, and spending billions of dollars on foreign aid takes away money that could be spent to make the United States stronger.

A complete shift to isolationism would be a mistake, but isolationist measures should be taken. No one should be ashamed about putting the needs of the United States over the needs of other countries. Reducing military commitments abroad would save money and allow a stronger military intended for defense to be created at home. Also, it would signal to the world that the United States is attempting to curb its imperialistic tendencies.

It makes no sense to continue on the road to empire when both history and contemporary times show it is a mistake.