Category Archives: Hillsdale College:Imprimis
I HAVE BEEN ASKED TO talk today about education and economic development. The standard thing to say on this topic is that the former is vital to the latter. We live in the modern world, so we all have to be highly informed and highly skilled and understand the power of modern science. It is a task of the very first importance to train a workforce that will be able to compete in the global marketplace. That is the standard thing to say, and we hear it said often by education bureaucrats from the federal level on down. And of course it is perfectly true, as far as it goes. But there is more to be said.
The practical point of this standard thing to say is that America needs more technical education—more scientists and mathematicians. And of course we do need scientists and mathematicians. But I like to remind people when they say this that the word “technical” comes from the Greek word “techne,” which means “art.” And Aristotle points out that art is about making, and that the question of what one should make is always superior, in point of order and logic, to the question of how to make it.
What does this mean? Consider one of the greatest scientific achievements of the last century—the development of the atomic bomb. The question of whether to build an atomic bomb, and then the question of whether to drop it on Hiroshima and Nagasaki in order to end World War II without the need of invading and conquering the Japanese mainland, were more important questions—superior in order and logic—to the question of how to make the bomb. The brilliant physicists who accomplished the latter had immense technical training, but that training gave them no special knowledge about those more important questions. Or to put the point in a slightly different and more general way, a technical education can make a person wealthy and famous, but it does not teach that person what is best to do with wealth and fame.
So the first point I would make about education and economics is the importance of liberal arts education, which is the kind of education offered at Hillsdale College. Many think of liberal arts education as a broad education, but in fact it is a high education. We understand things to be arranged in a hierarchy. Hillsdale College has plenty of science and math majors, and our students go on to the very best graduate and professional schools. But whatever their majors, they learn the distinction I just made about questions of greater and lesser significance, and they study how to think about the very greatest ones.
The second point I want to make has to do with politics and education. The greatest example of economic development in human history was in the United States during the 19th century. At the beginning of that century, we were about five million people huddled along the East Coast. By the end of it we had grown at a rate of about 25 percent—much faster than China is growing today—and had settled an entire continent, largely without the help of modern science. To the question of how it was done, I think the short answer is the Homestead Act—the greatest piece of legislation I know. Signed by President Lincoln in 1862, the Homestead Act is short and beautiful—two qualities good legislation should have, and two qualities in which legislation today is utterly lacking.
What the Homestead Act did was to take the western land of the United States—surely one of the greatest assets ever held by any government in history—and give 160-acre plots to anyone with the backbone to live on them and work them. These plots of land were granted regardless of who someone was and with the certainty that no one settling on them could ever vote for this congressman or that. It is one of the greatest impartial acts of legislation in all of human history. It, and things like it, built America and the character of the people who spread across it.
How does this connect to my first point? It connects because the spirit of the Homestead Act, which led to unprecedented economic growth, could not be more different from the spirit of our legislation today. And the key to this difference is the difference between the education our leaders today have had, and the education students get at Hillsdale.
The principle that justified the Homestead Act has two parts, and both are found in the first 15 lines of the Declaration of Independence. The first is the idea of human equality—the idea that it does not matter what race or what family you come from, it only matters what you do—which has been the source of our greatest struggles in an attempt to live up to it. The second is the idea of the “Laws of Nature and of Nature’s God.” At Hillsdale College, we study the Declaration of Independence as the greatest thing of its kind. The signers of the Declaration were risking their lives. There is a beautiful passage at the end of it where they write, “we mutually pledge to each other our Lives, our Fortunes and our sacred Honor.” But the document begins in an opposite mood, because the cause they are willing to die for is not specifically about them at all: “When in the course of human events”—that means not our time, but any time—”it becomes necessary for one people”—that means not our people, but any people—and then this sentence goes on to speak of the “Laws of Nature and of Nature’s God,” laws true always and everywhere.
Understood comprehensively, the Declaration points us to an unalterable law of God, visible in nature, that man is inferior to God and superior to the beasts, such that it is unjust for one human being to rule any other without his consent. And it is this same understanding of human nature on which Madison rests his case in Federalist 51, in explaining why government is both necessary and must be limited:
. . . [W]hat is government itself but the greatest of all reflections on human nature? If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary. In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed; and in the next place oblige it to control itself.
This is the understanding that animates legislation like the Homestead Act. And note the humility in it. America’s founders understood themselves to be bound and limited by something higher. And it is precisely this understanding that is missing among our political leadership today. Nearly 20 years ago now, when Clarence Thomas was testifying before the Senate Judiciary Committee during his confirmation hearings, several senators questioned him about the idea of natural law, which seemed to them a foreign and dangerous idea. And why would it seem that way?
These senators have been taught to understand government as a means by which they can do marvelous things, changing society for the better in countless and unlimited ways. And in this light, the old-fashioned idea of natural law—which, as we saw in the passage from Madison, leads to the idea of limited government—becomes simply an impediment to progress.
President Obama is an impressive man, and there is much good to be said about him. But he falls firmly into this newer school of thought. Let me read you a passage from his book, The Audacity of Hope:
Implicit in [the Constitution’s] structure, in the very idea of ordered liberty, was a rejection of absolute truth, the infallibility of any idea or ideology or theology or “ism,” any tyrannical consistency that might lock future generations into a single, unalterable course. . . .
One can see immediately the practical results of this in the health care debate. Advocates of one of the latest plans are proud to place the cost at only $900 billion—apparently it takes $1 trillion to impress in this day and age! But consider that, in most of the plans that have advanced in the Congress, people making in the range of $30,000 to $80,000 a year will be forced to pay health insurance costs—or fines of about the same amount—that come to between ten and 20 percent of their income. They will be compelled to buy plans that have certain specific features. There will be an allocation of health care resources as part of the plan. And it will not be legal to buy or sell a plan that does not fit the criteria. Compare the spirit of this legislation with the spirit of the Homestead Act. There is a bullying spirit behind it. And that bullying spirit is becoming ever more pervasive.
The means are already in place for the federal government to control what people say in elections. As a recent example of how it tries this between elections, consider that Henry Waxman—a congressman of some power and influence—sent a letter in August to the CEOs of health care companies asking for schedules of all salaries above a certain amount, and of the conferences they had been to, and how much they cost, and who was there. Was it a coincidence that he wanted this information just as a health care debate was starting up? Could it be that he was trying to intimidate and silence potential opposition? One of the many “czars”—isn’t that an ominous word?—in the Obama administration is Cass Sunstein, the czar of regulatory policy. Mr. Sunstein is a very smart man—a law professor, like the president—but he is on record saying that speech rights should be redistributed by government bureaucrats much as wealth is redistributed through post-New Deal tax and entitlement policy. This is not supposed to be a country where there are czars dealing with things like speech. But it is such a country right now.
The economic policies being proposed these days are very bad. But the principles behind them are worse. They represent a return to the idea that the American Revolution repudiated—the idea that some are equipped by nature or training to manage the lives of others without their consent. I have been making the point lately that people are wrong who accuse the Obama administration of being socialist. I take the president at his word when he says that he has no desire to own the automobile companies. Instead, he wants to control them—and the rest of us as well—through a regulatory apparatus overseen by czars and bureaucrats. And again, his intentions are good. What is bad is the view underlying them of what human beings are. Rather than looking on us as equal beings with a set nature—such that none of us should rule another in the way that God rules man or man rules beast—our political leaders today have been taught to see us as material to be shaped and perfected by experts who have the proper technical training.
It has been close to 100 years now that the majority of people teaching in American colleges and universities have agreed with Woodrow Wilson, one of the founders of the Progressive movement and the first to write explicitly that the Declaration of Independence is obsolete, and that we need to liberate the Constitution from the Declaration’s restraints. This liberation leads to the idea of a “living Constitution,” characterized by constant change or progress. Absolute truth, to the extent that ordinary people still believe in it, obstructs change or progress—which is why President Obama refers to it, in the passage I read, as tyrannical. But if change or progress is the rule, who is to determine what version of change or progress is good? And the logical problem here—as any Hillsdale student could tell you—is that once you deny the existence of absolute truth, the definition of “good” becomes subjective and the only standard of behavior is what we want—”we,” in the political sense, meaning the government or bureaucracy. It reduces politics not to right, but to force. That is why there is this bullying spirit about our government today, and why so many Americans are worried.
It is time for that to stop, and there are two conditions for stopping it. The first is for the ordinary folk of the United States to see in this the despotism that it is, and to rise up and repudiate it. The second thing is longer term, but equally vital: It is to replace leaders who have bad educations with leaders who have good educations. This is our work at Hillsdale College. We aim to recover the meaning of the “Laws of Nature and of Nature’s God” and to place that meaning firmly in the minds and hearts of ambitious young men and women who have the courage to do something with that knowledge. And I swear that we shall not stop pursuing that task.
The following is adapted from a lecture delivered at Hillsdale College on October 1, 2009, during the author’s four-week teaching residency.
I want to talk about the Western way of war and about the particular challenges that face the West today. But the first point I want to make is that war is a human enterprise that will always be with us. Unless we submit to genetic engineering, or unless video games have somehow reprogrammed our brains, or unless we are fundamentally changed by eating different nutrients—these are possibilities brought up by so-called peace and conflict resolution theorists—human nature will not change. And if human nature will not change—and I submit to you that human nature is a constant—then war will always be with us. Its methods or delivery systems—which can be traced through time from clubs to catapults and from flintlocks to nuclear weapons—will of course change. In this sense war is like water. You can pump water at 60 gallons per minute with a small gasoline engine or at 5000 gallons per minute with a gigantic turbine pump. But water is water—the same today as in 1880 or 500 B.C. Likewise war, because the essence of war is human nature.
Second, in talking about the Western way of war, what do we mean by the West? Roughly speaking, we refer to the culture that originated in Greece, spread to Rome, permeated Northern Europe, was incorporated by the Anglo-Saxon tradition, spread through British expansionism, and is associated today primarily with Europe, the United States, and the former commonwealth countries of Britain—as well as, to some extent, nations like Taiwan, Japan, and South Korea, which have incorporated some Western ideas. And what are Western ideas? This question is disputed, but I think we know them when we see them. They include a commitment to constitutional or limited government, freedom of the individual, religious freedom in a sense that precludes religious tyranny, respect for property rights, faith in free markets, and an openness to rationalism or to the explanation of natural phenomena through reason. These ideas were combined in various ways through Western history, and eventually brought us to where we are today. The resultant system creates more prosperity and affluence than any other. And of course, I don’t mean to suggest that there was Jeffersonian democracy in 13th century England or in the Swiss cantons. But the blueprint for free government always existed in the West, in a way that it didn’t elsewhere.
Just as this system afforded more prosperity in times of peace, it led to a superior fighting and defense capability in times of war. This is what I call the Western way of war, and there are several factors at play.
First, constitutional government was conducive to civilian input when it came to war. We see this in ancient Athens, where civilians oversaw a board of generals, and we see it in civilian control of the military in the United States. And at crucial times in Western history, civilian overseers have enriched military planning.
Second, Western culture gave birth to a new definition of courage. In Hellenic culture, the prowess of a hero was not recognized by the number of heads on his belt. As Aristotle noted in the Politics, Greek warriors didn’t wear trophies of individual killings. Likewise, Victoria Crosses and Medals of Honor are awarded today for deeds such as staying in rank, protecting the integrity of the line, advancing and retreating on orders, or rescuing a comrade. This reflects a quite different understanding of heroism.
A third factor underlies our association of Western war with advanced technology. When reason and capitalism are applied to the battlefield, powerful innovations come about. Flints, percussion caps, rifle barrels and mini balls, to cite just a few examples, were all Western inventions. Related to this, Western armies—going back to Alexander the Great’s army at the Indus—have a better logistics capability. A recent example is that the Americans invading Iraq were better supplied with water than the native Iraqis. This results from the application of capitalism to military affairs—uniting private self-interest and patriotism to provide armies with food, supplies, and munitions in a way that is much more efficient than the state-run command-and-control alternatives.
Yet another factor is that Western armies are impatient. They tend to want to seek out and destroy the enemy quickly and then go home. Of course, this can be both an advantage and a disadvantage, as we see today in Afghanistan, where the enemy is not so eager for decisive battle. And connected to this tradition is dissent. Today the U.S. military is a completely volunteer force, and its members’ behavior on the battlefield largely reflects how they conduct themselves in civil society. One can trace this characteristic of Western armies back to Xenophon’s ten thousand, who marched from Northern Iraq to the Black Sea and behaved essentially as a traveling city-state, voting and arguing in a constitutional manner. And their ability to do that is what saved them, not just their traditional discipline.
Now, I would not want to suggest that the West has always been victorious in war. It hasn’t. But consider the fact that Europe had a very small population and territory, and yet by 1870 the British Empire controlled 75 percent of the world. What the Western way of war achieved, on any given day, was to give its practitioners—whether Cortez in the Americas, the British in Zululand, or the Greeks in Thrace—a greater advantage over their enemies. There are occasional defeats such as the battles of Cannae, Isandlwana, and Little Big Horn. Over a long period of time, however, the Western way of war will lead us to where we are today.
But where exactly are we today? There have been two developments over the last 20 years that have placed the West in a new cycle. They have not marked the end of the Western way of war, but they have brought about a significant change. The first is the rapid electronic dissemination of knowledge—such that someone in the Hindu Kush tonight can download a sophisticated article on how to make an IED. And the second is that non-Western nations now have leverage, given how global economies work today, through large quantities of strategic materials that Western societies need, such as natural gas, oil, uranium, and bauxite. Correspondingly, these materials produce tremendous amounts of unearned capital in non-Western countries—and by “unearned,” I mean that the long process of civilization required to create, for example, a petroleum engineer has not occurred in these countries, yet they find themselves in possession of the monetary fruits of this process. So the West’s enemies now have instant access to knowledge and tremendous capital.
In addition to these new developments, there are five traditional checks on the Western way of war that are intensified today. One of these checks is the Western tendency to limit the ferocity of war through rules and regulations. The Greeks tried to outlaw arrows and catapults. Romans had restrictions on the export of breast plates. In World War II, we had regulations against poison gas. Continuing this tradition today, we are trying to achieve nuclear non-proliferation. Unfortunately, the idea that Western countries can adjudicate how the rest of the world makes war isn’t applicable anymore. As we see clearly in Iran, we are dealing with countries that have the wealth of Western nations (for the reasons just mentioned), but are anything but constitutional democracies. In fact, these nations find the idea of limiting their war-making capabilities laughable. Even more importantly, they know that many in the West sympathize with them—that many Westerners feel guilty about their wealth, prosperity, and leisure, and take psychological comfort in letting tyrants like Ahmadinejad provoke them.
The second check on the Western way of war is the fact that there is no monolithic West. For one thing, Western countries have frequently fought one another. Most people killed in war have been Europeans killing other Europeans, due to religious differences and political rivalries. And consider, in this light, how fractured the West is today. The U.S. and its allies can’t even agree on sanctions against Iran. Everyone knows that once Iran obtains nuclear weapons—in addition to its intention to threaten Israel and to support terrorists—it will begin to aim its rockets at Frankfurt, Munich, and Paris, and to ask for further trade concessions and seek regional hegemony. And in this case, unlike when we deterred Soviet leaders during the Cold War, Westerners will be dealing with theocratic zealots who claim that they do not care about living, making them all the more dangerous. Yet despite all this, to repeat, the Western democracies can’t agree on sanctions or even on a prohibition against selling technology and arms.
The third check is what I call “parasitism.” It is very difficult to invent and fabricate weapons, but it is very easy to use them. Looking back in history, we have examples of Aztecs killing Conquistadors using steel breast plates and crossbows and of Native Americans using rifles against the U.S. Cavalry. Similarly today, nobody in Hezbollah can manufacture an AK-47—which is built by Russians and made possible by Western design principles—but its members can make deadly use of them. Nor is there anything in the tradition of Shiite Islam that would allow a Shiite nation to create centrifuges, which require Western physics. Yet centrifuges are hard at work in Iran. And this parasitism has real consequences. When the Israelis went into Lebanon in 2006, they were surprised that young Hezbollah fighters had laptop computers with sophisticated intelligence programs; that Hezbollah intelligence agents were sending out doctored photos, making it seem as if Israel was targeting civilians, to Reuters and the AP; and that Hezbollah had obtained sophisticated anti-tank weapons on the international market using Iranian funds. At that point it didn’t matter that the Israelis had a sophisticated Western culture, and so it could not win the war.
A fourth check is the ever-present anti-war movement in the West, stemming from the fact that Westerners are free to dissent. And by “ever-present” I mean that long before Michael Moore appeared on the scene, we had Euripides’ Trojan Women and Aristophanes’ Lysistrata. Of course, today’s anti-war movement is much more virulent than in Euripides’ and Aristophanes’ time. This is in part because people like Michael Moore do not feel they are in any real danger from their countries’ enemies. They know that if push comes to shove, the 101st Airborne will ultimately ensure their safety. That is why Moore can say right after 9/11 that Osama Bin Laden should have attacked a red state rather than a blue state. And since Western wars tend to be fought far from home, rather than as a defense against invasions, there is always the possibility that anti-war sentiment will win out and that armies will be called home. Our enemies know this, and often their words and actions are aimed at encouraging and aiding Western anti-war forces.
Finally and most seriously, I think, there is what I call, for want of a better term, “asymmetry.” Western culture creates citizens who are affluent, leisured, free, and protected. Human nature being what it is, we citizens of the West often want to enjoy our bounty and retreat into private lives—to go home, eat pizza, and watch television. This is nothing new. I would refer you to Petronius’s Satyricon, a banquet scene written around 60 A.D. about affluent Romans who make fun of the soldiers who are up on the Rhine protecting them. This is what Rome had become. And it’s not easy to convince someone who has the good life to fight against someone who doesn’t.
To put this in contemporary terms, what we are asking today is for a young man with a $250,000 education from West Point to climb into an Apache helicopter—after emailing back and forth with his wife and kids about what went on at a PTA meeting back in Bethesda, Maryland—and fly over Anbar province or up to the Hindu Kush and risk being shot down by a young man from a family of 15, none of whom will ever live nearly as well as the poorest citizens of the United States, using a weapon whose design he doesn’t even understand. In a moral sense, the lives of these two young men are of equal value. But in reality, our society values the lives of our young men much more than Afghan societies value the lives of theirs. And it is very difficult to sustain a protracted war with asymmetrical losses under those conditions.
My point here is that all of the usual checks on the tradition of Western warfare are magnified in our time. And I will end with this disturbing thought: We who created the Western way of war are very reluctant to resort to it due to post-modern cynicism, while those who didn’t create it are very eager to apply it due to pre-modern zealotry. And that’s a very lethal combination.
The following is adapted from a speech delivered in Washington, D.C., on September 11, 2009, in the “First Principles on First Fridays” lecture series sponsored by Hillsdale College’s Allan P. Kirby, Jr. Center for Constitutional Studies and Citizenship.
President Obama’s Foreign Policy: An Assessment
I THINK it is important, on the eighth anniversary of the 9/11 attacks, to take a look at our foreign policy and to judge whether or not we’re on a path to becoming safer. In doing so, we should not be intimidated by those who say that criticism of foreign policy—criticism that suggests we’re less safe as a consequence of certain policies—is somehow disloyal or hyper-partisan. It is the essence of political debate over foreign policy to judge whether the interests of the United States are being protected and advanced. If we believe they are not, it is our responsibility to speak out.
For the last eight months, we’ve had a different kind of president than we’ve had in the past. Barack Obama is the first post-American president. And by this I don’t mean he’s anti-American. What I mean by post-American is suggested by a response the president gave to a reporter’s question during a recent trip to Europe. The reporter asked about his unwillingness to discuss American exceptionalism—the notion that the United States has a unique mission, that it’s “a shining city on a hill” as Ronald Reagan liked to say (echoing our pilgrim fathers). Mr. Obama responded that he believes in American exceptionalism in the same way that the British believe in British exceptionalism and the Greeks believe in Greek exceptionalism. Given that there are 192 member countries in the United Nations, I’m sure he could have gone on naming another 189 that believe in their own exceptionalism. But in any case, the idea that all countries believe themselves to be exceptional in the same way leads to the unmistakable conclusion that none are truly exceptional. In other words, the president’s response reflects his belief that America is not so different from other countries.
Mr. Obama’s supporters in the mainstream media share this view. Newsweek editor Evan Thomas, for example, delivered this revealing comment when previewing the president’s speech on the anniversary of D-Day last June:
Reagan was all about America . . . . Obama is ‘we are above that now.’ We’re not just parochial, we’re not just chauvinistic, we’re not just provincial. We stand for something—I mean in a way Obama’s standing above the country, above—above the world. He’s sort of God.
This image of President Obama standing above his country and above the world sums up the post-American way of thinking. The practical point it makes is that America’s interest is no different or better than any other country’s interest. But is that true? Is America’s interest not superior to Sudan’s or Cuba’s or Zimbabwe’s?
In line with this way of thinking, the Obama administration is pursuing a policy that can accurately be described as neoisolationist—a policy characterized by an unwillingness to be assertive in the world in defense of America’s interests and those of our friends and allies. This policy traces back in the Democratic party to George McGovern’s acceptance speech at the 1972 Democratic national convention. McGovern’s refrain was “Come Home America”—come home from Vietnam and come home from a lot of other places as well. This is the attitude that has come to dominate liberal foreign policy circles.
Consider our current policy regarding Iraq. The Obama administration is determined to withdraw American forces along the lines of a plan formulated at the end of the Bush administration, but without regard to the actual situation in Iraq. American forces have pulled back from their prominent roles in the major urban areas, and violence has increased. But the administration remains fixed on the withdrawal schedule, because it is withdrawal—rather than the political stability of Iraq—that matters to it most. And this strict adherence to the exit timetable without regard to the political and military consequences could prove to be very harmful to our interests—not only in Iraq, but in the broader region as well.
In Afghanistan, there is legitimate room for discussion about what our strategic objectives should be. I doubt we will transform it into a stable democratic society. It is not going to become Switzerland—or even Honduras. On the other hand, we have a serious strategic interest in making sure that the Taliban and al-Qaeda don’t use Afghanistan as a base to launch future terrorist attacks. But today, what was for years portrayed as the good war by liberals—as opposed to the “bad” Iraq War—has become just another war from which they want to get out. This is creating a difficult political problem for President Obama. And the path he chooses to take in Afghanistan is going to be significant, not least because of the consequences it will have in Pakistan.
Our interests in Pakistan are even more acute than in Afghanistan, and the potential risks to the United States and to our allies even graver. The reason is that if radical Islamists are able to create enough chaos inside Pakistan to enable them to take control of the government, they will immediately come into possession of a substantial arsenal of nuclear weapons. This would lead to a greater risk of conflict on the Indian subcontinent and also increase the chance that these weapons will fall into the hands of terrorist groups. So our national interest is not simply preventing al-Qaeda and the Taliban from returning to their safe havens in Afghanistan. The cross-border nature of Taliban and al-Qaeda activities requires us to work even harder to ensure that Pakistan’s nuclear capabilities don’t fall into the wrong hands.
More broadly, the Obama administration believes that its predecessor didn’t negotiate enough on issues like the proliferation of weapons of mass destruction. The president has said repeatedly—starting with his Inaugural Address—that the United States must hold out its hand to countries like North Korea and Iran in the hopes that they will unclench their fist and enter into negotiation. This reflects a curious view of history, since in fact the Bush administration negotiated directly or indirectly with Iran and North Korea for six-and-a-half years. But more importantly, it reflects a fundamental misunderstanding of the nature of negotiation. Negotiation is not a policy. It is a technique. It is a way of achieving our objectives. It doesn’t tell us what the objectives are. The emphasis on negotiation as an end in itself reflects a shallowness in this administration’s approach to international affairs, and gives us little confidence that our interests will be well served.
The Obama administration has extended its hand to North Korea, only to see that country conduct another nuclear test, launch more ballistic missiles, and kidnap and incarcerate two American reporters. Kim Jong Il apparently didn’t get the message about the “reset button” when President Obama replaced President Bush. And in fact, Kim Jong Il will never be talked out of his nuclear weapons program, which he sees as a trump card against the United States, Japan and South Korea. It’s the ultimate protection for his regime, and it’s a source of revenue and leverage elsewhere in the world, particularly in the Middle East. On the other hand, the North Koreans have been very successful over the years in using negotiations to leverage economic and political concessions. They’ve even been happy to pledge to give up nuclear weapons—five times, by my count, over the past 18 years. But of course they never carry through.
Sometime during the next year, North Korea will probably agree to negotiate. And why not? It’s to their advantage. It buys them time, it increases the possibility of further economic and political concessions, and it will fundamentally satisfy a U.S. administration whose supreme objective is negotiations. It won’t reduce the nuclear threat that North Korea poses to the world, but it will take it out of the media spotlight. And for this administration, that would appear to be as good as solving
In Iran we see another example of the outstretched hand being slapped away. Indeed, there is now at least anecdotal evidence that the regime in Tehran saw the Obama administration as so eager for negotiations that it would overlook any harsh steps Iran might take internally. So in response to the administration’s friendly overtures, the mullahs in Tehran conducted a grossly fraudulent presidential election on June 12 and have spent the subsequent months repressing their opponents. Close observers believe that there is no longer a power struggle in the Iranian government between hard-liners and moderates—if any moderates are left—but rather that power is flowing away from the ayatollahs and toward the Islamic Revolutionary Guard Corps. In other words, Iran is being transformed from a theological autocracy into a theological military dictatorship. And given that the Islamic Revolutionary Guard Corps controls both Iran’s nuclear weapons program and its funding of international terrorism, this means that Iran will only become more dangerous as time goes on.
As the failure of negotiations with Iran becomes more obvious by the day, the Obama administration’s next strategy seems to be a reliance on sanctions. In theory, sanctions will take advantage of the vulnerability stemming from Iran’s inability to refine petroleum. But this is the strategy that the Europeans and the Bush administration pursued unsuccessfully for the last seven years. The U.N. Security Council has passed three sanction resolutions, which have had almost no impact whatsoever on Iran’s ongoing nuclear weapons program. Another U.N. resolution is not likely, especially given Russia’s firm opposition. And if Europe and the United States don’t help Iran with oil, Venezuela’s President Chavez has pledged his country will do so.
There are really only two scenarios by which Iran can be stopped from possessing nuclear weapons. The first is regime change, which seems less and less likely now that the outrage following the fraudulent presidential election has dissipated. The second is preemptive military force. This is an extraordinarily unattractive option, but the alternative is much less attractive. The Obama administration almost certainly will do nothing militarily, which puts the entire onus on Israel. In the past, Israel has not hesitated to act when faced with an existential threat. It destroyed Saddam Hussein’s Osirak reactor outside Baghdad in 1981, and in September 2007 it destroyed a North Korean reactor in Syria. So the spotlight in the near future is very much going to be on Israel.
Toward Israel, the Obama administration’s policy to this point has been an essentially European policy. Its underlying assumption is that solving the Israeli-Palestinian problem will lead to a greater peace in the Middle East. But the real root of the problem in the Middle East is Iran’s continuing support for terrorist groups like Hamas and Hezbollah. Nonetheless, the administration has thus far spent more time and energy pressuring Israel to stop building settlements than pressuring Iran to stop funding terrorism.
Here at home, the Obama administration has gravely impaired our capability to gather human intelligence by declassifying hundreds of pages of documents that explain our interrogation techniques—information that is now probably in al-Qaeda training manuals. And at a time of the grossest profligacy in domestic spending in American history, the administration has imposed a ceiling on defense spending. At the same time it advocates an $800 billion stimulus plan that seems to include every idea ever hatched in Washington, it is making radical cuts on missile defense and cancelling the F-22 fighter aircraft. It supports a deposed president in Honduras—deposed, in accordance with the Honduran Constitution, for attempting to subvert the Constitution as his thuggish ally President Chavez did in Venezuela—against its legitimate government which promises a free and transparent election. The list goes on. And even where the administration has pursued sensible policies, it has only done so grudgingly, and with the clear understanding that, absent political constraints, it would have done things differently.
I understand that Americans are concerned about the economy. And I understand that every new president is going to have domestic priorities. But our adversaries around the world are not standing idly by while we debate these domestic issues. Our current focus on health care is very important, but people like Kim Jong Il don’t care about it. We need a president who is going to provide us with leadership in international affairs—not one who believes that America should simply come home. And we need a president who believes that the best place to defend our interest is overseas rather than in the streets of America. ■
The following is adapted from a lecture delivered on August 2, 2009, during a Hillsdale College cruise from Venice to Athens aboard the Crystal Serenity.
Future Prospects for Economic Liberty
One of the justifications for the massive growth of government in the 20th and now the 21st centuries, far beyond the narrow limits envisioned by the founders of our nation, is the need to promote what the government defines as fair and just. But this begs the prior and more fundamental question: What is the legitimate role of government in a free society? To understand how America’s Founders answered this question, we have only to look at the rule book they gave us—the Constitution. Most of what they understood as legitimate powers of the federal government are enumerated in Article 1, Section 8. Congress is authorized there to do 21 things, and as much as three-quarters of what Congress taxes us and spends our money for today is nowhere to be found on that list. To cite just a few examples, there is no constitutional authority for Congress to subsidize farms, bail out banks, or manage car companies. In this sense, I think we can safely say that America has departed from the constitutional principle of limited government that made us great and prosperous.
On the other side of the coin from limited government is individual liberty. The Founders understood private property as the bulwark of freedom for all Americans, rich and poor alike. But following a series of successful attacks on private property and free enterprise—beginning in the early 20th century and picking up steam during the New Deal, the Great Society, and then again recently—the government designed by our Founders and outlined in the Constitution has all but disappeared. Thomas Jefferson anticipated this when he said, “The natural progress of things is for liberty to yield and government to gain ground.”
To see the extent to which liberty is yielding and government is gaining ground, one need simply look at what has happened to taxes and spending. A tax, of course, represents a government claim on private property. Every tax confiscates private property that could otherwise be freely spent or freely invested. At the same time, every additional dollar of government spending demands another tax dollar, whether now or in the future. With this in mind, consider that the average American now works from January 1 until May 5 to pay the federal, state, and local taxes required for current government spending levels. Thus the fruits of more than one third of our labor are used in ways decided upon by others. The Founders favored the free market because it maximizes the freedom of all citizens and teaches respect for the rights of others. Expansive government, by contrast, contracts individual freedom and teaches disrespect for the rights of others. Thus clearly we are on what Friedrich Hayek called the road to serfdom, or what I prefer to call the road to tyranny.
As I said, the Constitution restricts the federal government to certain functions. What are they? The most fundamental one is the protection of citizens’ lives. Therefore, the first legitimate function of the government is to provide for national defense against foreign enemies and for protection against criminals here at home. These and other legitimate public goods (as we economists call them) obviously require that each citizen pay his share in taxes. But along with people’s lives, it is a vital function of the government to protect people’s liberty as well—including economic liberty or property rights. So while I am not saying that we should pay no taxes, I am saying that they should be much lower—as they would be, if the government abided by the Constitution and allowed the free market system to flourish.
And it is important to remember what makes the free market work. Is it a desire we all have to do good for others? Do people in New York enjoy fresh steak for dinner at their favorite restaurant because cattle ranchers in Texas love to make New Yorkers happy? Of course not. It is in the interest of Texas ranchers to provide the steak. They benefit themselves and their families by doing so. This is the kind of enlightened self-interest discussed by Adam Smith in his Wealth of Nations, in which he argues that the social good is best served by pursuing private interests. The same principle explains why I take better care of my property than the government would. It explains as well why a large transfer or estate tax weakens the incentive a property owner has to care for his property and pass it along to his children in the best possible condition. It explains, in general, why free enterprise leads to prosperity.
Ironically, the free market system is threatened today not because of its failure, but because of its success. Capitalism has done so well in eliminating the traditional problems of mankind—disease, pestilence, gross hunger, and poverty—that other human problems seem to us unacceptable. So in the name of equalizing income, achieving sex and race balance, guaranteeing housing and medical care, protecting consumers, and conserving energy—just to name a few prominent causes of liberal government these days—individual liberty has become of secondary or tertiary concern.
Imagine what would happen if I wrote a letter to Congress and informed its members that, because I am fully capable of taking care of my own retirement needs, I respectfully request that they stop taking money out of my paycheck for Social Security. Such a letter would be greeted with contempt. But is there any difference between being forced to save for retirement and being forced to save for housing or for my child’s education or for any other perceived good? None whatsoever. Yet for government to force us to do such things is to treat us as children rather than as rational citizens in possession of equal and inalienable natural rights.
We do not yet live under a tyranny, of course. Nor is one imminent. But a series of steps, whether small or large, tending toward a certain destination will eventually take us there. The philosopher David Hume observed that liberty is seldom lost all at once, but rather bit by bit. Or as my late colleague Leonard Read used to put it, taking liberty from Americans is like cooking a frog: It can’t be done quickly because the frog will feel the heat and escape. But put a frog in cold water and heat it slowly, and by the time the frog grasps the danger, it’s too late.
Again, the primary justification for increasing the size and scale of government at the expense of liberty is that government can achieve what it perceives as good. But government has no resources of its own with which to do so. Congressmen and senators don’t reach into their own pockets to pay for a government program. They reach into yours and mine. Absent Santa Claus or the tooth fairy, the only way government can give one American a dollar in the name of this or that good thing is by taking it from some other American by force. If a private person did the same thing, no matter how admirable the motive, he would be arrested and tried as a thief. That is why I like to call what Congress does, more often than not, “legal theft.” The question we have to ask ourselves is whether there is a moral basis for forcibly taking the rightful property of one person and giving it to another to whom it does not belong. I cannot think of one. Charity is noble and good when it involves reaching into your own pocket. But reaching into someone else’s pocket is wrong.
In a free society, we want the great majority, if not all, of our relationships to be voluntary. I like to explain a voluntary exchange as a kind of non-amorous seduction. Both parties to the exchange feel good in an economic sense. Economists call this a positive sum gain. For example, if I offer my local grocer three dollars for a gallon of milk, implicit in the offer is that we will both be winners. The grocer is better off because he values the three dollars more than the milk, and I am better off because I value the milk more than the three dollars. That is a positive sum gain. Involuntary exchange, by contrast, means that one party gains and the other loses. If I use a gun to steal a gallon of milk, I win and the grocer loses. Economists call this a zero sum gain. And we are like that grocer in most of what Congress does these days.
Some will respond that big government is what the majority of voters want, and that in a democracy the majority rules. But America’s Founders didn’t found a democracy, they founded a republic. The authors of The Federalist Papers, arguing for ratification of the Constitution, showed how pure democracy has led historically to tyranny. Instead, they set up a limited government, with checks and balances, to help ensure that the reason of the people, rather than the selfish passions of a majority, would hold sway. Unaware of the distinction between a democracy and a republic, many today believe that a majority consensus establishes morality. Nothing could be further from the truth.
Another common argument is that we need big government to protect the little guy from corporate giants. But a corporation can’t pick a consumer’s pocket. The consumer must voluntarily pay money for the corporation’s product. It is big government, not corporations, that have the power to take our money by force. I should also point out that private business can force us to pay them by employing government. To see this happening, just look at the automobile industry or at most corporate farmers today. If General Motors or a corporate farm is having trouble, they can ask me for help, and I may or may not choose to help. But if they ask government to help and an IRS agent shows up at my door demanding money, I have no choice but to hand it over. It is big government that the little guy needs protection against, not big business. And the only protection available is in the Constitution and the ballot box.
Speaking of the ballot box, we can blame politicians to some extent for the trampling of our liberty. But the bulk of the blame lies with us voters, because politicians are often doing what we elect them to do. The sad truth is that we elect them for the specific purpose of taking the property of other Americans and giving it to us. Many manufacturers think that the government owes them a protective tariff to keep out foreign goods, resulting in artificially higher prices for consumers. Many farmers think the government owes them a crop subsidy, which raises the price of food. Organized labor thinks government should protect their jobs from non-union competition. And so on. We could even consider many college professors, who love to secure government grants to study poverty and then meet at hotels in Miami during the winter to talk about poor people. All of these—and hundreds of other similar demands on government that I could cite—represent involuntary exchanges and diminish our freedom.
This reminds me of a lunch I had a number of years ago with my friend Jesse Helms, the late Senator from North Carolina. He knew that I was critical of farm subsidies, and he said he agreed with me 100 percent. But he wondered how a Senator from North Carolina could possibly vote against them. If he did so, his fellow North Carolinians would dump him and elect somebody worse in his place. And I remember wondering at the time if it is reasonable to ask a politician to commit political suicide for the sake of principle. The fact is that it’s unreasonable of us to expect even principled politicians to vote against things like crop subsidies and stand up for the Constitution. This presents us with a challenge. It’s up to us to ensure that it’s in our representatives’ interest to stand up for constitutional government.
Americans have never done the wrong thing for a long time, but if we’re not going to go down the tubes as a great nation, we must get about changing things while we still have the liberty to do so.
The Constitution and American Sovereignty
“WOULD WE be far wrong,” President Lincoln asked in a special message to Congress in 1861, “if we defined [sovereignty] as a political community without a political superior?” Maybe that’s not exhaustive, but it comes on good authority. And notice that for Lincoln, sovereignty is a political or legal concept. It’s not about power. Lincoln didn’t say that the sovereign is the one with the most troops. He was making a point about rightful authority.
By contrast, sovereignty wasn’t an issue in the ancient world. Cicero notes that the ancient Romans had the same word for “stranger” as for “enemy.” In the ancient world, people didn’t interact with foreigners enough to think about their relation to them except insofar as it meant war. Nor was sovereignty an issue in medieval Europe, since the defining character of that period was overlapping authority and a lot of confusion about which authority had primary claims. No one had to think about defining national boundaries. This became an issue only in the modern era, when interaction between different peoples increased.
The first important writer to address sovereignty was Jean Bodin, a French jurist of the late 16th century. In his work, Six Books of the Republic, Bodin set out an understanding of sovereignty whereby the King of France represented an independent political authority rather than owing allegiance to the Holy Roman Emperor or to the Pope. In the course of developing this argument, Bodin also advocated religious toleration and insisted that a monarch can neither seize property except by law nor raise taxes except by the consent of a representative body. He was in favor of free trade, and he insisted on the monarch’s general obligation to respect the law of nature and the law of God. His main practical point was that the government must be strong enough to protect the people’s rights, yet restrained enough not to do more than that. Subsequently, I might add, Bodin wrote a book about witchcraft—which he very much opposed. Witches are people who think they can make an end run around the laws of nature and of God using magical spells, and Bodin saw them as a menace.
It was not until the 17th century that the word “sovereignty” became common. This was also when people first came to think of representative assemblies as legislatures. Indeed, the word “legislature” is itself a 17th century term reflecting the modern emphasis on law as an act of governing will rather than impersonal custom. It is therefore related to the modern notion of government by consent. Significantly, it was also in this same era that professional armies came into being. Before the 17th century, for instance, there was no such thing as standard military uniforms. Uniforms indicate that soldiers have a distinct status and serve distinct governments. They reflect a kind of seriousness about defense.
The 17th century is also the period when people began thinking in a systematic way about what we now call international law or the law of nations—a law governing the relation of sovereign nations. The American Declaration of Independence refers to such a law in its first sentence: “When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them . . . .” The Declaration assumes here that nations have rights, just as individuals do.
The Sovereign Constitution
Returning to Lincoln, his understanding was that in an important sense American sovereignty rested in the Constitution. Article 7 of the Constitution declares that it will go into effect when it is ratified by nine states, for those nine states. And once ratified—once the people of those states have entered into the “more perfect Union’’ described in its Preamble—the Constitution is irrevocable. Unlike a treaty, it represents a commitment that cannot be renegotiated. Thus it describes itself unambiguously as “the supreme Law of the Land”—even making a point of adding, “any Thing in the Constitution or Laws of any State to the Contrary notwithstanding.”
The Constitution provides for treaties, and even specifies that treaties will be “the supreme Law of the Land”; that is, that they will be binding on the states. But from 1787 on, it has been recognized that for a treaty to be valid, it must be consistent with the Constitution—that the Constitution is a higher authority than treaties. And what is it that allows us to judge whether a treaty is consistent with the Constitution? Alexander Hamilton explained this in a pamphlet early on: “A treaty cannot change the frame of the government.” And he gave a very logical reason: It is the Constitution that authorizes us to make treaties. If a treaty violates the Constitution, it would be like an agent betraying his principal or authority. And as I said, there has been a consensus on this in the past that few ever questioned.
Let me give you an example of how the issue has arisen. In 1919, the United States participated in a conference to establish the International Labour Organization (ILO). The original plan was that the members of the ILO would vote on labor standards, following which the member nations would automatically adopt those standards. But the American delegation insisted that it couldn’t go along with that, because it would be contrary to the Constitution. Specifically, it would be delegating the treaty-making power to an international body, and thus surrendering America’s sovereignty as derived from the Constitution. Instead, the Americans insisted they would decide upon these standards unilaterally as they were proposed by the ILO. In the 90 years since joining this organization, I think the U.S. has adopted three of them.
Today there is no longer a consensus regarding this principle of non-delegation, and it has become a contentious issue. For instance, two years ago in the D.C. Court of Appeals, the National Resources Defense Council (NRDC), an environmental group, sued the Environmental Protection Agency (EPA), claiming that it should update its standards for a chemical that is thought to be depleting the ozone layer. There is a treaty setting this standard, and the EPA was in conformity with the treaty. But the NRDC pointed out that Congress had instructed the EPA to conform with the Montreal Protocol and its subsequent elaborations. In other words, various international conferences had called for stricter emission standards for this chemical, and Congress had told the EPA to accept these new standards as a matter of course. The response to this by the D.C. Court of Appeals was to say, in effect, that it couldn’t believe Congress had meant to do that, since Congress cannot delegate its constitutional power and responsibility to legislate for the American people to an international body. This decision wasn’t appealed, so we don’t yet have a Supreme Court comment on the issue.
The delegation of judicial power is another open question today. There’s no doubt that the U.S. can agree to arbitrations of disputes with foreign countries, as we did as early as the 1790s with the Jay Treaty. But it’s another thing altogether to say that the rights of American citizens in the U.S. can be determined by foreign courts. This would seem to be a delegation of the judicial power, which Article 3 of the Constitution says “shall be vested in one Supreme Court, and in such inferior Courts as the Congress may from time to time ordain and establish.” This became an issue last year in the case of Medellin v. Texas, which considered an International Court of Justice ruling that Texas could not execute a convicted murderer, because he had not been given the chance to consult the Mexican consulate before his trial, as he had the right to do under an international treaty. The Supreme Court, after much hand-wringing, concluded that it didn’t think the Senate had intended to give the International Court of Justice the power to decide these questions of American law as applied by American courts. I would go further and say that no matter what the Senate intended, this is not a power which can be delegated under the Constitution. But it is no longer clear that a majority on the Supreme Court would agree.
Or consider the Spanish judges who want to arrest American politicians if they venture into Europe, in order to try them for war crimes. This is preposterous. It is akin to piracy. And not only has our government not protested this nonsense, but it has contributed to building up an international atmosphere in which this sort of thing seems plausible—an atmosphere where the old idea of a jury of one’s peers and the idea of Americans having rights under the Constitution give way to the notion of some hazy international standard of conduct that everyone in the world can somehow agree upon and then enforce on strangers.
The Loss of Sovereignty
It is important to think about these issues regarding sovereignty today, because it is possible to lose sovereignty rather quickly. Consider the European Union. The process that led to what we see today in the EU began when six countries in 1957 signed a treaty agreeing that they would cooperate on certain economic matters. They established a court in Luxembourg—the European Court of Justice—which was to interpret disputes about the treaty. To make its interpretations authoritative, the Court decreed in the early 1960s that if the treaty came into conflict with previous acts of national parliaments, the treaty would take precedence. Shortly thereafter it declared that the treaty would also take precedence over subsequent statutes. And in the 1970s it said that even in case of conflicts between the treaty and national constitutions, the treaty would take precedence. Of course, judges can say whatever they want. What is more remarkable is that all the nations in the EU have more or less grudgingly accepted this idea that a treaty is superior to their constitutions, so that today whatever regulations are cranked out by the European Commission—which is, not to put too fine a point on it, a bureaucracy—supersede both parliamentary statutes and national constitutions. And when there was eventually a lot of clamor about protection of basic rights, the court in Luxembourg proclaimed that it would synthesize all the different rights in all the different countries and take care of that as well.
So on the one hand the European Union has constitutional sovereignty, but on the other it doesn’t have a constitution. When its bureaucrats recently attempted to write a constitution and get it adopted, a number of countries voted it down in referendums. Apart from lacking a constitution, the EU doesn’t have an army or a police force or any means of exercising common control of its borders. In effect, it claims political superiority over member states but declines to be responsible for their defense. Indeed, I think inherent in this whole enterprise of transcending nation-states through the use of international institutions is the idea that defense is not so important.
All of this has happened in Europe in a very short period, and is the reason we should be concerned about the loss in our own country of a consensus regarding constitutional sovereignty. Think of the Kyoto Protocol on global warming, which many of our leading politicians now say we should have ratified. Doing so would have delegated the authority over huge areas of important public policy to international authorities. It would have been a clear delegation of the treaty-making power. Nevertheless, the Obama administration is aiming to negotiate a new treaty along those lines.
Of even more urgent concern is the increasing sense that human rights law transcends the laws of particular countries, even those pertaining to national defense. Of course, the idea that there should be standards that all countries respect when engaged in armed conflict is fair enough. But who is going to set the standards? And who is going to enforce them—especially against terrorists who refuse to act like uniformed professional soldiers? What we once called the “law of war” is now commonly referred to as “international humanitarian law.” Many today say that we need to follow this law as it is defined by the International Red Cross. But who makes up this organization in Geneva, Switzerland, and what gives them the authority to supersede national statutes and constitutions? Currently the International Red Cross thinks it is a violation of humanitarian standards for the U.S. to hold prisoners in Guantanamo Bay—not on the basis of any claim that these prisoners are mistreated, but based on the argument that they cannot be held indefinitely and should be put on trial in ordinary criminal courts. Even the Obama administration is not yet willing to conform to this particular standard of so-called international law, believing that holding these prisoners is vital to national defense and that the right to self- defense is morally compelling.
* * *
Where does this trend away from the sovereignty of national constitutions lead? I do not think the danger is a world tyranny. I think that idea is fantastical. Rather what it will lead to, I think, is an undermining of the idea that national governments can protect people, with the result that people will start looking for defense elsewhere. We saw this in an extreme way in Iraq when it collapsed into chaos before the surge, and people looked for protection to various ethnic or sectarian militias. A similar phenomenon can be seen today in Europe with the formation of various separatist movements. We’re even hearing loud claims for Scottish independence. And it’s not surprising, because to the extent that Britain has surrendered its sovereignty, Britain doesn’t count for as much as it used to. So why not have your own Scotland? Why not have your own Wales? Why not have your own Catalonia in Spain? And of course the greatest example of this devolution in Europe is the movement toward Muslim separatism. While this is certainly driven to a large extent by trends in Islam, it also reflects the fact that it doesn’t mean as much to be British or to be French any more. These governments are cheerfully giving away their authority to the EU. So why should immigrants or children of immigrants take them seriously?
At the end of The Federalist Papers, Alexander Hamilton writes: “A nation, without a national government, is, in my view, an awful spectacle.” His point was that if you do not have a national government, you can’t expect to remain a nation. If we are really open to the idea of allowing more and more of our policy to be made for us at international gatherings, the U.S. government not only has less capacity, it has less moral authority. And if it has less moral authority, it has more difficulty saying to immigrants and the children of immigrants that we’re all Americans. What is left, really, to being an American if we are all simply part of some abstract humanity? People who expect to retain the benefits of sovereignty—benefits like defense and protection of rights—without constitutional discipline, or without retaining responsibility for their own legal system, are really putting all their faith in words or in the idea that as long as we say nice things about humanity, everyone will feel better and we’ll all be safe. You could even say they are hanging a lot on incantations or on some kind of witchcraft. And as I mentioned earlier, the first theorist to write about sovereignty understood witchcraft as a fundamental threat to lawful authority and so finally to liberty and property and all the other rights of individuals.
Jean Yarbrough is professor of government and Gary M. Pendy, Sr. Professor of Social Sciences at Bowdoin College. She received her B.A. at Cedar Crest College and her M.A. and Ph.D. at the New School for Social Research. The author of American Virtues: Thomas Jefferson on the Character of a Free People and editor of The Essential Jefferson, she is currently completing a study of Theodore Roosevelt and the Progressive critique of the Founders.
The following address was delivered at Hillsdale College on April 16, 2009, at the dedication of a statue of Thomas Jefferson by Hillsdale College Associate Professor of Art Anthony Frudakis.
It is one of the wonders of the modern political world that John Adams and Thomas Jefferson both died on the 50th anniversary of the Declaration of Independence. Unaware that the “Sage of Monticello” had died earlier in the day, the crusty Adams, as he felt his own life slipping away, uttered his last words, “Thomas Jefferson still lives.” And so he does.
Today, as we dedicate this marvelous statue of our third President, and place him in the company of George Washington, Winston Churchill, and Margaret Thatcher on Hillsdale’s Liberty Walk, soon to be joined by Abraham Lincoln, it is fitting to reflect on what of Thomas Jefferson still lives. What is it that we honor him for here today?
Without question, pride of place must go to Jefferson as the author of the Declaration of Independence. That document established Jefferson as one of America’s great political poets, second only to Abraham Lincoln. And fittingly, it was Lincoln himself who recognized the signal importance of its first two paragraphs when he wrote: “All honor to Jefferson—to the man who, in the concrete pressure of a struggle for national independence by a single people, had the coolness, forecast, and capacity to introduce into a merely revolutionary document, an abstract truth, applicable to all men and all times,” where it continues to stand as “a rebuke and a stumbling block to the very harbingers of reappearing tyranny and oppression.”
That abstract truth, of course, was that “all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness.—That to secure these rights Governments are instituted among Men, deriving their just powers from the consent of the governed.” It is surely a sign of our times that so many Americans no longer know what these words mean, or what their signal importance has been to peoples around the world. The one thing they are certain of, however, is that Jefferson was a hypocrite. How could he assert that all men were created equal and yet own slaves? What these critics fail to notice is that this is precisely what makes Jefferson’s statement so remarkable. Under no necessity for doing so, he penned the immortal words that would ultimately be invoked to put the institution of slavery on the road to extinction. His own draft of the Declaration was even stronger. In it, he made it clear that blacks were human and that slavery was a moral abomination and a blot upon the honor of his country.
Jefferson was serving as Minister in Paris while the Constitution was being drafted, and played no direct part in framing it. But he did make known his objections, the most important being the omission of a Bill of Rights. After the Constitution was ratified, he returned to the United States to serve as Secretary of State in the Washington administration. In and out of government in the 1790s, he challenged Hamilton’s expansive views of federal power, warning against a mounting federal debt, a growing patronage machine, and what he considered dangerous monarchical pretensions.
In the tumultuous contest for the presidency in 1800, Jefferson presided over the first peaceful transition of power in modern history, assuring those he had defeated that they too had rights that the majority was bound to respect. His observation, “We are all Republicans, we are all Federalists,” established a standard toward which every incoming administration continues to strive.
As president of the United States, Jefferson sought to rally the country around the principles of limited government. His First Inaugural Address reminded his fellow citizens that their happiness and prosperity rested upon a “wise and frugal Government, which shall restrain men from injuring one another, shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned.” This, he thought, was “the sum of good government” and all that was “necessary to close the circle of our felicities.” Although Jefferson had omitted property from the inalienable rights enumerated in the Declaration, he strongly defended private property because it encouraged industry and liberality—and, most importantly, because he thought it just that each individual enjoy the equal right to the fruits of his labor.
From these political principles, Jefferson never wavered. Writing in 1816, he once again insisted that the tasks of a liberal republic were few: government should restrain individuals from encroaching on the equal rights of others, compel them to contribute to the necessities of society, and require them to submit their disputes to an impartial judge. “When the laws have declared and enforced all this, they have fulfilled their functions.”
At the same time, Jefferson believed that constitutions must keep pace with the times. If the people wished to alter their frame of government, say, to fund public improvements or education, they were free to do so. But they should do so by constitutional amendment and not by allowing their representatives to construe the powers of government broadly. He particularly objected to the Court’s sitting in judgment on the powers of the legislative and executive branches, or acting as an umpire between the states and the federal government. To cede to the judiciary this authority, he believed, would render the Constitution a “ball of wax” in the hands of federal judges. In his battles with Chief Justice John Marshall, he defended the principle of coordinate construction, as Lincoln (and almost every strong president since then) did after him, arguing that each branch of government must determine for itself the constitutionality of its acts.
After his retirement from politics, Jefferson returned to Monticello, where he continued to think about the meaning and requirements of republican government. Republicanism, he was convinced, was more than just a set of institutional arrangements; at bottom, it depended upon the character of the people. To keep alive this civic spirit, he championed public education for both boys and girls, with the most talented boys going on at public expense all the way through college. He envisioned the University of Virginia, to which he devoted the last years of his life, as a temple that would keep alive the “vestal flame” of republicanism and train men for public service. And here, I cannot help but notice how the recent renovations and additions to the Hillsdale campus seem to take their inspiration from Mr. Jefferson’s university, paying graceful homage to an architecture of democracy that inspires and ennobles.
As Jefferson understood it, education had a distinctly political mission, beginning at the elementary level: schools were to form citizens who understood their rights and duties, who knew how earlier free societies had risen to greatness, and by what errors and vices they had declined. Knowing was not enough, however. Jefferson also believed that citizens must have the opportunity to act. Anticipating Tocqueville, Jefferson admired the strength of the New England townships and sought to adapt them to Virginia. The wards, as he called them, would allow citizens to have a say on those matters most interesting to them, such as the education of their children and the protection of their property. If ever they became too dispirited to care about these things, republican government could not survive.
The wards were certainly not the greatest of Jefferson’s contributions to the natural rights republic—that honor must be awarded to the Declaration—but they were his most original. Instead of consolidating power or attempting to forge a general will, Jefferson went in the opposite direction, “dividing and sub-dividing” political power, while multiplying the number of interests and views that could be heard. He saw these units of local self-government as a way of bringing the large republic within the reach of citizens and so keeping alive the spirit of republicanism so vital to its preservation. And in this day and age, when the federal government seems to intrude on every aspect of our daily lives, and people feel powerless over matters of most interest to them, can we doubt that he was right? For this insight, too, let us echo Lincoln: “All honor to Jefferson”!
Allen C. Guelzo
Professor of the Civil War Era, Gettysburg College
Allen C. Guelzo is the Henry R. Luce Professor of the Civil War Era and Director of Civil War Era Studies at Gettysburg College. A two-time winner of the Lincoln Prize, his books include Lincoln’s Emancipation Proclamation: The End of Slavery in America and Lincoln and Douglas: The Debates That Defined America.
The following are excerpts from a speech delivered at Hillsdale College on May 8, 2009, at the dedication of a statue of Abraham Lincoln by Hillsdale College Associate Professor of Art Anthony Frudakis.
Heroes have become invisible. Their virtues have become unexplainable in the language we now use to explain human actions . . . . Great deeds somehow keep on being done, but we have lost a capacity to see them as great. Biographies grow to ever-greater and greater lengths, while the subjects of them shrink into the shadows of the pedestrian, the ordinary, and the relentlessly disclosed secret. And no history textbook can to-day pass muster unless it highlights the insignificant, reduces absolutes to local accident, and eliminates grand narratives in favor of a collection of tales, full of sound and fury, whose chief goal is to elicit pity, sympathy or guilt.
The hero is the story, not just of a good deed, but a great deed—a great deed which climbs the unclimbable, endures the unendurable, holds fast to the lost. But who can be a hero when climbing is so routine that Mt. Everest has become littered with discarded bottles and cans? The dark side of our bottomless wealth and comfort is a cynicism which disarms any motivation for sacrifice, and a suspicion that, in a world of comforts, heroes can only be play-actors. Something other than the heroic must be motivating the heroes, we seem to reason, because there is so little need for heroism. . . .
* * *
What we do here today, in dedicating Tony Frudakis’s statue of Abraham Lincoln, flies so finely in the face of this age of post-heroism that somewhere, we can be sure some voice will fix on this event to tell us that this is all farce—that Lincoln cannot be a hero because he was a racist, or that he cannot be the savior of the Union because the Union was rotten to its exploitative, capitalist, war-mongering, imperialist, Christ-loving, minority-massacring, little-Eichmann core and couldn’t deserve a savior.
For six decades after his death, this was not so. Lincoln was the quintessential, the indispensable, American hero. Of the 600 or so statues dedicated to American presidents, fully one third are of Abraham Lincoln; one of them, Daniel Chester French’s seated Lincoln in the Lincoln Memorial, may be the most famous American statue ever created. But the post-World War One cultural malaise, which inaugurated an era of literary debunking and political minimalism, curved the arc of other Lincoln statuary downwards, away from the wise, heroic statesman and in the direction of a more folksy, proletarian Lincoln. Even in Lincoln’s Illinois, statuary of Lincoln continues to bring him off pedestals, closer to the earth, sitting on park benches, in the fashion of Jeff Garland’s 2001 Just Don’t Sit There, Do Something, a park-bench Lincoln whose head was decapitated in 2007 as a wedding prank…Rick Harney’s 2006 Lincoln at Leisure, which captures a shirt-sleeved Lincoln leaning on a fence…and, in Springfield, John W. McClarey’s A Greater Task, which is supposed to depict Lincoln grasping his coat around him as he delivers his farewell speech in 1861, but which ends up making him look like a derelict panhandling for spare quarters.
The statues, however, only reflect a larger decline in our estimate of Lincoln. In a multicultural perspective, no triumphal, Union-saving Lincoln is allowed to emerge; multiculturalism is the celebration of ordinariness, information, and egalitarianism. Which is why most people today are interested in knowing whether Lincoln was gay rather than knowing whether he was right. . . .
* * *
The price we pay for this, in our schools and in our public discourse as well as in our statuary, is a steep one. Political systems, whether constitutional regimes or political parties, rest on a bedrock of culture—of certain shared assumptions, rituals, and unexamined attitudes—which can sometimes seem to have the stolid immovability of granite, and which at others can seem to have the fragility of snow crusts. The difference is made by confidence, which itself is composed in equal parts of practical results and constant reminders. So a constitutional regime appears to be a collection of laws and statutes; but those laws and statutes depend first on a reverence for words, for reason, and for orderliness. And that reverence must grow both from the confidence that words, reasons, order, laws and statutes really do protect and assist them, and from the constant dinning into the ears of its citizens that same confidence. On the other hand, in a culture of repudiation, where venality, corruption and incompetence produce chaos or violence, and knowledge is reduced to a species of power, confidence in words evaporates, and so do constitutions; but when examples of civic good are corroded and dissolved by victimhood and grievance, confidence evaporates just as quickly. And all the king’s horses and all the king’s men cannot put it back together again, because there are no more kings among men. . . .
* * *
So what is there of the hero in the statue we dedicate here today? If we mean by ‘hero’ merely a sword-swinging swashbuckler on a spree, we will find little of that here (and in fact, it’s noticeable that in genuinely heroic statues of real soldiers, like the St. Gaudens of William Sherman in Central Park or the Henry Merwin Shrady statue of Ulysses Grant at the U.S. Capitol, no swords are ever swung). But this is because heroism is not about skull-cracking. It is, first of all, about profound moral conviction. The face of this Lincoln is set, not in excitement or antagonism, but in conviction. “I expect to maintain this contest until successful, or till I die, or am conquered, or my term expires, or Congress or the country forsakes me,” he wrote to Secretary of State William Seward in the summer of 1862, when things appeared bleak for the cause of the Union. Especially, Lincoln was single-minded in his commitment to emancipation. “While I remain in my present position,” Lincoln said in 1863, “I shall not attempt to retract or modify the emancipation proclamation; nor shall I return to slavery any person who is free by the terms of that proclamation.” And if, he added a year later, “the people should, by whatever mode or means, make it an Executive duty to re-enslave such persons, another, and not I, must be their instrument to perform it.” As he himself said, “I am a slow walker, but I never walk back.”
But heroism cannot be only a matter of conviction, since conviction and mere stubbornness are easy to confuse. The hero must also be the possessor of ability, and be conscious of that ability without any self-flattering hubris. People routinely underestimated Lincoln. After his election, one indignant newspaper editor demanded, “Who will write this ignorant man’s state papers?” That editor needn’t have worried. “Any man who took Lincoln for a simpleminded man,” said his old friend and legal associate, Leonard Swett, “would very soon wake [up] with his back in a ditch.” Swett especially remembered the deceptive shrewdness with which Lincoln conducted matters: “He kept a kind of account book of how things were progressing for three, or four months, and whenever I would get nervous and think things were going wrong, he would get out his estimates and show how everything on the great scale of action—the resolutions of Legislatures, the instructions of delegates, and things of that character—was going exactly as he expected.” No wonder that two years into the Civil War, Lincoln’s secretary, John Hay, could marvel that “the old man sits here and wields like a backwoods Jupiter the bolts of war and the machinery of government with a hand equally steady and equally firm. . . . There is no man in the country, so wise, so gentle and so firm. I believe the hand of God placed him where he is.”
Still, conviction and ability can often wilt in the face of antagonism, and Lincoln suffered enough antagonism to make the word fail on the lips. This statue shows a Lincoln of conviction and ability, but also of perseverance. Not angry defiance—for that, the hands would not be clasped behind him, but closed as fists in front of him, and the face would be contorted with rage. Instead, Lincoln’s face is set, composed, unblinking in the face of reality. The hands are joined, almost as a symbol of the Union he is determined to preserve—but notice that they are kept behind. Were they crossed before him, it would mean an end of forward motion. No, the man must lead the Union. He must endure a hurricane of abuse, and reconcile himself even to the prospect of failure, without whimper or casting blame; but he must always be prepared to move forward. Early in his career as an Illinois legislator, Lincoln said, “The probability that we may fall in the struggle ought not to deter us from the support of a cause we believe to be just; it shall not deter me.”
Francis Carpenter, who would go on to paint one of the greatest historical canvasses in American history, the First Reading of the Emancipation Proclamation, understood how the old masters of the old world “had delighted in representations of the birth from the ocean of Venus, the goddess of love,” drifting in sweetly to shore on the half-shell. But the new republic in the new world had witnessed a far greater birth—what Carpenter called “the immaculate conception of Constitutional Liberty.” Ninety years after being conceived in liberty, the republic had experienced a new birth of freedom: “The long prayed for year of jubilee had come; the bonds of the oppressed were loosed; the prison doors were opened.” Surely, Carpenter believed, a voice might proclaim from heaven: “Behold…how a Man may be exalted to a dignity and glory almost divine, and give freedom to a race. Surely Art should unite with Eloquence and Poetry to celebrate such a theme.” Today, it has, and this statue is the mark. For a moment, the heroic has reasserted itself—not the reeking heroic of kings and emperors, but the heroic republican citizen, in broadcloth rather than in uniform…armed with conviction, perseverance and ability rather than a sword…standing, and always facing forward to the light.