Category: Religion and society

  • Economics for community

    As I mentioned previously, Daly and Cobb’s central concern is that the abstractions of economics leave out aspects of reality that are crucial to understanding the world and shaping the economy in a way that nourishes community and is sustainable in the long run. Following A.N. Whitehead, they refer to the phenomenon of treating an abstraction as exhaustive of the reality it describes as the “fallacy of misplaced concreteness.”

    Chief among these abstractions is the market. While the free exchange of goods and services is key to any flourishing economy, treating “the market” in isolation has some built in limitations. These include the tendency for competition to be self-eliminating (monopoly), the corrosive affect of encouraging the pursuit of self-interest on the moral context necessary to sustain the virtues the market order requires, the need for public goods and the existence of public “bads” (externalities), and the market’s blindness to judgments of value such as those pertaining to the distribution of wealth or the overall scale of the economy in relation to the surrounding ecosystem.

    Daly and Cobb also criticize the reliance on GNP as a measure of economic well-being. They argue that it doesn’t accurately reflect income, much less genuine economic welfare. Homo economicus is the model of the human self posited by much economic thought. It assumes a human being who’s interested primarily in maximizing utility understood in terms of consumption. Economics qua economics forbids us from making value judgments about individual preferences and seeks instead to understand how those preferences can be maximized. Finally, “land,” the economic stand-in for all of non-human nature, rather than being seen as a productive and living system with its own intrinsic value, is reduced to a largely passive and inert commodity. An overly idealistic point of view tends to see all resources as having their ultimate source in human ingenuity, presdisposing economics to ignore the question of the finitude of resources.

    All of these abstractions, Daly and Cobb contend, serve to create an overly individualistic and short-term picture of the world and lends support to similarly constituted policies. Their goal is to reconceive the context of economic life as being in service to community, including the wider community of non-human nature.

    To this end, they advocate a shift from short-term to long-term thinking, with particular attention to the scale of the economy. Their argument here is fairly simple: the economy is situated within an ecosystem which is finite in size (i.e in terms of resources). Therefore, the economy cannot grow indefinitely. They define “scale” as population x per capita resource use rate and maintain that our trajectory of growth is pushing against the limits imposed by the natural ecosystem within which our economic life exists.

    Consequently, what they think is necessary is an economy that is oriented away from growth and toward more of a steady-state model. Economic well-being shouldn’t be measured in terms of increasing consumption, but by a combination of economic and non-economic welfare. Individualism should be replaced by a vision of human beings as persons-in-community whose relationships to others are seen as constitutive of their identity. Economic development should focus on the well-being of the community as a whole rather than individuals.

    Concerning this last point, Daly and Cobb see communities as the fundamental building blocks of a sound economic order. But they are also decentralists who would like to see a revival of local communities over against the atomized cosmopolitanism that globalization promises. They envision a world in which one’s primary loyalty is to one’s local community, with increasing and overlapping circles of loyalty expanding outward. Unlike many on the Left, they have no particular affinity for “post-national” globalism.

    In fact, Daly and Cobb acknowledge that in our world the only entities currently able to resist globalization and foster steps toward an economic order more in line with their aspirations is the nation. They are more or less unapologetic nationalists, resulting in some surprising policy prescriptions that would put them at odds with much of the Left. They are against free trade and for protectionism for domestic industries by means of tariffs, they favor population control, and the form they advocate for most developed countries, including the US, is a curtailment of immigration, particularly illegal immigration. Sounding for all the world like Pat Buchanan, they argue that a chief function of the nation-state is to secure its borders against unwanted immigrants. They oppose not only economic entanglements with foreign nations, but also foreign aid. All nations need to be self-sufficient, at least in essentials. Finally, the support a defense policy of what could fairly be called non-interventionism and suggest that a United States less enmeshed in a global market would have less cause for foreign meddling.

    The keystone of Daly and Cobb’s position, then, is a community of more or less self-reliant communities whose economic life is geared to stability and self-sufficiency rather than expanded growth. This is rooted in what they describe as a biocentric and theistic vision that sees all of creation as related to a good God and having value apart from human needs and interests. Their emphasis on the value of the biosphere leads them to support sustainable and organic agriculture and to favor subsistence agriculture over agriculture for commodity export as well as a tax system similar to that proposed by Henry George that treats land as a trust rather than a commodity.

    A lot of what’s contained in this volume will be familiar to anyone who’s paid much attention to debates about the economics of sustainability. What I find appealing about Daly and Cobb is their desire to foster a more decentralized, humane, and participatory economy instead of increased centralization. I also think they’re more realistic than some in viewing the nation-state as the best hope for gaining some measure of democratic control over economic life. Too often folks on the Left put what appears to me as an unrealistic hope in international institutions like the UN which, after all, are even further removed from popular control and participation than most national governments.

    However, I still can’t help but have some reservations about Daly and Cobb’s vision. On a sheerly factual level, I wish they’d spent more time making the case of a finite economy. To a certain extent they seem to cherry-pick their opponents, using the most extreme-sounding quotes from people like George Gilder. I would’ve liked to see more engagement with serious opponents of their view. Secondly, they seem to me at times insufficiently appreciative of the real benefits of liberal individualism. Like many who oppose “community” to “individualism” they tend to paint the former almost exclusively in glowing terms that downplay the genuine difficulties of close-knit community. There’s a real tension between individual liberty and community control, however democratic. To the extent that the community exercises control over a particular area of life, it leaves less room for indvidual discretion. There’s a genuine balancing act there and I’m not sure Daly and Cobb have paid much attention to it (their discussion of population control, for instance, is disturbingly sanguine about China’s coercive policies without actually advocating them). Finally, they don’t, in my view, deal adequately with the objection that participation in an expanding economy is necessary for many people in the world to escape from grinding poverty.

    Overall, though, Daly and Cobb seem to me to be asking the right questions: Is an ever-expanding economy consistent with the limits imposed by ecological fragility? How do we reconcile the need for democratic control over the economy with individual freedom? What kind of balance should be struck between ties to local community and a more cosmopolitan outlook? How do we honor the value of God’s creation without sacrificing vital human interests? These all strike me as among the most important questions we face in the 21st century, even if I’m not satisfied in every case with Daly and Cobb’s answers.

  • Religious myths

    I got my hands on a copy of Keith Ward’s Is Religion Dangerous? courtesy of our local library and have been enjoying it very much.

    In the introduction alone Ward takes on several myths about the study of religion that tend to be propagated by its cultured despisers:

    1. “Religion” is a univocal term. Ward points out the obvious (but frequently overlooked or elided) fact that the term “religion” covers a broad array of phenomena and it’s by no means easy to identify a core of belief or practice common to everything we would identify as a religion. “Is Communism a religion? Or football? Or Scientology? How do we know what a religion is?” (p. 8). And this makes it extremely difficult to say that “religion” as such is good or bad:

    There are obviously many different sorts of things that we can call ‘religion’. Since religions have existed as far back as we can trace the history of the human race, and in almost every society we know about, there are going to be as many different religions as there are human cultures. They are going to exhibit all the variety and all the various stages of development of the cultures in which they exist. That is going to make it virtually impossible to say that religion, as such, at every stage of its development and in all its varieties, is dangerous. (pp. 9-10)

    2. The true nature of religion is given by its earliest examples. Early anthropological studies of religion that first took up the attempt to explain religion as a natural phenomenon made two questionable assumptions. The first was that religious beliefs were false and thus to be explained entirely in naturalistic terms. The second was that so-called primitive religion showed the “essence” of religion and that all more developed religions were ultimately reducible to this essence. Religion, the story goes, began when people attributed personalistic characteristics to the natural objects around them, giving rise to animism, the earliest form of religion. Gradually, however, these spirits were combined into a single spirit and monotheism was born. These beliefs were rooted in early humans’ attempts to make sense of and exert control over their environment. But now that we have science these beliefs have been revealed as superstitious and irrelevant.

    The problem with this view, says Ward, is that there is very little evidence to support it. We simply don’t have access to the religious beliefs of early human beings, nor do we know in what order they developed. “It seems more like pure speculation without any evidence at all — a story that might appeal to us, given certain general beliefs about the universe and a generally materialist philosophical outlook” (p. 13).

    3. Early people took their religious beliefs “literally.” We commonly assume that people in the past took their religious beliefs literally and only gradually do they start to think of them as symbols or metaphors. Sometimes atheists accuse more “sophisticated” religious believers of not really being religious since they recognize the role of myth, symbolism, and metaphor in religion. The implication is that real sincere religious belief means literalism.

    But Ward calls into question this assumption. For starters, we simply have very little evidence about the content of the religious beliefs of “primitive” people. “We simply have no way of knowing how they interpreted their religious ideas. The truth is that we know virtually nothing about the first origins of religious belief” (p. 13). Again, the assumption that the evolution of belief starts from literalism and gradually moves to symbolism and metaphor is more a philosophical dogma than the result of empirical investigation. In fact, Ward suggests, it may well be that literalism is the late comer on the scene:

    If humans have evolved, then it will be true that at some stage, many tens of thousands of years ago, human thought would have been less developed than it is now. But does that mean it would have been more literal? Perhaps literalness is a late development, and the idea that artefacts should literally be like what they represent — or even the idea of ‘literalness’ itself — is a concept that only developed when humans began to think scientifically or analytically. (p. 15)

    Ward cites anthropological investigations in India where worshipers are puzzled by questions about whether the gods are “real” or whether the images “really” represent them. And linguists have long recognized that virtually all human language is metaphorical to some degree. A purely literal language about anything, much less about the divine, may well be impossible for us. “Metaphorical thinking is deeply rooted in the human mind. It may be the case that very early human thinking was more metaphorical than literal in nature” (p. 15).

    4. It is inauthentic for religion to develop. This myth can take religious or anti-religious forms. The atheist may point to later, more sophisticated forms of religion as not reflecting the “real” nature of the faith. This is often an attempt to catch the “moderate” believer on the horns of a dilemma: either you’re a fundamentalist or you’re not a genuine believer. Ironically, the same argument can be made by fundamentalists of all stripes; the “faith once delivered” is taken to be a set of timeless truths that can never change, and any re-thinking of previous expressions of the faith is tantamount to apostasy.

    Ward’s contention is that one of the positive fruits of the scientific study of religion has been the realization that religions do develop and that later forms aren’t necessarily inauthentic expressions of the faith. Since religious ideas are ways of trying to give expression to a reality that is “beyond all images” they naturally become more or less effective over time. That doesn’t mean they have no basis in objective reality, but that they can never perfectly depict it and are therefore subject to critique and revision. “Once we escape the delusion that [religion’s] earliest stage provides its real essence, we will be able to see that it is a continually developing set of diverse traditions” (p. 20).

    5. Religious belief is primarily aimed at explanation. One common atheistic argument, related to a particular story about how religion developed, assumes that religious belief is primarily about explaining why things happen, a kind of proto-science. But once science with its superior explanatory power comes along, the “God hypothesis” is rendered unnecessary.

    This may be a powerful argument against, say, 18th-century deism, but it’s not particularly convincing as an argument against religious belief as such. It’s not at all obvious that religious people either today, or historically, believe in God primarily as some kind of explanatory hypothesis. For instance, it’s been a commonplace of biblical scholarship for some time that the ancient Israelites first became aware of Yahweh through the powerful experience of deliverance from Egypt and only later did his role as universal creator become apparent to them. They didn’t propose the existence of God as a hypothesis to explain creation; rather through their awareness of his power and loving-kindness it became obvious that he must also be the Lord of all creation.

    As Ward says, “if we look at present religious beliefs, they are not only, or even mainly, used to explain why things happen. They are used to console, inspire and motivate, but not to explain” (p. 17):

    It looks as if the roots of religious belief do not lie in attempts to explain why things happen. If we ask intelligent modern believers where the roots of their belief lie, many different sorts of answers would be given, but rarely that their beliefs explain why things happen. One answer, and I think it is a very important one, would refer to experiences of a transcendent power and value, of greater significance and moral power than anything human. The metaphors of religious speech — metaphors of ‘dazzling darkness’ or ‘personal presence’ — are inadequate attempts to express such experiences of transcendence. Why should it ever have been different? For all we know, early religion could have originated in experiences of a transcendent spiritual reality, especially in the vivid experiences, sometimes in dreams and visions, of shamans or holy men and women. (pp. 17-18)

    I’m sure Ward wouldn’t deny that religious belief can sometimes play the role of explanation, but more often than not this isn’t to explain particular phenomena, but to offer more “global” sorts of explanations. For instance, Leibniz’s question Why is there something rather than nothing? may not demand the existence of a god, but it can point to or suggest it. Likewise, the question Why the universe has the particular order it does, one that seems “fine-tuned” to give rise to intelligent personal life. The existence of a personal God can make sense of these global phenomena that appear to be beyond the reach of scientific explanation.

    Ward’s point in discussing these myths is that any study of religion that proposes to evaluate whether it is on the whole and all things considered a good or bad thing needs to look at it in all its complexity and as it is actually lived. Too often critics of “religion” are attacking what is essentially a straw man or an ideological construct.

  • Debating tactics

    Only in Berkeley would you get a debate between Christopher Hitchens who thinks that all religion is evil and Chris Hedges who merely thinks that all “religious orthodoxy” is evil billed as a debate over the merits of religion. Hitchens seems to like soft targets; I’d like to see him debate a serious orthodox Christian thinker: Stanley Hauerwas, maybe? I have a feeling the cantankerous Texan could hold his own against Hitch.

  • Niebuhr and the neocons

    Thanks to Michael Westmoreland-White for pointing out this interview with liberal theologian and social ethicist Gary Dorrien. Dorrien, who now holds the Reinhold Niebuhr chair in social ethics at Union Theological Seminary, points out that while Niebuhr held many different and incompatible political views over the course of his life, the current US policy in Iraq is completely at odds with the main thrust of Niebuhr’s thought which emphasized the perils of unintended consequences and the selfishness of collectives such as nations that often clothes itself in the robes of righteousness.

    Q. What insights of Niebuhr’s are most pertinent for the nation’s public life today?

    A. His sense that elements of self-interest and pride lurk even in the best of human actions. His recognition that a special synergy of selfishness operates in collectivities like nations. His critique of Americans’ belief in their country’s innocence and exceptionalism — the idea that we are a redeemer nation going abroad never to conquer, only to liberate.

    Q. You’ve written two critical books on political neoconservatism. Don’t many neoconservatives claim to be Niebuhrians?

    A. In various phases of his public career, Niebuhr was a liberal pacifist, a neo-Marxist revolutionary, a Social Democratic realist, a cold war liberal and, at the end, an opponent of the war in Vietnam. He zigged and zagged enough that all sorts of political types claim to be his heirs. Even the neoconservatives can point to a few things.

    But over all, they’re kidding themselves. Niebuhr’s passion for social justice was a constant through all his changes. Politically he identified with the Democratic left. We can only wish that the neocons had absorbed even half of his realism.

    Niebuhr often gets criticized nowadays for having been too complacent about the use of power and inattentive to the need for a Christian ethic that offered a countercultural witness to the norms of “realism.” And while there’s some truth to that, we could still stand to re-learn some of the lessons he tried to impart.

  • Just War theory and the “charism of discernment”

    This post from Catholic theologian William Cavanaugh revisits some of the arguments of pro-Iraq war Catholics, in particular papal biographer George Weigel (link via Eric).

    Weigel’s notion of a “charism of political responsibility/discernment” is muddled at best. Here’s the relevant passage from his “Moral Clarity in a Time of War”:

    If the just war tradition is indeed a tradition of statecraft, then the proper role of religious leaders and public intellectuals is to do everything possible to clarify the moral issues at stake in a time of war, while recognizing that what we might call the “charism of responsibility” lies elsewhere-with duly constituted public authorities, who are more fully informed about the relevant facts and who must bear the weight of responsible decision-making and governance. It is simply clericalism to suggest that religious leaders and public intellectuals “own” the just war tradition in a singular way.

    As I have argued above, many of today’s religious leaders and public intellectuals have suffered severe amnesia about core components of the tradition, and can hardly be said to own it in any serious intellectual sense of ownership. But even if today’s religious leaders and public intellectuals were fully in possession of the tradition, the burden of decision-making would still lie elsewhere. Religious leaders and public intellectuals are called to nurture and develop the moral-philosophical riches of the just war tradition. The tradition itself, however, exists to serve statesmen.

    There is a charism of political discernment that is unique to the vocation of public service. That charism is not shared by bishops, stated clerks, rabbis, imams, or ecumenical and interreligious agencies. Moral clarity in a time of war demands moral seriousness from public officials. It also demands a measure of political modesty from religious leaders and public intellectuals, in the give-and-take of democratic deliberation.

    Now, you could legitimately argue, I think, that public officials have the unique responsibility for making decisions to go to war, but that’s no reason to suppose that they are given a unique gift of discernment or judgment. It’s true that they will often have access to privileged information (though, fat lot of good it did ‘em in the case of Iraq) but that’s a separate issue.

    What Weigel seems to imply is that public officials are granted almost supernatural aid in deciding whether or not a given war is just. I can’t imagine what in the tradition would support this claim unless we’re reverting to the idea of the king as God’s anointed.

    Cavanaugh puts it well:

    Regardless of the facts of this particular case, moral judgments about war, like all moral judgments, are not primarily a matter of good information. Good information is a necessary, but not sufficient, condition for sound moral judgments. Sound moral judgments depend on being formed in certain virtues. Why a Christian should assume that the president of a secular nation-state would be so formed – much less enjoy a certain “charism” of moral judgment – is a mystery to me. “Charism” is a theological term denoting a gift of the Holy Spirit. To apply such a term to whomever the electoral process of a secular nation-state happens to cough up does not strike me as theologically sound or practically wise.

    It’s also worth pointing out that the Constitution envisioned war being declared by Congress, not the President (Article I, Section 8). While again it’s true that public officials have a unique responsibility for making these decisions, they aren’t guaranteed a special wisdom. It seems to me that only an inflated, quasi-monarchical concept of the presidency would even be tempted to impute this kind of “charism” to the occupant of the Oval Office. If the decision to go to war was kept with Congress (or, heck, with a plebiscite), there would probably be much less temptation toward this kind of obscurantism.

  • Is religion dangerous?

    Saw an ad for this in the new First Things: Keith Ward (see here) has written a response of sorts to the “new atheist” crowd. I imagine it’s the usual kind of irenic, thoughtful stuff Ward is known for.

    I’ve often thought that the whole issue of whether “religion” is on the whole good or bad is a pretty muddled one. In addition to the probably insoluble matter of deciding what exactly counts as a religion, there’s no religion-less society to act as a control group in determing whether the influence of religion has been on the whole good or bad. And beyond that it’s very difficult to see how you would weigh the moral improvements against the moral defects that are arguably attibutable to a particular religion. Was the Inquisition worth the outlawing of infanticide? and so on. Plus there’s the issue of casuality: how do we know what’s attributable to religion? For instance, several scholars, including secular ones, have made the case that modern science arose in the West in part precisely because of the Christian worldview. The idea of a God who creates a universe that displays a rational order served as an impetus to discovering that order. But such a hypothesis hardly admits of definitive proof one way or the other.

  • Christians and war revisited

    Doug Bandow has an article worth reading on Christians and the Iraq war.

    I think we see here one of the problems with Just War theory, a problem that many pacifists have pointed out, namely that it can be so flexible as to (rhetorically at least) justify virtually any war.

    However, Just War adherents obviously think that pacifism is too high a price to pay for a bright, clear line about when to go to war. But Bandow articulates what some JW thinkers have called the presumption against the use of force:

    Christians should be particularly humble before advocating war. War means killing, of innocent and criminal alike. It means destroying the social stability and security that creates an environment conducive for people to worship God, raise families, create communities, work productively, and achieve success – in short, to enjoy safe and satisfying lives. Wars rarely turn out as expected, and the unintended consequences, as in Iraq, often are catastrophic.

    Indeed, in Iraq the U.S. has essentially killed hundreds of thousands of people in the name of humanitarianism. Christians, even more than their unbelieving neighbors, should be pained by the horror of sectarian conflict unleashed by the actions of their government with their support. Believers especially should eschew nationalistic triumphalism in pursuit of war. And when they err, like predicting health, wealth, liberty, and happiness in occupied Iraq, they should acknowledge fault – and seek forgiveness. At the very least they should exhibit humility before saddling their white horses to begin another crusade.

    I tried to make a similar point here, specifically with respect to proposed humanitarian interventions. A lot depends on whether we see war as an extraordinary last resort, or as a routine tool of statecraft. Andrew Bacevich and others have argued that Americans have come to see war as the latter, with disastrous results. And Bandow is surely right the Christians, even if they’re not pacifists, should be wary of war and set the bar high for supporting it.

  • Is Ron Paul right?

    The debate kerfuffle between Ron Paul and Rudy Giuliani over the question of the causes behind the 9/11 attacks has generated a fair amount of comment. I think Paul got the better of the exchange and Giuliani came across as a bit of a demagogue, but it’s still worth asking whether Paul is right here.

    Talking about the connection between our interventionist foreign policy and “blowback” in the form of terrorism has been the genuine third rail of US politics over the last 5+ years. What I didn’t hear Paul say was that we in any way deserved the 9/11 attacks. This is the canard frequently used against people who try to explain the motives of the terrorists with reference to US foreign policy. But there’s a big difference between explaining something and justifying it. Saying that OBL and co. want to attack us because we’re “over there” as Paul puts it does not imply that they were right to do so.

    My view has been that our interventions in the Middle East are at least a contributing factor in Islamist terrorism and the 9/11 attacks. I don’t want to discount the role of Islamic extremism, as some leftists and anti-war conservatives seem to do. The former often advert to sheerly economic or political explanations, while the latter sometimes fixate on the role of Israel. Nevertheless, as Paul pointed out in the debate, bin Laden and his confederates have explicitly said that they attacked us because of our presence over there. It would be extremely foolish to disregard their own account of their motives, even if it’s not the full story.

    An important component, I would think, of any sound strategy against terrorism would be to “peel off” potential supporters of terrorist groups by listening to their concerns about our presence in the region. Granted there are a hard core of radicalized jihadists who will be swayed by nothing, terrorist groups seem to thrive only when they have some kind of support from the larger public. Presumably one of the reasons the IRA was able to carry on its campaigns for so long was that there were people not directly involved who at least sympathized to some degree. Paul is surely right that it’s important to ask how we would feel if some other country was meddling in our affairs like we do in the Middle East (and elsewhere).

    And even apart from the question of blowback, we need to ask whether our interventions are a) good for the US on the whole and in the long run and b) morally legitimate. Even if Osama bin Laden didn’t oppose it, there’s still reason to doubt whether US forces should’ve be stationed in Saudi Arabia, just like there’s a legitimate question whether our forces should remain stationed in Iraq. And the fact that it would likely make the Iranian people dislike us even more (possibly leading to terrorist reprisals) is not the only reason to doubt the wisdom of attacking Iran to prevent the government there from acquiring nuclear weapons.

    Conservatives have reacted (at times understandably) against the leftist litany of American misdeeds, but this has all too often spilled over into an uncritical approval of everything the US does or has ever done. If conservatism means anything it means dealing with reality as it is, not as you would wish it to be. At least the kinds of conservative thinkers I’ve always found congenial are those who criticize simplistic, utopian, and ideological thinking. Repeating the mantra that “they hate us because we’re free” won’t help us understand our enemies and ultimately deal more intelligently with them.

    Moreover, Christians of all people should be able to look unflinchingly at their own sins. We don’t need to pretend that we, individually or collectively, are free from fault. Believing in the power of forgiveness ought to enable us to look honestly at our own failings and those of our country, without sliding into self-loathing. We shouldn’t have to fear acknowledging them and, if necessary, changing course. That’s part of what I think Christians should bring to the civic conversation, especially when political parties seem institutionally committed to an uncritical nationalism.

  • The triumph of anti-Constantinianism

    Over at Faith and Theology there’s a (somewhat tongue-in-cheek) poll on the “worst theological invention.” What’s interesting is not just that only one of the “inventions” is an actual heresy, but that “Christendom” and “just war theory” got enough nominations to make the poll. (Though, in fairness, biblical inerrancy and “the Rapture” are the current leading contenders for worst.)

    I say this is interesting not so much to disagree but to wonder at the fact that, at least in certain theological circles, the radical reformation/free church revisionist account of Christian history has triumphed almost completely and with little opposition. The story is that the early church was radically countercultural and pacifist until the conversion of the Emperor Constantine (who didn’t make Christianity the state religion as is sometimes asserted, but did institute religious toleration and opened the door for eventual establishment). From there the story is one of steep decline wherein the church becomes complicit in war, imperialism, crusades, slavery, genocide, you name it, roughly until, well, now. Just war theory is one manifestation of the Christendom’s attitude of compromise toward worldly powers. Granted there are always dissenters upheld as heirs of the true anti-Constantinian gospel such as anabaptists, but the overall picutre is a pretty bleak one. The prescription that usually follows this re-telling of the history is for the church to return to its countercultural roots in order to provide a radical witness against war, capitalism, consumerism, “radical individualism” and other ills of the modern age.

    In much of the recent academic theology I’ve read (which is admittedly a limited sample) this story seems to be taken almost for granted. The only major theologian I can think of who has really contested this account is Oliver O’Donovan. But I can’t help but wonder why magisterial Protestants, Catholics, and Orthodox Christians (for whom the Emperor Constantine is in fact a saint) haven’t been more ready to look critically at this anti-Constantinian/anti-Christendom narrative. After all, doesn’t it imply that the church went deeply and radically wrong for pretty much most of its history? What does this imply for the doctrine of providence, for instance? And what does it say about the practice of infant baptism, which seems like it fits better with the quasi-state church model as opposed to the practice of believer’s baptism associate with the free churches? And what about the Christologica dogmas formulated in many cases under the watchful eye of the emperor? Can they still be deemed legitimate?

    Again, I’m not saying the revisionist story is out and out false. I’m just not convinced that mainline Christians haven’t been too quick to jump on the anti-Constantinian bandwagon rather than sifting the wheat from the chaff when it comes to the legacy of Christendom.

  • Preemption, prevention, and the Pope

    Michael Novak and Richard John Neuhaus have both offered some critical comments on Pope Benedict’s Easter address where Benedict reiterated (by implication, at least) some of his criticisms of the Iraq war. Novak has consistently remained a steadfast supporter of President Bush, so his comments aren’t particularly novel or surprising; he offers the now-cliched rebuttal that the Pope, much like the “American Left” is ignoring all the “good news” coming out of Iraq.

    Neuhaus, by contrast, has expressed at least some misgivings about the war over the last several months, but here tries to get the Bush Administration off the hook for its embrace of “preventive war,” which, as numerous theologians, including the Pope himself, have pointed out, is incompatible with Catholic teaching on Just War:

    Talk about preemptive war was part of the Bush administration’s less than careful (others would say arrogant) strategic language, most assertively expressed in the statement on national security of September 2002. Language about preemptive war was provocative and entirely unnecessary. As George Weigel has explained (here and here) in the pages of First Things, traditional just-war doctrine adequately provides for the use of military force in the face of a clear and present threat of aggression. Such a use of force is more accurately described as defensive rather than preemptive, and it is worth keeping in mind that in 2003 all the countries with developed intelligence services agreed that Saddam Hussein had or was quickly developing weapons of mass destruction that he intended to use in aggressive war.

    There needs to be a distinction made between “preemptive” war and “preventive” war. Fr. Neuhaus is correct that preemption is allowed for in Just War thinking. If a country is facing an imminent threat it needn’t wait for the other side to attack before engaging in defensive action. The textbook (literally) example of this is Israel’s preemptive attack which began the Six Day War.

    But “preventive” war refers to initiating hostilities when the threat is only hypothetical. Daniel Larison dissects some of the problems with this concept here, but it is to say the least far harder to justify according to traditional Just War criteria.

    Fr. Neuhaus, unfortunately, seems to be engaging in a bit of sleight-of-hand here when he talks about the supposed threat from Iraq as “clear and present threat of aggression” and says that “all the countries with developed intelligence services agreed that Saddam Hussein had or was quickly developing weapons of mass destruction that he intended to use in aggressive war.” The “threat” posed by Hussein’s regime was always a very hypothetical one, relying on a chain of inferences involving its possession of WMDs, its alleged ties to al-Qaeda (always the weakest of the Administration’s arguments), and the claim that it couldn’t be deterred from launching what would appear to be a suicidal attack on the U.S. via these terrorist proxies. Even Administration spokesmen shied away from describing this “threat” as “imminent.” In fact, President Bush himself in his 2003 State of the Union address said:

    Some have said we must not act until the threat is imminent. Since when have terrorists and tyrants announced their intentions, politely putting us on notice before they strike? If this threat is permitted to fully and suddenly emerge, all actions, all words, and all recriminations would come too late.

    In fact, after it became clear that the threat from Saddam’s Iraq was largely illusory, there was a concerted effort by Administration spokesmen to deny that they ever claimed that the threat was “imminent.”

    Now, it’s open to the defender of preventive war to argue that a threat needn’t be imminent for war to be justified, but that would represent a serious departure from the Just War tradition; to mention only one problem it’s very difficult to see how preventive war could be reconciled with the criterion of “last resort.” But, if so, it should at least be admitted that it is a departure. Either the Administration was claiming that that the threat from Saddam was imminent, in which case it was either wrong or dissembling, or it was not claiming the threat was imminent, in which case it went to war in contravention of accepted Just War principles.