1. rorty is philosophy relevant to applied ethics

Download 1. Rorty is Philosophy Relevant to Applied Ethics

Post on 02-Aug-2015




5 download

Embed Size (px)



Note from the Editor. Richard Rorty's invited address to the 2005 annual meeting of the Society for Business Ethics prompted extensive comments from and discussion by those attending the meeting. In light of this, Professor Rorty's address is reprinted here, and is followed by invited responses from three past presidents of the Society for Business Ethics, and a brief concluding comment from Richard Rorty.

IS PHILOSOPHY RELEVANT TO APPLIED ETHICS? Invited Address to the Society of Business Ethics Annual Meeting, August 2005

Richard Rorty

Abstract: If, like Hegel and Dewey, one takes a historicist, anti-Platonist view of moral progress, one will be dubious about the idea that moral theory can be more than the systematization of the widely-shared moral intuitions of a certain time and place. One will follow Shelley, Dewey, and Patricia Werhane in emphasizing the role of the imagination in making moral progress possible. Taking this stance will lead one to conclude that although philosophy is indeed relevant to applied ethics, it is not more relevant than many other fields of study (such as history, law, political science, anthropology, literature, and theology). hilosophy has a glorious past and an uncertain future. That is why, when thinking about our role in intellectual life, we philosophy professors prefer to look backward. Doing so lets us see ourselves as the successors of Plato, St. Augustine, Spinoza, Kant, Marx, and Nietzsche. Those men imagined new shapes that the lives of individuals and communities might assume. Thinking of ourselves as their heirs helps us imagine that we might shape the human future. When we turn from the past to the present, however, we remember that we are not being paid to foment intellectual or social revolutions. We have been hired by colleges and universities to be responsible professionals, content to work within a well-defined area of expertise. As philosophy became one more academic discipline, it became harder for philosophers to do something bold and original. For the more 2006. Business Ethics Quarterly, Volume 16, Issue 3. ISSN 1052-150X. pp. 369-380




original and imaginative a philosopher is, the less she looks like a skilled, welltrained, disciplined professional. You cannot professionalize imaginativeness. The professionalization of philosophy began about two hundred years ago, in a period when the modem research university was beginning to take shape. That period, the time of Hegel and Humboldt, was also philosophy's acme. Since then, philosophy has gradually lost prestige. It has become invisible to the general public, and has drifted off the radar screens of most intellectuals. The principal reason for this marginalization is that the so-called "warfare between science and theology" has tapered off. By and large, science has triumphed. Despite occasional flare-ups, such as the current assault on evolutionary biology, most Westerners who read books are content to let their view of the universe be shaped by the natural sciences. Religious faith has been revised, by most educated believers, so as not to conflict with the stories scientists tell. As long as the warfare between science and theology lasted, there was an important role for philosophical theories to play. For thinkers like Hobbes, Spinoza, Hume, Kant, Hegel, Mill, and Marx all confronted an urgent question: how can the moral idealism common to Platonism, Judaism, and Chistianity survive affer we have accepted a materialist account of the way the universe works? What is the place of moral ideals in a clockwork universe? As long as those questions dominated intellectual life, philosophical theories about the nature of reality and about the scope of the limits of human knowledge were still relevant to felt needs. The question of whether human beings were more than just clever animals remained urgent. In recent times, however, the question "what is it to be a human being?""what makes human beings special?"has lost its urgency. That question has been replaced by two others. The first is political: "How can we create a better world for our descendants to inhabit?" The second is existential: "What sort of person shall I try to become?" This change resulted from an increased awareness of the possibihty of radical historical change, and from an increased tolerance for human diversity. In the course of the past two centuries, historians, novelists, and anthropologists have helped us reahze that the human future need not resemble the human past. The philosophers ofthe seventeenth and eighteenth centuries shared with Plato and Epicurus the assumption that the human situation was essentially the same at all times and places. But we modems have come to think that there is no single Good Life for Man. We no longer take for granted that all the members of our species share an unchanging essence, and that knowledge of this essence can help us decide what to do with ourselves. Plato invented philosophy by postulating the existence of unchanging essences. So the changes in the intellectual life of the West have, in the course of the last two centuries, produced a spate of radically anti-Platonic philosophies. The historicism that Marx took over from Hegel was thefirstradical break with Platonic essentialism. Marx's thesis that philosophers should stop trying to understand the world and start trying to change it epitomized the intellectuals' new-found willingness to tum away from metaphysics in favor of politics.



Nietzsche's existentiaHsm was the counterpart of Marxism at the level of the private individual. "If you have a virtue," Nietzsche said, "let it be your virtue." His point was that virtue not should be thought of as a matter of conforming to antecedent norms but rather of being faithful to one's own project of self-creation. In the hundred-odd years since Nietzsche wrote, anti-Platonist movements with names like "pragmatism," "existentialism," and "postmodernism" have prohferated. The burden of all of them was pretty much the same: there are no unchanging essences to be grasped, only new ways of describing both ourselves and the universe to be invented. Anti-Platonists are often accused of relativism. Doubting the existence of something transcultural and ahistorical that can provide guidance for moral choice, these accusers say, amounts to denying the existence of absolute values. But we know that slavery and torture are absolutely, objectively, wrong. So, they argue, there must be something that makes them soif not the Will of God, than the Moral Law, or Reason, or some other secular surrogate for the divine. Anti-Platonists like myself, the people sometimes described as "post-modem relativists," do not accept the correspondence theory of truth. So we have no use for the notion of a belief being made true by the world. We think that adding "absolute" to "wrong" or "objective" to "truth" is an empty rhetorical gesture. It is just a way of pounding on the podium. Saying that torture is absolutely wrong does nothing to still doubts about whether to save the city by torturing the terrorist. Everybody agrees that it is absolutely wrong to refuse sick children medical help, but nobody agrees on how the doctor's bill is to be paid. When decisions get tough, invoking notions like "absoluteness" and "objectivity" does nothing to make them easier. Philosophers' claims to have Truth and Reason on their side resemble theologians' claims to know the will of God. Such claims are advertising slogans rather than arguments. God has provided no algorithms for resolving tough moral dilemmas, and neither have the great secular philosophers. Urging that there is something that makes actions wrong or moral behefs true is an empty gesture. For we have no way of getting in touch with this purported truth-maker save to seek coherence among our own moral intuitions. Though truth and wrongness are not relative notions, justification is. For what counts as justification, either of actions or of beliefs, is always relative to the antecedent behefs of those whom one is seeking to convince. Anti-slavery arguments that we find completely persuasive would probably not have convinced Jefferson or Aristotle. Our best arguments against torture would probably not have budged the devout and learned prelates who ran the Holy Inquisition. That is why we are somedmes tempted to say, misleadingly, that a certain practice is right in one culture and wrong in another, or that a certain astrophysical theory was true for Aristotle but false for Newton. The reason this turn of phrase is misleading is that all we really mean is that, given his other behefs, Aristotle was perfectly justified in accepting a false theory. Analogously, the Mongol horde was perfectly justified in gang-raping the women of Baghdad, given their other behefs. Their behavior was, to be sure, wrong. If there were such a thing as absolute justification, we could say that it was absolutely



unjustified. But there is no such thing. Justification is a relation between beliefs and other beliefs. What can be justified to one audience cannot be justified to others. We are no closer to absolute justification for our moral beliefs than was Genghis Khan. We justify our actions and beliefs to each other by appealing to our own lightsto the intuitions fostered at our own time and place. The Mongols did the same. Philosophy, however, got its start when Plato insisted that there was such a thing as absolute justificationthat "the light of reason" was equally accessible to all reflective human beings. Whereas empirical inquiry into how things work depends on contingent availability of relevant evidence, Plato thought that we akeady had withi


View more >