We love stories, and we will continue to love them. But for more than 30 years, as Theory has established itself as “the new hegemony in literary studies” (to echo the title of Tony Hilfer’s cogent critique), university literature departments in the English-speaking world have often done their best to stifle this thoroughly human emotion.
Every year, heavy hitters in the academic literary world sum up the state of the discipline in the Modern Language Association of America’s annual, Profession. In Profession 2005, Louis Menand, Harvard English professor, Pulitzer Prize-winning author, and New Yorker essayist, writes that university literature departments “could use some younger people who think that the grownups got it all wrong.” He has no hunch about what they should say his generation got wrong, but he deplores the absence of a challenge to the reigning ideas in the discipline. He laments the “culture of conformity” in professors and graduate students alike. He notes with regret that the profession “is not reproducing itself so much as cloning itself.”
But then, curiously, he insists that what humanities departments should definitely not seek is “consilience, which is a bargain with the devil.” Consilience, in biologist E. O. Wilson’s book of that name, is the idea that the sciences, the humanities, and the arts should be connected with each other, so that science (most immediately, the life sciences) can inform the humanities and the arts, and vice versa. Menand claims that he wants someone to say “You are wrong,” but he rules out anyone challenging the position in which he and his generation have entrenched themselves. For they are certain there is at least one thing that just cannot be wrong: that the sciences, especially the life sciences, have no place in the study of the human world. Well, Professor Menand, you, and those you speak for, are wrong.
The position you represent has neither the intellectual nor the moral high ground you are so sure it occupies. Until literature departments take into account that humans are not just cultural or textual phenomena but something more complex, English and related disciplines will continue to be the laughingstock of the academic world that they have been for years because of their obscurantist dogmatism and their coddled and preening pseudo-radicalism. Until they listen to searching criticism of their doctrine, rather than dismissing it as the language of the devil, literature will continue to be betrayed in academe, and academic literary departments will continue to lose students and to isolate themselves from the intellectual advances of our time.
Not everything in human lives is culture. There is also biology. Human senses, emotions, and thought existed before language, and as a consequence of biological evolution. Though deeply inflected by language, they are not the product of language. Language, on the contrary, is a product of them: if creatures had not evolved to sense, feel, and think, none would ever have evolved to speak. In his presidential address to the 2004 MLA convention, the distinguished
critic Robert Scholes offered an overview of the problems and prospects for literary studies. When another critic, Harold Fromm, challenged him in a letter in PMLA for ignoring biology, Scholes answered: “Yes, we were natural for eons before we were cultural . . . but so what? We are cultural now, and culture is the domain of the humanities.” We were natural? Have we ceased to be so? Why do Scholes, Menand, and the MLA see culture as ousting nature rather than as enriching it? Don’t they know that over the last couple of decades biology has discovered culture—knowledge transmitted nongenetically and subject to innovation and fashion—in birds, whales and dolphins, and among primates other than ourselves, at least in chimpanzees, orangutans, and gorillas? Do they not see that without our own species’ special biology, culture could not be as important to us as it is?
Menand forcefully expresses his sense of the dramatic change in literary studies that began in 1966. A “greatest generation” of iconoclasts established two fundamental principles: first, anti-foundationalism, the idea that there is no secure basis for knowledge; and, second, difference, the idea that any universal claims or attempts to discuss universal features of human nature are instead merely the product of local standards, often serving the vested interests of the status quo, and should be critiqued, dismantled, overturned.
Menand and those he speaks for believe that the French poststructuralists, beginning with Jacques Derrida, offered an unprecedentedly profound challenge to the history of thought, a challenge since summed up as Theory. Despite admitting that the humanities are now sick, Menand nevertheless exhorts them not to retrench but “to colonize.” As the critic Christopher Ricks notes, “Theory’s empire [is] an empire zealously inquisitorial about every form of empire but its own.”
Like others of his era, Menand is sure that: (1) the “greatest generation” secured for its “disciples” (these are his terms) the intellectual and moral high ground; (2) the insights of anti-foundationalism would be accepted by all other disciplines, if only they would listen; and (3) the crusade made possible by an understanding of “difference” must continue.
He ends by saying that when these positions are challenged, academics are being invited “to assist in the construction of the intellectual armature of the status quo. This is an invitation we should decline without regrets.”
I, like others who think that humans need to be understood as more than cultural or textual entities, do not wish to affirm the status quo. But in the four decades since Menand’s “greatest generation,” science and technology have altered the status quo far more radically than anything literature professors have managed. By increasing the world’s food output dramatically, scientists have saved hundreds of millions of people from hunger. Their labor-saving devices have freed scores of millions from domestic drudgery and allowed countless women into the paid work force. They have raised life expectancy around the world. And if knowledge is indeed power, as Michel Foucault says, then through the Internet, scientists have made possible the greatest democratization of power ever. True, there is much more to be accomplished, but the triumphalist defeatism that has been so dominant in the profession of literary studies (the practice has actually been less narrow) seems unlikely to help.
The “we” to and for whom Menand speaks imagine that they have the intellectual high ground because of their anti-foundationalism, the cornerstone of Theory. Anti-foundationalism is an idea uncongenial to common sense: that we have no secure foundation for knowledge. But the fact that it is uncongenial does not make it wrong. Indeed there are good reasons, however troubling, to think it right.
Yet the particular brand of anti-foundationalism Derrida offered in the late 1960s was not the challenge to the whole history of Western thought that he supposed or that literary scholars assumed it must be. Derrida insisted that an untenable “metaphysics of presence” pervaded Western thought—in less gaudy and grandiose terms, a yearning for certainty. Unless meaning or knowledge could be founded in the intention or the referent of the speaker or in some other unshakable way—ultimately, perhaps, in the authority of God or gods—it would have to be endlessly referred or deferred to other terms or experiences, themselves part of an endless chain of referral or deferral.
If they had been less parochial, the literary scholars awed by Derrida’s
assault on the whole edifice of Western thought would have seen beyond the provincialism of this claim. They would have known that science, the most successful branch of human knowledge, had for decades accepted antifoundationalism, after Karl Popper’s Logik der Forschung (The Logic of Scientific Discovery, 1934) and especially after Popper’s 1945 move to England, where he was influential among leading scientists. They should have known that a century before Derrida, Darwin’s theory of evolution by natural selection—hardly an obscure corner of Western thought—had made anti-foundationalism almost an inevitable consequence. I say “parochial” because Derrida and his disciples think only in terms of humans, of language, and of a small pantheon of French philosophers and their approved forebears, especially the linguist Ferdinand de Saussure. There was some excuse for Derrida in 1966, but there is none for the disciples in 2006, after decades of scientific work on infant and animal cognition.
Just where is the problem in the supposedly devastating insight that meaning or knowledge has to be referred or deferred to other terms or experiences, themselves part of an endless chain of referral or deferral? How could things be otherwise? This state is not only to be expected, but in an evolutionary perspective can be explained without apocalyptic paroxysms. In a biological view, our understanding of the world always depends on earlier and less-developed forms of understanding, on simpler modes of knowledge. Knowledge registers regularities in the environment (shape, orientation, light, color, and so on), qualities, therefore, not contained within the moment of perception but repeatedly similar enough to previous circumstances to produce similar effects. Knowledge is also registered by emerging regularities in the senses and the brains that process their input, through capacities, therefore, that have been developed by minute increments over thousands of generations.
Repetition applies not only to objects of knowledge and organs of knowledge but also to communication, through a process that biologists call ritualization. When it is typically to the advantage of one member of a species for another to understand its behavior in circumstances like courtship or threat, key patterns of action gradually become formalized, intensified, exaggerated, and contrasted sharply with other behaviors in order to maximize distinctions and minimize confusion. For the French heirs of Saussure, the principle of phonemic opposition in language (that b and p, say, generate significant distinctions in English, as in bat and pat, but not necessarily in another language, such as Spanish) is supposed to suggest an arbitrariness at the base of all thought. But this principle can easily be understood as merely another case of ritualized behavior that helps a given community keep its signals straight.
How could concepts or communication not be endlessly deferred or referred back, once we accept the fact of evolution, once we move beyond language to consider how human understanding slowly emerged? If we are evolved creatures, our brains are not guarantors of truth, citadels of reason, or shadows of the mind of God but simply organs of survival, built to cope with the immediate environment and perhaps to develop some capacity to recall and anticipate. Evolution has no foresight and no aims, least of all an aim like truth. It simply registers what suffices, what allows some organisms to last and reproduce better than others.
Because accurate information is costly, evolution must economize. A bacterium does not need to know its environment in detail, but only which nearby substances harbor opportunities and dangers. So too for humans. We do not need the long-distance visual acuity of hawks or the fine canine sense of smell or the high or low hearing ranges of bats or elephants. These extra capacities might be handy, but not at the expense of the range of senses that most often allow us to cope better.
Evolution has equipped us with fast and frugal heuristics, rough ways of knowing that suffice for our mode of life. We can expect imprecision and even systematic error in our “knowledge” if they help us to survive. We therefore have, for instance, a systematic bias toward overinterpreting objects as agents, in case that thing moving on the ground is a snake and not just the shadow of a branch, and we have a bias in memory toward recency, so that we recall more easily something encountered yesterday—and therefore likelier to recur today—than something from two decades ago.
Human minds are as they are because they evolved from earlier forms. Being ultimately biological, knowledge is likely to be imperfect, affording no firm foundation, no “originary” moment, in Derrida diction. Reality is enormously complex and vast. If we want to go beyond the familiar, beyond the immediate world of midsized objects that our senses were shaped to understand, beyond the inferences our minds naturally make, all we can do is guess, grope, or jump from whatever starting points we happen to have reached. Almost all our attempts at deeper explanations are likely to be flawed and skewed, as the hundred thousand religious explanations of the world suggest.
The best we can do is generate new hunches, test them, and reject those found wanting in the clearest, most decisive tests we can concoct. Of course we may not be predisposed to devise severe tests for ideas we have become attached to through the long cumulative processes of evolutionary, cultural, or individual trial and error. And it is not easy to discern what can be tested, let alone how it can be tested, especially in the case of “truths” we have long accepted. But in a milieu that rewards challenges to received notions, others will test our conclusions if we do not. If exacting tests contradict our predictions, we may be motivated to seek new explanations or to find flaws in the critics’ tests. The discovery of possible error can prompt us to look for less inadequate answers, even if there is no guarantee that the next round of hypotheses will fare better. Most, indeed, will again prove flawed—yet one or two may just inaugurate new routes to discovery.
Some people find that such a view of science amounts to extreme skepticism, and some scientists suppose much in science is conclusively confirmed. But Newton’s laws seemed to have been confirmed endlessly, until Einstein showed that they were not universally valid, that Newtonian motion was only a special case of a much larger picture, a deeper truth. Or to take an even simpler case: the stability of species seems to be confirmed every time we seeanother sparrow, swan, or duck. Yet after Darwin, that turns out to be wrong. We just do not know where something that appears to be repeatedly confirmed may prove to be inadequate, even drastically so, in a larger perspective.
Every day seems to confirm the stability of the earth and the “fact” that the sun goes around the earth. Francis Bacon, the first great theorist of science, thought it unbelievably perverse to imagine that the earth revolved around the sun and rotated on its own axis: we would feel the motion, and since we don’t, experience proves every moment that the earth is unmoving. In later decades, once the findings of Copernicus and Galileo were assimilated, it was assumed that if the universe was no longer geocentric, it was heliocentric. Then it was discovered that, no, the sun is just one star within the galaxy. Then it was discovered that the galaxy was just one among hundreds of galaxies, no, wait, millions, no, wait, billions. Now we are wondering why 90 percent of the matter we think is in the universe (and in our own galaxy) is invisible. Who knows how the quest for dark matter will turn out, and what new understanding of our planet, solar system, galaxy, and universe we will have?
A biological view of our knowledge shows both its insecurity and its dependence on older and poorer forms of knowing, while also explaining the possibility of the growth of knowledge. Derrida’s challenge to the basis of knowledge seems bold, but it cannot explain advances in understanding, evident in the slow gradient from single cells to societies and the steep one from smoke signals to cell phones. Evolutionary biology offers a far deeper critique of and explanation of the origins and development of knowledge, as something, in Derrida’s terms, endlessly deferred, yet also, as biology and history show, recurrently enlarged.
Recognizing our uncertainty helps us in our search to understand more. But those in the humanities who have become “disciples” of the “greatest generation” argue against the possibility of knowledge or truth, since meaning is forever deferred. That is the knowledge or truth, however self-contradictory and self-defeating, that they insist on imparting. Their commitment to undermining the possibility of knowledge, even while claiming this as bracing new knowledge, explains much of the stasis of the Theorized humanities that Menand deplores.
A biocultural perspective, by contrast, can explain how evolution has made knowledge possible, albeit imperfect, and how it has made the quest for better knowledge possible. The process offers no guarantee of truth, only the prospect of our collectively learning from one another through both cooperation (sharing ideas) and competition (challenging ideas). In that sense, an evolutionary epistemology is progressive, but far from naïvely optimistic, for every apparent advance in knowledge may turn out to be flawed in its turn, although even to discover this advances our knowledge. Derrida announced an anti-foundationalist epistemology in a spirit of revolutionary self-congratulation. He did not know, any more than his acolytes, that the sciences had already begun to accept a much less flawed anti-foundationalism, based not on parochialism and arrogance, contradiction and despair, but on humility and hope.
Just as Menand thinks the “greatest generation” secured the intellectual high ground for its “disciples” by virtue of establishing anti-foundationalism, so he supposes it secured the moral high ground by establishing what he usually calls difference, and occasionally situatedness or a critique of ethnocentrism. The “greatest generation,” in this view, introduced the humbling recognition that any claims to universal truth, or to universal human nature, are merely local ideas, often in the service of those in power, even where they attempt to be universal and self-evident. All claims to objective truth, it pointed out, are situated in a particular social origin.
At this point the anti-foundationalism of Theory segues into Cultural Critique. The massive Norton Anthology of Theory and Criticism (2001) offers, as Menand notes, a kind of “guided tour” for North American graduate programs in literature. With the characteristic provincialism and hubris of recent literary theory, it claims “Theory” as its empire, as if all theory was literary, as if the theories of gravity, evolution, and relativity were nugatory compared with the anti-foundationalist truths of the “greatest generation.” And as the Norton Anthology itself suggests, the approach that unites them should now be called Cultural Critique, in consequence of the emphasis on difference established in Barthes, Derrida, Foucault, and after.
Menand writes: “Humanities departments have turned into the little boy who cries, ‘Difference!’ Humanities professors are right: there is difference, it always is more complicated, concepts are constructed.” Although he complains that they often go no further, Menand nevertheless does not lament their insistence on difference, for he adds that “Humanities departments do not need to retrench; they need, on the contrary, to colonize.” He claims that the insight of the “greatest generation” into difference has been resisted because it challenges ethnocentrism.
There are many problems in this account. First, the simple logical one. The idea that there is no universal truth runs into crippling difficulties straightaway, since it claims to be a universal truth. The idea that all is difference, merely local and situated, must apply, if true, to itself, and if this disqualifies its claim to truth, as the implication seems to be, then it contradicts itself. The only way out of the muddle of such paradoxes is by assuming that the propositions are false: then no self-contradictions arise.
A second problem arises from the attempt to define difference as uniquely human. “Culture . . . is constitutive of species identity,” writes Menand, meaning human species identity. The implied corollary is that culture is always local, always marked by difference. Actually, culture by itself is not uniquely constitutive of human identity, for many other species have culture. Every known group of wild chimpanzees has its own unique complex of cultural traditions and could not survive without them. But apart from being unable to distinguish humans as he thinks it can, Menand’s declaration also contradicts his claim of difference, since it presupposes a distinctive, species-typical trait, a common feature, as he thinks, uniting all humans and only humans. Yet this is exactly what the doctrine that all is difference purports to deny: that there are some features common to all human natures.
In fact, not everything in human lives is difference. Commonalities also exist, and without those commonalities between people, culture could not exist, since it could not pass from one person to another or one tradition to another. Cultural Critique wants to stress the “situatedness” of all that is human, but wants to define that situation only in terms of particular cultures. But why not also the unique situation of being human, with the special powers evolution has made possible in us?
The idea that all claims are situated, that they have particular origins, is surely not one many would argue with. But it does not follow that if a claim has a specific origin, this proves its error or incompleteness or nonobjectivity. If a Cretan, or Baron von Münchhausen, or Pinocchio says “This is a cow,” and it is, it is no less true than if George Washington says it. An idea may derive from observation, tradition, a dream, a guess, intoxication, or hallucination, or any combination of these. No origin guarantees the validity or invalidity of the idea.
Of course at a given time, in a given place, certain thoughts are less likely to arise than in other places. But significant new truth remains difficult to reach from any starting point. Nevertheless this does not mean it cannot be attained or approached wherever one begins. Take the example of evolution. Ideas of evolution preceded Darwin by centuries. They became more likely after the establishment of systematic taxonomy and the search for new species around the world, after Malthus’s work on population, after Lyell’s laying down the principles of modern geology and showing how imperceptibly small changes could accrete into major differences, and after the explanation for and systematic collection of fossils. Evolution would have been still more readily explicable had Mendel’s hypothesis of articulate inheritance been known to Darwin and Wallace—yet even without that, both developed a powerful explanation for evolution that seemed to contradict the apparent observed stability of species.
Menand supposes that others resist the claim of difference because they resist the challenge to ethnocentrism. In fact challenges to ethnocentrism had been widespread long before Derrida’s seminal paper of 1966, in the recoil from the horrors of Nazi racism, in the hope for a better world that led to the founding of the United Nations and to anti-colonial independence movements; in the American recognition of the role of African Americans in World War II, the Civil Rights movement, the increasing acceptance of and interest in African-American musical cultures in the 1950s and 1960s; and in anthropology, from early in the 20th century. It is a strange fantasy to suppose that the humanities, inspired by the “greatest generation,” have led the attack against ethnocentrism that was already well established in both intellectual and political culture.
What others resist in Cultural Critique is not critiques of ethnocentrism but the self-contradictory and defeatist claim that all knowledge, except the knowledge of the situatedness of all knowledge, is situated and therefore flawed. A corollary, making the idea even less inviting, has been that if all claims to universality and transparency of knowledge are false, then the appropriate response is to challenge the claims obscurely. Hence, in art, the vogue for bad writing, the self-confessedly exclusionary opacity of much writing inspired by Theory.
Of course there have been many claims of universal truths or of universal human nature that reflect the partial vision of particular cultures or interest groups within them. But rejecting false claims to the universality of a particular set of views does not entail a need to reject universality altogether. As Ghanaian-American philosopher Kwame Anthony Appiah notes, what postcolonial opponents of universals actually object to in these cases “is the posture which conceals its privileging of one national (or racial) tradition against others in false talk of the Human Condition”: “those who pose as antiuniversalists . . . use the term ‘universalism’ as if it meant ‘pseudouniversalism’; and the fact is that their complaint is not with universalism at all. What they truly object to—and who would not?—is Eurocentric hegemony posing as universalism.”
Indeed to reject claims of a common human nature, far from securing the moral high ground, is to undermine the grounds for treating other
human beings as equals. One of the most extreme advocates of difference was Hitler, with his sense of the special destiny of the Aryan people and the German nation, and of the utter difference between Aryan and Jew. This is not to equate Cultural Critique with Nazism—although those who have tried to critique Cultural Critique by considering human nature in terms of biology have themselves been accused of Nazism—but merely to stress that claims of utter human difference are not themselves ethically sufficient. We need to accept both the commonality of human nature and the differences between individuals and peoples. If we reject all claims to commonality, we risk denying a sufficient basis for concern for other humans.
For most of the 20th century, anthropology has stressed the difference between peoples, since anthropologists earn attention by reporting on the exoticism of other ways of life. But that does not mean that human universals are not there, as the anthropologist Donald E. Brown and others have been able to document extensively. And indeed the universals of human nature, the factors that made it possible to understand another people in depth, had simply been taken for granted, even ignored, in the emphasis on difference.
In all species, from bacteria up, communication within the species is possible because of shared senses and interests. In the human case, we can understand one another, even across cultures, because of a range of intraspecies similarities. And we can understand one another especially well because humans are geared to learn from one another through joint attention, the expressiveness of the human facial musculature, the precision of human pointing (all of which develop before language, and make it possible), and language. Our capacity for social learning, for acquiring our own culture, also makes it possible to appreciate and enjoy the culture of others.
The idea that there is only cultural difference between peoples discourages cultural contact and cultural sharing, which has been of benefit to all, over the years, from stone tools to the Internet. The insistence on difference, on refusing to see similarities, inhibits dialogue and the chance to learn from, understand, and appreciate others. This is particularly disturbing in the case of art and literary studies. Menand writes that “a 19th-century novel is a report on the 19th century; it is not an advice manual for life out here on the 21st-century street.” So all those who have read Pride and Prejudice in the 20th century and since and felt that it showed something about the dangers of first impressions and the error of equating social ease with merit and social stiffness with coldness or disdain have been wrong?
Appiah offers a much more attractive and defensible attitude toward the arts of other times and places. Because he sees us all as humans, and not as defined primarily by group differences, he argues that cultural patrimonies should be seen as part of the whole human heritage, not the exclusive property of a single place or people. In his recent Cosmopolitanism (2005), he takes as his example the Nok sculptures of the sixth century B.C.E., from an area now part of Nigeria:
If they are of cultural value—as the Nok sculptures undoubtedly are—it strikes me that it would be better for [the Nigerian government and citizens] to think of themselves as trustees for humanity. While the government of Nigeria reasonably exercises trusteeship, the Nok sculptures belong in the deepest sense to all of us. “Belong” here is a metaphor: I just mean that the Nok sculptures are of potential value to all human beings. . . . It is the value of the cultural property to people and not to peoples that matters. It isn’t peoples who experience and value art; it’s men and women.
Further, Appiah advances the claim for an artistic connection “not through identity but despite difference. We can respond to art that is not ours; indeed, we can only fully respond to ‘our’ art if we move beyond thinking of it as ours and start to respond to it as art. . . . My people—human beings—made the Great Wall of China, the Sistine Chapel, the Chrysler Building: these things were made by creatures like me, through the exercise of skill and imagination.”
Yet all too often in the academy today, the literature of other times and places is taught only as a demonstration of difference, of the local, false, constructed, and oppressive or contested nature of the concepts of those times and places. But as a species uniquely capable of social learning, of learning from others—and remember, this is what makes our deep immersion in culture possible—we are capable also of learning from, responding to, feeling a kinship with, times and places other than our own.
A biocultural perspective on the human offers the strongest possible reasons to take into account artistic accomplishment in all areas and cultures, and the strongest reasons for considering local difference in terms of a genuinely broad understanding of species-wide commonalities and differences. It is the least likely to fix on an artistic canon within a particular language or region, a particular cultural level (“high” art versus “low,” say), a particular state of civilization. In a biocultural view, the Paleolithic and the present, the hunter-gatherer and the cosmopolitan, orature and opera are all part of our human repertoire.
Not only does the stress on difference discourage the study of works of art outside the present, except as demonstrations of the truths of theory, it discourages attention to works of art tout court. Menand writes disapprovingly that “there is talk of a return to the literary and to sterile topics like beauty—the very things that the greatest generation rescued us from.” Why should literary studies think they have been fortunate to be rescued from the literary? Would Menand—who, recall, advocates that the humanities should colonize other areas—deem it a success if medicine were rescued from the medical, perhaps by a Theory-inspired denial of the possibility of knowledge (think of all the money that could be saved on cancer research) and a Cultural Critique insistence on difference (genital mutilation? diagnosis by divination? cures by incantation?)?
It is deeply troubling that those teaching any branch of the arts should find beauty sterile, rather than something to enjoy, to augment, and to explain. Is Menand indifferent to his wife’s looks? Does he dress shabbily or write sloppily? Does he refuse to decorate his home? Does he disdain the lively concern that people like the Wodaabe or the Nuba or the Maasai have for physical beauty? Does he scorn, indeed, the vast majority of people everywhere who take an interest in personal, scenic, or artistic beauty, and who try to decorate their homes or their lives accordingly? I recently saw a photograph of a Haitian woman who had just eaten a mud pie—yes, just mud—because she could afford no more, yet she clutched a transistor radio as she danced to its music: the beauty of sound, at least, she could have some share in. Does Menand not see that an interest in beauty is a real part of humanity (as of other species), and of the humanities, and that it needs to be explained rather than dismissed? Does he not realize that his dismissal of the sterility of beauty in theory, if not in his personal practice, is all too typical of the pharisaic hypocrisy of Theory?
Our shared sense of beauty is one of our surest avenues to cross-cultural understanding and enrichment. Does the real, deep beauty of the Nok vases that Appiah discusses and illustrates not say more, and more swiftly, for the cultural creativity, and the justified pride in their achievement, of a people otherwise unknown to us? Dürer in the 1520s, encountering elaborate Mexican craftsmanship, commented that he had never in all his life “seen . . . anything that has moved my heart so much.” Goethe, reading Chinese novels, observed that “these people think and feel much as we do.” Japanese audiences respond to Shakespeare and Beethoven with rapture. And if audiences appreciate, artists appropriate. Over a millennium ago, the makers of the Book of Kells sublimely synthesized calligraphic and pictorial traditions from Europe and around the Mediterranean. In the 19th and 20th centuries, Maori and New Guinea carvers picked up Western tools and techniques as keenly as Gauguin or Picasso borrowed from non-Western cultures.
Menand wrongly assumes that the recent insistence on cultural difference in university literature departments has helped to undermine whatever is most deplorable in the status quo. I suspect it has done little to undermine anything except student interest in academic literary study, while it has shored up the status of English professors who enlisted as disciples of the “greatest generation” and their conviction that they are in the intellectual and moral right.
Evolution has made knowledge possible. Not necessarily reliable knowledge, but knowledge good enough, on average, to confer a benefit. Evolution has developed sociality to the point where members of many species can transfer knowledge across time: culture, in other words. As comparative and developmental psychology have shown, evolution has developed the human brain’s capacity to understand false belief—to understand that others, or we ourselves, might be mistaken about a situation—and hence has driven our quest for better knowledge. Both human culture and our human awareness of the possibility of being mistaken have eventually given rise to science, to the systematic challenging of our own ideas. The methods of science make relatively rapid change and improvement possible—as well, of course, as unforeseen new problems. They offer no guarantee of the validity of individual ideas we propose, but they do offer the prospect of our collectively learning from one another.
In the long perspective of evolution, testing proposals systematically, as science does, is a very new step for all humanity. Anyone, regardless of origins, can participate in the process, which harvests the natural strengths of our twin tendencies to compete and cooperate. But in order to work, science requires a commitment to the possibility that we can improve our thinking. Insisting that no ideas are valid except the idea that all ideas are invalid, or that all ideas are merely local, except this one idea, is the least likely route to genuine change.