Now that I am about to turn 85 and the hourglass is almost empty, I have come to understand better than ever before an odd meeting I once had with the late Czech president and playwright Václav Havel. I was part of a group he had assembled at the ornate presidential castle in Prague. The participants included Hillary Rodham Clinton, Henry Kissinger, Adam Michnik (a Polish historian and journalist, and a much-revered dissenter), Meir Lau (a chief rabbi from Israel), Ashis Nandy (an Indian political psychologist), and Bishop Jonas Jonson of the Church of Sweden. We were each invited to speculate about where we thought history was headed next. It took a private chat with Michnik and an exchange with Havel, mediated by an interpreter, for me to realize that Havel (like other leaders of the anti-Communist uprisings that drove the Soviet Union out of Eastern Europe) was desperately looking for another wave to ride. As a hero of the masses who rose up against their oppressors and prevailed, Havel had every right to feel that he had played a crucial historical role, after which all that followed paled in comparison. It was clear to me then and even more apparent now that Havel and his comrades would have given almost anything to move history just one more time.
In comparison, my wave was so small that it escaped the attention of most people. The media coverage that followed Margaret Thatcher’s death last April prompted me to reflect on what had caused my wavelet. Journalists and public figures recounted the measures Thatcher introduced while in power, such as privatization and union busting, as if they were either meritorious or faulty in their own right. But it is far more productive to view such policies in the context of history. Societies are like cars with loose steering wheels; we keep steering them so far to the left or right that their course often must be corrected, which leads to overcorrection—which itself calls for still more course adjustments. Before Thatcher, many British industries that had been nationalized during World War II remained centrally controlled. Powerful unions representing workers in these industries regularly disrupted public life, going on strike when their demands, some fairly outlandish, were not met. Thatcher overcorrected, leaving the British economy and polity underregulated and vulnerable to the 2008 financial crisis.
In the United States, President Reagan was elected in part as a conservative reaction to the liberal 1960s and the plethora of social programs introduced under Presidents Kennedy and Johnson. Reagan also left behind an underregulated economy, which led directly to the savings-and-loan crisis and indirectly to much else. Culturally, both Thatcher and Reagan promoted individual preferences over the common good, celebrating numero uno—that is, the self. Thatcher most famously stated: “There is no such thing as society.”
I came upon a telling indication of the culture of those times, more or less by accident. At the end of the Reagan era, I was teaching ethics at the Harvard Business School. Preparing for my classes, I read a report showing that though young Americans felt strongly about their right to be tried by a jury of their peers, they themselves had no interest in serving on a jury. They responded, basically, “Find someone else.” I argued in class and later in my book The Spirit of Community that it is morally obscene to take and not to give, that strong rights presumed strong responsibilities, and that if young people did not serve on juries, then there would obviously be no juries of their peers.
I also noted that older Americans were afflicted with this same malaise of excessive individualism. They sought a smaller government and objected to paying more taxes but demanded more government services, including public support for education, housing, and health care. Their feelings were later captured by the Tea Party movement and summed up at a town hall meeting when an audience member famously shouted: “Keep your government hands off my Medicare!”
In 1991, I observed that Americans were proud of the way the United States won the Gulf War yet did not want to serve in the armed forces, and did not want their children to serve either. I wrote that while individual rights surely matter, these rights must be balanced with commitments to the common good—for instance, by protecting the environment and public health. These ideas and similar ones held by colleagues became the basis for a movement that even had a platform. Although my colleagues strongly objected to it, I called the movement “communitarian.” (Although the word shares the same root, it should not be confused with communism or communion; it derives from community.)
I added one more idea to the mix. I pointed out that the various liberation movements of the 1960s (civil rights, women’s rights, sexual liberation) had brought down the old regime, undermined authority figures from fathers to labor leaders, from priests to presidents, and also cast by the wayside the accepted standards of upright conduct. Many of these old norms were racist, sexist, and authoritarian, but the liberation movements failed to create new shared moral understandings, thereby allowing for a moral vacuum. The need was not for a return to the bad old days (the agenda of social conservatives dismayed by the 1960s) but for new understandings. Formulating new rules through moral dialogue would be the mission of communitarians.
The time for these ideas had come. Americans, Brits, and others suffering from an overdose of Reaganism and Thatcherism had discovered that when everyone just watches out for number one, the result is a rough-and-tumble society, one that is too self-centered and isolating. They looked for more togetherness and more attention to the common good. Polls showed that people welcomed the balance between “I” and “we” that communitarianism offered, and the line “the Me needs the We to Be” was very well received. The secret to our success was that for ideas to take off, they had to be historically appropriate. Ours were.
Communitarian policies developed by the New Democrats in the United States, the Neue Mitte in Germany, the “purple coalition” in Holland—and, above all, by New Labour in the United Kingdom—helped elect Bill Clinton, Tony Blair, and other centrist leaders who wisely avoided the term communitarianism but embraced its message. “Responsibility from all, responsibility for all” was a winning campaign theme for Tony Blair. The British media called me “the father” of his ideas, a claim that I never made, as I told him during an hour-long meeting I had with him about communitarianism. Blair generously responded, “I don’t mind at all being associated with you,” which admittedly made me glow.
The timing of my visit to Britain, in March 1995, was particularly auspicious. Several months earlier, Blair as Labour leader had been engaged in a fight for the soul of the party, moving away from socialism toward a centrist, Third Way approach. In October 1994, Blair had lost a major round of that battle when the Labour Party refused to drop a clause to which it had stubbornly clung for 77 years, committing it “to secure for the workers by hand or by brain the full fruits of their industry … upon the basis of the common ownership of the means of production, distribution, and exchange, and the best obtainable system of popular administration and control of each industry or service.”
In plain English, Labour promised that if it ever returned to power, it would nationalize anything that moved. Although this socialist idea had long been abandoned by all but the doctrinaire left, the Tories used its continued existence in the Labour platform to scare voters. The clause was a major reason why Labour lost the 1992 election, yet in 1994, the old left dug in and refused to drop it. Its removal thus became the defining mission of the New Labour movement, for which Blair was fighting.
While I was still in London, Blair got another chance to do battle with his party over the clause. In what has often been described as a “defining moment” for the party, he triumphed: Labour replaced the clause with a communitarian substitute proposed by Blair. The new clause states, “The Labour Party … believes that by the strength of our common endeavour we achieve more than we achieve alone … to realise our true potential and for all of us a community in which … the rights we enjoy reflect the duties we owe, and where we live together, freely, in a spirit of solidarity, tolerance and respect.”
The Sunday Times called it a “communitarian document” and added: “The new clause 4, with its emphasis on rights and duties, sounds remarkably Etzioni-ite. … Community has been talked about as the big idea for some time, since the end of the Reagan-Thatcher ‘me-first’ era.”
Reports in the media that I influenced other leaders—including Bill Clinton, Helmut Kohl of Germany, and Jan Peter Balkenende of the Netherlands—were both flattering and a source of concern. I feared that such reports might lead these leaders to distance themselves from the message, but I won’t deny that I got a high from being credited for ideas public leaders found winning. Still, it was far more important that the ideas lead to societal change and not be dropped because they were attributed to some college professor. Much higher highs exist than seeing your picture on the front page and the evening news.
If I claim that I did not get a kick out of small dinners in the White House with Bill, Hillary, and Al (beat that for name dropping!), don’t believe me. However, I felt much better after I persuaded President Clinton to address a conference on character education (a major communitarian building block) that we put together for the White House. And I was delighted to see a fellow communitarian, William A. Galston, helping the president launch AmeriCorps, which enabled many thousands of Americans to do community service.
When he was a presidential candidate, Barack Obama showed great familiarity with communitarian thinking (including mine) when we met on two occasions. In his book The Audacity of Hope, he emphasizes that individual rights must be balanced with social responsibilities. However, since he became president, communitarianism has been the philosophy that dare not speak its name. Although the president has often drawn on its principles—in trying to engage rather than confront opponents overseas and at home, in often reminding us that although we come from red states or blue states, we all come from the United States, and in making security and public health decisions that limit individual rights for the greater good—he never mentions communitarianism, nor has he enlisted any of its advocates to work in his administration.
I could regale you with other stories about the communitarian wavelet. But this is all mainly in the past. Since then, no matter how fiercely I huff and puff, my sails have been left luffing and my seas are becalmed. I lost the voice for translating social science findings and insights into public appeals for addressing societal problems. Thus, these days, I fume at the news, rush to my computer, and fire off another salvo. And then I fume some more because—despite my confidence that the message I have hammered out would do the world a lot of good—no one seems to be listening.
If only there were racing forms for public intellectuals of the kind they have for horse and dog races. These sheets would tell you the outcome of, say, the last three predictions or prescriptions a particular pundit has made. In this way, I fantasize, we would be able to handicap public voices, granting more weight to those who get it right. But this is not exactly what is happening.
David Brooks, in his sociologically insightful book Bobos in Paradise (2000), points out that if a public intellectual says something outlandish, however wrongheaded, he or she will get a lot of attention, as other commentators scramble to make a name for themselves by refuting the claims. This surely was the case for Francis Fukuyama, who predicted in 1990, as the Soviet Union collapsed, that the world had reached the “end of history,” wherein every nation would adopt an American-style democracy and keep it for all time. (Those nations that lagged behind could be helped with a kick in the pants, the kind neoconservatives delivered to Iraq in 2003.) Never mind that most of the world is still well “within history,” to use Fukuyama’s phrase, or that nations as different as Russia, Hungary, and Venezuela are sliding backward. For the media, Fukuyama continued to qualify as a wise man whose opinions were frequently solicited.
Runners-up for the dubious title of most-wrong-but-celebrated public intellectual are Paul Kennedy and Ezra Vogel, who in the 1980s predicted the collapse of the United States as a world power and the rise of Japanese hegemony. And there’s Samuel Huntington, who predicted that the flood of Latino immigrants would lead us to lose Texas and California to Mexico, without one shot being fired.
But however irritating it is when those who get things wrong do not get their comeuppance, it is worse that even when I make valid prognostications, which defy the prevailing consensus, I can no longer reach those who need to hear them. Take, for instance, the thesis advanced in It’s Even Worse Than It Looks, a much-discussed book by two highly respected and often-quoted political scientists, Norman Ornstein and Thomas Mann. They argue that Washington is deadlocked because the radical right Tea Party movement has dragged the GOP out of the American political mainstream. Once the Republicans were pushed so far to the right, they were unable to compromise with the reasonable Democrats, causing the debilitating gridlock in Washington. This argument seems plausible and is indeed the prevailing explanation for politicians’ intransigence.
I beg to differ. Gridlock takes place when one party wants to go left and the other wants to go right, with the result that things stay in place. However, when the Democrats want to go left, and the Republicans want to stay put, and things stay in place, then the Republicans win. That the 112th Congress enacted about half as many laws as the average of those Congresses before it was not an indication of gridlock but of Republican success. I also contend that we are subject to confusion when we assume that American politics is divided between the Democrats and the Republicans, with each party having more or less the same pull—but pull in opposite directions, hence the gridlock. What seems truer is that we have a majority conservative party that includes most of the GOP and a good portion of the Democrats, and a much weaker liberal party (consisting of about two-thirds of the Democrats). Analyzing American politics in these terms makes it clear why most of the time the conservatives win.
That is why what Congress has enacted—or chosen to ignore—since the first election of President Obama has been a far cry from what liberals would have preferred. The 2009 stimulus package was much smaller than liberals believed necessary, and roughly a third of it was diverted to conservative-preferred tax cuts. The amount of wealth exempt from the estate tax was increased fivefold. The Affordable Care Act, considered by some to be a major liberal victory, was advanced without even a hearing for the preferred liberal option of a single-payer system or moderately liberal proposals such as a public option.
The Patriot Act was extended, and so was domestic surveillance. The Obama administration has deported 1.5 times more illegal immigrants than the Bush administration. There were few differences between the foreign policy of the second Bush term and the first Obama one, other than Obama’s ordering five times as many drone strikes and, for the first time, authorizing the targeted killing of an American suspected of being a terrorist. Obama may well have proceeded in this way because of what he thinks the nation needs, but given the conservative majority in the electorate, acting otherwise would have endangered his chances of being reelected. Being weak on defense has always been the Achilles heel of the Democrats.
Thus, I have suggested, we have not gridlock but a conservative headlock. I published all of these findings and quite a few more in an academic journal. A fellow communitarian sent the article to TV news show producers and newspaper columnists—who continue to maintain that Washington is gridlocked, without a single mention of a contrary opinion. I was dismayed because it seemed all too clear that as long as people did not understand what was happening, they would be unable to change it.
My longstanding interest in communitarianism, and hence in communities, has led me to ask under what conditions could nations become members of more encompassing communities. My studies have led me to warn that economic union must be accompanied by political union, and that a political union requires a core of shared values so that citizens of different nations will transfer part of their political allegiance to the larger community. I warned in my book Political Unification Revisited that by centralizing and imposing economic unification before engaging in community building, the European Union was making a mistake, which is apparent in the growing alienation of Europeans from the EU. Again, I find no satisfaction in having predicted correctly; I’d much rather see the advice followed and people spared the resulting agony.
My recommendations on how to save Medicare without cutting entitlements (in Policy Review), how to reduce gun violence when the agency charged with doing so has been gutted (on Salon and The Huffington Post), and how to fight obesity by focusing on societal factors and early intervention (in Health Affairs)—like my warning long before Edward Snowden that we cannot have homeland security as long as the private sector refuses to do its share (in Political Science Quarterly)—fell upon similarly deaf ears among public officials and policymakers.
The stream of findings, predictions, and prescriptions, far from gaining me a following among policymakers, landed me in hot water. The media dropped me from their Rolodexes and deleted me from their smartphones.
The New York Times, which once ran my op-eds, my business section articles, and magazine pieces, now rejects my submissions. The Washington Post, which once granted me regular appearances in the Outlook section and published my book reviews, loves me no more. The same is true of NPR and other once-welcoming outlets. Without these major media, it is hard to get out the word, build a following for new ideas, connect with others who have similar ideas, and make a wave that will help carry change forward. If I were a true-blue academic, concerned only with my own studies, I would not have been bothered. But I want to share my ideas with the community at large and help develop a message around which new social movements may jell—and that is impossible except through mass media.
One reason for my gradual loss of a megaphone (if you are a public intellectual or have the urge to become one, or wonder about the way ideas sprout and spread, take note) is that I violated cardinal rules of public dialogue. First, as a communitarian, I don’t fit into either the liberal or the conservative category. Hence when the media seek a pair to comment on each issue, one of each kind, there is no room left for a third position. Other communitarians have been similarly left behind. Thus, Michael Sandel, who has been treated like a rock star in South Korea and Japan, filling stadiums with audiences for his lectures, and Charles Taylor, who has been awarded the Order of Canada, are hardly found in the American media.
More damaging was my failure to stick to my knitting. The media like public intellectuals to be specialized. Race? Henry Louis Gates. Feminism? Gloria Steinem. Congress? Norman Ornstein. Very few intellectuals have much of a voice if they do not specialize, on the academic assumption that to know more about a subject, you must cover less turf. The smaller the field, the deeper you can dig. I failed this test many times over. Time magazine labeled me the “everything expert”—and did not mean it as a compliment. When I ran into a media big shot who was once my student at Columbia University, he introduced me to his wife as “someone who writes a lot, who wants to cover it all.” I told myself that I always used the communitarian angle to try to cast new light on a variety of subjects, but that did not make me seem like less of a generalist.
Perhaps my biggest mistake has been railing against those who want to see the United States on a collision course with China. History is on the side of the Dragon Slayers (as the anti-China hawks are called). Historians point out that when a new superpower rises, the outgoing one is reluctant to yield power and a conflict ensues. Moreover, China has made moves that can be interpreted as hostile. Societies (and, I am sad to note, even communities) consolidate best against outsiders, and hence are tempted to find enemies. And the U.S. military and corporations in the defense business are looking for new targets, now that the wars in the Middle East are winding down. To argue, against these heavyweights, that China will be preoccupied with its domestic needs and has shown no eagerness to become a world power, and that there is time to test cooperation and competition rather than confrontation, was to be ahistorical, to try to turn back a wave rather than ride a swelling one.
In the past two years, I have found some solace in the last place I expected it—among my academic colleagues. They have given me more opportunity to air what I believe must be voiced than I had been given in any of my many preceding years. Academic journals at institutions ranging from the London School of Economics to Yale to Stanford University have published what I needed to say. The same held for several crossover publications, such as Foreign Affairs and The National Interest.
My young colleagues tell me that to be heard beyond the walls of academia, I should reduce what I have to say to 140 characters and Tweet. Send it to my “friends,” in the hope that they will share the word with their friends. Photoshop what needs to be broadcast and Instagram it. Stream, beam, and scream. This all may be beyond me. Still, until I am shown that my predictions or prescriptions are ill-founded, or not of service, I will try to get out what must be said. I’ll keep pulling at the oars, however small my boat, however big or choppy the sea.
Permission required for reprinting, reproducing, or other uses.