Active Measures: The Secret History of Disinformation and Political Warfare by Thomas Rid; Farrar, Straus and Giroux, 528 pp., $30
As November’s presidential contest looms, reminders of Russia’s hacking of the 2016 election pepper every news cycle, with warnings of more to come. The number of countries running disinformation campaigns has more than doubled to 70 over the past two years, abetted by “black PR” firms that slander and deceive for hire. Yet as high anxiety over digital disinformation swells, the United States—and open societies everywhere—risk deeper damage by seeing what’s happening today as something entirely new under the sun. As Thomas Rid, a political scientist at Johns Hopkins University, writes in his engrossing new book, Active Measures, “This sense of novelty is a fallacy, a trap. The election interference of 2016 and the renewed crisis of the factual has a century-long prelude.”
Democracies have survived previous onslaughts. Although the Internet has opened myriad avenues for mischief and skullduggery, Rid contends that it has also created a dynamic that has prompted many observers “to highlight the potentials of disinformation over its limitations.” A comparison of old and new methods not only reveals the latent deficiencies of digital disinformation but also can remind us of how best to preserve “our ability to assess facts on their merits and to self-correct accordingly,” the central building block for any open society.
What should we learn from the evolution of “active measures”—the term coined by the Soviet Union and Warsaw bloc for what the Central Intelligence Agency called “political warfare”—from the 1920s to the present? Start by recognizing that “active measures are not spontaneous lies by politicians, but the methodical output of large bureaucracies.” Rid shows that we have the Cold War to thank for this professionalization, “with American intelligence agencies leading the way in aggressive and unscrupulous operations.”
Postwar Berlin was a prime battleground. The CIA’s just-minted Office of Policy Coordination bankrolled West German activist groups that tracked East German political prisoners, exposed Soviet informers, sabotaged East German factories with fake orders, and let fly 15,000 leaflet-bearing balloons a month. A publishing front headed by a former U-boat commander created slick fakes of East German publications targeting official audiences, even a jazz magazine that became popular among East German youth organizations. As Bill Harvey, its swashbuckling CIA overseer, told his bosses, “Along with astrology, we consider [jazz] one of the most potent psychological forces available to the West for an attack on Moscow Communism.” Hepcats of the world unite—the stars are on your side!
Although active measures always contain some element of disinformation (forgery, for instance, or fake sourcing), some of the most effective ones trafficked in truth. Consider The Penkovsky Papers (1965), the memoirs of Oleg Penkovsky, a Soviet military intelligence colonel and one of the Cold War’s most effective spies (among other things, he provided U.S. intelligence with details of Soviet missile launch sites in Cuba). Because the CIA wanted to sow turmoil in the ranks of Soviet intelligence as well as lay bare its methods for a broader public, writes Rid, “the covert operators in Langley did not falsify any content, only the cover story” of how the manuscript reached the public. (Full disclosure: my journalist father, Frank Gibney, edited the book as a witting CIA asset.) Serialized in major newspapers around the world, it became a bestseller that also inspired John le Carré’s 1989 novel, The Russia House.
Active measures have always worked best by playing on existing societal divisions. Ladislav Bittman, a Czech defector who spearheaded numerous such schemes, told Rid that for disinformation to succeed, it must “at least partially respond to reality, or at least accepted views.” In 1959, for instance, the KGB and East German intelligence tapped unease over Germany’s Nazi past by defacing synagogues in West Germany and Europe with swastikas and anti-Semitic slogans—a coordinated effort that also hit synagogues and Jewish cemeteries in New York and elsewhere in the United States, Israel, and other countries.
The goal was not just stirring general fear and turmoil. Successful active measures are targeted measures. As a decrypted message from Moscow to East Berlin praising the campaign put it, “the socialist [Russian] government’s argument that West Germany is a potential bastion of Nazism and that consequently West Germany must under no circumstances be fully rearmed has been considerably strengthened.” By the mid 1960s, the KGB was running 350 to 400 such disinformation operations per year.
Yet by then, according to Rid, U.S. intelligence had “retreated from the disinformation battlefield almost completely.” That categorical assertion may raise an eyebrow, but certainly public unease and greater congressional oversight led to more restraint, and with good reason. Rid explains: “It is impossible to excel at disinformation and at democracy at the same time.” Active measures erode trust in factual authority and the institutions upon which open societies depend.
The Soviet Union had no such epistemic quibbles, ramping up to exploit Vietnam-era tensions by leaking dubious forgeries to antiwar groups all too eager to believe them. That strategy also bore fruit in the subversion of European peace activists protesting nuclear weapons—a concerted effort that East Germany’s Stasi labeled, with Orwellian precision, as Friedenskampf, or “peacewar.”
The West’s proven ability to withstand this barrage of falsehoods should offer comfort. But secular trends that predate the winning of the Cold War—the decline of public trust in institutions and the rise of postmodern thinking—pose a larger challenge, especially with the Internet’s toxic powers. Facebook and Twitter have disempowered gated journalism, mainlining disinformation and privileging feelings over facts. Utopian notions of radical transparency have inspired leakers, who can now acquire and share information with greater anonymity and ease. Indeed, Rid cites the “celebrity culture” that enfolded high-profile leakers such as Julian Assange, Chelsea Manning, and Edward Snowden as a main ingredient in creating “a dream come true for old-school disinformation professionals.”
Russia, led by a former KGB agent who honed his disinformation skills in Cold War Dresden, is making the most of this moment. Rid, who testified before Congress on Russia’s interference in the 2016 election, provides an authoritative blow-by-blow of the hacking of the Democratic National Committee and Clinton campaign and the accompanying social media disinformation effort. Whether or not that nefarious activity tipped the election (Rid thinks few U.S. voters were persuaded to change their votes by Russian disinformation on Facebook and Twitter), it has raised seismic doubts in the public’s mind about both the 2016 outcome and the integrity of U.S. elections moving forward.
Yet the paradoxes revealed in the course of that effort hold lessons for the future. First, although disinformation has been turbocharged by the Internet, it has also become a “low-skilled, remote and disjointed” endeavor that is “harder to control and harder to assess.” The nearly 1,000 trolls at Russia’s fabled Internet Research Agency were not happy Stakhanovites, much less hard-bitten KGB agents. They had quotas to meet, but the metrics they generated tended to overstate their real impact. Rid’s breakdown of their operations suggests that “social media had actually increased the significance of traditional journalism as an amplifier of disinformation operations.” In one case, a front-page New York Times story on a crude and hitherto obscure IRA Facebook ad gave it a visibility that it utterly lacked the first time around.
In other words, when confronted by disinformation, remain calm and carry on. Whether it is media inadvertently playing up a false story, or governments ham-handedly silencing one, Rid argues that “one of the most insidious threats posed by successful disinformation campaigns” is overreacting to them.
Rid is mostly silent on the larger issue of how to address the mistrust and polarization that create such fertile ground for information mayhem. But FBI Director Christopher Wray recently suggested a time-tested way forward. Pushing back on President Trump’s contention that Ukraine, not Russia, interfered in the 2016 election, he said: “I think part of us being well-protected against malign foreign influence is to build together an American public that’s resilient, that has appropriate media literacy, and that takes its information with a grain of salt.” A tall order, to be sure, but one that reading Rid’s invaluable work can make easier.