Article - Winter 2018

Tuskegee Truth Teller

Peter Buxtun, like many medical whistleblowers, got little thanks for exposing a notorious scandal

By Carl Elliott | December 4, 2017
Poor black men, many of them suffering from syphilis, have their blood drawn by a U.S. Public Health Service employee during the Tuskegee study (National Archives at Atlanta)
Poor black men, many of them suffering from syphilis, have their blood drawn by a U.S. Public Health Service employee during the Tuskegee study (National Archives at Atlanta)

One July evening in 2016, at the Viva Goa restaurant in San Francisco, I was sitting across from a silver-haired gentleman named Peter Buxtun. We had just ordered our meal when a loud thud sounded from across the room. Buxtun, who was nearly 80, leaped up from the table and rushed over to help a dazed-looking woman sprawled on the floor. “Did you hit your head?” Buxtun asked. “Are you okay?” When she didn’t answer right away, Buxtun switched to German. Yes, she was fine, although very embarrassed; apparently she had misjudged the location of her seat and sat down in empty space.

As he was helping the woman to her feet, a waiter pushed his way between them to fill her water glass. The waiter was intent on pretending that nothing unusual had taken place, even though it had been only seconds since the woman had crashed to the floor. When Buxtun returned to our table, he was still shaking his head, baffled by the the waiter’s behavior. This reaction, I have learned, is not uncharacteristic of Buxtun. Despite his easy laugh and genial manner, he has the air of a man who fears the world is populated by blockheads and scoundrels.

In 1972, Buxtun exposed the most notorious medical research scandal in American history. For 40 years, the U.S. Public Health Service had deceived and exploited hundreds of poor black men with syphilis near Tuskegee, Alabama, using free meals and burial insurance to lure them into an experiment in which they would receive no treatment for a potentially deadly disease. Very few employees of the health service apart from Buxtun saw anything wrong with this. Only when Associated Press reporter Jean Heller wrote about the abuse, using documents provided by Buxtun, did the Tuskegee study eventually end. Buxtun’s revelations triggered Senate hearings, a federal inquiry, a class-action lawsuit, and in concert with several other research scandals of the period, a lasting set of federal guidelines and institutional structures intended to protect the subjects of medical research.

It would be difficult to name a figure in the history of American medical ethics whose actions have been more consequential than Buxtun’s. Yet most ethicists have never heard of him, and many accounts of the Tuskegee scandal do not even mention his name. Buxtun did not appear in Jean Heller’s 1972 exposé, nor was he mentioned in the first major scholarly article about the scandal, written in 1978 by Harvard historian Allan M. Brandt. In the well-known play (and later, film) based on the scandal, Miss Evers Boys, Buxtun is completely absent. His name is rarely uttered along with the other notable whistleblowers of his era, such as Daniel Ellsberg, Karen Silkwood, and Frank Serpico. If the role played by Buxtun in exposing the scandal is at all familiar, it is largely because of James H. Jones’s influential 1981 history of the Tuskegee study, Bad Blood, and Wellesley College historian Susan Reverby’s 2009 book, Examining Tuskegee.

Included in Bad Blood is a photo of Buxtun as a bearded young man in 1973, posing next to Senator Ted Kennedy. For many years that photo—plus the knowledge that Buxtun lived in San Francisco during the 1960s—had placed a certain image of him in my head. If asked to describe it, I would have mentioned radical politics, psychedelic drugs, and maybe the Grateful Dead. Nothing could be less accurate. Buxtun is a lifelong Republican, a member of the National Rifle Association, and a collector of vintage weapons. When I told him about a recent visit I had made to the City Lights bookshop, famous as the home of San Francisco’s Beat poetry scene in the ’50s, he replied, “Sometimes I like to go there and ask to see their military history section.” Buxtun left the Public Health Service in 1968, moving on to law school and then to a career in investments, but he has lived in the same Telegraph Hill apartment for more than 50 years. In front of a large bay window overlooking San Francisco Bay is a set of German aviation binoculars mounted on a tripod. The walls of his apartment are lined with bookshelves, including one shelf devoted to books he keeps solely for their names. One is titled The Romance of the Gas Industry.

Maybe the most unusual thing about Buxtun is how few counterparts he has. The past half century has seen many medical research scandals, but few of them have been exposed by whistleblowers. In many scandals, even those that came well after Tuskegee, doctors and nurses have stayed silent even after seeing research subjects being shamefully mistreated. In cases where medical insiders have worked up the courage to speak out publicly, the result has often been professional vilification. This raises a larger question: In a research enterprise supposedly built on a humanitarian ethos, why are whistleblowers like Buxtun so rare?


 

Buxtun never planned to work for the Public Health Service. Raised on a ranch in Oregon, he had enlisted in the army and trained as a psychiatric social worker after he finished a political science degree at the University of Oregon. In 1965, he was doing graduate work in history when he saw a job flier. The health service was funding a venereal disease program in San Francisco. Buxtun says, “I found this thing and thought: San Francisco, working in VD control? What a stitch!”

Soon he had become a venereal disease tracker. “A typical day would be, Come in, look in your mail slot to see if you had some other people’s names,” Buxtun says. “Charlie Jones, met in a gay bar by another gay guy, and they had gay sex, the other guy had syphilis. Okay, chase this guy down. How do you find the guy? Well, there are a lot of ways to do it, and we had some of the resources that a typical detective in a police department would get—reverse directories and things like that.” Once Buxtun had tracked his subject down—sometimes in a flophouse, sometimes in one of the city’s better neighborhoods—he would persuade the person to be tested. Those who tested positive were treated effectively with penicillin. “Men could get a sore that would scare the hell out of you,” Buxtun says. “One of them looked like a dog had taken a bite out of a weenie.”

One day in the coffee room, Buxtun overheard a coworker talking about a syphilis patient in Alabama. “The family knew that he was really ill, that something was really wrong,” Buxtun says. “He was plainly insane, had symptoms, and for some reason they took him some distance away to a doctor they knew of.” The doctor diagnosed tertiary syphilis (the later stages of infection, which can damage the central nervous system) and gave the man a shot of penicillin. But when Public Health Service officials found out, they got very upset. The doctor, unaware of the Tuskegee study, had treated a research subject who was not supposed to be treated.

The next day Buxtun was on the phone with someone at the Communicable Disease Center (now the Centers for Disease Control). “I said, ‘Hey, what do you have on this Tuskegee study?’ He said, ‘Oh, I’ve got a lot on it. What do you want?’ I said, ‘Send me everything.’ Damned if I didn’t get—and I’ve still got it—a brown Manila envelope.” It had about 10 reports of what were called roundups—the occasions when the subjects were found and brought in for examination. What Buxtun read about the Tuskegee study in that envelope contradicted everything that he’d been advising doctors to do with a syphilis patient. “You treat him. You don’t let him get back out in society and infect someone else,” Buxtun says. Yet in Tuskegee, the researchers were simply following patients to see what would happen if they went without therapy. “It was an autopsy-oriented study,” Buxtun says. “They wanted these guys dead on a pathology table.”

The research subjects were all black men in Macon County, Alabama, many of them sharecroppers. Nearly 400 had syphilis and another 200 or so served as healthy controls. The syphilitic men were never told they had an infectious disease, only “bad blood,” nor were they offered any treatment apart from tonics and pills such as aspirin for aches and pains. Many subjects underwent painful lumbar punctures (spinal taps) to determine whether the infection had spread to the nervous system. The researchers persuaded them to enroll in the study by giving them free meals and minor remedies and by promising to pay their burial expenses in exchange for permission to autopsy their bodies. When the Tuskegee study began in 1932, treatment for syphilis involved a lengthy, toxic course of arsenic-based therapy. By 1943, however, the disease was easily curable with penicillin. The consequences of untreated syphilis are summed up on a yellow matchbook that Buxtun and his colleagues used to distribute in bars and bathhouses: “Blindness, heart injury, insanity, death.”

Buxtun took the roundup reports with him to the city library. “I wanted to look up German war crimes proceedings,” he says. Buxtun had come to America as an infant in 1937, the son of a Jewish Czech father and a Catholic Austrian mother. He knew that the “Doctors’ Trial” in Nuremberg, in which German physicians were indicted for experimenting on concentration camp prisoners—seven were executed—had led to the modern code of research ethics. The very first principle of the Nuremberg Code states, “The voluntary consent of the human subject is absolutely essential.” The code also directs researchers to protect subjects from disability, injury, or death, no matter how remote the possibility. Buxtun remembers, “It was toward the end of the evening in that library downtown, and I thought: I’ve got to do something.”

The first thing he did was prepare a report on the Tuskegee study. “I directly compared the work of the CDC in Atlanta, in Tuskegee, to what the Nazis had done,” Buxtun says. He showed the report to his boss and said he planned to send it to William Brown, the head of the Venereal Disease Section of the Public Health Service. He recalls his boss saying, “When they come to fire you, or do whatever they’re going to do, forget my name. I’ve got a wife and a couple of kids. I want to keep my job.”

It is unclear whether Brown ever read that report, but he certainly read a letter Buxtun sent him in November 1966. “Have any of the men been told the nature of this study?” Buxtun asked. “In other words, are untreated syphilitics still being followed for autopsy?” Brown drafted a reply assuring Buxtun that the subjects were volunteers who were “completely free to leave the study at any time.” But he apparently never sent it and instead decided to talk to Buxtun in person.

“To my surprise, I got orders to go to Atlanta, from Dr. Brown and company,” Buxtun says. “I was being called on the carpet, and they thought from the high position that they had that they were going to correct an errant employee. Maybe I was an alcoholic, or a lunatic of some sort.” The March 1967 summons to Atlanta coincided with an annual conference for healthcare workers specializing in venereal disease. Buxtun was scheduled to meet Brown after the first session. “So these stern-looking bureaucrats come out,” he says. They led him to a meeting room with a large, dark wooden table. “All these guys came in and sat at one end of the table, so I went a little way down the table, sat down, and put my things down,” Buxtun says. “They were sitting right in front of the American flag and the flag of the Public Health Service. The leader of the group was Brown, who turned out to be a mousy little bureaucrat,” Buxtun says. The real enemy in the room was John Cutler, an assistant surgeon general and venereal disease specialist who was deeply involved in the Tuskegee study. “He was bursting with rage,” Buxtun remembers. “He couldn’t wait for the door to be shut to that meeting room.”

“That guy pinned my ears back,” Buxtun says. “He proceeded to give me a tongue-lashing. ‘See here, young man. This is serious work we are doing. You are talking about harm to these black sharecroppers? This is something they are doing as volunteers.’ ” Buxtun responded by reading from one of Cutler’s own reports, which stated clearly that the subjects would never have agreed to the study without the “suasion” of burial expenses. Buxtun remembers Cutler saying, “I didn’t write that! I didn’t write that! It must have been written by one of my colleagues!” At that point, Buxtun says, everyone in the room began to look nervous.


 

“It’s tough being a whistleblower when you don’t even know you’re a whistleblower,” Buxtun told me. When he began his work with the Public Health Service, the word was completely unknown. “Would I have known the term whistleblower in 1972? I don’t know. I might have,” he says. “But it wouldn’t have the connotations that it carries now.”

In Buxtun’s era, people who called out wrongdoing in their own organizations were more likely to be called turncoats, snitches, or squealers. In the introduction to a 1972 book on whistleblowing, historian Taylor Branch writes that it is difficult to find in history or mythology any case where people are honored for having publicly exposed the actions of their superiors. “Whistle-blowing is severely hampered by the image of its most famous historical model, Judas Iscariot,” Branch writes. “Martin Luther seems to be about the only figure of note to make much headway with public opinion after doing an inside job on a corrupt organization.”

Anyone who has studied the notorious research scandals of Buxtun’s era cannot help being struck by the absence of whistleblowers—or even any real public dissent—in studies that often lasted years. It took decades, for example, for someone to finally blow the whistle, in 1972, about the federal government’s practice of testing the toxic effects of radioactive substances on unwitting citizens. Nor did anyone act when University of Pennsylvania dermatologist Albert Kligman tested organophosphates and other toxic chemicals on inmates at Holmesburg Prison in Philadelphia. In Montreal, McGill University faculty members remained silent for a decade while CIA-funded psychiatrist Ewen Cameron subjected his patients to some of the most bizarre experimental techniques imaginable—drug-induced comas lasting weeks, repeated high-voltage electroconvulsive therapy, the administration of powerful psychoactive drugs ranging from LSD to PCP, helmets broadcasting taped “psychic driving” messages into their ears. In a famous 1966 article in The New England Journal of Medicine, Harvard anesthesiologist Henry Beecher identified 22 stunningly abusive studies that had been published in medical journals over a period of years, seemingly without prompting any ethical objections whatsoever.

Of course, this was also a time when public trust in institutional authority was plummeting—the era of the My Lai massacre and the Watergate cover-up, of Eichmann in Jerusalem and Unsafe at Any Speed. In 1978, more than 900 Americans committed suicide by drinking cyanide-laced Kool-Aid at the People’s Temple religious commune in Guyana, on the orders of their minister, Jim Jones. By the end of the 1970s, the consequences of such blind obedience to authority had given whistleblowing a measure of cultural respectability.

Today whistleblowers in medical research are not as rare as they once were. For instance, it was largely the actions of whistleblower John Pesando, an oncologist at the Fred Hutchinson Cancer Research Center in Seattle, that in 2001 exposed a series of ill-conceived, deceptive cancer studies. Yet many other scandalous studies in recent decades have taken place without public disclosure. Some were conducted by private companies, such as Pfizer’s Trovan trials during a meningitis epidemic in Nigeria in the ’90s, which resulted in the deaths of 11 children. But many others occurred in academic health centers: the controversial treatment withdrawal study of patients with schizophrenia at UCLA in the 1980s, the recent “bacteria-in-the-brain” studies at University of California–Davis, the infamous “symptom provocation” studies on schizophrenic patients at (for example) Yale University, Columbia University, the National Institute of Mental Health, and the University of Cincinnati. My own institution, the University of Minnesota, has endured a series of research scandals in its psychiatry department dating back to the early ’90s, many of which remained hidden for years until mistreated research subjects or their family members contacted reporters.

In the decades since the Tuskegee study, the moral standing of whistleblowers has become more complicated. As in such movies as Serpico and The Insider, Hollywood portrays them as brave, conscience-tortured martyrs who triumph in the end, but in reality, people who try to blow the whistle on wrongdoing often fail and face brutal punishment. In 2010, scholars at the universities of Chicago and Toronto studied 216 cases of corporate fraud. They found that more than 82 percent of cases with named employees who reported fraud were fired, quit under duress, or were punished in some other way. Many never worked again. The scholars concluded, “Not only is the honest behavior not rewarded by the market, but it is penalized. Why employers prefer loyal employees to honest ones is an interesting question that deserves separate study.”

This pattern is consistent across a whole array of organizations, both public and private: engineering firms, banks, military bases, government agencies, and hospitals. Even nurses who speak up about dangers to patients are often punished. One study found that 28 percent of nurses who reported misconduct had been formally reprimanded, and every single nurse surveyed had suffered some kind of informal retaliation, such as ostracism or pressure to resign. Ten percent were asked to see a psychiatrist.

Although no such studies of whistleblowers in clinical research have been conducted, there is little reason to think that the results would be much different. One of the most demoralizing recent assessments of medical whistleblowing came from a Harvard study of 26 people who had exposed fraud and corruption in pharmaceutical companies using qui tam lawsuits. The purpose of qui tam lawsuits is to encourage whistleblowers by allowing them to collect a share of the resulting financial settlement. Many of the 26 whistleblowers eventually collected millions of dollars, yet few felt that it was worth the personal devastation. Often they had been asked to take extraordinary risks, such as smuggling files out of the company or wearing a wire to meetings, yet federal investigators treated them with suspicion, as if they were complicit in the crimes. Nearly half of the whistleblowers experienced stress-related illnesses, and more than 30 percent were financially ruined.

Research whistleblowers have even fewer protections. The federal research oversight system has no formal mechanism for dealing with whistleblower complaints. The Food and Drug Administration will not even tell whistleblowers whether it is investigating a complaint. If an FDA investigation is undertaken and completed, the only way for a member of the public to find out the result is to file a Freedom of Information Act request. Even worse, those who dare to blow the whistle on abuses at their own institution do so at their own risk. No federal statute offers them any legal protection. (In some states, however, they may have protection under other laws.)

Even back in 1967, Buxtun was well aware that his efforts to stop the Tuskegee study might backfire on him. “You bet I thought about having to find another job, perhaps in another city and probably outside of government,” Buxtun tells me. “Make no mistake, my confrontation with the CDC aristocracy was intended to get rid of me. They knew it, and I knew it.”


 

Buxtun heard nothing more from William Brown after his summons to Atlanta. In November 1968, seven months after the assassination of Martin Luther King Jr., Buxtun wrote to Brown once again, this time pointing out the political volatility of the study. “The group is 100% negro. This in itself is political dynamite and subject to wild journalistic misinterpretation,” Buxtun wrote. Brown didn’t respond. Instead, in the spring of 1969, he convened a blue-ribbon panel, this time with experts from outside the Public Health Service. But even they decided against stopping the study.

Although Buxtun was unaware of it, he was not the only person to find the Tuskegee study objectionable. Over the decades, a handful of others had protested. Some were physicians at other universities who learned about the study from lectures or journal articles, such as Count Gibson of the Medical College of Virginia and Irwin Schatz of the University of Michigan. Another was a Public Health Service employee: Bill Jenkins, an epidemiologist and one of the first African Americans to work for the CDC. Yet none of them had the staying power of Buxtun, who for seven years simply refused to let the issue die.

By the early 1970s, Buxtun had left the Public Health Service for law school but was still living in San Francisco. Among his circle of friends was a group of women who had been at Stanford together, one of whom was Edith Lederer. She is now the Associated Press correspondent for the United Nations but in 1972 was only six months into her first job as an AP reporter. Buxtun started telling her and some of her journalist friends about the Tuskegee study. “I remember one night, a bunch of us went to a pizza place, and two of these reporters were right across the table from me.” He gave them his pitch. “This one guy looked at me, put his pizza down, and said, ‘Look pal, we deal with the news all the time. Give us a break. We just want to have a pizza.’ ”

A month later at a dinner party, he tried Lederer again. “That night she listened,” Buxtun says. He recalls her saying, “What? Black people? All of them black?” Lederer asked Buxtun if he had any documentation. “We’d had dinner already so I said come on, hop in my car, and I’ll show you.” Lederer can still remember her reaction to those documents. “I was horrified,” she says. Buxtun says Lederer was sitting on the couch in his apartment. “She kept looking and looking and finally she said, ‘Can I borrow this and Xerox it?’ And I said, ‘I wish you would!’ ”

Lederer took the documents to her bureau chief, who decided that the story should be written by a more experienced reporter. Lederer sent the material to Jean Heller, an investigative reporter in the Washington office. When Heller’s article appeared on July 25, 1972, it carried the headline, “Syphilis Victims in U.S. Study Went Untreated for 40 Years.”

“It just blew the story wide open,” Buxtun says. Not only did the Tuskegee revelations shatter the popular image of doctors as honorable professionals, they confirmed the African-American community’s worst fears about institutionalized medical racism. That such a study could be sponsored by the federal government was bad enough. That it could continue in plain sight for four decades—the results published openly in medical journals with little objection or comment—was stunning. Before Tuskegee, there were no formal federal guidelines or required structures to protect research subjects. Americans thought only Nazis needed that kind of formal oversight. After Tuskegee, it was impossible to believe that the honor and good intentions of doctors were enough.


 

When bystanders to organizational wrongdoing are asked why they remained silent, they usually give one of two reasons. The first is that nobody would listen. The second is fear of retribution. For decades, findings such as these have guided reformers who want to encourage whistleblowers. The Whistleblower Protection Act of 1989, for example, made it a federal offense for government agencies to retaliate against government employees who report things such as waste, mismanagement, and violations of the law.

Yet many bystanders to wrongdoing in medical research are academic physicians with all of the protections of tenure. These physicians can’t be fired, at least not easily, yet still they remain silent. Most of us understand intuitively how difficult it is to defy authority or break from a group, especially a close-knit group, even if doing so presents no real danger at all. The true horror of My Lai and Jonestown—or for that matter, any number of college hazing scandals—is not that the shame of going along with the group is unimaginable, but that we can imagine it all too well.

Recently a team of social psychologists in Amsterdam, Padua, and Palo Alto designed an unusual study of research whistleblowing based on Stanley Milgram’s famous obedience experiments at Yale in the early 1960s. In an elaborately constructed sham scenario, Milgram’s subjects were ordered to administer what they believed to be dangerous electrical shocks to unwitting people. The subjects weren’t enthusiastic about obeying; in the films Milgram made of the experiment, you can see the subjects sweating, trembling, and repeatedly protesting as they increased the “shocks” to the highest voltage. Yet in the end, faced with a conflict between the demands of their conscience and those of a man in a white coat, more than 65 percent obeyed the man in the white coat.

In the new version, the social psychologists wanted to see whether people would blow the whistle on an obviously dangerous experiment. They arranged for groups of Dutch university students to be approached by a stern, formally dressed “scientist” (actually an actor) who wanted help recruiting volunteers for a study of sensory deprivation. The scientist told the students that his experimental subjects would be isolated and unable to see or hear anything for an extended period. An earlier pilot study had gone badly; the traumatized subjects had hallucinated, panicked, and lost their ability to think rationally. Two of the six subjects had begged to have the experiment stopped, the scientist explained, but stopping it would have ruined his data. Now the scientist was repeating the experiment with younger subjects, whose brains were even more sensitive to the traumatic effects of sensory deprivation. He wanted the Dutch students to help him by recommending the experiment by email to their acquaintances and friends. Those who agreed to help were instructed to write a statement using at least two of the words “exciting,” “incredible,” “great,” or “superb.”

This study was designed to make it as easy as possible for the subjects to refuse to cooperate and blow the whistle to an oversight committee. The “scientist” left the room so that no subject would have to confront an authority figure, and the subjects were given plenty of time to consider their decisions. In addition, the subjects were told that the university’s research committee was still deciding whether to approve the study. The students were all given a form encouraging them to register any ethical objections, which they could submit anonymously.

When a group of subjects was asked to predict how they would handle such a scenario, virtually none of the subjects could imagine themselves cooperating. More than 96 percent said they would disobey the scientist or blow the whistle to the research committee or both. But when a matched group of subjects was placed in a room with the fake scientist, the overwhelming majority complied. More than three-quarters of the students wrote a statement intended to recruit their friends and acquaintances into the dangerous study, and only one in 10 blew the whistle to the research committee.

What accounts for this alarming result? Certainly not fear of retribution or a sense of futility. Subjects could register their dissent confidentially, and there was no reason for them to believe that the research committee would ignore their concerns. Nor was it anything about the subjects as individuals. Personality tests could detect no differences between those who resisted authority and those who obeyed. According to the psychologists who designed the study, the explanation is simply the one that Milgram laid out decades ago: in most situations, we simply do what is expected of us by people we see as legitimate authorities. Milgram called this surrender of autonomy the “agentic state.”

The phrase sounds like something from The Manchurian Candidate, but there is nothing sinister about it. In most social situations, we expect that someone will be in charge—a host at a dinner party, a flight attendant on a plane, an usher at the theater. In such situations, we naturally do as the authority tells us. The problem, of course, comes when a legitimate authority figure asks us to do something cruel or dishonest. According to Milgram, we often try to weasel out of such conflicts by imagining ourselves as mere instruments for the wishes of the authority. Adolf Eichmann claimed he was just following orders; today, we’re more likely to excuse ourselves by saying, “Above my pay grade,” or “Not my circus, not my monkeys.” A version of this response comes through in the whistleblowing experiment. The subjects who cooperated tried to deflect responsibility to the fake scientist, while the rare subjects who blew the whistle did so precisely because they felt personally implicated.

Of course, in actual cases where research subjects are mistreated, the potential whistleblowers are highly trained, knowledgeable adults with medical expertise, not students listening to a research presentation for the first time. Like Buxtun, they have the chance to do their own background reading, ask tough questions, and talk to colleagues. Rarely is there a single moment of decision, as there was for the students; potential whistleblowers usually have months or even years to ponder their choices. Yet many of them still fail to act.

As alarming as Milgram’s findings were, they also suggested a solution. By manipulating study conditions, Milgram found that he could tip the scales away from obedience and toward the demands of conscience. Diminishing the prestige of the authority figure helped—getting rid of the scientist’s lab coat, for instance, or moving the experiment from Yale to a nearby community building. So did bringing the victim into the same room as the person administering the shocks, so that the screams and protests by the victim became more personal. But the most profound changes came when Milgram placed dissenting confederates in the room. If the “scientist” was accompanied by a second “scientist” who objected to the shocks, not a single subject was willing to continue administering them. And if a person was placed at a table with two others who refused to administer shocks, that person was emboldened to resist as well. In other words, people were far more likely to follow their consciences if they didn’t feel so isolated and alone in their dissent.

In many academic health centers, the default solution for encouraging whistleblowing is an “ethics hotline,” by which employees can file anonymous, confidential complaints. But anonymous hotlines have many problems. They are easily abused by unhappy employees looking to settle petty workplace scores, and genuine whistleblowers often have little reason to trust the people on the other end of the phone. The former director of Whistleblowers Australia, a physician, once wrote, “A very important piece of advice for whistleblowers, which they ignore at their peril, is never to use an official, internal anti-corruption body for anything but the most trivial matter, and preferably not to risk using it even then.”

If academic health centers genuinely want to protect research subjects, they need to make far deeper changes along the lines that Milgram suggested: deflating the institutional authority of researchers while making it easier for dissenters to find common cause. In the rare cases where research whistleblowers have been successful, they have often banded together in teams. For example, the exposure of the “unfortunate experiment” in 1987 at National Women’s Hospital in Auckland, New Zealand—which involved the nontreatment of cervical carcinoma in situ for decades and is often compared to the Tuskegee study—came only when three physicians at National Women’s wrote a journal article and one later spoke to magazine journalists. More recently, four physicians at the Karolinska Institute in Sweden mounted a public campaign to expose the wrongdoing of the transplant surgeon Paolo Macchiarini, who managed the rare trifecta of contributing to the death of research subjects, committing scientific misconduct in his publications, and achieving international disgrace in his personal life. (An NBC producer told Vanity Fair that Macchiarini proposed to her, promised her a wedding ceremony overseen by the pope and attended by celebrities such as Elton John and Russell Crowe.) In both cases, however, institutional leaders steadfastly supported the researchers for years and tried to shut down dissent before the wrongdoing was eventually exposed.


 

The natural assumption is that Buxtun’s actions were the consequence of his extraordinary character and tenacity. Not many people would take up a moral cause on behalf of strangers and stick with it for seven years without any discernible success. “Once Peter gets something in his head, he’s going to pursue it and give it 100 percent,” author James Jones told me. That Buxtun persisted so long and eventually succeeded, without the help of like-minded colleagues, sets him apart from even the most determined crusaders.

But it is also important to note the social factors that made it more likely that Buxtun would resist authority. He was not a doctor, so he had not been trained to see senior doctors as his superiors. Nor was he committed to a career in public health. He lived some 2,000 miles from Macon County, Alabama, and the authority figures he answered to in San Francisco were not involved in the Tuskegee study. By the time Buxtun met Brown and Cutler, he had already committed himself deeply to dissent. He never saw the Tuskegee doctors as legitimate authorities and never surrendered his moral agency to them.

Many whistleblowers see their actions as the defining event of their lives and never let go of their bitterness over the difficulties they have faced. Buxtun, in contrast, seems remarkably free of rancor. “I’ve moved on,” he says. “A lot of good things have happened.” The only figure in the Tuskegee scandal he still seems to harbor any resentment toward is Cutler, the man who gave him a tongue-lashing in Atlanta and remained unrepentant for decades. In 2010, President Obama issued a formal apology to victims after Susan Reverby discovered that Cutler had also directed Public Health Service experiments in Guatemala in the 1940s, in which researchers intentionally gave syphilis and gonorrhea to soldiers, prisoners, and mentally ill patients. “He’s my villain for all of this,” Buxtun says of Cutler. “I can see Dr. Mengele saluting this guy.”

If Buxtun carries any resentment about his relative anonymity, he hides it well. “I don’t want to be embarrassed by an oversupply of compliments,” he says. “I am who I am. There’s nothing to try to change, up or down.” I told him I was gratified to see he had recently been given a Freedom of Information prize by a journalism association in Northern California. Buxtun replied that he was not the only person honored that night. Then he added, “Another recipient was arrested the following week for public corruption and gun trafficking.”

Permission required for reprinting, reproducing, or other uses.

Comments powered by Disqus