About 18 months ago, I began the transition from academic researcher to science communicator. It has been, in parts, rewarding and disappointing, painful and effortless and surprising, which is everything you’re supposed to feel, though that doesn’t make the feeling any easier.
One of the more discomfiting discoveries was that the values of my old world were not always compatible with the values of my new one. Take, for instance, expertise. The more I know about a scientific subject—about the researchers, the methodologies, the conflicting theories—the better prepared I am to communicate that science to others. Right? Expertise is what matters. It’s what earns you the opportunity to peer-review papers. It puts you on funding panels, and in front of 300-person lecture halls.
But what passes as expertise among researchers smacks of bias among some journalists. I know the researchers? I’ve used these methods? I may have some preconceived ideas about the scientific theories under investigation? I’m the last person who can be trusted to cover this research objectively. And in many journalistic corners, objectivity, even its appearance, is prized over expertise.
Such value misalignments exist everywhere. In little ways, they chip away at the self-assurance of anyone rash enough to let two disciplines collide. This is something Facebook now knows all too well.
You’ve probably heard about the recent study, published in the prestigious Proceedings of the National Academy of Sciences, in which nearly 700,000 Facebook users’ feeds were manipulated to contain primarily posts filled with negative words, or those filled with positive words. Would all the positivity or negativity affect our own moods, as measured by the positivity or negativity of the words in our own status updates? That’s indeed what the researchers found—albeit barely, even unconvincingly.
The hubbub arises because none of these nearly 700,000 users gave their informed consent to being participants in a study—a lynchpin of the code of ethics that academic researchers agree to abide by. Instead, the study seemed to skate by thanks to Facebook’s blanket user agreement, and the general assumption that, well, Facebook tweaks its algorithms without informing anyone all the time, so why not for science?
Facebook users are none too pleased. But make no mistake. For Facebook and many other companies, this really is business as usual. With the rise of A/B testing to optimize clicks, orders, eyeballs, and that ever-elusive engagement, each one of us participates in dozens of mini-experiments every time we browse the web. The “line” Facebook crossed isn’t in manipulating us, or making us feel worse about our lives. It’s in publishing their findings—in trying to pass off what they’d done as scientific, intended to produce knowledge for the greater good. And when held to this set of values, which include informed consent and other human subject protections, it failed utterly. (See interface designer and blogger Sebastian Deterding for a similar take.) If Facebook has learned anything from its brush with academia, it is to stay the hell away—from academia, not from experimentation.
Compounding the problem, the tech sector is shiny-new and ever evolving. And unlike the more established fields of academia and journalism, many ethical questions have been posed, but few have been firmly settled.
A better lesson from the Facebook fiasco is that this must change. As is, we have no real sense of what we can expect when we visit a website or download an app or are targeted in an online marketing campaign. Is it okay if we experience something different from what our neighbors do? If that experience is tailored to our gender, or to our race, or to our income level—either explicitly, or for all practical purposes? What if our experience is tailored instead to what’s gotten a rise out of us in the past, or just to dumb, blind chance, to be later analyzed by marketing teams and academics alike? These questions are too big and too many to continue to go unanswered.
Scientists abide by one code; journalists, by another. But the land of code, the Internet, is still found wanting.