Empathy for Inanimate Objects

Do not attempt this experiment at home

Barb Dybwad/Flickr
Barb Dybwad/Flickr

 

“Watch this poor, abused, washing machine go completely insane and explode,” urges the technology website Gizmodo. Over the next three or so minutes, a videographer, “Aussie50,” inserts a heavy piece of metal into the drum of a front-loading washer and activates its spin cycle. The machine hammers itself to death: its door flies open, the back falls off, wires twist loose, and finally the washer lies deconstructed on the ground. “Best washer-kill ever,” says Aussie50, tittering.

I showed the video to a friend, who said he felt sorry for the machine and asked why it deserved to be destroyed? That empathic reaction makes me wonder why humans feel pity for inanimate objects.

Some insight into this question comes from Astrid M. Rosenthal-von der Pütten, a social psychologist at the University of Duisberg-Essen in Germany. She and her research team have published two studies analyzing how humans respond when a robot is tortured.

In the first study, she divided 41 participants into two groups. Group One watched a two-minute video of a person in a black sweater choking and beating a robot dinosaur, Pleo, as it emitted sounds of suffering, including crying. Group Two watched a two-minute video of Pleo being stroked and fed as it sang, purred, and babbled. The Group One subjects felt significant pity for the robot and anger at the torturer when the robot was tormented; they also experienced higher “physiological arousal,” a measure of human “fight or flight” response.

In the second experiment, published in 2014, Rosenthal-von der Pütten and her team employed brain-scanning to examine how 14 participants would respond to videos of a human, a robot (Pleo), and an inanimate object (a green box) being tortured or treated nicely. Activation of neurons in the brain’s limbic system—areas that process emotions such as anger, happiness, or fear—was similar when robots and humans were treated affectionately. Subjects showed significantly more empathy and emotional distress, however, when the human was abused, as compared to the robot.

Do humans feel empathy for robots because they seem humanlike or, as in the case of the robot-dino, because it appears to suffer when mistreated, as do live animals? “I think, to some extent robots activate the same mechanisms of empathetic processes” that humans do, Rosenthal-von der Pütten responded to my question via email, “but there are not enough studies to draw concrete conclusions. But one can say that the human likeness of robots (in terms of their appearance and of their behavior) plays a role.”

If that is the case, why would anyone feel empathy for a washing machine, which doesn’t seem human at all? Rosenthal-von der Pütten said she is “not aware of any study investigating empathy in the context of non-robotic machines” and cannot explain what the underlying brain mechanisms might be. But one clue, I believe, comes from the studies of Swiss child psychologist Jean Piaget. He noted that children go through a stage of “animistic thinking,” in which they imbue inanimate objects with human emotions; or, as my four-year-old son recently said, “the tiny tractor is tired so he is not scooping up.”

Perhaps adults’ feelings for wasted washers and other non-living matter are a residue of childhood. Or maybe we express empathy because we see what a waste of resources it is to shatter a decent device. Possibly, as we watch the wanton destruction, we intuit the human care with which it was created.

Permission required for reprinting, reproducing, or other uses.

Josie Glausiusz writes about science and the environment for magazines that include Nature, National Geographic, Scientific American Mind, Discover, New Scientist, and Wired. From 2013 to 2015 she wrote The American Scholar’s “On Science” blog. Her Hakai Magazine article, “Land Divided, Coast United,” won Amnesty International Canada's 2015 Online Media Award.

● NEWSLETTER

Please enter a valid email address
That address is already in use
The security code entered was incorrect
Thanks for signing up