your connection to The Boston Globe

Artificial emotion

Sherry Turkle is at it again. This Friday, she's hosting a daylong powwow at MIT to discuss "Evocative Objects." The front of the brochure includes pictures of an electric guitar, VW Beetle, rubber duckie, and a pipe. "Objects bring philosophy down to earth," she says.

Over the past two decades, the founder of the MIT Initiative on Technology and Self has been watching how our relationships with machines, high tech and low tech, develop. Turkle is best known for her place at the table in any discussion of how computers -- and robots in particular -- will change our lives. This makes her an essential interlocutor in the palaver, sharpened two years ago by a piece written by Sun Microsystems cofounder Bill Joy, that robots are going to take over the world, soon.

"The question is not what computers can do or what computers will be like in the future," she maintains, "but rather, what we will be like."

What has become increasingly clear to her is that, counterintuitively, we become attached to sophisticated machines not for their smarts but their emotional reach. "They seduce us by asking for human nurturance, not intelligence," she says. "We're suckers not for realism but for relationships."

Kids, she has found, define aliveness in terms of emotion: "In meeting objects as simple as Furbys, their conversations of aliveness have to do with if a computer loves them and if they love the computer."

Quite simply, the research boundaries in this field between cognitive thought and feeling are eroding.

Exhibit A: A simple toy like Hasbro's My Real Baby (no longer in production), which exhibits and craves emotion. Last year, Turkle studied the effects of the toy on residents at the Pine Knoll Nursing Center in Lexington. It had acquired four of the dolls and found them particularly effective for the emotional comfort they provided some residents suffering from dementia.

"It is a useful tool to reduce their constant anxiety," says Terry McCatherin, activities director there.

Japan is far ahead of us in this regard, adds Turkle. A major movement is already under way there to bring robotics into nursing homes for companionship, to dispense medicine, to help flip a patient over to avoid bed sores, among many roles.

Then there is AIBO, the Sony robotic dog described on a Web page as follows: "From the first day you interact with it, it will become your companion." Indeed, it remembers you, follows your commands, and develops a personality shaped by you. Turkle famously quoted an elderly woman who said that the robotic pet was better than a real one because it never died.

"We are very vulnerable to technology that knows how to push our buttons in a human way," she says. "We're a cheap date. Something makes eye contact with us and we cave hard. We'd better accept that about ourselves."

Turkle, who has worked closely with the wizards at MIT's Artificial Intelligence lab, remembers vividly the first time she saw Cog, a robot developed there. "It made eye contact with me and traced my movement across the room," she recalls. "It moved its face and torso and paid attention to me and gestured toward me with an outstretched arm. It takes your breath away how you react to a robot looking at you."

The very names are loaded. Entering the scientific lexicon are words like "robo-nanny" and "robo-nurse."

The market for robotics in health care is about to explode, Turkle says. The question is: Do we want machines moving into these emotive areas? "We need a national conversation on this whole area of machines like the one we're having on cloning," Turkle says. "It shouldn't be just at AI conventions and among AI developers selling to nursing homes."

So who are the bad guys here? The developers? It's not that simple, Turkle says: "The developer says, `Hello. I make a doll that says I love you.' " Turkle says its nursing home application speaks more to a society that understaffs its old-age facilities: "If they were packed with young people helping out every day, little robot dolls would not be such a big thrill."

The line between real and simulation continues to blur, too. "Authenticity is to our generation what sex was to the Victorians," she says.

What the hell does it mean? It means, she says, there is a sense of taboo and a fascination with the fake versus the genuine article. Like a virtual personality in a computer game like EverQuest.

And the line separating us from machines has moved. Emotions alone no longer do the job.

"People are special now because we are vulnerable," Turkle explains. "We are special because we lose. [Think of the IBM computer dusting chess whiz Gary Kasparov.] We're defined by our imperfections. And the fact that we die. It is this life cycle, our being in a family and our biology that separate us now."

Turkle is an agnostic about the future. "It's too simple to say, `The horror! The horror!' " she says. "But we're not pausing to ask, `What are they doing to us?' "

Nothing good, says McCatherin, if they replace human contact: "I don't like the idea of forming an emotional bond with a machine."

Me neither.

Sam Allis can be reached at

Today (free)
Yesterday (free)
Past 30 days
Last 12 months
 Advanced search / Historic Archives