Inside the MIT Media Lab

The former head of MIT’s Media Lab highlights the work of the MIT Media Lab’s Personal Robots Group.

Nexi tilts her head inquisitively to one side, glances down at her hands, leans forward ever so slightly as she bats her blue eyes and speaks in a soft, pleasant, feminine voice.

“When everyone plays with Huggable, I feel left out. When was a time that you felt alienated or left out?”

She’s talking to Professor Cynthia Breazeal, the head of the MIT Media Lab’s Personal Robots Group. Breazeal nods sympathetically and says, “Yes, my parents definitely like my brother best.”

Nexi and her nemesis Huggable are robots who occupy the fourth-floor workshop of the Personal Robots Group here in Cambridge, Massachusetts, home to MIT itself. They are essentially students in a robot kindergarten, whose population comes in all sizes and shapes. Their teachers, who are also their designers, builders and programmers, are Breazeal’s students, members of her group, which is at the vanguard of a movement to design robots that can relate to people in decidedly human terms, living, learning, working and playing among us. And not just for fun: These robots could serve as helpful companions for the sick, the disabled and the elderly.

A key goal of Breazeal’s group is to build robots that can learn from average people—not just computer programmers—in the same way people learn from each other. This is no small feat. The learning process is incredibly complex, involving all kinds of subtle social cues and signals. It’s also highly variable; as we all know, different people have very different learning styles. To understand and reproduce that process, Breazeal and her apprentices need to build a lot of different robots, then test them in a lot of different situations with a lot of different people. Thus the school’s diverse population.

A star pupil, Nexi is one of the world’s first mobile-dexterous-social (MDS) robots: In addition to being able to interact socially, she can move around on her motorized wheels, and her arms and hands are capable of gesturing and manipulating objects. Deceptively lifelike, Nexi can look happy, sad, confused or bored; she can blink her eyes and arch her eyebrows. She has microphones for ears and color video cameras for eyes. The 3-D camera embedded in her forehead allows her to track human facial, head and body movements; her laser range finder constantly scans the room, determining distances between people or objects. When Nexi looks you in the eye, you feel as if she knows you, and when she expresses an emotion—like sibling rivalry toward Huggable—you can’t help but feel for her.

And that’s exactly the point. The “sociable robot” is meant to seem like a person so it can function as an ally on the job, a learning companion in the classroom, a devoted assistant for the infirm or injured or an emergency worker whom people are willing to trust with their lives.

When Cynthia Breazeal was ten years old, she was so taken with the loyal droids of the Star Wars movies, R2-D2 and C-3PO, that she decided to build one herself when she grew up. As a graduate student at MIT, she first designed insect-like robots that roved in space, and she notes the irony that although robots have already explored the depths of the oceans, braved the Arctic and volcanoes and ventured deep into space, the real “final frontier”—where it’s most difficult for them to function—is human society.

From the beginning, Breazeal recognized that in order to be effective companions, robots would need to be able to interact with humans in meaningful ways. But, she wondered, could a robot really be taught everything it would need to know in order to live in the real world, where situations, personalities and habits are constantly evolving?


Inspired by developmental psychology, Breazeal looked to human parent-child relationships for answers. It takes years of nurturing for children to develop adult reasoning abilities. It dawned on Breazeal that it’s unrealistic to think these skills could just be programmed into a robot. She concluded that, just as it does for humans, it would take many years and many experiences in many different situations for a robot to develop anything resembling true human intelligence.

Huggable, which looks like a teddy bear, is probably the most adorable robot you will ever encounter. Unlike Nexi, Huggable can’t navigate a fire maze or pull a child out of a burning building, but it can be held in one’s arms and channel the words and emotions of an actual human. Designed to function as a “physical avatar,” Huggable has a number of potential uses. It could provide comfort to a hospitalized child, letting the child’s grandmother “be there” remotely in the form of a cuddly bear. At the same time, Huggable could contain embedded devices to monitor the child’s vital signs. At home, Huggable could be used to teach a child a second language via a tutor a continent away.

While Huggable is still in the prototype stage, Nexi has already ventured into the real world. Breazeal and her students have taken her to visit a number of senior centers in the Boston area, and overall the seniors really embraced her. Some shook Nexi’s hand and even hugged her when they became more familiar. And the researchers discovered that the more expressive Nexi was, the longer people would talk to her. With time, Breazeal and her students are moving closer to their goal of developing social robots that can provide companionship, comfort and support.

Adapted from Frank Moss’s forthcoming book, The Sorcerers and Their Apprentices: How the Digital Magicians of the MIT Media Lab Are Creating the Innovative Technologies That Will Transform Our Lives (Crown Business, June).

Fact: Media Lab professor Cynthia Breazeal was a consultant on the 2001 Kubrick-Spielberg film A.I. Artificial Intelligence.

Media Lab Inventions

First-year students at the MIT Media Lab take a class called “How to Make (Almost) Anything,” where they learn to use all the tools—laser cutters, welders, power drills—in the Lab’s central workshop. And twice a year, the groups demonstrate their prototypes for visiting corporate sponsors, like General Motors and LEGO. The emphasis here is as much on building and testing concepts as it is on thinking them up, and the result is that many of the Lab’s faculty, students and alumni have seen their creations, a few of which are detailed below, find their way into the wider world.

Death and the Powers Opera

Head of the Opera of the Future group, Tod Machover is perhaps best known for the electronic hypercello he created for Yo-Yo Ma. Machover’s latest work, an opera called Death and the Powers, with a hybrid cast of robots (left) and humans—not to mention lyrics by former U.S. poet laureate Robert Pinsky—had its world premiere in Monaco last fall and has since been performed in Boston and Chicago.

Rock Band and Guitar Hero

Harmonix Music Systems, the company that developed the video games series Guitar Hero and Rock Band, was founded in 1995 by Alex Rigopulos and Eran Egozy, who met as students in Tod Machover’s Opera of the Future group. Both musicians themselves—Rigopulos is a composer and Egozy is a clarinettist—their goal was to give everyone, regardless of training, the ability to have fun making music.

iWalk Powerfoot BiOM

Soon to be available commercially, the Powerfoot BiOM prosthesis uses robotic engineering to re-create—and even surpass—the muscle function of a lost lower leg. It is based on technology developed by the Lab’s Biomechatronics group, led by Professor Hugh Herr, himself a double-amputee as a result of severe frostbite. An avid climber, Herr found standard prosthetics exceedingly uncomfortable; with the Powerfoot, he can now climb higher and faster than he did with his biological feet.

iSet “Face Reader”

Developed by Professor Rosalind Picard’s Affective Computing group, the iSET, or Interactive Social Emotional Toolkit, is a prototype “face reader,” a tablet-sized computer equipped with cameras designed to help people with autism interpret the expressions, gestures and other nonverbal signals of those around them, as well as understand the impact of their own.