Today's robots can cap bottles, assemble cars, even make pizzas. But when it comes to using "brainpower" to size up a new situation and act appropriately, robots quickly falter: Scientists cannot stuff robots' circuits with all the information they need to act independently.

But a few simple hand signals from a human, researchers at the University of Rochester are finding, can increase a robot's usefulness and independence dramatically.

"A human operator can provide the guidance that a robot needs," says graduate student Polly Pook, who presented her work at the recent conference on Intelligent Robots and Systems (IROS) in Germany. "Otherwise, designing a truly autonomous robot is much tougher than anyone thought. Any time you're dealing with an unknown environment, or with people, the robot needs judgment and common sense -- abilities that are extremely difficult to program."

Teleoperation, in which humans guide the robot's every motion, is one approach to overcoming these programming difficulties. Teleoperation has given the world robots that accomplish amazing feats, like repairing the shuttle as it zips through space, or exploring the inside of a volcano. But teleoperation is quite tedious, and time delays often hinder performance. "It's like having to think about every movement you make each time you reach for a cup of coffee or walk down a hallway," says Pook. "That awkwardness is compounded by delay. Imagine how difficult it would be to walk if it took a whole minute for you to feel the floor after you put your foot down."

Pook and others are trying a variation called teleassistance, where just a few strategic cues from a human operator -- and a little programming -- allow the robot to accomplish tasks autonomously.

"Today's robots are rather myopic," Pook says. "They are good at responding to local feedback, but they still need someone to provide context, to set their agenda. Teleassistance acknowledges the robot's myopia: The robot has control over local problems like balancing and avoiding obstacles, and we provide the goals and high-level direction, such as where to go."

Pook uses hand signals to communicate with the robot. The signals tell the robot what to do next, and sometimes, where or how to do it. The robot detects the hand motions, not with its eyes but through sensors mounted on a special glove Pook wears.

"People frequently communicate with nonverbal signs," says Pook. "We point when giving directions, hold up a hand to mean 'stop,' and gesture to say, 'Come along.' I can do the same with the robot."

So far, Pook has guided the robot in two tasks: opening a door, and flipping a plastic fried egg. Those might sound like easy tasks, but they're difficult for a robot in a changing world. Insignificant changes, such as substituting a different door knob or spatula, or rearranging the positions of objects, can easily confound the robot. In the door-opening exercise, the robot reaches for the door after Pook points to it, and it checks how Pook shapes her hand to discover the type of handle. Then the robot takes over using pre-defined programs to grasp and turn the handle and swing the door open.

In Pook's experiments, the tele-assisted robot opened the door much more reliably than an autonomous robot, while relying on just a fraction of the human effort compared to traditional teleoperation.

Pook's sign language guides the robot through the task, placing the robot in a series of constrained situations with well-defined goals. Within each context, the robot can respond quickly and appropriately to sensory feedback, without having to consider all possible world scenarios. "Once the robot knows its context, the behavior flows smoothly. We can skip a lot of mathematics," says Pook.

Pook's advisor, Professor Dana Ballard, says a similar process takes place every day within us. "When you put your hand on a hot stove, you remove it instantly -- you don't think about it," says Ballard. "This is why your reflexes aren't wired to your brain -- it would take too long.

"Likewise, when a sprinter hears the gun, she doesn't ask herself to run, and then think about pulling each of the individual muscles in her legs. The skilled reaction is pre- coded: She just runs. In the same way, a robot can run low-level reactive behaviors independently, once it knows its context."

Mechanical telemanipulators have existed for decades, but the new science of computationally assisted telemanipulation is growing extremely fast, Pook says. When she began her work in teleassistance a few years ago, hers was one of a handful of such presentations, but at the IROS conference, dozens of groups presented work in the area. Possible applications of the approach include robot assistants for the elderly and disabled, mining, exploration of remote sites, and hazardous waste clean-up.

Pook's work is funded by the National Science Foundation, the National Institute of Health, and the Human Science Frontiers Program. tr