Give It a Thought--And Make It So
Outfitted with a virtual reality helmet and a computer program adept at recognizing key brain signals, volunteers use their thoughts to take actions like those of any apartment dweller--turning on the television or the stereo, for instance.
This line of research, which links a brain and computer in a near real-world environment, may someday allow patients with extreme paralysis to regain some control over their surroundings and could eventually eliminate the keyboard and mouse as the go-betweens connecting our thoughts and the actions we wish to see in our environment.
While several teams around the world are working on brain-computer interfaces (BCI), computer-science graduate student Jessica Bayliss is the first to demonstrate that detection of the brain's weak electrical signals is possible in a busy environment filled with activity. She has shown that volunteers who don a virtual reality helmet in her lab can control elements in a virtual world, including turning lights on and off and bringing a mock- up of a car to a stop by thought alone.
Though all this is so far taking place only in virtual reality, the team is confident that the technology will make the jump to the real world and should soon enable people to look around a real apartment and take control in a way they couldn't before.
"This is a remarkable feat of engineering," says Dana Ballard, professor of computer science and Bayliss's advisor. "She has managed to separate out the tiny brain signals from all the electric noise of the virtual reality gear. We usually try to read brain signals in a pristine, quiet environment, but a real environment isn't so quiet. Jessica has found a way to effectively cut through the interference."
The National Institutes of Health is supporting Bayliss's research because it may someday give back some control to those who have lost the ability to move. A person so paralyzed that he or she is unable even to speak may be able to communicate once again if this technology can be perfected, Bayliss says.
By merely looking at the telephone, television, or thermostat and wishing it to be used, a person with disabilities could phone up a friend or turn up the heat on a chilly day. Bayliss hopes that someday such people may even be able to operate a wheelchair by themselves simply by thinking their commands.
"Virtual reality is a safe testing ground," she says. "We can see what works and what doesn't without the danger of driving a wheelchair into a wall. We can learn how brain interfaces will work in the real world, instead of how they work when someone is just looking at test patterns and letters. The brain normally interacts with a 3-D world, so I want to see if it gives off different signals when dealing with a 3-D world than it does with a chart."
The brain signal Bayliss listens for is called the "P300 evoked potential." It's not a specific signal that could be translated as "Aunt Nora" or "Stop at the red light," but rather a sign of recognition--more like "That's it!"
"It's as if each neuron is a single person who is talking," Bayliss explains. "If there's just one person, then it's easy to hear what he's saying, but the brain has billions of neurons, so imagine a room full of a billion people all talking at once. You can't pick out one person's voice, but if everyone suddenly cheers or oohs or aahs, you can hear it. That's what we listen for--when several neurons suddenly say, 'That's it!'"
Bayliss looks for this signal to occur in sync with a light flashing on the television or stereo. If the rhythm matches the blinks of the stereo light, for instance, the computer knows the person is concentrating on the stereo and turns it on. A person doesn't even have to look directly at the stereo; as long as the object is in the field of view, it can be controlled by the person's brain signals. Since it's not necessary to move even the eyes, this system could work for paralysis patients who are completely "locked in," a state where even eye blinks or movement are impossible.
The virtual apartment in which volunteers have been turning appliances on and off is modeled after Bayliss's own. Such a simple, virtual world is the first step toward developing a way to accurately control the real world. Once Bayliss has perfected the computer's ability to determine what a person is looking at in the virtual room, the next hurdle will be to devise a system that can tell what object a person is looking at in the real world.
Bayliss and Ballard work in the University's National Resource Laboratory for the Study of Brain and Behavior, which brings together computer scientists, cognitive scientists, visual scientists, and neurologists to study neural functions in complex settings.
So in the future will we all be wearing little caps that will let us open doors, channel surf, and drive the car on a whim? "Not likely," Bayliss says.
"Anything you can do with your brain can be done a lot faster, cheaper, and easier with a finger and a remote control."
Maintained by University Public Relations
Maintained by University Public Relations
|©Copyright 1999 — 2002 University of Rochester|