[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[escepticos] Manejar un mando a distancia con la mente (Brain-Computer Interfaces (BCI's)



Curioso articulo:

en: http://www.beyond2000.com/news/story_605.html

How many times have you wished your noisy neighbours would turn their stereo
down? Soon that wish might be all you need. Thought-controlled appliances
have been a stalwart ingredient of science fiction stories for decades, but
inside a laboratory at the University of Rochester, people are communicating
their mental musical wishes to a stereo without a remote control (or genie)
in sight. There's also a chance that in a future emergency you'll be able to
stop your car as soon as you think about it, rather than having to transfer
your feet between pedals. It all sounds like a gimmick for the couch potato,
but the true beneficiaries of such technology will be the disabled.

Presently there is a rather cumbersome accessory involved, though one day
that will be redundant. But for now, outfitted with a virtual reality helmet
and a computer program adept at recognising key brain signals, volunteers
use their thoughts to take actions like those of any average citizen, such
as turning on the TV or the music.

This brain-to-computer line of research may someday allow patients with
extreme paralysis to regain some control of their surroundings, say the
project's developers. Eventually it could eliminate keyboards, mice,
switches and all the other interfaces between our thoughts and desired
results.

No Brainer

While dozens of teams around the world are working on such brain-computer
interfaces (BCI's), Rochester computer science graduate student Jessica
Bayliss is the first to show that detection of the brain's weak electrical
signals is possible in a busy environment filled with activity. She has
shown that volunteers who don a virtual reality-style helmet in her lab can
control elements in a virtual world, including turning lights on and off and
bringing a mock-up of a car to a stop by thought alone.

The laboratory's research combines tools that mimic real-world sensations,
such as driving simulators and gloves that simulate the feel of physical
objects. Recently the lab added virtual people, robot-like actors with which
volunteers can interact in a limited way. Though all this is currently
taking place merely in the make-believe, the team is confident that the
technology will eventually make the jump to the actual.

"Virtual reality is a safe testing ground," says Bayliss. "We can see what
works and what doesn't without the danger of driving a wheelchair into a
wall. We can learn how brain interfaces will work in the real world, instead
of how they work when someone is just looking at test patterns and letters.
The brain normally interacts with a 3-D world, so I want to see if it gives
off different signals when dealing with a 3-D world than with a chart."

Skull Talk

The brain signal Bayliss listens for is called the "P300 evoked potential."
It's not a specific signal that could be translated as "turn this crap off"
or "stop at the red light," but rather a sign of satisfaction; more like
"That's it!" (Plus it's not language specific, so you won't encounter the
problems Clint Eastwood had in "Firefox". - Ed.)

The difficulty lies in hearing that signal above the intense neurological
crowd noise inside a person's skull. "It's as if each neuron is a single
person who's talking," Bayliss explains. "If there's just one person, then
it's easy to hear what he's saying, but the brain has billions of neurons,
so imagine a room full of a billion people all talking at once. You can't
pick out one person's voice, but if everyone suddenly cheers, or oohs or
aahs, you can hear it. That's what we listen for, when several neurons
suddenly say 'that's it!'"

The trick is in looking for this signal to occur in sync with a light
flashing on the appliance. If the rhythm matches the blinks of the stereo
light, for instance, the computer knows the person is concentrating on the
stereo and turns it on. A person doesn't even have to look directly at the
stereo; as long as the object is in the field of view, it can be controlled
by brain signals. Since it's not necessary to move even the eyes, this
system could work for paralysis patients who are completely "locked in," a
state in which even eye blinks or movement are impossible.

Once Bayliss has perfected the computer's ability to determine what a person
is looking at in the virtual room, the next hurdle will be to devise a
system that can discriminate objects in the real world.

"This is a remarkable feat of engineering," says Professor Dana Ballard, who
is Bayliss' adviser. "She's managed to separate out the tiny brain signals
from all the electric noise of the virtual reality gear. We usually try to
read brain signals in a pristine, quiet environment, but a real environment
isn't so quiet. Jessica has found a way to effectively cut through the
interference."

Smart Wheels

Although being able to turn out the lights without leaving bed sounds kinda
cool for the terminally lazy, the really important application of this work
is in assisting the handicapped. America's National Institutes of Health
body is supporting the Rochester research because it may someday provide an
element of control to those who have only a limited ability to move.

For example, someone so paralyzed that he or she is unable even to speak may
be able to communicate once again, explains Bayliss. By merely looking at
the telephone, television or thermostat and wishing it to be used, a person
with disabilities could call a friend or turn up the heat on a chilly day.
Ultimately, she dreams, such people may even be able to operate a wheelchair
simply by thinking a series of commands.

Other BCI groups are also close to surmounting another obstacle: attaching
the required sensors to the head. Currently, dozens of electrodes must be
attached to the scalp one at a time using a gooey adhesive gel. Bayliss
though, says dry sensors are just around the corner, and simple slip-on head
caps should not be far behind.

"One place such an interface may be very useful is in wearable computers,"
says Professor Ballard. "With the roving eye as a mouse and the P300 wave as
a mouse-click, small computers that you wear as glasses may be more
promising than ever."

BCI's are broadly divided into two categories: biofeedback and
stimulus-response. Bayliss uses the latter approach, which simply measures
the response the brain has to an event.
Biofeedback is a method whereby a person learns to control some aspect of
his or her body, such as relaxing, and the resulting change in the brain can
be detected and applied as an output. Simple video games can be played like
this. Though many research groups use such an approach, Bayliss decided
against it because people must be trained, sometimes for a year or more, and
some subjects never learn to accurately control their thought patterns.

So in the future will we all be wearing little caps that will let us open
doors, channel surf and drive the car on a whim? "Not likely," Bayliss says.
"Anything you can do with your brain can be done a lot faster, cheaper and
easier with a finger and a remote control."

Bad luck couch dwellers. You'll still have to spend all that energy reaching
for the remote, though it really would be nice to turn down the neighbours.