Sunday, September 9, 2007

Machine Reads Thoughts to Navigate Wheelchair

A fascinating article, amply demonstrating the frontiers of technology. My thought - if we can "send" out thoughts to a machine interphase which can "understand" and carry out the thought/command, why is it so hard to believe that we can also send out thoughts to others' brains directly? Thinking of words can guide your wheelchair 13:02 06 September 2007 NewScientist.com news service Tom Simonite A motorised wheelchair that moves when the operator thinks of particular words has been demonstrated by a US company (see video, right). The wheelchair works by intercepting signals sent from their brain to their voice box, even when no sound is actually produced. The company behind the chair, Ambient, is developing the technology with the Rehabilitation Institute of Chicago, in the US. The wheelchair could help people with spinal injuries, or neurological problems like cerebral palsy or motor neurone disease, operate computers and other equipment despite serious problems with muscle control. The system will work providing a person can still control their larynx, or "voice box", which may be the case even if the lack the muscle coordination necessary to produce coherent speech. The larynx control system, called Audeo, was developed by researchers Michael Callahan and Thomas Coleman at University of Illinois at Urbana-Champain, US, who together also founded Ambient. Restored speech The system works via a sensor-laden neckband which eavesdrops on electrical impulses sent to larynx muscles. It then relays the signals, via an encrypted wireless link, to a nearby computer. The computer decodes these signals and matches them to a series of pre-recorded "words" determined during training exercises. These "words" can then be used to direct the motorised wheelchair. Callahan and Coleman say they can also be sent to a speech synthesiser, allowing a paralysed person to "speak" out loud. Recent refinements to the algorithms used may make it possible to interpret whole sentences thought out by the user. This could potentially restore near-normal speech to people who have not spoken for years, the researchers say. "Everyone working on brain-computer interfaces wants to be able to identify words," says Niels Birbaumer from Eberhard Karls University in Tübingen, Germany, who is developing similar systems for use by stroke victims. "If this works reliably I would be very impressed, it is very hard to record signals from nerves through the skin." Space suit applications Birbaumer adds that measuring brain waves using an electrode cap or implants placed directly in the brain has been used to control computers and wheelchairs before, but so far there is little evidence that either method can reproduce either single words or continuous speech. "Recording from outside the brain like this may the only way to do it," he says. On the other hand, reading information directly from the brain is the only way to help people with very severe spinal injuries. "I have some patients not even able to send nerve signals to the muscles in their face," he told New Scientist. "In those cases you have to try and interface with the brain." Ramaswamy Palaniappan, who works on EEG-based brain computer interfaces at Essex University, agrees this is a limitation. "The main advantages of their device are that that it is very portable, not cumbersome to set-up, and the ease of use," he told New Scientist. NASA produced a similar system to Audeo system in 2004. This can recognise a handful of short words and numbers, and the individual letters of the alphabet. The agency hopes to eventually integrate the technology into spacesuits.

No comments:

Related Posts Plugin for WordPress, Blogger...