The vision of telepathic computing shared by the original project-leader on Apple's Macintosh Jef Raskin moved closer to reality with NASA's move to build software that lets computers understand words not yet spoken.
The software "comes close to reading thoughts not yet spoken by analyzing nerve commands to the throat," according to AFP News.
NASA developer Chuck Jorgensen said: "A person using the subvocal system thinks of phrases and talks to himself so quietly it cannot be heard, but the tongue and vocal cords do receive speech signals from the brain."
The team identified the sensors (under the chin and to either side of the Adam's apple) that receive brain to speech organ commands. In early trials, the software could recognize a very limited collection of six words and ten numbers 92 per cent of the time.
The scientists plan to try silently commanding a robot with the technology next. With accessibility high on the IT agenda, they said: "A logical spin-off would be that handicapped persons could use this system for a lot of things."
In a glimpse of the future of human-computer interfaces, Raskin said in January: "People are too fed up with the complexity of computers. I predict we will see more wearable, head-mounted displays. We will eventually have direct mind input; we could do it now."