Engaging the brain — but where ?
In The Count of Monte-Cristo , Monsieur Noirtier de Villefort cannot move or speak but communicates with his granddaughter through words and letters in a book .
CREDIT : PROJECT GUTENBERG
with the outside world . These technologies typically use an implanted device to record the brain waves associated with speech and then use computer algorithms to translate the intended messages . The most exciting advances require no blinking , eye tracking or attempted vocalizations , but instead capture and convey the letters or words a person says silently in their head .
“ I feel like this technology really has the potential to help the people who have lost the most , people who are really locked down and cannot communicate at all anymore ,” says Sarah Wandelt , a graduate student in computation and neural systems at Caltech in Pasadena . Recent studies by Wandelt and others have provided the first evidence that brain-machine interfaces can decode internal speech . These approaches , while promising , are often invasive , laborious and expensive , and experts agree they will require considerably more development before they can give locked-in patients a voice .
The first step of building a brainmachine interface is deciding which part of the brain to tap . Back when Dumas was young , many believed the contours of a person ’ s skull provided an atlas for understanding the inner workings of the mind . Colorful phrenology charts — with tracts blocked off for human faculties like benevolence , appetite and language — can still be found in antiquated medical texts and the home decor sections of department stores . “ We , of course , know that ’ s nonsense now ,” says David Bjånes , a neuroscientist and postdoctoral researcher at Caltech . In fact , it ’ s now clear that our faculties and functions emerge from a web of interactions among various brain areas , with each area acting as a node in the neural network . This complexity presents both a challenge and an opportunity : With no one brain region yet found that ’ s responsible for internal language , a number of different regions could be viable targets .
For example , Wandelt , Bjånes and their colleagues found that a part of the parietal lobe called the supramarginal gyrus ( SMG ), which is typically associated with grasping objects , is also strongly activated during speech . They made the surprising discovery
To Table of Contents
Accessibilty for All 73