By Tia Ghose January 29, 2016
A diagram of the human brain
A new computer program can decode people’s thoughts almost in real time, new research shows.
Researchers can predict what people are seeing based on the electrical signals coming from electrodes implanted in their brain, and this decoding happens within milliseconds of someone first seeing the image, the scientists found.
…a neuroscientist at the University of Washington in Seattle, said in a statement.
“Clinically, you could think of our result as a proof of concept toward building a communication mechanism for patients who are paralyzed or have had a stroke and are completely locked in,” Rao said.
In recent years, scientists have made tremendous strides in decoding human thoughts. In a 2011 study, researchers translated brain waves into the movie clips people were watching at the time. In 2014, two scientists transmitted thoughts to each other using a brain-to-brain link. And other studies have shown that computers can “see” what people are dreaming about, using only their brain activity.
Rao and his colleagues wanted to see if they could further this effort. They asked seven people with severe epilepsy, who had already undergone surgery to implant electrodes into their temporal lobes, if they would mind having their thoughts decoded. (The patients had the electrodes implanted for a single week so that doctors could pinpoint where the seizures originated within the temporal lobe, which is a common source of seizures, the researchers said.)
“They were going to get the electrodes no matter what; we were just giving them additional tasks to do during their hospital stay while they are otherwise just waiting around,” said study co-author Dr. Jeff Ojemann, a neurosurgeon at the University of Washington Medical Center in Seattle.
The temporal lobe is also the brain region responsible for processing sensory input, such as visualizing and recognizing objects that a person sees.
Rao, Ojemann and their colleagues had the participants watch a computer screen as several images briefly flickered by. The images included pictures of faces and houses, as well as blank screens, and the subjects were told to keep alert to identify the image of an upside-down house.
At the same time, the electrodes were hooked up to a powerful computer program that analyzed brain signals 1,000 times a second, determining what brain signals looked like when someone was viewing a house versus a face. For the first two-thirds of the images, the computer program got a label, essentially telling it, “This is what brain signals look like when someone views a house.” For the remaining one-third of the pictures, the computer was able to predict, with 96 percent accuracy, what the person actually saw, the researchers reported Jan. 21 in the journal PLOS Computational Biology. What’s more, the computer accomplished this task within 20 milliseconds of the instant the person looked at the object.
It turned out that different neurons fired when people were looking at faces versus when they were looking at houses. It also turned out that the computer needed two types of brain signals to decode the images: an event-related potential and a broadband spectral change. The event-related potential is a characteristic spike in brain cell firing that appears when the brain responds to any stimulus, whereas the broadband spectral change is detected by electrodes as an overall change in power across the brain region.
“Traditionally, scientists have looked at single neurons,” Rao said. “Our study gives a more global picture, at the level of very large networks of neurons, of how a person who is awake and paying attention perceives a complex visual object.”
By allowing researchers to identify, in real time, which parts of the brain respond to certain stimuli, the new technique could help doctors map the entire human brain one day, the researchers said.”