Researchers have demonstrated how to decode what the human brain is seeing by using artificial intelligence to interpret fMRI scans from people watching videos, representing a sort of mind-reading technology.
The advance could aid efforts to improve artificial intelligence and lead to new insights into brain function. Critical to the research is a type of algorithm called a convolutional neural network, which has been instrumental in enabling computers and smartphones to recognize faces and objects.
“That type of network has made an enormous impact in the field of computer vision in recent years,” said Zhongming Liu, an assistant professor in Purdue University’s Weldon School of Biomedical Engineering and School of Electrical and Computer Engineering. “Our technique uses the neural network to understand what you are seeing.”
Convolutional neural networks, a form of “deep-learning” algorithm, have been used to study how the brain processes static images and other visual stimuli. However, the new findings represent the first time such an approach has been used to see how the brain processes movies of natural scenes, a step toward decoding the brain while people are trying to make sense of complex and dynamic visual surroundings, said doctoral student Haiguang Wen.
The researchers acquired 11.5 hours of fMRI data from each of three women subjects watching 972 video clips, including those showing people or animals in action and nature scenes. First, the data were used to train the convolutional neural network model to predict the activity in the brain’s visual cortex while the subjects were watching the videos. Then they used the model to decode fMRI data from the subjects to reconstruct the videos, even ones the model had never watched before.
The model was able to accurately decode the fMRI data into specific image categories. Actual video images were then presented side-by-side with the computer’s interpretation of what the person’s brain saw based on fMRI data.
“For example, a water animal, the moon, a turtle, a person, a bird in flight,” Wen said. “I think what is a unique aspect of this work is that we are doing the decoding nearly in real time, as the subjects are watching the video. We scan the brain every two seconds, and the model rebuilds the visual experience as it occurs.”
The researchers were able to figure out how certain locations in the brain were associated with specific information a person was seeing.
“Neuroscience is trying to map which parts of the brain are responsible for specific functionality,” Wen said. “This is a landmark goal of neuroscience. I think what we report in this paper moves us closer to achieving that goal. A scene with a car moving in front of a building is dissected into pieces of information by the brain: one location in the brain may represent the car; another location may represent the building. Using our technique, you may visualize the specific information represented by any brain location, and screen through all the locations in the brain’s visual cortex. By doing that, you can see how the brain divides a visual scene into pieces, and re-assembles the pieces into a full understanding of the visual scene.”
The researchers also were able to use models trained with data from one human subject to predict and decode the brain activity of a different human subject, a process called cross-subject encoding and decoding. This finding is important because it demonstrates the potential for broad applications of such models to study brain function, even for people with visual deficits.
“We think we are entering a new era of machine intelligence and neuroscience where research is focusing on the intersection of these two important fields,” Liu said. “Our mission in general is to advance artificial intelligence using brain-inspired concepts. In turn, we want to use artificial intelligence to help us understand the brain. So, we think this is a good strategy to help advance both fields in a way that otherwise would not be accomplished if we approached them separately.”
The Latest on: Mind-reading
- How did they do it!? Britain's Got Talent 2019 mentalist freaks out the judges on April 13, 2019 at 12:31 pm
A mind reading magician freaked out the judges on Britain's Got Talent 2019 this weekend. And Ant and Dec were the centre of his act. They got involved with a mindbending performance tonight in ... […]
- The complexity of understanding others as the evolutionary origin of empathy and emotional contagion on April 9, 2019 at 5:42 pm
We explore a game theoretical model for the evolution of mind-reading strategies, used to predict and respond to others’ behavior. In particular we explore the evolutionary scenarios favoring ... […]
- 17 Classic Children's Books You Won't Mind Reading to Your Toddler Over and Over on March 27, 2019 at 11:19 am
There's no such thing as reading to your little one too early, right? And while we love all the new titles floating around out there, there's something particularly special about turning the pages of ... […]
- A Peek into the Future of Wearables on March 14, 2019 at 10:09 am
I’m wearing a first-generation Fitbit Flex. It’s been a while since it’s been cool. Sitting near me in a Stanford University conference room last month was someone wearing the latest Apple ... […]
- Mark Zuckerberg's mind-reading madness on March 12, 2019 at 4:10 am
Mark Zuckerberg, Facebook CEO, is trying to build a “brain-computer interface” — or, in layman’s, technology that can read your mind. No keyboard needed. Does anybody outside of the techno-geek crowd ... […]
- Carolyn Hax: When your wife gives you a test and says you flunked mind-reading on March 9, 2019 at 3:59 am
Carolyn Hax is away. The following first appeared on Oct. 29, 2004. — Va. Va.: No, you’re right — except that dating isn’t the time for tests, either. Relationships at all stages are the time to say, ... […]
- Can a Mind-Reading Computer Speak for Those Who Cannot? on March 7, 2019 at 10:13 am
Credit: Adapted from Nima Mesgarani, Columbia University’s Zuckerman Institute, New York Computers have learned to do some amazing things, from beating the world’s ranking chess masters to providing ... […]
- Zuckerberg Wants Facebook to Build a Mind-Reading Machine on March 7, 2019 at 4:10 am
For those of us who worry that Facebook may have serious boundary issues when it comes to the personal information of its users, Mark Zuckerberg’s recent comments at Harvard should get the heart ... […]
- ‘What Men Want’ Review: A Mind-Reading Woman Flips the Script, Sort of on February 7, 2019 at 11:51 am
“What Men Want” presumes a lot of things about its viewers. One is that they won’t tolerate a satire of workplace sexism if it doesn’t sometimes put the woman in her place. Another is a taste for Fiji ... […]
- ‘Mind reading’ technology poses ethical questions on January 31, 2019 at 10:27 am
Imagine not being able to speak your mind, not even to ask for a glass of water. Now scientists have revealed technology that could one day speak it for you. Researchers at Columbia University in New ... […]
via Google News and Bing News