This latest research led by CMU’s Marcel Just builds on the pioneering use of machine learning algorithms with brain imaging technology to “mind read.” The findings indicate that the mind’s building blocks for constructing complex thoughts are formed by the brain’s various sub-systems and are not word-based. Published in Human Brain Mapping and funded by the Intelligence Advanced Research Projects Activity (IARPA), the study offers new evidence that the neural dimensions of concept representation are universal across people and languages.
“One of the big advances of the human brain was the ability to combine individual concepts into complex thoughts, to think not just of ‘bananas,’ but ‘I like to eat bananas in evening with my friends,'” said Just, the D.O. Hebb University Professor of Psychology in the Dietrich College of Humanities and Social Sciences. “We have finally developed a way to see thoughts of that complexity in the fMRI signal. The discovery of this correspondence between thoughts and brain activation patterns tells us what the thoughts are built of.”
Previous work by Just and his team showed that thoughts of familiar objects, like bananas or hammers, evoke activation patterns that involve the neural systems that we use to deal with those objects. For example, how you interact with a banana involves how you hold it, how you bite it and what it looks like.
The new study demonstrates that the brain’s coding of 240 complex events, sentences like the shouting during the trial scenario uses an alphabet of 42 meaning components, or neurally plausible semantic features, consisting of features, like person, setting, size, social interaction and physical action. Each type of information is processed in a different brain system—which is how the brain also processes the information for objects. By measuring the activation in each brain system, the program can tell what types of thoughts are being contemplated.
For seven adult participants, the researchers used a computational model to assess how the brain activation patterns for 239 sentences corresponded to the neurally plausible semantic features that characterized each sentence. Then the program was able to decode the features of the 240th left-out sentence. They went through leaving out each of the 240 sentences in turn, in what is called cross-validation.
The model was able to predict the features of the left-out sentence, with 87 percent accuracy, despite never being exposed to its activation before. It was also able to work in the other direction, to predict the activation pattern of a previously unseen sentence, knowing only its semantic features.
“Our method overcomes the unfortunate property of fMRI to smear together the signals emanating from brain events that occur close together in time, like the reading of two successive words in a sentence,” Just said. “This advance makes it possible for the first time to decode thoughts containing several concepts. That’s what most human thoughts are composed of.”
He added, “A next step might be to decode the general type of topic a person is thinking about, such as geology or skateboarding. We are on the way to making a map of all the types of knowledge in the brain.”
The Latest on: Mind reading
The Age of Mind Reading
on April 18, 2018 at 8:04 am
Researchers are surprising themselves with their breakthroughs in mind reading experiments. There already are several demonstrations of researchers being able to 'see' the images a person's brain is seeing with frightening ease and accuracy - using the ... […]
Black Mirror-style AI can READ your thoughts to record and replay the songs you're singing in your head
on April 18, 2018 at 2:31 am
Scientists in California have created a mind-reading machine that reveals the song being thought about simply by studying the brain's electrical activity. The finding opens the door to strange future scenarios, such as those portrayed in the series 'Black ... […]
Mind-reading A.I. algorithm can work out what music is playing in your head
on April 17, 2018 at 1:10 pm
Most of us have used apps like Shazam, which can identify songs when we hold up our phone up to a speaker. But what if it was possible for an app to identify a piece of music based on nothing more than your thought patterns. Impossible? Perhaps not ... […]
Tune in your head? Mind-reading tech can guess how it sounds
on April 16, 2018 at 4:08 am
We now have the ability to hear another person’s thoughts. Researchers have identified the differences in brain activity linked to heard and imagined sounds, a finding that could lead to better communication devices for people who are fully paralysed. […]
Mind-reading tech is here (and more useful than you think!)
on April 7, 2018 at 3:00 am
Here’s a thought: Mind-reading software is not only ready for commercial use, but it will actually be of practical use in everyday business applications. But wait, you say. That’s creepy, invasive and useless. Read this column, though, and you just ... […]
This Headset Can 'Read Your Mind' And Respond Accordingly
on April 6, 2018 at 2:30 pm
Researchers at MIT have developed an innovative piece of technology called AlterEgo that can "read your mind" and respond accordingly. After strapping the new AlterEgo headset on, you can simply think questions like "what time is it" or "what's 20 percent ... […]
The mind-reading AlterEgo headset (almost) promises telepathy with Alexa
on April 6, 2018 at 12:33 pm
Speaking to voice assistants, no matter how helpful they can be, is still not something the majority of us do on a daily basis. Especially in public. But what if you could “speak” to a voice assistant only by thinking about the words you want to say? […]
Keep This Cursed Mind-Reading Device Away From Me
on April 6, 2018 at 11:51 am
Look, I know the urge to compare every new tech-related advancement with Black Mirror is a cliché, but bear with me here. Researchers at MIT have unveiled a new device called AlterEgo, which allows you to communicate with a computer using nothing but your ... […]
The mind-reading headset that can control an AI helper
on April 6, 2018 at 6:00 am
A new headset created by university researchers can understand words the wearer says internally but does not speak aloud, enabling a form of mind-reading computing. A team of researchers from MIT in the US have created the AlterEgo, a wearable which uses ... […]
This headset lets you talk to your computer with your mind
on April 5, 2018 at 6:39 am
Kapur and his team are still collecting data and have carried out tests on a small number of subjects that showed a 92% accuracy. We think something might have to be done about that design as well before we accept walking around with this mind-reading tech. […]
via Google News and Bing News