Carnegie Mellon University scientists can now use brain activation patterns to identify complex thoughts, such as, “The witness shouted during the trial.”
This latest research led by CMU’s Marcel Just builds on the pioneering use of machine learning algorithms with brain imaging technology to “mind read.” The findings indicate that the mind’s building blocks for constructing complex thoughts are formed by the brain’s various sub-systems and are not word-based. Published in Human Brain Mapping and funded by the Intelligence Advanced Research Projects Activity (IARPA), the study offers new evidence that the neural dimensions of concept representation are universal across people and languages.
“One of the big advances of the human brain was the ability to combine individual concepts into complex thoughts, to think not just of ‘bananas,’ but ‘I like to eat bananas in evening with my friends,'” said Just, the D.O. Hebb University Professor of Psychology in the Dietrich College of Humanities and Social Sciences. “We have finally developed a way to see thoughts of that complexity in the fMRI signal. The discovery of this correspondence between thoughts and brain activation patterns tells us what the thoughts are built of.”
Previous work by Just and his team showed that thoughts of familiar objects, like bananas or hammers, evoke activation patterns that involve the neural systems that we use to deal with those objects. For example, how you interact with a banana involves how you hold it, how you bite it and what it looks like.
The new study demonstrates that the brain’s coding of 240 complex events, sentences like the shouting during the trial scenario uses an alphabet of 42 meaning components, or neurally plausible semantic features, consisting of features, like person, setting, size, social interaction and physical action. Each type of information is processed in a different brain system—which is how the brain also processes the information for objects. By measuring the activation in each brain system, the program can tell what types of thoughts are being contemplated.
For seven adult participants, the researchers used a computational model to assess how the brain activation patterns for 239 sentences corresponded to the neurally plausible semantic features that characterized each sentence. Then the program was able to decode the features of the 240th left-out sentence. They went through leaving out each of the 240 sentences in turn, in what is called cross-validation.
The model was able to predict the features of the left-out sentence, with 87 percent accuracy, despite never being exposed to its activation before. It was also able to work in the other direction, to predict the activation pattern of a previously unseen sentence, knowing only its semantic features.
“Our method overcomes the unfortunate property of fMRI to smear together the signals emanating from brain events that occur close together in time, like the reading of two successive words in a sentence,” Just said. “This advance makes it possible for the first time to decode thoughts containing several concepts. That’s what most human thoughts are composed of.”
He added, “A next step might be to decode the general type of topic a person is thinking about, such as geology or skateboarding. We are on the way to making a map of all the types of knowledge in the brain.”
The Latest on: Mind reading technology
- US$24.67: That's What Analysts Think PAR Technology Corporation (NYSE:PAR) Is Worth After Its Latest Resultson May 11, 2020 at 5:12 am
The investors in PAR Technology Corporation's (NYSE:PAR) will be rubbing their hands together with glee today, ...
- National Technology Day: Citizens say technology has helped them stay afloat in lockdownon May 10, 2020 at 11:17 pm
The Covid-19 lockdown hasn’t been an easy time, but what has acted as a constant support for everyone following social distancing is the technology at hand.
- Google's Read Along app to help kids improve their reading skills now available in 180 countrieson May 8, 2020 at 9:49 am
Google's Read Along app was first launched in India as Bolo. Its speech recognition technology helps kids aged five and above improve their reading skills. Available in 180 countries, the app can be ...
- Do I look mad? Reading facial cues with the touch-screen generationon May 7, 2020 at 10:00 am
Infancy and early childhood are critical developmental phases during which children learn to interpret important non-verbal cues such as facial expressions, tone of voice and gestures. Traditionally, ...
- CNN's Brooke Baldwin on Having Coronavirus: "I Was So Lonely, My Mind Went to Some Dark Places"on May 6, 2020 at 5:01 pm
Even after my husband put his hand on my forehead that afternoon and declared I definitely had a fever (later, he'd confess he "could've fried an egg" on my head), even while I went in for a test, I ...
- Neuroscience Startup Uses Mind-Reading Helmet to Read Brain Signalson May 6, 2020 at 1:10 am
TEHRAN (Tasnim) – A California-based startup says it has found a way to shrink typically bulky brain-reading technology into a compact helmet.
- Minneapolis technology company launches kiosk that helps mitigate spread of Covid-19on May 6, 2020 at 12:13 am
(MPS) today announced the launch of the first and only automated system to help mitigate the asymptomatic spread of Covid-19 on private and public premises through the detection of elevated ...
- Carpenter Technology Corporation (NYSE:CRS) Analysts Are Cutting Their Estimates: Here's What You Need To Knowon May 4, 2020 at 3:47 am
Carpenter Technology Corporation (NYSE:CRS) investors will be delighted, with the company turning in some strong numbers with its latest results. Results were good overall, with revenues beating ...
- Show but don’t tell: why silent Zooms are golden for focusing the mindon April 26, 2020 at 12:38 am
In practice, silent Zooms have become a lifeline in lockdown for users trying to focus on writing, reading, meditation and more. Author Anne Penketh has been retreating to “a virtual monastic ...
- Mitcham Industries, Inc. (MIND) Q4 2020 Results - Earnings Call Transcripton April 22, 2020 at 1:59 pm
Q4 2020 Results Earnings Conference Call April 22, 2020, 9:00 am ET Company Participants Ken Dennard - Investor Relations ...
via Google News and Bing News