Could a computer, at a glance, tell the difference between a joyful image and a depressing one?
Could it distinguish, in a few milliseconds, a romantic comedy from a horror film?
Yes, and so can your brain, according to research published this week by CU Boulder neuroscientists.
“Machine learning technology is getting really good at recognizing the content of images—of deciphering what kind of object it is,” said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder. “We wanted to ask: Could it do the same with emotions? The answer is yes.”
Part machine-learning innovation, part human brain-imaging study, the paper, published Wednesday in the journal Science Advances, marks an important step forward in the application of “neural networks”—computer systems modeled after the human brain—to the study of emotion.
It also sheds a new, different light on how and where images are represented in the human brain, suggesting that what we see—even briefly—could have a greater, more swift impact on our emotions than we might assume.
“A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system,” said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. “We found that the visual cortex itself also plays an important role in the processing and perception of emotion.”
The birth of EmoNet
For the study, Kragel started with an existing neural network, called AlexNet, which enables computers to recognize objects. Using prior research that identified stereotypical emotional responses to images, he retooled the network to predict how a person would feel when they see a certain image.
He then “showed” the new network, dubbed EmoNet, 25,000 images ranging from erotic photos to nature scenes and asked it to categorize them into 20 categories such as craving, sexual desire, horror, awe and surprise.
EmoNet could accurately and consistently categorize 11 of the emotion types. But it was better at recognizing some than others. For instance, it identified photos that evoke craving or sexual desire with more than 95 percent accuracy. But it had a harder time with more nuanced emotions like confusion, awe and surprise.
Even a simple color elicited a prediction of an emotion: When EmoNet saw a black screen, it registered anxiety. Red conjured craving. Puppies evoked amusement. If there were two of them, it picked romance. EmoNet was also able to reliably rate the intensity of images, identifying not only the emotion it might illicit but how strong it might be.
When the researchers showed EmoNet brief movie clips and asked it to categorize them as romantic comedies, action films or horror movies, it got it right three-quarters of the time.
What you see is how you feel
To further test and refine EmoNet, the researchers then brought in 18 human subjects.
As a functional magnetic resonance imaging (fMRI) machine measured their brain activity, they were shown 4-second flashes of 112 images. EmoNet saw the same pictures, essentially serving as the 19th subject.
When activity in the neural network was compared to that in the subjects’ brains, the patterns matched up.
“We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so,” said Kragel.
The brain imaging itself also yielded some surprising findings. Even a brief, basic image – an object or a face – could ignite emotion-related activity in the visual cortex of the brain. And different kinds of emotions lit up different regions.
“This shows that emotions are not just add-ons that happen later in different areas of the brain,” said Wager, now a professor at Dartmouth College. “Our brains are recognizing them, categorizing them and responding to them very early on.”
Ultimately, the resesarchers say, neural networks like EmoNet could be used in technologies to help people digitally screen out negative images or find positive ones. It could also be applied to improve computer-human interactions and help advance emotion research.
The takeaway for now, says Kragel:
“What you see and what your surroundings are can make a big difference in your emotional life.”
Learn more: A computer system that knows how you feel
The Latest on: Machine learning technology
via Google News
The Latest on: Machine learning technology
- Y Combinator-backed Holy Grail is using machine learning to build better batterieson August 16, 2019 at 9:08 am
“There are some companies that only do the machine-learning part and the computational ... Think of it like bringing the fabless chip design technologies and business models to the battery ...
- 4 Best Technology Funds for Stellar Returnson August 16, 2019 at 6:42 am
Improving industry fundamentals and emerging technologies such as artificial intelligence, machine learning, robotics and data science are key catalysts to the sector’s growth. In addition ...
- Bucks fintech cuts deal to provide technology to Freddie Macon August 16, 2019 at 6:24 am
FAST leverages LoanLogics Intelligent Data Extraction and Automation technology, which uses machine learning and other capabilities to transform digital images and scanned documents into verified and ...
- 3rd IEEE P3652.1 Federated Machine Learning Working Group Meeting, a Joint Effort Facilitating and Accelerating Industry Standard-Settingon August 16, 2019 at 3:18 am
"Standard-setting serves as an important backbone for the healthy and orderly development of technologies. This meeting boosted the standard-setting progress for Federated Learning significantly," ...
- AI And Machine Learning Are Powering Next-Generation Media Operationson August 15, 2019 at 8:41 am
This can take many forms and has a number of contributing technology types, such as artificial intelligence (AI) and machine learning. But the headline is that data is personalizing media ...
- Four Best Practices For Using Machine Learning In Your Product Or Platformon August 15, 2019 at 5:22 am
In my software development work, I've spent considerable time figuring out how to integrate other tools and technology to make my solutions better. The last few years, my work has expanded to adding ...
- Machine Learning Artificial intelligence Market Major Technology Giants in Buzz Againon August 14, 2019 at 12:09 pm
Aug 14, 2019 (HTF Market Intelligence via COMTEX) -- HTF MI presents an in-depth overview of the Global Machine Learning Artificial intelligence Market Study, detailing the latest product / industry ...
- BAE selected for machine learning capability contracton August 13, 2019 at 12:43 pm
the second phase of an existing $12.8M contract to develop digital tools to test and evaluate technologies for space command and control. The second phase focuses on the use of machine learning ...
- Book review: ‘AIQ’ explains machine learning fundamentals using human historyon August 13, 2019 at 10:16 am
Since this book was first released, we’ve learned a lot about Chinese use of facial recognition technology to track ... the present-day dominance of machine learning. The combination helps ...
- DARPA Selects BAE Systems to Develop Machine Learning Capabilities for Space Situational Awarenesson August 13, 2019 at 10:08 am
BAE Systems’ research on the Hallmark-TCEM program adds to the company’s machine learning and artificial intelligence segment of its autonomy technology portfolio. The capabilities developed ...
via Bing News