Computers operate self-driving cars, pick friends’ faces out of photos on Facebook, and are learning to take on jobs typically entrusted only to human experts.
Researchers from the University of Wisconsin–Madison and Oak Ridge National Laboratory have trained computers to quickly and consistently detect and analyze microscopic radiation damage to materials under consideration for nuclear reactors. And the computers bested humans in this arduous task.
“Machine learning has great potential to transform the current, human-involved approach of image analysis in microscopy,” says Wei Li, who earned his master’s degree in materials science and engineering this year from UW–Madison.
Many problems in materials science are image-based, yet few researchers have expertise in machine vision — making image recognition and analysis a major research bottleneck. As a student, Li realized that he could leverage training in the latest computational techniques to help bridge the gap between artificial intelligence and materials science research.
Li, with Oak Ridge staff scientist Kevin Field and UW–Madison materials science and engineering professor Dane Morgan, used machine learning to make artificial intelligence better than experienced humans at analyzing damage to potential nuclear reactor materials. The collaborators described their approach in a paper published July 18 in the journal npj Computational Materials.
Machine learning uses statistical methods to guide computers toward improving their performance on a task without receiving any explicit guidance from a human. Essentially, machine learning teaches computers to teach themselves.
“In the future, I believe images from many instruments will pass through a machine learning algorithm for initial analysis before being considered by humans,” says Morgan, who was Li’s graduate school advisor.
The researchers targeted machine learning as a means to rapidly sift through electron microscopy images of materials that had been exposed to radiation, and identify a specific type of damage — a challenging task because the photographs can resemble a cratered lunar surface or a splatter-painted canvas.
That job, absolutely critical to developing safe nuclear materials, could make a time-consuming process much more efficient and effective.
“Human detection and identification is error-prone, inconsistent and inefficient. Perhaps most importantly, it’s not scalable,” says Morgan. “Newer imaging technologies are outstripping human capabilities to analyze the data we can produce.”
Previously, image-processing algorithms depended on human programmers to provide explicit descriptions of an object’s identifying features. Teaching a computer to recognize something simple like a stop sign might involve lines of code describing a red octagonal object.
More complex, however, is articulating all of the visual cues that signal something is, for example, a cat. Fuzzy ears? Sharp teeth? Whiskers? A variety of critters have those same characteristics.
Machine learning now takes a completely different approach.
“It’s a real change of thinking. You don’t make rules. You let the computer figure out what the rules should be,” says Morgan.
“This is just the beginning. Machine learning tools will help create a cyber infrastructure that scientists can utilize in ways we are just beginning to understand.”
Today’s machine learning approaches to image analysis often use programs called neural networks that seem to mimic the remarkable layered pattern-recognition powers of the human brain. To teach a neural network to recognize a cat, scientists simply “train” the program by providing a collection of accurately labeled pictures of various cat breeds. The neural network takes over from there, building and refining its own set of guidelines for the most important features.
Similarly, Morgan and colleagues taught a neural network to recognize a very specific type of radiation damage, called dislocation loops, which are some of the most common, yet challenging, defects to identify and quantify even for a human with decades of experience.
After training with 270 images, the neural network, combined with another machine learning algorithm called a cascade object detector, correctly identified and classified roughly 86 percent of the dislocation loops in a set of test pictures. For comparison, human experts found 80 percent of the defects.
“When we got the final result, everyone was surprised,” says Field, “not only by the accuracy of the approach, but the speed. We can now detect these loops like humans while doing it in a fraction of the time on a standard home computer.”
After he graduated, Li took a job with Google, but the research is ongoing. Morgan and Field are working to expand their training data set and teach a new neural network to recognize different kinds of radiation defects. Eventually, they envision creating a massive cloud-based resource for materials scientists around the world to upload images for near-instantaneous analysis.
“This is just the beginning,” says Morgan. “Machine learning tools will help create a cyber infrastructure that scientists can utilize in ways we are just beginning to understand.”
The Latest on: Machine learning tools
via Google News
The Latest on: Machine learning tools
- Machine learning platform MLflow joins the Linux Foundationon June 25, 2020 at 9:00 am
Handing the platform run by Databricks to the vendor-neutral foundation will speed growth, the organizations say.
- Intel and National Science Foundation Invest in Wireless-Specific Machine Learning Edge Researchon June 25, 2020 at 7:00 am
Intel and the National Science Foundation (NSF) announced award recipients of joint funding for research into the development of ...
- Machine Learning Powers COVID-19 Risk Assessment Dashboardon June 25, 2020 at 6:34 am
Researchers from Florida Atlantic University are using machine learning to build a COVID-19 risk assessment dashboard.
- D2iQ Unveils KUDO for Kubeflow to Accelerate Enterprise-grade Machine Learning on Kuberneteson June 25, 2020 at 6:00 am
PRNewswire/ -- D2iQ, the leading provider of enterprise-grade cloud platforms that power smarter Day 2 operations, today introduced KUDO for Kubeflow ...
- Machine Learning Can Help Decode Alien Skies—Up to a Pointon June 25, 2020 at 4:52 am
Astronomers are testing the tools that might help them keep up with the upcoming storm of exoplanet atmosphere data.
- Blue Ridge Enhances Machine Learning Capabilities for Price Optimizationon June 24, 2020 at 7:34 am
Blue Ridge announced today enhancements to its suite of next-gen cloud-based Price Optimization solutions, which leverage machine learning to quickly.
- Google’s new ML Kit SDK keeps all machine learning on the deviceon June 24, 2020 at 4:11 am
Smartphones today have become so powerful that sometimes even mid-range handsets can support some fancy machine learning and AI applications. Most of those, however, still rely on cloud-hosted ...
- What a machine learning tool that turns Obama white can (and can’t) tell us about AI biason June 23, 2020 at 1:45 pm
This image speaks volumes about the dangers of bias in AI.” But what’s causing these outputs and what do they really tell us about AI bias? First, we need to know a little a bit about the technology ...
- AI and Machine Learning Are Changing Everything. Here’s How You Can Get In On The Funon June 22, 2020 at 11:01 am
This collection features four courses and almost 80 hours of content, introducing interested students to the skills, tools and processes needed to not only understand AI, but apply that knowledge to ...
via Bing News