Computers operate self-driving cars, pick friends’ faces out of photos on Facebook, and are learning to take on jobs typically entrusted only to human experts.
Researchers from the University of Wisconsin–Madison and Oak Ridge National Laboratory have trained computers to quickly and consistently detect and analyze microscopic radiation damage to materials under consideration for nuclear reactors. And the computers bested humans in this arduous task.
“Machine learning has great potential to transform the current, human-involved approach of image analysis in microscopy,” says Wei Li, who earned his master’s degree in materials science and engineering this year from UW–Madison.
Many problems in materials science are image-based, yet few researchers have expertise in machine vision — making image recognition and analysis a major research bottleneck. As a student, Li realized that he could leverage training in the latest computational techniques to help bridge the gap between artificial intelligence and materials science research.
Li, with Oak Ridge staff scientist Kevin Field and UW–Madison materials science and engineering professor Dane Morgan, used machine learning to make artificial intelligence better than experienced humans at analyzing damage to potential nuclear reactor materials. The collaborators described their approach in a paper published July 18 in the journal npj Computational Materials.
Machine learning uses statistical methods to guide computers toward improving their performance on a task without receiving any explicit guidance from a human. Essentially, machine learning teaches computers to teach themselves.
“In the future, I believe images from many instruments will pass through a machine learning algorithm for initial analysis before being considered by humans,” says Morgan, who was Li’s graduate school advisor.
The researchers targeted machine learning as a means to rapidly sift through electron microscopy images of materials that had been exposed to radiation, and identify a specific type of damage — a challenging task because the photographs can resemble a cratered lunar surface or a splatter-painted canvas.
That job, absolutely critical to developing safe nuclear materials, could make a time-consuming process much more efficient and effective.
“Human detection and identification is error-prone, inconsistent and inefficient. Perhaps most importantly, it’s not scalable,” says Morgan. “Newer imaging technologies are outstripping human capabilities to analyze the data we can produce.”
Previously, image-processing algorithms depended on human programmers to provide explicit descriptions of an object’s identifying features. Teaching a computer to recognize something simple like a stop sign might involve lines of code describing a red octagonal object.
More complex, however, is articulating all of the visual cues that signal something is, for example, a cat. Fuzzy ears? Sharp teeth? Whiskers? A variety of critters have those same characteristics.
Machine learning now takes a completely different approach.
“It’s a real change of thinking. You don’t make rules. You let the computer figure out what the rules should be,” says Morgan.
“This is just the beginning. Machine learning tools will help create a cyber infrastructure that scientists can utilize in ways we are just beginning to understand.”
Today’s machine learning approaches to image analysis often use programs called neural networks that seem to mimic the remarkable layered pattern-recognition powers of the human brain. To teach a neural network to recognize a cat, scientists simply “train” the program by providing a collection of accurately labeled pictures of various cat breeds. The neural network takes over from there, building and refining its own set of guidelines for the most important features.
Similarly, Morgan and colleagues taught a neural network to recognize a very specific type of radiation damage, called dislocation loops, which are some of the most common, yet challenging, defects to identify and quantify even for a human with decades of experience.
After training with 270 images, the neural network, combined with another machine learning algorithm called a cascade object detector, correctly identified and classified roughly 86 percent of the dislocation loops in a set of test pictures. For comparison, human experts found 80 percent of the defects.
“When we got the final result, everyone was surprised,” says Field, “not only by the accuracy of the approach, but the speed. We can now detect these loops like humans while doing it in a fraction of the time on a standard home computer.”
After he graduated, Li took a job with Google, but the research is ongoing. Morgan and Field are working to expand their training data set and teach a new neural network to recognize different kinds of radiation defects. Eventually, they envision creating a massive cloud-based resource for materials scientists around the world to upload images for near-instantaneous analysis.
“This is just the beginning,” says Morgan. “Machine learning tools will help create a cyber infrastructure that scientists can utilize in ways we are just beginning to understand.”
Receive an email update when we add a new MACHINE LEARNING article.
The Latest on: Machine learning tools
via Google News
The Latest on: Machine learning tools
- AI and Machine Learning: 9 Predictions for 2019 on December 13, 2018 at 11:52 am
1-Machine learning projects will move from science projects and innovation ... and these capabilities will increasingly be embedded in management tools. This much-anticipated capability will simplify ... […]
- Column: Machine learning is a no-brainer for supply chains on December 12, 2018 at 1:56 pm
Machine learning is a must-have tool for supply chain planning because it can be used to optimize the process from end-to-end, writes Shaun Phillips of QAD. Machine learning, he writes, can reduce hum... […]
- AtScale raises $50 million to bring machine learning to cloud data management on December 12, 2018 at 6:00 am
AtScale today announced the close of a $50 million funding round to incorporate more machine learning into its data ... and connect business intelligence tools. AtScale also helps companies ... […]
- Artificial intelligence and machine learning aren't just for the big guys on December 12, 2018 at 5:14 am
Smaller firms should be just as interested in machine learning (ML) and artificial intelligence (AI ... Taking first steps forward AI is an emerging area with a complex array of tools and services. Ye... […]
- AI and Machine Learning Bias on December 11, 2018 at 1:09 am
Built into Google’s open-source web application, TensorBoard, the tool allows users to analyze machine learning models and to make fairness assessments without the need for additional code. Microsoft: ... […]
- Why Machine Learning Is A Service In Addition To A Skill on December 7, 2018 at 4:45 am
This merged with unstable infrastructures and a lack of tools can lead to low job retention and significant turnover rates as data scientists become frustrated with the workloads early on. While we ne... […]
- VOSAI Demonstrates Capabilities of its Decentralized Machine Learning Infrastructure on December 5, 2018 at 5:45 am
Moreover, VOSAI's API was designed for any developer regardless of machine learning experience, so companies don't need big budgets or in-house expertise to leverage the power of AI." The demo kits in ... […]
- Cloudera Announces Preview of Cloud-Native Machine Learning Platform to Accelerate the Industrialization of AI on December 5, 2018 at 3:00 am
Cloudera Machine Learning also ensures secure data access with a unified experience across on-premises, public cloud, and hybrid environments. Unlike data science tools that address only parts of the ... […]
- AI In China: How Buzzfeed Rival ByteDance Uses Machine Learning To Revolutionize The News on December 4, 2018 at 8:36 pm
Powered by AI and Machine Learning Whether it’s BuzzVideo or Toutiao, all of ByteDance’s products use artificial intelligence and machine learning to deliver content that users want. The company’s int... […]
via Bing News