Machine-learning techniques that mimic human recognition and dreaming processes are being deployed in the search for habitable worlds beyond our solar system. A deep belief neural network, called RobERt (Robotic Exoplanet Recognition), has been developed by astronomers at UCL to sift through detections of light emanating from distant planetary systems and retrieve spectral information about the gases present in the exoplanet atmospheres.
RobERt will be presented at the National Astronomy Meeting (NAM) 2016 in Nottingham by Dr Ingo Waldmann on Tuesday 28th June.
“Different types of molecules absorb and emit light at specific wavelengths, embedding a unique pattern of lines within the electromagnetic spectrum,” explained Dr Waldmann, who leads RobERt’s development team. “We can take light that has been filtered through an exoplanet’s atmosphere or reflected from its cloud-tops, split it like a rainbow and then pick out the ‘fingerprint’ of features associated with the different molecules or gases. Human brains are really good at finding these patterns in spectra and label them from experience, but it’s a really time consuming job and there will be huge amounts of data.
We built RobERt to independently learn from examples and to build on his own experiences. This way, like a seasoned astronomer or a detective, RobERt has a pretty good feeling for what molecules are inside a spectrum and which are the most promising data for more detailed analysis. But what usually takes days or weeks takes RobERt mere seconds.”
Deep belief neural networks, or DBNs, were developed more than a decade ago and are commonly used for speech recognition, Internet searches and tracking customer behaviour. RobERt’s DBN has three layers of unit processors, or ‘neurons’. Information is fed into a bottom layer of 500 neurons, which make an initial filter of the data and pass a subset up to the second layer. Here, 200 neurons refine the selection and pass data up to a third layer of 50 neurons to make the final identification of the gases most likely to be present.
To prepare RobERt for his challenge, Waldmann and colleagues at UCL created a total of 85,750 simulated spectra, covering five different types of exoplanet ranging from GJ1214b, a potential “ocean planet”, to WASP-12, a hot Jupiter orbiting very close to its star. Each spectrum in the training set contained the fingerprint of a single gas species. RobERt’s learning progress was tested at intervals during the training with ‘control’ spectra. At the end of the training phase, RobERt had a recognition accuracy of 99.7%.
“RobERt has learned to take into account factors such as noise, restricted wavelength ranges and mixtures of gases,” said Waldmann. “He can pick out components such as water and methane in a mixed atmosphere with a high probability, even when the input comes from the limited wavebands that most space instruments provide and when it contains overlapping features.”
RobERt’s DBN can also be reversed so that instead of analysing data fed into the system, he can enter a ‘dreaming state’ in which he can generate full spectra based on his experiences.
“Robots really do dream. We can ask RobERt to dream up what he thinks a water spectrum will look like, and he’s proved very accurate,” said Waldmann. “This dreaming ability has been very useful when trying to identify features in incomplete data. RobERt can use his dream state to fill in the gaps. The James Webb Space Telescope, due for launch in 2018, will tell as more about the atmospheres of exoplanets, and new facilities like Twinkle or ARIEL will be coming online over the next decade that are specifically tailored to characterising the atmospheres of exoplanets. The amount of data these missions will provide will be breathtaking. RobERt will play an invaluable role in helping us to analyse data from these missions and find out what these distant worlds are really like.”
The Latest on: Deep belief neural network
via Google News
The Latest on: Deep belief neural network
- Can Machine Learning Teach Us Anything? on February 19, 2019 at 12:13 pm
Unfortunately, the neural ... through deep learning to reassess its strategy with each decision.” To me, having little poker experience, this was a revelation: Contrary to my naive belief ... […]
- Facebook’s chief AI scientist: Deep learning may need a new programming language on February 18, 2019 at 9:30 am
Deep ... neural nets,” he said. He also recommended dynamic networks and hardware that can adjust to utilize only the neurons needed for a task. In the paper, LeCun reiterated his belief that ... […]
- Using Memristors for Robust Local Learning of Hardware Restricted Boltzmann Machines on February 12, 2019 at 2:17 am
The different neural network topologies (bias included) were set as follows: 785-301-10 for the RBM + softmax stack, 794(784 + 10) + 300 for the Discriminative RBM and 785 + 501 + 511(501 + 10) + 2001 ... […]
- Penetration of the Smart Devices to Drive the Deep Learning Market Over the Forecast Period on February 10, 2019 at 11:48 am
... SDK market is segmented into Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), Deep Belief Networks (DBN), Graphical Processing Units (GRU) and Deep Stacking Networks (DSN). […]
- AI: Neural Networks on January 10, 2019 at 12:00 pm
Some examples of common architectures used today are Deep Convolutional Networks (DCN), Recurrent Neural Networks (RNN), and Deep Belief Networks (DBN). These networks differ in the organization of th... […]
- Deep Neural Networks With Python on December 18, 2018 at 11:30 pm
let’s talk about one more thing- Deep Belief Networks. A DBN is a sort of deep neural network that holds multiple layers of latent variables or hidden units. Such a network observes connections betwee... […]
- Scientists improve deep learning method for neural networks on August 17, 2018 at 6:30 am
He stated that multilayer neural networks could be pre-trained by training one layer at a time with the help of the restricted Boltzmann machine and then fine-tuning them using backpropagation. These ... […]
- Workshop on ADASIVA at IIIT-A on July 23, 2018 at 1:11 am
auto encoder and deep belief networks focusing on the fact that deep neural networks perform better than a shallow one. Then he introduced a new learning framework called deep dictionary learning and ... […]
- From Perceptron to Deep Networks on July 16, 2017 at 11:44 am
Perceptron is a basic neural network building block.I will start with overview ... business problems Hence came stacked Autoencoders & Stacked RBM i.e. Deep Belief networks. The last subsampling (or c... […]
- Democratizing deep learning with an iPhone app and open source SDK on April 24, 2014 at 1:52 pm
Training the Deep Belief app to recognize objects not in those categories happens “up” the stack of neural networks that comprise the architecture. The key to doing it well is getting lots of images o... […]
via Bing News