The brain power of neuromorphic computing
Imagine yourself as a leading researcher.
How many academic papers do you think you can read in a week? In a good week, with no other urgent work or distractions, even the best could probably get through about ten.
Unfortunately, an estimated 8,000 papers are published every single day — so keeping up with the latest research developments can be likened to surfing a tsunami.
“Even if you restrict your search to a specific field, the number of papers being published — and the knowledge contained in those papers — is increasing exponentially. Not only that, most of the data being produced is unstructured, making it difficult for us to make sense of it,” said Dr Alessandro Curioni, Vice President of IBM Europe and Director of IBM Research-Zurich.
For Dr Curioni, who was speaking at the Supercomputing Frontiers 2017 conference held in Singapore from 13-16 March 2017, the solution is obvious: Train computers to read papers for us, condensing and presenting the information in such a way that we can quickly understand it.
In the course of his 20-year career in high performance computing, he has seen simulation become entrenched as the third pillar of science, alongside theory and experimentation.
“Without a doubt, simulation is now considered an important part of research and development in many fields, from molecular simulation to computational fluid dynamics. Over the years, we as a community have made simulations easier to use, improved accuracy and fidelity, and enhanced the throughput of these simulations,” said Dr Curioni, a two-time winner of the prestigious Gordon Bell Prize.
As essential as simulations have become to the scientific enterprise, they have also added a new dimension of complexity to solving research questions.
“The problem is that when you apply these simulations to a real R&D environment, they add additional data points and enlarge the scope of the question. It turns out that we cannot beat complexity by brute force simulation.”
At the heart of the matter is the fact that humans are only able to do things in a linear manner, while knowledge is increasing at an exponential rate.
Instead, IBM is pursuing what they call ‘cognitive discovery’: using machine learning to rapidly process information and create a knowledge graph which can tell researchers where there are gaps in the existing knowledge. This in turn guides the simulations that need to be done and narrows down the number of experiments that need to be performed, he said.
“What is done today is that you start with a research team that has limited knowledge about the problem, and they build an empirical simulation using their experience. To go from the initial idea to results can often take many years and multiple failures,” Dr Curioni explained.
“Cognitive discovery can reduce this process to a matter of weeks or even days. Now that the field is mature, artificial intelligence, quantum computing and big data can start making a big difference for all of us and make our field much more impactful.”
The Quantum of Brain Power
Complementing IBM Research’s cognitive discovery approach is their attempt to build computers that are inspired by the human brain, or neuromorphic computing. According to Dr Curioni, IBM Research is simultaneously working on three different aspects of neuromorphic computing: connectivity, neuron-inspired chips and the packaging of the brain.
The most advanced of these efforts is the TrueNorth chip built by the SyNAPSE team, he shared. Each chip contains a million ‘neurons’ and 256 ‘synaptic’ connections, and when connected into a ‘brain’ of ten billion neurons, is expected to use less than one kilowatt, an unprecedentedly low power consumption.
“We have demonstrated that once a network is trained using these neurons, it can do different types of classification at three orders of magnitude better than standard computing,” Dr Curioni said. “But we didn’t stop there. More recently we have been working on the first phase change memory-based artificial neurons which can do signal processing without any supervised learning.”
One other way that the brain has inspired IBM researchers, Dr Curioni adds, is how it uses a single fluid — blood — to provide power and do cooling. In contrast, most computing structures use a circuit to supply power and another fluid to do cooling, a situation that prevents researchers from increasing the density and efficiency of existing computers.
For Dr Curioni, however, the most exciting project at IBM is IBM Q, the company’s attempt to provide quantum computing commercially. Announced earlier this month (6 March), IBM Q offers 20 qubits of computing power on the cloud to enable users to tackle problems that are too large and complex for classical computers to handle.
“With quantum computing, every qubit that you add to your system doubles the performance of the whole system. So with a linear scaling of the system you get an exponential increase in performance,” Dr Curioni said.
“In other words, quantum computing can help us bring Moore’s Law back.”