Neuromorphic Computing: How the Brain-Inspired Technology Powers the Next-Generation of Artificial Intelligence – Interesting Engineering

As a remarkable product of evolution, the human brain has a baseline energy footprint of about 20 watts; this gives the brain the power to process complex tasks in milliseconds. Todays CPUs and GPUs dramatically outperform the human brain for serial processing tasks. However, the process of moving data from memory to a processor and back creates latency and, in addition, expends enormous amounts of energy.

Neuromorphic systems attempt to imitate how the human nervous system operates. This field of engineering tries to imitate the structure of biological sensing and information processing nervous systems. In other words, neuromorphic computing implements aspects of biological neural networks as analogue or digital copies on electronic circuits.

Neuromorphics are not a new concept in any way. Like many other emerging technologies which are getting momentum just now, neuromorphics have been silently under development for a long time. But it was not their time to shine yet. More work had to be done.

Over 30 years ago, in the late 1980s, Professor Carver Mead, an American scientist, engineer, and microprocessor pioneer, developed the concept of neuromorphic engineering, also known as neuromorphic computing.

Neuromorphic engineering describes the use of very-large scale integration (VLSI) systems containing electronic analog circuits. These circuits were arranged in a way that mimics neuro-biological architectures present in the human nervous system

Neuromorphic computing gets its inspiration from the human brains architecture and dynamics to create energy-efficient hardware for information processing, making it capable of highly sophisticated tasks.

Neuromorphic computing includes the production and use of neural networks. It takes its inspiration from the human brain with the goal of designing computer chips that are able to merge memory and processing. In the human brain, synapses provide a direct memory access to the neurons that process information.

For decades, electrical engineers have been fascinated by bio-physics and neural computation, and the development of practical mixed-signal circuits for artificial neural networks. The challenge is in working across a broad range of disciplines spanning from electron devices to algorithms. However, the practical usefulness of neuromorphic systems will be used in everyday life, and this alone makes the effort worth it.

"Artificial Intelligence (AI) needs new hardware, not just new algorithms. Were at a turning point, where Moores law is reaching its end leading to a stagnation of the performance of our computers. Nowadays, we are generating more and more data that needs to be stored and classified,"said Professor Dmitri Strukov, an electrical engineer at the University of California at Santa Barbara in an interview with Nature Communications about the opportunities and challenges in developing brain-inspired technologies, namely neuromorphic computing, when asked why we need neuromorphic computing.

Dmitri Strukov goes on telling Nature Communications how the recent progresses in AI allow automating this process, with data centers multiplying at a cost of consuming an exponentially increasing amount of electricity, which is a potential problem for our environment. "This energy consumption mainly comes from data traffic between memory and processing units that are separated in computers,"said Strukov.

"It wastes electrical energy and it considerably slows down computational speed. Recent developments in nanotechnology offer the possibility to bring huge amounts of memory close to processing, or even better, to integrate this memory directly in the processing unit,said Dmitri Strukov.

According to Strukov, the idea of neuromorphic computing is to take inspiration of the brain for designing computer chips that merge memory and processing. In the brain, synapses provide a direct memory access to the neurons that process information. That is how the brain achieves impressive computational power and speed with very little power consumption. By imitating this architecture, neuromorphic computing provides a path to building smart neuromorphic chips that consume very little energy and, meanwhile, compute fast.

To some, it may seem that neuromorphic computing is part of a distant future. However, neuromorphic technology is here, closer than what you think it is. Beyond research and futuristic speculation, Intels Neuromorphic Lab created a self-learning neuromorphic research chip initially under the code-name Loihi (pronounced low-ee-hee). Loihi, Intel's fifth neuromorphic chip, was announced in September 2017 as a predominantly research chip. Since then, it has come a long way.

As an interesting related fact, Intel's chosen name for the chip, Lihi, means 'long' in Hawaiian, and is the newest --sometimes referred to as youngest-- active submarine volcano in the HawaiianEmperor seamount chain, a string of volcanoes that stretches about 6,200 km (3,900 miles) northwest of Lihi.

Now back to the chip. Loihi is a neuromorphic manycore processor with on-chip learning.Intels 14-nanometer Loihi chip contains over 2 billion transistors, 130,000 artificial neurons, and 130 million synapses.

Loihi chip integrates a wide range of novel features for the field such as programmable synaptic learning rules. According to Intel, the neuromorphic chip is the next-generation Artificial Intelligence enabler.

The Abstract of the paper Loihi: A Neuromorphic Manycore Processor with On-Chip Learningpublished by IEEE Micro reads:

Loihi is a 60-mm 2 chip fabricated in Intel's 14-nm process that advances the state-of-the-art modeling of spiking neural networks in silicon. It integrates a wide range of novel features for the field, such as hierarchical connectivity, dendritic compartments, synaptic delays, and, most importantly, programmable synaptic learning rules. Running a spiking convolutional form of the Locally Competitive Algorithm, Loihi can solve LASSO optimization problems with over three orders of magnitude superior energy-delay-product compared to conventional solvers running on a CPU iso-process/voltage/area. This provides an unambiguous example of spike-based computation, outperforming all known conventional solutions.

Most recently,Intel and Sandia National Laboratories signed a three-year agreementto explore the value of neuromorphic computing for scaled-up Artificial Intelligence problems.

According to Intel, Sandia will kick-off its research using a 50-million neuron Loihi-based system that was delivered to its facility in Albuquerque, New Mexico. This initial work with Loihi will lay the foundation for the later phase of the collaboration, which is expected to include continued large-scale neuromorphic research on Intels upcoming next-generation neuromorphic architecture and the delivery of Intels largest neuromorphic research system to this date, which could exceed more than 1 billion neurons in computational capacity.

Upon the release of the agreement, Mike Davies, Director of Intels Neuromorphic Computing Lab, said: By applying the high-speed, high-efficiency, and adaptive capabilities of neuromorphic computing architecture, Sandia National Labs will explore the acceleration of high-demand and frequently evolving workloads that are increasingly important for our national security. We look forward to a productive collaboration leading to the next generation of neuromorphic tools, algorithms, and systems that can scale to the billion neuron level and beyond."

Clearly, there are great expectations on what the neuromorphic technology promises. While most neuromorphic research to this date has focused on the technologys promise for edge use cases, new developments show that neuromorphic computing could also provide value for large, complex computational problems that require real-time processing, problem solving, adaptation, and fundamentally learning.

Intel, as a leader in neuromorphic research, is actively exploring this potential by releasing a 100-million neuron system, Pohoiki Springs, to the Intel Neuromorphic Research Community (INRC). Initial research conducted on Pohoiki Springs demonstrates how neuromorphic computing can provide up to four orders of magnitude better energy efficiency for constraint satisfaction a standard high-performance computing problem compared to state-of-the-art CPUs.

One of the goals of the joint effort aims to better understand how emerging technologies, such as neuromorphic computing, can be utilized as a tool to address some of the current most pressing scientific and engineering challenges.

These challenges include problems in scientific computing, counterproliferation, counterterrorism, energy, and national security. The possibilities are diverse and perhaps unlimited. As we can see, there are more applications than the ones one might have thought at the start.

Advance research in scaled-up neuromorphic computing is, at this point, paramount to determine where these systems are most effective, and how they can provide real-world value. For starters,this upcoming new research is going to evaluate the scaling of a variety of spiking neural network workloads, from physics modeling to graph analytics to large-scale deep networks.

According to Intel, these sorts of problems are useful for performing scientific simulations such as modeling particle interactions in fluids, plasmas, and materials. Moreover, these physics simulations increasingly need to leverage advances in optimization, data science, and advanced machine learning capabilities in order to find the right solutions.

Accordingly, potential applications for these workloads include simulating the behavior of materials, finding patterns and relationships in datasets, and analyzing temporal events from sensor data. We can say, that this is just the beginning. There is yet to be seen what real-life applications are going to emerge.

The fact that neuromorphic systems are designed to mimic the human brain raises important ethical questions. Neuromorphic chips utilized in Artificial Intelligence have, indeed, more in common with human cognition than with the conventional computer logic.

What perceptions, attitudes, and implications can this bring in the future when a human encounters a machine in the room that has more similarities in their neural networks to the neural networks of a human, rather than to a microprocessor?

While neuromorphic technology is still in its infancy, the field is advancing rapidly. In the near future, commercially available neuromorphic chips will most likely have an impact on edge devices, robotics, and Internet of Things (IoT) systems. Neuromorphic computing is on its waytoward low-power, miniaturized chips that can be able to infer and learn in real time. Indeed, we can expect exciting times ahead in the field of neuromorphic computing.

Related Articles:

See the original post here:
Neuromorphic Computing: How the Brain-Inspired Technology Powers the Next-Generation of Artificial Intelligence - Interesting Engineering

Related Posts

Comments are closed.