Neuromorphic Computing for Artificial Intelligence

Free Topic: Christopher A. Leidich is sharing his interest and findings in an emerging technology.

Neuromorphic Computing Promises to Open Exciting New Possibilities

Neuromorphic computing, an approach to building intelligent machines, is evolving into a method of computer engineering in which elements of a computer are modeled after systems in the human brain and nervous system. The term refers to the design of both hardware and software computing elements. Neuromorphic computing is sometimes referred to as neuromorphic engineering[1]. Neuromorphic computing includes new algorithmic approaches that emulate how the human brain interacts with the world to deliver capabilities closer to human cognition.

Neuromorphic computing promises to open exciting new possibilities and is already in use in a variety of areas including, sensing, robotics, healthcare, and large-scale AI applications[2] . Neuromorphic computing offers a means of designing AI systems capable of learning and adapting in a more similar way to human cognition than traditional AI algorithms. Neuromorphic computing aims to address the challenges of next-generation AI by providing a brain-inspired energy-efficient computing paradigm that primarily focuses on the ‘thinking’ and ‘processing’ side of these human-like systems. Inspired by the human brain, neuromorphic computing technologies have made important breakthroughs in recent years as alternatives to overcome the power and latency shortfalls of traditional digital computing[2].

Snapshot to the History

Limits of Von-Neumann Computing Paradigm

The Von-Neumann computing paradigm, also known as the Von Neumann model or Princeton architecture, is a computer architecture based on a 1945 description by John von Neumann and others. Artificial intelligence (human brain like machine capabilities) and machine learning (machines getting smarter by learning through experience) have reached an inflection point due to the digitization of massive amounts of information. Moore’s law estimates that the number of transistors on an Integrated Circuit (IC) doubled about every two years [3].

Moore’s law drove performance and power efficiency to new heights but then saturated because of hitting fundamental limits – causing architects and designers to revisit computing paradigms. Power and memory wall limits are now established as fundamental limits of the Von-Neumann computing paradigm.

Neuromorphic Computing Paradigm

With the end of Moore’s law approaching and transistor density limitations, the computing community is increasingly looking at new technologies to enable continued performance improvements. Neuromorphic computers are one such new computing technology. Neuromorphic computing is a new computing technology inspired by the brain’s structure and function. It is composed of neurons and synapses, and its programs are defined by the structure of the neural network and its parameters, rather than by explicit instructions as in a von Neumann computer. The term neuromorphic was coined by Carver Mead in the late 1980s. In 1990, Carver Mead projected that custom analog-mixed signal Matrix-Vector Multiplications (MVM) would be thousands of times more energy efficient than custom digital computation[3].

The field has continued to evolve and with the advent of large-scale funding opportunities for brain-inspired computing systems such as the Defense Advanced Research Projects Agency (DARPA) Synapse Project and the European Union’s Human Brain Project. The term neuromorphic has come to encompass a wide variety of hardware computing element implementations. Neuromorphic algorithms can be further developed following neural computing principles and neural network architectures inspired by biological neural systems.

The Future of AI - Exciting Emerging Areas

AI processing will increasingly move from the cloud into edge devices with the availability of neuromorphic processors

The AI field will undergo an exciting expansion beyond visual recognition to include all the human senses, including odor and gas recognition, taste classification, voice identification, vibration analysis for early fault detection, and other ‘expert systems. AI will become more commonplace in everyday products, e.g., refrigerators that can smell if any food could cause food poisoning. AI tools will become easier to use, not requiring any specialist knowledge, whereby the development systems will hide the complicated ‘expert’ work under a user-friendly graphical user interface (GUI), much like what Apple did for computer operating systems when the “non-GUI” Microsoft Disk Operation System (MS-DOS) was the standard [4].

Optical Memristors for Neuromorphic Computing and AI in the Optical Domain

An exciting emerging technology using light will revolutionize computing. Optical memristors (resistors with memory) will revolutionize computing and information processing across several applications. They can enable active trimming of Photonic Integrated Circuits (PICs), allowing for on-chip optical systems to be adjusted and reprogrammed as needed without continuously consuming power. They also offer high-speed data storage and retrieval, promising to accelerate processing, reduce energy consumption, and enable parallel processing. Optical memristors can be used for artificial synapses and brain-inspired architectures. Dynamic memristors with nonvolatile storage and nonlinear output replicate the long-term plasticity of synapses in the brain and pave the way for spiking integrate-and-fire computing architectures, which are brain-inspired computing for machine intelligence[5].

Loihi 2, and Lava

Intel Labs’ neuromorphic research goes beyond today’s deep-learning algorithms by co-designing optimized hardware with next-generation AI software. Built with the help of a growing community, this pioneering research effort seeks to accelerate the future of Adaptive AI. Adaptive AI is a characteristic of the next-generation AI systems – it can adjust its code for real-world changes, even when the coders didn’t know or anticipate these changes when they wrote the code. Neuromorphic computing’s innovative architectural approach will power future autonomous AI solutions that require energy efficiency and continuous learning. It promises to open exciting new possibilities in computing and is already in use in a variety of areas including sensing, robotics, healthcare, and large-scale AI applications.

Intel Labs’ second-generation neuromorphic research chip, codenamed Loihi 2 and an open-source software framework (Lava), will drive innovation and adoption of neuromorphic computing solutions. Enhancements include: up to 10x faster processing capability, up to 60x more inter-chip bandwidth, up to 1 million neurons with 15x greater resource density, 3D Scalable with native Ethernet support, a new, open-source software framework called Lava, fully programmable neuron models with graded spikes and enhanced learning and adaptation capabilities.

Space industry

Neuromorphic computing is recognized by the electronics industry and aerospace industry as a promising tool for enabling high-performance computing and ultra-low power consumption to achieve what clients need. Satellites, rovers, and other key assets impose limits on size, weight and power consumption, as well as the need for radiation-tolerance. TechEdSat-13 was launched on 1/13/22 from Virgin Orbit’s LauncherOne Rocket. TechEdSat-13 is a 3U form factor nanosatellite (a type of cubsat) that carries a unique Artificial Intelligence/Machine Learning (AI/ML) module featuring the first orbital flight of a neuromorphic processor. Radiation-hardened neuromorphic processor designs are currently in development by various companies. “AI is really being used all across the space exploration enterprise, everything from making the spacecraft smarter, to analyzing the huge datasets on the ground, and then to operating things like the huge communications antennas that we need to talk to the spacecraft,” Steve Chien, head of Artificial Intelligence (AI) for NASA/Jet Propulsion Laboratory (JPL).

Reference List

[1] https://research.ibm.com/projects/neuromorphic-computing
[2] https://www.arrow.com/en/research-and-events/articles/neuromorphic-computing-chips-and-ai-hardware
[3] https://www.investopedia.com/terms/m/mooreslaw.asp
[4] https://www.arrow.com/en/research-and-events/articles/neuromorphic-computing-chips-and-ai-hardware
[5] https://www.sciencedaily.com/releases/2023/06/230605181314.htm