AI & RoboticsNews

Is Intel Labs’ brain-inspired AI approach the future of robot learning? 

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


Can computer systems develop to the point where they can think creatively, identify people or items they have never seen before, and adjust accordingly — all while working more efficiently, with less power? Intel Labs is betting on it, with a new hardware and software approach using neuromorphic computing, which, according to a recent blog post, “uses new algorithmic approaches that emulate how the human brain interacts with the world to deliver capabilities closer to human cognition.” 

While this may sound futuristic, Intel’s neuromorphic computing research is already fostering interesting use cases, including how to add new voice interaction commands to Mercedes-Benz vehicles; create a robotic hand that delivers medications to patients; or develop chips that recognize hazardous chemicals.

A new approach in the face of capacity limits

Machine learning-driven systems, such as autonomous cars, robotics, drones, and other self-sufficient technologies, have relied on ever-smaller, more-powerful, energy-efficient processing chips. Though traditional semiconductors are now reaching their miniaturization and power capacity limits, compelling experts to believe that a new approach to semiconductor design is required. 

One intriguing option that has piqued tech companies’ curiosity is neuromorphic computing. According to Gartner, traditional computing technologies based on legacy semiconductor architecture will reach a digital wall by 2025. This will force changes to new paradigms such as neuromorphic computing, which mimics the physics of the human brain and nervous system by utilizing spiking neural networks (SNNs) – that is, the spikes from individual electronic neurons activate other neurons in a cascading chain. 

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.


Register Here

Neuromorphic computing will enable fast vision and motion planning at low power, Yulia Sandamirskaya, a research scientist at Intel Labs in Munich, told VentureBeat via email. “These are the key bottlenecks to enable safe and agile robots, capable to direct their actions at objects in dynamic real-world environments.”

In addition, neuromorphic computing “expands the space of neural network-based algorithms,” she explained. By co-locating memory and compute in one chip, it allows for energy-efficient processing of signals and enables on-chip continual, lifelong learning.

One size does not fit all in AI computing

As the AI space becomes increasingly complex, a one-size-fits-all solution cannot optimally address the unique constraints of each environment across the spectrum of AI computing.

“Neuromorphic computing could offer a compelling alternative to traditional AI accelerators by significantly improving power and data efficiency for more complex AI use cases, spanning data centers to extreme edge applications,” Sandamirskaya said.

Neuromorphic computing is quite similar to how the brain transmits and receives signals from biological neurons that spark or identify movements and sensations in our bodies. However, compared to traditional approaches, where systems orchestrate computation in strict binary terms, neuromorphic chips compute more flexibly and broadly. In addition, by constantly re-mapping neural networks, the SNNs replicate natural learning, allowing the neuromorphic architecture to make decisions in response to learned patterns over time.

These asynchronous, event-based SNNs enable neuromorphic computers to achieve orders of magnitude power and performance advantages over traditional designs. Sandamirskaya explained that neuromorphic computing will be especially advantageous for applications that must operate under power and latency constraints and adapt in real time to unforeseen circumstances. 

A study by Emergen Research predicts that the worldwide neuromorphic processing industry will reach $11.29 billion by 2027.

Intel’s real-time learning solution

Neuromorphic computing will be especially advantageous for applications that must operate under power and latency constraints and must adapt in real-time to unforeseen circumstances, said Sandamirskaya.

One particular challenge is that intelligent robots require object recognition to substantially comprehend working environments. Intel Labs’ new neuromorphic computing approach to neural network-based object learning — in partnership with the Italian Institute of Technology and the Technical University of Munich — is aimed at future applications like robotic assistants interacting with unconstrained environments, including those used in logistics, healthcare, or elderly care. 

In a simulated setup, a robot actively senses objects by moving its eyes through an event-based camera or dynamic vision sensor. The events collected are used to drive a spiking neural network (SNN) on Intel’s neuromorphic research chip, called Loihi. If an object or view is new to the model, its SNN representation is either learned or modified. The network recognizes the object and provides feedback to the user, if the object is known. This neuromorphic computing technology allows robots to continuously learn about every nuance in their environment.

Intel and its collaborators successfully demonstrated continual interactive learning on the Loihi neuromorphic research chip, measuring about 175-times lower energy to learn a new object instance with similar or better speed and accuracy compared to conventional methods running on a central processing unit (CPU). 

Computation is more energy-efficient

Sandamirskaya said computation is more energy efficient because it uses clockless, asynchronous circuits that naturally exploit sparse, event-driven analysis. 

“Loihi is the most versatile neuromorphic computing platform that can be used to explore many different types of novel bio-inspired neural-network algorithms,” she said, including deep learning to attractor networks, optimization, or search algorithms, sparse coding, or symbolic vector architectures.

Loihi’s power efficiency also shows promise for making assistive technologies more valuable and effective in real-world situations. Since Loihi is up to 1,000 times more energy efficient than general-purpose processors, a Loihi-based device could require less frequent charging, making it ideal for use in daily life.

Intel Labs’ work contributes to neuronal network-based machine learning for robots with a small power footprint and interactive learning capability. According to Intel, such research is a crucial step in improving the capabilities of future assistive or manufacturing robots.

“On-chip learning will enable ongoing self-calibration of future robotic systems, which will be soft and thus less rigid and stable, as well as fast learning on the job or in an interactive training session with the user,” Sandamirskaya said. 

Intel Labs: The future is bright for neuromorphic computing

Neuromorphic computing isn’t yet available as a commercially viable technology.

While Sandamirskaya says the neuromorphic computing movement is “gaining steam at an amazing pace,” commercial applications will require improvement of neuromorphic hardware in response to application and algorithmic research — as well as the development of a common cross-platform software framework and deep collaborations across industry, academia and governments. 

Still, she is hopeful about the future of neuromorphic computing.

“We’re incredibly excited to see how neuromorphic computing could offer a compelling alternative to traditional AI accelerators,” she said, “by significantly improving power and data efficiency for more complex AI use cases spanning data center to extreme edge applications.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Victor Dey
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!