AI & RoboticsNews

Axelera raises $68M to rival Nvidia with edge AI chips

Nvidia leads the age of AI hardware. The company’s high-performance GPUs are being used by some of the biggest technology companies in the world to power the training and inference for some very large models.

But, as the Jensen Huang-led giant continues to thrive, a batch of smaller AI hardware companies has also come to the fore, targeting specific niches within the domain. Today, one such startup hailing from the Netherlands, Axelera AI, announced it has $68 million in series B funding.

Axelera is building solutions powered by AIPUs or AI processing units to run computer vision inference workloads on the edge. The investment in the startup marks Europe’s largest Series B round in the fabless semiconductor category and has been led by major institutional backers, including Invest-NL Deep Tech Fund, European Innovation Council Fund, Innovation Industries Strategic Partners Fund and Samsung Catalyst Fund.

Axlera said it will use the capital to take its existing AIPU solutions to new geographies and markets as well as add new products to address the computing needs of next-generation AI workloads, including multimodal LLMs. 

Spun out of the AI innovation lab of Bitfury Group, a blockchain technology company, Axelera AI’s core focus is on AI acceleration, particularly for edge computing applications. The company launched in 2021 and has developed a platform called Metis, which takes a hardware and software-based approach to handle computer vision inference at the edge while trying to maintain a balance between performance, efficiency and ease of use.

At the core, the platform has two key components: a 12nm CMOS AI Processing Unit (AIPU) and a software development kit that can be used to easily build computer vision applications capable of running on devices using the chip. Each AIPU comes with four “self-sufficient” AI cores that can either collaborate on a workload to boost throughput or process different neural networks required by the application concurrently.

“The AI core is a RISC-V-controlled dataflow engine delivering up to 53.5 TOPS of AI processing power featuring several high-throughput data paths to provide balanced performance over a vast range of layers and to address the heterogeneous nature of modern neural network workloads. The total throughput of Axelera’s four-core Metis AIPU can reach 214 TOPS at a compute density of 6.65 TOPS/mm2,” Axelera writes on its website. 

Each AI core also uses the company’s proprietary digital in-memory computing technology to accelerate matrix operations, offering a high level of energy efficiency at 15 TOPS/W. Axelera describes this as a radically different approach to data processing, where crossbar arrays of memory devices store a matrix and perform matrix-vector multiplications “in place” without intermediate movement of data.

“Based on SRAM (Static Random-Access Memory) combined with digital computations, each memory cell effectively becomes a compute element. This radically increases the number of operations per computer cycle without suffering from issues such as noise or lower accuracy,” the company noted.

Currently, Axelera is shipping Metis evaluation kits to enterprises like Fogsphere, XXII and System Electronics.

With this funding, which takes the total capital raised to $120 million, the company plans to build on this momentum and take the platform into full production for widespread deliveries. It says the product can deliver up to a five-fold increase in efficiency and performance for enterprises and is witnessing massive demand, with a strong business pipeline of over $100 million.  

As part of the growth strategy, Axelera plans to expand to North America, Europe and the Middle East with a focus on key verticals such as automotive, digital healthcare, Industry 4.0, retail, robots & drones and surveillance. The starting point in these markets will be Metis, but the company has plans to go beyond and launch price-competitive, data center-focused accelerators as well. This will enable the company to address the growing computing needs for generative AI models, including large multimodal models. 

“There’s no denying that the AI industry has the potential to transform a multitude of sectors. However, to truly harness the value of AI, organizations need a solution that delivers high performance and efficiency while balancing costs,” Fabrizio Del Maffeo, co-founder and CEO of Axelera AI, said in a statement.

“This funding supports our mission to democratize access to artificial intelligence, from the edge to the cloud. By expanding our product lines beyond the edge computing market, we are able to address industry challenges in AI inference and support current and future AI processing needs with our scalable, proven technology,” he added.

Another notable player in the edge AI acceleration space is Hailo. The company recently announced Hailo-10, an energy-efficient processor designed to deploy gen AI applications across edge devices, like cars and commercial robots.

Don’t miss OpenAI, Chevron, Nvidia, Kaiser Permanente, and Capital One leaders only at VentureBeat Transform 2024. Gain essential insights about GenAI and expand your network at this exclusive three day event. Learn More


Nvidia leads the age of AI hardware. The company’s high-performance GPUs are being used by some of the biggest technology companies in the world to power the training and inference for some very large models.

But, as the Jensen Huang-led giant continues to thrive, a batch of smaller AI hardware companies has also come to the fore, targeting specific niches within the domain. Today, one such startup hailing from the Netherlands, Axelera AI, announced it has $68 million in series B funding.

Axelera is building solutions powered by AIPUs or AI processing units to run computer vision inference workloads on the edge. The investment in the startup marks Europe’s largest Series B round in the fabless semiconductor category and has been led by major institutional backers, including Invest-NL Deep Tech Fund, European Innovation Council Fund, Innovation Industries Strategic Partners Fund and Samsung Catalyst Fund.

Axlera said it will use the capital to take its existing AIPU solutions to new geographies and markets as well as add new products to address the computing needs of next-generation AI workloads, including multimodal LLMs. 


Countdown to VB Transform 2024

Join enterprise leaders in San Francisco from July 9 to 11 for our flagship AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register Now


Understanding Axelera’s value proposition

Spun out of the AI innovation lab of Bitfury Group, a blockchain technology company, Axelera AI’s core focus is on AI acceleration, particularly for edge computing applications. The company launched in 2021 and has developed a platform called Metis, which takes a hardware and software-based approach to handle computer vision inference at the edge while trying to maintain a balance between performance, efficiency and ease of use.

At the core, the platform has two key components: a 12nm CMOS AI Processing Unit (AIPU) and a software development kit that can be used to easily build computer vision applications capable of running on devices using the chip. Each AIPU comes with four “self-sufficient” AI cores that can either collaborate on a workload to boost throughput or process different neural networks required by the application concurrently.

“The AI core is a RISC-V-controlled dataflow engine delivering up to 53.5 TOPS of AI processing power featuring several high-throughput data paths to provide balanced performance over a vast range of layers and to address the heterogeneous nature of modern neural network workloads. The total throughput of Axelera’s four-core Metis AIPU can reach 214 TOPS at a compute density of 6.65 TOPS/mm2,” Axelera writes on its website. 

Each AI core also uses the company’s proprietary digital in-memory computing technology to accelerate matrix operations, offering a high level of energy efficiency at 15 TOPS/W. Axelera describes this as a radically different approach to data processing, where crossbar arrays of memory devices store a matrix and perform matrix-vector multiplications “in place” without intermediate movement of data.

“Based on SRAM (Static Random-Access Memory) combined with digital computations, each memory cell effectively becomes a compute element. This radically increases the number of operations per computer cycle without suffering from issues such as noise or lower accuracy,” the company noted.

Goal to scale up, handle next-gen AI workloads

Currently, Axelera is shipping Metis evaluation kits to enterprises like Fogsphere, XXII and System Electronics.

With this funding, which takes the total capital raised to $120 million, the company plans to build on this momentum and take the platform into full production for widespread deliveries. It says the product can deliver up to a five-fold increase in efficiency and performance for enterprises and is witnessing massive demand, with a strong business pipeline of over $100 million.  

Fabrizio Del Maffeo, co-founder and CEO of Axelera AI

As part of the growth strategy, Axelera plans to expand to North America, Europe and the Middle East with a focus on key verticals such as automotive, digital healthcare, Industry 4.0, retail, robots & drones and surveillance. The starting point in these markets will be Metis, but the company has plans to go beyond and launch price-competitive, data center-focused accelerators as well. This will enable the company to address the growing computing needs for generative AI models, including large multimodal models. 

“There’s no denying that the AI industry has the potential to transform a multitude of sectors. However, to truly harness the value of AI, organizations need a solution that delivers high performance and efficiency while balancing costs,” Fabrizio Del Maffeo, co-founder and CEO of Axelera AI, said in a statement.

“This funding supports our mission to democratize access to artificial intelligence, from the edge to the cloud. By expanding our product lines beyond the edge computing market, we are able to address industry challenges in AI inference and support current and future AI processing needs with our scalable, proven technology,” he added.

Another notable player in the edge AI acceleration space is Hailo. The company recently announced Hailo-10, an energy-efficient processor designed to deploy gen AI applications across edge devices, like cars and commercial robots.





Author: Shubham Sharma
Source: Venturebeat
Reviewed By: Editorial Team
Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!