AI & RoboticsNews

EdgeQ samples SoC for 5G and AI inference engines

A new GamesBeat event is around the corner! Learn more about what comes next. 


EdgeQ revealed today it has begun sampling a 5G base station-on-a-chip that allows AI inference engines to run at the network edge. The goal is to make it less costly to build enterprise-grade 5G access points, as well as radio units and distributed units that make up an open radio access network.

The choice EdgeQ made to create a base station that can be deployed as a system-on-chip (SOC) platform also reduces the time and effort providers of wireless networks need to create physical layer software that governs all the essential protocols and features of an integrated 4G/5G network, said EdgeQ CEO Vinay Ravuri.

At the core of the product, simply named EdgeQ 5G Base Station-on-a-Chip, is a specialized baseband processor RISC-V ISA that EdgeQ created by extending an open source RISC-V Open Instruction Set with more than 50 custom instructions. The SoC is also programmable, which will enable carriers to leverage a network functional application platform interface (nFAPI) to add custom extensions as they roll out 5G services. The host module with the SoC, meanwhile, is based on an eight-core ARM Neoverse CPU Cluster that handles service provisioning and data processing.

The ability to run AI inference engines on the same SoC employed to process 5G signals will reduce the total cost of pushing AI out to the network edge, said Ravuri. Organizations won’t need to deploy a separate processing platform to run AI inference engines, he noted. The overall performance of the application environment should also improve, since using a SoC as the basis of a base station will reduce overall latency and power consumption, Ravuri said, adding, “A lot of use cases are battery operated.”

Pushing AI to the edge

Platforms such as the 5G Base Station-on-a-Chip will make it easier for organizations to apply AI across a broad range of use cases because base stations can be attached to utility poles or embedded within an oil rig. The 5G connections will provide enough bandwidth to deploy augmented and virtual reality applications infused with AI models, noted Ravuri.

While it’s feasible to deploy an AI inference engine at the 5G edge, it may be a while before there is enough raw compute power at the edge to also train AI models. In the meantime, however, inference engines based on AI models typically trained in the cloud will soon become much more widely distributed.

EdgeQ has been building the 5G Base Station-on-a-Chip for more than three years, Ravuri said. Led by former executives from Qualcomm, Intel, and Broadcom, EdgeQ is backed by investors such as Threshold Partners, Fusion Fund, and AME Cloud Ventures. The company also claims to have an additional unidentified investor that is also one of its customers. As of last fall, the company has raised $51 million in total funding. A recent IHS Markit report forecasts global 5G technology will enable $13.2 trillion in economic output through 2035. The size of the market for 5G infrastructure itself is valued at approximately $10 billion.

It’s not clear to what degree other startups or processor manufacturers such as Qualcomm or Broadcom are pursuing similar initiatives. The one thing that is clear is SoC architectures are spurring a wave of platform innovations that are making it possible to run a wide range of workloads more efficiently at a lower cost. The next step will be the development of frameworks that make it simpler for developers to invoke the capabilities of these next-generation platforms. After all, providing access to 5G network bandwidth is one thing. Making that bandwidth programmatically available to developers is another thing altogether.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member


Author: Michael Vizard
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!