AI & RoboticsNews

Modular looks to boost AI mojo with $100M funding raise

Make no mistake about it, there is a lot of excitement and money in early stage AI.

A year and a half after being founded, and only four months after the first previews of its technology, AI startup Modular announced today that it has raised $100 million, bringing total funding to date up to $130 million.

The new round of funding is led by General Catalyst and includes the participation of GV (Google Ventures), SV Angel, Greylock, and Factory. Modular has positioned itself to tackle the audacious goal of fixing AI infrastructure for the world’s developers. This goal is being achieved with product-led motion that includes the Modular AI runtime engine and the Mojo programming language for AI.

The company’s cofounders Chris Lattner and Tim Davis are no strangers to the world of AI, with both having worked at Google in support of TensorFlow initiatives.

A challenge that the cofounders saw time and again with AI is how complex deployment can be across different types of hardware. Modular aims to help solve that challenge in a big way.

“After working on these systems for such a long time, we put our heads together and thought that we can build a better infrastructure stack that makes it easier for people to develop and deploy machine learning workloads on the world’s hardware across clouds and across frameworks, in a way that really unifies the infrastructure stack,” Davis told VentureBeat.

Today when AI inference is deployed, it’s usually with an application stack often tied to specific hardware and software combinations.

The Modular AI engine is an attempt to break the current siloed approach of running AI workloads. Davis said that the Modular AI engine enables AI workloads to be accelerated to scale faster and to be portable across hardware. 

Davis explained that TensorFlow and PyTorch frameworks, which are among the most common AI workloads, are both powered on the backend by runtime compilers. Those compilers basically take an ML graph, which is a series of operations and functions, and enable them to be executed on a system.

The Modular AI engine is functionally a new backend for the AI frameworks, acting as a drop-in replacement for the execution engines that already exist for PyTorch and TensorFlow. Initially, Modular’s engine works for AI inference, but it has plans to expand to training workloads in the future.

“[Modular AI engine] enables developers to have choice on their back end so they can scale across architectures,” Davis explained. “That means your workloads are portable, so you have more choice,  you’re not locked to a specific hardware type, and it is the world’s fastest execution engine for AI workloads on the back end.”

The other challenge that Modular is looking to solve is that of programming languages for AI.

The open source Python programming language is the de facto standard for data science and ML development, but it runs into issues at high scale. As a result, developers need to rewrite code in the C++ programming language to get scale. Mojo aims to solve that issue.

“The challenge with Python is it has some technical limitations on things like the global interpreter lock not being able to do large scale parallelization style execution,” Davis explained. “So what happens is as you get to larger workloads, they require custom memory layouts and you have to swap over to C++ in order to get performance and to be able to scale correctly.”

Davis explained that Modular is taking Python and building a superset around that. Rather than requiring developers to figure out Python and C++, Mojo provides a single language that can support existing Python code with required performance and scalability.

“The reason this is such a huge deal is you tend to have the researcher community working in Python, but then you have production deployment working in C++, and typically what would happen is people would end their code over the wall, and then they would have to rewrite it in order for it to be performant on different types of of hardware,” said Davis. “We have now unlocked that.”

To date, Mojo has only been available in private preview, with availability opening up today to some developers that have been on a preview waitlist. Davis said that there will be broader availability in September. Mojo is currently all proprietary code, although Davis noted that Modular has a plan to open source part of Mojo by the end of 2023.

“Our goal is to really just supercharge the world’s AI development community, and enable them to build things faster and innovate faster to help impact the world,” he said.

Head over to our on-demand library to view sessions from VB Transform 2023. Register Here


Make no mistake about it, there is a lot of excitement and money in early stage AI.

A year and a half after being founded, and only four months after the first previews of its technology, AI startup Modular announced today that it has raised $100 million, bringing total funding to date up to $130 million.

The new round of funding is led by General Catalyst and includes the participation of GV (Google Ventures), SV Angel, Greylock, and Factory. Modular has positioned itself to tackle the audacious goal of fixing AI infrastructure for the world’s developers. This goal is being achieved with product-led motion that includes the Modular AI runtime engine and the Mojo programming language for AI.

The company’s cofounders Chris Lattner and Tim Davis are no strangers to the world of AI, with both having worked at Google in support of TensorFlow initiatives.

Event

VB Transform 2023 On-Demand

Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.

 


Register Now

A challenge that the cofounders saw time and again with AI is how complex deployment can be across different types of hardware. Modular aims to help solve that challenge in a big way.

“After working on these systems for such a long time, we put our heads together and thought that we can build a better infrastructure stack that makes it easier for people to develop and deploy machine learning workloads on the world’s hardware across clouds and across frameworks, in a way that really unifies the infrastructure stack,” Davis told VentureBeat.

How the Modular AI engine aim to change the state of inference today

Today when AI inference is deployed, it’s usually with an application stack often tied to specific hardware and software combinations.

The Modular AI engine is an attempt to break the current siloed approach of running AI workloads. Davis said that the Modular AI engine enables AI workloads to be accelerated to scale faster and to be portable across hardware. 

Davis explained that TensorFlow and PyTorch frameworks, which are among the most common AI workloads, are both powered on the backend by runtime compilers. Those compilers basically take an ML graph, which is a series of operations and functions, and enable them to be executed on a system.

The Modular AI engine is functionally a new backend for the AI frameworks, acting as a drop-in replacement for the execution engines that already exist for PyTorch and TensorFlow. Initially, Modular’s engine works for AI inference, but it has plans to expand to training workloads in the future.

“[Modular AI engine] enables developers to have choice on their back end so they can scale across architectures,” Davis explained. “That means your workloads are portable, so you have more choice,  you’re not locked to a specific hardware type, and it is the world’s fastest execution engine for AI workloads on the back end.”

Need some AI mojo? There’s now a programming language for that

The other challenge that Modular is looking to solve is that of programming languages for AI.

The open source Python programming language is the de facto standard for data science and ML development, but it runs into issues at high scale. As a result, developers need to rewrite code in the C++ programming language to get scale. Mojo aims to solve that issue.

“The challenge with Python is it has some technical limitations on things like the global interpreter lock not being able to do large scale parallelization style execution,” Davis explained. “So what happens is as you get to larger workloads, they require custom memory layouts and you have to swap over to C++ in order to get performance and to be able to scale correctly.”

Davis explained that Modular is taking Python and building a superset around that. Rather than requiring developers to figure out Python and C++, Mojo provides a single language that can support existing Python code with required performance and scalability.

“The reason this is such a huge deal is you tend to have the researcher community working in Python, but then you have production deployment working in C++, and typically what would happen is people would end their code over the wall, and then they would have to rewrite it in order for it to be performant on different types of of hardware,” said Davis. “We have now unlocked that.”

Supercharging the AI development community

To date, Mojo has only been available in private preview, with availability opening up today to some developers that have been on a preview waitlist. Davis said that there will be broader availability in September. Mojo is currently all proprietary code, although Davis noted that Modular has a plan to open source part of Mojo by the end of 2023.

“Our goal is to really just supercharge the world’s AI development community, and enable them to build things faster and innovate faster to help impact the world,” he said.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Sean Michael Kerner
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
Cleantech & EV'sNews

Einride deploys first daily commercial operations of autonomous trucks in Europe

Cleantech & EV'sNews

ChargePoint collaborates with GM Energy to deploy up to 500 EV fast chargers with Omni Ports

Cleantech & EV'sNews

How Ukraine assassinated a Russian general with an electric scooter

CryptoNews

Day-1 Crypto Executive Orders? Bitcoin Bulls Brace for Trump's Big Move

Sign up for our Newsletter and
stay informed!