AI & RoboticsNews

Spell unveils deep learning operations platform to cut AI training costs

All the sessions from Transform 2021 are available on-demand now. Watch now.


Spell today unveiled an operations platform that provides the tooling needed to train AI models based on deep learning algorithms.

The platforms currently employed to train AI models are optimized for machine learning algorithms. AI models based on deep learning algorithms require their own deep learning operations (DLOps) platform, Spell head of marketing Tim Negris told VentureBeat.

The Spell platform automates the entire deep learning workflow using tools the company developed in the course of helping organizations build and train AI models for computer vision and speech recognition applications that require deep learning algorithms.

Deep roots

Deep learning algorithms trace their lineage back to neural networks in a field of machine learning that structures algorithms in layers to create a neural network that can learn and make intelligent decisions on its own. The artifacts and models that are created using deep learning algorithms, however, don’t lend themselves to the same platforms used to manage machine learning operations (MLOps), Negris said.

An AI model based on deep learning algorithms can require tracking and managing hundreds of experiments with thousands of parameters spanning large numbers of graphical processor units (GPUs), Negris noted. The Spell platform specifically addresses the need to manage, automate, orchestrate, document, optimize, deploy, and monitor deep learning models throughout their entire lifecycle, he said. “Data science teams need to be able to explain and reproduce deep learning results,” Negris added.

While most existing MLOps platforms are not well suited to managing deep learning algorithms, Negris said the Spell platform can also be employed to manage AI models based on machine learning algorithms. Spell does not provide any tools to manage the lifecycle of those models, but data science teams can add their own third-party framework for managing them to the Spell platform.

The Spell platform also reduces cost by automatically invoking spot instances that cloud service providers make available for a finite amount of time whenever feasible, Negris said. That capability can reduce the total cost of training an AI model by as much as 66%, he added. That’s significant because the cost of training AI models based on deep learning algorithms can in some cases reach millions of dollars.

A hybrid approach

In time, most AI applications will be constructed using a mix of machine and deep learning algorithms. In fact, as the building of AI models using machine learning algorithms becomes more automated, many data science teams will spend more of their time constructing increasingly complex AI models based on deep learning algorithms. The cost of building AI models based on deep learning algorithms should also steadily decline as GPUs deployed in an on-premises IT environment or accessed via a cloud service become more affordable.

In the meantime, Negris said that while the workflows for building AI models will converge, it’s unlikely traditional approaches to managing application development processes based on DevOps platforms will be extended to incorporate AI models. The continuous retraining of AI models that are subject to drift does not lend itself to the more linear processes that are employed today to build and deploy traditional applications, he said.

Nevertheless, all the AI models being trained eventually need to find their way into an application deployed in a production environment. The challenge many organizations face today is aligning the rate at which AI models are developed with the faster pace at which applications are now deployed and updated.

One way or another, it’s only a matter of time before every application — to varying degrees — incorporates one or more AI models. The issue going forward is finding a way to reduce the level of friction that occurs whenever an AI model needs to be deployed within an application.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member


Author: Michael Vizard
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!