AI & RoboticsNews

Dell customizes GenAI and focuses on data lakehouse

Dell Technologies is growing its portfolio of generative AI products and services to help more of its customers harness the power of artificial intelligence.

Today Dell announced a series of initiatives that expand on the company’s generative AI efforts that it has been incrementally rolling out since early 2023. Back in May, Dell announced Project Helix in partnership with Nvidia as an effort to bring the power of large language models (LLMs) to on premises environments with Dell hardware. A few months later in July, Dell and Nvidia announced the first fruits of the Project Helix effort with validated designs for running AI inference workloads and professional services to support enterprise deployments. Now Dell is going a step further with validated designs for model customization with Nvidia to help organizations fine tune AI.

Dell is also now detailing its strategy for enabling data for generative AI, with an open data lakehouse platform that benefits from a partnership with data query platform vendor Starburst.

“One note that I want to make about enterprise is that we’re seeing obviously seeing a lot of interest in this area and I think that it is somewhat early days in the deployment of generative AI on premises,” Carol Wilder, vp for cross portfolio software and solutions at Dell Technologies said during a press briefing. “Dell is committed to providing the best solutions and options to our customers so that they have the flexibility and resilience in order to address the business outcomes that they’re trying to achieve.”

The first iteration of Dell Validated Designs for Generative AI with Nvidia was just about inferencing. That solution enables organizations to run an optimized stack of Dell and Nvidia hardware, along with optimized software for AI inference.

She explained that with the inference solution, the models are pre-trained and ready to deploy. With model customization, Wilder said that there are more resource requirements needed in terms of hardware capabilities, than what inference typically demands.

With the new model customization offering Dell is looking to enable organizations to tune models that are specifically optimized for an enterprise’s use case with its own data. Among the use cases that Dell expects the model customization service to be used for are: virtual assistants, content generation and software development.

Wilder noted that the benefit of customization is that enterprises will get optimized models for their own deployment as well as the benefit of models that optimize hardware usage.

Being able to fine tune as well as train generative AI is a process that relies on data, lots and lots of data.

For enterprise use cases, that data isn’t just generic data taken from a public source, but rather is data that an organization already has in its data centers or cloud deployments and is likely also spread across multiple locations. To help enable enterprises to fully benefit from data for generative AI, Dell is building out an open data lakehouse platform.

The data lakehouse concept is one that was originally pioneered by Databricks, as a way of enabling organizations to more easily query data stored in cloud object storage based data lakes. The Dell approach is a bit more nuanced in that it is taking a hybrid approach to data, with a goal of being able to query data across on-premises as well as mutli-cloud deployments.

Greg Findlen, senior VP data management at Dell explained during the press briefing that the open data lakehouse will be able to use Dell storage and compute capabilities as well as multi-cloud storage. On top of the storage, Dell will be integrating the Starburst Enterprise platform which provides data query and management capabilities enabling data from disparate sources to be used for data analytics and AI.

“We also want to make sure that customers can discover, integrate and process data across the organization, that’s one of the big reasons why we have partnered with Starburst,” Findlen said.

He noted that with the Starburst integration for Dell’s data lakehouse effort, organizations will be able to leverage the data where it exists. Findlen emphasized that the number one priority for the lakehouse effort is making sure that Dell can accelerate how quickly data science teams and the AI developer teams can get access to data from across the organization.

From Findlen’s perspective, the growth of generative AI has helped to reinforce the primary importance of data for enterprises overall.

“I’m excited by how GenAI has really put a lot more focus on these technologies and the importance of data within the enterprise and the importance of protecting the data that you have and making sure that it stays private but also enabling customers to accelerate their businesses,” Findlen said. “It’s important to think about how all the different kinds of data in the enterprise can feed the many use cases for AI.”

VentureBeat presents: AI Unleashed – An exclusive executive event for enterprise data leaders. Network and learn with industry peers. Learn More


Dell Technologies is growing its portfolio of generative AI products and services to help more of its customers harness the power of artificial intelligence.

Today Dell announced a series of initiatives that expand on the company’s generative AI efforts that it has been incrementally rolling out since early 2023. Back in May, Dell announced Project Helix in partnership with Nvidia as an effort to bring the power of large language models (LLMs) to on premises environments with Dell hardware. A few months later in July, Dell and Nvidia announced the first fruits of the Project Helix effort with validated designs for running AI inference workloads and professional services to support enterprise deployments. Now Dell is going a step further with validated designs for model customization with Nvidia to help organizations fine tune AI.

Dell is also now detailing its strategy for enabling data for generative AI, with an open data lakehouse platform that benefits from a partnership with data query platform vendor Starburst.

“One note that I want to make about enterprise is that we’re seeing obviously seeing a lot of interest in this area and I think that it is somewhat early days in the deployment of generative AI on premises,” Carol Wilder, vp for cross portfolio software and solutions at Dell Technologies said during a press briefing. “Dell is committed to providing the best solutions and options to our customers so that they have the flexibility and resilience in order to address the business outcomes that they’re trying to achieve.”

Event

AI Unleashed

An exclusive invite-only evening of insights and networking, designed for senior enterprise executives overseeing data stacks and strategies.


Learn More

Credit: Dell Technologies

Dell grows GenAI efforts beyond inference to model customization

The first iteration of Dell Validated Designs for Generative AI with Nvidia was just about inferencing. That solution enables organizations to run an optimized stack of Dell and Nvidia hardware, along with optimized software for AI inference.

She explained that with the inference solution, the models are pre-trained and ready to deploy. With model customization, Wilder said that there are more resource requirements needed in terms of hardware capabilities, than what inference typically demands.

With the new model customization offering Dell is looking to enable organizations to tune models that are specifically optimized for an enterprise’s use case with its own data. Among the use cases that Dell expects the model customization service to be used for are: virtual assistants, content generation and software development.

Wilder noted that the benefit of customization is that enterprises will get optimized models for their own deployment as well as the benefit of models that optimize hardware usage.

Dell’s vision for an open, modern data lakehouse

Being able to fine tune as well as train generative AI is a process that relies on data, lots and lots of data.

For enterprise use cases, that data isn’t just generic data taken from a public source, but rather is data that an organization already has in its data centers or cloud deployments and is likely also spread across multiple locations. To help enable enterprises to fully benefit from data for generative AI, Dell is building out an open data lakehouse platform.

The data lakehouse concept is one that was originally pioneered by Databricks, as a way of enabling organizations to more easily query data stored in cloud object storage based data lakes. The Dell approach is a bit more nuanced in that it is taking a hybrid approach to data, with a goal of being able to query data across on-premises as well as mutli-cloud deployments.

Greg Findlen, senior VP data management at Dell explained during the press briefing that the open data lakehouse will be able to use Dell storage and compute capabilities as well as multi-cloud storage. On top of the storage, Dell will be integrating the Starburst Enterprise platform which provides data query and management capabilities enabling data from disparate sources to be used for data analytics and AI.

“We also want to make sure that customers can discover, integrate and process data across the organization, that’s one of the big reasons why we have partnered with Starburst,” Findlen said.

He noted that with the Starburst integration for Dell’s data lakehouse effort, organizations will be able to leverage the data where it exists. Findlen emphasized that the number one priority for the lakehouse effort is making sure that Dell can accelerate how quickly data science teams and the AI developer teams can get access to data from across the organization.

From Findlen’s perspective, the growth of generative AI has helped to reinforce the primary importance of data for enterprises overall.

“I’m excited by how GenAI has really put a lot more focus on these technologies and the importance of data within the enterprise and the importance of protecting the data that you have and making sure that it stays private but also enabling customers to accelerate their businesses,” Findlen said. “It’s important to think about how all the different kinds of data in the enterprise can feed the many use cases for AI.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Sean Michael Kerner
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!