AI & RoboticsNews

Snowflake doubles down on enterprise AI with no-code studio and more

After making headlines with the announcement of Polaris, a vendor-neutral open data catalog for Apache Iceberg, Snowflake is launching new tools to help enterprises double down on the development of trusted AI-powered applications.

In his second keynote of the company’s annual data cloud summit, CEO Sridhar Ramaswamy shared several enhancements for the Cortex AI service and Snowflake ML to make it easier for enterprises to build, govern and manage AI applications, using the data hosted on the platform. The capabilities touch several aspects but the highlight for us is a new no-code AI & ML Studio that gives any enterprise user the ability to start building AI applications for their desired use-cases.

The move comes as another step from Snowflake to empower its customers with all the tools and models they need to build strong AI applications. The company has been moving in this direction ever since Ramaswamy took over as the CEO and appears to be a part of its broader strategy to better position itself against Databricks, which has been AI-centric for quite some time with MosaicML integrated across its platform. Just recently, Snowflake also launched Arctic, an open LLM for enterprise-centric tasks.

Since its inception, Snowflake has heavily focused on building the data infrastructure of choice for enterprises (the data cloud) and enabling a wide range of downstream use cases, including AI and analytics. 

When ChatGPT came to the scene, customers started asking for a solution to develop generative AI-powered applications using the data stored on the platform. This led the company to launch Snowflake ML features to help teams build and manage ML model workflows on the data cloud as well as Cortex. This fully managed service provided a suite of AI building blocks, including open-source LLMs, to analyze data and build applications targeting different business-specific use cases.

Now, the company is enhancing both offerings at the Data Cloud Summit. 

Cortex is being expanded with as many as four new capabilities: AI & ML Studio, Cortex Analyst, Cortex Search and Cortex Guard.

The new AI & ML Studio democratizes and accelerates development with a no-code interactive interface that any user can use to put the state-of-the-art large language models (LLMs) to work and start developing applications targeting different use cases. They can use the environment to test and evaluate different models to find the most suitable option and then build custom apps using it. The offering also includes a fine-tuning service, dubbed Cortex Fine-Tuning, that can be used to further enhance the performance of LLMs and deliver more personalized experiences. However, Snowflake notes that the capability is only available for a subset of Mistral AI and Meta models.

The other three Cortex capabilities, Analyst, Search and Guard, work toward the development of high-performance LLM chatbots. Cortex Analyst, based on Meta’s Llama 3 and Mistral Large models, allows businesses to build applications so they can ask business questions about their analytical data in Snowflake. Search, on the other hand, is a fully managed text search solution for RAG chatbots and enterprise search. It uses the state-of-the-art retrieval and ranking technology Snowflake acquired from Neeva as well as the company’s Arctic embed model to create hybrid search – a combination of vector search and text search – that allows users to build applications against documents and other text-based datasets.

Finally, the new Cortex Guard, based on Meta’s Llama Guard, is a safeguarding product that uses LLMs to flag and filter out harmful content across organizational data and assets, thereby ensuring enterprises get reliable and safe answers from the chatbots built via Cortex.

When the apps are deployed, the next step is to ensure that the models powering them remain on track. This is what the company is solving with Snowflake ML’s MLOps capabilities that allow teams to seamlessly discover, manage, and govern features, models, and metadata across the entire ML lifecycle.

The offering already had Model Registry, which allows users to govern the access and use of all types of AI models so they could deliver more personalized experiences and cost-saving automations. Now, as the next step, it is being enhanced with a Feature Store, which is an integrated solution for data scientists and ML engineers to create, store, manage, and serve consistent ML features for model training and inference. 

The company is also adding ML Lineage into the mix, giving teams the ability to trace the usage of features, datasets, and models across the end-to-end ML lifecycle.

Currently, the Feature Store is in public preview, while the Lineage product is in private preview, with access only to select enterprises. Similarly, all new Cortex capabilities are also in private preview at this stage. Snowflake says it expects to open them in public preview very soon but the timeline remains unclear at this stage.

Snowflake Data Cloud Summit runs from June 3 to June 6, 2024.

Time’s almost up! There’s only one week left to request an invite to The AI Impact Tour on June 5th. Don’t miss out on this incredible opportunity to explore various methods for auditing AI models. Find out how you can attend here.


After making headlines with the announcement of Polaris, a vendor-neutral open data catalog for Apache Iceberg, Snowflake is launching new tools to help enterprises double down on the development of trusted AI-powered applications.

In his second keynote of the company’s annual data cloud summit, CEO Sridhar Ramaswamy shared several enhancements for the Cortex AI service and Snowflake ML to make it easier for enterprises to build, govern and manage AI applications, using the data hosted on the platform. The capabilities touch several aspects but the highlight for us is a new no-code AI & ML Studio that gives any enterprise user the ability to start building AI applications for their desired use-cases.

The move comes as another step from Snowflake to empower its customers with all the tools and models they need to build strong AI applications. The company has been moving in this direction ever since Ramaswamy took over as the CEO and appears to be a part of its broader strategy to better position itself against Databricks, which has been AI-centric for quite some time with MosaicML integrated across its platform. Just recently, Snowflake also launched Arctic, an open LLM for enterprise-centric tasks.

Bolstering AI efforts with Cortex and Snowflake ML

Since its inception, Snowflake has heavily focused on building the data infrastructure of choice for enterprises (the data cloud) and enabling a wide range of downstream use cases, including AI and analytics. 


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure optimal performance and accuracy across your organization. Secure your attendance for this exclusive invite-only event.


When ChatGPT came to the scene, customers started asking for a solution to develop generative AI-powered applications using the data stored on the platform. This led the company to launch Snowflake ML features to help teams build and manage ML model workflows on the data cloud as well as Cortex. This fully managed service provided a suite of AI building blocks, including open-source LLMs, to analyze data and build applications targeting different business-specific use cases.

Now, the company is enhancing both offerings at the Data Cloud Summit. 

Snowflake Cortex AI ecosystem

Cortex is being expanded with as many as four new capabilities: AI & ML Studio, Cortex Analyst, Cortex Search and Cortex Guard.

The new AI & ML Studio democratizes and accelerates development with a no-code interactive interface that any user can use to put the state-of-the-art large language models (LLMs) to work and start developing applications targeting different use cases. They can use the environment to test and evaluate different models to find the most suitable option and then build custom apps using it. The offering also includes a fine-tuning service, dubbed Cortex Fine-Tuning, that can be used to further enhance the performance of LLMs and deliver more personalized experiences. However, Snowflake notes that the capability is only available for a subset of Mistral AI and Meta models.

The other three Cortex capabilities, Analyst, Search and Guard, work toward the development of high-performance LLM chatbots. Cortex Analyst, based on Meta’s Llama 3 and Mistral Large models, allows businesses to build applications so they can ask business questions about their analytical data in Snowflake. Search, on the other hand, is a fully managed text search solution for RAG chatbots and enterprise search. It uses the state-of-the-art retrieval and ranking technology Snowflake acquired from Neeva as well as the company’s Arctic embed model to create hybrid search – a combination of vector search and text search – that allows users to build applications against documents and other text-based datasets.

Finally, the new Cortex Guard, based on Meta’s Llama Guard, is a safeguarding product that uses LLMs to flag and filter out harmful content across organizational data and assets, thereby ensuring enterprises get reliable and safe answers from the chatbots built via Cortex.

Strengthened MLOps

When the apps are deployed, the next step is to ensure that the models powering them remain on track. This is what the company is solving with Snowflake ML’s MLOps capabilities that allow teams to seamlessly discover, manage, and govern features, models, and metadata across the entire ML lifecycle.

The offering already had Model Registry, which allows users to govern the access and use of all types of AI models so they could deliver more personalized experiences and cost-saving automations. Now, as the next step, it is being enhanced with a Feature Store, which is an integrated solution for data scientists and ML engineers to create, store, manage, and serve consistent ML features for model training and inference. 

The company is also adding ML Lineage into the mix, giving teams the ability to trace the usage of features, datasets, and models across the end-to-end ML lifecycle.

Currently, the Feature Store is in public preview, while the Lineage product is in private preview, with access only to select enterprises. Similarly, all new Cortex capabilities are also in private preview at this stage. Snowflake says it expects to open them in public preview very soon but the timeline remains unclear at this stage.

Snowflake Data Cloud Summit runs from June 3 to June 6, 2024.





Author: Shubham Sharma
Source: Venturebeat
Reviewed By: Editorial Team
Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!