AI & RoboticsNews

Docker dives into AI to help developers build GenAI apps

Underneath just about every generative AI application for training or inference today you’ll likely find Docker containers as the primary approach to deployment.

Today at the Dockercon conference in Los Angeles, Docker Inc., the eponymous company behind the open source docker container technology, is taking a dive into the deep end of AI with a series of initiatives designed to help developers more rapidly build generative AI applications.

Among the efforts is the launch of a new GenAI stack that integrates docker with the Neo4j graph database, LangChain model chaining technology and Ollama for running large language models (LLMs). The new Docker AI product is also debuting at Dockercon, as an integrated way for developers to get AI powered insights and direction for development with containers.

The critical importance of Docker to the modern development ecosystem cannot be overstated, and the new AI efforts could have a big impact on GenAI development efforts. Docker has doubled down on its developer focus in recent years, which is an effort the company’s CEO said is paying off.

“For four years running, Stack Overflow’s community of developers has voted us number one most wanted, number one most loved developer tool,” Docker CEO, Scott Johnston told VentureBeat. “And we’re now up to 20 million monthly active developers from all around the world.”

While the use of Docker containers to help share and deploy AI is pervasive, Johnston said that there is also a need to make development of GenAI applications easier.

GenAI applications all typically require a few core elements, such as a vector database, which is something that Neo4j now has as part of its graph database platform. Then of course GenAI requires an LLM, which is what Ollama provides with its platform that enables users to run LLMs including Llama 2, to run locally. Modern GenAI applications are also commonly multi-step, which is where LangChain fits in with its framework. Getting all those different pieces configured in containers to work together normally would require a bit of effort that can now be significantly simplified with the GenAI stack.

The Docker GenAI stack is designed to help developers and the enterprises they work for to more easily get started with AI development using containers. With the GenAI stack there are several use cases that are being targeted including the ability to build a support agent bot with a retrieval augmented generation (RAG) capability, a python coding assistant and automated content generation. 

“It’s pre configured, it’s ready to go and they [developers] can start coding and experimenting to help get the ball rolling,” Johnston said.

The whole stack is designed so it can run locally on a developer system and is being made freely available. As developers build out applications and need deployment and commercial support, Johnston said that there will be options available from Docker and its partners.

There is no shortage of GenAI developer tools in the market today, with popular options such as GitHub Copilot and Amazon CodeWhisper among others.

Docker is now entering that fray with its own GenAI tool, simply called Docker AI. Rather than referring to Docker AI as a copilot, which is a term that Microsoft and other vendors are increasingly using for GenAI tools that assist users, Docker is using the term- mech suit. The basic idea is that with the mech suit, developers have more power and strength to accomplish tasks.

Docker AI has been trained on Docker’s proprietary data from millions of Dockerfiles, compose files, and error logs. Docker AI integrates directly into developers’ workflows to provide assistance when errors occur. It will display potential fixes within the development environment and allow developers to test the fix before committing changes. The goal is to create a better experience for developers to troubleshoot and fix issues when they arise.

Johsnton noted that while tools like Github Copilot are useful and powerful, Docker AI is specifically tuned to help enable container development.

“It has been trained on a rich proprietary stream of Docker data that other LLMs don’t have access to,” he said.

VentureBeat presents: AI Unleashed – An exclusive executive event for enterprise data leaders. Network and learn with industry peers. Learn More


Underneath just about every generative AI application for training or inference today you’ll likely find Docker containers as the primary approach to deployment.

Today at the Dockercon conference in Los Angeles, Docker Inc., the eponymous company behind the open source docker container technology, is taking a dive into the deep end of AI with a series of initiatives designed to help developers more rapidly build generative AI applications.

Among the efforts is the launch of a new GenAI stack that integrates docker with the Neo4j graph database, LangChain model chaining technology and Ollama for running large language models (LLMs). The new Docker AI product is also debuting at Dockercon, as an integrated way for developers to get AI powered insights and direction for development with containers.

The critical importance of Docker to the modern development ecosystem cannot be overstated, and the new AI efforts could have a big impact on GenAI development efforts. Docker has doubled down on its developer focus in recent years, which is an effort the company’s CEO said is paying off.

Event

AI Unleashed

An exclusive invite-only evening of insights and networking, designed for senior enterprise executives overseeing data stacks and strategies.

 


Learn More

“For four years running, Stack Overflow’s community of developers has voted us number one most wanted, number one most loved developer tool,” Docker CEO, Scott Johnston told VentureBeat. “And we’re now up to 20 million monthly active developers from all around the world.”

Credit: Docker Inc.

What the Docker GenAI stack brings to developers

While the use of Docker containers to help share and deploy AI is pervasive, Johnston said that there is also a need to make development of GenAI applications easier.

GenAI applications all typically require a few core elements, such as a vector database, which is something that Neo4j now has as part of its graph database platform. Then of course GenAI requires an LLM, which is what Ollama provides with its platform that enables users to run LLMs including Llama 2, to run locally. Modern GenAI applications are also commonly multi-step, which is where LangChain fits in with its framework. Getting all those different pieces configured in containers to work together normally would require a bit of effort that can now be significantly simplified with the GenAI stack.

The Docker GenAI stack is designed to help developers and the enterprises they work for to more easily get started with AI development using containers. With the GenAI stack there are several use cases that are being targeted including the ability to build a support agent bot with a retrieval augmented generation (RAG) capability, a python coding assistant and automated content generation. 

“It’s pre configured, it’s ready to go and they [developers] can start coding and experimenting to help get the ball rolling,” Johnston said.

The whole stack is designed so it can run locally on a developer system and is being made freely available. As developers build out applications and need deployment and commercial support, Johnston said that there will be options available from Docker and its partners.

Docker AI: a ‘mech suit’ for developers

There is no shortage of GenAI developer tools in the market today, with popular options such as GitHub Copilot and Amazon CodeWhisper among others.

Docker is now entering that fray with its own GenAI tool, simply called Docker AI. Rather than referring to Docker AI as a copilot, which is a term that Microsoft and other vendors are increasingly using for GenAI tools that assist users, Docker is using the term- mech suit. The basic idea is that with the mech suit, developers have more power and strength to accomplish tasks.

Docker AI has been trained on Docker’s proprietary data from millions of Dockerfiles, compose files, and error logs. Docker AI integrates directly into developers’ workflows to provide assistance when errors occur. It will display potential fixes within the development environment and allow developers to test the fix before committing changes. The goal is to create a better experience for developers to troubleshoot and fix issues when they arise.

Johsnton noted that while tools like Github Copilot are useful and powerful, Docker AI is specifically tuned to help enable container development.

“It has been trained on a rich proprietary stream of Docker data that other LLMs don’t have access to,” he said.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Sean Michael Kerner
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!