AI & RoboticsNews

From prompt chaos to clarity: How to build a robust AI orchestration layer

Editor’s note: Emilia will lead an editorial roundtable on this topic at VB Transform next week. Register today.

AI agents seem like an inevitability these days. Most enterprises already use an AI application and may have deployed at least a single-agent system, with plans to pilot workflows with multiple agents

Managing all that sprawl, especially when attempting to build interoperability in the long run, can become overwhelming. Reaching that agentic future means creating a workable orchestration framework that directs the different agents. 

The demand for AI applications and orchestration has given rise to an emerging battleground, with companies focused on providing frameworks and tools gaining customers. Now, enterprises can choose between orchestration framework providers like LangChain, LlamaIndex, Crew AI, Microsoft’s AutoGen and OpenAI’s Swarm

Enterprises also need to consider the type of orchestration framework they want to implement. They can choose between a prompt-based framework, agent-oriented workflow engines, retrieval and indexed frameworks, or even end-to-end orchestration. 

As many organizations are just beginning to experiment with multiple AI agent systems or want to build out a larger AI ecosystem, specific criteria are at the top of their minds when choosing the orchestration framework that best fits their needs. 

This larger pool of options in orchestration pushes the space even further, encouraging enterprises to explore all potential choices for orchestrating their AI systems instead of forcing them to fit into something else. While it can seem overwhelming, there’s a way for organizations to look at the best practices in choosing an orchestration framework and figure out what works well for them. 

Orchestration platform Orq noted in a blog post that AI management systems include four key components: prompt management for consistent model interaction, integration tools, state management and monitoring tools to track performance. 

Best practices to consider

For enterprises planning to embark on their orchestration journey or improve their current one, some experts from companies like Teneo and Orq note at least five best practices to start with. 

  • Define your business goals 
  • Choose tools and large language models (LLMs) that align with your goals
  • Lay out what you need out of an orchestration layer and prioritize these, i.e., integration, workflow design, monitoring and observability, scalability, security and compliance
  • Know your existing systems and how to integrate them into the new layer
  • Understand your data pipeline

As with any AI project, organizations should take cues from their business needs. What do they need the AI application or agents to do, and how are these planned to support their work? Starting with this key step will help better inform their orchestration needs and the type of tools they require.

Teneo said in a blog post that once that’s clear, teams must know what they need from their orchestration system and ensure these are the first features they look for. Some enterprises may want to focus more on monitoring and observability, rather than workflow design. Generally, most orchestration frameworks offer a range of features, and components such as integration, workflow, monitoring, scalability, and security are often the top priorities for businesses. Understanding what matters most to the organization will better guide how they want to build out their orchestration layer. 

In a blog post, LangChain stated that businesses should be aware of what information or work is passed to models. 

“When using a framework, you need to have full control over what gets passed into the LLM, and full control over what steps are run and in what order (in order to generate the context that gets passed into the LLM). We prioritize this with LangGraph, which is a low-level orchestration framework with no hidden prompts, no enforced “cognitive architectures”. This gives you full control to do the appropriate context engineering that you require,” the company said. 

Since most enterprises plan to add AI agents into existing workflows, it’s best practice to know which systems need to be part of the orchestration stack and find the platform that integrates best. 

As always, enterprises need to know their data pipeline so they can compare the performance of the agents they are monitoring. 


Author: Emilia David
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
AI & RoboticsNews

Announcing the 2025 finalists for VentureBeat Women in AI Awards

AI & RoboticsNews

OpenAI open sourced a new Customer Service Agent framework — learn more about its growing enterprise strategy

CryptoNews

SEC and JPMorgan Meet to Discuss Approaches to Crypto Regulation – Regulation Bitcoin News

CryptoNews

Coinbase Delivers USDC Breakthrough in US Futures Trading – Exchanges Bitcoin News

Sign up for our Newsletter and
stay informed!

Share Your Thoughts!

This site uses Akismet to reduce spam. Learn how your comment data is processed.