AI & RoboticsNews

Prophecy’s generative AI assistant ushers in a new era of data pipeline automation

Prophecy

Data engineering startup Prophecy is giving a new turn to data pipeline creation.

Known for its low-code SQL tooling, the California-based company today announced data copilot, a generative AI assistant that can create trusted data pipelines from natural language prompts and improve pipeline quality with greater test coverage.

The capability has the potential to make pipeline development a breeze and save data engineers precious time for other more pressing tasks. Previously, we have seen gen AI being looped in for other aspects of data workflows such as data querying and cataloging.

Along with the tool, Prophecy announced a new platform to help companies build gen AI applications on top of their privately-held data.

Historically, the task of building a data pipeline revolved around writing complex SQL code. Data engineers had to give in a lot of time and effort just to bring business users’ desired pipelines to life. Then, players like Prophecy came in, providing low-code solutions (a visual drag-and-drop canvas) to simplify things.

Now, as the next step in this work, Prophecy has introduced the data copilot, which just requires one to tell what they want in natural language.

Once a command is given, the platform uses it to suggest a pipeline that brings all the data together for the desired report. The user can then preview and accept the pipeline — or decline it to come up with something else.

“This will enable businesses to be less bottlenecked on data engineering resources, as business data users and others are able to serve themselves… further, these data products will also be made more consistent and of higher quality, as Prophecy Data Copilot suggests transformations and expressions as well,” Raj Bains, Prophecy CEO and cofounder told VentureBeat.

To achieve this, Bains explained that Prophecy creates a comprehensive knowledge graph of a company’s data models that contains technical metadata associated with tables and schemas, business metadata from data catalogs and historical queries and code executed on SQL, Spark, or Airflow. This graph is then fed into a state-of-the-art large language model that translates the user’s natural language query into a performant data pipeline. The system also learns and improves based on the user’s feedback, he added.

In addition to the data copilot, Prophecy is adding a platform to build gen AI solutions like chatbots on top of privately-owned, enterprise data.

The offering works in a two-step process. Firstly, a data engineer populates a knowledge warehouse with unstructured text in internal messaging systems, documents, support tickets and more (converted to vectors). Next, the engineer builds a streaming inference pipeline — that is, a chatbot — backed by OpenAI.

“When a user poses a question, a vector lookup is performed to retrieve the internal documents that are most relevant to that question,” Bains explained. “Those documents provide crucial context for answering the question at hand. This context plus the question itself make up the prompt, which is sent to OpenAI via APIs. The answer is then returned to the user.”

Both new offerings are available to use starting today.

Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More


Data engineering startup Prophecy is giving a new turn to data pipeline creation.

Known for its low-code SQL tooling, the California-based company today announced data copilot, a generative AI assistant that can create trusted data pipelines from natural language prompts and improve pipeline quality with greater test coverage.

The capability has the potential to make pipeline development a breeze and save data engineers precious time for other more pressing tasks. Previously, we have seen gen AI being looped in for other aspects of data workflows such as data querying and cataloging.

Along with the tool, Prophecy announced a new platform to help companies build gen AI applications on top of their privately-held data.

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.


Register Now

How does data copilot help?

Historically, the task of building a data pipeline revolved around writing complex SQL code. Data engineers had to give in a lot of time and effort just to bring business users’ desired pipelines to life. Then, players like Prophecy came in, providing low-code solutions (a visual drag-and-drop canvas) to simplify things.

Now, as the next step in this work, Prophecy has introduced the data copilot, which just requires one to tell what they want in natural language.

Once a command is given, the platform uses it to suggest a pipeline that brings all the data together for the desired report. The user can then preview and accept the pipeline — or decline it to come up with something else.

“This will enable businesses to be less bottlenecked on data engineering resources, as business data users and others are able to serve themselves… further, these data products will also be made more consistent and of higher quality, as Prophecy Data Copilot suggests transformations and expressions as well,” Raj Bains, Prophecy CEO and cofounder told VentureBeat.

To achieve this, Bains explained that Prophecy creates a comprehensive knowledge graph of a company’s data models that contains technical metadata associated with tables and schemas, business metadata from data catalogs and historical queries and code executed on SQL, Spark, or Airflow. This graph is then fed into a state-of-the-art large language model that translates the user’s natural language query into a performant data pipeline. The system also learns and improves based on the user’s feedback, he added.

Platform to build gen AI apps

In addition to the data copilot, Prophecy is adding a platform to build gen AI solutions like chatbots on top of privately-owned, enterprise data.

The offering works in a two-step process. Firstly, a data engineer populates a knowledge warehouse with unstructured text in internal messaging systems, documents, support tickets and more (converted to vectors). Next, the engineer builds a streaming inference pipeline — that is, a chatbot — backed by OpenAI.

“When a user poses a question, a vector lookup is performed to retrieve the internal documents that are most relevant to that question,” Bains explained. “Those documents provide crucial context for answering the question at hand. This context plus the question itself make up the prompt, which is sent to OpenAI via APIs. The answer is then returned to the user.”

Both new offerings are available to use starting today.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Shubham Sharma
Source: Venturebeat

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!