AI & RoboticsNews

Vectara aims to ground generative AI conversational search without hallucinations

Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More


Vectara is continuing to grow as an AI powered conversational search platform with new capabilities announced today that aim to improve generative AI for business data.

The Santa Clara, Calif.- based startup emerged from stealth in Oct. 2022, led by the former CTO and founder of big data vendor Cloudera. Vectara originally branded its platform as a neural search-as-a-service technology. This approach combines AI-based large language models (LLMs), natural language processing (NLP), data integration pipelines and vector techniques to create a neural network that can be optimized for search.

Now, the company is expanding its capabilities with generative AI that provide summarization of results for a more conversational AI experience. The company is also adding what it calls “grounded generation” capabilities in a bid to help reduce the risk of AI hallucinations and improve overall search accuracy.

“It’s all about moving from legacy, which is a search engine that gives you a list of results, and what ChatGPT opened our eyes to, which is that all consumers want is the answer,” Vectara CEO and cofounder Amr Awadallah told VentureBeat. “We just want the answer, don’t give me a list of results and I have to go read to figure out what I’m looking for — just give me the answer itself.”

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.

 


Register Now

Alongside the new features, Vectara announced that it has closed a seed round of $28.5 million. The closed seed round includes $20 million that Vectara had previously  announced in Oct. 2022. The funding was led by Race Capital, with new strategic board of advisors including Databricks CTO Matei Zaharia.

Generative AI-powered search is increasingly competitive

When Vectara first emerged in 2022, there were few competitors in the generative AI search space — but that has changed dramatically in just a short time in 2023.

In recent months, Google has entered the space with a preview of its Generated Search Experience that was announced at the Google I/O conference on May 15. Microsoft’s Bing integrated with OpenAI to provide a generative AI experience as well. Elasticsearch has also been expanded to integrate generative AI with an update announced on May 23.

Awadallah is well aware of the increasingly competitive landscape and is confident in his firm’s differentiation. A core element of the Vectara platform is what is known as a “retrieval engine,” the technology that matches the right semantic concepts with entries in a vector database.

The original basis for the Vectara retrieval engine comes from research that Awadallah’s co-founder Amin Ahmad did in 2019 while at Google. This was described in a 2019 paper, Multilingual Universal Sentence Encoder for Semantic Retrieval. Awadallah explained that Vectara has improved on that original design, providing a highly accurate retrieval system.

What grounded generation is all about

Prior to the new update, the search platform provided users with a list of results that benefited from both semantic keyword and AI capabilities. The list of results however was still just a list that a user had to look through to get an answer.

With the platform update, users can now get a generative AI result that will summarize the most relevant sources to provide an answer to a query.  

Generative AI results, such as those from ChatGPT, can potentially have a risk of AI hallucination, where an inaccurate result will be shown. Awadallah explained that hallucinations occur in LLMs because the model has compressed a vast amount of information and can potentially generate an answer that is not true.

To help solve that issue, Vectara has integrated a grounded generation approach, which other vendors sometimes refer to as retrieval augmented generation. The basic idea is that generated results are associated with a source citation to help improve accuracy and to direct users to more information from the original source.

Zero shot ML

The Vectara platform also uses what is known as a “zero shot” machine learning (ML) approach that enables the model to continuously learn from new data, without the need for more consuming fine tuning and retraining.

“As data is coming in, within a few seconds that data is already part of the mix and it will be reflected in the answers that are being generated by the engine,” said Awadallah.

Overall, he emphasized that the strategy for his company is to help businesses not just find the right search results, but to deliver actions for end users.

“The longer term belief is we’re moving from search engines to answer engines,” said Awadallah. “Right now what we’re doing is ‘answer engines’ — meaning I don’t give you back a list of results, I’m giving you back the answer. But if you get the answers to be truly accurate, we can move from answer engines to action engines.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Sean Michael Kerner
Source: Venturebeat

Related posts
AI & RoboticsNews

Mike Verdu of Netflix Games leads new generative AI initiative

AI & RoboticsNews

Google just gave its AI access to Search, hours before OpenAI launched ChatGPT Search

AI & RoboticsNews

Runway goes 3D with new AI video camera controls for Gen-3 Alpha Turbo

DefenseNews

Why the Defense Department needs a chief economist

Sign up for our Newsletter and
stay informed!