AI & RoboticsNews

Vectara’s AI-based neural search-as-a-service challenges keyword-based searches

To further strengthen our commitment to providing industry-leading coverage of data technology, VentureBeat is excited to welcome Andrew Brust and Tony Baer as regular contributors. Watch for their articles in the Data Pipeline.

Is there a better way to build a search tool that produces more highly relevant results than just using keyword-based techniques?

That’s one of the many questions that former Google staffers Amr Awadallah (CEO), Amin Ahmad (CTO) and Tallat Shafaat (chief architect) wanted to solve with their new startup, which has been in stealth under the name ZIR AI. Today, ZIR AI is emerging from stealth under the name Vectara, with the help of $20 million in seed funding, and availability of the company’s neural search-as-a-service technology.

The foundational premise of Vectara is that artificial intelligence (AI)-based large language models (LLMs) combined with natural language processing (NLP), data integration pipelines and vector techniques can create a neural network that is useful for multiple use cases, including search.

“At the heart of what we have built is a neural network that makes it very simple for any company to tap that power and do something useful with it,” Awadallah told VentureBeat. “Large language models and neural networks have transformed how we understand the meaning behind text and the first offering that we’re launching is neural search-as-a-service. “

Event

Low-Code/No-Code Summit

Join today’s leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.


Register Here

How Vectara combines multiple AI techniques into something new

LLMs and neural networks in general use vectors as a foundational element. “One of the key elements of doing large language models and neural network inference is a vector-matching system in the middle,” he said.

Awadallah explained that neural networks input information, and the output of the network are the vectors that represent the learnings that the neural network generates. He stressed that Vectara’s platform isn’t just about analyzing vectors, rather his company’s platform covers the whole data pipeline.

There are multiple vendors in the market today that provide vector-database technologies such as Pinecone. A vector database is only one part of what Vectara is providing.

Awadallah explained that when a user issues a query, Vectara uses its neural network to convert that query from the language space, meaning the vocabulary and the grammar, into the vector space, which is numbers and math. Vectara indexes all the data that an organization wants to search in a vector database, which will find the vector that has closest proximity to a user query.

Feeding the vector database is a large data pipeline that ingests different data types. For example, the data pipeline knows how to handle standard Word documents, as well as PDF files, and is able to understand the structure. The Vectara platform also provides results with an approach known as cross-attentional ranking that takes into account both the meaning of the query and the returned results to get even better results.

From big data on Hadoop to neural search-as-a-service

Vectara isn’t the first startup that Awadallah has helped to get started; he was also a cofounder of Hadoop provider Cloudera back in 2008. There are lessons learned from his experiences that are helping to inform decision-making at the new startup.

One of the lessons he has learned over the years is that it’s never a good idea to build technology just for the sake of technology. Awadallah emphasized that Vectara’s neural data processing pipeline is powerful and could be used for different applications. They chose search to start with at Vectara because it’s a challenge that faces a large number of organizations.

“We wanted to start with a problem that everybody has that needs to be solved in a good way,” Awadallah said

Awadallah and his cofounders all had experience at Google, where LLMs and the use of transformer techniques have been used. He explained that with a transformer, it’s possible to better understand context to get a better result for a query. With a transformer, a system doesn’t just understand the meaning of a given word, it also understands how the word relates to other words in that sentence, and in the previous sentence, and then the following sentence, to get the right context.

“We did this at Google,” he said. “We know how to properly fine-tune the parameters to get the best outcome for our customers, and that’s truly what differentiates us.”

Search is only the first service for Vectara. Awadallah said that his company will add new services over time, with likely future candidates including providing recommendations, as well as tools to help users surface related topics.

“The Industrial Revolution was about how we make stuff with our hands and now we’re helping people to build things with stuff that is coming out of their brains,” Awadallah said. “That’s the foundation of this pipeline that we’re building, which is a neural network pipeline that allows you to process and extract value out of data.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Sean Michael Kerner
Source: Venturebeat

Related posts
AI & RoboticsNews

AI risk management startup ValidMind raises $8.1M to help banks comply with regulations

DefenseNews

Amid faltering domestic program, Taiwan orders more MQ-9B drones

DefenseNews

BAE demos platform that gives Army AMPVs turret system options

DefenseNews

US Army’s fresh look at watercraft includes unmanned options

Sign up for our Newsletter and
stay informed!