AI & RoboticsNews

OpenAI rival Cohere AI has flown under the radar. That may be about to change.

Check out all the on-demand sessions from the Intelligent Security Summit here.


Aidan Gomez, cofounder and CEO of Cohere AI, admits that the company, which offers developers and businesses access to natural language processing (NLP) powered by large language models (LLMs), is “crazy under the radar.” 

Given the quality of the company’s foundation models, which many say are competitive with the best from Google, OpenAI and others, that shouldn’t be the case, he told VentureBeat.

Perhaps, he mused, it’s because the company isn’t releasing attention-grabbing consumer demos like OpenAI’s ChatGPT. But Cohere, he emphasizes, has been “squarely focused on the enterprise and how we can add value there.” 

Cohere reportedly in talks for new funding

In any case, the Toronto-based Cohere, founded in 2019 by Gomez, Ivan Zhang and Nick Frosst, may not remain unnoticed for long.

Event

Intelligent Security Summit On-Demand

Learn the critical role of AI & ML in cybersecurity and industry specific case studies. Watch on-demand sessions today.


Watch Here

Reuters reported on Tuesday that Cohere is in talks to raise hundreds of millions of dollars in a funding round that could value the startup at more than $6 billion, in “the latest sign of the investment frenzy around generative AI.” And back in October 2022, the Wall Street Journal reported that Cohere had reportedly been in talks with both Google and Nvidia about a possible investment.

While Cohere has not commented on the funding rumors, one vote of confidence for the company is the recent addition of Martin Kon, formerly YouTube’s finance chief, who joined as president and chief operating officer in December. 

Kon said he was impressed not only with the deep expertise of Cohere’s cofounders, but their focus on making LLMs relevant to developers and enterprises.

“I saw this next wave of disruption and transformation and it was just really exciting,” he said. “But thinking about developers, about enterprises and solving real business problems, that was where I said, ‘I think I can bring something here.’”

According to its website, the Cohere platform can be used “to generate or analyze text to do things like write copy, moderate content, classify data and extract information, all at a massive scale.” It is available through API as a managed service, via cloud machine learning (ML) platforms like Amazon Sagemaker and Google Vertex AI. For enterprise customers with the highest data-protection and latency demands, Cohere’s platform is available as private LLM deployments on VPC, or even on-premises.

“We’re working directly with developers and enterprises to develop or apply the applications that will help them solve business problems,” Kon said. For example, “We’re working now with a global audio streaming platform to use multilingual semantic search to enable much better search through podcasts, and we’re also working with AI-powered copywriting companies like HyperWrite.” [Ed. note: Quote corrected 2/9/23 at 10:09 am ET.]

Cohere founded by co-author of Transformer paper

Back in 2017, Gomez and a group of fellow Google Brain colleagues, who had co-authored the original Transformer paper, titled “Attention Is All You Need,” were frustrated.

The team had struck gold with Transformers — a neural network NLP breakthrough that captured the context and meaning of words more accurately than its predecessors: the recurrent neural network and the long short-term memory network. The Transformer architecture became the underpinnings of LLMs like GPT-3 and ChatGPT, but also non-language applications including OpenAI’s Codex and DeepMind’s AlphaFold.

“We built it initially for Google Translate, but then it was adopted in Search, Gmail, YouTube,” said Gomez. “So it kind of just swept Alphabet’s product areas, almost uniformly. It was driving really incredible changes inside Google.” 

But while Gomez saw huge adoption of Transformers within Google, there was not a lot of adoption outside of it. “There were crazy demos internally, but nothing was changing outside,” he said. “None of the infrastructure necessary for getting it into production was being built or adopted or being considered. Nobody really understood language models or how to make them useful, and this was before GPT-3. We were just getting so antsy — you’re face-to-face with something extraordinary and no one else sees it.” 

Computer resources and AI/ML expertise were adoption barriers

As a result, several Transformer co-authors famously decided to leave Google and found their own startups (for example, Noam Shazeer founded Character AI, Niki Parmar and Ashish Vaswani founded Adept AI) — including Gomez.

“We just decided we needed to do our own thing,” said Gomez. “We felt there was some fundamental barriers keeping enterprises and young developers and startup founders from [adopting NLP] and there’s got to be a way to bring those barriers down.”

One of the biggest barriers to organizations who want to build products using NLP at scale, Gomez explained, was computer resources.

“To build these models, you need supercomputers with thousands of GPUs,” he said. “And there’s not a lot of supercomputers on earth, so it’s not like everyone could do it in-house.”

In addition, the AI and ML expertise to create these models is extremely rare and competitive. “We wanted to create a product that eliminates those two barriers,” he added. “We wanted to take something really hard — that only experts in that domain know how to do — and create an interface onto it that lets every single developer go and build with it.”

Cohere is not bound to a single cloud

One of Cohere’s selling points is that it is not bound to a single cloud, Gomez pointed out. “We’re not locked into Azure,” he said, referring to OpenAI’s relationship with Microsoft. “We have a relationship with Google and have access to their supercomputer TPU pods, and we also recently announced a partnership with AWS.”

This means that customers can deploy within their chosen cloud or even on premises. “If you want to be extremely low-latency, or if you don’t want us to have visibility into your customer data because it’s something super sensitive, we can support that in a way that no one else can,” he said. “No one else is offering that, not with the models that we have at the quality that we have.”

Thanks to the runaway success of ChatGPT, Gomez said educating people about the power of LLMs has become vastly easier. “Most of my time was spent educating people, but that has completely changed,” he said. “Now people are coming to us and saying, ‘hey, we saw this, we really want to build this.’”

When a new technology emerges, he explained, at first it tends to be all about education, and then it becomes common knowledge and all about deployment or production. “I think within the past couple months, we just flipped into deployment,” he said.

In particular, Gomez said he thinks knowledge assistance is a big emerging use case for enterprise businesses. “Copywriting was one of the first products and market fit, like Jasper, but now it’s starting to spread out a lot more,” he explained. “We’re starting to see stuff like summarization. We’re starting to see large enterprises saying ‘hey, I really need this.’ I think having a much more natural, powerful way to discover information specific to your organization (or to you) is about to be unlocked.”

A look back at Google — and ahead

The Transformer paper was a big success for its Google co-authors, who had the earliest inkling of what was coming down the pike when it comes to LLMs.

But, said Gomez, each of the cohort has a different vision of what they want to build.

“We’re each solving a different layer of the stack,” he said. “Some folks are at the application layer, building fun chatbots to talk to. I’m down at the foundational layer where we want to build the infrastructure and the platform that everyone can build off of, and there’s people all the way in between. I think we each have a different vision of where we’re most excited about contributing, but it’s all very complimentary.”

As for Google, Gomez said that he is “super excited” about his former employer’s next generation of products, which includes the newly-announced Bard.

“They really look like they’re pulling up their socks and diving into productizing AI,” he said. “It seems like there has been a total turnaround.”

And without noting the similarity to his own goals for Cohere, he added: “That’s really exciting for the world — that means this stuff is going to be out there in applications, changing things and providing value.”

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Sharon Goldman
Source: Venturebeat

Related posts
AI & RoboticsNews

Why AI won’t make you a better writer

AI & RoboticsNews

Snowflake Build: the 4 biggest announcements on Cortex AI and more

AI & RoboticsNews

AI search wars heat up: Genspark adds Claude-powered financial reports on demand

DefenseNews

Kongsberg wins biggest-ever missile contract from US Navy, Marines

Sign up for our Newsletter and
stay informed!