AI & RoboticsNews

Why AI leaders need a ‘backbone’ of large language models

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.


AI adoption may be steadily rising, but a closer examination shows that most enterprise companies may not be quite ready for the big time when it comes to artificial intelligence

Recent data from Palo Alto, California-based AI unicorn SambaNova Systems, for example, shows that more than two-thirds of organizations think using artificial intelligence (AI) will cut costs by automating processes and using employees more efficiently. But only 18% are rolling out large-scale, enterprise-class AI initiatives. The rest are introducing AI individually across multiple programs, rather than risking an investment in big-picture, large-scale adoption. 

That will create an increasing amount of distance between companies that are AI leaders and innovators and those that fall behind, said Marshall Choy, senior vice president of product at SambaNova, which offers custom-built dataflow-as-a-service (and won VentureBeat’s AI Innovation Award for Edge AI in 2021).

Companies that are more mature in AI and able to invest in large-scale adoption will reap the rewards, he told VentureBeat, while the ones introducing AI across multiple programs will suffer from information and insight silos. “We see time and time again that leaders need to have a holistic view across their organization.” 

Event

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.


Register Here

AI is going to transform industries, segments and organizations as dramatically as the internet did, Choy explained. Today’s AI innovators are laying down a unified AI ‘backbone’ of large language models (LLMs) for natural language processing (NLP), which will serve as the foundation for the next 5-10 years of application and deployment in their organizations. 

“We’re seeing that with those taking a leadership position – it started with the hyperscale, cloud services providers who have done this at massive scale,” he said. “Now, it’s the banks, the energy companies, the pharmaceutical companies, the national laboratories.”

Soon, he said, it’s going to be “unheard of” for enterprises not to have an LLM-based AI “backbone.”

“The long-term benefit will be to start building out what organizations need to get where they want to be by doing it [all] now, rather than piecing it all together and then having to do a redo in a couple of years,” Choy said. 

The AI maturity curve predicts enterprise-scale adoption

Many organizations are early in the AI maturity curve, which typically means they are self-educating, experimenting and doing pilots to try to determine the right use cases for AI. 

“I think those folks are a long way away from enterprise-scale adoption, if they don’t even know what the use cases are,” said Choy. 

But there are many organizations that are further along, deploying AI for departmental use and beginning to reach a maturity stage. “They’ve got architectural and data maturity, they’re starting to standardize on platforms, they have budgets,” he said. 

Still, the organizations thinking big and rolling out large-scale projects tend to be in industries like banking, which may have hundreds or thousands of disparate AI models running across the enterprise. Now that foundation models based on tools like GPT-3 are feasible, these organizations can make the kind of big-picture AI investment they need to truly transform their business and provide more customized services for their end users. 

“It’s almost like a do-over for them – they would have devised this as a strategy three years ago, had the technology been available,” he said. “The banking industry is at the stage where there’s a recognition that AI is going to be the accelerant for the next transitional shift for the enterprise.” 

Other industries may look to AI for tactical efforts, including cost optimization and gaining more efficiencies. But the ones that are truly reforming and reshaping themselves to create new products and services — and therefore new revenue streams and lines of business – those are the industries that will need that foundational AI “backbone,” Choy added. 

Advances in language models make ‘backbone’ possible

Mature AI organizations are gravitating their deep learning efforts to LLMs and language processing. “Inherent in that application is document, text and speech-heavy industries like banking, insurance, some areas of manufacturing like warehousing and logistics,” said Choy. “I think in a few short years, no industry will be untouched because language is effectively the connector to everything we do.” 

What’s making this all possible now, he added, is the advances in the language models themselves.

“The magic of these new, large language models, like our own GPT banking model, is their generative capabilities,” he said. “From auto-summarization from a voice-ready meeting transcript, for example, or robotic claims, processing and completion, this generative quality takes it to the next level with regard to language – it’s a huge step forward for both front-office customer service-oriented tasks, and also back-office stuff like risk and compliance.” 

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.


Author: Sharon Goldman
Source: Venturebeat

Related posts
AI & RoboticsNews

H2O.ai improves AI agent accuracy with predictive models

AI & RoboticsNews

Microsoft’s AI agents: 4 insights that could reshape the enterprise landscape

AI & RoboticsNews

Nvidia accelerates Google quantum AI design with quantum physics simulation

DefenseNews

Marine Corps F-35C notches first overseas combat strike

Sign up for our Newsletter and
stay informed!