Within a week of being launched, chatGPT, the AI-powered chatbot developed by OpenAI, had over 1 million users, growing to 100 million users in the first month. The flood of attention from the press and consumers alike comes in part because of the software’s ability to offer human-like responses in everything from long-form content creation, in-depth conversations, document search, analysis and more.
Uljan Sharka, CEO of iGenius, believes that generative AI has world-changing potential in the business world, because for the first time, data can be truly democratized. GPT stands for generative pretrained transformer, a family of language models trained with supervised and reinforcement learning techniques — in chatGPT’s case, 45 terabytes of text data powering all that content creation.
But what if generative AI can be used to respond to essential data-related queries in the business world, not only content?
“Up till now, data, analytics and even ‘data democratization’ has been data-centered, designed for data-skilled people,” Sharka says. “The business users are being left out, facing barriers to the information they need to make data-driven decisions. People are not about data. They want business answers. We have an opportunity today to shift the user interface toward language interfaces, and humanize data to make it people-centric.”
But the interface is only a small percentage of what a complex system needs to perform in order to make this kind of information integrated, certified, safe, equal, and accessible for business decisions. Composite AI means bringing together data science, machine learning, and conversational AI in one single system.
“I like to think of it as the iPhone of the category, which provides an integrated experience to make it safe and equal,” Sharka says. “That’s the only way we’ll have generative AI delivering impact in the enterprise.”
Generative AI and the humanization of data science
As the gap between B2C and B2B apps has grown, business users have been left behind. B2C apps put billions of dollars into creating exemplary apps that are very user friendly, operable with a few taps or a conversation. At home, users are writing research papers with the help of chatGPT, while back at work, a wealth of data stays siloed when the complex dashboards that connect data go unused.
In organizations, generative AI can actually connect every data product anywhere in the world and index it in an organization’s “private brain.” And with algorithms, natural language processing and user-created metadata, or what iGenius calls advanced conversational AI, the complexity of data quality can be improved and elevated. Gartner has dubbed this ‘conversational analytics.’
Virtualizing complexity unlocks unlimited potential to clean, manipulate and serve data for every use case, whether that’s cross-correlating information or just bringing it together as one single source of truth for an individual department.
On the back end, generative AI helps scale the integration between systems, using the power of natural language to actually create what a Sharka calls an AI brain, composed of private sources of information. With no-code interfaces, integration is optimized and data science is democratized even before business users start consuming that information. It’s an innovation accelerator, which will cut costs as the time it takes to identify and develop use cases is slashed dramatically.
On the front end, business users are literally having a conversation with data and getting business answers in plain natural language. Making the front-end user experience even more consumerized is the next step. Instead of a reactive and single task-based platform, asking text questions and getting text answers, it can become multi-modal, offering charts and creative graphs to optimize the way people understand the data. It can become a Netflix or Spotify-like experience, as the AI learns from how you consume that information to proactively serve up the knowledge a user needs.
Generative AI and iGenius in action
From an architectural perspective, this natural language layer is added to the applications and databases that already exists, becoming a virtual AI brain. Connecting across departments unlocks new opportunities.
“This is not about using data more — this is about using data at the right time of delivery,” Sharka says. “If I can use data before or while I make a decision, whether I’m in marketing or sales or supply chain, HR, finance, operations — this is how we’re going to make an impact.”
For instance, connecting marketing data and sales data means not only monitoring campaigns in real time, but correlating results with transactions, conversions and sales cycles to offer clear performance KPIs and see the direct impact of the campaign in real time. A user can even ask the AI to adapt campaigns in real time. At the same time, the interface surfaces further questions and areas of inquiry that the user might want to pursue next, to deepen their understanding of a situation.
At Enel, Italy’s leading energy company now focused on sustainability, engineers consume real-time IOT information, mixing finance data with data coming from the production plants, having conversations with that data in real time. Whenever their teams need to perform preventative maintenance or plan activities in the plant, or need to measure how actual results compare to budgets, asking the interface for the synthesized information needed unlocks powerful operational analytics that can be reacted on immediately.
The future of generative AI
ChatGPT has sparked a massive interest in generative AI, but iGenius and OpenAI (which both launched in 2015) long ago realized they were headed in different directions, Sharka says. OpenAI built the GPT for text, while iGenius has built the GPT for numbers, a product called Crystal. Its private AI brain connects proprietary information into its machine learning model, allowing users to start training it from scratch. It uses more sustainable small and wide language models, instead of large language models to give organizations control over their IP.
It also enables large-scale collaboration, in which companies can leverage expertise and knowledge workers to certify the data used to train models and the information generated to reduce bias at scale, and provide more localized and hyper-personalized experiences. It also means you don’t need to be a prompt engineer to safely work with or apply the data these algorithms provide to produce high-quality actionable information.
“I’ve always believed that this is going to be a human-machine collaboration,” Sharka says. “If we can leverage the knowledge that we already have in people or in traditional IT systems, where you have lots of semantic layers and certified use cases, then you can reduce bias exponentially, because you’re narrowing it down to quality. With generative AI, and a system that’s certified on an ongoing basis, we can achieve large-scale automation and be able to reduce bias, make it safe, make it equal, and keep pushing this idea of virtual copilots in the world.”
This is a VB Lab Insight article presented by iGenius. VB Lab Insights content is created in collaboration with a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact sales@venturebeat.com.
Author: Jen Larsen
Source: Venturebeat