AI & RoboticsNews

VentureBeat Transform Day 1: Moving fast with care advised for AI adoption

This year’s VentureBeat Transform event focused on generative AI, the technology that has been causing massive change, excitement and concern over the past few months. VentureBeat CEO Matt Marshall opened the event with an observation that is becoming important for the enterprise: “One language model will not rule them all. There will be many models. And today, you can build a best-in-breed model for your customers, your data, and for a low cost.”

Top experts from different industries shared their experiences and explained how they are using LLMs and other generative models in their products and businesses to improve efficiency and customer experience. Here are some of the themes that were prominent on the first day of VB Transform.

Models such as ChatGPT have made it possible for more and more people to use generative AI in their everyday lives. We are starting to see the technology in many domains. At the Women in AI breakfast panel, Mastercard’s chief data officer JoAnn Stonier compared it to the Oscar-winning movie Everything Everywhere All at Once. “The pace is really really fast,” she said.

The panelists noted that generative AI is becoming democratized, but that we must also make sure everyone can take advantage of the opportunities. We need to make sure the right people are involved and we ask the right questions and have the right constraints to achieve equitable results.

>>Follow all our VentureBeat Transform 2023 coverage<<

In a fireside chat, Uljan Sharka, CEO of iGenius, stressed the importance of using a human-centered approach in developing generative AI products. Today’s efforts are mostly focused on technology, model size and data size. But we should focus on human needs if we want everyone to benefit from this new wave of technology.

“In the past 20 years, we designed amazing technology, but we did not get the adoption we hoped for,” he said. “This will happen again if we don’t design for the human.”

As with everything AI, generative models rely on abundant quality training data. Ashok Srivastava, senior vice president and chief data officer at Intuit, highlighted two key aspects of a strong data foundation: Having clean data at scale and having real-time data at scale.

Intuit has long used machine learning in its products. It has two million operational models doing personalization, 730 million customer-driven AI interactions per year, and 30 million interactions with experts and humans. “When you have that kind of interaction going on, AI starts to play a very critical role,” he said.

The company has built a generative AI operating system, GenOS, that uses gen AI to bring new and personalized experiences to customers. While LLMs are playing an enormous role in GenOS, classic machine learning models such as classifiers and recommendation systems are not going away, he said.

Mark Tack, CMO at TreasureData, and Gail Muldoon, data scientist at Stellantis, spoke about using generative AI to accelerate personalization and improve customer insights. Tack warned about the consequences of falling into the “shiny object syndrome” trap. Before using AI as an accelerator, you must know what you are accelerating. You have to have an objective, a strategy and metrics before deciding what role AI can play.

“People are excited, they’re moving fast. Slow down in order to speed up,” Tack said. “If you jump in full-throttle and you don’t have those foundational elements in place, you do risk going in the wrong direction and it could potentially do more harm than good.”

That said, there is a lot of untapped potential for the enterprise. Muldoon explained how Stellantis, the world’s third-largest automobile company, is using AI to transition to a full-online shopping experience. “ [Treasure Data’s] Customer Data Cloud [data consolidation platform] allowed us to anticipate customers’ shopping interests, enabling us to suggest specific products from our range and understand their preferences,” she said.

Enterprises are still exploring how to best use generative AI, but efficient access to information is certainly one of the low-hanging fruits. At a fireside chat, Sarah Hoffman, VP of AI and ML at Fidelity, said that gen AI is enabling new ways to collaborate and to define workflows, such as interfaces that use text boxes instead of a “web page with lots and lots of tabs.”

In terms of creativity, generative AI will be very useful in brainstorming, where hallucinations are not necessarily an issue. “Any type of brainstorming you’re doing, it’s good to look at this technology,” she said.

Steve Wood, SVP of product and platform at Slack, discussed the role LLMs will play in making automation available to everyone at organizations. “I think too many organizations are holding on to automation as a practitioner’s role. And I think we need to open it up and … [empower] everybody to build and automate things, and they may not get it perfectly right,” he said.

The integration of the knowledge held in LLMs with the data found in the conversations on Slack channels can unlock bespoke business intelligence for the users of the collaboration tool, Wood said.

“Today there’s all these pervasive productivity gains through these tools and we just have to let them be discovered,” said Wood.

Join top executives in San Francisco on July 11-12 and learn how business leaders are getting ahead of the generative AI revolution. Learn More


This year’s VentureBeat Transform event focused on generative AI, the technology that has been causing massive change, excitement and concern over the past few months. VentureBeat CEO Matt Marshall opened the event with an observation that is becoming important for the enterprise: “One language model will not rule them all. There will be many models. And today, you can build a best-in-breed model for your customers, your data, and for a low cost.”

Top experts from different industries shared their experiences and explained how they are using LLMs and other generative models in their products and businesses to improve efficiency and customer experience. Here are some of the themes that were prominent on the first day of VB Transform.

Putting humans at the center of the generative AI experience

Models such as ChatGPT have made it possible for more and more people to use generative AI in their everyday lives. We are starting to see the technology in many domains. At the Women in AI breakfast panel, Mastercard’s chief data officer JoAnn Stonier compared it to the Oscar-winning movie Everything Everywhere All at Once. “The pace is really really fast,” she said.

The panelists noted that generative AI is becoming democratized, but that we must also make sure everyone can take advantage of the opportunities. We need to make sure the right people are involved and we ask the right questions and have the right constraints to achieve equitable results.

Event

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.

 


Register Now

>>Follow all our VentureBeat Transform 2023 coverage<<

In a fireside chat, Uljan Sharka, CEO of iGenius, stressed the importance of using a human-centered approach in developing generative AI products. Today’s efforts are mostly focused on technology, model size and data size. But we should focus on human needs if we want everyone to benefit from this new wave of technology.

“In the past 20 years, we designed amazing technology, but we did not get the adoption we hoped for,” he said. “This will happen again if we don’t design for the human.”

Building on top of a strong data foundation

As with everything AI, generative models rely on abundant quality training data. Ashok Srivastava, senior vice president and chief data officer at Intuit, highlighted two key aspects of a strong data foundation: Having clean data at scale and having real-time data at scale.

Intuit has long used machine learning in its products. It has two million operational models doing personalization, 730 million customer-driven AI interactions per year, and 30 million interactions with experts and humans. “When you have that kind of interaction going on, AI starts to play a very critical role,” he said.

The company has built a generative AI operating system, GenOS, that uses gen AI to bring new and personalized experiences to customers. While LLMs are playing an enormous role in GenOS, classic machine learning models such as classifiers and recommendation systems are not going away, he said.

Slowing down to speed up

Mark Tack, CMO at TreasureData, and Gail Muldoon, data scientist at Stellantis, spoke about using generative AI to accelerate personalization and improve customer insights. Tack warned about the consequences of falling into the “shiny object syndrome” trap. Before using AI as an accelerator, you must know what you are accelerating. You have to have an objective, a strategy and metrics before deciding what role AI can play.

“People are excited, they’re moving fast. Slow down in order to speed up,” Tack said. “If you jump in full-throttle and you don’t have those foundational elements in place, you do risk going in the wrong direction and it could potentially do more harm than good.”

That said, there is a lot of untapped potential for the enterprise. Muldoon explained how Stellantis, the world’s third-largest automobile company, is using AI to transition to a full-online shopping experience. “ [Treasure Data’s] Customer Data Cloud [data consolidation platform] allowed us to anticipate customers’ shopping interests, enabling us to suggest specific products from our range and understand their preferences,” she said.

Boosting creativity and efficiency

Enterprises are still exploring how to best use generative AI, but efficient access to information is certainly one of the low-hanging fruits. At a fireside chat, Sarah Hoffman, VP of AI and ML at Fidelity, said that gen AI is enabling new ways to collaborate and to define workflows, such as interfaces that use text boxes instead of a “web page with lots and lots of tabs.”

In terms of creativity, generative AI will be very useful in brainstorming, where hallucinations are not necessarily an issue. “Any type of brainstorming you’re doing, it’s good to look at this technology,” she said.

Democratizing automation

Steve Wood, SVP of product and platform at Slack, discussed the role LLMs will play in making automation available to everyone at organizations. “I think too many organizations are holding on to automation as a practitioner’s role. And I think we need to open it up and … [empower] everybody to build and automate things, and they may not get it perfectly right,” he said.

The integration of the knowledge held in LLMs with the data found in the conversations on Slack channels can unlock bespoke business intelligence for the users of the collaboration tool, Wood said.

“Today there’s all these pervasive productivity gains through these tools and we just have to let them be discovered,” said Wood.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.


Author: Ben Dickson
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!