Did you miss a session from the Future of Work Summit? Head over to our Future of Work Summit on-demand library to stream.
Conversational AI, which allows chatbots to engage in human-like conversations, has been a much talked about (and debated) topic in the enterprise IT. Some say it’s the future of how companies will work with their employees and customers. Others claim that the technologies behind conversational AI fail to understand English language nuances, let alone other languages, and aren’t fully mature.
Perspectives can vary, but the numbers continue to show that conversational AI is on track to see widespread adoption. A recent survey conducted by Replicant found that nearly 80% of consumers are willing to speak with conversational AI. Gartner predicts that enterprise-level chatbot implementation will see over a 100% increase in the next two to five years. This surge in demand is exactly what Moveworks has witnessed in recent years. The California-based company that leverages conversational AI to offer end-to-end employee support, has seen the surge grow particularly throughout the pandemic when the need for hybrid and remote work grew significantly.
“There are three secular shifts that are leading the way to produce a brand-new era of conversational AI — SaaS [software-as-a-service] integrations, enterprise messaging, and NLU advancements,” Bhavin Shah, the founder and CEO, of Moveworks, said during a panel at VentureBeat’s Future of Work Summit.
More often than not, the response to conversational solutions like chatbots is underwhelming, as they fail to understand the meaning and nuances of a user’s sentence and come up with incorrect responses. This, Shah said, is a result of hard-coding the tools with rigid logic flows (if this then that kind of system) and can go away with the effective employment of advanced ML models, allowing the tools to be more seamless.
“By using machine learning, new techniques, and ensembles of techniques – from spell corrector models to statistical grammar models – you can actually react to the conversation as it emerges with the employee instead of predetermining it,” he said.
Enterprises mastering hybrid, remote work with conversational AI
The sophisticated conversational AI offered by Moveworks has already driven positive business outcomes for enterprises working remotely, Shah emphasized while noting the case of Palo Alto Networks, one of the largest cybersecurity companies in North America.
At the peak of the pandemic during April 2020, Palo Alto envisioned Flexwork, an ecosystem tying together Uber, Box, Splunk, and Zoom for seamless remote working. However, in order to bring the vision to life, the company needed a digital hub to ensure personalized (based on location, role, working habits) and friction-free employee support. That’s where Moveworks came in and developed Sheldon, a conversational AI chatbot that allowed Palo Alto employees to seek IT help, HR help, and more.
“Over 90% of employees now use Sheldon on a regular basis. And over 4,000 issues are solved by Sheldon completely autonomously end-to-end, saving Palo Alto Networks over 180,ooo hours of productivity,” the founder said, adding that the company’s stock has grown 252% since then.
The CEO went on to cite other success stories where chatbot solutions not just helped enterprises thrive in a hybrid work environment, but also drove the overall advancement of conversational AI technology.
For instance, Hearst Media, which has been around for 130 years, uses a chatbot named Herbie to provide hybrid employees support information and resources from the systems scattered across over 360 subsidiary organizations. Herbie, Shah said, tackles this massive challenge by using an Enterprise Cache system, which indexes available resources every four hours, to make sure employees get a single, precise snippet of information as the answer to every question.
Moveworks also enhanced a chatbot called ALBot for chemical giant Albemarle. This solution stands apart from others because it doesn’t just support English-only questions, but also those in other languages as well. This enables the company to treat its entire global workforce as first-class citizens and save the cost of hiring multilingual support agents. However, the task wasn’t easy because supporting foreign languages required starting from the ground up and building new machine learning models using language data (examples of queries/use cases) that wasn’t available as widely as English language data.
“So we discovered a technique called collective learning,” Shah said. “We are able to abstract all these different sentences that are spoken, no matter what language, and from there we can take that idea and permute all of these different examples into millions of use cases that we can use to train our machine learning models, making them more robust making them more precise.”
Market opportunity for conversational AI
With companies like these coming to the fore and leveraging NLU and AI to power remote employee experiences through chatbots, conversational AI is expected to become a commonality in the long run. According to a Markets and Markets study, the market size for the technology is expected to grow 22% to nearly $19 billion by 2026.
VentureBeat
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.
Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
- networking features, and more
Author: Shubham Sharma
Source: Venturebeat