Just three months after launching tech-generating superstar ChatGPT, and only a few days after releasing a blog post about its “plans” for artificial general intelligence (AGI), OpenAI released its ChatGPT and Whisper APIs yesterday. The APIs make it easier to integrate ChatGPT into various applications.
According to the blog post, the APIs give developers “access to cutting-edge language (not just chat!) and speech-to-text capabilities.” In addition, thanks to “system-wide optimizations,” OpenAI said it achieved a 90% cost reduction for ChatGPT since December and is now passing through those savings to API users.
>>Follow VentureBeat’s ongoing ChatGPT coverage<<
Thousands of developers probably immediately scrapped their weekend plans to start building.
“We’re diving in as soon as possible,” said Nate Sanders, cofounder of qualitative data insights platform Artifact IO, which has already fully integrated GPT-3 into their platform. “We have several features that leverage question-answer, summarization and interrogation techniques. We’ll be experimenting with how chained context and windowed tasks could increase the accuracy of the tasks we perform.”
Pricing is the ‘biggest headline’
The ChatGPT API is priced at $0.002 per 1k tokens, which OpenAI says is 10x cheaper than existing GPT-3.5 models.
“From our perspective, the pricing is the biggest headline,” said Max Shaw, SVP of product at Yext, which offers digital experience software solutions. “Developers can now actively explore use cases that would’ve previously been cost-prohibitive.”
The drop-in cost for running ChatGPT is “impressive,” Gartner analyst Rowan Curran told VentureBeat by email — probably the result of a “combination of improvements to the infrastructure running the model and application software itself.”
OpenAI offers clear format and framework to build on top of APIs
One interesting aspect of the announcement, Curran added, is the release of the Chat Markup Language, OpenAI’s format for developers to communicate with the ChatGPT API.
“This gives developers working with OpenAI a very clear format and framework to build on top of these APIs,” he explained. “It is also a good first step to creating best practices around model prompts to enable greater security for applications that use LLM [large language model] prompts and responses. They have indicated that they will be doing more work around developing the markup language and in making the model more ‘steerable’ with it – so this will hopefully be a successful attempt to lay the groundwork for a standardized format for interacting with these models.”
OpenAI offers opt-in for data sharing
Another talked-about aspect of the announcement is the fact that data sharing is opt-in, rather than opt-out.
“The default assumption is that companies keep the data sent to their APIs and can use them for whatever they want, including improving their models,” said Mark Riedl, professor in the Georgia Tech School of Interactive Computing and associate director of the Georgia Tech Machine Learning Center. “AI researchers and companies need all the data they can get and this leaves readily available data on the table.”
But strategically, OpenAI needs companies to adopt their technology before competitors come out, he explained. “Third-party companies that roll out services built on ChatGPT might be chatting about proprietary things,” he said. For example, JP Morgan prohibits workers from using ChatGPT because they might talk about clients or investments. “They don’t want OpenAI using that, selling that, or training it into future versions of the technology. This may give confidence to more companies to build on the top of the technology,” he explained.
Shopify offers one of first ChatGPT API use cases
One company that already has the ChatGPT API up and running is ecommerce platform Shopify. Yesterday, it announced an AI-powered search feature for its Shop app, leveraging OpenAI’s ChatGPT API.
Shopify’s consumer shopping app has always been essential to solving one the company’s biggest challenges for its more than 1.75 million merchants — bringing in shoppers. Improving search within the app was a key opportunity in its efforts as part of the ChatGPT API beta, Miqdad Jaffer, director of product leading Shopify’s AI initiatives, told VentureBeat.
“It gave us the opportunity to look at chat as a way to do search in a different way altogether,” he said. “We didn’t want to waste any time, we wanted to get something out there as quick as we could.”
The result is that Shop’s 100 million users now have an AI-powered personal shopper that can chat about everything from gift recommendations to style advice and home decor ideas. Users can tap a purple icon in the search bar at the top of the home tab, and start chatting by asking a specific question or using one of the existing prompts.
There were, however, some necessary tweaks. “The thing that was really challenging was that you have a corpus of 175 billion-plus parameters in the ChatGPT API, and then you have an API that it is going to be referencing through — so it’s not going to necessarily know the best way to communicate directly with that,” he explained. “It’s always going to truncate some of its search parameters to what it thinks is appropriate, so you have to play with the pumping a little bit to get it to give you a little bit more information, and then have more to call into the search that’s going to result in meaningful, meaningful results coming back to the buyer.”
Cala was early user of DALL-E API, now plans to tap ChatGPT API
Meanwhile, the New York City-based Cala, a startup that bills itself as the “world’s first operating system for fashion,” showed off an early DALL-E API use case back in October. Now, the company says it is on board to implement the ChatGPT and Whisper APIs as well.
“I’ve recorded all of my Zoom calls for the last four years, so I am planning to use Whisper API to transcribe and then train via chatGPT API,” said Andrew Wyatt, cofounder of Cala. “This will enable me to converse with myself to work through tricky problems, create outlines for decks, draft tweets, or blog posts with very detailed previous context specific to me and Cala.”
Wyatt added that he wouldn’t be surprised if help articles and FAQs disappear forever, replaced by some sort of chatbot. “I’ve even thought about scrapping our landing page and just having a ‘Welcome to Cala, how can we help?’ with a text input,” he said.
What happens now?
Besides the ChatGPT API hackathons already being planned in San Francisco this weekend, one thing is certain, at least according to Wyatt. “Every Fortune 500 company board member is asking leadership about their AI strategy, so I’m sure we’re going to see the ChatGPT API implemented all over the place,” he said.
Of course, the familiar challenges of ChatGPT — most notably its tendency to confidently hallucinate — have not gone away. And while the Shop app is primarily about implementing the ChatGPT API on the search side, when it comes to prompts, “they can be prompt-hacked in whatever way people are going to do,” said Shopify’s Jaffer, who said users can have a conversation with the bot about planning a party, for example.
“We’ve tried to put constraints on the search results that will come back, but the bot is the bot,” he said.
Meanwhile, developers are chomping at the bit to get started.
“I’m excited to see where this goes and how the role of the developer will evolve,” said Hadi Chami, developer advocate and manager at Leadtools. “Low-code apps set the scene for citizen developers and AI will help push forward a whole new set of capabilities from customer service, coding and anticipating software updates.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.
Author: Sharon Goldman
Source: Venturebeat