Microsoft-backed OpenAI has rolled out application programming interfaces (APIs) for ChatGPT, the popular generative artificial intelligence (AI) tool made by the company. These APIs will make it easier for companies to plug-in ChatGPT into their conversation platforms, the company said in a blog post, adding that this also streamlines the cost of deployment of AI in chat services.
It said that the version of ChatGPT that is being offered to developers is based on the language model called ‘GPT 3.5 Turbo’ — which is seemingly different from GPT 3.5, the version that originally powered ChatGPT. It is also different from Microsoft’s ChatGPT integration in Bing, which uses the OpenAI tool and its own in-house technologies and data to create what it called ‘Prometheus Model’.
An API is a set of software commands that allow a service to be used as a plug-in tool atop an existing platform. In the API form, ChatGPT can now be officially added to existing chatbot platforms — a feature that conversational AI companies have been trying to do over the past two months.
With the API, OpenAI is seemingly looking to address issues and concerns that many have raised about businesses using ChatGPT in their operations. For one, the company said in its blog post that users of ChatGPT will now get the option to voluntarily allow OpenAI to collect the data being fed to the service to train its AI language model. So far, OpenAI collected all data being submitted through the tool by default, unless a user specifically chose to opt out of it by writing to the company. This can address major privacy concerns, which Bern Elliot, vice-president and analyst at Gartner, highlighted to Mint on February 27.
OpenAI also said that businesses that run massive volumes of data through ChatGPT to custom-train the generative language model will get more controls on the tool itself, including which model of GPT is it using, and the length of conversations that it can support.
Businesses using ChatGPT for their operations will need to pay $0.002 for 1,000 tokens. Each token will approximately correspond to four letters or characters in a query, and the company will also offer businesses a tool to calculate approximately how many tokens would one user query cost — and extrapolate this figure to match their expected customer query volume. OpenAI also claimed that its pricing is “10x cheaper than existing GPT3.5 language models”.
The update to its operations comes as businesses continue to evaluate challenges to deploying ChatGPT in their platforms. Mint reported on Monday that while companies are evaluating the use of ChatGPT in their businesses, concerns regarding consistency of results, lack of sufficient privacy-related governance settings, and complete customization of the tool to a business’ own data sets are holding companies back from deploying the tool.
OpenAI also announced pricing for its AI-powered transcribing and translating speech-to-text platform, Whisper, which will cost businesses $0.006 for every minute of audio input. While Whisper is available as an open-source tool already, using OpenAI's version will reportedly give businesses more tools to custom-deploy it into their operations.