Content
summary Summary

OpenAI's most powerful AI model, GPT-4, is freely available in Microsoft's Bing chatbot - but it's not always in use. Presumably for cost reasons, Microsoft also uses its Turing language models, which could affect the quality of some responses.

Ad

About three months after its launch, the Bing chatbot is still prone to making mistakes and giving unreliable answers, more so than the original ChatGPT from OpenAI. Now there's a possible explanation: the Bing chatbot doesn't consistently rely on OpenAI's AI models.

The new Bing is powered by an AI model called Prometheus, Microsoft said at the service's launch. The system assigns questions whose answers are classified as "simple" to Microsoft's Turing language models, said Jordi Ribas, head of search and AI, in an interview with Wired.

Turing models are also an integral part of Bing search results, as Microsoft explains on a landing page. According to Microsoft, they are responsible for ranking, autocomplete, and evaluating the relevance of ads, among other things.

Ad
Ad

Peter Salin, founder of AI startup Silo AI, suggests that the first answer is usually given by a Turing model, but that further queries would yield better results because they would be processed by GPT-4. In this way, Microsoft could save on computational costs.

Ribas rejects the notion that the Bing chatbot's initial answers are generally of lower quality. Inaccurate answers, he says, are more likely due to a lack of context in users' initial queries.

GPT-4 is expensive to run

OpenAI has not yet released revenue and expense figures for ChatGPT. Dylan Patel, an analyst at research firm SemiAnalysis, estimates that ChatGPT currently costs OpenAI about $700,000 per day, or 36 cents per request, for a total of more than $200 million per month.

OpenAI makes money through a paid API, the GPT-4 API in particular is expensive, and the ChatGPT Plus subscription model for $20 per month, which is a requirement for access to GPT-4 via ChatGPT. Whether OpenAI makes or loses money is currently unknown.

According to projections by SimilarWeb, there were about 100 million active users in January. Based on the numbers above, if about ten percent of them had a paid account, the pure operation of ChatGPT could be roughly cost-covering. However, this is a very shaky calculation.

Recommendation

The high computing costs are mainly due to the use of power-hungry Nvidia graphics cards, which Microsoft plans to replace with its own AI chip, according to The Information. Meanwhile, the company, which provides the infrastructure for ChatGPT and similar services on Azure, is apparently trying to minimize the use of the expensive GPT-4.

Microsoft's AI chatbot has been integrated into Bing since early February 2023 and provides a glimpse into a possible future of search: instead of generating a list of links, the AI answers the question asked in a compact summary - in the best case, including links to relevant sources so that the chatbot's statements can be validated externally. This is the biggest advantage of GPT-4 in Bing over the native ChatGPT integration.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.
Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • The new Bing was the first freely available web interface to GPT-4.
  • However, the queries of the search engine are not always answered with the most powerful language model of OpenAI.
  • Instead, Microsoft also uses its Turing language models. Microsoft probably developed this architecture for cost reasons.
Sources
Jonathan works as a freelance tech journalist for THE DECODER, focusing on AI tools and how GenAI can be used in everyday work.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.