$700,000/day: ChatGPT’s Eye-Popping Operating Cost Revealed

By:

Published on:

ChatGPT, the AI-powered language model developed by OpenAI, has been hailed as one of the fastest-growing technologies in human history. With over 100 million active users just two months after its launch, it has surpassed the growth rate of popular apps like TikTok and Instagram. However, the cost of operating such an AI system has always been a concern, with OpenAI CEO Sam Altman stating that it would cost “eye-popping” to get it up and running.

Recently, SemiAnalysis, a technology research firm, published the results of their trial calculations, which revealed that ChatGPT costs approximately $700,000 per day to operate, with most of that money being spent on the hardware infrastructure required to run the system. The computational hardware cost for ChatGPT is estimated at $694,444 per day, with OpenAI requiring 3,617 HGX A100 servers (28,936 GPUs) to operate the system.

Dylan Patel, a chief analyst at SemiAnalysis, said the initial cost estimates were based on GPT-3 and that the newer model GPT-4 may have even higher operating costs. This is a significant concern for OpenAI, as the cost of operating such a system can quickly eat into their profits. If the ChatGPT model were used for Google’s existing search business, “LLM inference costs” alone would eat up $36 billion from the company’s profits, according to SemiAnalysis.

To address this issue, Microsoft, a prominent OpenAI backer, is reportedly working on an AI-specific chip that it hopes will reduce operating costs. OpenAI is also making cash by launching a $20 premium subscription, ChatGPT Plus, earlier this year. However, the long-term sustainability of such a system remains a question, as the cost of operating such an AI model can quickly spiral out of control.

While ChatGPT is a significant breakthrough in the field of AI, the high cost of operating such a system is a significant concern. As OpenAI continues to develop and improve ChatGPT, it must find ways to reduce operating costs to ensure the project’s long-term sustainability. With Microsoft’s AI-specific chip in development and the launch of ChatGPT Plus, there may be hope for a more cost-effective future for ChatGPT.

Vishak
Vishak
Vishak is a skilled Editor-in-chief at Code and Hack with a passion for AI and coding. He has a deep understanding of the latest trends and advancements in the fields of AI and Coding. He creates engaging and informative content on various topics related to AI, including machine learning, natural language processing, and coding. He stays up to date with the latest news and breakthroughs in these areas and delivers insightful articles and blog posts that help his readers stay informed and engaged.

Related Posts:

Leave a Reply

Please enter your comment!
Please enter your name here