OpenAI’s GPT-5 Development Confirmed by CEO Sam Altman


Published on:

OpenAI’s CEO, Sam Altman, has confirmed that GPT-5, the next generation of OpenAI’s language model, is currently under development. This announcement, made during an interview with the Financial Times.

Altman has not specified a release date for GPT-5, suggesting a potentially longer interval compared to the previous models like GPT-3 and GPT-4. This delay is attributed to the need for a larger volume of data for training the AI and the significant costs involved. 

OpenAI appears to be entering a new phase, prioritizing a more balanced economy over rapid innovation. This strategic shift is evident in the recent introduction of GPT-4 Turbo and other GPT models.

Altman maintained a measured approach when discussing the potential capabilities of GPT-5. Although it is expected to be more powerful and sophisticated than its predecessors, even OpenAI is uncertain a “Until we start training the model, it’s going to be a fun guessing game for us,” Altman stated, emphasizing the importance of predicting capabilities for security reasons.

As we already said, the development and launch timeline of GPT-5 are heavily influenced by two critical factors — the data required for training and the financial resources. Altman expressed his intention to use publicly available data sets and proprietary information provided by companies and organizations.

The economic aspect is particularly crucial, as OpenAI has been impacted by the high demand for NVIDIA’s H100 chips, essential for building data centers needed to train AI models. However, the situation is expected to improve next year, with players like AMD and Microsoft developing their hardware to compete with NVIDIA.

Despite significant revenue growth this year, OpenAI continues to operate at a loss due to the high costs of training AI models. Microsoft’s investment of about $10 billion this year is substantial, but more is anticipated.

OpenAI’s ultimate objective remains the development of Artificial General Intelligence (AGI). “There’s a long way to go and a lot of computing power to develop between where we are today and AGI. And the training expenses are enormous,” Altman explained.

Vishak is a skilled Editor-in-chief at Code and Hack with a passion for AI and coding. He has a deep understanding of the latest trends and advancements in the fields of AI and Coding. He creates engaging and informative content on various topics related to AI, including machine learning, natural language processing, and coding. He stays up to date with the latest news and breakthroughs in these areas and delivers insightful articles and blog posts that help his readers stay informed and engaged.

Related Posts:

Leave a Reply

Please enter your comment!
Please enter your name here