OpenAI Roadmap: Will reduce GPT-4 API costs, considering open-sourcing GPT-3

2023-06-01 19:58:54
Collection

According to ChainCatcher news and a blog post from the AI development platform HumanLoop, OpenAI CEO Sam Altman stated in a closed-door seminar that OpenAI is currently severely limited by GPU availability, causing them to delay many short-term plans. Most issues regarding the reliability and speed of ChatGPT are due to the shortage of GPU resources.

Sam Altman also shared OpenAI's recent roadmap: in 2023, they will reduce the cost of the GPT-4 API; a longer ChatGPT context window (up to 1 million tokens) will be available, and there will be a future API version that remembers conversation history; the multimodal capabilities of GPT-4 will not be publicly available until 2024, as they cannot scale the visual version of GPT-4 to everyone until they acquire more GPU resources.

Additionally, OpenAI is considering open-sourcing GPT-3. One reason they have not done so yet is that they believe not many individuals and companies are capable of properly managing such a large language model. The recent claims in many articles that "the era of giant AI models is over" are incorrect. Internal data from OpenAI indicates that the law of scale being proportional to performance still holds, and the scale of OpenAI's models may double or triple each year (various sources indicate that GPT-4 has a parameter scale of 1 trillion), rather than increasing by many orders of magnitude. (Source link)

ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
banner
ChainCatcher Building the Web3 world with innovators