13.9 C
Los Angeles
Monday, December 23, 2024

OpenAI’s latest model hits a roadblock. There’s not enough data in the world to train it: Report

NewsOpenAI's latest model hits a roadblock. There's not enough data in the world to train it: Report

Welcome To Latest IND >> Fastest World News

Dec 22, 2024 09:43 AM IST

This is highly critical since the $157 billion valuation investors gave OpenAI in October is largely dependent on GPT-5, also known as Orion, working out

OpenAI’s new artificial intelligence (AI) project is behind schedule, running up huge bills, and it isn’t even clear if it’ll work, according to a report by the Wall Street Journal. It is because there may not even be enough data in the world to make it smart enough.

OpenAI logo is seen in this illustration taken May 20, 2024(Dado Ruvic/Reuters)
OpenAI logo is seen in this illustration taken May 20, 2024(Dado Ruvic/Reuters)

This is highly critical since the $157 billion valuation investors gave the company in October is largely dependent on this working out.

Also Read: GST Council meeting: What gets cheaper and more expensive?

The project, officially known as GPT-5 and codenamed Orion, has been in the works for over 18 months, ever since GPT-4 was launched. OpenAI’s closest partner Microsoft expected to see the new model around mid 2024.

However, new problems kept surfacing and Orion kept falling short when OpenAI conducted at least two large training runs, each involving months of crunching huge amounts of data to make it smarter.

The problem is that while it does perform better than all of the company’s current products, it hasn’t advanced to the point where the huge cost of developing it has been justified.

For reference, computing costs alone for a six-month training period can cost around half a billion dollars. In contrast, the current GPT-4 costed around $100 million to train.

GPT-5 or Orion was to have advanced to the point where new scientific discoveries could be unlocked because of it and it could also handle routine human tasks such as booking appointments or flights.

For comparison, if GPT-4 is a smart high-schooler, GPT-5 would have to become a Ph.D. in some tasks.

Also Read: Starbucks strike: Why are the company’s employees in the US protesting?

Problems in training Orion

The biggest issue in training the model is that the company’s researchers found out the public internet didn’t have enough data to train it. This for instance is how previous models were trained.

OpenAI’s solution was to create data from scratch by hiring software engineers and mathematicians to write fresh software code or solve math problems for Orion to learn from, though it is a painfully slow process.

The company is also using synthetic data, which is data created by AI itself to train Orion. However, this resulted in malfunctions or nonsensical answers.

All of these issues weren’t apparent in small-scale training efforts, but only after the full-fledged training had started. But by then, the company had spent too much money to start all over again.

Also Read: New Zealand falls into recession with abrupt economic slowdown

OpenAI’s corporate problems

On top of the challenges with Orion, OpenAI is also facing other issues such as internal turmoil and near-constant attempts by rivals to poach its top researchers, sometimes by offering them millions of dollars.

As a result, over two dozen key executives have left the company this year, including co-founder and Chief Scientist Ilya Sutskever, Chief Technology Officer Mira Murati, and Alec Radford, a widely admired researcher who served as lead author on several of OpenAI’s scientific papers.

Rivals also started catch up with GPT-4 becoming a year old, with Anthropic’s new LLM being rated by many in the industry as better than GPT-4 and Google also releasing the NotebookLM.

Stay updated with the…

See more

Latest IND

Check out our other content

Check out other tags:

Most Popular Articles