The race for ultra-powerful AI: who will reach the 2 trillion parameter peak

Elizabeth Smith

If the emergence of generative AI has left you dismayed, know we have not seen anything yet. The genie is out of the bottle, and it will be hard to get back in.

Indeed, the race is as wild as ever, and there are six projects vying to create AI models that go beyond 2 trillion parameters. Yes, you read that right: trillions.

AI titans with 2 trillion parameters

The six major projects competing for the 2 trillion-parameter milestone are OpenAI, Anthropic, Google/Deepmind, Meta, a British government project and one that is still secret.

And it is a “deadly” race: to keep up requires economic resources. Lots of them. Between $1 billion and $2 billion a year, to constantly upgrade hardware (increasingly voracious in computation and energy). Al well as hire hundreds of specialists, and retain the best team members with million-dollar salaries and stock options.

GPT-5: The return of the king

After taking half the world by storm, OpenAI already has the possible knockout blow in the works. The one that could assure Sam Altman’s company supremacy.

GPT-5 will be completed by the end of 2023 and released in early 2024, with between 2 and 5 trillion parameters. We are unable, at present, to imagine its capabilities.

Read also: GPT-4, the new version of Chat GPT: the main innovative features

Claude-Next: Anthropic and its ambitious project

Anthropic, the team founded by former OpenAI employees, is working on a model called Claude-Next, which aims to be 10 times more powerful than current AI. With $1 billion in funding already secured and another $5 billion on the way, Anthropic expects to achieve its goals over the next 18 months.

Their flagship model will require 10^25 FLOPs, using clusters consisting of tens of thousands of GPUs. Among Anthropic’s backers is Google itself, which is playing on multiple tables.

Gemini: Google seeking redemption with Deepmind

Google and Deepmind are collaborating to develop a GPT-4 competitor called Gemini. The project started recently after Bard showed it could not compete with ChatGPT.

Gemini will be a large language model with trillions of parameters, similar to GPT-4 or GPT-5, and will use tens of thousands of Google AI TPU chips for training. No word yet on whether it will be multimodal.

Deepmind has also developed the Sparrow web-based chatbot, optimized for security and similar to ChatGPT. DeepMind researchers have found that Sparrow’s quotes are useful and accurate 78 percent of the time.

Another leading DeepMind model is Chinchilla, trained on 1.4 trillion parameters.

The parameters of an unthinkable future

If you want to get an idea of what 2 trillion parameters means, know that the estimated total amount of usable textual data in the world is between 4.6 trillion and 17.2 trillion parameters.

All the books, scientific articles, news stories, the entire Wikipedia, publicly available code, and much of the rest of the Internet, filtered for quality. To sum up, all the digital human knowledge.

As larger models arrive, new capabilities will emerge. Over the next 6 years, there will be improvements in computing power and algorithms to scale models a thousand-fold, in fact much more.

Nvidia’s CEO has predicted AI models a million times more powerful than ChatGPT within 10 years.

Read also: Elon Musk to launch new artificial intelligence company X.AI for the development of own TruthGPT AI project

Related articles...
Latest news
Can the gaming industry keep growing?
Aircraft contrails: the AI tries to eliminate them
Anonymous browsing: what is it really for? Here is what you need to know
The war between Iran and Israel shakes the world: what are the positions of other countries
Buyback: what share buyback is and how it works
The Iranian attack on Israel: 4 scenarios that could unfold now

Newsletter

Sign up now to stay updated on all business topics.