The viral chatbot interface is based on GPT-3, said to be one of the largest and most complex language models ever created – trained on 175 billion “parameters” (data points).
However, it’s something of an open secret that its creator – the AI research organization OpenAI – is well into development of its successor, GPT-4. Rumor has it that GPT-4 will be far more powerful and capable than GPT-3. One source even went as far as claiming that the parameter count has been upped to the region of 100 trillion, although this has been disputed in colorful language by Sam Altman, OpenAI’s CEO.
But could it be the case that GPT-4 is already among us? Well, nothing has been announced for certain. But some have speculated that this is the version powering the newly-launched ChatGPT functionality within Microsoft’s Bing search engine. Although unexpected, it’s a claim that makes sense, given that the Seattle tech giant recently became the OpenAI’s largest single shareholder, with a $10 billion investment.
So how is GPT-4 different than what has come before it, and does it take AI closer to becoming, in the words of Google CEO Sundar Pichai, “more profound than fire or electricity” in terms of the impact it will have on society? Here’s what we know so far:
GPT-4 will be released some time in 2023
Although, as of writing, nothing has been announced officially, numerous outlets, including the New York Times, have reported that rumors are rife around the tech industry that GPT-4 is ready for release and is likely to see the light of day outside of OpenAI’s research laboratories this year. In fact, as we've stated, some believe that it's already here, in the form of the chat functionality recently added to Bing. Currently, users have to join a waiting list to get access to ChatGPT-powered Bing, but Microsoft has said it plans to open it to millions of users before the end of February.
If this turns out to be nonsense and Bing is, in fact, running on plain old GPT-3 or GPT-3.5 (an updated version released last year), then we may have to wait a little longer. GPT-3 was initially made available to selected partners, paying customers, and academic institutions before it became widely available to the public with the launch of ChatGPT in late 2022, and a similar controlled release may be used with GPT-4.
It might not be trained on much more data than GPT-3
Again, this is unconfirmed, but it seems likely to be a safe bet. Altman himself has dismissed the idea that it is trained on 100 trillion parameters as "complete bullshit," but some sources are claiming that it could be up to 100 times larger than GPT-3, which would put it in the region of 17 trillion parameters. However, Altman has also gone on record as saying it may not, in fact, be much larger than GPT-3. This is because effort may instead be put into improving its ability to utilize existing data rather than simply throwing more and more data at it. Some experts have pointed to the fact that a rival large language model (LLM) known as Megatron 3 is trained on substantially more data than GPT-3 but does not outperform OpenAI’s platform in testing as evidence that bigger is not always better in the realm of AI. Improving the efficiency of the algorithm would reduce the running cost of GPT-4 and, presumably, ChatGPT. This will be an important factor if it is going to become as widely used as the most popular search engines, as some predict.
GPT-4 will be better at generating computer code
Earlier this year, news broke that OpenAI was actively hiring programmers and software developers – and specifically, programmers with competence in using human language to describe what their code does. This leads many to predict that future products, including GPT-4, will push AI even further to break new boundaries when it comes to generating computer code. This could lead to more powerful versions of tools such as Microsoft's Github Copilot, which currently uses a fine-tuned version of GPT-3 to improve its ability to turn natural language into code.
GPT-4 will not add graphics to its capabilities
There had been some speculation that the next evolution of generative AI would involve a combination of the text generation of GPT-3 with the image creation abilities of OpenAI’s other flagship tool, Dall-E 2. This is an exciting idea because it brings the possibility that it would have the ability to turn data into charts, graphics, and other visualizations – functionality missing from GPT-3. However, Altman denied that this is true and said that GPT-4 would remain as a text-only model.
Some people will be disappointed by GPT-4
When something causes as much excitement as GPT-3 has done, there’s an inevitability around the fact that the next iterations may not seem so groundbreaking. After all, once we’ve been amazed at a computer writing poetry, are we going to be as amazed a few years later by a computer writing slightly better poetry? This is a sentiment that’s even been expressed by Altman himself, who said in an interview in January, "The GPT-4 rumor mill is a ridiculous thing. I don't know where it all comes from … people are begging to be disappointed, and they will be.”