Your submission is now in Draft mode.

Once it's ready, please submit your draft for review by our team of Community Moderators. Thank you!

Submit Essay

Once you submit your essay, you can no longer edit it.

Pending

This content now needs to be approved by community moderators.

Submitted

This essay was submitted and is waiting for review.

Billions of params of GPT-4 if released

Question

GPT stands for "Generative Pre-Training" and was introduced in this paper from OpenAI in 2018. GPT-2 became famous in 2019 within the machine learning community for producing surprisingly coherent written text samples. It used 1.5 billion parameters.

In May 2020, OpenAI released GPT-3, a 175 billion parameter model, widely regarded to have impressive language generation abilities. The massive increase in parameter count compared to GPT-2 is likely the result of a previous investigation from OpenAI which revealed the relationship between neural language model size and performance. Many are now interpreting OpenAI's strategy as one intended to scale neural models to their ultimate practical limit. Gwern writes,

The scaling hypothesis that, once we find a scalable architecture like self-attention or convolutions, which like the brain can be applied fairly uniformly (eg “The Brain as a Universal Learning Machine” or Hawkins), we can simply train ever larger NNs and ever more sophisticated behavior will emerge naturally as the easiest way to optimize for all the tasks & data, looks increasingly plausible. [...]

In 2010, who would have predicted that over the next 10 years, deep learning would undergo a Cambrian explosion causing a mass extinction of alternative approaches throughout machine learning, that models would scale up to 175,000 million parameters, and that these enormous models would just spontaneously develop all these capabilities, aside from a few diehard connectionists written off as willfully-deluded old-school fanatics by the rest of the AI community.

If GPT-4 is released from OpenAI, how many parameters will it contain, in billions of parameters? Resolution is made via a report from OpenAI.

If OpenAI does not release GPT-4 by January 1st 2023, this question resolves ambiguously.

In case OpenAI does not explicitly refer to the relevant model as GPT-4, members of the community, community moderators or admin will do a strawpoll on the /r/openai subreddit and ask:

In your opinion, is it roughly correct to say that this model is the successor to GPT-3?

After 1 week, the majority answer wins with a tie counting as "yes".

Make a Prediction

Prediction

Note: this question resolved before its original close time. All of your predictions came after the resolution, so you did not gain (or lose) any points for it.

Note: this question resolved before its original close time. You earned points up until the question resolution, but not afterwards.

Current points depend on your prediction, the community's prediction, and the result. Your total earned points are averaged over the lifetime of the question, so predict early to get as many points as possible! See the FAQ.