assembling quantitative understanding calculating probable futures crowdsourcing precise insights exploring intelligent futures crowdsourcing precise contingencies mapping the future aggregating calibrated predictions exploring quantitative insights computing definitive understanding modeling contingent insights aggregating accurate wisdom generating probable predictions crowdsourcing precise contingencies assembling precise futures

Question

Metaculus Help: Spread the word

If you like Metaculus, tell your friends! Share this question via Facebook, Twitter, or Reddit.

Will a 100 trillion parameter deep learning model be trained before 2026?

In the last few years, the size of the largest deep learning models has grown enormously. Within the field of natural language processing, the largest models have gone from having 94 million parameters in 2018, to 17 billion parameters in early 2020.

Now, Microsoft has released a new library DeepSpeed and created a memory efficient optimizer which aid in training extremely large models distributed across GPU clusters. From their blog post,

The Zero Redundancy Optimizer (abbreviated ZeRO) is a novel memory optimization technology for large-scale distributed deep learning. ZeRO can train deep learning models with 100 billion parameters on the current generation of GPU clusters at three to five times the throughput of the current best system. It also presents a clear path to training models with trillions of parameters, demonstrating an unprecedented leap in deep learning system technology. [...] With all three stages enabled, ZeRO can train a trillion-parameter model on just 1024 NVIDIA GPUs.

For comparison, the current top supercomputer Summit has 27,648 GPUs, suggesting that training models with tens of trillions of parameters is already within theoretical reach.

Also recently, advances in neural models such as the new Reformer may enable the ability to train large models that use memory much more efficiently.

I have chosen 100 trillion because it is considered by some to be the median estimate of the number of synapses in a human neocortex.

This question resolves positively if and when a reliable paper, blog post, or any other type of document, is published that reports that a deep learning model with at least 100 trillion parameters was trained before January 1st 2026 (no other details need to be reported except for the number of parameters). Otherwise, this question resolves negatively.

{{qctrl.predictionString()}}

Metaculus help: Predicting

Predictions are the heart of Metaculus. Predicting is how you contribute to the wisdom of the crowd, and how you earn points and build up your personal Metaculus track record.

The basics of predicting are very simple: move the slider to best match the likelihood of the outcome, and click predict. You can predict as often as you want, and you're encouraged to change your mind when new information becomes available.

The displayed score is split into current points and total points. Current points show how much your prediction is worth now, whereas total points show the combined worth of all of your predictions over the lifetime of the question. The scoring details are available on the FAQ.

Note: this question resolved before its original close time. All of your predictions came after the resolution, so you did not gain (or lose) any points for it.

Note: this question resolved before its original close time. You earned points up until the question resolution, but not afterwards.

This question is not yet open for predictions.

Thanks for predicting!

Your prediction has been recorded anonymously.

Want to track your predictions, earn points, and hone your forecasting skills? Create an account today!

Track your predictions
Continue exploring the site

Community Stats

Metaculus help: Community Stats

Use the community stats to get a better sense of the community consensus (or lack thereof) for this question. Sometimes people have wildly different ideas about the likely outcomes, and sometimes people are in close agreement. There are even times when the community seems very certain of uncertainty, like when everyone agrees that event is only 50% likely to happen.

When you make a prediction, check the community stats to see where you land. If your prediction is an outlier, might there be something you're overlooking that others have seen? Or do you have special insight that others are lacking? Either way, it might be a good idea to join the discussion in the comments.

Embed this question

You can use the below code snippet to embed this question on your own webpage. Feel free to change the height and width to suit your needs.