Question
Will transformer derived architectures still be state of the art for language modeling in December 2025?
Total Forecasters65
Community Prediction
90%
(85% - 94%)
Make a Prediction
No key factors yetAdd some that might influence this forecast.
Add key factor
Authors:
Opened:Jul 27, 2020
Closes:Dec 1, 2025
Scheduled resolution:Dec 1, 2025
Spot Scoring Time:Jul 29, 2020
Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?
90%
Will high-impact research on reducing the sample complexity of Large Language Model pretraining be forthcoming before 2026?
60%
How many billions of parameters will the largest machine learning model trained before 2030 have?
1.04M
Authors:
Opened:Jul 27, 2020
Closes:Dec 1, 2025
Scheduled resolution:Dec 1, 2025
Spot Scoring Time:Jul 29, 2020
Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?
90%
Will high-impact research on reducing the sample complexity of Large Language Model pretraining be forthcoming before 2026?
60%
How many billions of parameters will the largest machine learning model trained before 2030 have?
1.04M