7 comments
66 forecasters
Will transformer derived architectures still be state of the art for language modeling in December 2025?
94%chance
No key factors yetAdd some that might influence this forecast.
Authors:
Opened:Jul 27, 2020
Closes:Dec 1, 2025
Scheduled resolution:Dec 1, 2025
Spot Scoring Time:Jul 29, 2020
Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?
98% chance
19
Will high-impact research on reducing the sample complexity of Large Language Model pretraining be forthcoming before 2026?
50% chance
33
How many billions of parameters will the largest machine learning model trained before 2030 have?
1.01M B
(190k - 7.01M)
1.01M B
(190k - 7.01M)
51 forecasters