I have already asked whether there will be a machine learning model trained with 100 trillion parameters trained before 2026. We still have a way to go before reaching that milestone, but a day before writing this question, OpenAI published a paper describing GPT-3, a 175 billion parameter transformer. This model is over an order of magnitude larger than the previous largest models, which had roughly 17 billion parameters.
Physical constraints will eventually slow progress, but things can still get interesting before then.
I ask, before 2030, how many billions of parameters will the largest machine learning model trained have? Resolution is determined by some sort of reliable document, blog post, or paper, published anywhere on the internet.