Your submission is now a Draft.

Once it's ready, please submit your draft for review by our team of Community Moderators. Thank you!

You have been invited to co-author this question.

When it is ready, the author will submit it for review by Community Moderators. Thanks for helping!


This question now needs to be approved by community moderators.

You have been invited to co-author this question.

It now needs to be approved by community moderators. Thanks for helping!

Backpropagation taking a back seat by 2030

Backpropagation is a widely used method of efficiently computing gradients of functions with many inputs and few outputs and hence particularly popular for modern machine learning techniques like neural networks. It can be seen as applying the chain rule "from outside to inside". It is generalised by Automatic Differentiation which imposes no such restriction on the order in which to apply the chain rule. Choosing this order appropriately may lead to an additional speed-up.

At NeurIPS 2022 Geoffrey Hinton presented the forward-forward algorithm, an alternative to backpropagation and automatic differentiation more generally. While it does not seem to yield competitive results just yet, one might wonder about more research in this direction.

Will backpropagation take a back seat by 2030?

This question will resolve as Yes if, at any point from January 1, 2023 to January 1, 2030, over 50% of machine learning frameworks (each framework is weighted by its relative usage as elicited from or whatever we deem to be the closest alternative if it doesn't exist anymore) use something other than backpropagation or automatic differentiation.

If it is not obvious whether a framework "uses" backpropagation or automatic differentiation—e.g. because multiple methods are implemented or it is abstracted away in the code—we look at the "Getting Started" section or something equivalent on their website and see whether the first "reasonable" example (no print("Hello world!")) listed there uses backpropagation or automatic differentiation under the hood. If no such page/example exists, this will be up to the discretion of moderators, but should involve checking the source code or reaching out to developers.

Note that automatic differentiation is included because this question is not supposed to resolve positively due to some minor local modifications in the order of differentiation which, technically, is not backpropagation anymore. Moreover, we use to track progress in machine learning instead of just neural networks in case neural networks are replaced by a different approach, but backpropagation remains the best option for neural networks.

Note: this question resolved before its original close time. All of your predictions came after the resolution, so you did not gain (or lose) any points for it.

Note: this question resolved before its original close time. You earned points up until the question resolution, but not afterwards.

Current points depend on your prediction, the community's prediction, and the result. Your total earned points are averaged over the lifetime of the question, so predict early to get as many points as possible! See the FAQ.