A major uncertainty in understanding some timeline estimates for high-level AI is in estimating the minimal computational power necessary to perform the operations that the human brain does.
Estimates in the literature (see appendix A on p. 84 of this paper for a compilation) range from to FLOPS following a variety of methodologies. (For a comparison, the Landauer limit at 20 C is about bit erasures per second. However, the author has no clear idea how to convert between bit erasures and FLOPS.)
This huge range will probably eventually be narrowed down to within an order-of-magnitude or two, and we ask for that number here. Assume that by 2075 there is either (a) a full software emulation of a human brain that can duplicate the basic functionality of a typical adult human of average intelligence; or (b) there is an AI system that can pass a full "strong" Turing test (i.e. the interview is long, adversarial, and include sensory data); or (c) there is a computer system that attains "human intelligence parity" by the definition set forth in this Metaculus question. In each case (a,b,c) the number will be evaluated on a state-of-the-art system five years after the first demonstration of a system satisfying the criterion.
What will the computation in FLOPS be of this machine system, if running at a speed comparable to that of human mental processing?
The point of this question is not really as a prediction, but more as a gathering place for estimates.
Fine print: we'll settle for a published estimate accurate to within a factor of 5. The speeds of the systems can be matched up by requiring that similar delays occur between queries and responses in the system as compared to humans, or scaling for this equivalency. Resolves as ambiguous if (a), (b) or (c) don't occur by 2075.
(edited 2020-09-13 to fix eval date as 5 years after such a system appears.)