composing critical contingencies composing accurate predictions crowdsourcing probable insights computing quantitative understanding crowdsourcing calibrated estimations mapping the future formulating precise estimations exploring intelligent forecasts calculating precise insights forecasting calibrated insights crowdsourcing quantitative wisdom mapping contingent wisdom crowdsourcing calibrated forecasts calculating quantitative insights

Question

Metaculus Help: Spread the word

If you like Metaculus, tell your friends! Share this question via Facebook, Twitter, or Reddit.

Ragnarök Question Series: results so far

It’s dangerous to be alive and risks are everywhere. But not all risks are created equally. Luckily, the Ragnarök Question Series will bring essential clarity to your evaluation of the risks facing humanity. Its aim is to analyse some of the catastrophes we are most likely to befall, and to identify potential causes of our collective demise.

Predictions about global catastrophes this century

This table contains all probabilities of the computed on the 14th of October, 2019. The table was generated from 1,018 predictions. The first two columns are computed as the product of the overall probability of a catastrophe and the probability that if a catastrophe occurs it is due to the named cause. The third and fourth columns are products of the earlier columns and the probability that if the named catastrophe occurs more than 95% of the population will be lost. The precise specification of each question may be found in the question list below. ">10% decline?" and ">95% decline?" refer to the percentage loss in the Earth's human population relative to the pre-catastrophe population over some specified timeframe.

Some conclusions regarding the findings updated on the 14th of October, 2019:

  • The Metaculus prediction found nuclear war to be the most severe catastrophic risk. It assigned to it the highest probability to it causing a catastrophe with a >10% human population loss (at 3.24%) and a >95% human population loss (at 0.36%).

  • The risk associated with an AI failure mode is found the second-most severe in terms of >95% human population loss (with a 0.34% probability), according to the Metaculus prediction. Moreover, it predicted that, out of all catastrophes that result in a >10% loss of population, an AI catastrophe is most likely to also result in a >95% loss of population. That is, an AI catastrophe, if it were to occur, is most likely out of all catastrophes we considered to result in near-to-complete extinction.

  • By contrast, climate or geoengineering catastrophes, if these were to occur, are found to be the least likely to result in a >95% loss of population, according to the Metaculus prediction.

  • The community prediction has remained more gloomy than the Metaculus prediction with regards to both the 10% and 95% reduction in human population this century.


Please have a look at the questions individually and contribute your own insight, analysis or factorizations.[1]

  1. By 2100 will the human population decrease by at least 10% during any period of 5 years?

  2. Will such a catastrophe be due to either human-made climate change or geoengineering?

  3. Will such a catastrophe be due to a nanotechnology failure-mode?

  4. Will such a catastrophe be due to nuclear war?

  5. Will such a catastrophe be due to an artificial intelligence failure-mode?

  6. Will such a catastrophe be due to biotechnology or bioengineered organisms?

For those with an interest in the fate of humanity in the event of some the above catastrophes occur, have a look at following questions.

  1. If a global biological catastrophe occurs, will it reduce the human population by 95% or more?

  2. If an artificial intelligence catastrophe occurs, will it reduce the human population by 95% or more?

  3. If a nuclear catastrophe occurs, will it reduce the human population by 95% or more?

  4. If a global climate disaster occurs by 2100, will the human population decline by 95% or more?


[1] Since multiple catastrophes are possible this century, the sum of the predictions on specific risks may well exceed 100%.