Metaculus
M
Questions
Tournaments
Leaderboards
News
More
create
Log in
a
/
ๆ
Feed Home
๐ฅ
Communities
๐ญ
Bridgewater 2025
๐ค
AI Benchmarking
๐
ACX 2025
Topics
โจ๐
Top Questions
๐ฆ๐ฆ
H5N1 Bird Flu
๐๏ธ๐
Global Elections
โณ๐
5 Years After AGI
๐ฎ๐ฑ๐ต๐ธ
Gaza Conflict
๐ฆ ๐ฉบ
Mpox outbreak
๐บ๐ฆโ๏ธ
Ukraine Conflict
categories
๐ค
Artificial Intelligence
๐งฌ
Health & Pandemics
๐
Environment & Climate
โฃ๏ธ
Nuclear Technology & Risks
See all categories
Hot
Movers
New
More
Filter
Will artificial superintelligence precede the achievement of longevity escape velocity, if either occur by year 2300?
92%
138 forecasters
29
9 comments
9
AGI Outcomes
condition
OpenAI solves alignment before June 30 2027?
93 forecasters
if yes
if no
Human Extinction by 2100
1%
1%
Human Extinction by 2100
4%
4%
14
11 comments
11
AI Safety
Will there be a positive transition to a world with radically smarter-than-human artificial intelligence?
50.8%
415 forecasters
35
98 comments
98
AGI Outcomes
Ragnarรถk Question Series: if an artificial intelligence catastrophe occurs, will it reduce the human population by 95% or more?
50%
238 forecasters
21
14 comments
14
Ragnarรถk Series
After a (weak) AGI is created, how many months will it be before the first superintelligent AI is created?
30.2
298 forecasters
34
48 comments
48
AGI Outcomes
After a weak AGI is created, how many months will it be before the first superintelligent oracle?
18.9
235 forecasters
44
36 comments
36
AGI Outcomes
Will Any Major AI Company Commit to an AI Windfall Clause by 2025?
Resolved :
No
1%
121 forecasters
16
14 comments
14
Business of AI
Will transformative AI come with a bang?
13
5 comments
5
AI Progress Essay Contest
Leverage AI-centric to AGI and ASI-centric beyond Moore's Law
no comments
0
AI Progress Essay Contest
The Unsolvable Control Problem, Artificial Sentience, and InfoSec
2 comments
2
AI Progress Essay Contest
Load More