assembling contingent understanding composing calibrated predictions generating quantitative estimations computing accurate estimations formulating intelligent futures mapping the future exploring probable forecasts crowdsourcing predictive understanding mapping probable contingencies forecasting accurate understanding modeling contingent understanding crowdsourcing accurate estimations forecasting accurate estimations formulating critical insights

Question

Metaculus Help: Spread the word

If you like Metaculus, tell your friends! Share this question via Facebook, Twitter, or Reddit.

Why it is rational to predict according to your true beliefs

Earlier today, I was making a prediction on question whose median prediction seemed way too high. Because I want those informed by these predictions to have accurate beliefs, for a brief moment, I felt inclined to predict way below the median prediction (and even below my true belief) in order to have the largest impact on the median prediction. But, as some of you might know, this does not make sense.

Here I examine the case of binary questions where you input a prediction on the [1%,99%] range, but similar reasoning applies also to numeric-range predictions and date-range predictions.

Suppose my true belief about question is that it has a probability of a positive resolution. Suppose that my preferences are such that I prefer the median to be closest to my true belief . After reporting my prediction , all predictions are given by the vector , and the median prediction is . My preferences imply that I want to report a prediction that minimises the distance between the median and my true belief, which is given by

Suppose I report my true belief as prediction (), then there are three possible outcomes,

In case 1, the median is equal to my true belief , and I have no reason to deviate. In case 2, the median is larger than my true belief, but since no prediction will decrease the median below its current value, I have no reason to deviate. Similarly, in case 3, I cannot increase the median with any prediction , which again gives me no reason to deviate. Hence, predicting your true belief is at least as good as any other prediction (and possibly even better when you're interested in maximising your points!)

I recognise that there might be exceptions to this principle, especially given that predictions on Metaculus are made in sequence, and not simultaneously. For instance, suppose you expect that the Metaculus community is likely to be positively biased about the chance of resolving positively. Moreover, you know that the anchoring of some initial low median will update future predictions downward closer toward your true belief. This may plausibly effect the median prediction over the long run. However, I hope that my pointing this out will void this strategy, as you now know to update much less on early predictions given that these might reflect other player's strategic behaviour and not their true beliefs.