Discussion about this post

User's avatar
Neural Foundry's avatar

Nice framework for structuring forecasts. The conjuction deflation problem in conditional chains is something I see constantly - people will say each step is "70% likely" without realizing that five steps at 70% each gives you 17% total. Had a project manager once who kept piling"probably fine" assumtions onto a roadmap and then was shocked when nothing shipped on time.

Bolton's avatar

> 1−(1−1/32)^1 ≈ 3.1%.

This is a nice technique, but I find this math is somehow missing some scale-awareness. Suppose we had taken our "period" for assessing the per-period probability to be months instead of years. Then we would have calculated a 1/(30*12 + 2) = 0.002762430939 per-month chance of satellite attack, and substituting in values appropriately to the above equation, we would get a slightly higher chance

1 - (1 - 1/(30*12 + 2))^12 ≈ 3.26%

If we had taken it to be a decade, then we would have calculated a 1/(3 + 2) = 0.2 per-decade chance of satellite attack, and substituting in values, we would get something lower:

1 - (1 - 1/(3 + 2))^(1/10) ≈ 2.21%

Is there some art to choosing the period over which we average? Is it a good idea to take the limit as the period gets smaller and smaller?

1 more comment...

No posts

Ready for more?