Bayes theorem...

So, Bayes theorem says that

so as an example, if we knew that a jar had equal odds of having either 3 or 4 numbered jelly beans, and we drew one at random that was numbered 1, we could calculate the odds of there being 3 as opposed to 4 jelly beans.

In this case and

Now the probability of drawing the number 1 jelly bean IF there are 3 jelly beans is and the odds of there being 3 jelly beans (as opposed to 4) are but what is ? Well, we have to add all the ways we could draw a '1'. where . So then we have

.

Now was drawn from the discrete set of events , but we could imagine a continuous set of events (\x$)]]. Then Bayes theorem reduces to

So let's say someone placed you randomly along a line that could be anywhere from 0 to L meters long with equal probability… And you found yourself at 1 meter from the start. What does the probability distribution for the length of the line look like now?. Well initially . Now, for and otherwise. So we have

The following image shows the prior probability distribution for line lengths of 10 and 20 as well as the new probability distributions after finding oneself at 1 meter. Also shown are the 50th percentile and the average expected length. For example, if the initial distribution is between 0 and 10 meters, then the modified distribution has an average expected length of 3.90 and the length will be less than 3.1 50% of the time.

The expected value and the 50th percentile are dependent on our initial guess as to the maximum line length. For example if we assume the length could be from 0 to 20 meters instead, then the 50th percentile is at 4.5 and the expected value is at 6.3. In general the expected value will go as and the 50th percentile will go as

The above analysis is for a distribution in which there are equal probabilities for being found at any point along the line. , and an equal likelihood for the line being any length less than L , however we can also consider a system in which the probability of being at any given point grows exponentially with distance (or time). while still keeping

For example if you chose a random human in history, the odds of finding a human living at time t, assuming the human civilization grows exponentially for x years before ending, should be something like

In this case we have and . So we have

This gives an expected value of and an . In this case the expected value is fairly insensitive to and quickly asymptotes to 2. So in an exponentially growing system, a random selection will be heavily skewed towards the end and would give the expectation that the system would end within one more timescale.

So if civilizations continue to grow indefinitely, then we are likely near the end of ours… however if civilizations cap out at a fixed number of people, then our expected lifetime becomes dependent on the maximum expected civilization lifetime…

So perhaps this is an argument against colonizing other planets? :) Or curbing the population growth?

Attachments (2)

Download all attachments as: .zip

Comments

No comments.