## rarity

posted on 13 Nov 2019

The rarity of an event is the number of coin flips you need to make an all-heads result as rare as that event.

Rarity is just a new term for (binary) “self-information” or “surprisal”, which is the logarithm of the inverse of the probability of an event.

Rarity uses log2 instead of the natural log because it is intended to be accessible to programmers for back-of-envelope calculations.

``````R(e) = log2(1/P(e)) = -log2(P(e))
<=>
P(e) = 1 / 2^R(e)
``````

(“Natural rarity” could a the term for rarity measured in nits instead of bits. Again, this is just “surprisal”.)

Events that have a probability less than 0.5 have a rarity greater than 1. Events that have a probability greater than 0.5 have a rarity between 0 and 1.

Here is a table of some events. Suggestions are welcome for more useful reference values. Assume uniform distributions for event spaces if not specified.

``````event space   | event                  | probability | rarity
--------------|------------------------|-------------|-------
'80/20 rule'  | 80% case               |  0.8        | ~0.32
day at random | on Monday-Friday       | ~0.714      | ~0.485
|                        | ~0.707      |  1/2
flip a coin   | heads                  |  1/2        |  1
|                        |  1/3        | ~1.585
day at random | on Saturday-Sunday     | ~0.286      | ~1.807
4-way stop    | next car goes north    |  1/4        |  2
'80/20 rule'  | 20% case               |  0.2        | ~2.322
6-sided die   | roll a 6               |  1/6        | ~2.585
|                        |  0.125      |  3
day in June   | the 13th               | ~0.033      | ~4.907
8 bits        | is a given byte        | ~0.0039     |  8
day of year   | is June 13th           | ~0.0027     |  8.51
``````

Rarity is good for talking about independent events. Flipping heads has a rarity of 1. Rolling a 6 (with a 6-sided die) has a rarity of ~2.565. Therefore flipping heads and rolling a 6 at the same time has a rarity of ~3.565.

### forecast

Rarity is also good for talking about entropy. Entropy is usually expressed in terms of probability, but it is easier to understand in terms of rarity.

``````H(E) = sum{e in E}( P(e) * -log2(P(e)) )
=>
H(E) = sum{e in E}( R(e) / 2^R(e) )
``````

This formulation reveals a relationship to the structural encoding of an event space…