# Impromptu gambling with dice This Riddler puzzle is an impromptu gambling game about rolling dice.

You and I stumble across a 100-sided die in our local game shop. We know we need to have this die — there is no question about it — but we’re not quite sure what to do with it. So we devise a simple game: We keep rolling our new purchase until one roll shows a number smaller than the one before. Suppose I give you a dollar every time you roll. How much money do you expect to win?

Extra credit: What happens to the amount of money as the number of sides increases?

Here is my solution:
[Show Solution]

For a more in-depth analysis of the distribution, read on:
[Show Solution]

## 7 thoughts on “Impromptu gambling with dice”

1. Hector Pefo says:

Nice once again! I eventually found the same thing after some bumbling that led me to notice the empirical similarity between this game and one that seems rather different: (N/1-N)^N is also 1/(1-1/N)^N, which is one over the chance of getting zero successes in N trials of an experiment with individual probabilities 1/N of success (such as 100 rolls of this Riddler’s die without side 100 coming up), and so is the expected number of runs of N trials before a no-succesess one. I wonder if there’s some clever way of thinking about the Riddler game that assimilates it to that one (or maybe it’s just that, like e, the binomial distribution tends to show up).

Second, you can also show quickly that the limit is e by considering the continuous case of a die with real values in [0,1]. The expected value of a non-game-ending roll #(r-1), is (r-1)/r, which is therefore the probability of the game ending on roll #r (if it hasn’t by then), and the probability of the game *not* ending them is 1/r. The probability (from the start) of the game ending on roll #r, then, is the sum, for r from 1 to r-1, of r/r!, and the limit of that sum is e.

1. Laurent says:

thanks! I made note of the connection between the [0,1] case and the infinite-N case at the end of my post. I found it a lot more intuitive to think about random real numbers in a closed interval rather than infinite integers.

2. Peter Calhoun says:

Great post! I’ve been reading your blog for a few months now and am always impressed by your solutions. One minor typo, when you subtract the two equations, it should be:
f(n+1) – f(n) = 1/N – (f(n)+1)/N

1. Laurent says:

thanks! fixed the typo.

3. Jim says:

I’m sorry; I’m having a fundamental problem with this calculation, and his to do with, what seems to me, the way you’ve established your initial formula — which assumes that N=1 is undefined and thus doesn’t include a calculation for it.

Empirically speaking, there is a universe where you roll 100, over and over again, infinitely. Which would be the situation if N = 1, but even if N = 100, there’s a nonzero possibility of landing on 100 endlessly, forever, until the heat death of the universe. While the probability of that occurring is extremely low, any non-zero probability of an infinite amount is, by definition, infinite. So your expected value in this particular game would be infinity dollars.

To put it another way: If you’re using a 2-sided die, then you have a traditional St. Petersburg lottery. Using a 100-sided die just gives you a special iteration of a St. Peterburg lottery.

1. Laurent says:

We can both agree that the situation $N=1$ leads to infinite expected winnings, since the game is deterministic; the same things is rolled every turn so the game never ends.

You’re right that for $N > 1$ the game can in principle last forever, albeit with a vanishingly small probability. However, this is different from the St. Petersburg Paradox in that the expected value is not infinite. In the paradox, you flip coins and keep playing as long as you keep getting Heads and you win $2^n$ where $n$ is the number of flips it took to get your first Tails. The probability of having longer and longer games shrinks exponentially but the amount you can win grows exponentially. The result is that the total expected winnings for the game is infinite:
$\frac{2^1}{2^1} + \frac{2^2}{2^2} + \cdots = 1 + 1 + \cdots = \infty$This is not the case in the dice problem. The probability of having longer and longer games does shrink exponentially, but the amount you can win only increases linearly. The result is a finite expected value. The analogy would be if the St. Petersburg game only awarded you $n$ dollars for a game that lasts $n$ turns. In that case, your expected winnings would be:
$\frac{1}{2} + \frac{2}{2^2} + \frac{3}{2^3} + \cdots = \sum_{k=1}^\infty \frac{k}{2^k}$It turns out this sum equals $2$. This can be shown in several different ways. The calculation for the actual dice problem is a bit more complicated than this but the basic idea is the same.

1. Jim says:

That makes sense; thank you so much!