Halloween Puzzle

This week’s Fiddler is about rounding!

You are presented with a bag of treats, which contains $n \geq 3$ peanut butter cups and some unknown quantity of candy corn kernels (with any amount being equally likely). You reach into the bag $k$ times, with $3 \leq k \leq n$, and pull out a candy at random. Each time, it’s a peanut butter cup! How many candy kernels do you expect to be in the bag?

My solution:
[Show Solution]

6 thoughts on “Halloween Puzzle”

  1. Hi Laurent, that’s a nice solution. I used telescoping series since we are dealing with infinite discrete series here. It’s not quite the simple counting argument that you’re seeking. Anyway, I’ll use N = 6, k = 4 to illustrate my approach though we can use general N, k (typing it here in comments would just look messier).

    Let z be the uniform probability of the bag having exactly x corn kernels, 0 <= x < infinity. Let X be a random variable for the number of corn kernels. Let Y be the event of getting k=4 peanut cups (out of N=6) at the start.
    P(Y) = z * [ 4!2!/6! + 4!3!/7! + 4!4!/8! + … ] = 4!z * [ 1/3.4.5.6 + 1/4.5.6.7 + 1/5.6.7.8 + … ]
    Now, 1/3.4.5.6 can be decomposed as (1/3) * [ 1/3.4.5 – 1/4.5.6 ]. Similarly we can decompose the other terms. Decomposing using the first and last numbers in the denominators can be done regardless of the number of numbers in the denominator (for general N, k).
    Decomposing gives us…
    P(Y) = (4!z / 3) * [ 1/3.4.5 – 1/4.5.6 + 1/4.5.6 – 1/5.6.7 + 1/5.6.7 – 1/6.7.8 + … ]. All terms in the infinite telescoping series except the first one cancel out.
    P(Y) = 8z / 3.4.5 = 2z/15

    P(Y | X=x) = 4!(x+2)! / (x+6)! = 4! / (x+3)(x+4)(x+5)(x+6)
    Decomposing, we get…
    P(Y | X=x) = 4! * (1/3) * [ 1/(x+3)(x+4)(x+5) – 1/(x+4)(x+5)(x+6) ]

    P(X=x | Y) = z * P(Y | X=x) / P(Y) = z * P(Y | X=x) / (2z/15) = (15/2) * P(Y | X=x)

    E(X | Y) = sum of P(X=x | Y) * x, for x from 0 to infinity.
    E(X | Y) = sum of (15/2) * 4! * (1/3) * [ 1/(x+3)(x+4)(x+5) – 1/(x+4)(x+5)(x+6) ] * x, for x from 0 to infinity.
    E(X | Y) = sum of 60 * [ x/(x+3)(x+4)(x+5) – x/(x+4)(x+5)(x+6) ], for x from 0 to infinity.
    Substituting values of x from 0 to infinity, we get…
    E(X | Y) = 60 * [ 1/4.5.6 – 1/5.6.7 + 2/5.6.7 – 2/6.7.8 + 3/6.7.8 – 3/7.8.9 + … ]
    After the first term, consecutive terms can be paired and added. We get…
    E(X | Y) = 60 * [ 1/4.5.6 + 1/5.6.7 + 1/6.7.8 + … ]
    Decomposing, we get…
    E(X | Y) = 60 * (1/2) * [ 1/4.5 – 1/5.6 + 1/5.6 – 1/6.7 + 1/6.7 – 1/7.8 + … ]. All terms in the infinite telescoping series except the first one cancel out.
    E(X | Y) = 30 * [ 1/4.5 ]
    = 3/2

    Thanks for referencing the German tank problem. That’s a great usage of conditional probability, and an amazing WW2 story. Almost as cool as the exploits of the Bletchley Park team.

  2. Hi Laurent, the Euler beta function gives us an integral representation of the inverse of the binomial coefficient,

    (n+m,k)^(-1) = k * int_0^1 dt t^(m+n-k) (1-t)^(k-1)

    The sums over m can then be done under the integral.

  3. “Where in the last step, we used the fact that P(m) is the same for all m so we canceled it from the numerator and denominator.”

    I think this doesn’t work. P(m) cannot be a constant for all values of m. (There’s no uniform probability distribution over an infinite set).
    Perhaps you can change the sums to run up to a finite number, say M, and take it to infinity later.

    I think the problem begins in the question itself: “with any amount being equally likely”.

    1. Yes, that’s how a non-normalizable prior is handled: set P(m)=1/M for m less than M and zero otherwise, then take M to infinity at the end. The quick way is to do what Laurent did, and just drop p(m). The result is the same, as long as the posterior is normalizable.

Leave a Reply

Your email address will not be published. Required fields are marked *