Today’s puzzle was posed on the Riddler blog, but it’s actually a classic among problem-solving enthusiasts, and is commonly known as the pirate game. Here is the formulation used in the Riddler:

Ten Perfectly Rational Pirate Logicians (PRPLs) find 10 (indivisible) gold pieces and wish to distribute the booty among themselves.

The pirates each have a unique rank, from the captain on down. The captain puts forth the first plan to divide up the gold, whereupon the pirates (including the captain) vote. If at least half the pirates vote for the plan, it is enacted, and the gold is distributed accordingly. If the plan gets fewer than half the votes, however, the captain is killed, the second-in-command is promoted, and the process starts over. (They’re mutinous, these PRPLs.)

Pirates always vote by the following rules, with the earliest rule taking precedence in a conflict:

- Self-preservation: A pirate values his life above all else.
- Greed: A pirate seeks as much gold as possible.
- Bloodthirst: Failing a threat to his life or bounty, a pirate always votes to kill.

Under this system, how do the PRPLs divide up their gold?

Extra credit: Solve the generalized problem where there are P pirates and G gold pieces.

Here is the solution to the main problem:

[Show Solution]

This problem is a perfect example of when to use *recursion*. If there are $P$ pirates, when each pirate contemplates how to vote, they are comparing what would happen if they voted yes (and agreed to the current proposal) to what would happen if they voted no (and the current pirate died, reducing to the $P-1$ case).

Let’s start with two pirates and build up from there. For each case, here is a list of each pirates’ share of the gold. We ordered the pirates from lowest rank to highest rank (captain).

\begin{align}

&\quad\text{Gold distribution} &\text{Votes}&\text{ needed}\\ \hline

v_2 &: \begin{bmatrix} 0 & 10 \end{bmatrix} && 1\\

v_3 &: \begin{bmatrix} 1 & 0 & 9 \end{bmatrix} && 2\\

v_4 &: \begin{bmatrix} 0 & 1 & 0 & 9 \end{bmatrix} && 2\\

v_5 &: \begin{bmatrix} 1 & 0 & 1 & 0 & 8 \end{bmatrix} && 3\\

v_6 &: \begin{bmatrix} 0 & 1 & 0 & 1 & 0 & 8 \end{bmatrix} && 3\\

v_7 &: \begin{bmatrix} 1 & 0 & 1 & 0 & 1 & 0 & 7 \end{bmatrix} && 4\\

v_8 &: \begin{bmatrix} 0 & 1 & 0 & 1 & 0 & 1 & 0 & 7 \end{bmatrix} && 4\\

v_9 &: \begin{bmatrix} 1 & 0 & 1 & 0 & 1 & 0 & 1 & 0 & 6 \end{bmatrix} && 5 \\

v_{10} &: \begin{bmatrix} 0 & 1 & 0 & 1 & 0 & 1 & 0 & 1 & 0 & 6 \end{bmatrix} && 5

\end{align}

Let’s examine this table one row at a time. If there are two pirates ($v_2$), a majority is decided by a single vote. Since the captain gets to vote, they can safely propose to keep all the gold.

If there are three pirates ($v_3$), a majority is decided by two votes. So the captain needs to secure one extra vote. The first pirate (lowest rank) won’t win anything if the captain dies, because we would reduce to the $v_2$ case. The captain must give this pirate at least one gold piece to buy their vote and avoid death, but can keep the remaining $9$ pieces.

Similarly, if there are four pirates ($v_4$), a majority is decided by two votes. This time, the first pirate’s vote costs two gold pieces because they stand to win one gold piece if we reduce to $v_3$. However, the second pirate’s vote now costs one gold piece because they win nothing in the $v_3$ case. A similar argument applies as we continue to add pirates.

**The final solution** is that if we number the pirates $1,2,\dots,10$ in order from lowest rank to highest rank, then pirates $2,4,6,8$ each get one gold piece and the captain gets $6$ gold pieces.

And here is a solution to the general case:

[Show Solution]

As we saw in the previous solution, it makes sense to recurse on $P$, the number of pirates. If $G$ is the number of gold pieces, the pattern we observed in the simpler case still hold here, and we can continue it until the captain runs out of gold and is no longer able to bribe enough of his fellow pirates. So the table looks like:

\begin{align}

&\quad\text{Gold distribution} &\text{Votes}&\text{ needed}\\ \hline

v_2 &: \begin{bmatrix} 0 & G \end{bmatrix} && 1 \\

v_3 &: \begin{bmatrix} 1 & 0 & G-1 \end{bmatrix} && 2 \\

v_4 &: \begin{bmatrix} 0 & 1 & 0 & G-1 \end{bmatrix} && 2 \\

v_5 &: \begin{bmatrix} 1 & 0 & 1 & 0 & G-2 \end{bmatrix} && 3 \\

v_6 &: \begin{bmatrix} 0 & 1 & 0 & 1 & 0 & G-2 \end{bmatrix} && 3 \\

&\vdots &&\vdots\\

v_{2G+1} &: \begin{bmatrix} 1 & 0 & \dots & 1 & 0 & 0 \end{bmatrix} && G+1\\

v_{2G+2} &: \begin{bmatrix} 0 & 1 & \dots & 0 & 1 & 0 & 0 \end{bmatrix} && G+1

\end{align}

In the last two rows of the table, there are $G$ 1’s. In the case with $2G+2$ pirates, a majority is decided by $G+1$ votes, so the captain wins nothing; they must forfeit all the gold in order to buy the votes necessary to survive. From this point forward, it may seem that if we add any additional pirates, they will be killed because there isn’t enough gold to buy enough votes. However, this isn’t quite the case. Here is what happens if we keep adding pirates:

\begin{align}

&\quad\text{Gold distribution} &\text{Votes}&\text{ needed}\\ \hline

v_{2G+2} &: \begin{bmatrix} 0 & 1 & \dots & 0 & 1 & 0 & 0 \end{bmatrix} && G+1 \\

{\color{red}v_{\color{red} 2\color{red}G\color{red}+\color{red}3}} &: \begin{bmatrix} ? & ? & \dots & ? & ? & ? & ? & {\color{red}0} \end{bmatrix} && \color{red}G\color{red}+\color{red}2 \\

v_{2G+4} &: \begin{bmatrix} 1 & 0 & \dots & 1 & 0 & 0 & 0 & 0 & 0 \end{bmatrix} && G+2 \\

{\color{red}v_{\color{red} 2\color{red}G\color{red}+\color{red}5}} &: \begin{bmatrix} ? & ? & \dots & ? & ? & ? & ? & ? & ? & {\color{red}0} \end{bmatrix} && \color{red}G\color{red}+\color{red}3 \\

{\color{red}v_{\color{red} 2\color{red}G\color{red}+\color{red}6}} &: \begin{bmatrix} ? & ? & \dots & ? & ? & ? & ? & ? & ? & {\color{red}0} & {\color{red}0} \end{bmatrix} && \color{red}G\color{red}+\color{red}3 \\

{\color{red}v_{\color{red} 2\color{red}G\color{red}+\color{red}7}} &: \begin{bmatrix} ? & ? & \dots & ? & ? & ? & ? & ? & ? & {\color{red}0} & {\color{red}0} & {\color{red}0} \end{bmatrix}\hspace{-2cm} && \color{red}G\color{red}+\color{red}4 \\

v_{2G+8} &: \begin{bmatrix} 0 & 1 & \dots & 0 & 1 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix}\hspace{-2cm} && G+4

\end{align}

Once we get to $P = 2G+3$ pirates, the captain will always be killed (deaths are indicated in red) since there isn’t enough gold to buy the required votes. In the $2G+4$ case, however, the second in command is doomed to die if he votes to kill the captain, so he will vote yes *even if offered nothing*. This gives the captain a free vote, which is enough to secure the required $G+2$ votes. The captain must make an offer that compares favorably to the $2G+2$ case since the intermediate case $2G+3$ always leads to $2G+2$.

In the $2G+5$ case, none of the pirates have an incentive to keep the captain alive (since everybody will be safe with $P=2G+4$). The same happens with $2G+6$ and $2G+7$. Once we arrive at $2G+8$, three of the pirates are doomed if the captain dies, so they will vote yes even if offered nothing. Once again, this allows the captain to barely survive. This pattern continues, each time doubling the interval length between safe captains.

**The solution to the general problem.** Number the pirates $1,2,\dots,P$ in order from lowest rank to highest rank. If we have $G$ gold pieces to distribute, then we consider two cases:

- If $P \le 2G+2$, then:
- If $P$ is odd, pirates $1,3,5,\dots,P-2$ each get one gold piece, and the captain gets the remaining $G-\frac{P-1}{2}$ pieces.
- If $P$ is even, pirates $2,4,6,\dots,P-2$ each get one gold piece, and the captain gets the remaining $G-\frac{P-2}{2}$ pieces.

- If $P > 2G+2$, then the highest ranked captains will die until there are $P = 2G+2^k$ pirates remaining with $k$ as large as possible. Then:
- If $k$ is even, pirates $1,3,5,\dots,2G-1$ each get one gold piece and nobody else gets anything.
- If $k$ is odd, pirates $2,4,6,\dots,2G$ each get one gold piece and nobody else gets anything.

In the case where $P \ge 2G+2$, the solution is not unique in the sense that once we arrive at $P=2G+2^k$, there are other admissible ways of dividing up the gold besides the one shown above.

If this sort of problem interests you, I recommend taking a crack at the riddle of the blue-eyed islanders, or the unfaithful husbands. More information here as well.

A couple of further thoughts:

1) The solution is very different if you require an absolute majority of the votes to win, not just 50%. For example, with two pirates the captain always dies, and with three the captain grabs all the gold.

2) There are other strategies to examine. Suppose you’re the captain of 100 pirates with 20 pieces of gold. Looks like you’re gonna walk the plank, as are 27 pirates after you. What to do? Your best bet is to quick grab two pieces of gold and throw them overboard! Now we have 2(18 + 2^5) = 100 and you’re OK. Desperate people do desperate things. Of course, if you start throwing away treasure I suspect that the rules go out the window and you’d be dead anyway.

3) Another strategy: Suppose there are 3 pirates and 10 gold pieces. The “logical” solution is that the captain gets 9 gold pieces and #1 gets 1. But suppose that before the captain says anything, #1 says loudly to the captain: “I’m not settling for 1 piece! Unless you give me all 10 pieces, I’m voting against, and you’ll die!” If the captain then makes the logical proposal and pirate #1 carries out his threat, the captain dies. To stay alive, his top priority, isn’t the captain forced to offer #1 all 10 pieces? But presumably the captain should call the bluff and make the logical proposal anyway, at which point it’s irrational for #1 to do anything other than accept.

Thank you for this great comment! I particularly like item 2, as it adds an entirely new dimension to the problem. It would be interesting to investigate this option further. e.g. not all pirates have an incentive to throw away some of the gold — only the doomed pirates do.

I think item 1 could be tackled using a similar recursive approach to the one I talk about in my post. With regards to item 3, I think things could get very complicated because all sorts of complicated bribes could be proposed. Ultimately, the only 100% safe thing for any pirate to do is to not trust any other pirate!

Wrt problem 3: That just wouldn’t happen, if we take the premises seriously. If the captain proposes 1-0-9, #1 WILL accept it because the other option is to get no gold at all.

Wrt problem 2:

Let’s add an additional constraint to make the question deterministic: A pirate only throws gold away iff he’d die otherwise but he’s able to save himself.

In this situation, the first 2G+2 pirates won’t throw away anything, as can be seen from the original problem.

For P > 2G+2, let N = P-2G. This is the value which has to be a power of two for the murder party to stop. So the question is, can the captain make it a power of two by throwing away some gold?

Throwing away one gold piece decreases G by 1, thereby increasing N by 2. So this is true iff there’s a power of two between N and N+2G, or in our original terms, between P-2G and P. Let’s call that power of two P2. If it exists, the captain will throw away (P2-N)/2 = (P2-P+2G)/2 = G – (P-P2)/2 gold pieces, leaving (P-P2)/2, then get his proposal accepted as in the original problem. If not, the result will be the same as in the original problem.

All in all, when P > 2G+2 (or just 2G), the captain will live if and only if there’s a power of two in the interval [P-2G; P]. If there is, he’ll leave (P-P2)/2 coins and get his proposal accepted. If not, his (and his direct successors’) life will be just as short and sad as in the original case.

I think that is only partially correct. If $P > 2G+2$, then the captain is only safe if $P-2G$ is a power of $2$. If we assume the captain can throw gold away in order to save himself, then there are two cases:

Great answer. And I arrived at a similar answer without ‘peeking’ at yours, but I was Googling ‘pirate divide pie’ (a pretty standard class of games) and I landed here eventually, but I didn’t want a spoiler. You nicely hid your solution behind a hyperlink, but temptation… Give us self-learners a chance and maybe wait until Mondays to post your fivethirtyeight.com Riddler solutions? Great links to additional problems; good fun. Excellent blog. You do a great job of mathing-out solutions to these riddles, but I’d rather read these after the fivethirtyeight.com deadline.

That’s a good point — I think that from now on I’ll wait until Monday to post my solution. I thought a good compromise would be to hide solutions behind a link, but I agree that it’s probably too tempting to be just a click away.

If the pirates are perfectly rational then for 3 pirates wouldn’t the solution end up the same as in 2 pirates. Reasoning thus,

You have v3: [1 0 9]. However, the last pirate – who gets 9 – knows that if he votes to kill the captain he can get 10 gold, maximizing his gold by voting to kill. The second pirate is bloodthirsty and votes to kill because he gets 0 gold either way but he definitely votes to kill. So the solution ends up as [killed 0 10]. There’s nothing the captain can do.

With v4 you have: [0 1 0 9]. But, with the reasoning above pirate 4 will vote kill unless he gets 10 gold. So I get the solution as [0 0 0 10] where pirates 1 and 4 vote yes.

For v5 I get [7 1 1 1 0]. Pirates 2-4 prefer to get something rather nothing if they vote to kill since v4 is [0 0 0 10]. Pirate 5 votes to kill but is outnumbered by the others.

v6: [0 8 2 0 0 0]. Pirates 2 and 3 are bloodthirsty but the +1 from v5 trumps their desire to kill.

v7: [6 1 0 0 1 1 1]. If P5 and P4 know that if they can kill the captain they’ll get to v6 and maximize their gold so the captain has to offer them more than in v6 to vote yes. Since it’s more gold maximizing for the captain he gives them 0 and 1 to everyone else.

v8: [0 0 2 1 1 2 2 2]

v9: [4 1 1 0 2 2 0 0 0]

v10: [3 0 2 2 1 0 0 1 1 0]

I think you may have misinterpreted my solution. I am arranging the pirates in order

from lowest priority to highest priority. The captain is always listed last. So when I write:\begin{align}

&\quad\text{Gold distribution} &\text{Votes}&\text{ needed}\\ \hline

v_2 &: \begin{bmatrix} 0 & 10 \end{bmatrix} && 1\\

v_3 &: \begin{bmatrix} 1 & 0 & 9 \end{bmatrix} && 2

\end{align}

in $v_2$, the captain gets $10$ and the second pirate gets nothing. This is optimal because the captain’s vote alone secures “at least half the votes”, ensuring he never gets killed. In the case with three pirates, the captain is voting to give himself $9$ gold. The second pirate doesn’t like getting zero, so votes against, but the third pirate is getting one gold, which is better than the zero he would get if he voted no, the captain died, and we were down to only two pirates…

Yep, I definitely misread it!

I haven’t understood one point of the solution: why v3 is [1 0 9] and not [0 1 9] or, similarly, why v3-v4 sequence is not [1 0 9] –> [1 0 0 9] or [0 0 1 9]? Isn’t logic preserved in either case?

Each pirate will vote in favor of the plan that gives them the most gold (greed). If there is a tie, they vote for the plan that causes death (bloodthirsty). Take the case of three pirates, and look at the pirate who is last in line. If the captain dies, we reduce to the v2 case, and this last pirate will get 0 gold (solution to v2 is [0 10]). So in order to buy the vote of the last pirate, we need to offer him at least 1 gold. (offering him 0 gold will not sway him because of bloodthirst). If we look at the second-last pirate, however, that pirate stands to get 10 gold if the captain dies. So no amount of gold can buy that pirate’s vote. Luckily for the captain, he only needs one vote (plus his own vote) to have a majority in the v3 case. So he’ll give 1 gold to the last pirate, 0 to the second pirate, and keep 9 for himself.

There’s a problem with your last partition list showing the dead captains in red. Notice that from 2G+2 on, only even-numbered pirates consistently get to cash in on the offered gold, and none of the odd pirates ever get their rewarding plans passed. We should mix it up a bit to avoid the third rule.

The problem is, of course, the bloodthirstiness of these repeatedly rewarded even-numbered lot kicks in again. These lower rank even pirates all know that they’ll eventually reach a plan that gets passed for that lucky “saved captain” (e.g. P=2G+2) and they will all be rewarded with a coin for their votes. Thus, for the next highest 2^k (P=2G+4) you can’t propose to reward these same pirates, as they already know they’ll get a coin at the next successful partition (after killing off a couple of captains or even a multitude of them for higher k). Thus, NONE of the intended bribes will succeed, not a single taker, ALL of those who were offered a coin will vote NO! The even mates can only be persuaded by offering them 2 coins to overcome their bloodthirsty ways, but that is clearly not the way to go (gold is already extremely tight). Fortunately, we know there’s plenty of cheaper bribe takers.

So, the correct viable proposals (after P>2G+2) must attempt to bribe the others who’d get nothing i.e. the even-numbered pirates instead. Continuing to alternate between these even/odd low-ranks for each consecutive increment in k (for only the 2G+2^k successful plans, as the other “doomed” proposals can be anything and will always fail so are irrelevant) seems sufficient to keep them all resetting their expectations back to zero.

Another (perhaps nitpick): the non unique solutions mentioned in the next-to-last paragraph actually start for the P=2G+2 case, since in addition to the even-numbered pirates (who’d get zero on the next round), the captain can offer a coin to the first mate too (call it pirate P*=P-1=2G+1 who’d also end up with zero in the next round in order to nearly save his/her neck).

The later issue may not be a mere nitpick given a subtler understanding of the PRPLs’ decision-making under uncertainty. Indeed the introduction of non-unique acceptable proposals even before the first dead captain comes up (“before” in our reasoning but after as the game would hypothetically play out, although it would in fact never get that far), combined with the first rather critical issue I raised above (that of making sure you are careful not to reward the wrong pirates tragically squandering that single serendipitous chance of living) introduces probabilistic reasoning into the problem.

WARNING: What follows is considerably long! It’ll take me quite a bit to explain this throughly, but I think it’s worth it to get a complete understanding of all the subtle ways that these strategies can fail. So bear with me for a while, if you’re interested of course (or just stop reading right here and try to work it all out on your own from what I’ve said so far). Apologies in advance for taking up so much space and time out of this blog’s readership.

Off we go then. First, note this pirate I marked as P* is an odd numbered pirate, yet is in fact the first that satisfies the condition 2G+2^k (for k=0). All other odd-numbered pirates are guaranteed zero coins (at this point) and there’s a total of G like this, in addition to our current captain who’ll also end up empty handed (so G+1 pirates get nothing in this round), while there’s the other G+1 (including our odd P*) who have at least some positive chance of being offered a coin, at this point with probability p=G/(G+1) if the captain picks at random. Note that adding both groups totals (G+1)+(G+1)=2G+2, so everyone’s accounted for. These details are important when determining which pirates can beneficially be offered a coin for larger sized pirate bands, as I’ll try to show next.

For the next larger band of pirates who have a viable chance at splitting the loot (P=2G+4 and from here on I’ll name these in terms of the k-th power of 2 added to 2G, so this is P(|k=2)=P(2)=2G+2^2), we can look at the future prospects of 3 distinct cohorts of pirates, in terms of the consequences of passing or rejecting the proposal for each of them:

1) Those who will die for sure if the current distribution plan doesn’t pass, which in this first case include only the current captain and first mate (call this C_1(|k=2)=2 pirates). They will of course vote for the plan despite being offered zero.

2) Those who would survive down to the next succesful round (when P(1)=2G+2 pirates would split the booty) but would be guaranteed zero coins there. In this first plan they include the same mostly odd lot totaling C_2(2)=G+1 which I mentioned earlier (the lowest G odd numbers plus the 2G+2 captain in place of his P* first mate). They will reject any current proposal unless offered a coin, seeing as if offered nothing they’d get the same later if this plan fails and they’d rather see some captains offed.

3) Those who’d also survive but have at least some chance of getting offered a coin in the next succesful partition if the current one fails. For now these total C_3(2)=G+1 (including our special odd P* pirate, incidentally P*=P(0) so this mate’s a true edge case) each of them currently feeling “entitled” by a relatively high probability p(|k=2)=G/(G+1) (asymptotic but never achieving certainty for high G) of getting a coin in the next successful round after cohort 1 walks the plank.

Those in cohort 3 will probably also accept the current plan if offered a coin (now assured rather than a less than certain chance), but they’re a bit less incentivized to do so than those in cohort 2. Still, such a preference for a guarantee over a mere probability would seem perfectly rational for greedy pirates, and would greatly simplify the job of deciding who gets offered coins: just pick anyone (at random) from cohorts 2 and 3, that is, anyone from pirate number 2G+2 down, in contrast to the tricky even/odd (but for incremental k instead of P) pattern suggested initially.

However, them being pirates it would not be too unreasonable if they were quite comfortable when taking considerable risks. It might not seem that daring for them to bet on a G/(G+1) probability (if sufficiently close to 1 or who knows, maybe even 1/2 triggers their gambling inclinations) and take the risk of being singled out to not get a coin in the next viable split (out of G+1 pirates), with such a risk perhaps being quite worth the pleasure of seeing a bunch of captains walk the plank. I see it as highly likely that at least one of the “randomly chosen” bribes (remember these are half of our pool of candidates) could be one of these risk-seekers, and if just one decides to reject the bribe, cohort 1 dies.

Thus, it would be advisable for the captain deciding who to “bribe for their votes” to avoid this cohort 3 and stick to the low expectations of those in cohort 2 (obviously never offer coins to cohort 1 as those votes are assured for free). This “risk seeking” behavior could be characterized as a subtler rule saying “bloodthirstiness kicks in above some probabily p>1/2 of getting a future coin, or an expected value under uncartainty in future rounds exceeding a similar percent p>50% of currently offered number of coins” or some such.

On the other hand, perhaps they’re so greedy (to the point of being miserly) that they’ll accept any small (but assured) offer instead of a higher (though uncertain) expectation of receiving gold in a future split (“a bird in hand” they reason). However, this “risk aversion” possibility doesn’t present itself in any optimal proposal in the analysis of the game with the rules as stated (no one, other than captains with excess gold after trading for votes, can ever expect more than a single coin, yet the possibility does come up if, for example, we require proposals to pass with over 50% of the votes, but that’s a whole ‘nother can of worms to tackle).

In any case, there’s G coins to give and enough pirates to pick from cohort 2 (a tight G+1 at first but we’ll see how to grow this slack for higher k). So a captain wary of any risk-seeking, blood-thirsty, lower ranked mates can safely bribe all he needs, just pick all but one (in our current case of k=2) among those in cohort 2. And the probability for each of them of getting offered a coin in the current plan is once again G/(G+1).

Furthermore, since for more lucrative loots this p(|k99% to get coined next) when they really weren’t (as indeed happened in the article’s otherwise quite thorough analysis). Despite their infinitely deep rationality, probabilities tend to play tricks on sensibly rational agents and this “odd/even now at increasing k not P but wait P* was odd but special” business can get hairy.

So adding (exponentially increasing numbers) to the cohort 2 pool from which the captains sample the coin receivers signals to everyone how they should continuously update (lower) their expectations for larger and larger bands relative to loot size, to the point where just a few near limit cases (which are quicker to reason through) actually involve potentially “entitled” pirates that can’t be successfully bribed with a single coin, leaving as much room for error as possible when complexity really kicks in.

So, I’ll try to generalize these computations of cohort sizes and probabilities and expectations, first for the next larger successful proposal where P(3)=2G-8 pirates can split the loot, and to any higher k and P(k)=2G-2^k by induction. We first notice that a switch of cohorts must happen: this potentially bribed set of pirates in old cohort 2 with a chance at a coin would now constitute the new cohort 3 (they’re potentially entitled now and will be harder to persuade with a single coin), while those in old cohort 3 now are guaranteed not to get offered a coin, so they’ll join the ranks of the new cohort 2. And those in old cohort 1, as we reasoned was prudent to do, will also fall in the new cohort 2 to further dilute their expectations, since these also were guaranteed zero coins (but their lives are safe now) and are susceptible to bribing with a single coin. So our totals in the new cohorts will be:

1) C_1(3)=4 pirates (k=3), we generalize as the increase in pirates, all of whom are newly “doomed” captains:

C_1(k) = P(k) – P(k-1)

= 2G+2^k – (2G+2^(k-1))

= 2^(k-1)*(2-1)

= 2^(k-1)

2) C_2(3)=G+3 when tried to generalize (e.g. from adding previous cohorts 1 and 3 or as difference from total P(k)-C_1(k)-C_3(k) ) got this recurrence relation:

C_2(k+1)=C_2(k-1)+2^k with

C_2(0)=G and C_2(1)=G+1

I couldn’t resolve the recurrence but further research revealed these very cool (to me) relationships:

C_2(k)=G+A(k) where A(0)=0, A(1)=1,

A(k) = 2^(k-1) – A(k-1)

= 2A(k-1) – (-1)^k

= (2^k – (-1)^k)/3

Thus |3A(k)-2^k|=1, tripling these numbers leaves you adjacent to the powers of 2.

(see Jacobsthal numbers http://en.wikipedia.org/wiki/Jacobsthal_number and http://oeis.org/A001045 for a dumbfounding list of properties and relationships)

3) C_3(3)=G+1 and generalize as previous cohort 2:

C_3(k)=G+A(k-1) [from C_2(k-1) see 2) above]

p(3)=G/C_3(3)=G/(G+1) probability of “entitlement” and generalize as:

p(k)=G/C_3(k)

=G/(G+A(k-1)) [see 2) above]

We can check that the recursion formulas do give what we expect for k=4:

P(4)=2G+2^4=2G+16

C_1(4)=2^3=8

C_2(4)=G+(2^4-1)/3=G+5

C_3(4)=G+(2^3+1)/3=G+3

Using the approximation to that A(k-1) applied on what we had on p(k) leads to:

p(k)=G/(G+2^(k-1)/3)

1/p=2^(k-1)/3G+1

2^(k-1)=3G(1/p-1)

= 3G(1-p)/p

And a number of pirates P_min who don’t become too “entitled” to higher than single-coin bribes (p=1/2, (1-p)/p=1):

P_min=2G+2^k

= 2(G+2^(k-1))

= 2(G+3G)

= 8G

Alright that’s a nice and easy way to get a ballpark figure for what it takes to make these rowdy mates a bit more submissive. The higher-ups have enough with trying to save their own skins, don’t need none of this “we demand more than a single coin” attitude. It takes ~80 pirates for a booty of 10 gold coins. Or (roughly) one eighth of the loot, in general. Willing to trust them as 75% entitled? The above guideline yields P_min=4G. Only have 3 times pirates as much as gold? Then everyone will have to keep very close tabs on who should and should not get bribed at every point, relying strongly on the strategy of alternating cohorts, since about one out of three of them will feel about 86% entitled to a coin at any given time, and I’d bet many of those would reject such a cheap bribe, if offered.

So I’ll leave it there. Once again apologies for the extremely long comment. Hope it was worthwhile to readers.

Thank you for this comment — indeed you are correct. When considering the $2G+4$ case, the pirates will evaluate a proposal by comparing it to what they might win at $2G+2$, since the $2G+3$ case always falls through no matter what is proposed. I updated my solution to reflect the first part of your comment.

The second part of your comment is very interesting, and brings up a whole new dimension to the problem. It might be worth making an entire separate post about it!

Thanks for taking the time to read that whole comment. As I kept writing and writing I kept telling myself “no this is already too long no one will read any of it stop now just stop!” lol and then another cool idea came up and I just had to include it. I even feared it wouldn’t pass moderation but I could then try to pare it down.

But here you are, not only read it all and already incorporated some of it into the article, and you think the other ideas might be worthy of a full blog post. Well that’s great! Glad I could contribute something valuable.

So I re-read my comment and noticed a couple of paragraphs got mangled, so here they are restored.

_____________

Furthermore, since for more lucrative loots this p(|k=1 or 2) starts off quite close to certainty (e.g. for G=83 there’d be 84 pirates who have a ~99% chance at a coin in plans proposed by captain number 168 and higher), it seems in the high-ranked captains’ interest (170, 174, 182, 198, etc. or just imagine the convoluted thought processes required by ALL pirates for 2G+2^8=422’s plan) to try to lower this expectation as much as they can, by successively increasing the pool from which to pick at random, which they can easily do by adding each cohort 1 into the next cohort 2 from which to pick the G who’ll reap the rewards. Note that this dilution of expectations does not occur in the proposals suggested in the article nor the two amendments introduced by me initially; the probability always remains at ~99% and a single mistake by anyone could be fatal.

The risk I’m raising here is that, due to the introduction of probabilistic thinking and the breakdown of the original simple odd/even pattern at our edge case P*, it seems more likely that one or more perhaps slightly distracted pirates or maybe overly risk-takers or bloodthirsty might make a subtle slip at any step and assume they were one of the “entitled” ones (~99% to get coined next) when they really weren’t (as indeed happened in the article’s otherwise quite thorough analysis). Despite their infinitely deep rationality, probabilities tend to play tricks on sensibly rational agents and this “odd/even now at increasing k not P but wait P* was odd but special” business can get hairy.

_____________

In the last paragraph of my long comment I had a brainfart where I said:

“It takes ~80 pirates for a booty of 10 gold coins. Or (roughly) one eighth of the loot, in general.”

The last sentence is obviously a mistake. I meant to say “a loot that’s one eighth of the number of pirates”. Another way of saying it would be “when the ratio of coins per pirate is 1/8”.

By the way, the generic relationship between P, G and the probability can be expressed using the ratio of pirates per coin’ r=P/G like so:

P=2G(1+3(1-p)/p)

r=P/G=2+6(1/p-1)

r=6/p-4

Or if we need the inverse: p=6/(r+4).