You are playing backgammon (wlog as white). (The rules of backgammon are explained more clearly here.) At any given point during a round, there are six possible outcomes of the round, with corresponding values:

- white wins by a backgammon ()
- white wins by a gammon ()
- white wins ()
- red wins ()
- red wins by a gammon ()
- red wins by a backgammon ()

At the start of a round, and both players have control of the *doubling cube*.

When it is Alice’s turn, she may offer a *double* if and only if she controls the doubling cube. If Alice offers a double, then Bob must accept or decline. If Bob declines, then Bob forfeits the round and Alice gains points. If Bob accepts, then Alice ceases to control the cube, Bob gains control of the cube, and is multiplied by 2.

(There are other rules.)

## The problem

Suppose that both you and your opponent can perfectly calculate the probability of each of the above outcomes.

- It is your turn. Should you offer a double?
- It is your opponent’s turn, and they have offered a double. Should you accept?

For a harder problem, suppose that you assign probabilities to each of the above outcomes, and your opponent assigns different probabilities , but you know both the and the .

Interesting article Jonny. For the one where you can both perfectly calculate the probabilities:

I think that one should only offer a double when they are more likely to win but only marginally. It all depends on how risk averse the players are.

Equally, if a gammon or backgammon is likely then they probably shouldn’t offer it.

In terms of being offered the double, one has to analyse the volatility of the probabilities over the next couple of throws and assess whether the entire game could turn in the next few goes. I think the big reason of not accepting if if the opponent if 60%+ chance of winning.

It is an interesting situation. One should only really offer the doubling cube if they look set to win but one should not accept if it’s unlikely they’ll win. So by that logic, it’ll never be used.

This applies if we’re not playing the Jacoby rule. It’s been pointed out to me that everything is different if we are playing a match (best to 15 or 17) rather than for money. If you’re very behind with the score at 13-4, say, then doubling is more advantageous for you than it is for them, because you’re likely to lose the match anyway.

I thought about this solution: Suppose you both can calculate the probabilities. Then you can both calculate the expected outcome for the round, if you don’t double. Call it . Your opponent will take the double if , and drop if . Thus, both players stand to gain from an offer if — you double, your opponent will take, and you can now expect to win .

But you’re right: This solution doesn’t take volatility or skew into account. You might have a situation with , with a small chance of you winning by a backgammon and a large chance of you losing by a point, but a very small chance of winning by just a point or a gammon. (I can’t think of any such situation at the moment.) I’m not sure whether the expectations argument still holds.