Tuesday, August 23, 2016

The Prisoner's Dilemma & Your Campaign

Many of my readers will be familiar with the Prisoner's Dilemma, but let's cover it quickly just to be sure we're on the same page.  This post isn't about the dilemma itself but upon its application - which, I'll just say, is a virtually ignored subject in most game theory books, which are mostly concerned with its mathematical proof - and believe me, there's a lot of math.

Two jewel thieves work together to steal a large gem from a jewelry store and hide it together in a corn field.  Both jewel thieves are caught by the police, who have evidence enough to hold the pair on a lesser charge.  The pair are interrogated separately.  The same deal is offered to each: confess to the crime and be set free, while the partner is given a hard sentence; don't confess and go to jail on the minimum charge.

There are a number of things at play here: the amount of time spent in jail is one: the fact that confessing and going free will mean that the object hidden in the cornfield will be found is another.  There is, therefore, an incentive to tell the truth and be set free and an incentive to do the time and get the jewel.  If both crooks keep quiet, they'll both go to jail on the minimum charge and they'll split the money.  If either crook rats the other, that crook will go free and the jewel will be lost.  If both crooks rat, both crooks will be in jail a long time and the jewel will be lost.

The key to the dilemma is not what the crooks do, however, but how the element of time affects the equation.  Let's suppose that both crooks keep quiet and begin their sentence.  Even if this is their decision, it is an unstable situation.  At any time, either crook can decide the sentence is too unpleasant and choose to rat out the other.  The longer the time on the minimum charge, the less stable the situation is and the more likely it becomes that it will fold, leaving us with one crook having given up the jewel.

Once the confession is made, however, the situation becomes more unstable.  Now the betrayed criminal has no incentive to keep quiet and plenty of incentive for revenge.  While the proposition of the dilemma may be seen as a closed puzzle, in actual life we know that the second crook will try to seek ways to take revenge, revealing anything possible to ensure that the first criminal is arrested for some other, previously unknown crime - or potentially using influence to harm or kill the freed criminal.

Therefore, the only stable situation is one where both criminals are severely punished.  The reason this matters has everything to do with nuclear war.  The prisoner's dilemma was first proposed by two employees of the RAND corporation in 1950, a think tank whose commitment at the time was to intellectually address the Cold War.  If we think about the situation described above in terms of two nuclear powers, the best possible situation is for both to refrain from total war.  However, since both powers have interests throughout the globe, the desire to drop the bomb to solve the problem is always there, meaning that the situation is unstable.

One power might try to avoid the use of atomic weapons through the use of capital or conventional weapons - whereupon the other side, like the second prisoner, will join in and perhaps go a little farther.  If one side or the other begins to lose, the desire is to escalate the conflict - and thus the unstable situation becomes increasingly unstable until, boom, total nuclear war makes the system completely stable.

It is this escalation that makes the prisoner's dilemma interesting, as one prisoner tries to confess some of the crime while holding back information on where the jewel is actually located; or tries to ascertain what the other prisoner has already confessed, in order to improve their individual advantage.

This is the game we all play, all the time, because in every relationship we have we are always seeking advantage over those people we perceive are seeking advantage over us - with an understanding that at any time, we can drop the bomb by revealing some secret that will ruin the other person's life at the risk of getting ourselves fired or otherwise escalating the situation until total mayhem results.  Intellectually, we understand that we're in the best situation possible when we all agree wholeheartedly, but that's not how we function instinctively.  Instinctively, where there is instability, we strive for stability and that ultimately ends in disaster.

In application to our role-playing campaigns, both the DM and the Players have a nuclear potential: either can quit outright and the game is wrecked.  At the same time, it is difficult for the DM or the Players to fully trust the other, because the game is rigged in the DM's favor, while the social contract is rigged in the Player's favor (there are many players but only one DM).

IF each side respects the other and carefully bows to the needs of the other, the game continues.  But as the DM seeks for an edge on controlling the players or as the players seek for an edge on the DM, or as each tries to restrict the behavior of the other, the small amount of instability grows and . . . boom.

The only practical strategy is to accept the instability and try to keep it at a minimum.  The game will never be 100% stable as long as its ongoing; habitually, without thinking about it, both DM and Players will take actions with increase instability because we're built this way.  The solution is a formula of open, friendly discussion, where no one tries to bull anyone else, where a frank discourse of needs and expectations is wholeheartedly addressed by both parties and where everyone is completely honest, all the time.  Any and all behavior to the contrary is a risk.

If the reader would like a little more insight into the RAND corporation and game theory's influence on modern day culture, see Adam Curtis' The Trap, a documentary that aired on the BBC in 2007.

1 comment:

Scarbrow said...

While sorely tempted to add my own reasoning here, when it comes to the Prisoner's Dilemma I always turn to Scott Alexander, of Slate Star Codex fame


TL;DR: Society and biology, with its norms for "honor" and "friendship" offer a solution to the dilemma by making us more likely to cooperate. So treat your players nicely, and they will be less likely to backstab your campaign. You can know it's true when, after saying it it looks so obvious you can't even understand why it had to be said.