• Grey Facebook Icon
  • Grey LinkedIn Icon
  • Grey Twitter Icon

D21 Partners, LLC. 

50 Fountain Plaza, Suite 1700

Buffalo, NY 14202

(716) 853-5100

All rights reserved.

  • D21 Partners

A Brief Overview of Game Theory

By Andrea Vossler, Scott Friedman, Eliza Friedman, and Mary Owen

When disagreements among family members threaten to boil over, applying insights from Game Theory can be helpful.

Game theory is a modeling system designed to explore how people interact with each other when they have to solve a problem that requires them to choose between “cooperating with” or “competing against” the partner they’re paired with.

A branch of applied mathematics, game theory provides tools for analyzing decision-making choices that different players face based on a consideration of other player’s possible decisions. The games help inform the optimal decisions of the players, who may have similar, opposed, or mixed interests, and the outcomes that may result from these decisions.[1]

Perhaps the best known “game” is the “prisoners’ dilemma,” in which two individuals are arrested for a petty crime but the police suspect the individuals to be guilty of a much more serious crime. The police, however, don’t have enough evidence to secure a conviction for the serious crime and, so, need a confession. The suspects are taken into separate rooms where they can’t communicate with each other and are offered a deal to betray the other prisoner and testify that he committed the serious crime. The terms of the deal are as follows:

  • If only one prisoner betrays the other, (a) the betrayer gets both crimes dismissed (and gets zero years in prison) but (b) the other prisoner, who has remained silent, gets the maximum penalty (ten years) for committing the serious crime;If both prisoners betray each other, they both go to jail for the serious crime, but for only five years since they assisted the police by confessing; andIf they both cooperate and keep silent, they both go to jail—but only for one year for the petty crime—and get away with the serious crime.

  • As a result, there are four possible outcomes for each prisoner: receiving the maximum sentence (ten years), a medium sentence (five years), a short sentence (one year), and no sentence. The winner of the game is the prisoner who serves the least amount of jail time. The “dilemma” is that each prisoner is tempted by the prospect that his accomplice will keep silent, in which case, if he confesses and betrays his partner, he wouldn’t have to spend any time in jail. The motivation to make a selfish decision and betray a partner is clear, often leading both prisoners to confess—which makes them both worse off than if they had “cooperated” and kept their mouths shut.

  • This logic can help explain a variety of irrational behaviors, such as why many family members sometimes spend enormous sums of money in litigating their differences out of “spite” or “principle,” even though they would have been better off economically (much less “emotionally”) having reached a settlement.

When considering the advantages and disadvantages of family members who must choose over an extended period of time (as opposed to a single interaction) whether to “cooperate” with each other by making decisions in the family’s collective interests or, instead, make “selfish decisions” in one’s personal interest, it can be helpful to consider a version of Prisoner’s Dilemma in which the players, instead of playing a single game, play against each other repeatedly and, so, can react to the other player’s behavior.

In these games, unlike “one-time game players,” both players have an opportunity to react to the other’s previous decisions, including by always cooperating, always being selfish, reacting randomly and unpredictably, etc. It turns out, when played repeatedly, the winners of the extended version of the game (i.e. those who get, in the aggregate, the fewest time in jail), adopt what is known as a “tit for tat” strategy: a strategy that always begins by cooperating (not confessing the crime and pointing a finger at the other prisoner) and then, after that, doing whatever the other player did on the last move (either remaining silent or confessing the crime).[2]

Essentially, those who learn from their mistakes and correct their behaviors do better over time. This result has become known as the “Nash equilibrium,” which was named after the Nobel Prize winning mathematician John Nash (whose life was made famous in the movie A Beautiful Mind) and describes a stable outcome that results from people or institutions making rational choices based on what they think others will do.[3]

As family business advisors specializing in culture, communication, and conflict preemption, D21 Partners applies Game Theory to help business owners make the right choices for their business.

Does your family business struggle with disagreements and decision making? Find out by taking our Family Business Score Card.


[1] See generally Guillermo Owen, Game Theory (3d. ed.  1995).

[2] Explaining how a classic “Tit for tat” strategy can be helpful by sending a message by retaliating—but only once—and letting bygones be bygones, at least until the other party fails to cooperate again, Roger Kay writes:

Tit for tat can swing both ways.  It elicits cooperation if you’ve got any inclination, but doesn’t take any guff.  When playing against Jesus, a virtuous cycle of cooperation prevails for all 200 rounds. Against Lucifer, Tit for Tat plays pretty good defense. And it wins, in evolutionary terms . . . [I]n a world where Lucifers dominate, a few Tit for Tat players can take back the night if there are enough of them to run into each other from time to time.

Rodger Kay, Generous Tit for Tat: A Winning Strategy, Forbes (Dec. 19, 2011), http://www.forbes.com/sites/rogerkay/2011/12/19/generous-tit-for-tat-a-winning-strategy/#4d0fa6336669. Adam Grant’s recent work on “giving,” “taking,” and “matching” suggests interestingly related findings and conclusions. See generally Adam Grant, Give and Take: Why Helping Others Drives Our Success (2014).

[3] See Ken Grimes, To Trust is Human, New Scientist, May 10, 2003, at 32 (describing games in which individuals show trust when trusted, distrust when threatened, and noting that initial distrust is self-prophesizing in that it leads to more untrustworthy behavior).