The Dieter’s Gamble and the Prisoner’s Dilemma
In an earlier post I talked about how our innate sense of fair play can make the experience of dieting feel unjust. But what does justice or fairness, which involves preventing or resolving conflict between an individual and others, have to do with an individual’s own internal conflict about dieting? If fairness is a social issue, it should be irrelevant for helping us understand the struggle with emotional eating and self-control.
Or is it? I’ll return to the topic of dieting later, but first let’s explore the evolution of social cooperation for some background to answer this question.
Where do our notions of fairness come from? We’re taught at an early age to wait our turn, not to cheat and to share our toys. It would seem that we need to be educated about these rules so that we don’t act on our childlike impulses to take every selfish advantage that we can and not play well with others.
To a large degree, though, we seem to grasp many of these ideas intuitively. For example, we may learn in driver’s education who has the right of way at a four-way stop sign, but we don’t question why that’s fair; we just know it because it’s obvious that there should be some protocol. And when we learn that it’s a version of first-come-first-served, it seems right. We can instantly grasp that it’s not just unsafe but wrong to jump your turn at the intersection, or to feel angry at someone who does.
I’m also sure you’ve seen very small children who, even though they’re too young to have learned the rules, will instinctively reach out to a crying playmate by offering a toy or sharing a treat. It seems that when we learn rules for getting along in pre-school, it’s more like the driver’s ed class: a formalization and reinforcement of social rules that we already know or can figure out instinctively.
This combination of learned social skills and innate altruism competes with our natural inclination to act in our self-interest. But why are we apparently willing to make any sacrifices for others if the rules are unwritten and unenforceable? And how does a system for social cooperation develop in the first place?
It seems that they’re not entirely unenforceable, at least informally. We struggle with the ongoing tension between taking care of our own needs and setting them aside for the good of the social group that we’re part of. There are times when it’s appropriate to inhibit our self-indulgent impulses, not only for the greater good, but for our own good as well.
I’ve written frequently about our need to maintain a balance between overriding our impulses and letting go of self-restraint. More recently, I’ve come to believe that the tension created by maintaining this internal balance is rooted in and parallels a broader human need to find balance in our interpersonal social interactions. Both conflicts – the individual and social – are similar and are both ultimately driven by self-interest. They require a sense of trust that the sacrifice invested today (in the group or in one’s future self) will result in a greater reward at some later time.
When this conflict of interests between individual needs and group needs is the subject of study, it’s known as the Tragedy of the Commons. This refers to the need for herders to set limits on grazing their own flocks on a common pasture. Although grazing as many as they can would be in their own short-term interest, they limit their flock in order to preserve the resource for the benefit of everyone (including, of course, themselves).
Some degree of self-limiting behavior is seen in all social groups. There are about fifty species of small fish who live off the tiny parasites that they pick off the teeth of larger fish. In the short-term, of course, the larger fish could get an easy meal by simply snapping their jaws shut while the little fish are going about their jobs of cleaning their teeth. But in the longer-term it pays for them to restrain that impulse and benefit instead from this steady symbiotic arrangement. It’s a kind of primitive social contract that nevertheless involves very evolved “human” traits like trust, reciprocity, mutual dependence, self-control, altruism and investment for the future.
Ethologist Konrad Lorenz describes how animals competing for mating opportunities or territory curb their aggression against rivals. Rather than engaging in a fight to the death, such competition is more typically carried out like a ritualized jousting tournament. Why do the animals hold back? Because a rival who survives today can live to be an ally tomorrow, and that may prove useful when competing against other groups. Like Lincoln’s famed “team of rivals,” there are benefits in winning the prize without eliminating the competition.
This kind of strategic restraint has been studied by social psychologists and game theorists using a model of social behavior called The Prisoner’s Dilemma. The name refers to a hypothetical situation in which two partners in crime are arrested for armed robbery, but the police need at least one of them to give up the other in order to have enough evidence for a conviction. Each can choose only one of two moves: COOPERATE and observe the criminal code of silence or DEFECT and implicate the other.
There are four possible outcomes:
They both cooperate. All the police can do is charge them each with breaking and entering and they’ll both get only a year in jail. That’s “the reward for mutual cooperation” and it’s a fairly good outcome.
They each defect. Both are charged with the more serious offense and serve two years in jail. Call that “the penalty for mutual defection” which is a fairly bad outcome.
One defects and goes free, a very good outcome and is therefore called, “the temptation to defect”…
…while the other cooperates and gets a three-year sentence for the robbery and obstructing justice. The worst outcome, called “the sucker’s payout.”
In the research setup, points are awarded that reflect the relative value of the different outcomes: 0, 1, 2, and 3.
In a one-time competition it’s always best to defect because you have a chance of a big score and you can never do worse than your opponent. But in a real social group, members will interact repeatedly over a long period of time which changes the strategy. If you always defect, others are likely to retaliate later. In studies on interactions in a social group a modified version of the game is played called the Iterated Prisoner’s Dilemma, in which the game is replayed repeatedly.
So which strategy works best for the good of all?
Robert Axelrod, a political scientist at the University of Michigan, describes the approach he took to answering this question in his book, The Evolution of Cooperation. Axelrod and his collaborator, biologist William Hamilton, put out a call to game theorists around the world to submit computer programs with what they believed would be the best strategy. The winner is the strategy that has the highest score after going against the others in a round-robin tournament.
In the end, the winning program was a program called Tit for Tat. It’s the most successful strategy for fostering a stable social system and had the simplest and most intuitive rules of all. Simply put, it’s “I’ll scratch your back if you scratch mine.” But there’s also an implied corollary to this conditional offer: “and if you don’t, you can forget about me tending to your itchy back next time!” The rules for this strategy are simple: you begin by cooperating on your first move, then you just copy your opponent’s moves after that.
Here’s a real world example of how this strategy can be used. An investor is asked during a business meeting to write a proposal for a deal. He’s the best qualified person in the group to do it, but he thinks it would be considerate of him to pass a draft around so others can have an opportunity to comment on it. [PLAYER A: COOPERATE?]
But then he considers the possibility that it would just open the door for someone who might begrudge him the privilege of writing the proposal to criticize his work or shame him by making a disparaging comment [PLAYER B: DEFECT?]. He never said during the meeting that he would pass it around, so why risk it? [A: DEFECT?]
He finally concludes, “If someone else wants to act like a jerk, let him! I’m not going to assume there were any jerks in that meeting until one declares himself.” [A: COOPERATE] So he sends it around and gets positive feedback from everyone. [B: COOPERATE] Both sides (he and his potential rivals in the group) receive the “cooperators’ reward” because they’re now a more cohesive team, which will help in negotiating this and any future deal.
The Tit for Tat strategy is successful because it strikes the right balance between being generous and being selfish. It’s friendly, but it’s no pushover. When more forgiving programs were run, the hypothetical group using the strategy was exploited. Less generous programs that anticipated selfish behavior and defected on the first move were punished – even if the rest of the moves followed Tit for Tat.
This is the basis for reciprocal altruism. It’s altruistic because we suspend our own needs for someone else’s benefit. It’s reciprocal because that person will do the same or else his privileges that come with group membership will be suspended until he cooperates.
Now we can turn back to the question of how this relates to an individual’s conflict around self-control. When a dieter, for example, feels the impulse to indulge in something that’s not allowed on her diet but restrains herself in order to lose weight, how does she expect to benefit from this sacrifice?
It depends on the basis of her motivation to diet. She may view the diet as something that really comes from her and when she sticks to the rules, she genuinely feels that she’s doing exactly what she wants. This does happen sometimes and it’s likely to bode well for the success of her dieting efforts.
Now consider someone who is on a diet because she feels pressured somehow. Perhaps it’s the cultural pressure to be thin or she’s getting ready for the summer or a beach vacation. Even though her conflict is going on in her own head, if she’s trying to comply with perceived social pressure, the Prisoner’s Dilemma is likely to play a role in her diet strategy.
Her opening move in the game was COOPERATE. Her expectation was that she will, at some point, experience a return on her investment, like greater social acceptance, success, or attention, because she will lose weight. At first, it’s likely to go well as long as the scale keeps showing steady progress, or the compliments keep coming, or the clothes that she hasn’t been able to wear in a while begin to fit again.
But once those reinforcements stop – and eventually they do – she’ll begin to wonder why she’s doing this. The fact that others are able to maintain their weight without dieting will become more irritating. Most importantly, if the expected improvement in social acceptance and career or relationship success has not come through, she’ll be left with nothing but the deprivation, food rules and low-calorie desserts. It’s the ultimate sucker’s payout.
This is the kind of experience that puts the “emotion” in emotional eating. It’s a sense of betrayal and false promises that generates anger and a defiant abandonment of self-restraint. Her emotional eating, then, is the Tit for Tat strategy to play the DEFECT move because she feels betrayed and angry that her COOPERATE move was not reciprocated. It’s the right strategy to play in the iterated prisoner’s dilemma: cooperate first and continue to do so as long as the other guy does as well, but respond in kind if he defects.
There are two takeaways: First, if your decision to diet is truly coming from you, whether to reduce health risks, increase self-esteem, or any intrinsic motive, you’ll likely reach your goal and will feel good about the process. But doing it just to play the social cooperation game is a setup for feeling betrayed.
Second, if the emotions that you experience when you break the rules of the diet feel like resentment or angry defiance, then it’s probably a sign that you’re feeling cheated. But it’s like a shell game of Three-card Monte or Find the Lady: once you figure out that it’s a scam, you’re smart enough not to fall for the hype.