Seemingly negative behaviors may ultimately lead to positives, new research shows.
Credit: Nico Traut, Shutterstock
Fairness may have darker roots than previously believed, according to new research that finds spiteful behavior can lead others to act fairly.
The study is based on a theoretical model, not human experiments, but it opens up the possibility that fairness evolved not out of Kumbaya-style cooperation, but out of a need to get by when others act spiteful. In an economic game, the study found, fair behavior evolved in order to survive in an environment where spiteful players thrived.
"What we found is an alternative evolutionary path towards fair behavior," said study researcher Patrick Forber, a philosopher at Tufts University in Medford, Mass.
Spite is the opposite of altruism. An altruistic person pays a personal price to do something nice for others. A spiteful person pays that price to do something to hurt someone else. [The 10 Most Destructive Human Behaviors]
Forber and his co-researcher, Rory Smead of Northeastern University in Boston, wanted to understand why spite might evolve. They used a famous economics game, called the Ultimate Game, to find out.
In the Ultimatum Game, there are two players. The first is given a resource — say, $10 — and told to offer part of that resource to the second player. If player two refuses the offer, neither player gets anything. If player one accepts, both get the amounts proposed by player one.
If a person is playing to maximize his or her profits in the Ultimatum Game, the rational thing to do is to offer as little as possible to player two and accept anything offered when in player two's shoes. That's not how people work, however; they regularly make fair, even offers and reject unfair offers. That rejection is an example of spite, because the player refuses a reward in order to punish someone who gave an unfair offer.
Easy riders and spiteful bots
Forber and his colleagues set up a model (essentially computers playing the Ultimatum game against one another) to see what kind of players would evolve. They created a situation in which the players could make fair or unfair offers when in the player one position and could choose to accept or reject offers when in the player two position.
The setup resulted in four possible player types: The "rational" player, who makes unfair offers and accepts any offer that comes his way; the "fair" player, who makes fair offers and rejects unfair offers; the "easy rider," who makes fair offers but accepts any offers; and finally, the "spiteful" player, who makes unfair offers but also rejects unfair offers.
The model was set up so that the most successful players would multiply, mimicking evolutionary dynamics.
When types of players are matched randomly, the result is either a population of rational players or some mix of fair players and easy riders, Forber said. But when the game was designed to allow players to mix with types unlike themselves, another pattern emerged.
Under these mixed-up conditions, spite evolved — much to the researchers' surprise, Forber said. And with spite in play, strange things started happening. First, rational and fair players disappeared. Spiteful players rejected rational players' unfair offers, essentially spiting them out of the game. Fair players got duped by spiteful players, who always took their nice offers, but never returned the favor.
Only one type of player could survive the onslaught of spite: the easy rider. These players made fair offers, so spiteful players had no cause to punish them. But they also accepted what they could get from the spiteful, which kept them in the game.
The findings are theoretical, but they open up the question of how fairness evolved in humans, Forber said. (And in chimpanzees, which also value fairness in the Ultimatum Game.)
"It could be that [fairness] wasn't a solution to solving problems of cooperation," he said. "Instead, it was a solution of handling these anti-social types."
The researchers published their findings today (Feb. 11) in the journal Proceedings of the Royal Society B.