President Donald Trump has been dogged by questions about conflicts of interest. He has declined to divest himself of his assets or put them in a blind trust, as is customary for presidents, news reports say. He has tweeted in defense of his daughter's clothing line. And taxpayer money may go toward the Department of Defense leasing space in Trump Tower — the president's property — to remain close to the president when he is in Manhattan, CNN recently reported.
At the heart of any conflict-of-interest situation is the question of whether to act in your own best interest or do what is best for the greater good. Trump's issues might make a cynic shrug. After all, don't we all look out only for ourselves?
"In the past 20 years, we have discovered that people — all around the world — are a lot more moral and a lot less selfish than economists and evolutionary biologists had previously assumed, and that our moral commitments are surprisingly similar: to reciprocity, fairness and helping people in need, even if acting on these motives can be personally costly for a person," Samuel Bowles, an economist at the Santa Fe Institute and author of "The Moral Economy: Why Good Incentives Are No Substitute for Good Citizens" (Yale University Press, 2016), wrote in an email to Live Science. [No 'I' in Team: 5 Key Cooperation Findings]
Philosophers have been arguing about whether people are inherently selfish since there has been such a thing as philosophers. In Plato's "Republic," Socrates has a discussion with his older brother Glaucon in which Glaucon insists that people's good behavior actually only exists for self-interest: People only do the right thing because they fear being punished if they get caught. If human actions were invisible to others, Glaucon says, even the most "just" man would act purely for himself and not care if he harmed anyone in the process.
It's the sort of argument that might have appealed to Thomas Hobbes, the 17th-century English philosopher famous for saying that the natural state of man's life would be "nasty, brutish and short." According to Hobbes, humans must form social contracts and governments to prevent their selfish, violent tendencies from taking over.
Not all philosophers have agreed with this dour point of view, however. Philosopher John Locke, for example, thought that humans were inherently tolerant and reasonable, though he acknowledged humanity's capacity for selfishness.
So what does the science say? In fact, people are quite willing to act for the good of the group, even if it's against their own interests, studies show. But paradoxically, social structures that attempt to give people incentives for good behavior can actually make people more selfish.
Take a classic example: In 2000, a study in the Journal of Legal Studies found that trying to punish bad behavior with a fine backfired spectacularly. The study took place at 10 day care centers in Haifa, Israel. First, researchers observed the centers for four weeks, tracking how many parents arrived late to pick up their children, inconveniencing the day care staff. Next, six of the centers introduced a fine for parents who arrived more than 10 minutes late. The four other centers served as a control, for comparison. (The fine was small but not insignificant, similar to what a parent might have to pay a babysitter for an hour.)
After the introduction of the fine, the rate of late pickups didn't drop. Instead, it nearly doubled. By introducing an incentive structure, the day cares apparently turned the after-school hours into a commodity, the researchers wrote. Parents who might have felt vaguely guilty for imposing on teachers' patience before the fine now felt that a late pickup was just something they could buy. [Understanding the 10 Most Destructive Human Behaviors]
The Haifa day care study isn't the only one to find that trying to induce moral behavior with material incentives can make people less considerate of others. In a 2008 review in the journal Science, Bowles examined 41 studies of incentives and moral behavior. He found that, in most cases, incentives and punishments undermined moral behavior.
For example, in one study, published in 2000 in the journal World Development, researchers asked people in rural Colombia to play a game in which they had to decide how much firewood to take from a forest, with the consideration that deforestation would result in poor water quality. This game was analogous to real life for the people of the village. In some cases, people played the games in small groups but couldn't communicate about their decisions with players outside their group. In other cases, they could communicate. In a third condition, the players couldn't communicate but were given rules specifying how much firewood they could gather.
When allowed to communicate, the people in the small groups set aside self-interest and gathered less firewood for themselves, preserving water quality in the forest for the larger group as a whole. Regulations, on the other hand, had a perverse result over time: People gradually began to gather more and more firewood for themselves, risking a fine but ultimately putting their self-interest first.
"People look for situational cues of 'acceptable behavior,'" Bowles said. "Literally dozens of experiments show that if you offer someone a money incentive to perform a task (even one that she would have happily done without pay), this will 'turn on' the 'What's in it for me?' way of thinking, often to such an extent that the person will perform less with the incentive than without."
Though cooperation is ingrained in the human psyche to some extent, it's also obvious to anyone who has worked on a team that not everyone approaches group activities with the same attitude. An increasing focus on individual differences in humans reveals that some people tend to cooperate more than others.
"It has been known for quite a while that people differ quite a lot, and they differ in all kinds of behavioral tendencies," said F.J. Weissing, a theoretical biologist at the University of Groningen in the Netherlands. "But when people conducted experiments, they typically looked at the average behavior and not so much at the variation between subjects." [Top 10 Things that Make Humans Special]
That variation among subjects turns out to be quite important. In 2015, Weissing and his colleagues published a paper in the journal PNAS in which they allowed people to play a game where they could choose to seek out either information about the choices of other players, or information about how successful those other players were. People were remarkably consistent about the kind of information they sought, the researchers found: Two-thirds always asked for the same kind of information, whether they preferred information about choices or success.
Then, the researchers split people into groups based on which information they preferred, with some groups comprising only people who liked choice information, some groups made up of only people who liked success information, and some mixed. These groups then played games in which cooperation benefited everyone, but a selfish strategy could elevate an individual's fortunes while hurting the group.
People who fixated on the success of their teammates were more likely to behave selfishly in these games, the researchers found. This finding shows that this strategy — comparing others' successes and failures — prompts people to engage in behaviors focused on their own gain, the researchers said.
In contrast, people who focus on how the rest of the group is acting, regardless of individual successes, might be more prone to working together, the researchers said.
Both cooperation and selfishness may be important behaviors, meaning that species may be most successful if they have some individuals that exhibit each behavior, Weissing told Live Science. In follow-up experiments that have not yet been published, he and his colleagues have found that in some economic games, mixed groups perform far better than groups made up only of conformists or only of those who look out for themselves. [7 Thoughts That Are Bad for You]
Very fundamental physiological differences between people may be at the root of these different social strategies, Weissing said, including differences in hormone levels and organization of the central nervous system. However, he agreed that situational factors can subtly push people toward cooperation or self-interest. More realistic studies of cooperative and selfish behavior are needed, he said.
"In real life, cooperation looks very, very different from these very, very simplified lab contexts," Weissing said. "And the dominant factor is not really money, but something else. I think that makes quite a difference."
Original article on Live Science.