Over the course of the Pleistocene epoch, between 2.6 million years ago and 11,700 years ago, the brains of humans and their relatives grew. Now, scientists from Tel Aviv University have a new hypothesis as to why: As the largest animals on the landscape disappeared, the scientists propose, human brains had to grow to enable the hunting of smaller, swifter prey.
This hypothesis argues that early humans specialized in taking down the largest animals, such as elephants, which would have provided ample fatty meals. When these animals' numbers declined, humans with bigger brains, who presumably had more brainpower, were better at adapting and capturing smaller prey, which led to better survival for the brainiacs.
Ultimately, adult human brains expanded from an average of 40 cubic inches (650 cubic centimeters) at 2 million years ago to about 92 cubic inches (1,500 cubic cm) on the cusp of the agricultural revolution about 10,000 years ago. The hypothesis also explains why brain size shrank slightly, to about 80 cubic inches (1,300 cubic cm), after farming began: The extra tissue was no longer needed to maximize hunting success.
This new hypothesis bucks a trend in human origins studies. Many scholars in the field now argue that human brains grew in response to a lot of little pressures, rather than one big one. But Tel Aviv University archaeologists Miki Ben-Dor and Ran Barkai argue that one major change in the environment would provide a better explanation.
"We see the decline in prey size as a unifying explanation not only to brain expansion, but to many other transformations in human biology and culture, and we claim it provides a good incentive for these changes," Barkai wrote in an email to Live Science. "[Scholars of human origins] are not used to looking for a single explanation that will cover a diversity of adaptations. It is time, we believe, to think otherwise."
Big prey, growing brains
The growth of the human brain is evolutionarily outstanding, because the brain is a costly organ. The Homo sapiens brain uses 20% of the body's oxygen at rest despite making up only 2% of the body's weight. An average human brain today weighs 2.98 lbs. (1,352 grams), far exceeding the brains of chimpanzees, our nearest living relatives, at 0.85 lb. (384 grams).
Barkai and Ben-Dor's hypothesis hinges on the notion that human ancestors, starting with Homo habilis and peaking with Homo erectus, spent the early Pleistocene as expert carnivores, taking down the biggest, slowest prey that Africa had to offer. Megaherbivores, the researchers argue in a paper published March 5 in the journal Yearbook of Physical Anthropology, would have provided ample calories and nutrients with less effort than foraging plants or stalking smaller prey. Modern humans are better at digesting fat than other primates are, Barkai and Ben-Dor said, and humans' physiology, including stomach acidity and gut design, indicate adaptations for eating fatty meat.
In another paper, published Feb. 19 in the journal Quaternary, the researchers argue that human species' tools and lifestyle are consistent with a shift from large prey to small prey. In Barkai's fieldwork in Africa, for example, he has found Homo erectus sites strewn with elephant bones, which disappear at later sites from between 200,000 and 400,000 years ago. The human ancestors at those more recent sites seemed to have been eating mostly fallow deer, Ben-Dor wrote in an email to Live Science.
Overall, megaherbivores weighing over 2,200 lbs. (1,000 kilograms) began to decline across Africa around 4.6 million years ago, with herbivores over 770 lbs. (350 kg) declining around 1 million years ago, the researchers wrote in their paper. It's not clear what caused this decline, but it could have been climate change, human hunting or a combination of the two. As the biggest, slowest, fattiest animals disappeared from the landscape, humans would have been forced to adapt by switching to smaller animals. This switch, the researchers argue, would have put evolutionary pressure on human brains to grow larger because hunting small animals would have been more complicated, given that smaller prey is harder to track and catch.
These growing brains would then explain many of the behavioral changes across the Pleistocene. Hunters of small, fleet prey may have needed to develop language and complex social structures to successfully communicate the location of prey and coordinate tracking it. Better control of fire would have allowed human ancestors to extract as many calories as possible from smaller animals, including grease and oil from their bones. Tool and weapon technology would have had to advance to allow hunters to bring down and dress small game, according to Barkai and Ben-Dor.
A fuzzy past
Single hypotheses for human brain evolution haven't held up well in the past, however, said Richard Potts, a paleoanthropologist and head of the Smithsonian's Human Origins Program in Washington, D.C., location, who wasn't involved in the research. And there are debates about many of the arguments in the new hypothesis. For example, Potts told Live Science, it's not clear whether early humans hunted megaherbivores at all. There are human cut marks on large-mammal bones at some sites, but no one knows whether the humans killed the animals or scavenged them.
The researchers also sometimes use arguments from one time period that might not apply to earlier times and places, Potts said. For example, the evidence suggests a preference for large prey by Neanderthals living in Europe 400,000 years ago, which would have served those human relatives well in winter, when plants were scarce. But the same thing might not have held true a few hundred thousand or a million years earlier in tropical Africa, Potts said.
And when it comes to brains, size isn't everything. Complicating the picture, brain shape also evolved over the Pleistocene, and some human relatives — such as Homo floresiensis, which lived in what is now Indonesia between 60,000 and 100,000 years ago — had small brains. H. floresiensis hunted both small elephants and large rodents despite its small brain.
The period over which humans and their relatives experienced this brain expansion is poorly understood, with few fossil records to go on. For example, there are perhaps three or four sites firmly dated to between 300,000 and 400,000 years ago in Africa that are certainly related to humans and their ancestors, said John Hawks, a paleoanthropologist at the University of Wisconsin–Madison who was not involved in the research and was skeptical of its conclusions. The human family tree was complicated over the course of the Pleistocene, with many branches, and the growth in brain size wasn't linear. Nor were the declines in large animals, Hawks told Live Science.
"They've sketched out a picture in which the megaherbivores decline and the brains increase, and if you look at that through a telescope, it sort of looks true," Hawks told Live Science. "But actually, if you look at the details on either side, brain size was more complicated, megaherbivores were more complicated and it's not like we can draw a straightforward relationship between them."
The paper does, however, draw attention to the fact that human species may indeed have hunted large mammals during the Pleistocene, Hawks said. There is a natural bias in fossil sites against preserving large mammals, because human hunters or scavengers wouldn't have dragged an entire elephant back to camp; they would have sliced off packets of meat instead, leaving no evidence of the feast at their home sites for future paleontologists and archaeologists.
"I'm sure we're going to be talking more and more about what was the role of megaherbivores in human subsistence, and were they important to us becoming human?" Hawks said.
Originally published on Live Science.
Sign up for the Live Science daily newsletter now
Get the world’s most fascinating discoveries delivered straight to your inbox.
Stephanie Pappas is a contributing writer for Live Science, covering topics ranging from geoscience to archaeology to the human brain and behavior. She was previously a senior writer for Live Science but is now a freelancer based in Denver, Colorado, and regularly contributes to Scientific American and The Monitor, the monthly magazine of the American Psychological Association. Stephanie received a bachelor's degree in psychology from the University of South Carolina and a graduate certificate in science communication from the University of California, Santa Cruz.