With my phone handy, I feel pretty smart. I’m able to follow my motto of “always be prepared.” More or less, what that boils down to is that for various endeavors, I whip up a checklist. With the convenience and ease of creating one, I don’t think I’m very orderly or meticulous, I just don’t want to forget any groceries or to find myself overnighting deep in the wilderness without any matches.
Such cleverness – and preparedness – you could say, owes largely to my exceptionally large brain. No, I don’t think I’m particularly smart either, I’m referring to the human brain in general. Relative to other mammals, our brain-to-body size ratio, known as encephalization, is three times larger than expected, when compared with other primates. Incredibly, on average, there’s an estimated 86 billion neurons in our noggins. Such encephalization is thought to be a clear hallmark of exceptional human brain evolution and intelligence. Indeed, the cleverness of making lists has been around for millennia, long before the conception of any smart device.
But since brains decay at a rate similar to any kind of meat, it’s impossible to directly examine the neural intricacies of our ancestors. We can’t obtain any estimates of the number of neurons or neural connections they had. All there is to go on are measurements of the empty space in fossilized skulls, as an index of cranial capacity. And sure enough, records reveal an extensive pattern of cranial growth over millions of years of human evolution. From the time of Miocene hominids around 10 million years ago, our mammalian heads expanded towards their current, disproportionately bulky size. Since sharing a common ancestor with chimpanzees, our brain size quadrupled.
Yet any apex in growth is absent in present day humans. To the chagrin of our typical anthropocene arrogance, we owe that title to Cro-Magnons and Neanderthals who lived about 40,000 years ago. Studies of cranial capacity dating back to the 1970s, based on hundreds of skulls, clearly and consistently reveal a decrease in size since then. And not just any decrease. The physical amount of brain loss equates roughly to a chunk the size of a tennis ball. That amounts to a staggering 10 million neurons each passing decade. The rate of decrease is thirty-seven times greater than the rate of expansion that got there, and it seems to be accelerating.
How can that be possible? In the last 40,000 years or so, our species achieved some of its greatest accomplishments in terms of intelligence. The last 10,000 years alone have witnessed the invention of agriculture, writing, and the transistor to name a few biggies. How can we reconcile these achievements with such significantly decreased – and decreasing – brain size?
Any such attempt should first be informed by an explanation for the shrinkage. And there are various theories to choose from. Among anthropologists, reduced cranial capacity is well known, but hardly understood. It’s seen as a pariah. Relevant studies I’ve come across often state something like “the decline in cranial capacity is well-known but often ignored or dismissed.” The shrinkage is the elephant in the room.
To start, the simplest explanation argues that it is no more than a component of an overall decrease in body size as a consequence of warmer global climate, beginning with the onset of the current Holocene period. Humans that lived through colder climates, like ice ages, were bulkier, a trait that afforded better heat retention. As we evolved, with the gradual onset of warmer temperatures bulkiness, however, was likely naturally selected against. Bigger humans, although they were warmer, required more food and were less likely to survive shortages. Since brains consume more energy than any human organ, they were particularly costly, and so larger ones were selected out.
Over a broader timespan, however, this explanation doesn’t hold up well. Other epochs of increasing temperatures over the past 2 million years don’t yield evidence of shrinkages in cranial capacity, but instead show mostly continuous growth. The present decrease is also suspiciously large relative to the decrease in body mass over the same timespan to be explained by an overall decline in human body size.
Warmer climate doubtless made survival easier. There were more resources which, overall, made for better hunting, cooking, improved shelters, and so on.
With a greater ability to thrive, reductions in mortality rates meant living in increasingly larger groups where families and friends cared for and helped each other throughout their lifespan. The relation between resources and cooperation still occurs in the wild. When food is scarce, for example, chimps tend to be more aggressive and hierarchical. Other species like Bonobos who typically live in more lush, food-abundant environments, tend to be more peaceful.
It’s likely that resource availability had a huge impact on human advancement – and brain size – that came mostly from the creation collective, group knowledge, rather than the accomplishments of innovative, brainy individuals.
In fact, domesticated animals provide a good model for evolutionary changes toward communal living. Domestication favors traits that can maximize cooperation while minimizing aggression. Despite our atrocities, it’s likely no coincidence that we are exceptionally friendly. There’s consensus that “niceness” is a defining feature of ours.
In an experiment with Siberian foxes, researchers proved that aggressive traits can be clearly selected out after just four generations. The pups are then tail-waggy and have floppy ears in a manner similar to many domesticated dogs. In fact, the traits observed to replace aggressive behaviors are those that seem to elicit care from others. Basically, these changes result in phenotypes with prolonged juvenile features, a process called neoteny.
Over time, since living in larger groups humans might have self-domesticated. Exceptionally cooperative features in humans could be the outcome of natural selection in resource-filled environments, in what’s been termed, “survival of the friendliest.” The idea even goes back to Charles Darwin, who originally pondered it. Going with Occam’s razor, genetic changes producing prolonged, care-eliciting juvenile features are relatively simple in comparison to, say, new mutations that favor increasingly communal living, and could possibly happen in the span of 40,000 years.
There’s plenty of evidence to suggest that domestication plays a significant role in our shrinking brains. In fact, there’s a consistent pattern of cranial capacity shrinkage in present day domesticated animals. In each of thirty examined species, there’s been a clear reduction in cranial size relative to their wild counterparts. Furthermore, domestication directly affects the morphology of brain regions related to aggression and it causes reductions in hormones, which can influence both aggression and brain size.
As our communities increased demographically and we became friendlier, history shows that we advanced. These larger-group conditions lead to more shared knowledge and ultimately to the evolution of culture.
For example, historical evidence from the time of early European contact in Oceania shows that tool complexity, which advances slowly, was highest on islands with the most people. Geographically, even today, technical innovations are most likely to happen in cities by groups of people who’ve accumulated cutting-edge knowledge. This pattern suggests that increases in human intelligence and innovation owe less to genetic traits of high-IQ individuals than to the emergence of larger communities that held increased collective knowledge relative to previous, smaller communities.
Essentially, increases in shared knowledge arise from the offloading of information from individual brains, onto collective spheres.
So, despite my feeling “smart,” perhaps it’s a valid question as to whether or not smart technology – googleability – is actually making us dumber. Moreover, there’s a broader picture here because humans have been externalizing information for much longer. Arguably since 150,000 years ago, which dates the oldest known beads that could communicate information about identity, kinship, and status.
In terms of my ritual preparedness, technically we as humans haven’t needed to remember anything that could be put on a list since the invention of writing, around 3200 BCE. Writing was such a huge innovation because it permitted not just the recording and storage of information, but indefinite access to the accumulation of wisdom. Life is easier when you don’t have to remember every detail. But does such extensive externalization that coincides with shrinking brains mean we’re actually getting dumber?
The connection between brain size and intelligence is no simple matter. If there was a direct relationship, whales and elephants would be geniuses beyond comprehension and Albert Einstein would likely have required a much larger than average size brain to come up with E=mc2. Yet, numerous studies provide evidence for a robust, moderate correlation or at least a weak one. So it seems the connection between cranial capacity and intelligence is considerable, not dismissable.
The belief that information technology, be it writing, television or Google, is the root of an intelligence drain is nothing new. It dates back at least to Socrates, who couldn’t have more clearly stated this foreboding conclusion about writing, as a thing:
If men learn this, it will implant forgetfulness in their souls; they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks.
We can’t know exactly how our intelligence compares with big-brained Cro-Magnons who lived before the invention of writing systems. Yet, some anthropologists believe that externalizing knowledge and decreasing cranial capacity inevitably points towards a dumbing down in the human race.
Early European modern humans faced the same basic survival pressures as we do yet they didn’t have an accumulating wealth of communal knowledge at their fingertips. Ultimately, they lived on what they could remember, largely as individuals. Their survival depended on deep, recallable knowledge concerning everything from complex tool making methods, seasonal foraging and hunting practices to dynamic social information regarding family, friends, and foes. There's no hard evidence to provide a definitive answer but perhaps we need to give our ancestors more credit.
Now, the consequence of externalizing information on our intelligence couldn’t be more relevant. Some recent correlational investigations involving handheld devices provide a glimpse into the nature of the relationship. They show that when heavy digital media users were put to various cognitive tasks, sure enough, they came up short. Compared to infrequent users, while trying to think and reason they had more lapses in attention, tended to forget information more frequently and had a harder time successfully switching tasks.
But there’s no way to prove their device use actually caused these people to be dumber, which means there is no evidence to support a direct connection. Conversely, it could very well be the case that our ability to get the info we want, when we want it actually frees up cognitive resources, allowing us to solve bigger, more important problems. I know that when I go hiking in the mountains that surround my home, if I can monitor my location on a digital map in real time instead of having to work it out with a compass, it’s much easier to focus on the features of the landscape to create and change strategies of how I’m going to get to where I want to go.
The information our brains process, consciously or not, is likely to be motivated by present needs and goals, a characterization known as motivated cognition. Perhaps over time, such situational relevance in our thinking also impacts intelligence. Brains more exposed to changing environments and varying resources need to process new types of information to accomplish goals and meet needs. They exercise what’s called cognitive flexibility, a potentially fundamental component of intelligence. When such exercising occurs at a younger age it impacts academic achievement and creativity in adulthood. It seems that as our brains mature, increasing exposure to situations that require cognitive flexibility can permanently lead to greater cognitive ability in our shrunken brains.
If you’re a cognitive acrobat, able to configure solutions in all sorts of different contexts, such ability could matter as much or more than simply having a higher IQ. It could be that the externalization of knowledge into conveniently accessible forms supplements cognitive flexibility, allowing us to be even greater acrobats than our big-brained ancestors.
In any case, I’m still going to thrive on making lists.