A small patch of the brain’s surface is a battleground in a centuries-old debate: do we get our cognitive abilities from our inherited “nature” or from our environment’s “nurture?” Twenty years ago, using the then-new technique of functional magnetic resonance imaging (fMRI), Nancy Kanwisher and colleagues found that a small region of cerebral cortex, the layer of the brain that evolved most recently and is thought to perform our highest mental functions, was most active when its human owner looked at faces.
Since then, a torrent of research has located other “face patches” – parts of the brain involved in seeing faces – in humans and monkeys, and shown that they hold the information needed to recognize individuals. Yet these findings have not explained how we end up with these remarkably specialized neural structures in the first place. Are face patches an evolutionary adaptation, hardwired into our brain by genetics because faces are so important for primate social behavior? Or do they emerge only because we look intensely at faces during childhood, when our brains are most plastic? A new study suggests that there is nothing special about faces compared to unnatural stimuli; we just spend much of our childhood looking at them.
Although we have known for decades that sensory signals shape the developing brain, this work suggests that even our “expert” abilities -- like recognizing faces -- are unlikely to emerge on their own. Evolution has left us as a species that can learn from its environment, but perhaps not one that “knows,” innately, how to solve any particular problem (unlike species with elaborate instincts.) If this model is accurate, our core visual abilities would be reflections of our early experience rather than fixed outcomes of our genetic inheritance.
Michael Arcaro and colleagues at Harvard Medical School raised three macaque monkeys in an environment where they could not see their own reflections, other monkey faces, or the faces of the experimenters (which were always covered by welding masks.) Importantly, these monkeys could hear and smell other monkeys, and they had free access to toys and soft fabric “surrogate” mothers; as psychologist Harry Harlow found in the 1950s, the monkeys raised with these rich non-facial stimuli were “lively, curious, exploratory, and interactive” and able to integrate normally into juvenile social groups after the end of the experiment. Keeping the monkeys happy was essential, because the researchers wanted to understand the effects of a subtle change in sensory experience without the confounding influences of mood and social stimulation.
The face-deprived monkeys and control monkeys were scanned by fMRI when they were six months old to measure their neural responses to faces and other visual stimuli.
Control monkeys had face patches by the time they were six months old; the face-deprived monkeys did not. Patches for other visual categories that both sets of monkeys saw equally, such as hands and bodies, were roughly equivalent between the two groups.
These findings suggest that looking at faces is necessary for the development of face patches; in general, each region of “expertise” probably develops through extensive experience with that type of visual stimulus. In an earlier study, the same research group trained young monkeys to recognize Arabic numerals, Tetris pieces, and other abstract symbols – not things monkeys evolved to see. Yet this experience was enough to produce “symbol patches” in fMRI scans of their brains.
The researchers knew, though, that this finding needed to be reconciled with evidence that face patches and face-looking behavior are innate. Specifically, if face patch development depends on seeing faces, why do the patches normally appear in roughly the same spots across different people and monkeys? There must be some hardwired organization that primes patches to develop in certain places.
In this and a simultaneous study, the authors propose just such an organization. As early in development as they looked, each monkey cortex contained a spatial map of the visual world: some regions of cortex consistently responded to stimuli presented nearer to the visual periphery, whereas others responded to stimuli seen in the center of the monkeys’ field of view. This center-preferring region included the parts of visual cortex that, in control monkeys, eventually became face patches. The authors therefore argue that the early experience of looking directly at faces is what causes the face patches to develop in their typical positions. It makes sense that this underlying structure is programmed by genes: we’ve found maps of where stimuli occur in space, like the “homunculus” map of the body, for many sensory modalities and in many mammalian species.
This spatial organization of the visual cortex is probably not the full extent of hardwiring, though. Faces are not the only objects primates look at directly, and other forms of organization may be lurking beneath the detection threshold of fMRI in the actual firing patterns of cortical neurons. Still, this study sets a clear upper limit on how hardwired the face patches can be.
Answers to the most interesting questions are still to come. The face-deprived monkeys in this study did not prefer to look at faces over other objects, again countering the belief that face-related behaviors are innate. It is not yet known, though, whether these monkeys are any worse than normal at recognizing individual faces or reading social cues. If they are worse – perhaps resembling congenitally face-blind people – it will be interesting to see if they are better at getting information out of other visual stimuli, like hands, or if they are more attuned to other sensory modalities.
On the other hand, if these monkeys see faces normally, a new hunt would be on: if face patches aren’t associated with facial recognition, what behaviors do we need them for? And where should neuroscientists look to understand how we read each other’s faces even without early experience?
As with most nature-nurture debates, the sides will probably meet in the middle. Primates have a remarkable ability to shape their abilities and behavior according to their environment, but evolution has drawn sophisticated blueprints to get us started.
We asked other neuroscientists to respond with some commentary to this article. In a very small way, this is how peer-review works in scientific journals. We wanted to give you a taste of what scientific discussion looks like! If you want to know more, feel free to contact the scientists directly via Twitter.
Maya Emmons-Bell: Even though the face-deprived group interacted with other monkeys post-experiment, I wonder if the fine details of their social behaviors were altered in any way. Maybe "face patches" aren't required for perceiving faces, but instead for prioritizing facial perception or some higher-level integration of the image into past experience or a social context?
Bear responds: My hunch is that you’ll be right about that. The face patch neurons may hold the most information about what faces look like, but there’s plenty of evidence that other neurons in visual cortex could be used for facial recognition, too. The only behavior they really checked out in this paper (more will come) is preference for looking at faces, which seems like it would have everything to do with those higher-level tasks or contexts you’re bringing up. We know so little about the brain that outside of a few areas; most guesses about certain regions being “required for X” end up being wrong.
Danbee Kim: I’m really glad that the research team found keeping the monkeys happy to be essential, because that implies an understanding that the natural behavior is the baseline, and taking away things essential to developing those natural behaviors counts as a confound to the experiment.
Bear responds: I think the researchers would agree with you – and this study raises important, general questions about what we should even consider a “natural” behavior! Is it something that brains have hardwired circuits for doing? Or is it something that would develop in the wild because a certain type of experience is almost inevitable there? There’s also the in-between situation, where some behaviors are learned more easily than others (what Konrad Lorenz called the “innate school-marm“).