Did humans evolve to eat raw, aged, or cooked meat? I tried to test digestion
That's how I ended up feeding raw meat to undergrads
Anthropologists have long debated about the importance of meat-eating and cooking in human evolution. One prominent Harvard primatologist, Dr. Richard Wrangham, proposed in 1999 the idea that cooking is the most important adaptation that allowed humans to evolve into who we are today – but what if we didn't need to cook meat to make it more easily digestible? What if simply letting it break down naturally, as it starts to do almost immediately after an animal dies, made it similarly digestible?
As part of my PhD coursework in evolutionary anthropology, I decided to find out. My advisor, Dr. Robert Scott, and I put up posters around Rutgers University to advertise an experiment to test the importance of cooking to the evolution of the human diet. "WANTED: A multidisciplinary Rutgers research team is looking for study participants to eat experimental meals and have their metabolic rates during digestion measured. WARNING! One of those meals is raw beef!!!"
The raw beef in question was one-inch cubes of eye of round roast, meticulously prepared to minimize any risk of food-borne illness (per a Rutgers Institutional Review Board-approved protocol). Our goal: Find some undergrads willing to chow down for science. Hungry yet?
The debate in paleoanthropology about whether early hominins were hunters or scavengers can be mashed up with Wrangham's hypothesis that cooking (especially meat) was an essential adaptation – the key step toward becoming our current small-jawed, big-brained selves. This argument boils down to energy: how do you get the most with the least effort? While hunting has its perks – access to fresher, more desirable meat – it comes with the potential costs of failure and injury. Scavenging means doing less work and therefore expending less energy, but it also means waiting until the predator that did your dirty work is full and leaves the carcass.
Cooking potentially fits well with either a hunting or a scavenging strategy. It has a number of benefits: it breaks down plant toxins, kills surface pathogens, and breaks down the muscle fibers of meat (making them more easily digestible, which means using less energy). But the biggest advantage of proactively breaking down the muscle fibers by cooking, rather than by scavenging meat that has already started to break down naturally, is the killing of any nasty bacteria that might have gotten to the meat first.
Anthropologists aren't the only ones obsessed with how we ate in the past; as the paleodiet shows, popular culture is also fascinated with the question of what we "should" be eating. While some argue humans have evolved to eat significant amounts of meat, followers of the raw food movement (as their name suggests) believe that humans' most natural diet is food that has not been cooked to any significant degree.
Bite off an angle you can chew
So I set out to try to figure out what our hominin ancestors might have gained from moving toward Top Chef. Did cooking deliver an evolutionarily beneficial easy-to-digest diet, or would just scavenging less-than-fresh zebra have been equally digestible? I hypothesized that letting meat age would actually have the same effects on the muscle fibers of the meat as cooking – after all, people still age meat for a reason.
A deep dive into the world of academic science journals provided research to back up my hunch. Previous research suggests that both aging and cooking break down muscle fibers and weaken collagen – a major structural protein in tissue – making meat more tender and easier to eat. My literature review suggested that early hominins would have likely used a similar amount of energy chewing and digesting scavenged meat as they did with cooked meat (both of which required less energy than digesting raw meat).
Before we rounded up willing test subjects, my advisor and I began our research by testing the material properties of the meat itself. This meant hooking scissors up to a specialized testing machine, and cutting through slices of meat to measure how tough the meat was. Cutting across the grain of the meat was tougher than cutting with the grain, since it meant cutting through individual muscle fibers. This difference was even larger when comparing cooked meat to raw meat: Surprisingly, the cooked meat was tougher to cut across the grain. These findings contradict the conventional idea that raw meat is harder to chew than cooked meat – it turns out it what might really matter is exactly how large a bite one takes, and at what angle one chews.
When the time came to test this idea out in reality, we thankfully recruited some undergrads with a taste for flesh. They were given the opportunity to eat both raw and cooked roast, and we timed how it took them to finish their experimental meal. Unsurprisingly, we found that the subjects ate faster when the meat was cooked. We weren't able to determine, though, whether this was because the cooked meat was tastier, or whether they weren't able to chew the raw meat at just the right angle to break it down efficiently. Unfortunately, we never made it to feeding the undergrads aged meat.
Since we did this experiment, a great deal more research has been done on early hominin adaptations to meat eating, with perhaps the most convincing study suggesting that slicing can cut down the chewing time and energy needed to eat raw meat, and that tools capable of this work show up in the archaeological record long before the evidence of hearths. As with most experimental work in paleoanthropology, our study added a piece to the puzzle that is our evolutionary past by testing hypotheses rather than telling stories.