In spite of the logic above, I've always thought this was at least a little wrong. Machines have more and less-optimal operating speeds. I've heard that cars are most efficient at 55 mph*...why would we expect a nature-built machine to be any different? Four-legged animals also have more-and-less efficient paces, and again, we shouldn't expect humans to be different.
*not that it stops me from speeding, because efficiency isn't always key when I want to get the hell to my destination.
This quasi-informed hunch has been vindicated by several studies over the last few years. Here's what we now know: Faster is costlier. Yes, just as you might expect, you are less efficient at harder efforts. This is probably true for most endurance/aerobic physical activities-- it's not just due to less optimal biomechanics that come into play at faster running paces. By "efficient" I mean "cost in calories to run a given distance", also known as "cost of transport" or "cost of running", used interchangeably with "running economy". Let's go with CoT, for "cost of transport". Let's delve into the research.
1. This 2009 paper by a biologist who studies locomotion in animals and humans was, I think, the first to offer good evidence that cost of running varies with pace. In fact, she found a "curvilinear" association between pace and CoT: each study subject was less efficient at slow and fast paces, and had a sweet spot pace. The graphs below show the CoT curves for 6 of the 9 study subjects.
|from Steudel-Numbers and Wall-Sheffler, 2009.|
For me this too makes intuitive sense. Slower paces should also be less efficient--for one, because efficiency in running largely comes from elastic energy storage (and subsequent return to foward motion) in muscle and tendon; very slow paces, and indeed walking (and uphill running! the subject of another post) benefit very little from this elastic-energy-return effect. For me, the problem with this study was in its ultimate conclusion. As this was published in the Journal of Human Evolution, it had an evolutionary argument to make. In brief: persistence hunting, in which human hunters run prey into heat stroke, works in part because humans can choose from a variety of paces, without penalty, while prey animals are constrained to a select few paces. The authors interpreted their results to mean that early humans probably did not engage in persistence hunting because we cannot, in fact, select any pace we choose with equal efficiency. This makes persistence hunting slightly less efficient than we had thought, they argue. Well, the energy difference between paces is quite minimal, and on the caloric scale we're talking about for a persistene hunt (1- 2 thousand calories, ballpark) the extra caloric cost of running less optimal paces during a hunt is a drop in the bucket*. Here, I think, the authors took really cool data and forced an evolutionary significance onto it that didn't fit.
*If persistence hunting worked at all it was because the reward-- a 100,000 calorie animal-- far outweighed the cost. And anyway, persistence hunting, if it happened (and like modern hunting techniques by modern hunter gatherers) probably failed quite often, and was really a high-risk venture made possible by the more reliable foraging efforts of others (women?) and serving in part as a display of vigor.
2) One or two other studies found a similar result- that running faster cost a bit more. Shaw et al (2014) asked a related question (and definitively answered ours as well): can running economy, or CoT, be accurately assessed by measuring oxygen consumption, as many studies have traditionally done? The thinking of this particular dogma is that any oxygen you take in will be used in the oxidation of carbohydrates and fats, and it's all used-- every drop. This part is true. But as studies using oxygen consumption to get at CoT routinely found no difference in CoT based on pace, clearly something was missing.
The enormous graph above shows the CoT at 4 differnet paces, slower to faster (left to right). Over a hundred trained runners were tested and these graphs show the range (the bars) and the average (middle points on each line) values. The top graph shows actual energy cost of running, which they calculated by analyzing CO2 exhaled during running. From this they calculated the proportion of carbohydrate vs. fat that was used as fuel, and then worked backwards to figure out actual energy cost. Cool. See how it goes up with increasing pace? Faster is more expensive. The middle graph shows oxygen consumption. No change with increasing pace, just like every other study using oxygen consumption as a proxy for energy consumption has found. Why wasn't oxygen use going up, given that cost of transport was?
Look at the bottom graph, which shows "RER", the "respiratory exchange ratio". It's a measure of how much CO2 vs. oxygen is exhaled. This ratio changes based on the proportion of carbohydrates to fats that are being used to power the muscles. As exercise intensity increases, a greater proprortion of carbohydrates is used, and relatively less fats are used. (This is a known thing and it happens because, during hard exercise, energy is needed faster, and carbohydates oxidize faster than fats.) An RER of 1.0 means that all muscle power is coming from carbohydrate oxidation...and at the harder pace, at the far right of the graph, runners' RER approached 1.
One more piece here. While fats are a great energy source--they actually net a greater energy return than carbohydrates, and they can be stored in excess all over the body, making them great long-term fuel-- they require more oxygen for their breakdown, into water and CO2 (this is called oxidation). Why is this? On the right is a simple carbohydate called glucose, and at left is a fat molecule.
The black atoms are carbon. The carbohydate, glucose, has a 1-1 ratio of carbon to oxygen, the red atoms. Basically this means that, to break down glucose and turn it into water and CO2, you only need one more oxygen per carbon. Boom, fast energy! The fat, however, has way more carbons than hydrogens, and each of those carbons needs 2 oxygens-- from the air you inhale--in order to be broken down for energy.
Back to the study. At faster running paces, carbohydates become the predominant fuel source for muscles. Because carbohydrate breakdown requires less oxygen than fat breakdown, the shifting carb/fat ratio with increasingly hard exercise presents an unchanging oxygen consumption per mile, masking the fact that energy consumption is increasing. Here, then, is why dogma told us for decades that it cost the same to cover a mile fast or slow.
So... you are right to feel that different paces are more or less efficient! (Noting, please, that energy differences between paces are tiny.) Does any of this have any bearing on evolutionary questions? I'd say likely not. Running is enormously costly, regardless of pace, and for early humans living on the margins of energy balance, such a behavior would have needed a damn good purpose, just as all animals who move a lot have a good reason for it. I have some ideas about that...