Posted by Dave on April 10, 2014 | 1 Comment
Conventional wisdom is that you can get your best time in a race by starting off a little slower than your goal pace: “Negative splits” are the ideal — times for each mile run should decrease over the course of the race.
So, for example, if you were running a 10k and had a goal of finishing the 6.2 miles in 62 minutes, then you should not, according to conventional wisdom, start out by running your planned average pace of 10 minutes per mile. You should start a little slower, maybe 10:15 per mile, and make up for it by running faster at the end.
But why? I’ve seen several web pages make the assertion that “every world record from the 1500 meters to the marathon has been set running negative splits” or something similar. But rarely do such assertions come backed by hard evidence. So I was interested to see this paper, which seems to turn that notion on its head:
In fact, most world records at distances above 200 meters have been set with positive splits, not negative splits. The article goes on to argue that the best strategy for 400- and 800-meter races is a positive split, where the finish is slower than the start.
But typical runners are not setting out to break a world record. They just want to do the best they can, perhaps setting a personal record in a race. They visit sites like McMillanRunning.com, which can predict, say, a 10K pace based on previous 5K. If I enter my personal best 5K time, 17:49, it spits out a projected time of 37:00 for a 10K. But it doesn’t offer any strategy for achieving that time other than letting me know that’s an average pace of 5:57 per mile. Supposing I feel like I have a shot at that time, should I start out a little slower, a little faster, or right on pace?
I can see that most men’s world-records at 10,000 meters were set with positive splits, suggesting I should start out a little faster. But for a recreational runner, is that realistic?
I decided to take a look at some real-world results and see what runners like me do. The Ukrops Monument Avenue 5K in Richmond, Virginia, was run a couple weeks ago and has a large field of runners. I decided to look at the second page of mens’ results (since the race leaders may have been running a strategic race rather than going for the all-out best time). Here’s what I found:
As you can see, the men ran the race in an average time of 36:31, and their pace on the first 5K was 3 seconds faster than the pace on the second 5K. 26 of the 44 runners had positive splits while only 18 had negative splits. Maybe there is something to this idea that a positive split is better for a 10K after all.
So I decided to look at the women’s results as well:
Once again, there is a similar pattern, only more pronounced. The women in this sub-elite group ran the second half 23 seconds slower than the first half of the race.
So perhaps starting out fast is the best way to finish fast. But there might be some problems with this data. What if the runners who started fast were actually capable of going even faster, but made an error and paid for it with a slow finish?
To test for this eventuality I eliminated runners who were positive or negative by more than 3 percent. When I did this, the numbers evened out a bit: 19 men had positive splits while 14 had negative splits. And 13 women had positive splits compared to 14 with negative splits. Overall, more runners had positive splits, but it was much closer.
Interestingly, though, this race has a harder first half than second half: the first half is a gradual uphill. So even if all the runners had put out precisely even splits, they were actually exerting more effort in the first half than the second half.
This suggests to me that starting out slow in a 10K is not a good idea. An even pace, or even slightly positive splits (though no more than 5 seconds or so per mile) will probably generate the best results.
Reardon J. (2013). Optimal pacing for running 400- and 800-m track races, American Journal of Physics, 81 (6) 428. DOI: 10.1119/1.4803068