Note: I got a bit carried away, and the length ran up on me. If you don't want the technical details, read the summary at the end, but the justification's in the center.
Some college guys did a test on a Dodge pickup a few years ago related to tailgates and covers. Here's what they found (complete with photos of the air flow):
--Snip--
Their results run counter to what others claim, they found a decrease in drag with the tailgate down but an increase with it off.
Being a newly-graduated aerospace engineer myself, I can vouch for the
theory behind their experiments. The best configuration for reducing Cd would be an open tailgate with a cap that started at the roofline, and sloped down to meet the tailgate at the very end. The more flat vertical area at the rear of the vehicle, the more drag (wind resistance)
In practice, real-world results tend to differ a bit from experimental modeling, and using scale models always introduces some error. Their figures for drag coefficient, for example, are with the mirrors removed. I'd be interested to see a full-size truck placed in one of the big wind tunnels in Tullahoma to see how close their figures are.
But I digress. At any rate, I can say with certainty that reducing your speed is the best way to gain mileage. To a point. I'm an airplane guy, so the calculations are a bit different, but if you make a plot of drag versus velocity it will increase
exponentially with increasing speed. The very well-known formula for equating drag and velocity (I'll spare you the specifics) indicates that drag increases with the square of velocity. Basically, if you go twice as fast, you get four times the drag, three times as fast, nine times, four times as fast,
sixteen times.
There are more considerations for drag produced as a consequence of lift that are more applicable to airplanes than autos, but they do factor in. This message is already overlong, though, and I'll not bore you with more.
Basically, the amount of drag is what determines how much horsepower the engine needs to produce to sustain a given speed, and the BHP determines the fuel consumption (that's where the BSFC comes in to play).
Given the charts for BSFC vs. RPM, we can determine the RPMs where the
engine runs most efficiently. BSFC is in units of lbm/hp*hr. That's how many pounds of diesel per hour are required for
each horsepower produced--that's the sticky part. Yeah, you might have the best
engine efficiency at a certain RPM, but if you get a 10% decrease in BSFC from some other RPM, but the increase in speed requires 50% more power, you're actually less efficient--fuel flow rate will increase. If we also had a chart for drag force vs. speed, we could relate the two, and a direct correlation of vehicle speed to fuel consumption.
Personally, I'd love to throw something like this together (yeah, some geeks drive big trucks, too

), but I don't think Chrysler would give me their aerodynamic data. And booking an hour in a vehicle-size incompressible flow wind tunnel makes a week or ten on the dyno look cheap.
-------
Okay, this is way long now. In summary:
BSFC relates how much fuel is burned *per horsepower* at a given RPM.
Drag increases with the *square* of velocity.
Without knowing the drag/velocity performance of the vehicle, it is impossible to get any real benefit from the BSFC figures.
In general, going slower will *almost always* give you better mileage.
Okay. I'm done. If anyone actually read this far, and wants clarification on any of this, let me know. And I should offer a disclaimer:
No, I don't drive slow--it takes too long, so I get 15-16 mpg. And just because I've got an engineering degree doesn't necessarily mean I know what I'm doing. :-laf . I've got the theory down, but the thought of doing something like changing my injectors still scares the hell out of me. You guys are helping with that, though--thanks.
--Ty