I’m going to state right upfront I don’t have an all the data in the engine operating and cooling characteristics to back my opinion.
Having driven with and observed all kinds of drivers ranging from friends and family to so called “professional” journalists, I’ll say without hesitation America is the land of Luggers. Most Americans simply won’t downshift a manual trans when they should. Must be something in the water.
What we do know as fact from physics:
Dropping a gear (or two) is the right thing to do - if for no other reason than to increase mechanical advantage that lower gear ratios provide to ease the load on the engine. Increase of fan speed might be a little bonus.
As far as the physics go, it’s going to take a certain amount of work (power) to climb a given hill. Regardless of what gear it’s done in, the work expended is the same.
For the aerodynamics, it will take a given amount of power to maintain a given speed. Power needed is proportional to the cube of speed. So let’s say you wanted to double speed from 30 mph to 60 mph, it will take 8 times as much power to maintain 60 mph as it did 30 mph.
So what is unknown is where the engine is operating at peak efficiency. If we have a dyno curve for a specific engine we would know. But what makes it difficult to known exactly where the sweet spot is in the real world.
To do the work of the climb, we want peak torque. But to maintain high speeds (highway speed) we want max horsepower. Not sure exactly where its sweet spot is for any random hill at any particular speed with a given engine.
Then with respect to engine cooling, so much of that is dependent on the operational tune of the engine - as stated timing advance is going to have an effect as will AFR. Running lots of advance, with lean mix, and high load is a death sentence for an engine.
Cooling is harder to pin down - air cooled engine cooling is governed primarily by the temperature difference between ambient air and the cylinder head temp. I recently posted a chart of the cooling coefficient of air vs air velocity. The bottom line is that it isn’t linear. Doubling air velocity from the fan doesn’t double the cooling effectiveness. So at some point, the increased fan speed gains less and less cooling increase. (Chart in link below)
http://www.914world.com/bbs2/index.php?sho...amp;pid=3097424Here’s a part that is counter intuitive. The hotter the heads get, the more effectively they cool. But of course we can’t have heads at 600F or we run into other issues like dropping valve seats or stretching valves.
And ambient air temp - we can’t do much with that other than to recognize that if the cooling is dominated by the difference between heads and ambient air, the ambient temp isn’t the biggest contributor. Example, original baseline heads 350F. Ambient air 90F. Difference 260F.
Now assume cooler ambient air = 50F. Temp Difference 300F. So that change in cooler ambient air only increases cooling effectiveness by 15%. Not as big as you might have thought? We almost halved the ambient air temp but only got a 15% cooling increase.
Now let’s do the same thing with increasing head temps. Head 400F. Ambient air 90F. Difference 310F. That results in a 19% cooling increase vs baseline with 350F heads. But wait, we only increased the head temperature 14% (350 to 400F) but we got a 19% increase in cooling.
See why air cooling is a bit weird? Actually cools better when head temps have increased.
So back to the question - when is it better to down shift to find the sweet spot for optimum cooling? Who knows without some very complicated modeling or fully instrumented road testing