Thermodynamics of Engine Cooling
Some time back, a few questions about thermostats and heating were posted to the EBR. They were:
A question about thermostats: What exactly does the rated temp mean ? Do they begin to open at the rated temp ? Or is the rated temp the temp at which it will be fully open?
I've tried two different 160 degree high-perf thermostats in a week, and the tempteture gauge settles at about 180 for both of them. Is my gauge reading 20 deg too hot?
These questions sparked of some discussion on the thermodynamics of engine cooling that you may find interesting.
TLG2@aol.com writes in response:
The easiest way to overheat an engine is to install a 160 thermostat. Your guage is right, your radiator just can't cool to 160. To answer your question, the rated temp is where it opens at, full open is usually about 15 degrees higher.
Choose your thermostat based upon the requirements of the engine, Most modern engines run better warm, high compression (>10:1) engines may need to be a little cooler, but they don't need the extra heat for better ignition becasue they have the extra pressure. Think about the job of a thermostat, to stay closed until the engine reaches a set temperature then open. What happens to the water in the radiator while the thermostat is closed, it sits in the radiator cooling. If you put in a thermostat that is too cold then it will be wide open at all time, the water will run through the radiator with no time to cool and just go back through the engine again and get hotter, until eventually it overheats. A properly operating thermostat will show a variable engine temperature, you can watch it open and close, my 192 warms to 192 then to 200 then it opens and cools down to 192 then it start over, this shows that I have excess cooling available from my radiator. I doubt that the 160 will stay below 180 at anything but an idle, on a cool day. Another advantage of a high temp thermostat, greater cooling capacity. The object of cooling the engine is to get rid of excess BTU's. It is scientifically proven that the greater the difference between the hot and cold temperature, the faster the cooling effect, (don't all our broncos cool better on a cold day than a hot one?) Since a 192 thermostat is 12 degrees hotter than a 180, I will lose as many BTU on an 82 degree day as a Bronco w/ a 180 will on a 70 degree day, and as many as you will on a 50 degree day. Many people are afraid to run with a 192, and I will admit the 205 degrees I see at wide open is very close to the 210 where it starts to overheat the carb.
Jamison Gray (Jamison.Gray@Eng.Sun.COM)
Umm...I'm not sure this makes sense to me. Your point about choosing the proper thermostat for the best operating temperature of the engine makes sense, but I don't see how a *lower* temperature thermostat can cause overheating. A circulating coolant doesn't need to stop and start to do its job -- the coolant doesn't need extra time to cool in the radiator if it's continually flowing. The faster and more continually a coolant circulates, the more heat it can transfer from the hot side to the cold side.
To demonstrate this, let's use the fact you noted below: that the amount of heat transfer between objects increases with the difference in temperature. Let's say I have a tub of hot water, and the only way I can cool it down is by filling a tin cup with it and placing it on the breezy 80 degree windowsill, then dumping it back. Okay, I fill up the first steaming 180 degree cup and set it out. At first, it's shedding heat like crazy, let's say .25 BTU's/minute. But then, it starts to cool. By the time it gets down to 130 ten minutes later, the difference is cut in half, so it's losing much less heat. If I let it get down to the ambient temperature, the heat loss in my system drops to zero before I dump it back in the bucket and draw another cup.
But if I adopt a different strategy, and return the cup to the tub every minute, then I'm continually putting a high-temperature cup up on the windowsill, and my heat loss will be at that maximum .25 BTU/minute. By the time it drops a degree or two and that heat loss rate starts to drop, I'm replacing it with another hot cup. Even if I only leave each cup out for ten seconds, it's still out there with that maximum temperature difference, disippating heat at the highest possible rate every moment.(There's certainly a point of diminishing returns, where I'm working twice as hard switching cups to get only a tiny increase in cooling efficiency)
I don't see that a higher thermostat temp gives you greater cooling capacity, either. If the truck with the 160 thermostat ever did get up to 205 under some condition, it would be able to cool itself as quickly as the one with the 192; once they both got down to 192, the former would just keep going instead of stopping like the latter.
And Tom again:
Lets define a properly operating cooling system, I think it is one that cools faster than the engine heats, a measure of that would be that the water in the rad is cooler thant the water in the rad, if the water circulates at full speed the water in the rad is the same temp as the water in the rad and not only does it not get cooled by the air, it also does a poor job of removing heat from the engine.
What you have to do is establish a cooling rate that is faster than your heating rate in your example you must remember that the sink is heating the water all the time, lets assume its heating at X btu/min. The cup on the shelf is losing heat at Y(Temp thermostat-temp air) btu/min. If you put 192 degree water on the shelf on an 80 degree day then you lose Y(112) BTU min if you put 160 degree water on the shelf you lose Y(80) BTU min. this is fine as long as X is less than Y(80). But what happens is that X is usually greater than Y(80) so the stat goes wide open ( why have one if it's always going to be open?) the coolant goes through the radiator in less than a minute ( a guess I'm making) ( so the amount of heat lost is less than Y(80)) and until the temperature in the radiator gets so high that it can lose enough heat to stabilize it just keeps climbing. However if X is less than Y(112) then the coolant stays in the rad for at least the full minute and we lose the whole Y(112) ( which is greater than X) and the coolant we send back is even less than what we started with.(quoting James above) I don't see that a higher thermostat temp gives you greater cooling capacity, either. If the truck with the 160 thermostat ever did get up to 205 under some condition, it would be able to cool itself as quickly as the one with the 192; once they both got down to 192, the former would just keep going instead of stopping like the latter.
Here is where you are right, once both thermostats are hotter than the wide open condition of the thermostat the cooling is equal, but as soon as the 192 gets below 192, the thermostat will close, the water will stay in the radiator longer and cool more, and the water in the radiator will be cooler thatn the water in the engine, sooner than if the thermostat were a 160 and it were wide open.
10 years ago I would have agreed with you, I put a 160 in my Jeep which would overheat at idle on very hot days >100 ( I started with a 180 thermostat) the problem got worse, the thermostat opens up, the water is too cool to lose much heat, and moving to fast to sit long enough to lose the heat, and so it just gets hotter. I put a 192 in it and it never got over 200 ( wide open position). Another benifit was that it had MORE POWER, the 8:1 compression engine ran better at 192 than at 160, it wasn't noticable going from 180 to 192, but from 160 to 192 there was a difference. I hope I explained it to your satifaction, it is a dificult concept to grasp, but the 2 things ( low temp difference and the high flow rate combine to give poor cooling, your example didn't note both things.
X= BTU's produced by the engine/min Y= cooling capadity of rad BTU/min Z= time spent in rad(min) Tt= temp of water from radiator (thermostat temp until hot) Ta = air temp ( 80) if X >Y(Tt-Ta)Z the engine gets hotter If X=Y(Tt-Ta)Z the engine stays the same temp ( thermstat temp
The above approach is the correct one to take.
The radiator/engine can be looked at as two second order systems, each with its own thermal transfer rate. Water flow can be considered a constant velocity, switched in the case of the thermostat cycling.
Three possible conditions can exist.
Cycling is the condition of interest, and the RMS temperature of the motor will stablize at some point depending upon the thermostat temp, engine thermal input, radiator thermal output, and water flow rate.
Since this is a (at least) second order system, it is not totally stable and adjustment of any of the above parameters could easily send a stable system into instability (overheating). It is quite possible for a lower temp thermostat to actually drive the system to overheating.
The math is really not *that* difficult, but requires some thought.