The amount of heat energy needed to warm a tank of water by x-degrees is fixed by the amount of water heated and the temperature rise x. If you know the tank size and the temperature rise, it is easily calculated.
I cubic centimeter of water heated by one degree C requires 1 calorie of energy. To get from that to watt-seconds (joules) multiply the calories by 4.2. You can then get from watt-seconds to watt-hours and then to kilowatt hours.
The longer the tank stays hot, the more heat you lose.
So, as a pure physics question, it must consume more power/gas/electricity to keep a tank hot continuously than to heat it up only when you want to use the hot water.
Of course, in practice, the amount of heat loss depends on how well the tank is lagged and, according to that, what you will save by not having hot water all the time.
P.S. I once had a galvanized steel cold tank in the loft and my hot copper cylinder failed 3times due to steel-copper electrolysis. I changed it and then lost one more tank going porous, and none since. The existing one has so far easily outlasted the previous three.
Last edited by: busbee on Sun 23 May 10 at 18:28
|