A small economy.

Yes, surely having the lights on more would have a greater effect than the type of bulb one uses.

Surely?

A filament bulb uses about 4x the energy that an equivalent CFL or LED does. So you could switch bulb types from filament to CFL/LED and have the lights on four times as much and there would be near zero effect on the bill.

That's a LOT more 'lights on' time, for no effect. How do you come to your conclusion?


I stand by my original statement, a person is unlikely to be able to parse out the bulb type savings across all the variability in a month-to-month bill. Better to make a good estimate of usage and calculate it on a spreadsheet.

-ERD50
 
In the process of probing the power supply of my TV while fixing it, I found that its entire internal video circuit runs all the time, even when the TV appears to be off.

Modern electronics always draw some phantom power, basically to keep at least the remote control function alive so that they can respond to "Power On" command. But because this TV takes a minute to boot up its digital processor, they keep the whole thing active.

I did not think of measuring the power draw before, so was shocked to find that this thing draws 60W when off. Turned on, it draws 210W. The 60W power drawn works out to $5/month in electricity cost. It is not a lot of money, but is still a waste. I am going to unplug it when not in use, and suffer the 1-minute boot time.

This 10-year old TV was fairly advanced in its time, and cost $4000 then. A modern replacement would cost less than $1000, but as we do not watch much TV, I find it difficult to get rid of something that still works.
 
Last edited:
...

I did not think of measuring the power draw before, so was shocked to find that this thing draws 60W when off. Turned on, it draws 210W. The 60W power drawn works out to $5/month in electricity cost. It is not a lot of money, but is still a waste. I am going to unplug it when not in use, and suffer the 1-minute boot time.

This 10-year old TV was fairly advanced in its time, and cost $4000 then. A modern replacement would cost less than $1000, but as we do not watch much TV, I find it difficult to get rid of something that still works.

The newer TVs are much better though. I measured the OFF draw from the power strip with a 55” TV, DVD, and RCVR plugged in, ~ 1W for all combined. The 55" TV (Vizio - purch DEC2012) by itself, when ON, drew ~ 80 Watts.

-ERD50
 
Having worked to make set top boxes energy star I can tell you that newer, energy star tvs are going to put every chip and circuit to "sleep" that they can. (The set top I was involved with was not DVR so we didn't get the easy savings of turning off the hard drive.). Depending on the TV... If it's a smart TV it may needed to listen to more than the remote... It may be listening to the internet or upstream plant - another set of times that can't be turned off .
 
In the process of probing the power supply of my TV while fixing it, I found that its entire internal video circuit runs all the time, even when the TV appears to be off.

Modern electronics always draw some phantom power, basically to keep at least the remote control function alive so that they can respond to "Power On" command. But because this TV takes a minute to boot up its digital processor, they keep the whole thing active.

I did not think of measuring the power draw before, so was shocked to find that this thing draws 60W when off. Turned on, it draws 210W. The 60W power drawn works out to $5/month in electricity cost. It is not a lot of money, but is still a waste. I am going to unplug it when not in use, and suffer the 1-minute boot time.

This 10-year old TV was fairly advanced in its time, and cost $4000 then. A modern replacement would cost less than $1000, but as we do not watch much TV, I find it difficult to get rid of something that still works.

You should use a power strip that has an on/off button on it to make it easy to do.

You have encouraged me to find someone to borrow one of those kill-a-watt things from to measure the draw :greetings10:
 
I should mention that we installed a switch for the flat screen and associated equipment recently as well. I am sure that helped.

Sent from my SM-G900V using Early Retirement Forum mobile app
 
You have encouraged me to find someone to borrow one of those kill-a-watt things from to measure the draw :greetings10:

If you're serious about saving power a Kill-A-Watt meter is the best $20 investment you can make. A few years ago after getting one I realized that my old Sears freezer was drawing so much juice that if I bought a new one to replace it, it would pay for itself in 3 years or so.
 
Last edited:
If you're serious about saving power a Kill-A-Watt meter is the best $20 investment you can make. A few years ago after getting one I realized that my old Sears freezer was drawing so much juice that if I bought a new one to replace it, it would pay for itself in 3 years or so.

I agree that a Kill-a-watt meter is a great investment.

Using it, I realized that my ancient fridge and freezer really were not too bad, and it was not worth replacing them. A new model might save me $30- $40/year. That would be a long payback, and I fear the new models may not be as reliable or long lasting as these proven beasts (my 1988 model freezer uses ~ $75 of kwh per year - replaced a $20 thermostat, and some little plastic vent caps in all those years).

I also learned that most of the stuff that I was concerned about using 'phantom' power weren't bad at all, and I no longer bother to shut them off. But I found a few old things that used as much 'off' as 'on' - so those got shut off.

-ERD50
 
I have read on other forums some libraries lend out Kill A Watt meters. We bought one, and combined with hourly electric usage reports, have really been able to slash our energy bills. I also bought a thermal leak detector.

One of our biggest surprises was the power drawn by Bose speakers when not actively being used.
 
Last edited:
Back
Top Bottom