I’ve made a discovery! We’ve recently retired and moved into what was our vacation home and my wife recently made the statement “our electricity bill sure has gone up a lot since we moved in”. Basically it had more than doubled, I didn’t think much of it at the time thinking “ok, we’re here full time now so that makes sense”. It just kept bugging me though, this place is half the size of our previous home. I had made the point of replacing all the light bulbs with LED’s, all of our appliances are fairly new and energy efficient, as a matter of fact we had just purchased a new washer and dryer set for that specific reason, the old set was 20 years old. What was using all the energy ??
Now keep up with me here, this is actually very easy. There are a few pieces of information you need to figure this out, and it’s very simple. First off you need to know that an electric device rated at 1000 watts ran for one hour equals 1 kilowatt hour. Let’s say it’s your 1000 watt microwave, operating for one hour uses one Kilowatt Hour (KWh) of energy. **NOTE: This information is usually found on a label on the back of the device near the power cord entry to the device. If your device shows amps instead of watts you simply multiply amps times voltage, in my case: 8 amps*120 volts= 960 watts, we’ll call it 1000 watts for simplicity.
So, lets say I run that microwave for one hour and my energy costs me 12 cents a kilowatt hour (this info can be found on your electric bill), that means I owe the electric company 12 cents for running that microwave for an hour. Simple right? Let’s take that out further, say that microwave runs 24 hours a day for 30 days: .12cents*24hr= $2.88 a day or $2.88*30 days= $86.40 a month. Say what??? Ok, obviously we don’t run the microwave 24/7 which begs the question, what do we run 24/7? All kinds of things, the charger for your laptops and tablets, those little wall warts (power adapters) for all kinds of other gadgets, and in my case one very large and power hungry desktop computer.
I’m an IT guy, a Network Administrator by trade and I’ve got a pretty robust desktop computer, but we’ll come back to that. My wife and I both have our own laptops and they pretty much stay on their chargers 24/7. My wife’s laptop has a 90 watt charger while mine uses a 45 watt charger, so let’s figure out what they cost us to run:
90watts*24hours= 2160 watts or 2.160 kilowatt hours*30days= 64.8KWh*.12 cents= $7.78 monthly
45watts*24hours= 1080 watts or 1.08 kilowatt hours*30 days= 32.4KWh*.12 cents= $3.88 monthly
Now comes my surprise discovery, my desktop computer not including the 25” monitor, runs at 600watts (yes, it’s a pig) 24 hours a day. **Note: most desktop computers run about 250-300 watts
600w*24hr= 14,400w or 14.4KWh*30= 432KWh*12cents = $51.84 monthly, KA-CHING!!
Whoa!! $51.84 a month!! Now granted, it sits and idles probably half the time, but when I am using it I push it pretty hard. Then you add in the big monitor, a commercial HP color Laserjet printer, and a half dozen other peripherals; I’m pretty sure I get close to that $50 a month. All that stuff now gets turned off when not in use.
I realize all of this may seem too technical to those who don’t deal with this stuff everyday, if that’s the case you can get whats called a Kill-A-Watt meter, usually for about $25.00 It’s a pretty simple little device to use, you just unplug the device you want to test, plug in the Kill-A-Watt meter, then plug your device back in. The Kill-A-Watt meter has a digital read out that tells you the power usage in kilowatt hours. I have one that I use on the job, guess I should start using it here at home….. You can find one here.