Posted on 12/28/2011 4:54:12 AM PST by GreenAccord
Thanks for the ping, LVD.
You’ve got the cooling efficiency inverted. It’s more like the cost to run the device, plus 1/3 again in increased air-conditioning costs. Maybe less if you have modern high-efficiency a/c.
If AC is 75% efficient, I’m impressed.
You need to understand the definition of EER, the energy efficiency ratio, which is how many BTU’s of heat are removed per watt-hour of energy consumed.
Your original statement implied that it took three units of electrical energy to remove one unit of heat (from the electrical energy used by the TV).
An air conditioning system sold today has a minimum SEER of 13. Lets assume a big plasma TV used 1kW, just to have a round number to work with. The electrical energy used becomes 3413 BTU/h of heat that must be removed by the A/C. Doing the math,
3413 BTU/h / 13 BTU/h-W = 262 W, or only about 1/4th of the electrical energy used for the TV to remove the heat from the space via the A/C. Thus the total cost per hour in summer to run that TV would be 1.262 kW x cost / kWh.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.