Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Revel

The original ENERGY STAR® rating for TVs only measured standby power (the amount of power the set consumes when it’s turned off), which accounts for a small fraction of the power TVs consume. The ENERGY STAR 3.0 rating, which took effect on November 1, 2008, also factored in power consumption when the TV is on, providing a much clearer idea of how much power a TV uses.

Version 4.1, which takes effect on May 1, 2010, sets a more stringent standard for on-mode power consumption. It promises consumers a savings of 40 percent over models currently on the market. Version 5.1, which takes effect on May 1, 2012, promises a savings of 65% over sets that were on the market prior to the debut of the version 4.1 standards.

So I am a year ahead of my time hippie;-)


120 posted on 03/11/2011 5:16:06 PM PST by MPJackal ("From my cold dead hands.")
[ Post Reply | Private Reply | To 110 | View Replies ]


To: MPJackal

That information is irrelevant as to how much power TV’s consume in running mode. Look at the watts rating. My brother just bought a plasma TV. It uses up to a max of 800 Watts on a bright scene. His 27 inch CRT uses 105 watts. Not hard to figure out what happened to his electric bill.


123 posted on 03/11/2011 5:26:08 PM PST by Revel
[ Post Reply | Private Reply | To 120 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson