The original ENERGY STAR® rating for TVs only measured standby power (the amount of power the set consumes when it’s turned off), which accounts for a small fraction of the power TVs consume. The ENERGY STAR 3.0 rating, which took effect on November 1, 2008, also factored in power consumption when the TV is on, providing a much clearer idea of how much power a TV uses.
Version 4.1, which takes effect on May 1, 2010, sets a more stringent standard for on-mode power consumption. It promises consumers a savings of 40 percent over models currently on the market. Version 5.1, which takes effect on May 1, 2012, promises a savings of 65% over sets that were on the market prior to the debut of the version 4.1 standards.
So I am a year ahead of my time hippie;-)
That information is irrelevant as to how much power TV’s consume in running mode. Look at the watts rating. My brother just bought a plasma TV. It uses up to a max of 800 Watts on a bright scene. His 27 inch CRT uses 105 watts. Not hard to figure out what happened to his electric bill.