Samsung, Vizio, others accused of gaming TV energy standards, publishing inaccurate results

Samsung, Vizio, others accused of gaming TV energy standards, publishing inaccurate results

Energy Star ratings are supposed tell consumers how much power their devices use under normal operation. But they only work if the tests and ratings reflect real-world use. A new study of Ultra HD televisions has revealed some of these devices use far more electricity than they claim if operated even slightly outside the preprogrammed test conditions that the manufacturers use for testing — and that the end result can be substantially higher electrical bills over the lifetime of the device.

A new  by the Natural Resources Defense Council (NRDC) criticizes the behavior of many high-end UHD televisions for the way they implement power-saving features.

In some cases, manufacturers may use motion-detecting dimming (MDD) to falsely inflate the power savings they claim. This feature cuts backlight power when the screen changes rapidly, but it saves far more power on the test clips used by the Department of Energy than in any normal content. This may well reflect a weakness of the test, which switches from one type of content to another fairly rapidly — more rapidly than most broadcasts or even sports games do.

The NDRC team created their own content tests to attempt to nail down the difference between how the Samsung and LG televisions tested on the official IEC test versus their own content. What they found is that the IEC test contains many more jump cuts, with the average scene lasting just 2.29 seconds. The average scene in their own real-world content tests was larger, at 3.89 seconds — and their own real-world content tests produced dramatically different (and less positive) results.

Confusing standards, significant discrepancies

Not all of the problems here are a function of the test content used by the DoE. In some cases, shifting the display from its default mode into Vivid or Cinematic modes also disables automatic brightness control, or the aforementioned MDD.

Part of the problem here is that different manufacturers use different settings to mean different things. LG, for example, uses Auto Power Save to describe its power-saving feature; setting the TV to “Standard” completely disables all power-saving functions. On a Samsung TV, selecting “Standard” means that all power-saving options are enabled — so long as you don’t change anything, period. Some Samsung TVs apparently disable all power saving options if you so much as tweak the brightness or contrast settings, even if you use the “Standard” profile to do so — and they don’t inform you that the brightness and color settings have been altered.

Now, it’s fair to ask how much all of this matters and how great the discrepancy is between what a TV set scores on a test versus its real-life power consumption. Here’s a comparison of results on the SAM 7100 and SAM 9000 with Automatic Brightness Control (ABC) and motion-detection dimming (Samsung calls this Motion Lighting) off vs. on when using the DoE’s own test setup.

The gaps here aren’t small, and keep in mind that these Samsung televisions turn the options off the instant you change Brightness or Contrast, even if you’ve selected the power-saving “Standard” setting. The difference in operating costs on a yearly basis is fairly small — the average US power cost is 12.6 cents per kW, and at five hours per day of estimated operation, that only works out to about $15 per year. At the same time, however, there’s no denying the energy efficiency the unit promises on the sticker isn’t being delivered to the consumer — at least, not without a great many caveats and assumptions about operating conditions that may or may not be made clear, either by salespeople or within the device’s own documentation.

While not many people choose their devices solely on the basis of their energy efficiency, many people do take them into account when choosing between products. If Samsung and LG both make a well-regarded UHD TV, and one platform is listed as being more energy efficient than the other, then the smart consumer will choose the device he or she calculates will save him money over the long term. Energy use, like vehicle mileage, is difficult to capture in a single test because how you use the device will necessarily impact the overall figure.

One takeaway from the NDRC’s results, however, is that the IEC test suite doesn’t seem to map well to real-world content — and that has knock-on effects for consumers who want to take actual energy-efficiency into account. Vizio also gets dinged for the way it communicates certain features. It tells consumers to use one mode for power consumption, but then tells them “Calibrated” mode produces the best picture. While this may be factually true, it implies that using the device in efficiency mode necessitates a worse image and offers no way for the customer to fine-tune the settings to his or her liking.

HDR content sends power consumption spiking

One added wrinkle to this debate is the use of high-dynamic-range lighting, or HDR. HDR has many benefits — it captures a much broader range of the visible light spectrum, it increases color accuracy, and it can improve video games and offer greater ranges of detail without requiring more powerful video cards (or at least, not more powerful in the conventional sense of pushing higher frame rates).

Unfortunately, in at least early models, those benefits come along with a whopping increase in power consumption, as captured below.

According to the NDRC, average TV power consumption during the movie rose from 106.9W to 149.4W when HDR was enabled. Samsung estimates that its Samsung 9000 (the UN55JS9000 to be precise) uses 142kWh per year. This rises to an estimated 389kWh if you assume identical usage criteria, but use HDR instead of SDR. That’s an extra $31 per year in operating cost, or roughly $155 over a five-year period. Of course, if you live in an area with higher power costs, those differences will be correspondingly larger. Devices with higher power consumption also emit more heat, which can then require additional cooling, which also costs money.

The bottom line is this: If we’re going to go to the trouble of creating energy tests and incentivizing companies to publish the results, those numbers need to be accurate. Based on these results, there are clear deficiencies in the current test materials. Manufacturers also need to communicate how power saving features work and make them easier to change — Samsung’s practice of disabling power-saving features if the end-user touches the brightness or contrast setting is particularly ridiculous (the company has pledged to discontinue doing this in future displays).

Facebook Twitter Google+ Pinterest
Tel. 619-537-8820

Email. This email address is being protected from spambots. You need JavaScript enabled to view it.