LG C2 OLED High power consumption in standby

I checked my 42 inch LG C2 power consumption using one of the smart plugs with power monitoring and found despite LGs specs saying it should use less than half a watt in standby, it was still using 12 watts in standby hours after turning tt off. The strange thing is with Quick start and Always on etc enabled it uses only 8 watts.

After the initial power on with the smart plug it always drops back to half a watt but once you let it load the software by powering it on with the remote it will never drops back to half a watt.

I checked against my older LG C8 OLED and it never uses more than 1 watt in standby, even with all the smart features enabled.

After six weeks still no response from LG customer care.


Hi @Ericc, welcome to the community.

Plug in power use monitors aren’t overly accurate, especially when measuring low wattages.

This article is worth reading:

Choice has also reviewed some in the past:


Which is why I compared it with my other LG OLED and with two different meters.
I also checked all my appliances with standby, 18 and all complied pretty much with manufacturers specs except this LG TV.


I would consider that even though the display has turned off, the non-graphics part of the processor is still actively using power doing something. What that something could be, who knows. Could be some setting, or could be a loop that is not supposed to occur during true standby.

1 Like

It depends on the device the meter is plugged into. Devices which are on standby or are energy efficient can affect the reading of a meter to make them inaccurate. The article I posted earlier tries to explain why.

Comparing readings from one (older) TV with another (newer) TV may not be comparing apples with apples.

At lower wattages which you are trying to measure, the impacts or errors can be more noticeable as small +/- watts in error can impact significantly on a reading.

The only way to truly test standby power accurately is use more sophisticated equipment to a domestic type usage meter. The meter you have purpose will be to manage electricity use to minimise usage, not to accurately test a appliance/device power consumption. You are trying to use the meter for something it isn’t really intended to be used for, causing you to think there is a problem with the TV. The issue will most likely lie with the meter and how it measures usage, especially very low usage.

A device may also take some time to fully power down. If you are measuring immediately after entering standby, this can also give false readings/cause over estimation of power use.


I couldn’t disagree more with what you are saying.

Meters are designed to very accurately measure things like voltages, and in the case of a power measurement, current.

Measuring milliamps should be a doddle for even cheap meters. A difference of one watt and twelve watts is a ten-fold factor.

The meter in question measures usage, watts.

There are many factors which affect accuracy of how this is done. The article/paper linked previously to why this can be the case.

An example being, most domestic type meters don’t have power factor correction adjustment for each different device attached to the meter. It uses a fixed correction which may or may not be applicable for the device.

Most domestic type usage meters are about presenting usage figures to allow a consumer to reduce usage. For example, if x is done, the meter shows lower usage, if y is done, the meter shows higher usage. Such meters were never intended to test actual usage of everything the meter is attached to. Such meters are very different and used by testing authorities or technical specialists. I used to work for an organisation who used such equipment looking for earth leakages, induction voltages etc.


As @Ericc has clearly said, tests were done using two measuring devices, on a number of different electrical devices, and found the measurements confirmed what the specs should be. Clearly they know what they are doing. And clearly the meters are doing what they are designed to do. That is measuring current flow under the nominal AC voltage at the power point to calculate watts of usage.

Except this one TV in standby. Possibly this TV has a very strange power usage method that can cause incorrect measurement as in the Dutch article you posted. But I really doubt it.

The TV is chewing up power in standby that it should not be using. Why is it doing that?


And as I said in my original post everythings fine until it loads its software.


It usually takes about 5 minutes to fo its cleaning etc as 20 to 30 watts then drops to 12 watts. I’ve checked after days and it’s still at 12 watts.


I have explained why two different tv measurements isn’t comparing apples with apples.

It isn’t known the TV is chewing power. The only thing which is known is a meter shows different measurements for different TVs and TV settings. The values shown on the meter may not show accurate usage, especially at low wattages.

I will give you an example. We have a microwave oven. Using a digital usage meter, it showed a usage of 80W when it wasn’t being used for heating (led clock was only on). I wanted to check there wasn’t a problem with the microwave. One of the guys at work dropped by on the way home and tested the microwave oven with testing equipment and the usage was around 13W (which was close to the value in the specifications). He explained that the usage meter didn’t allow for power correction, hence the overestimation of the usage. Testing showed the microwave oven was fine and in good working order. BTW, we now turn the microwave off at the power point between use to save 13W. It is an older oven where efficiency was a lesser consideration.

The usage meter read okay for incandescent light bulbs, fridge and other higher wattage uses, but struggled measuring things with different power factors or low usages.

The domestic meter being used is no different.

It can be ‘dangerous’ using such meters assuming they are accurate for every measurement. The meters aren’t design for such purposes and are about measuring relative power use (it goes up or down by doing x or y). They aren’t test equipment and using them for such is outside their intended purpose.

I would not be relying on a domestic usage meter fir quantifying an appliance power use. This is most likely why LG has not responded as there also know it is the case.

It is a bit like using a sound level meter app on a smart phone to measure noise levels thinking it is accurate (it does happen as a mate wgo is an acoustical engineer has indicated that some think an app makes them all of a sudden a sound expert). It isn’t as it isn’t testing equipment but something which can be used to measure relative noise levels… a noise being louder or softer if the phone’s microphone can measure the frequencies of the noise. If the frequencies sit outside the specs of the microphone, even relative noise levels may not be possible and the app levels are meaningless.

The software will most likely include energy management. Energy management can mess with meter readings as indicated in the paper linked above.


To consider the meter might be wrong.
On the other hand there is no more evidence the meter is wrong than it is right. Perhaps I agree with @phb re the uncertain around what the meters are saying. But for slightly different reasons.

Some plug adapter power meters (I’ve one) also measure VA and reactive power. IE power factor. Hence they correctly report within the stated accuracy a measurement in Watts.

Could there be an anomaly in how the newer TV loads the mains in standby? One that causes a large discrepancy.

Technical content - reporting 12W load at 230/240V when LG suggests 0.5W? One answer a power factor of 0.05 on standby. Add several different meters not able to measure real (true) power. I’ll leave it to others more knowledgeable to say if this is normal and how this comes about, leading or lagging.

There remains a possibility of a minor fault with the individual TV P.S. Whether it is truely drawing approx 12W, or a possible component failure causing the large phase shift, or could there be as might be suggested a power down software issue?

A better equiped electrician or specialist TV repairer would have a suitable model of Fluke recording meter to determine the real power for any load condition and any phase difference (apparent power). I doubt there is a definitive answer without better information.


I assumed small appliances like TVs etc would have power factor of around .9 so the difference when measuring the power with the smart plug would be minimal but if I understanding correctly then the power factor in standby is so poor that the smart plug reads 12 watts which begs the question why is the power factor so poor in this TV and which am I actually paying.

1 Like

Residential power meters measure real power. The uncertainty is whether your plug adapter is measuring real (vi x pf) or apparent (vi) power. Hopefully the second if the pf of your new TV is so poor under low load.


My guess is a software bug. Which would raise the question as to whether you have the latest software in the TV and/or whether it will automatically update itself.

I measured my own TV with an actual power meter (not professional standard but probably better than a plug / powerboard), starting with the TV fully and normally operating, and immediately, on going into standby, power dropped to 12W. That’s probably the effect of switching off the panel. Then the TV will execute its shut down procedure and within, say, 10 seconds power had dropped to 0.5W. (This latter figure is exactly as claimed in the manual but, as @phb says, I would be suspicious of the power meter’s ability to be accurate at the really low end. For example, even after switching the TV off using the on-off switch on the TV i.e. no longer on standby, the power meter still said 0.2W.)

Some things to think about:

  • what ports are connected to the TV permanently? how many HDMI ports are connected? how many USB ports? LAN port? (In my case, the answer is “all of the above” but that doesn’t prevent measured power in standby matching the documentation.)
  • if the TV has network capability, has it dropped off the network (no longer pingable) when it is consuming 12W?
  • are you using any timer features of the TV? (if it has any)
  • you might try to reset the TV to factory defaults (if that’s not too painful) and see whether the behaviour persists?

For the benefit of @phb :slight_smile: , I have tested my power meter with a wide range of appliances, including those where the documented power consumption should be, say, 12W. I am comfortable that there is no “6x” error, let alone a “24x” error.


Tried those in your sequence and power usage dropped to 10 watts with nothing plugged in.
One thing I forgot about is that if I cycled Always ready on or off the first time it went to standby it would drop to < 1 watt which seemed to suggest software to me.

The factory reset did the trick, drops to <1 watt almost immediatley, I will now get it back to where it was to see if it stays that way. I should have thought of it. Seems fairly logical now but it’s still crickets from LG Customer “support”. Thanks.


‘Always ready’ mode suggests to me that the TV is running as usual receiving and processing from aerial and/or network, but just not displaying or sounding any output which is turned off.
Receiving software updates, extracting program data from broadcast streams to update the program guide, etc.

I wouldn’t call that standby at all.

Always ready optionally turns off the screen or displays wallpaper. With it enabled and the screen off it drew 8 watts. Fast start is something else again which I eventually found in a totally different part of the menu. I had both turned off and it drew 12 watts in standby.

Software bug. For sure. Reset fixed it as you say.

Apparently I’m working for LG Customer “Support” now. :rofl:

1 Like