Does Lowering The Monitor Refresh Rate Save Battery Life?

6

2

Is this a myth or is this for real?

I read about it all and was wondering if anyone knows if it is true or tried it yet.

Going from 60Hz to 40Hz

Rachel Nark

Posted 2013-05-23T14:04:53.623

Reputation: 649

You can test this empirically: charge your laptop overnight, clear caches, full reboot, unplug, do some specific thing for a few hours - e.g. watch a movie or play a repetitive game (puzzle, racing, etc), check battery level, then change the refresh rate and do the same thing again the next day (same movie or same game, same playstyle). – Foo Bar – 2015-05-12T20:26:11.847

There are plenty of reports around of it having a positive effect, but that also included switching to 16bit color as well - not just refresh. – nerdwaller – 2013-05-23T14:12:08.107

Based on intuition, I'd say that the amount of power the screen uses over any given time unit would remain the same. I say this because lowering the refresh rate, would result in each picture/frame being showed for a longer time period. However, it could be that the less frequent redrawing of the screen would result in a lower power consumption. That would be my best educated guess, but I'm by no means an expert on the subject. – AcId – 2013-05-23T14:17:02.057

Answers

0

Yes but it also affects eyesight. Graphics cards and monitors are becoming more energy saving. Also note that the CPU consumes considerably more power, so do not think that the overall percentage of energy savings will be large. And just using an extra battery can save you the risk of eye problems, I think. Good luck.

STTR

Posted 2013-05-23T14:04:53.623

Reputation: 6 180

2eye problems could be possible on a CRT monitor, however, on a laptop LCD, i can hardly see a difference going from 60 to 40 hertz. Good point though about relative battery consumption – Blaine – 2015-05-12T14:54:59.267

@Blaine Problems unique. I hurt my eyes on different LCD monitor, until I find a suitable. There are those for whom you want glass surface LCD. Some work better with perforated sunglasses, someone in the dark. Those who do not eat meat, and deprived of vitamin D is better to see in the dark. Some can not stay long in the monitor. Certain views often prevents to achieve the convenience for the individual. Categorical interfere. Thank you for your feedback. – STTR – 2015-05-12T18:05:04.320

9

LED: Refresh rate should have a minimal effect on energy efficiency. The LEDs only draw power when they are energized, and the overhead from the control circuitry should be more or less constant. (Note: this refers to 'true' LED displays, not LED-backlit LCD displays, which may or may not actually be commercially available at the time one is reading this.)

LCD: Refresh rate should have a minimal effect here, too. The backlight is by far the biggest power draw, which is always on whenever the screen is active. Since the liquid crystals only need to adjust their position/orientation when the color or brightness changes (which is a function of what is being displayed, not how quickly it's being refreshed), I would think the differences there are negligible.

CRT: Here I would think reducing the refresh rate might have some appreciable benefit to energy efficiency. The picture is constructed by firing electrons at phosphor spots on the front glass. If you reduce the refresh rate, you reduce the number of electrons fired per second and thus reduce the energy required to (a) generate the electrons and (b) slew the magnetic field generators that aim the electron beam. Of course, if the cathode ray tube is tuned to have optimal energy efficiency at a given refresh rate, its overall energy consumption might increase on net at lower refresh rates due to lower efficiency more than offsetting any power gains from a reduced rate of electron generation.

Some related questions on energy efficiency can be found here and here; and one on monitors and eyestrain can be found here.

hBy2Py

Posted 2013-05-23T14:04:53.623

Reputation: 2 123

2

It is true in certain circumstances - specifically, those circumstances in which the display driver (GPU) is a significant part of the power consumption, and the display itself is not capable of self-refresh (which is still a new and non-widespread feature).

As other answers note, the power consumed by a display (possibly excluding CRTs) is largely unaffected by how often it refreshes, since the power consumption is dominated by producing a certain amount of light (and, as an unfortunate side-effect, heat) rather than the activity required to perform the refresh. This only considers the display itself though, and the display is only one part of the full pipeline that leads to an image being produced.

Looking at the bigger picture, a non-trivial amount of power is consumed by the GPU, and in fact even refreshing a static display can consume a meaningful amount of power - nowhere near the power needed to render a complex scene, to be sure, but still enough that it can be worth trying to save it. Panel self-refresh is a way to save this energy by allowing the GPU to skip frames which would be unchanged, and letting the display handle redrawing the static content where necessary. This is more likely to be a decent saving on a small and low-power display (such as on a phone) than on an enormous power-hungry monitor.

There is a quick instroduction to panel self-refresh at http://www.anandtech.com/show/7208/understanding-panel-self-refresh; as part of its rationale it covers the topic of this question.

Nye

Posted 2013-05-23T14:04:53.623

Reputation: 633

2

If your graphics card is drawing frames 40 times a second rather than 60 times a second, it will save power because the graphics card is less busy (33% power savings). This is probably the source of any power savings.

If your monitor refresh rate is set to 40Hz and a game you are playing is set to wait for vsync, then this certainly applies. Otherwise your graphics card is just drawing frames as fast as it can.

If you have Aero/desktop composition enabled, Windows is using the graphics card to render the desktop. I would imagine that Windows will draw its windows (assuming Aero/desktop composition is enabled) according to the refresh rate, but I'm not sure. You'll save power if it is. Turning off Aero/composition will offload rendering to the CPU and may actually increase power consumption.

As far as the signal from anything generating video to the LCD - the technique used is Low Voltage Differential Signaling and that works by comparing the voltages of two signals which are out of phase. You might want to ask on electronics.stackexchange.com to be sure, but I think this means there is always a constant amount of power going through the wire irrespective of what is transmitted, since the thing being modulated to represent data is not the amount of power, but the phase difference in two signals. So the amount of frames sent through the wire doesn't affect the power used.

LawrenceC

Posted 2013-05-23T14:04:53.623

Reputation: 63 487

I might be wrong but ... My logic was that it consumes GPU cycles to draw a frame so if there is less frames there is less to draw. Don't GPU's generate more heat and use more power when under load then when not? An idle GPU is a cool and lower-power consuming GPU I thought. – LawrenceC – 2015-05-12T19:04:50.403

Aero only redraws when required, not at the refresh rate, s. https://blogs.msdn.microsoft.com/greg_schechter/2006/06/09/how-underlying-wpf-concepts-and-technology-are-being-used-in-the-dwm/

– Ray – 2017-10-02T14:40:48.920

2

The best way to find out, use something like this:

enter image description here

Ben

Posted 2013-05-23T14:04:53.623

Reputation: 319