How much power do video cards use after Windows turns your display off?

16

2

Modern video cards seem to use 150-200 Watts at idle. Does this mean that this is the minimum power the video card will ever draw while your computer is on? It's clear that if you are sitting at the Windows desktop and not doing much, this is the power you'll be drawing. If you're playing a game, you'll draw more power. But what about when your computer is idle long enough to trigger Windows' "Turn off the display" event? Will the video card use negligible power during this time, or still use idle power?

To be clear, I am not talking about the entire computer entering sleep or standby mode. I'm also not talking about simply pushing the power button on the monitor. I'm talking about the computer being on, visible on the network, possibly performing background or server tasks, but with the display off as a result of Windows power settings.

Greg

Posted 2011-03-24T14:08:55.550

Reputation: 659

Does the Windows' "Turn off the display" have anything to do with the video card ? – sdadffdfd – 2011-03-24T14:23:15.910

1I love Anandtech, but I feel obligated to point out that those are high-end gaming video cards you link there. I have a machine across the room with a great card that uses 12 watts at idle; it just won't play Crysis. – Shinrai – 2011-03-24T14:40:57.570

@victor23k - I don't believe it does. I think the answer is 'if the machine is awake the card is still pulling that much juice'. I'm not 100% on this though so I won't answer the question. – Shinrai – 2011-03-24T14:41:46.493

1Those aren't GPU power usage numbers in the chart. It is the power usage for the whole system at the wall. To get to GPU power usage you need to take into account the loss of AC to DC conversion and then the power usage of the rest of the hardware. – Mr Alpha – 2011-03-24T14:51:54.447

1@victor23k Yes, it does, because Windows achieves this by telling the video card drivers to kill the display output. And, as I've experienced, if you have buggy versions of nVidia drivers, you can run into situations where turning off the display does not work properly (or you can have it garbled when it comes back on). – Greg – 2011-03-24T15:32:42.723

2@MrAlpha - You're right. But, today's CPUs use 15W at idle, power supplies are 80% efficient, and whole systems can use 25-50W at idle, so the graphics card is far and away the largest portion of the power shown. – Greg – 2011-03-24T15:35:57.233

@Shinrai - That's basically the reason I'm asking. I want to know if I can have the best of both worlds, or if getting a card that will play Crysis will use 10 times as much juice when I'm away from the computer (it's left on 24/7 to share printers, images, music, etc.). – Greg – 2011-03-24T15:40:51.380

@Greg True, the graphics card probably is the single biggest. But on the other hand that is hardly a low power platform they are testing it on. Although the SSD probably gets pretty low. But most of the platform is tuned towards performance and overclocking, not power savings. And in order to run those big multi-card tests they probably have a massive PSU in that build, which means the idle load of 160W falls far outside peak efficiency. – Mr Alpha – 2011-03-24T16:12:09.010

@Greg bigger does use more. The bigger the card you have, the more power it will consume at idle (although this may change in the future with semiconductor technology still decreasing in size). If you want to find the power consumption of your own card at idle, just find some reviews done on it, or get a multimeter and measure the current being drawn by each rail. P = V*I, and just add up the power taken by each positive line. – Breakthrough – 2011-03-25T11:01:21.810

Answers

3

Many modern cards usage very low power when idle for example my sapphire hd5770 consume 18w when idle and 108w when at its max. i don't comment on any specific card but commonly new cards consume less power during idle state. And even if window turn off the display it can not turn off the gpu and video ram. there will be very low or nothing difference between idle and display off state.

there is one exception. when you have hybrid sli(onboard and discrete graphics running in sli mode) then windows can completely turn off the discrete graphics card like 9800 gx2. you find this kind of setup more in notebooks where you can turn off the gpu.

kaykay

Posted 2011-03-24T14:08:55.550

Reputation: 881

1

That very much depends on the graphics card in your system.

I can only find this for certain relating to ATI Radeon graphics cards.

Some graphics cards do have a 'low power' mode where they reduce their power consumption considerably when they activate DPMS (Display Power Management Signaling).

From the Linux Radeon manual page:

Option "DynamicPM" "boolean"
      Enable dynamic power mode switching.  This can help reduce  heat
      and  increase  battery  life  by  reducing  power usage when the
      system is idle (DPMS active). The default is off.

I don't believe anyone has done a negative power consumption test - all the tests I have found have been while running the cards at maximum. It would be an interesting project to test various cards in running, idle and DPMS modes.

Majenko

Posted 2011-03-24T14:08:55.550

Reputation: 29 007

0

Power consumption of the graphic card depends on the application being used. Switching the monitor off will not effect the power consumption of the graphic card. For example, if I run a game, and switch off the monitor, still the graphic card would be providing the resource to the game even after the monitor is switched off. The only thing that will slow down the graphic card is the application being used.

Van Helsing

Posted 2011-03-24T14:08:55.550

Reputation: 21