16
2
Modern video cards seem to use 150-200 Watts at idle. Does this mean that this is the minimum power the video card will ever draw while your computer is on? It's clear that if you are sitting at the Windows desktop and not doing much, this is the power you'll be drawing. If you're playing a game, you'll draw more power. But what about when your computer is idle long enough to trigger Windows' "Turn off the display" event? Will the video card use negligible power during this time, or still use idle power?
To be clear, I am not talking about the entire computer entering sleep or standby mode. I'm also not talking about simply pushing the power button on the monitor. I'm talking about the computer being on, visible on the network, possibly performing background or server tasks, but with the display off as a result of Windows power settings.
Does the Windows' "Turn off the display" have anything to do with the video card ? – sdadffdfd – 2011-03-24T14:23:15.910
1I love Anandtech, but I feel obligated to point out that those are high-end gaming video cards you link there. I have a machine across the room with a great card that uses 12 watts at idle; it just won't play Crysis. – Shinrai – 2011-03-24T14:40:57.570
@victor23k - I don't believe it does. I think the answer is 'if the machine is awake the card is still pulling that much juice'. I'm not 100% on this though so I won't answer the question. – Shinrai – 2011-03-24T14:41:46.493
1Those aren't GPU power usage numbers in the chart. It is the power usage for the whole system at the wall. To get to GPU power usage you need to take into account the loss of AC to DC conversion and then the power usage of the rest of the hardware. – Mr Alpha – 2011-03-24T14:51:54.447
1@victor23k Yes, it does, because Windows achieves this by telling the video card drivers to kill the display output. And, as I've experienced, if you have buggy versions of nVidia drivers, you can run into situations where turning off the display does not work properly (or you can have it garbled when it comes back on). – Greg – 2011-03-24T15:32:42.723
2@MrAlpha - You're right. But, today's CPUs use 15W at idle, power supplies are 80% efficient, and whole systems can use 25-50W at idle, so the graphics card is far and away the largest portion of the power shown. – Greg – 2011-03-24T15:35:57.233
@Shinrai - That's basically the reason I'm asking. I want to know if I can have the best of both worlds, or if getting a card that will play Crysis will use 10 times as much juice when I'm away from the computer (it's left on 24/7 to share printers, images, music, etc.). – Greg – 2011-03-24T15:40:51.380
@Greg True, the graphics card probably is the single biggest. But on the other hand that is hardly a low power platform they are testing it on. Although the SSD probably gets pretty low. But most of the platform is tuned towards performance and overclocking, not power savings. And in order to run those big multi-card tests they probably have a massive PSU in that build, which means the idle load of 160W falls far outside peak efficiency. – Mr Alpha – 2011-03-24T16:12:09.010
@Greg bigger does use more. The bigger the card you have, the more power it will consume at idle (although this may change in the future with semiconductor technology still decreasing in size). If you want to find the power consumption of your own card at idle, just find some reviews done on it, or get a multimeter and measure the current being drawn by each rail. P = V*I, and just add up the power taken by each positive line. – Breakthrough – 2011-03-25T11:01:21.810