How can I determine and set my primary graphics card?

16

1

I have a Lenovo W520 laptop with two graphics cards:

Device Manager showing "Intel(R) HD Graphics Family" and "NVIDIA Quadro 1000M"

I think Windows 7 (64 bit) is using my Intel graphics card ₃ which I think is integrated — because I have a low graphics rating in the Windows Experience Index. Also, the Intel card has 750MB of RAM while the NVIDIA has 2GB.

  1. How do I know for certain which card is Windows 7 really using?
  2. How do I change it?
  3. Since this is a laptop and the display is built in, how would changing the graphics card affect the built in display?

Jonas Stawski

Posted 2011-09-01T02:15:55.193

Reputation: 910

2I think it knows to switch which card, based on demand? 3D games should use the NVidia, and most everything else should use the much lower power Intel built in video. – geoffc – 2011-09-01T02:24:08.840

I think you're on to something there, geoffc. @jstawski, Is there any Lenovo brand software running in the system tray, particularly one that manages power or other advanced features? – Hand-E-Food – 2011-09-01T02:55:45.497

You can also disable Nvidia Optimus in the BIOS. :) – Johnny – 2012-08-21T21:12:29.800

Answers

10

geoffc is right. I found out from exploring the BIOS that my machine is using NVIDIA Optimus, a "new" technology for saving battery. The general idea is that it allows the driver to pick the right graphics card based on the demand, i.e. a 3D game will use NVIDIA, while surfing the net in Chrome will use the integrated Intel card.

There are two ways to manually use a specific card:

  1. Set it at the BIOS level.
  2. Change it in the NVIDIA Control Panel:

    NVIDIA Control Panel

Jonas Stawski

Posted 2011-09-01T02:15:55.193

Reputation: 910

1This is becoming a common way to pack greater graphics capabilities into laptops without sacrificing battery life. Running a full 3D accelerated graphics system in a laptop, even when it's just showing desktop stuff, uses significantly more power. By using "switchable" graphics, the more powerful device can be turned off and on as needed. – music2myear – 2011-09-01T18:08:58.647

2That seems ill thought-out. Instead of putting two graphics adapters in a laptop, they should just make sure that vid mfgs design their chips to use as little power as needed. A high-performance graphics adapter should not be using more power if it is doing simple rendering. – Synetech – 2011-09-02T03:13:19.763

@Synetech inc. I agree! – Jonas Stawski – 2011-09-02T19:25:49.397

4@Synetech: The discrete adapter has its own GDDR memory chips, etc., which draw just as much power in 2D mode as full load even though barely any of the memory is in use. Clock generators, which Intel HD Graphics share with CPU cores, are separate and consuming more power with a discrete GPU. Idle power for a discrete GPU just can't be as low as an integrated one no matter how much you optimize. – Ben Voigt – 2012-05-06T17:48:59.553

@Ben, that may be for Intel HDG, but what about laptops that use other architectures like AMD? Do they also use dual-adapters? I don’t know about these days, but I recall AMD specifically designing its mobile chipsets to be low-power in the past. – Synetech – 2012-05-06T20:25:58.250

@Synetech: Same thing there. Intel HD and AMD APU graphics idle with less power because they can take advantage of circuitry shared with the CPU for clocks and memory access. AMD and NVidia discrete GPUs, even the "mobile" editions which are optimized for power consumption, always will need more power, if only a little. – Ben Voigt – 2012-05-06T21:56:49.633