2

Trying to hook up a Dell Poweredge T620 to 2 27" monitors via a PNY Nvidia NVS 310.

Server running Windows 2008 R2 Standard SP1.

Nothing fancy, but need a very large desktop area (2 x 2560 x 1440).

Fitted the card into the 'SLOT2 PCIE_G3_X16(CPU1)' slot (the server has two CPUs, and notice there are PCI slots for each CPU, not sure if this is relevant or not but haven't installed a graphics card in a multi-CPU system before).

Installed drivers "331.65-quadro-tesla-grid-winserv2008-2008r2-2012-64bit-international-whql" as recommended on the Nvidia drivers page, installed with the default options.

The graphics card now appears to be installed and detected by Windows, appearing alongside the on-board "Matrox G200eR" adapter in Device manager as "NVIDIA NVS 310", no errors/issues reported.

Card is connected up to the monitors using DVI cables.

Now... problem is

  • Cannot detect any monitors in 'Screen Resolution' screen, only listing the monitor connected to the on-board/Matrox VGA port.
  • Looking at the advanced settings only show settings for the Matrox display.
  • Right clicking on desktop and opening 'nView Desktop Manager' nothing happens & no new processes.

Disabling the Matrox display adaptor in Device manager doesn't have an effect on the Nvidia card.

Only options in the BIOS setup appear to be enable/disable the PCI slots on the server. I could disable the embedded display from here, but worry that I'd lose all display output.

ETA

Trying the drivers (clean install) that came with the card, version 310.90. No change.

There's two Nvidia control panel items - 'NVIDIA nView Desktop Manager' which doesn't do anything and 'NVIDIA Control Panel' which launches nvcplui.exe (Control Panel Application) and reports 'NVIDIA Display Settings are not available. You are currently not using a display attached to an NVIDIA GPU.'

There's several Nvidia applications running in the background

  • 1 x nvvsvc (Driver helper service)
  • 2 x nvwmi64 (WMI Provider Core)
  • 2 x nvxdsync (User Experience Driver Component)

Tried swapping the card over to another port PCIe port - SLOT7 PCIE_G3_X16(CPU2) - sadly no change.

ETA (again)

Tried using DirectPort cables (rather than the DirectPort to DVI adapter cable). No difference.

MJF
  • 123
  • 1
  • 4

1 Answers1

2

Try disabling the on-board graphics adapter from BIOS setup. If you are worried that you'll lose all video, document all settings in there so you safely can reset the CMOS in case it doesn't work. Removing the Nvidia card should also revert to the onboard adapter.

David W
  • 71
  • 4
  • Actually, quick google mentions this: [link](http://en.community.dell.com/support-forums/servers/f/956/t/19502455.aspx) "With most Dell servers, they won't support using a 3rd party video card in the server. The 12th generation servers differ in the fact that you can use the 3rd party video cards in them, but in a GPGPU use. Not for the sake of video display. So they support 3rd party video cards, not for using for display, but as a graphic processor." – David W Nov 28 '13 at 14:52
  • That did the trick. I wasn't too hopeful after reading that Dell support forum thread, but I noticed that the option to disable the on-board graphics could only be set if the Nvidia card was fitted. Once disabled and rebooted, the server started using the Nvidia card for all display output. – MJF Dec 02 '13 at 14:04