3

Setting up a development machine, I was wondering about the way I should connect the displays. The machine has two monitors and also has two graphics cards (2x GeForce 9800 GTX+) both having 2 DVI ports. What I have been wondering is if it better to connect both monitors to a single card or one monitor to each card? Is one configuration definitively better then the other, and if not what would be the benefits & detractors of each?

voretaq7
  • 79,345
  • 17
  • 128
  • 213
rmoore
  • 133
  • 5
  • 1
    You don't say what OS you're using. This is more of a Superuser (coming soon) question, anyway. – Dennis Williamson Jun 28 '09 at 00:01
  • Superuser would be a great fit if it was up but I felt it was close enough to go here for now, I've got tons of other questions I'm holding on to until then though. – rmoore Jun 28 '09 at 04:12

6 Answers6

5

You don't say which OS or whether the two cards are in the same type of slot.

On Windows, a large virtual desktop spanning multiple monitors can be set up no matter which way you connect them in your circumstances. I don't have any multimon experience on Linux and I've forgotten what i knew about OS X.

Regardless of the OS, I would think that some operations that span the two monitors would have better performance if they stay on one card and don't involve the bus or the OS. So I would recommend that you use one card until you get more monitors.

However, performance depends on the particular combination of factors in a specific configuration. The OS, drivers, slot type(s), bus speed etc. all interact. The only way to tell for sure without doing a well-designed benchmark. I would find something that does a lot of OpenGL and displays the frame rate. Run it so that it spans the two monitors (if it will let you) and try it in each potential configuration.

Dennis Williamson
  • 60,515
  • 14
  • 113
  • 148
  • +1 for the testing segestion – Unkwntech Jun 28 '09 at 01:09
  • The specifics would be Windows 7, same monitors, graphics cards, slots, etc but I'm definitely going to test out the benchmarking idea, since you're right that there probably isn't a one-size-fits-all solution. The point on spanning across two monitors on different cards is a very good point that I wasn't thinking of, and probably going to be the determining factor in my case. – rmoore Jun 28 '09 at 04:06
3

I can't be sure on this, but one to each card would be the logical answer. Distributing the load across both cards. I should think it makes minimal performance difference, but that's the way I'd do it.

Adam Gibbins
  • 7,147
  • 2
  • 28
  • 42
0

One of the guys in the office has three monitors on his desk. Two out of one card, one out of the other. Multi-monitor handling has more to do with your graphics driver and OS support. If you're creating one big virtual desktop split down the middle, I believe that has to be done on a single card. If instead you're putting two different desktops virtually right next to each other, either way will work. Both approaches will give you that great dual-head feel, one will give you new centered windows opening up in the middle of one monitor, the other will give new centered windows bisected by your monitor split. I know which way I'd go.

For simple desktop work without heavy 3D action, I don't think it matters much. If you put the both monitors on a single card you'll have to set up crossfire/sli in order to use both cards, so one on each may be a simpler config.

sysadmin1138
  • 131,083
  • 18
  • 173
  • 296
  • 1
    You wont need SLI/Crossfire for multipple monitors on one card. I have 2 on a single card here with no second card. – Unkwntech Jun 28 '09 at 01:08
0

Multiple monitors is pretty standard for a NOC environment, and my experience setting it up and configuring it under Linux varies wildly. Some of the new support under Ubuntu Linux can be quite fantastic, they make installing the proprietary nVidia drivers very simple indeed. Invoke 'nvidia-settings' as root afterward and you'll spy a very simple way to configure the right orientation for your monitors, works perfectly for two monitors or similar cards.

Going above 'dual head' at the moment is moderately complex under Linux, though dead simple under Windows sadly. You'll probably be looking at using Twinview on each 'Device' within xorg.conf and then tying them together with Xinerama. This may hurt performance.

Best suggestion is what I put in my first paragraph. With lots of testing :-)

nixgeek
  • 874
  • 5
  • 14
0

I would go with the single card to reduce noise and consumption. I am sure a single 9800 can drive any practical resolution, at least for everyday computing and software development.

-3

I'ev got three myself. The optimal way is to have one monitor per video card. Just be careful what your motherboard supports. Some only support ATI cards, the others just support NVidia.

Steve French
  • 145
  • 3
  • 7
    I've NEVER in my >10 years, in IT, heard of a motherboard that only supports either ATI or nVidia, some may only support SLI or Crossfire but that is not needed for this type of setup. – Unkwntech Jun 28 '09 at 01:07
  • Ah, I wasn't clear earlier. I could not get both NVidia cards to work on an ATI Crossfire board, just one. I switched to two ATI Crossfire enabled cards and now both cards work perfectly. – Steve French Jun 29 '09 at 23:11