Can I use both my internal and external graphics card in same system?

5

My motherboard has an integrated ATI Radeon HD 3000 graphics and I had Nvidia GT 610 2gb external graphics card as well. I have a Gigabyte GA-78LMT-S2 Motherboard, AMD 4 GHz AM3+ FX-4300 Processor, 450 Watts PSU.

Can I use both of them and run apps and games simultaneously with these two graphics system?

sarath

Posted 2014-04-04T05:14:52.367

Reputation: 163

Answers

2

In PCI and PCIe systems, it is certainly possible. The BIOS has settings to say which is the primary, if both are present. It is subject to the usual issues of mixing graphics cards of different brands: the drivers may be allergic to each other. At least, be sure not to install the "utilities" from both as they may fight over things.

Before switching to PCIe (2008), the built-in graphics was mutually exclusive with the AGP slot, but it was still possible to add PCI graphics cards, and I did so when serious non-AGP cards were becoming hard to find.

The lingering memory of not having both is from AGP.

The flip side is that if you're not using the built-in graphics because nothing is plugged in to that connector, you should turn it off in the BIOS to free up resources. People are running both without knowing it.

JDługosz

Posted 2014-04-04T05:14:52.367

Reputation: 597

I'm pretty sure modern CPUs will disable in-chip GPU's if not connected at boot anyway, but it might have been a special thing. At least I've had that experience with a ZBox: I wouldn't get any HDMI output unless the screen has been present when turning that thing on. – Mario – 2015-03-05T08:05:40.663

Yea, it might have an "auto" as well as Enable/Disable, so people don't lose out the RAM it claims. Detection is not reliable with older connection types, but it certainly is with HDMI: my HTPC has fits when I turn the TV or Receiver on or off. It used to (with DVI) not care and happily keep windows on the screen I couldn't see. – JDługosz – 2015-03-05T08:42:39.683

Setting up a new motherboard today, I looked at the options: The onboard graphics in integrated peripheral configuration had CPU IGFX Enabled, Disabled, and Auto. – JDługosz – 2015-03-11T11:16:11.423

Yeah, I most likely experienced the auto setting – Mario – 2015-03-11T11:41:28.633

2

A few notes as I ran into same setup recently (2 displays connected to 2 different GPUs, one integrated).

On my MSI motherboard, there is indeed a setting in BIOS (well, UEFI stuff, but you know what I mean) in section about graphic adapters that allows you to select whether the integrated GPU should be used at all (it says something about multi display setup). This way you can use on (main) display via "external" card, the other via the integrated.

Windows 10 don't seem to have any problem with this setup. Ubuntu 16.04, however, does. It fails to send visual signal to the secondary monitor connected to the integrated GPU (although the display is in use (as in the space for windows is there)).

Interesting thing is that when I forced one game (under Windows with the new AMD "external" card), which ran really poorly on the integrated GPU (back then when I had no external GPU), go fullscreen on the display I had connected to integrated GPU, it ran smoothly, obviously being rendered by the AMD "external" card. I think there could be some way through this to fix the Ubuntu problem mentioned above, but I haven't figured it out yet. I'll probably go with using a DVI->D-sub adapter and connect the old display to the AMD external card too.

edison23

Posted 2014-04-04T05:14:52.367

Reputation: 121

Another thing - although I was unable to confirm the conditions leading to this (probably restart and not putting the computer to sleep, I think), I was able to get visuals on both displays (while one in AMD GPU's DVI, second in IGP's VGA), but, on the primary display, the mouse flickered/disappeared (looks similiar to this: https://bugs.launchpad.net/ubuntu/+source/xorg/+bug/1278223 )

– edison23 – 2017-02-18T13:16:29.657